8 exceptional insights into the present and future of PLM

a PLM Talk with Prof. Jörg W. Fischer and Christoph Golinski. After a few technical contributions, today it's time for another PLM Talk. I am very happy to have won an excellent PLM expert for today's episode: Prof. Jörg W. Fischer. Enjoy reading this interview.

8 exceptional insights on the present and future of PLM a talk with Prof. Jörg W. Fischer and Christoph Golinski

You can find the original article in the PLM Blog from Christoph Golinski.

After a few technical contributions, today it's time for another PLM Talk. I am very happy to have won an excellent PLM expert for today's episode: Mr. Prof. Jörg W. Fischer from the Karlsruhe University of Applied Sciences - Technology and Economics and the Steinbeis - Transfer Center Computer Application in Mechanical Engineering (STZ-RIM) in Karlsruhe. I hope the interested visitor of this blog enjoys reading this interview.

Welcome to my blog couch, Prof. Fischer. Please make yourself quite comfortable. Many readers of this article will know you from your work at the Steinbeis Transfer Center for Computer Applications in Mechanical Engineering (STZ-RIM) Karlsruhe and from your numerous publications, e. g. in your Youtube Channel know. Nevertheless, could you please say a few words about yourself and your professional career to date?

Yes of course! Hello Mr. Golinski, I am pleased to be here. Unlike many others in the PLM field, I am a production engineer, i.e. I studied mechanical engineering production engineering at the KIT in Karlsruhe. Back then, when the Internet and object-oriented programming came up, I realized that the interface between economics, mechanical engineering and information technology would be of central importance in the future. Consequently, I was looking for a job where I could learn C++ programming. So I came to the Institute of Industrial Engineering and Management (ifab), first as a programming student and then as a research assistant, and helped to develop simulation tools for production systems. At that time, we also used these in industrial projects in practice. After my doctorate, I joined Tecnomatix in 2004, which was then acquired by UGS in 2005.

That's how I got from the digital factory to PLM. Fortunately, I was working as a consultant at GM-Europe, i.e. Opel, during that time. UGS has always put a lot of emphasis on supporting GM, so it was easy for me to get a very strong communication channel directly into UGS development in North America. When UGS was acquired by Siemens in 2007, I then joined the Business Consulting Manager team and started developing PLM methodologies for the automotive industry. In 2008, I was promoted to professor of manufacturing engineering with CAD-CAM-CNC at the Berlin University of Applied Sciences appointed. The fact that part of the Siemens CAM and CNC development was located in Berlin helped me a lot. It was the perfect way to get a holistic view of the manufacturing process chain. During this time, I began to develop the topic of structure theory in PLM. There is also a whole series of scientific publications from that time.

In 2012, Prof. Hoheisel, then Dean of the Department of Mechanical Engineering at the Karlsruhe University of Applied Sciences, asked me to apply for a professorship in Production Management and Digital Factory in the Faculty of Mechanical Engineering and Mechatronics (MMT). That's how I came back to Karlsruhe as a Karlsruhe native. At the end of 2013, I then also joined the Steinbeis Transfer Center - Computer Applications in Mechanical Engineering (STZ-RIM), which was founded by Prof. Hoheisel back in 1985.

Behind this was my desire to be able to use the developed methods even better in practical applications. So in this role, I have been advising primarily large medium-sized companies on the path to digitization since 2013. The subject area I cover has become much broader as a result. Today, I advise companies holistically. PLM is an important part of this.

Let's look back a few years to the beginnings of PDM/PLM. In my case, it was my professor for design methodology who infected me with the PDM virus in the CAD lab during my mechanical engineering studies. What actually motivated you back then to deal with this topic professionally?

I actually came to PLM rather by chance. Back in 2005, when UGS bought Tecnomatix. But in the end, without knowing it, I had contact with PLM much earlier. From 1990 to 1995, there was the KIT's SFB 346 "Computer Integrated Design and Manufacturing of Components"., in which the ifab with my doctoral advisor Prof. Zuelch had some essential shares. The foundations for PLM were laid there. Many of the technologies we used back then have long been components of PLM systems. Many of the people involved in SFB 346 are still active in the industry today, and I'm always happy when I meet one. Consequently, I found my way into PLM through SFB 346 - even though it wasn't called PLM back then.

At STZ-RIM Karlsruhe, you are intensively involved in PLM strategy and process consulting. What has changed in your daily work at the Company in the last 5-10 years changed? Are you experiencing a shift in your customers' focus on PLM?

In the last 5-10 years, from my perspective, a lot of very positive things have happened. This has an incredible number of facets, as various effects have had an impact.

In the past, people in PDM were very fixated on features & functions and the return on investment (ROI) calculation. Many believed in the Messiah of software. It was as if the mere purchase of the PDM system would make all the problems that companies had with the creation and lifecycling of product information disappear and the ROI that the PLM providers had calculated would then occur by itself.

The situation is roughly comparable to a tidiness problem at your home and you now believe that buying a new cabinet would solve the tidiness problem by itself. The right cabinet creates the basic prerequisite for good order, but that's it. This is exactly one of the insights that is finally catching on. With customers, I like to use the example of this Netflix series "Tidy up with Marie Kondo". I then say, in principle, what we do in the project is not unlike what Marie Kondo does.

There are a few more points that have changed. Not so long ago, the one-structure concept was preached. This was based on the idea that one product structure was sufficient to represent all aspects of the product. This approach assumed that the necessary views could be created ad hoc. The idea of the single structure concept has turned out not to correspond to reality. It is only suitable for a small number of very special cases. For all others, it has only led to even more excell lists and effort.

In the meantime, the multi-structure concept is being propagated. The name xBom is also often used for it. The multi-structure concept recognizes the fact that several different structures are created during product development, each of which also requires its own persistent structural semantics. Today, we can say very precisely in which case we need to establish which structures and how they should be structured in terms of their structural semantics. This is already an enormous step, but it is hardly ever seen because only a few experts are able to penetrate the subject matter and then explain it in a comprehensible way from the perspective of the companies.

However, there is one change that is the most important from my point of view. Up to now, the prevailing conviction was that it was sufficient to use tools of classic process modeling in order to be able to model and design the processes in PLM. Due to this conviction, companies often create wallpapers of process sequences that are built up over years for a lot of money and then are of little or no use.

This has its past: process modeling was/is mainly used in the context of ERP implementation. There it also works very well for some reasons. In ERP, there are very formal processes that are transactional on a hierarchy level. This is a perfect use case for process modeling and this is where it originated.

Over 90% of the creative activities that generate information about the product in the product lifecycle are not sufficiently formal, so modeling focused solely on processes fails. To design these processes, it is necessary to model the flow of product information across the different product structures. We call this the information architecture. In order to shed light on this, we have developed tools at our Steinbeis Transfer Center (STZ-RIM) for modeling the flow of information across the information architecture, as well as for modeling the structure semantics.

When I used to show these approaches, people from the companies looked at me with many question marks in their eyes. Today, they are often excited and eager to learn the methods to be able to discuss, understand and implement these topics on their own.

The pressure for digitization that prevails today naturally forces this. In the past, the focus was not on digitally available information. Everything revolved around the mostly order-related development and construction of the physical product. After the product left the ramp and was accepted by the customer, the next order was taken care of.

In the scenario described, it was enough that essential carriers of information were the people who shouted things to each other. Often there was the artist and hero mentality in mechanical engineering companies. Then there were the heroes in production, we'll call them Kalle and Paule. Kalle and Paule could be "called something" - that's how we say it in the Southwest. Here, it is the highest praise to say about someone: "You can be worth something". By this, we mean that you can give these people an almost insoluble technical task and they then manage to do it perfectly and on time. So Kalle and Paule often got rough drawings and pages of e-plans thrown at them by the artists from development, and then they got it right. In the end, the product was delivered to the customer on time and everyone was satisfied.

That has changed fundamentally. Today, customers expect the necessary digital information about the product in its current state of modification, and they also expect much more comprehensive after-sales service than in the past. If a service employee has to come on site first to find out which components have been installed and which software has been installed, this is increasingly met with a lack of understanding. As a result of the pressure building up from this direction, the realization is increasingly gaining ground that end-to-end management of product information and, in particular, the clean, explicit, digital storage of this information in backbone systems represents a central enabler for the business process capability of companies.

For consultants like me, this makes many things easier, because a lot of effort that used to be put into persuasion is no longer needed, and you can tackle the essential issues more directly.

When you look back on your projects in recent years, what was the biggest hurdle to the success of PLM initiatives? Was it a purely commercial challenge to get the investment off the ground? Or was it more the changing ways of working and the new tools that come with it, forcing change on employees and fighting the attitude of "but we've always done it that way"? Or was it a technical hurdle because the technology and architecture of the PLM systems had too many limits and boundaries?

The hurdles they mention all apply. I think one of the primary hurdles that comes up in PLM projects is the lack of PLM readiness of the companies in question.

A PLM-ready company should have developed in the basic tenor subsequent conviction from top management to the working level:

  • The availability of well-constructed and correct information is seen as a basic prerequisite for business process capability and thus the success of companies.
  • The organization and design of the flow of product information in the company are understood as a lever to achieve the company's goals.
  • It is clear that the design of the information architecture must also take place at the strategic/tactical level and cannot be left solely to clerks at the operational level.
  • It is recognized that the flow of information from the market/customer, through the company and back to the customer must be considered holistically, i.e. end-to-end.
  • Measures have been taken to ensure that shaping the flow of information does not stop at departmental boundaries.
  • There is an awareness that the early availability of information upstream (towards development) generates more work, but this is well invested as it avoids a lot of problems downstream (towards production). This extra work that occurs upstream is then also accounted for in terms of capacity.
  • A PLM introduction is not compared with the introduction of an application in the sense of authoring systems, but it is known that it has an effect on a large number of central business processes of the company.
  • However, PLM readiness alone is not enough. There are a number of things that were hidden for a long time and you just had to hope that things would go right.

 

Today, we can formulate these hidden things much more precisely. PLM implementations are based on different process patterns of information architecture. These patterns do not depend primarily on the industry, as is often assumed, but on how a company wants to position itself strategically in relation to the market. Which of these process patterns a company wants or needs to master is therefore a strategic management decision made by the company itself. This must not and cannot be made by a solution architect who accompanies the PLM implementation on the basis of gut instinct.

Added to this are the weaknesses of today's PLM systems. These originate from PDM systems, which often did not know the material. Even today, most of them have large gaps in dealing with material, especially when the modeling of the material in the plant reference becomes necessary. When I see how this is sometimes implemented and what solutions are propagated, my hair stands on end.

This sometimes has the effect that some solution architects push the customer toward an unsuitable solution because they feel this gap with the material in their stomach and want to proactively avoid it. The customer is then often unable to outline this and goes along with the proposed path until it comes to a crashing halt in the project.

As you can see, PLM projects contain a very explosive mix without anyone being aware of it. If, against the background explained, dissatisfaction arises among the employee concerned, this is of course justified, since what they are getting does not correspond to what they really need.

I think and propagate that it is necessary to think the implementation of PLM fundamentally differently. Before the project, the question must be answered which process capability a company wants to achieve tomorrow and in the future. This also makes the statement "but we have always done it this way" superfluous. If this statement were correct, the project would not exist, since the process capability would already exist in the status quo. The answer to the necessary process capability is followed by the choice of the necessary process pattern for the information architecture. Such a process pattern affects the company as a whole, i.e., also all backbone systems implemented or to be implemented there, i.e., CRM/PLM/ERP and MES.

The process patterns can now be adapted by the companies by creating an overall concept for process capability and information architecture adapted to the company, which must first of all be agreed with the important stakeholders in the company. It is not just a matter of getting the okay for the concept, but rather of generating acceptance of the change that goes hand in hand with it.

At this point, a state is reached where it should be clear to all stakeholders what needs to be done, why it is being done, and what the big goal behind it all is. Then the essential basis has been created.

Now the conception must be detailed, distributed to the backbone systems and cut into realizable slices. In principle, I recommend testing this detailed concept on a short list of possible backbone systems. It is not enough to be shown a solution and then to query the existence of desired functions on the basis of long Excel lists. This is a procedure from the 90s of the last century, and when I see consulting companies propagating this today, I can only shake my head. Testing in a kind of digitization lab in the company is not just about checking the suitability of the system in question. It must be understood much more as a place for developing the future process and PLM capabilities of a company. In this digitization lab, employees are introduced to and trained in the technologies. In addition, the technology that is right for the company is still being evaluated.

If the whole thing is flanked by a suitable change management approach, so that communication into the company also runs smoothly, implementation becomes much easier. The problem with this approach is the need for a larger budget. No matter how you slice it, eventually the costs are incurred anyway. Process capability is a capability that is driven by the self-image of one's own employees. It's not something you buy in with an IT system. If you ignore the preparation at the beginning and act as if it were not necessary, you will be in for a nasty surprise later.

I also have an example of this, based on a true story. Once upon a time there was a PLM system consultant who was involved in a PLM project at a large medium-sized company. This PLM system consultant had the problem that he had to create rights and roles in the system. Thus, he raised the question of who was allowed to access what and when at the administrator level. The clerks did not know this and asked their manager. The manager was upset. Role and rights was something that needed to be clarified internally at a higher management level, not at the clerk level in the PLM project. When the PLM system consultant came to the next meeting, he wondered why it felt like he was talking to a deaf, cold wall. He didn't get the answer he needed, nor could he continue to do his job. Then, over a year later, when concept slides with the management's view of roles and rights, created in the project with the support of expensive strategy consultants, arrived at the PLM system consultant, this concept was in no way implementable.

In the end, the system provider was to blame with its poor PLM system consultants. The system provider's sales staff groveled with the customer and gave him their best PLM methods consultant for over a year.

Yes, that's how it was back then. Admittedly, this story is a little exaggerated, but in essence it really did happen that way.

In the last few years, new initiatives have kept coming and sometimes you already had the feeling that you belonged to the establishment with PLM and the young wild ones, such as the Internet of Things or Industry 4.0 are shaking outside the door of digitization. From your personal point of view and experience: Are these new concepts and the potentials they hold competing with our classic PLM thinking - will the two have to complement each other and then more or less merge?

Here I would like to broaden the question and distinguish between the technologies and the people who represent Silicon Valley's way of thinking - or what has reached us.

First of all, the people: I often experience the "young savages" as consultants or as employees in companies. They want to shake up everything and anything, but often lack a little understanding of the complex interrelationships. I think it's simply a question of where you put them. Imagine a vertical dividing line between the market and the company. The first question a company has to ask itself towards the market is whether its current business models are viable in the future, whether new business models are emerging and what it wants to earn money with in the future. It is essential to answer this question. In this context, it makes perfect sense to put the "young savages" to work on answering this question. However, the result must be viable proposals for new business models and not a collection of wild ideas.

If a company now decides to implement such a new business model, then it must be examined to what extent the internal process capability must be developed in order to be able to support this business model.

Here is an example to help you understand: In the future, it is expected that the plant developer will find all the necessary components from all manufacturers with all the necessary data in the IT tools with which he develops the plant. Until now, plant manufacturers have had to laboriously build such libraries themselves in the PLM. Why is that? The component manufacturers could do this much better for all their customers. Then it would only have to be done once per component manufacturer and not every time for a plant manufacturer. Now, you could say they already do this because they usually offer Cadena's 3D data on their websites, along with data sheets. But that is not what I mean. What I am talking about is a fully classified library that can be retrieved from the PLM system so that you can drag and drop the component into the 3D model and then the e-plans, fluid plans and BOM information are available in the PLM in a buildable form.

If I were to design the digitization roadmap for a component manufacturer, there would be a number of projects on the roadmap working on setting up the internal processes in such a way that the necessary information for such catalogs for my products would already be generated as a matter of course during product creation. Of course, this would require an adaptation of the internal PLM processes. This would follow very closely the classic PLM idea, which is still highly topical. I would then entrust the implementation of this to the experienced experts and not to the "young savages". However, it would be good if the "young guns" had already worked out viable concepts for earning money in advance for this case.

Now to the actual aspect of the question of whether the classic PLM idea will remain. There are a number of technologies that could fundamentally change PLM and PLM systems as we know them in the long term. Here, for example, I see the topic of graph databases as a central issue. Product structures are root trees in the sense of graph theory. It is thus obvious that graph databases can be handled much better. When you see the speed with which graph databases spit out millions of accounts in milliseconds then it is quite impressive.

If this idea is paired with distributed decentralized data nodes, whose legalization in the sense of access is handled via transactions using blockchain, then a central database would no longer be necessary and thus completely different PLM scenarios would become possible. Protected data of products and components could be located on every computer in the world. Distributed teams and also freelancers could develop very complex products together. There could be a kind of open source catalog for solution principles and components, in which everyone who has a good idea contributes his solution principle and, as with open source software, assigns a corresponding license condition.

If such an approach became big, then established PLM manufacturers could well start sweating. This idea can now be extended to production and supply chains. In the case of physical products, it is often not only the development that is at stake, but also the ability to produce them. If this could be solved, for example through additive subtractive production processes, then this would have the potential to fundamentally change industry and the structures of companies as we know them today. So far, I have only perceived such approaches as very limited tender shoots and therefore I think they will still be a while in coming.

Back to more immediate topics: One that I find very interesting and worth mentioning is the topic of data octopus, as my dear colleague Martin Eigner gladly formulated. The idea behind this is the approach of leaving the legacy systems, i.e. the implemented basis of the backbone systems, untouched and building new processes on a clean (PLM) platform above them. This connects to the legacy systems and extracts the necessary data from there.

Different approaches are often mixed up in the discussion about this. One is to persist data on a new platform and run business processes over this data. This is then ultimately the introduction of a new parallel PLM system and brings with it major challenges in data synchronization between the legacy systems and the new system.

The other focuses on better data representation and data analysis, which will be urgently needed in the future. Here, the data from the legacy systems is simply collected in order to present it to the user in a common context. Since the data remains unchanged and is ideally only pulled ad hoc, the synchronization problem is not relevant in this case.

Since many companies are trapped in very old customizations of their ERP and the step out of there would be equivalent to a new implementation of the ERP, the data octopus approach is very tempting on a high management level, since the opinion is induced that you can leave everything as it is. Therefore, I think we will see a trend in this direction. From my point of view, it is difficult to judge today to what extent the approach is sustainable. If you set up such a concept with a flexible, lightweight PLM platform, it can be very successful in some cases. You just have to look very carefully at what you are doing.

Against the backdrop of an increasingly complex data world, the question arises as to whether new architectures and concepts for product data management need to be considered here. Do you see promising approaches from research and science there?

The thing with research and science is a special topic from my point of view. It seems to me that many in the research community believe that the structural issue in terms of information architecture that needs to be solved in PLM has already been solved. Since it is not at all easy to formulate research questions from these questions that arise during the introduction of PLM, and even more difficult to convince reviewers to mobilize funds for projects, one can certainly take this standpoint.

The consequence is that the keyword topics for which there is funding are researched. These include topics such as Model Based Systems Engineering (MBSE), Artificial Intelligence (AI) and topics related to the Internet of Things, i.e., communication between machines and end devices, etc. There are a number of research projects in this direction that have delivered great results. First and foremost, I would like to mention the MecPro2 project by my colleague Eigner, whose results I liked very much.

But let's go back to the topic of new architectures and concepts for product lifecycle management. This topic can only be penetrated with very good industry experience. It is therefore very difficult for a research institute with young scientific staff without the necessary experience to go into the depths of this topic. Another hurdle is the problem that lifecycle effects cannot be replicated in the laboratory, or only with difficulty. Data in the laboratory is not subject to the dynamics of the living data of companies that have to process thousands of orders per week, for example.

I think the whole thing is similar to the ERP back then. After the basic functions of the ERP, such as those of Hackstein, were defined, it was also good from a research perspective. The rest was implemented by the solution providers, first and foremost, of course, in the SAP environment. Even today, most books that deal with ERP, e.g., practical scheduling or the like, come from the environment of the ERP providers and are usually written by consultants and not by professors.

A shining exception was and is Professor Scheerwho achieved real breakthrough power with IDS Scheer and the modeling approach of event-driven process chains. From my perspective, however, Scheer is and was more of a consultant than a professor. In my young assistant years, I read many of his books and he inspired me a lot with them.

Today, we have a similar situation in PLM. The complex interrelationships can be researched much better in practice. That's why I don't know of any research project that deals intensively with these topics. Only now and then do I come across an industry dissertation that deals with aspects of the topic.

In order to break down topics around information architecture and to further develop future concepts for product data management, some consultant friends, selected companies and our STZ-RIM team have formed a kind of virtual network. Our major goal is to define and systematize process patterns in PLM. Such a process pattern will then form the frame of reference for the implementation. As a work result, we want to formulate a catalog of process patterns that we can assign to companies according to their needs. The assignment is then based on characteristics of the respective company itself and characteristics of the products manufactured by the companies. Since we are convinced that this topic can be solved better in industrial projects, we do not primarily act in research projects, but work with our industrial partners on corresponding concepts. Since concepts only have a value if they can be implemented, we also accompany the implementation and, if necessary, adapt the concepts according to the requirements of the implementation.

What is it about the cloud? Is it just a server "located somewhere else" or is it the strategy and solution for IT infrastructure management for enterprise software like PLM?

The cloud in the sense of "cloud and nothing else" is in a first stage an issue for infrastructure management. Suddenly it is no longer necessary to keep an own installation alive. This in itself will have a major impact on IT departments in medium-sized companies. To put it flippantly, IT in midsize companies often works like this: The IT department hangs under the commercial director and has been cut to the bone. It sees its task as keeping SAP running, writing Z-transactions and installing laptops. The highest of the feelings is then possibly a ticket system.

With the cloud, IT departments with this kind of self-image therefore quickly come under strong pressure, since on the one hand the essential work content is omitted and on the other hand a vacuum is created because no one takes on the design of an end-to-end information architecture across the backbone systems. Strategic departments then emerge whose task it is to define the processes and information architecture end-to-end and thus compensate for the vacuum in the IT departments.

I see this already today in many companies. It often starts in places where the vacuum is first noticed, e.g., in development or in organizational development - both of which are obvious. Let's look at the development department: there, new services for the customer are being considered, e.g. extended service offerings, etc.. These are often not manageable with the existing processes and on the basis of the existing information architecture. This creates pressure from this direction to change something. In the case of organizational development, the situation is similar. These departments start to model processes and thus create awareness, understanding and responsible persons for these very processes. If new requirements are now placed on the processes, these often first arrive at process management and this department then takes the reins.

Against this background, I think we will see a very interesting change in IT departments, as the classic system and installation mindset no longer has a future. CIOs and IT managers will then have to ask themselves what their future role will be.

In the implementation support of PLM projects, you leverage your experience with Siemens Teamcenter stands out. Is this an exclusive partnership or do you also support projects from other PLM system manufacturers?

Of course, as a former Tecnomatix employee who came to Siemens via UGS, I owe a lot to Siemens. In addition, I have a strong network in Siemens Industrie Software and will always be very friendly with them.

However, it would be unforgivable for the standard of my consulting if I did not deal intensively with different PLM, ERP, MES and CRM systems.

One should also not close one's eyes to the fact that there are some very good providers in the PLM area who have great PLM systems.

I see my task in helping my customers to find the suitable system for the respective application. To do this, it is necessary to transparently compile the many very different criteria that form the basis for such a decision in order to then arrive at a suitable selection.

Because it's all the rage right now, how do you rate the cooperation between Siemens and SAP one? SAP, according to some publications, will sell Teamcenter in the future and no longer focus so much on the development of its own SAP PLM.

For the European industry in particular, I am very pleased that these two great companies are now cooperating with the aim of being able to offer end-to-end digitization solutions. This is one of the most interesting and best things that could happen.

For customers, this can become unique. However, for this to happen, the methodical technical integration must be right. That's why I would like to explain the topic here from this perspective.

The question is how the connection between a PLM system and ERP system will and should be realized in the future.

This question must be answered both at the information object level and at the structure level. At the information object level, the question arises as to which object type is held where and who has the master for it and when. In concrete terms, this concerns the types CAx document and material. At the structure level, the question arises as to which of the central structures CAD-BOM, EBOM or MBOM are synchronized and where the respective master is located. Omitting topics such as configuration and the need for a uniform configuration engine, routing, assembly instructions, NC program supply, the digital twins as delivered and as maintained, and service bom, there are essentially two integration scenarios.

Scenario I:

The PLM system acts as a management system for document data storage and for lifecycling the data from the authoring systems. Material creation and maturity development of the material then take place in the ERP. In this scenario, creating and holding the EBOM in PLM would be possible in the early phase of development.

Scenario II:

As scenario I, whereby additional EBOMs and all plant-specific MBOMs are created in PLM and fully synchronized from there into ERP. The structure and maturity development of the material takes place in PLM, the maintenance of the attribution is then possible at any time due to the full synchronicity between the systems in both systems.

Scenario I is what can be implemented well today between ERP and most PLM systems. However, PLM would not act as PLM, but merely as a PDM/TDM system. In reality, the PLM layer would then be in ERP and would be based on the data model and functions of SAP ERP that exist today.

However, scenario I has the decisive disadvantage that precisely the added value that is to be achieved through cooperation cannot be achieved. This is due to the fact that PLM in this scenario would be disconnected from the downstream changes in the plants and the idea of the digital thread or the feedback loop approach would not work.

In my view, Scenario II is the option that should definitely be chosen. However, it is also the ideal case in terms of technical implementation in the PLM systems.

As already mentioned, the underlying problem here is that all of today's PLM systems, which are descended from PDM systems, cannot handle material very well. I know, some of our readers will object that PLM systems very well have the object type Part or Material and thus know material. However, those who are more deeply involved in the matter know that there are a number of question marks at this point, which I do not want to explain further here.

The real hurdle is the lack of ability of most PLM systems today to map plant views of the material, plant BOMs and BOM usages. An ERP that maps different plants can do this, of course.

It is possible to build up something comparable in PLM systems with on-board resources during an implementation, but then you have to live with quite painful restrictions. Mostly, the revision mechanism is used for this at the party level. However, this contradicts the actual application intention of this mechanism and therefore often has a very disruptive effect on the implementation. As a result, there is always the danger that the heavily customized data model will fly apart in the event of minor extensions to the PLM and that the gate to hell will open, so to speak.

In order to provide a real remedy here in the future, it will be necessary to build fundamental changes deep into the data models of the PLM systems so that true persistable plant views can be mapped there as well.

From my perspective, this is a homework for all PLM vendors on the market. I think the PLM vendor that solves it first and thus achieves full synchronicity of EBOM and the MBOM's between PLM and ERP creates an excellent market position for itself.

If, in addition, this provider succeeds in making customers understand the need for and the added value of this, it has a great chance of achieving resounding market success.

If Siemens were to tackle this in this depth together with SAP, I would be very excited. That would be a major step, and probably the decisive one, in the direction of digitization.

Mr. Prof. FischerThank you very much for this informative discussion, and I wish you every success for the tasks ahead.

Leave a Reply

Your email address will not be published. Required fields are marked *

More articles

The article presents designations and categorizations of bills of materials (BOMs) as well as the RIM 360° multi-structure model....