PLM, Microservices & Composable Architecture

The Future of PLM, ERP, CRM and PLM

Microservices and composable architecture. Will TDM, PLM, ERP, CRM... obsolete in the future? 👨‍💻🚀

The Future of PLM, ERP, CRM and PLM

Post 1

PLM, Microservices & Composable Architecture
PLM, Microservices & Composable Architecture

In discussions about the future of PLM, ERP etc., I usually hear two lines of argument:

1️⃣ Previous system classes (ERP...) are coming to an end due to upcoming microservices architectures.
2️⃣ Previous system classes will merge into one platform in the future.

What is actually behind the three-letter abbreviations ERP, PLM etc.?
I call this category system classes. So far, they have been used to group purchasable IT systems according to the type of business functions they realize.
They crystallized out of a specific business problem and were then functionally expanded. Today, there is a partial overlap and therefore also a battle for supremacy of the systems in the companies.

What does the future hold?

1️⃣ The previous system classes are now cumbersome on-prem system monoliths that can hardly be changed. Integrating them into a common scenario is time-consuming. For native cloud applications, the idea of microservice architectures or the derived idea of composable architectures has become established. The aim is to make services combinable as required. If the existing system classes switch to a native cloud architecture in the future, it makes sense to demand these paradigms from them too.

2️⃣ Does this mean the end of system classes in the future? The skills contained in the system classes will continue to exist. However, they will detach themselves from the previous system class brackets. Therefore, the argument that the system classes are at an end is valid from this perspective.

Now to the practical reality.

A microservice brings its information with it. That is the key to understanding!

The IT system classes are based on complex information models for the respective business problem. They cannot be cut arbitrarily, as information models must always retain their semantic consistency in order to solve the business problem. Arbitrary cuts are doomed to failure. The models must therefore be filtered cleanly.

The exciting thing is that the system classes form good cutting lines for filleting.
That's why I still like to use them 😊.

More on this in a follow-up article. Or contact us at STZ-RIM Reshape Information Management (👉 https://lnkd.in/eei3UYdm)

What do you think?
I look forward to your comments. 🤔💬

 

Future PLM - What is the future PLM area? 🚀

I believe that the PLM market will change fundamentally. We will see new players and unexpected players will come more into focus. 🧐

Post 2

To get to the heart of the matter, I like to use two dimensions. ✅

Future PLM - What is the future PLM area?
Future PLM - What is the future PLM area?

Dimension 1: PLM IT architecture 🌐

Everyone is talking about the cloud and composable architectures. Well, if you look closely at the cloud offerings of many PLM providers, you will see something that could be described as a "containerized cloud". This means that the existing code line is packed unchanged into a cloud container and presented as a cloud offering.

Of course, this is a far cry from a cloud-native application that may also be composable, i.e. easy to integrate with other cloud services.

This is where dimension 2 comes into play. 🔍

Dimension 2: PLM layer 🛠️

I like to differentiate between different PLM layers and like to use TDM as layer 1. Sometimes I am told in discussions that this division was necessary in the 1990s but is outdated today. Microservices are then often used as an argument.

Well, this division ultimately has a methodological background. One of the core ideas of microservices is that they bring their data with them. PLM processes are based on the lifecycle of data. So if you want to divide these into services, you need natural interfaces. In my opinion, these interfaces run along the PLM layers.

  • Layer 1 - Team Data Management (TDM) Layer
    Subsumes the data management of application-based data. These will be transformed from previous file-based systems into cloud-native services. At the interface to layer 2, a time-correct (e.g. geometric) document for one or more entities of the product is passed on as transfer information.
  • Layer 2 - Multistructure PLM layer
    Has the task of managing the maturity development of the product-relevant physical entities via a series of structures (requirements structure, system/functional structure, installation space structure, EBOM ...). The documentation from layer 1 is assigned to the entities at the correct time. The transfer to layer 3 is a complete definition of realized product entities from the perspective of R&D.
  • Layer 3 - ERP-related PLM layer
    The level two implementations must now be industrialized in the plant network. This creates the implemented material numbers and additional planning levels.

 

In future, the three layers mentioned for Dimension 2 can be separated into different services and handled in terms of packaged business capabilities.

In the picture I have drawn a PLM space for you from these dimensions. 📊

How I see today's PLM players should be classified will follow in the next post. Feel free to post your questions or comments in the comments. I look forward to the discussion. 💬👀

 

The oligopoly of the big 3 CAx providers in PLM will end 🛑 📆

We will see the rise of new and unexpected PLM vendors 🚀.

Post 3

PLM Space Today and Future
PLM Space Today and Future

Here's why 📝.

👉What we think it is: Today Common View 🧐.
  • PLM-IT architecture (dimension 2).
    The major PLM vendors today offer on-premise and cloud solutions at ☁️. If you take a closer look at the cloud offerings, we have something that could be called a "containerized cloud". This means that the existing line of code is packed unchanged into a cloud container and offered as a cloud service. This is a far cry from a cloud-native solution.
  • PLM layers (dimension 1)
    I differentiate between layer 1, which contains the engineering applications (CAx apps), the Team Data Management (TDM) layer 🌀, layer 2, the multi-structure PLM layer and layer 3, the ERP-related PLM layer, in which we create the M(RP)-BOM snippets. (Youtube movie) will be managed 🎬.

 

In today's common view, many assume that the current PLM systems of the Big 3 CAx providers can easily be extended to Layer 3, the ERP-related PLM layer, and that the ERP systems will therefore mainly become flow-through heaters 🌡️.

👉 How it will be: A look into the future 🔮.
  • PLM IT architecture (Dimension 2).
    I am convinced that we will see true cloud-native offerings and hybrid scenarios (i.e. cloud mixed with containerized cloud) during the transition ⚙️.
  • PLM layers (dimension 1)
    Today, CAD systems are being expanded into 3D platforms with a wide range of automation options ⚡️. The new possibilities will force the big three to integrate their CAD systems much more deeply into their classic PDM/PLM solutions 🔄. This will replace PDM data models, the core of which dates back to the 1980s 🕰️. At the same time, it will become more difficult to occupy the multi-structure layer with these data models 🧩. In addition, the PLM layer close to #erp is clearly crystallizing in layer 3 ⭐️. It is already clear that layer 3 is very difficult to occupy with the existing #PDM data models 🚧!
👉 Conclusion

This means that the three major PLM companies are facing several challenges 🌪️.
Should they redevelop their PLM systems to be cloud-native?
Shouldn't they rather develop CAx as a cloud-native solution?
Are they able to define the urgently needed new data models for layer 2 and layer 3?
These are very complex questions that need to be answered 🤔.

At the same time, we see that existing players and also completely new players, freed from the CAx legacy, are happily occupying PLM layers 2 and 3 🌟 in the various architectures.

I think that SAP and CONTACT Software have the potential to be at the forefront 🥇. But previous niche providers such as Propel Software or Oleg Shilovitsky with his OpenBOM will certainly also play their part and others that I haven't mentioned now 🎭 (feel free to name them in the comments).

As you can see, the future will be super exciting 😃.

Of course we shouldn't write off the big 3!
They are strong, they have top teams and of course the ability to develop into the future 🌱.
It would be interesting to see which path they choose.

Tough stuff! 💪

What do you think? 📢💬

 

Leave a Reply

Your email address will not be published. Required fields are marked *

More articles

I am convinced that the future viability of the discrete industry lies in the ability to implement Hybrid Modular Kits. They close the gap between development...