How PLM can “build itself” using artificial intelligence technologies

January 7, 2015


I had a chance to visit The Art of Brick exhibition in Boston’s Faneuil Hall Museum few days ago. If you following me on social media websites, there is a chance you noticed few pictures. Afterwards, I read more about LEGO artist Nathan Sawaya. What impressed me is a power of “simple LEGO brick”. A simple plastic brick and huge amount of imagination is allowing to create such an incredible models.


You can ask me – how is that connected to engineering, manufacturing and product lifecycle management? Here is the thing… It made me think about ways PLM systems are implemented these days. I’m sure you are familiar with the “best practices” approach. The topic isn’t new. I found my old post - PLM best practices torpedo. After five years, I still like my conclusion – PLM best practices are good to show what PLM technology and software are capable to do. However, for real implementation, it is not very useful. You have to come back to a “simple bricks” of PLM technology – data models, documents, lifecycle statuses, bill of materials, processes.

I captured a bit different perspective about PLM best practices. Navigate to PLM cultural change blog – PLM design patterns. It took me back into thinking about best practices. How to define implementation patterns and make a real PLM implementation much easier? The article speaks about general way of PLM implementation can be done, organizational transformation and change. Read the article. I found the following few passages interesting:

In general you can setup all required supporting procedures in using the PLM design patterns. Even for specific supporting procedures of a business process pattern like Engineer to Order (ETO) you can derive patterns, which consists of a framework of general PLM design patterns and are adapted to the specific business needs. There is enough freedom to derive based on these patterns supporting procedures to fulfill specific business needs.

If some organizations would have implemented supporting procedures based on patterns already, then consultants in introducing PLM to an organization could refer to “state of the art” implementation examples of these organizations. The target is to convince an organization, that the decision for a new practice requesting organizational change is required and works. Only then the organization can enable the full potential of the PLM methodology without remaining stuck in the current practice.

Instead of inventing a Ping-Pong table “from scratch” with a cabinetmaker we can make a clear decision based on all the options available, fulfilling and probably exceeding our originally perceived needs (with a safe and easy-to-use folding mechanism). And we can afford it, because a stock table is cheaper than a custom built one.

The time saved in avoiding the endless discussions and continual redesign of processes because of paradigm paralysis, based on current methods, could be better used in a well-planned, strategic deployment of the new processes leading to an improved business solution.


The idea and vision of configurable patterns and best practice is interesting. In my view, it was invented earlier as PLM toolkits, flexible data models and process templates. The key problem here is not related to technology- software does what it does. The problem is related to people and organization. Remember, technology is simple, but people are really hard. What called “to convince people” is actually a process that takes organization and people to understand their business and product development patterns. Without that understanding the chances of successful PLM implementation are very low and probability of PLM project failure is high.

So, what could be 21st century solution for that problem?

My attention today caught by a new startup – The Grid. The tagline states – AI websites that design themselves. The vision of The Grid is to change the paradigm of website building. The idea of self-building websites driven by artificial intelligence and data analysis is something worth to think about. Watch the video.

Now let me back to manufacturing companies and PLM implementations. All manufacturing organizations are different. The approach most of PLM vendors are taking these days is to classify companies by size (small, medium, large), industry (aero, auto, industrial equipment, etc), manufacturing model (mass production, configured to order, engineering to order, etc.) and many others such as locations, supply chain, existing enterprise systems (ERP, SCM, etc.). The decision matrix is huge. To make analysis of existing manufacturing company, processes, existing systems, requirements – this is what takes time and money during PLM implementation.

What is my conclusion? The opportunity we have today is coming from new way to process data. Call it cloud computing, big data, whatever. Facebook is reporting about a capability to index trillion posts. Would it be possible to capture data from an existing manufacturing company and ask PLM system to build itself? Is it a dream or a future of PLM? Just my thoughts…

Best, Oleg

pictures credit to The Grid website and PLM cultural change blog

What is PLM software replacement cycle?

December 13, 2014


PLM selection is complex process. It takes time to make a decision, evaluate, build a pilot and implement PLM system. I’ve been thinking about how this process can change in the future. Navigate to my Future PLM selection post to catch up. One of my discoveries was the following data point about age of ERP system.

Bluelinkerp blog – When should you replace your ERP software brings an interesting diagram – the majority of ERP implementations is up to 7 years old. The chart based on data provided by Aberdeen study – Aging ERP – When your ERP is too old.


This data point is not scientific, my I can predict that company is replacing ERP system every 7-10 years. This number is actually aligned with similar numbers I’ve heard from ERP resellers in the past.

It made me think about replacement cycle of PLM systems. I guess we can probably see a similar trend in the PLM market too. PLM systems are aging and we can probably discover lifecycle of PLM implementations. Sort of PLM recycling. I’ve been trying to find some information to support it, but didn’t find much references online.

Joe Barkai’s blog – Product Innovation Congress 2014 San Diego brings some interesting fact about PLM system replacements. Here is the passage from Joe’s blog.

There appears to be much activity in selecting, replacing and upgrading PLM software. Some were first time PLM buyers, but there were a surprising number of companies expressing dissatisfaction with the exiting solution and seeking a “better” PLM system. I did not conduct a structured survey, but anecdotally it appears that a good number of those in search of a PLM replacement are users of ENOVIA SmarTeam and ENOVIA MatrixOne.

My observation: The continued search for a “better” PLM system will continue to drive activity and put pressure on PLM vendors to deliver greater value in enhanced functionality, lower cost, faster deployment, and new delivery and ownership models. The move of reluctant PLM vendors such as Oracle Agile to offer a cloud delivery model is but one recent example and I except other PLM vendors are in the process of following suit. This dynamic keeps the door open for vendors such as Aras PLM that continues to challenge the hegemony of the incumbents.

That being said, buyers should realize that the PLM software itself isn’t a substitute or remedy for flawed and suboptimal product development processes. For each dissatisfied PLM user company you will find many others who are highly successful and are able reap the full potential of the very same PLM software. It isn’t the SW. It’s you. Don’t blame the vendor.

My hunch most of large manufacturing companies already made few PLM system implementations. They made mistakes and probably want to fix them. In addition to that, businesses and systems requirements are evolving. People turnover is another factor. Enterprise systems lifecycle can be triggered by new people coming to the role of managing enterprise and engineering IT. So, 7-9 years, is a good time period to make analysis, fix problems and re-think PLM implementation and strategy.

What is my conclusion? Understanding of PLM software replacement cycle and lessons learned from an implementation can help to build a better PLM industry eco-system. It is less about blaming vendors of companies for software problems. It is more about understanding of business, technologies and implementation needs. Just my thoughts….

Best, Oleg

PLM implementations: nuts and bolts of data silos

July 22, 2014


Data is an essential part of every PLM implementation. It all starts from data – design, engineering, manufacturing, supply chain, support, etc. Enterprise systems are fragmented and representing individual silos of enterprise organization. To manage product data located in multiple enterprise data silos is a challenge for every PLM implementation.

To "demolish enterprise data silos" is a popular topic in PLM strategies and deployments. The idea of having one single point of truth is always in mind of PLM developers. Some of my latest notes about that here – PLM One Big Silo.

MCADCafe article – Developing Better Products is a “Piece of Cake” by Scott Reedy also speaks about how PLM implementation can help to aggregate all product development information scattered in multiple places into single PLM system. The picture from the article presents the problem:


The following passage is the most important, in my view:

Without a PLM system, companies often end up with disconnected silos of information. These silos inhibit the ability to control the entire product record and employees waste unnecessary time searching for the correct revision of the product design. As companies outsource design or manufacturing, it becomes even harder to ensure the right configuration of the product is leveraged by external partners.

Whether your company makes medical devices, industrial equipment, laptops, cell phones or other consumer products – PLM provides a secure, centralized database to manage the entire product record into a “Single Record of the Truth”… With a centralized product record, it is easy to propose and submit changes to the product design, track quality issues and collaborate with your internal teams and supply-chain partners.

The strategy of "single record of truth" is a centerpiece of each PLM implementation. However, here is the thing… if you look on the picture above you can certainly see some key enterprise systems – ERP, CRM, MES, Project and program management, etc. PLM system can contain scattered data about product design, CAD files, Part data, ECO records, Bill of Materials. However, some of the data will still remain in other systems. Some of the data gets duplicated. This is what happens in real world.

It made me think about 3 important data architecture aspects of every PLM implementation: data management, data reporting and data consistency.

Data management layer is focusing on what system is controlling data and providing master source of information. Data cannot be mastered in multiple places. Implementation needs to organize logical split of information as well as ability to control "data truth". This is the most fundamental part of data architecture.

Data reporting is focusing how PLM can get data extracted from multiple sources and presented in seamless way to end user. Imagine, you need to provide an "open ECO" report. The information can reside in PLM, ERP and maybe some other sources. To get right data in a right moment of time, can be another problem to resolve.

Last, but not least - data consistency. When data located in multiple places system will rely on so-called "eventual consistency" of information. The system of events and related transactions is keeping data in sync. This is not a trivial process, but many systems are operating in such way. What is important is to have a coordinated data flow between systems supporting eventual consistency and data management and reporting tools.

What is my conclusion? To demolish silos and manage single point of truth is a very good and important strategic message. However, when it comes to nuts and bolts of implementation, an appropriate data architecture must be in place to insure you will have right data at right time. Many PLM implementations are underestimating the complexity of data architecture. It leaves them with marketing slogans, burned budgets and wrong data. Just my thoughts…

Best, Oleg

picture credit MCADCafe article.

PLM Implementations Challenges and 3 Organizational Lenses

June 4, 2014


It is not unusual to hear people speaking about PLM implementation and changes that need to be done in the organization. Very often, PLM vendors or implementers are calling this process business transformation, which is literally supposed to make a change in everything that related to product design, engineering, manufacturing, support and services.

So, to implement PLM is hard. I can admire the power of some PLM technologies. At the same time, to make customer implementing and using them takes time and energy. PLM vendors understand that. PLM service companies aimed to make implementation successful by understanding organizational specifics, adapting business processes and configuring PLM software.

It made me think about how PLM implementation challenges can be mapped on some fundamental organizational behaviors. Organizations are often inspected with 3 fundamental lenses: (1) strategic (2) political, and (3) cultural.

PLM and Strategic Lens

The strategic lens is the most often applied perspective. Under this lens, managers are looking how to optimize work and meet corporate goals. This lens is responsible for processes and procedures. This is a place where all benefits of PLM can shine. However, very often, the application of PLM strategic transformation is triggering significant organizational turbulence in everything that related to processes and procedures. People dislike the change as well as tend to spend lot of time discussing how to make an optimal strategic process alignment. It drives lots of confusion and can derails PLM implementation.

To map existing organizational processes can be a good starting point to overcome challenges that PLM implementation can face with changes of processes and procedures. A good approach can be to apply changes in the specific processes without changing whole organization in a single shot.

PLM and Political Lens

The political lens looks at the distribution of power and influence. This is one of the most complicated part. This is a part where organization is distributing power and authority. The major challenge for PLM is related to the need to cross organization silos. Organizational silos are distributing power by separating data, people and responsibility. It is also separating IT stack and data ownership.

To bridge organizational silos can be a good way to optimize organizational behaviors and establish "political contracts". PLM can be a factor that consolidate people and help them to turn organizational silos into "cylinders of excellence".

PLM and Cultural Lens

This is probably the most unclear thing. It reflects underlying attitudes and beliefs. In many situation it reflects how culture and history of the company can affect their decisions. This is where you can expect lots of historical "PLM pitfalls" to happen. You need to understand the motives of people in power in order to be able to understand and predict their decisions. It will help you to influence them and increase PLM adoption.

What is my conclusion? Don’t ignore fundamental organizational structure and mechanisms. Strategic, Political and Cultural lenses can give you a good model to survive PLM implementation and make it successful for organization in all aspects. Just my thoughts…

Best, Oleg

PLM Services, Department Stores and Digital Future

June 2, 2014


Don’t be surprised if your most trusted CAD/PLM service provider will be acquired tomorrow. According to Joe Barkai’s post- PLM Service Providers Ready To Deliver Greater Value, we have been witnessing a wave of mergers and acquisitions of PLM services companies (the examples – Accenture / PRION Group, Accenture / PCO Innovation, KPIT-Tech / I-Cubed /Akoya; Kalypso / Integware merge). The following passage gives you a feeling of the core reason behind that.

For years, PLM companies focused more on PLM /PDM implementation than on actually improving business processes. While the business benefits of PLM were well articulated and supported by rosy ROI models and complex colorful architecture slides, many manufacturing companies were unable to achieve the process changes and enterprise software integration that were need to reap the promised benefits, and ended up implementing a PDM system. Albeit critical for managing product data, this reality might explain why some manufacturers feel they might have overpaid for their PLM implementation efforts.

The status quo may be changing, and organizations that have gone through massive implementation projects are ready for more. They need to improve their capacity for more complex multidisciplinary decisions using product data, whether it’s stored in PLM/PDM, ERP or in other, less structured forms; they need to improve collaboration in elongated and fragmented design partner networks and supply chains; they need to leverage product and consumer insight garnered from social media, warranty claims, and channel activities.

The story makes sense to me. In my post few weeks ago – Why PLM stuck in PDM?, I’ve been talking about exactly the same reasons behind a problem with deep and broad PLM adoption – (1) focus on CAD, (2) poor integration between PLM and ERP, (3) absence of process thinking, etc.

Joe’s article made me think about the role PLM service providers will play in the future PLM implementation strategies. It reminded me department stores. Think Macy’s, JCPenny, Bloomingdale, Nordstorm… Large manufacturing companies own a huge chunk of PLM software. Every PLM vendor has their own strong characteristics. One size doesn’t fit all. Customers’ existing investments are huge.I don’t see these manufacturing companies will start jumping between vendors. So, how to make existing PLM system work and show bigger value becomes very important. Which obviously raises the question about qualified service providers. Large teams and ability to implement any PLM software will be the key for success and profit. Customers will be coming to the PLM service department store and guided to the right brand(s) or configuration of brands depends on their preferences and constraints.

You may ask me who will play the role of Amazon in the growing PLM service eco-system? This is a very interesting question to ask. Will e-commerce come and disrupt B&M PLM services? Who will provide a new class of systems, which requires different service capabilities? Who will provide online PLM services in a lean way. Joe is mentioning Autodesk PLM360, GrabCAD and Aras in the list of potential candidates. Who knows…

What is conclusion? The history often repeats. To pay attention on existing trajectories and department stores and e-commerce is important. New e-commerce vendors are growing up, but existing B&M department stores are selling lots of stuff. The same happens in PLM. Today, large vendors provide solutions for companies that ready to implement existing PLM software. It sounds like a good strategy for large manufacturing companies with deep pockets. Until the question about “lean and digital” will come up. Will online and lean PLM offerings compete with existing PLM vendors? This is a good question. There is a good chance for newcomers to play disruptive strategies. However, alternatives are possible as well. Either newborns in the cloud will outgrow existing B&M or existing vendors will develop right digital skills and experience. Just my thoughts…

Best, Oleg

Why My PLM Won’t Work For You?

February 6, 2014


To implement PLM is a process and change. Speak to anyone in engineering and manufacturing community and they will bring you lots of stories about complexity of PLM implementations and associated cost. Also, you can hear lots of stories about complexity of moving from one PLM implementation to another or switching from one PLM system to another. Companies are spending tons of money to align PLM systems to a specific set of requirements to fit company date management and process needs.

Couple of years ago, I posted "Is PLM customization a data management Titanic?". For many of manufacturing companies, it is a reality these days. Implementations done 10 years ago can be hardly maintained. To update current implementation to a new PLM system version or another PLM systems is mission impossible. Companies are hiring advisers and consulting companies involved in the implementations and development of PLM systems to run migration and adjust PLM system to a new set of requirements.

Vendors have been trying to resolve the complexity of PLM systems by applying out-of-the-box configurations. However, the success of these ready-to-go systems was somewhat mixed. Pre-configured templates worked well during marketing shows, presentations and evaluations. However, in order to bring system to production mode, (still) required customization to be done. Very often, the customization was coming to replace pre-configured template totally. The problem is not very unique in PLM space. I posted – How to de-customize PLM article few weeks ago. I discussed the importance to decrease customization level as well as presented some similar customization complexities coming from SharePoint implementations.

Today I wanted to provide some recommendations you can follow in order to stay away from costly PLM customizations. These recommendations will also help you to avoid some typical PLM implementation pitfalls. Here are 4 steps to follow:

1- Ask yourself what problem you want to solve with PLM for the next 2 years. The term PLM used by many people in a variety of forms and meanings. Going with specific scope (e.g. change management, quality, bill of materials, etc.) will help you to chart functionality you expect PLM system to support.

2- Outline main data elements and structures PLM system needs to support in order to solve list of problems from previous step. Do it with no connection to specific PLM system and vendor. Make an agreement in your extended team about that. It can take you some time to get to the agreement, but in many situations this is one of the best investments you can do in order to eliminate extra customization steps.

3- Pickup few PLM systems and try to map your requirements to what these systems can provide you without customizations. Don’t be afraid to change your terminology alongside of this process. However, insure that whatever name PLM system is using, it will do what you expect from functional standpoint. It will be a good idea to hire consultant during this stage. It is worth to spend some dollars to avoid future budget waste.

4- Last, but very important. You need to test that selected system is flexible enough to apply changes on top of pre-configured parameters/templates. It is not unusual to provide out-of-the-box system configuration that cannot be practically changed. Practically means in this context the ability to add/modify system and data model and (at the same time) keep most of existing functionality in place. Stay away from system configurations with scripts and customized behaviors hard-coded to a particular data models and workflows.

What is my conclusion? The combination of flexibility and preconfigured environment is the key to stay away from costly PLM customizations. However, these two characteristics are very often mutually exclusive. Vendors can show up ready-to-be-used PLM configuration that will be literally destroyed as soon as you will have to change something. To run a test of how flexible is out-of-the-box PLM model is a key thing not to be punched in the face by future PLM system re-configuration and customization cost. Just my thoughts…

Best, Oleg

Why PLM should not think about ownership?

October 10, 2013


Enterprise software business is complicated. It often comes with questions of system responsibilities and ownerships. IT usually holds primarily responsibility for cross company software stack. At the same time, divisions and departments often try to influence IT and make their own decision related to what software to use independently. One of the terms came out of this political and technological confrontation is shadow IT.

Ownership is a sensitive question. In enterprise organization (manufacturing companies are included) it often involved with many politics, hidden agendas and power influence. This is a reality of many enterprise organizations. By the nature, PLM is getting deeply involved in this politics. If you think about PLM as a business strategy focused on organizational processes, it becomes clear why it is hard to avoid cross departments conflicts. Then, if you think about PLM as a technology, you might appreciate why ownership of a particular set of technology and/or data can be involved in the same cross department operation politics.

Earlier this week, I’ve been reading Aras blog – Who should take ownership of PLM? Navigate to the following link to have a read (). Draw your own opinion. I think it provide some interesting perspective on PLM ownership. This is how Aras blog explained that:

With this business dynamic in mind, have you reexamined who should take ownership of PLM in your organization? Something to consider, there will be different driving forces behind PLM depending upon who has ownership. Here are some to keep in mind: Should Design select and own the PLM system? As you’ve likely seen, the main focus of PLM is in CAD file management, change management and BOM management when design is at the wheel. What about Operations? With cost reduction, inventory management and purchasing at the top of their priorities you can bet these will be the focus for Operations driven PLM as well. What about Quality? With the Quality organization leading, the driving forces for PLM are likely to be compliance, regulations and overall quality of products.

I agree, the story is really not black and white. However, good organization will break this story down to the pieces of business processes and technologies. There are clear benefits for division and departments to be responsible for their processes. This is should be an ultimate goal. Nevertheless, I still can see some overall responsibility on product development processes (especially when it comes to cross functional teams). So, to have somebody responsible for product can provide a lot of value. At the same time, it doesn’t mean technology needs to be driven by the same department people. I see technological and software stack responsibility separately. Sometimes, it can come as multiple pieces of product and technologies.

What is my conclusion? In my view, the question of "PLM ownership" is a wrong one. It must be split into business, process, communication and technologies. It is obvious Quality (or any other) department should be responsible for how processes are organized. However, it doesn’t imply ownership of technologies and software pieces. Just my thoughts…

Best, Oleg

Image is courtesy of Aras blog.


Get every new post delivered to your Inbox.

Join 272 other followers