How PLM can “build itself” using artificial intelligence technologies

January 7, 2015


I had a chance to visit The Art of Brick exhibition in Boston’s Faneuil Hall Museum few days ago. If you following me on social media websites, there is a chance you noticed few pictures. Afterwards, I read more about LEGO artist Nathan Sawaya. What impressed me is a power of “simple LEGO brick”. A simple plastic brick and huge amount of imagination is allowing to create such an incredible models.


You can ask me – how is that connected to engineering, manufacturing and product lifecycle management? Here is the thing… It made me think about ways PLM systems are implemented these days. I’m sure you are familiar with the “best practices” approach. The topic isn’t new. I found my old post - PLM best practices torpedo. After five years, I still like my conclusion – PLM best practices are good to show what PLM technology and software are capable to do. However, for real implementation, it is not very useful. You have to come back to a “simple bricks” of PLM technology – data models, documents, lifecycle statuses, bill of materials, processes.

I captured a bit different perspective about PLM best practices. Navigate to PLM cultural change blog – PLM design patterns. It took me back into thinking about best practices. How to define implementation patterns and make a real PLM implementation much easier? The article speaks about general way of PLM implementation can be done, organizational transformation and change. Read the article. I found the following few passages interesting:

In general you can setup all required supporting procedures in using the PLM design patterns. Even for specific supporting procedures of a business process pattern like Engineer to Order (ETO) you can derive patterns, which consists of a framework of general PLM design patterns and are adapted to the specific business needs. There is enough freedom to derive based on these patterns supporting procedures to fulfill specific business needs.

If some organizations would have implemented supporting procedures based on patterns already, then consultants in introducing PLM to an organization could refer to “state of the art” implementation examples of these organizations. The target is to convince an organization, that the decision for a new practice requesting organizational change is required and works. Only then the organization can enable the full potential of the PLM methodology without remaining stuck in the current practice.

Instead of inventing a Ping-Pong table “from scratch” with a cabinetmaker we can make a clear decision based on all the options available, fulfilling and probably exceeding our originally perceived needs (with a safe and easy-to-use folding mechanism). And we can afford it, because a stock table is cheaper than a custom built one.

The time saved in avoiding the endless discussions and continual redesign of processes because of paradigm paralysis, based on current methods, could be better used in a well-planned, strategic deployment of the new processes leading to an improved business solution.


The idea and vision of configurable patterns and best practice is interesting. In my view, it was invented earlier as PLM toolkits, flexible data models and process templates. The key problem here is not related to technology- software does what it does. The problem is related to people and organization. Remember, technology is simple, but people are really hard. What called “to convince people” is actually a process that takes organization and people to understand their business and product development patterns. Without that understanding the chances of successful PLM implementation are very low and probability of PLM project failure is high.

So, what could be 21st century solution for that problem?

My attention today caught by a new startup – The Grid. The tagline states – AI websites that design themselves. The vision of The Grid is to change the paradigm of website building. The idea of self-building websites driven by artificial intelligence and data analysis is something worth to think about. Watch the video.

Now let me back to manufacturing companies and PLM implementations. All manufacturing organizations are different. The approach most of PLM vendors are taking these days is to classify companies by size (small, medium, large), industry (aero, auto, industrial equipment, etc), manufacturing model (mass production, configured to order, engineering to order, etc.) and many others such as locations, supply chain, existing enterprise systems (ERP, SCM, etc.). The decision matrix is huge. To make analysis of existing manufacturing company, processes, existing systems, requirements – this is what takes time and money during PLM implementation.

What is my conclusion? The opportunity we have today is coming from new way to process data. Call it cloud computing, big data, whatever. Facebook is reporting about a capability to index trillion posts. Would it be possible to capture data from an existing manufacturing company and ask PLM system to build itself? Is it a dream or a future of PLM? Just my thoughts…

Best, Oleg

pictures credit to The Grid website and PLM cultural change blog

PLM Technology vs Vertical Industries: Wrong balance?

April 14, 2014


Let’s talk about PLM technologies. Err.. PLM is not a technology. Even more, PLM is even not a product. So, what is that? Business strategy? Product development politics? For the sake of this conversation let’s leave these debates out. I want to speak about PLM technologies that allow you to manage product data, CAD files, bill of materials, rich set of related information as well as processes around it. This technology came to us about 20-25 years ago first as a very hard-coded set of tools. You had to build it literally different for every customer. So, it supported only large customers that were able to pay for software, infrastructure and implementation. Later on, PDM/PLM turned into software toolkit. The next step in PDM/PLM technology evolution was called flexible data modeling. The first flexible (dynamic) PLM data modeling tools were released back in 1995-2000 and… not much changed since then.

So, what happened since that time? PLM vendors went to develop out-of-the-box and vertical industry solutions in a massive way. David Linthicum’s article officially is out of ideas reminded me about the joke comparing technology vs. industry play. Here is the passage:

When you run out of new ways to provide innovative technology, you go vertical. That was the running joke among CTOs back in the day. It usually meant the market had reached the saturation point and you could not find new growth

I found this message very compelling to what happens in PLM industry. PLM vendors are trying to compete by providing more comprehensive set of data models, best practices, process templates. By doing so, vendors want to reduce TCO of PLM implementations. It is actually brings success and many customers are using these solutions as a starting point for their PLM implementation.

So, where is the problem? For most of the situations, PLM is still costly and expensive implementation. Services may take up to 50% of the cost. Here is the issue – core PLM data and process modeling technology didn’t change a lot for the last 10-15 years. Data models, CAD file management, product structure, process orchestration. All these things are evolving, but very little. The fundamental capabilities are the same. And it is very expensive to develop solutions using these technologies.

You may ask me about cloud technologies. Cloud is the answer. But only partially. It solves problems related to infrastructure, deployments and updates. Cloud provides clear benefits here. However, from the implementation technology standpoint, it is very similar to what non-cloud solutions can offer. Another interesting passage from Infoworld cloud computing article explains what is the problem new SaaS/cloud products can experience when trying to displace existing vendors:

So many companies have tried this approach — many times — but most found limited success. I can’t help but think the same will occur here. Salesforce will soon discover that when you get into vertical industries, the existing foundation of industry-specific applications is difficult to displace. Although Salesforce can always play the SaaS card, most of those industry-specific providers have already moved to SaaS or are in the midst of such a move. That means SaaS won’t be the key differentiator it was when Salesforce first provided its powerful sales automation service more than a decade ago.

What is my conclusion? Efficiency and cost. These are two most important things to make PLM implementation successful. So, the technology must be improved. Data and model capturing tools, flexibility and ease of use – everything must be more efficient to support future of manufacturing processes. How to do so? This is a good topic to discuss with technology leaders and strategiest. I’m going to attend COFES 2014 in 10 days. I hope to find some answers there and share it with you.

Best, Oleg

PLM Best Practices Torpedo

March 10, 2010

One of the very important aspects of PLM as enterprise software is the ability to be implemented in the fast and easy way. I think you will agree, that long implementation cycle can put your PLM project into the wastebasket as well as your carrier on hold. I’m observing strong move of PLM companies towards proposing of “best practices” or so called “industry best practices”. Such best practices are normally set of data models, process definitions and other recommendations about how to implement a system in the organization.

The whole approach made me think this can represent the mainstream trend in PLM implementations. On the surface, it sounds as a silver bullet. You are getting “ready-to-use” system with all bells and whistles. What do you need is only run it in your organization.

However, when thought about that more, I got a different way to see it. The significant piece of these best practices is a data modeling schema. Basically, it represents the way system works. On top of these data models, applied different business rules and process definitions. It reminds taxonomies (Wikipedia: Taxonomy is the practice and science of classification) applied to the data you manage. Why do I have a concern in such models? In my view taxonomy-based approach is good when your data is stable, and you have an agreed way to organize it. When applied to manufacturing organization, it means you have an agreed way to handle product data and processes around. However, this is not true in the modern manufacturing organizations. Today’s organizations are dynamic and experience on going change due to economical, business and regulatory activities. How do you think it will work when pre-build best practices will be applied?

Here, I’m coming to the second question. The most important activity when implementing PLM is the ability to make changes and react in a fast manner when you experience changes in your organizational processes. Customers these days are interested in how to make small and lean implementations as well as a move with the short steps. And it sounds like a contradiction with pre-defined templates and best practices. Whatever you are going to apply in the beginning will be changed later.

And my third and final question is about initial implementation. Each organization is sort of unique skills, rules and business processes. How these  practices can be mapped to the predefined best practices. Hm… It sounds as another point when changes will be applied.

This is my take. Best practices are like a torpedo. When it comes as a bunch of models, processes and rules, you need to spend organizational time to apply them to the way your company is doing business. This is a first time explosion. Within the time, you’ll need spend more time to change various aspects of predefined pieces. So, this is your next explosion. After few of such explosions, I think, your model will be completely different from the original best practices.

So, what is my conclusion? What should be implementation starting point? I believe best practices are the excellent way to show what your PLM product is capable of doing. However, as implementation practices, it doesn’t work. You need to have a system that can capture your business practices. Once you did it, you can change and optimize. The system need to be flexible enough. The cost of the initial best practices’ application is too high. Instead of that you better invest into a system that can capture your business processes.

Just my thoughts… It will be great if you can share your experience. I hope we’ll have a good conversation.

Best, Oleg


Best-practices aren’t good enough for PLM?

October 6, 2009

Picture 19What I want to discuss today is PLM Best Practices. Frankly saying, my thoughts about the topic were accelerated by COFES 2010 theme publishing “COFES 2010: Best-practices aren’t good enough“. So, it brings me to think in loud about what we call “best practices”. I think, you can hear a lot about this in PLM community. Best practices became very popular and had promoted as the way to have an efficient PLM implementation. You can find it in different “envelops” and combinations – best practices, industry solution, express offering etc. But speaking with different people in our community, I identified two main trends, if you will, in everything related to best practices:

1. Support best practices coming from PLM vendors.
This community of people truly believes, PLM providers, by supporting best practices will release packages that will be ready to use and will be adopted very quickly by organization. For the small organizations, it will help to reduce the cost of implementation. For the big organizations, it will provide a framework to change way organization work. Customers don’t need to spend time to define processes, models, implementation specific stuff. Just install and go…

2. Support flexible configurable PLM software. This community is actually very opposite to previous one. From the standpoint of these people, PLM vendors have no sufficient knowledge to provide pre-packaged configurations. In addition, they believe in uniqueness of product development processes in the organization (even in the same industry). In their view, PLM software vendors need to focus on producing highly configurable, flexible software that can be customized, configured, adapted for specific customer needs.

So, what is my conclusion for today? I don’t see any of these approaches as a “silver bullet”. And I definitely see advantages of both directions. I’d be interested in open discussion with you to share and discuss your experience, vision and future thoughts.

Best, Oleg

Should I keep secrets from my PLM system?

August 12, 2009

Product Lifecycle Management pretend to manage everything in the organization that related to product, information about product, development IP etc . So far, a pretty long list. The ultimate PLM goal, is to manage product lifecycle from an initial idea until disposal. When I’m thinking about such as goal is always sounding big and ambitious. However, do we really need it to success with PLM?

Maybe we can learn from people experience in non-PLM areas. I was reading interesting post on New Com Biz related to people notebooks. I discover some interesting observations related to notebook (paper, not computer :)) management. You can see below a picture of weekly pages separated into four zones – Work, Projects, Personal, Misc. You can consider it as a weekly lifecycle. So far, this approach will not manage all your tasks, but will provide you quite good control on what is going on.

My experience in the area of digital control is related to my own implementation of GTD (Getting Things Done) in my Outlook system. If you are not familiar with GTD approach, take a look on this book. Take a look on the picture. You can see a stream of incoming messages, contextual folders, next actions, projects etc. This is an ultimate system to control your digital life.

My Getting Things Done Folders

My Getting Things Done Folders

Now, back to PLM. Should we take “total approach” and manage everything related to a product with PLM? The answer probably yes, but the long term. For the short term, please keep secrets from PLM. Don’t give PLM system power to make your life complex. You better keep your PLM implementation simple and get your things done on time.

Best, Oleg

PLM Transformation: Easy, No; Costly, Yes.

July 31, 2009

For the last weeks, I have chance to speak with few of my colleagues about PLM implementations and technologies in different contexts. The main point of these conversations was about how to make PLM implementation easy. Few interesting blog posts related to this topic – “A PLM Success Story with ROI” by Jos Voskuil and “Why is implementing PLM Hard” by Jim Brown. Few days ago I wrote about how to move PLM to mainstream. However, my take in this post was mostly about technologies, Jim and some of my other blogging colleagues raised valid question – PLM is about people and organizational transformation and therefore, it cannot be easy. This is about change! Change is hard… So, my thoughts today are about to analyze what PLM transformation mean from both sides – technological and organizational.

Here is my short conclusion about main factors that make PLM transformation to be a very not trivial task in organization.

New operational environment for many people in organization

In most of the cases introduction of PLM systems bring new environment to users. New data and process tools, viewing software etc. PLM environment normally combined from few connected pieces – Product, Process and Organization related. Because of complex dependencies existing between these pieces, resulting environment is not easy. So what we have in organization – confused customers and increased training budgets. A key conclusion – PLM needs to look for Trojan horse that will help them to come easy for organization.

Complexity of Infrastructure and Implementation

PLM naturally positioned in the middle of everything in organization. Allowing to connect requirement and design, design and manufacturing, supply chains, PLM position as a system that needs to have quite heavy integration portion with other systems. We know, integration work in enterprise is complicated, painful and very expensive if you start touching main enterprise systems such as CRM, ERP, SCM and some others. So, in many cases PLM stacks in the middle of these integrations and requires significant effort and resources to move it forward to completion. What is possible to improve in this integration journey? My recommendation is to make your integration project part of other projects that bring value. Even if you 100% sure you need to integrate two systems to work together, never start this task alone. You will be going to an endless process of integration. Instead of this approach, make integration to achieve a specific task and show result values. Integration technologies are very painful, don’t try to re-invent a wheel. You will not change “the integration kingdom”. So keep your budgets here.

Changes in Processes

This is another painful point. When we start PLM journey we are always saying – we are going to change how people work, improve it, optimize it. This initial value proposition sounds great and can be appreciated by many stakeholders. However, as soon as people start this process of change they face so many organizational problems and obstacle that they either going for the next 1-2-year  discussion, about how they need to work in organization or try to implement processes that have relatively low maturity. Result is obvious – drain of resources and people dissatisfaction. Meantime you spent your PLM $$$ and what is mostly important credit to make PLM implementation successful. My recommendation here – keep focus and boundary.

Content Transformation

This is last, but not least. In many cases PLM system comes in place where different legacy systems or just handmade processes worked. As a result organization faced huge requirement to transform a lot of organizational content – metadata, document, process information etc. to a new system. This legacy data imports are also not simple. People worried when their data moved, they not always can find it. Also, organization has a tendency to get rid of what they don’t need at the same time. My conclusion, be prepared to import legacy content and handle it initially in your system.

So, what is my conclusion? PLM is a not easy task. If you don’t plan it properly, you can stack many times on your way to transform company to the future way to develop and support a product. As soon as you will get there you will enjoy benefits, but transformation will be not easy and costly. You will ask me – what to do? My short answer – plan it before, have a good team of people committed to making it happen and go!

PS. I tried not to talk about technologies in this post. Technologies can be helpful, but sometimes you need to see beyond magic PLM technologies. So I did and I hope you’ll do so. I’m open to discussions and feedbacks.

Best, Oleg

My Slice of PLM Single Version of Truth

June 18, 2009

I have a crazy idea to discuss today. I’d like to talk about a topic that we like very much and that is often discussed when we mention PLM. The topic is a “single version of the truth”. In my opinion, in many cases, we do present it as being obvious.. Yes, the fundamental intention of Product Lifecycle is to cover a product from the initial concept up until the product is manufactured, released, supported and recycled. So, having a unified way to manage product, processes, and resources is one of the most important ideas concerning PLM.

Today’s enterprises are becoming very dynamic: changes are happening all the time; companies are working with a wide range of suppliers for different purposes. How can PLM provide affordable and scalable solutions for such a dynamic eco-system? This creates a lot of challenges for a company providing product data and lifecycle management solutions. How you can get everybody synchronized in the way you do business processes, and how can you keep your PLM systems up-to-date in this environment?

So, I came to a working conclusion that I’d like to discuss. My point is that in today’s enterprise eco-system, you cannot demand people to agree about how to manage your product data and processes. Ah… I know, it sounds bad, but bear with me for few more minutes, don’t close this post:)… I think today’s data management is too complex to allow large organizations to agree on a single way to do business and implement a PLM system to follow this agreement. This task is too complex and too long. You won’t be able to finish this task and you will have to start with new one! So, this is probably the most fundamental problem in today’s system implementations. It’s too long and too expensive since we are trying (and we need) to agree on how to implement the systems.

Here’s my 5-point view on the subject as follows:

  1. Organizations and systems are too complex to agree on PLM related data, processes and best practices.
  2. Successful PLM implementations need to focus on how to manage ongoing system changes.
  3. Best practices and processes in an organization will be a result of multiple changes and improvements in the PLM system implementation.
  4. The system needs to keep track of all changes
  5. We need to have very a flexible PLM system, and I don’t believe we have one yet.

What’s my conclusion? I was reading Jos Voskuil’s blog post about PLM ROI yesterday and thought about why ROI for PLM is not obvious. My take on this today is that, probably, as our implementations are still too big and too complex, people see this as very big and fundamental investment. So, they need to double-check themselves with many calculations around ROI. Allowing ongoing changes and modifications of PLM systems will make implementation simpler and ROI calculations easier…

So, don’t keep quiet… I know you won’t all agree with each other – but let me know what you think.


Get every new post delivered to your Inbox.

Join 268 other followers