Why My PLM Won’t Work For You?

February 6, 2014


To implement PLM is a process and change. Speak to anyone in engineering and manufacturing community and they will bring you lots of stories about complexity of PLM implementations and associated cost. Also, you can hear lots of stories about complexity of moving from one PLM implementation to another or switching from one PLM system to another. Companies are spending tons of money to align PLM systems to a specific set of requirements to fit company date management and process needs.

Couple of years ago, I posted "Is PLM customization a data management Titanic?". For many of manufacturing companies, it is a reality these days. Implementations done 10 years ago can be hardly maintained. To update current implementation to a new PLM system version or another PLM systems is mission impossible. Companies are hiring advisers and consulting companies involved in the implementations and development of PLM systems to run migration and adjust PLM system to a new set of requirements.

Vendors have been trying to resolve the complexity of PLM systems by applying out-of-the-box configurations. However, the success of these ready-to-go systems was somewhat mixed. Pre-configured templates worked well during marketing shows, presentations and evaluations. However, in order to bring system to production mode, (still) required customization to be done. Very often, the customization was coming to replace pre-configured template totally. The problem is not very unique in PLM space. I posted – How to de-customize PLM article few weeks ago. I discussed the importance to decrease customization level as well as presented some similar customization complexities coming from SharePoint implementations.

Today I wanted to provide some recommendations you can follow in order to stay away from costly PLM customizations. These recommendations will also help you to avoid some typical PLM implementation pitfalls. Here are 4 steps to follow:

1- Ask yourself what problem you want to solve with PLM for the next 2 years. The term PLM used by many people in a variety of forms and meanings. Going with specific scope (e.g. change management, quality, bill of materials, etc.) will help you to chart functionality you expect PLM system to support.

2- Outline main data elements and structures PLM system needs to support in order to solve list of problems from previous step. Do it with no connection to specific PLM system and vendor. Make an agreement in your extended team about that. It can take you some time to get to the agreement, but in many situations this is one of the best investments you can do in order to eliminate extra customization steps.

3- Pickup few PLM systems and try to map your requirements to what these systems can provide you without customizations. Don’t be afraid to change your terminology alongside of this process. However, insure that whatever name PLM system is using, it will do what you expect from functional standpoint. It will be a good idea to hire consultant during this stage. It is worth to spend some dollars to avoid future budget waste.

4- Last, but very important. You need to test that selected system is flexible enough to apply changes on top of pre-configured parameters/templates. It is not unusual to provide out-of-the-box system configuration that cannot be practically changed. Practically means in this context the ability to add/modify system and data model and (at the same time) keep most of existing functionality in place. Stay away from system configurations with scripts and customized behaviors hard-coded to a particular data models and workflows.

What is my conclusion? The combination of flexibility and preconfigured environment is the key to stay away from costly PLM customizations. However, these two characteristics are very often mutually exclusive. Vendors can show up ready-to-be-used PLM configuration that will be literally destroyed as soon as you will have to change something. To run a test of how flexible is out-of-the-box PLM model is a key thing not to be punched in the face by future PLM system re-configuration and customization cost. Just my thoughts…

Best, Oleg

Why PLM should not think about ownership?

October 10, 2013


Enterprise software business is complicated. It often comes with questions of system responsibilities and ownerships. IT usually holds primarily responsibility for cross company software stack. At the same time, divisions and departments often try to influence IT and make their own decision related to what software to use independently. One of the terms came out of this political and technological confrontation is shadow IT.

Ownership is a sensitive question. In enterprise organization (manufacturing companies are included) it often involved with many politics, hidden agendas and power influence. This is a reality of many enterprise organizations. By the nature, PLM is getting deeply involved in this politics. If you think about PLM as a business strategy focused on organizational processes, it becomes clear why it is hard to avoid cross departments conflicts. Then, if you think about PLM as a technology, you might appreciate why ownership of a particular set of technology and/or data can be involved in the same cross department operation politics.

Earlier this week, I’ve been reading Aras blog – Who should take ownership of PLM? Navigate to the following link to have a read (). Draw your own opinion. I think it provide some interesting perspective on PLM ownership. This is how Aras blog explained that:

With this business dynamic in mind, have you reexamined who should take ownership of PLM in your organization? Something to consider, there will be different driving forces behind PLM depending upon who has ownership. Here are some to keep in mind: Should Design select and own the PLM system? As you’ve likely seen, the main focus of PLM is in CAD file management, change management and BOM management when design is at the wheel. What about Operations? With cost reduction, inventory management and purchasing at the top of their priorities you can bet these will be the focus for Operations driven PLM as well. What about Quality? With the Quality organization leading, the driving forces for PLM are likely to be compliance, regulations and overall quality of products.

I agree, the story is really not black and white. However, good organization will break this story down to the pieces of business processes and technologies. There are clear benefits for division and departments to be responsible for their processes. This is should be an ultimate goal. Nevertheless, I still can see some overall responsibility on product development processes (especially when it comes to cross functional teams). So, to have somebody responsible for product can provide a lot of value. At the same time, it doesn’t mean technology needs to be driven by the same department people. I see technological and software stack responsibility separately. Sometimes, it can come as multiple pieces of product and technologies.

What is my conclusion? In my view, the question of "PLM ownership" is a wrong one. It must be split into business, process, communication and technologies. It is obvious Quality (or any other) department should be responsible for how processes are organized. However, it doesn’t imply ownership of technologies and software pieces. Just my thoughts…

Best, Oleg

Image is courtesy of Aras blog.

PLM real implementations: too long to be on-time?

July 7, 2013

One of the looongest US weekends I remember is going to end. This is a time to get back from relaxing holiday atmosphere to business reality. I’ve been skimming social channels and stumbled on PLM statistics posts published by my good friend and PLM blogging buddy Jos Voskuil. Navigate to the following link – PLM statistics–the result. Read and make your opinion. Jos’ main conclusion – PLM is more vision than tech. PLM implementation is a journey that takes time, effort and resources. Some interesting and funny things came out of comparison of experience and PLM implementation time. Here is the passage I liked:

Here, it was interesting to see that more than 60 % of the respondents have over 8 years of experience. As mentioned related to the previous questions it is necessary to have a long term experience. Sometimes I meet a “Senior” PLM Consultant (business card) with two or three years of experience. I believe we should reserve the word “senior” for PLM with a minimum amount of 5 years experience. And it is also depending on the amount of projects you were involved in. Interesting thought came into my mind. Some vendors claim the provide extreme rapid implementations for PLM ( 2 weeks / 30 days / 3 months). If this is real PLM you could do 25, 12 or 4 PLM projects per year full time.

It made me think about PLM implementations in the way they exist today – journey type of specialized implementation requiring time and effort. I certainly agree with Jos- to change the way companies work requires vision, time and effort. In some situations, PLM implementations are coming to change product development processes established during decades.

However, here is a different angle to look on PLM problem. Business is very dynamic these days. Business environment, ecosystem, technology, human resources, business landscape. What if the current lifecycle of many PLM implementations is not very inline with business needs? It reminds me one of the old PTC slides from COFES Israel – people just want drink beer!

What is my conclusion? New enterprise landscape and business reality will be require a different approach in everything – IT, computing models, enterprise software and implementations. We’ve seen lots of changes in consumer technology space, open source and other places happened over the past 10 years. People are looking how to build new products faster and provide a quick respond on customers demands. So, my hunch some of PLM journeys will be late to deliver results. Just my thoughts…

Best, Oleg

Who should be the first PLM user in a company?

May 9, 2013

Enterprise software implementations are usually not a simple task. Compared to selection of your next mobile device and RSS reader, it is an organizational effort. Enterprise software gets really complicated when it comes to the point implementation requires involvement of people. Product lifecycle management (PLM) is one of these systems. Implementation of PLM is deeply connected to product development and manufacturing processes. Success or failure of PLM implementation is directly impacted by how people are involved in PLM system adoption and use.

Companies are taking different approaches in implementing PLM. However, fundamentally, I can see two different ways in implementation. First is holistic approach usually called "business transformation". It implies significant process changes as a result of PLM system implementation. Companies are analyzing their existing processes, optimize and restructuring the way they do business. Second approach is focusing on a specific process or problem solving. It is usually come as an improvement of a specific activity and/or process.

There are lots of debates about PLM implementations these days. The value of PLM system implementations becomes clear to organizations on different levels. At the same time, it is obviously not easy to people to understand how to start using a PLM system that will have such a significant impact of everything they do.

I was reading an Minerva blog post – Should we pull PLM deployment? A new lean deployment strategy by Yoann Maingon. In this article Yoann shares his view on different approaches to implement PLM. The idea of lean and "pulling data" resonated. Here is an interesting passage:

The lean concept is highly based on a pull flow. Most of the arguments I’ve had were about the fact that the main data is created in Engineering so we should start deployment in engineering. Well, what if you should provide a system to the first person who enter the system. The one who will pull the flow, the customer? the marketing? assistance & support?

It made me think about how to maximize the value of PLM implementation withing short period of time. Here is the idea. Every company is manufacturing products for customers in some ways. The biggest process loop in every manufacturing company starts from requirements and ends with "release" of product to customer. To control the loop between requirements and results can be an interesting problem to handle first.

The idea of "pull" will be related to pulling of product requirements and documents representing released products and combined them together in a single system. In my view, it can provide an interesting insight on company operation. It is also very useful information source that every company can "re-use" for different purposes – new projects, customer support, etc.

What is the conclusion? It all starts from ROI. How to make it faster… This is a challenge most of PLM implementations are facing these days. For most of the implementations the process of getting to results can be slow. To provide system that can capture requirements to release control can be an interesting option. Lots of valuable information is hidden in this relationships of requirements-result. It also can drive management attention and focus in a company. Just my thoughts…

Best, Oleg

PLM adoption and CAD management valley of death

March 5, 2013

The issue of PLM adoption remains critical, in my view. Even if we can see more examples of PLM implementations, companies usually consider "PLM project" as something that needs to be taken with care, significant amount of planning and justification. So, I wanted to ask "why it happens"? The traditional answer mostly coming from PLM vendors and PLM consulting is leading to the complexity of business processes, the need of people to change and technological challenges related to implementation and customization of systems. Usually, vendors and consulting are working with CAD/IT managers that lead "engineering part" of the implementation. In parallel, PLM consulting often dream to work with C-level people in a company to define better alignment of PLM strategy.

I want to raise a question of balance between engineering IT / CAD managers and more strategic PLM business planning. Here is the thing – even if companies are succeeding in making strategic planning for future business and product planning development changes, CAD management is still can create too many complication on the way. CAD management is complicated, requires lengthy implementations, data import and… more important – change the way engineers and designers are working by introducing "data management". The last one is hated by everybody hates because of complexity.

In addition to complexity, CAD data management introduces an issue of compatibility between different CAD systems and PDM components. New trend to solve this problem is to use PDM system of the same CAD vendor. It simplifies CAD/PDM integration, but introduces the problem of multiple PDM/PLM system integration. CAD management (PDM) projects often positioned as an introductory step in PLM implementations often end up as a long and painful journey.

What is my conclusion? In my view, CAD management is a valley of death for many PLM implementations. Many PLM consulting and even some PLM vendors are trying to avoid and position PLM implementations "beyond CAD/PDM". It certainly gives some advantages, but (in my view) just hides the real problem of bad engineering data management. Engineering data from CAD is an important element of change management and many other PLM-related processes. As we move to more agile and efficient product development process management, to solve the problem of CAD data management and PDM becomes very important. Just my thoughts…

Best, Oleg

Dogfooding and PLM APIs random thoughts

November 14, 2012

If you are long time enough in software business you should be familiar with the term "dogfooding" (or eat your own dog food). This term used to explain the situation or scenario in which company is using their own products to demonstrate their capabilities and quality. If you are not familiar with this process, navigate to the following Wikipedia article to read more. I liked some examples there, specifically, Apple one, which I wasn’t aware about -

Apple Computer president Michael Scott in 1980 wrote a memo announcing that "EFFECTIVE IMMEDIATELY!! NO MORE TYPEWRITERS ARE TO BE PURCHASED, LEASED, etc., etc." by the computer company, with a goal to eliminate typewriters by 1 January 1981.[9]

The following passage brings few more examples:

One perceived advantage beyond marketing is that dogfooding allows employees to test their company’s products in real-life scenarios,[3][5] and gives management a sense of how the product will be used, all before launch to consumers.[5] In software development, the practice of dogfooding with build branches, private (or buddy) builds, and private testing can allow several validation passes before the code is integrated with the normal daily builds. The practice leads to more stable builds[citation needed], and proactive resolution of potential inconsistency and dependency issues, especially when several developers or teams work on the same product. For example, Microsoft and Google emphasize the internal use of their own software products[citation needed]. For Microsoft, especially during the development stage, all employees across the corporation have access to daily Software builds of most products in development, including the Windows operating system.[citation needed]

Today, I want to speak about specific "dogfooding", which is related to PDM/PLM APIs or (Application Programming Interfaces). In the world of PLM implementations, the role of Open API becomes very important. Usually, when I’m working with customer requirements, I can see the following notes – external programming or customization as a way to resolve features or function absence available in the product. Yesterday, I had a chance to read the following TechCrunch article – 5 Rules for API Management. Even if you are not programmer or software engineer, have a read and make your opinion.

The article made me think of the complexity of API delivery in PDM/PLM as well as about "lifecycle". The latest is important – PDM/PLM products live very long period of time, and the development of stable APIs is a separate and almost "must have" a prerequisite. The 5 rules – design, documentation, analytics, universal access and uptime made a perfect sense to me. I found interesting note about the relationships between IT and business group (which is also very typical for many PDM/PLM implementations):

Enterprise API Management must include the entire Enterprise, not just the techies in IT. The SOA solution, and the other gateways as well, is focused on the IT person and not the business owner of the API program. This is reflected in the UI that they present in their free version as well as their language that includes things like “policies”; too much of the business rules are codified in complex policies that require a technical expert to really use.

However, I found the notion of analytics, mostly interesting, since it can address the idea and requirements of API management through the lifecycle of the product. Here is the passage to think about:

[how to] think about the collection and processing of all the statistics associated with the use of the API, with an eye toward supporting and encouraging effective usage and discouraging/limiting usage that is counter to your business or technology goals.

What is my conclusion? The days of single PLM platforms are almost gone. The future belongs to Networks. Data networks, product and cloud services networks. The ability to adapt a product to customer needs, to continue product development in a fast-changing customer environment and strategic goal for cloud, deployment set new goals in front of PDM / PLM developers. The importance of having agile and flexible API that can sustain many product releases and development cycles was never as important as of today. Just my thoughts…

Best, Oleg

Image is courtesy of TechCrunch article (Feature image courtesy of XPlane – under Creative Commons.)

How Do You Know Your PLM Project is in Trouble?

December 19, 2011

Blogosphere and other literature are full of remarks about companies that stuck in different phases of PLM process. You probably had a chance to read Aras’ Frustrated by a stuck PLM project? blog post last year. Recent Autodesk announcement of Nexus PLM raised again many publications about the complexity and sophistication of existing PLM implementations. While time will show if Autodesk cloud PLM technologies will be able to reduce the complexity of PLM implementations, I’ve been thinking about how you can today to identify your own PLM project (if you are running one already) is in trouble.

I had a chance to read infoworld article earlier this week – Six lessons from lightning ERP rollout. Have a read and make your opinion. I especially like the following passage from the beginning:

Here’s something you don’t hear everyday: "Our SAP implementation finished ahead of schedule. Sorry, let me rephrase that. Hearing about an SAP implementation that finished ahead of schedule is like hearing that someone captured the Loch Ness Monster and turned it into a kiddie ride. It’s as likely as Bigfoot singing "La Traviata" at Lincoln Center. It’s as if you called a software company’s tech support line and the voice on the other end didn’t insist you reboot your PC.

This article made me think (again and again) about how you need to plan your PLM implementations. Here is my top 5 symptoms you should be careful about. When you discover them, you better check what you do with your PLM project:

1. You cannot control your PLM project budget. As R&D, you know that "shit happens" all the time. However, be aware – the achievements of your PLM system will be significantly diminished when you overspend 200-300%.

2. Engineers and other people in your company work around PLM system. This is should be a "red-flag" for you. If people think the system doesn’t work (or way too complex), check your fasten belt and run fast to understand what is the core reason for that.

3. The infrastructure becomes more and more complicated. You need more databases, storage, CPU, etc.You are probably familiar with that – after first pilot, the system requirements are growing. Watch carefully the fist production data load. Your bill of materials, check-in/out operations and some other elements are sensitive and your can run out of budget fast.

4. Vendor is pushing you towards the next release of their flagship product. This is another "red-flag". Normally, it means something was over-promised by sales fellows. Watch this moment as well.

5. You start hearing that you will be able to take a full advantage of your PLM system when you completely integrate it with your ERP (and other systems) as well as migrate to another CAD system. This is, actually, the right time to stop and re-think what you do. The best talk with somebody who is not involved in the business of PLM vendors.

What is my conclusion? Looking up on what I wrote, I found symptoms that probably will be true not only for PLM, but for a broader range of enterprise software. However, as you probably know, in PLM and enterprise, one size doesn’t fit all. You need to have a diversity of knowledge and experience to make thing work. Just my thoughts…

Best, Oleg

picture courtesy digitalart / FreeDigitalPhotos.net

PLM Integration Failures

January 13, 2011

There is one topic that always raises lots of controversy, in my view. I’m talking about integrations or even more specifically about PLM-oriented integration. I want to point on the following two articles I posted previously about PLM integrations:

PLM Integration Gotchas
PLM and Enterprise Integraton Game

I read Reasons Why PLM Integration Fails?" article on the To-Increase Blog. To-Increase is a company from Netherlands specialized in the Microsoft ERP products (Dynamics AX, NAV) and product configuration software e-Con. Read the article and make your opinion. The author is making point of various difficulties related to PLM integrations. Here is my favorite passage from this article:

A fundamental risk within any manufacturing firm, especially a firm with global operations, is the risk of information becoming siloed within individual teams. For example, if information concerning a flaw in the development of a product is available only to the engineering team, and kept from marketing, there exists the risk of gearing up product launch tasks too quickly – resulting in wheel spinning at best, and a significant loss in resources at worst.

In much the same way, if a PLM system is implemented – but not integrated with all other systems related to manufacturing processes (think Enterprise Resource Planning systems, think Manufacturing Execution Systems) the risk exists for information to be siloed in one system.

PLM Integration and Competition

The focus on PLM-ERP integration is interesting. These are two systems that very often are trying to establish a dominance in a culture of manufacturing organization. Are you PLM or ERP driven? What system "owns" Part or Bill of Material information? Who is authoring BOM? I heard such statements many times when talked to customers during implementations. The integration point is often becoming a competitive advantage. I believe for To-Increase, integration with other products is a significant competitive advantage. Manufacturing companies would be thinking twice before deciding what system will drive product development processes.

Partners and Integration Complexity

Integration is not a simple task. You need to have enough technological and process knowledge as well as technical skills to make it work. In addition, you can rarely find two identical integration solutions. Each manufacturing company will have their own practices, systems and specifics. Because of such high level of complexity, software vendors are trying to rely on partners to deliver an integration solution for end users.The ability of partner to deliver integration becomes a key in the ability to make an overall implementation success .

PLM Integration is hard. The cost of implementation is high. The cost of failure is even higher. Vendors are pushing integrations out of the scope of their deliveries. Partners, like To-Increase can provide a significant advantage by helping customers to make integration happen. These are realities of ERP and PLM implementations.

What is my conclusion? Integrations are important and complicated at the same time. Integration failures are one of the main sources that can cause overall implementation failure. The reliance on services increases the implementation cost and creates dependency of customers on implementation services. I’d expect software vendors to re-think their view how they can make integration easier. It can be a significant differentiation factor in future PLM systems. Just my thoughts…

Best, Oleg
Freebie. Nobody paid me to write this article.

PLM Prompt: A Roadmap to PLM Success?

September 30, 2009

I wanted to share with you funny picture from flowingdata. This picture reminds me a lot about PLM implementations and various PLM methodologies. Nevertheless, now, let me ask you few serious questions.

What is a roadmap to PLM success in the organization?

What is a roadmap to successful PLM implementation?

I’m going to think and blog about this next week, but would be interested to hear your voices…

Best, Oleg

PLM Transformation: Easy, No; Costly, Yes.

July 31, 2009

For the last weeks, I have chance to speak with few of my colleagues about PLM implementations and technologies in different contexts. The main point of these conversations was about how to make PLM implementation easy. Few interesting blog posts related to this topic – “A PLM Success Story with ROI” by Jos Voskuil and “Why is implementing PLM Hard” by Jim Brown. Few days ago I wrote about how to move PLM to mainstream. However, my take in this post was mostly about technologies, Jim and some of my other blogging colleagues raised valid question – PLM is about people and organizational transformation and therefore, it cannot be easy. This is about change! Change is hard… So, my thoughts today are about to analyze what PLM transformation mean from both sides – technological and organizational.

Here is my short conclusion about main factors that make PLM transformation to be a very not trivial task in organization.

New operational environment for many people in organization

In most of the cases introduction of PLM systems bring new environment to users. New data and process tools, viewing software etc. PLM environment normally combined from few connected pieces – Product, Process and Organization related. Because of complex dependencies existing between these pieces, resulting environment is not easy. So what we have in organization – confused customers and increased training budgets. A key conclusion – PLM needs to look for Trojan horse that will help them to come easy for organization.

Complexity of Infrastructure and Implementation

PLM naturally positioned in the middle of everything in organization. Allowing to connect requirement and design, design and manufacturing, supply chains, PLM position as a system that needs to have quite heavy integration portion with other systems. We know, integration work in enterprise is complicated, painful and very expensive if you start touching main enterprise systems such as CRM, ERP, SCM and some others. So, in many cases PLM stacks in the middle of these integrations and requires significant effort and resources to move it forward to completion. What is possible to improve in this integration journey? My recommendation is to make your integration project part of other projects that bring value. Even if you 100% sure you need to integrate two systems to work together, never start this task alone. You will be going to an endless process of integration. Instead of this approach, make integration to achieve a specific task and show result values. Integration technologies are very painful, don’t try to re-invent a wheel. You will not change “the integration kingdom”. So keep your budgets here.

Changes in Processes

This is another painful point. When we start PLM journey we are always saying – we are going to change how people work, improve it, optimize it. This initial value proposition sounds great and can be appreciated by many stakeholders. However, as soon as people start this process of change they face so many organizational problems and obstacle that they either going for the next 1-2-year  discussion, about how they need to work in organization or try to implement processes that have relatively low maturity. Result is obvious – drain of resources and people dissatisfaction. Meantime you spent your PLM $$$ and what is mostly important credit to make PLM implementation successful. My recommendation here – keep focus and boundary.

Content Transformation

This is last, but not least. In many cases PLM system comes in place where different legacy systems or just handmade processes worked. As a result organization faced huge requirement to transform a lot of organizational content – metadata, document, process information etc. to a new system. This legacy data imports are also not simple. People worried when their data moved, they not always can find it. Also, organization has a tendency to get rid of what they don’t need at the same time. My conclusion, be prepared to import legacy content and handle it initially in your system.

So, what is my conclusion? PLM is a not easy task. If you don’t plan it properly, you can stack many times on your way to transform company to the future way to develop and support a product. As soon as you will get there you will enjoy benefits, but transformation will be not easy and costly. You will ask me – what to do? My short answer – plan it before, have a good team of people committed to making it happen and go!

PS. I tried not to talk about technologies in this post. Technologies can be helpful, but sometimes you need to see beyond magic PLM technologies. So I did and I hope you’ll do so. I’m open to discussions and feedbacks.

Best, Oleg


Get every new post delivered to your Inbox.

Join 218 other followers