Why PLM is taking “best practices” wrong way?

October 19, 2015


The topic of “best practices” in PLM implementations usually brings lot of controversy. Since the time PDM and PLM technologies and products moved from a toolkit approach to product and solutions, vendors and industry pundits are looking for a silver bullet how to rationalize and optimize the deployment and implementation process.

For the purpose of this article, best practices are NOT general guidance about how to make PLM implementation successful. You can find a good example of general PLM implementation best practices from Gartner here, but this is just one example.

In practice, the traditional PLM implementation approach is focusing on two specific elements – data model and processes.


What I call “best practices” is related to specific PLM implementation techniques. I first touched it few years ago in my PLM Best Practice Torpedo article. So, think about best practices are like a torpedo. When it comes as a bunch of models, processes and rules, you need to spend organizational time to apply them to the way your company is doing business. This is a first time explosion. Within the time, you’ll need spend more time to change various aspects of predefined pieces. So, this is your next explosion. After few of such explosions, I think, your model will be completely different from the original best practices.

There are many researches looking at “best practices” and trying to find a right balance between applying predefined templates. Tech-Clarity research – How to identify and implement PLM best practices is a good example. More information about the research in PTC blog here:

The short version of the results is this – The top performers focus on defining new processes, but they do it in context with the capabilities and processes supported by the software. This is not the only way to implement PLM, but those that got the most from PLM were more likely to take this approach, so it’s good advice to consider. See more in the guest blog post.


While in theory, it might be a great approach, in practice it is very hard to improve processes and software concurrently. From a practical standpoint, it means (1) to take few data models and process templates and (2) to apply configurations and customization based on company requirements.

This “two stages” approach has nothing new and used by almost all implementation and service providers. But I actually think, this is where PLM vendors and product took it wrong. The attempt to apply “standard” first and then start changing is a wrong thing to do it. What if we can think about it a reverse way (first by capturing existing processes, then applying best practices).

To understand it better, navigate to the Quartz blog related to Tesla autopilot feature – Tesla’s master plan uses its drivers to map every lane on the road. Tesla is planning to use its own cars to map roads.

That’s why Tesla is not just in the process of creating its own maps but is deciphering where each individual lane is on every road, across the globe. It’s doing this in part by tracking every one of its Model S cars each time a customer takes a drive, to learn where traffic typically moves. The project is immense but it is necessary if autonomous cars—which Tesla expects to be a reality in three years—are to work properly.

Similar approach is used by social navigation system – Waze (acquired by Google in 2013), which is using the power of users to capture maps and traffic.

By connecting drivers to one another, Waze helps people create local driving communities that work together to improve the quality of everyone’s daily driving. After typing in their destination address, users just drive with the app open on their phone to passively contribute traffic and other road data, but they can also take a more active role by sharing road reports on accidents, police traps, or any other hazards along the way, helping to give other users in the area a ‘heads-up’ about what’s to come. In addition to the local communities of drivers using the app, Waze is also home to an active community of online map editors who ensure that the data in their areas is as up-to-date as possible.

In both examples capturing of real world data, allows to improve future behavior – navigation guidance.

What is my conclusion? The best practices approach many PLM implementations took is wrong. Current approach is focusing on delivering OOTB applications with predefined functionality (best practices) and then allowing to use product configurations tools and available APIs to changes its behavior. This process is complicated, expensive and leads to significant inefficiency of PLM implementations. Now, think in reverse – PLM tools are capable to capture existing data models and processes to outsmart PLM implementations. The starting point for PLM tools should be not OOTB data models and processes, but existing organization data and processes captured by tools. Among many discussions about big data and predictive analysis, that should be one future PLM implementation technologies to focus on. Just my thoughts…

Best, Oleg

Image courtesy of winnond at FreeDigitalPhotos.net

Integration with manufacturing robots might be the next PLM challenge

August 21, 2015


Integration is one of the most painful aspects of PLM deployment and implementations. Especially when you need to integrate engineering, manufacturing planning and shopfloor systems. Usually it comes down to large number of data synchronization between each and every system in the loop. Integration failures can slow production process and lead to mistakes. In one of my earlier post I discussed why manufacturing future will depend on solving of old PLM / ERP integration process. The earlier visibility of product information in manufacturing process can reduce cost and optimize production schedule.

As complexity of product lifecycle is growing, the role of integration becomes even more important. The following Forbes article caught my attention earlier today – In the Factories of the Future: A Conversation With Jabil’s John Dulchinos. It speaks about changes in manufacturing driven by factors such as – mass customization, shrinking product lifecycle and offloading manufacturing cost to regions with low cost labor. These trends are quite usual these days. But here is where it starts very interesting – robots. Jim Lawton of Rethink Robotics speaks about smart collaborative robots. The following passage is my favorite:

Think about it – every robot we deploy is a computer. That means, going back to what I said earlier about the role of data in production environments, is that these robots will become critical in that model. Robots will be information management systems that can collect and analyze data on the floor, in real-time and make it available for interpretation.

That represents a real break-through in manufacturing allowing us to not only see what is happening now, but able to apply predictive technologies to the information. Everything from when a machine needs to be serviced to when a process needs to be adjusted will become available to us.

With that ability, we’ll no longer be simply looking at the past, but able to see ahead – a significantly more powerful tool for increasing efficiency and productivity. More compelling though, may in fact be the contribution that it makes toward accelerating innovation and creativity.

The future of collaborative robots is really exciting. At the same time, it made me think about complexity of product data integration between systems to support that. To support predictive analytics model and many other aspect of robotics operation. the information about variety of product characteristics should be available for robot information management systems. It would be interesting to learn more about potential information and process flow- I’m sure it will impose many challenges as soon as we will demand robot to take decisions about building specific configuration demanded by customers in a real time. It might represent the next level of complexity comparing to traditional configuration to order models.

What is my conclusion? Product complexity, shortening of lifecycle and cost pressure are forcing manufacturing companies to innovate in everything that related to optimization of product planing, manufacturing and shopfloor operation. I can see a new type of manufacturing product line equipped with collaborative robots capable to produce a specific configuration of product on demand driven by customer order. It will create the next challenge for PLM systems integration. Do you think traditional PLM architectures and platforms are ready to meet future integration with robots? This is a good question to ask PLM architects. Just my thoughts…

Best, Oleg

Image credit Rethink Robotics

PLM vs Excel: Bullfight and Prohibition

July 27, 2015


PLM has love and hate relationships with Excel spreadsheets. PLM vendors are spending marketing dollars campaigning to replace Excel. The last post by Lionel Grealou caught my attention during the weekend. Navigate to read PLM vs Excel post here. In addition to to almost traditional confirmation that PLM can outperform Excel spreadsheet, I noticed an interesting statement about the fact some organizations are prohibiting availability of Excel reports. This is a passage I captured:

Some organizations considered removing access to import from / export to Excel from their PLM applications to limit uncontrolled usage of data and reports. Most PLM applications now have advanced data search, live feed dashboard which can be tailored to business needs, with Excel-like features for data mining, profiling, compiling, formatting, presenting in various chart for analysis.

The comparison of PLM vs Excel capabilities for BOM management looks like a bullfight. But honestly, I’m not sure who is who in this fight. PLM vendors are fighting excel spreadsheets for decades with no visible success. The number of whitepapers and sales materials trying to convince users how much damage Excel can do is skyrocketing. But customers are still using Excel.

To remove an access to Excel reports is actually something new and unexpected. I had a chance to see many situations when IT was developed bunch of Excel-based solutions to eliminate the need of end users to touch complex enterprise systems in manufacturing, supply chain and finance. Customers loved Excel and hate complex enterprise software. Which reminded me my very old blog – Why do I like my PLM Excel Spreadsheets with top 5 reasons why I prefer Excel over PLM system. To balance a PLM point of view, I can recommend another post – What PLM need to take over Excel spreadsheets?

What is my conclusion? I’m sure you know that bulls are colorblind. Matadors used red material to mask the bull’s blood. PLM sales materials explaining the value of PLM systems vs Excel spreadsheets is like read caps. Users seems to me colorblind to recognize them. Prohibition also seems to me as a bad way to convince users. Especially in our era of consumerization and total focus on user experience. Enterprise UX is going through the paradigm shift. Old, bulky, cumbersome, weighty and hard to use environment that can block a productive flow will be replaced with new tools. It is all about the need for speed. When each function engineers need is “15 clicks away”, you cannot expect company to perform well. Just my thoughts…

Best, Oleg

Image courtesy of vectorolie at FreeDigitalPhotos.net

How to escape “listing” paradigm and reinvent ECO

June 30, 2015


Do you remember what is "hard copy listing"? If you had a chance to write software programs in 1980s, you might remember "listings" – a printed list of computer code or digital data in a human reading format. Navigate to the following Wikipedia article to refresh your memory. We don’t do listings anymore. It is gone. Wikipedia explains that in a dead simple language:

Today, hard copy listings are seldom used because display screens can present more lines than formerly, programs tend to be modular, storage in soft copy is considered preferable to hard copy, and digital material is easily transmitted via networks, or on disks or tapes. Furthermore, data sets tend to be too large to be conveniently put on paper, and they are more easily searched in soft-copy form.

My long time blogging buddy Ed Lopategui is discussing common practices and challenges manufacturing companies are experiencing with ECO (Engineering Change Management). I recommend you to take a look on the following two articles on GrabCAD blog: ECOs Aren’t Dead, But They Are Slow and Stupid and ECOs are stupid II: The price of unincorporated change. ECO best practices are heavily influenced by traditional approach of paper-oriented world following old-school configuration management standards. The following passage explains it very well:

Most engineering change processes are rooted in very formal and traditional frameworks. ECOs can be traced back to Configuration Management (CM) practices that literally come from a time well before CAD (much less PDM/PLM) where manual drawings ruled the earth. Engineering data was neither readily portable nor widely accessible. These effective but complex practices were established in the larger, older manufacturing companies that became the first natural customers to afford PDM/PLM.. As a result, these processes live on and are perceived as absolutes. They remain relatively intact, buoyed by large company process culture despite the opportunity to evolve.

Unincorporated change is another archaic practice, which is a reflection of old practice of making changes in drawings. The complexity of changing documentation created the practice of change process itself. So called "was-now" practice was related to the fact comparison between two states of the design was very complicated:

The concept of an unincorporated change was a necessary compromise in the past because of limitations in changing design data, especially in the era of manual drawings and early CAD, when drawing views didn’t just update with a click or two. Understanding the difference from one version of a design to another chiefly involved intense staring for prolonged periods of time until the change was understood and/or blindness occurred. To minimize eye strain, ECOs often highlighted the changes specifically, sometimes with designation of WAS/NOW views side by side on the form.

Now, let’s move into modern world of software. Imagine software engineer is writing a paper document with explanation about code changes he is going to implement tomorrow. He prints "listing" and mark by yellow color "was/now" differences in the code. Then apply it for approval and documentation. After it made, actual change is going to be performed. It is probably sounds strange.

The traditional PLM paradigms are very much like old-school software listings. It is slow and complicated. I’m consistently hearing engineers are not making formal changes and not abandoning lifecycle process because to perform lifecycle is slow and complicated. Nobody wants to deal with complicated forms and processes. In many situations the process is just too complex for teams and people to deal with. I pointed some of these problems in my earlier article – why PLM should revise NPI products.

Agile software development became very popular for the last decade. It introduces many concepts that can be adopted by manufacturing companies. One of the most interesting opportunity I can see is around how to make change management fast and easy. When it comes to change, the speed is a very important thing.

What is my conclusion? Old habits die hard. ECO is one of them and it goes back 20-30 years to best practices that were developed before CAD systems were capable to compare two versions of the model and visualize differences. Check my earlier post about how to compare versions and changes. New technologies and new practices should come and displace old "ECO listings" in an agile, paperless and easy way. Just my thoughts…

Best, Oleg

Picture is credit Wikipedia article – Computer Listing.

Brutal reality of process management for hardware startup

June 26, 2015


Startup company and process management. These are probably two most conflicting definitions you might think about. Everyone is familiar with famous Zuckerber’s statement – move fast and break things. How process management can survive in such environment? Although Facebook is more careful these days about "breaking things", startups are still operating on the edge to move as fast as possible. The outcome is questionable for many hardware startups. A large majority of young companies are not delivering on time. The rough statistics said 75-80% of hardware projects on Kickstarter are not shipping products on time. That was a piece of information that made me wrote the following post last year – Why Kickstarter projects need PLM?

The blog post Speed Can Kill: the importance of process for hardware written by Ben Einstein of Bolt put a great perspective on what means process for manufacturing startups companies dealing with atoms (not bits like software companies). Take a look on the article – I found it very interesting. So called "long shadow effect" is brutal if you think about potential impact of decisions made during the lifecycle of product development. The following passage can give you an explanation how it can happen:

Early mistakes often don’t have a measurable impact until first shots are coming off tooling and the manufacturing process grinds to a halt. My partner Scott calls this the “long shadow effect.” An early decision about which microcontroller to use or the shape of a housing can appear correct until months or even years later during the first production run. Sometimes parts can have exceptionally long lead times, require odd financing terms, demand manual rework, or be entirely un-moldable. None of these problems can be uncovered by moving quickly to get to production.

The article speaks about four "trunks" as a the way to organize processes. I found it interesting comparing to more traditional department organization you can in PLM implementations done in larger companies. It reminded me my earlier blog about why PLM companies should revise NPI processes. The waterfall process is complicated and can introduce many artificial breakpoints to prevent company from moving fast. At the same time, running out of process organization will fail you on late stages of product development and manufacturing. My favorite passage from Bolt’s blog is the following conclusion:

This is not to say that hardware startups can’t move quickly; in fact they can move faster than ever before. But the ability to go fast and build good products on time and under budget comes from process, not pure iteration. The fastest companies tend to be exceptionally organized about their product development and manufacturing process. Many people have asked us to cover this in much more detail and we’re working on a 4 part series exploring each of these trunks.

What is my conclusion? I’d change the original "breaking things" statement in order to fit manufacturing companies reality as following – "move fast and respect process". It is easy to say and hard to implement. The complexity of product and processes is high. Products are combined of mechanical parts, electronics and software. Work and teams are distributed. All things are interconnected and can create "long shadow effects". Hardware startup companies are struggling to set basic elements of information and process organization. My hunch that those companies that are able to organize four trunks right will survive. This is a note for PLM architects and other people thinking how to apply PLM technologies for agile and dynamic processes. Just my thoughts…

Best, Oleg

Image courtesy of hywards at FreeDigitalPhotos.net

How PLM vendors can compete with manufacturing status-quo?

June 4, 2015


I attended startup event yesterday in MIT. It was organized by by Startup Secrets with participation of Michael Skok and Alex Osterwalder. If you’re not familiar with Alex’s books – Business Model Generation and Value Proposition Design, I certainly recommend you to check it out. In my view it is good not only for startup, but for any project activities with customers. For the purpose of this conversation think about typical project of selling PLM system to a customer.

One of the things that caught my special attention was a topic of inertia. Michael Skok was talking about inertia as one of the most critical part in the success of startup or any other project. Think about it as a competition with a status quo. You may think, your product or project is the thing that customer mostly need. But guess what? Life is happening anyway. Customer is managing his own priority lists and your project is probably not on the top of the list (or not in the top of the top 10 problems).


It made me think about PLM business and more specifically about how PLM vendors can compete with a status quo of manufacturing companies. Think about it- all manufacturing companies today are actually manufacturing things and successfully shipping products to customers. Some of them are doing it better than others. However, very often, when you come to sell PLM to a customer, your activity is not driven by emergency situation. It usually in the area of information control, streamlining processes, quality improvements, better collaboration, etc.

Historically, PLM solutions came to large aerospace and automotive companies because of a single reason – it was the only way for manufacturing company to produce aero-planes and cars. Without these tools, companies cannot manage product structures, configurations, etc. Now think about large amount of manufacturing companies that using #1 PLM software in the world – Excel. They do it with a help of CAD, ERP, Excel and other DIY solutions. This is a status quo. PLM is maybe a good idea for them, but something that requires a special effort, which is not going in balance between pains and gains – this is a place where I found Value proposition design study really helpful.


Customers would probably consider PLM system, but an entry barrier is too high. I can recall 3 examples of how PDM/PLM vendors removed to entry barrier in the past.

1- CAD/PDM integration. Back in 1990s, PDM vendors came with the idea of integration between CAD and PDM systems. It was hard for engineers to use data management system outside of design (CAD) environment. Integrating UI was a way to eliminate complexity of use.

2- Out-of-the-box PLM solutions. Earlier PLM systems came in a form of a toolkit combining data modeling environment, API and some generic tools. Applications engineers helped companies to turn it into products. It requires time and resources. The idea of templates that can come out of the box was a good one. It moved PLM systems adoption one step forward.

3- Cloud PLM. IT resources and cost of hardware was another significant problem for PLM adoption (especially for smaller companies with tight budget). The idea of SaaS software and cloud PLM developed for last 3-5 years is removing an additional barrier towards PLM adoption.

What is my conclusion? Do you think PLM vendors removed all barriers and PLM adoption will skyrocket in the next few years? Unfortunately I don’t think so. Cloud is a good way to remove IT barrier. However, PLM abstraction model and implementations are still too complex. It hard for many companies to grasp the idea of PLM and form a strategy to implement PLM. Consultants and advisers can help, but the scalability of it is questionable. Therefore one of my earlier conclusions was – Cloud is not a final way to re-think PLM. We still need to think how to re-imagine PLM to make it affordable yet easy to implementation for many manufacturing companies in the world. Just my thoughts…

Best, Oleg

What cloud CAD data management is right for me?

April 6, 2015


The amount of data created in the cloud and transferred to the cloud is growing. You probably noticed few of my last blog posts about cloud CAD – The stage for cloud CAD competition and How CAD vendors “murdered” PDM business. CAD vendors are moving to the cloud, but the truth the competition between other cloud vendors are heating up for the ability to generate content and manage it in the cloud. The following article caught by attention over the weekend – Dropbox is working on a new note taking applications. Together with few other larger and smaller vendors, the dynamics of getting our data up to the cloud is increasing.

This is probably a good time to ask a question – what are products that can help you to organize and manage your engineering data in the cloud. Few years ago, I’ve been sharing some of my thoughts about CAD file sharing in my public discuss with Hardi Meybaum of GrabCAD. You can navigate to my old post here – Debunking the cons to CAD file sharing tools.

Today I want to take a short review of tools that became available since that time and focus primarily on managing engineering and CAD data in the cloud.

Generic cloud data (document) management tools

Yes, there are many generic cloud data management tools. Most of them are coming from vendors focused on cloud data storage. Google, Microsoft, Dropbox. There are few other tools. These tools can give you a way to put you files in the cloud without much focus on what is there (3D models, drawings, specifications, etc.)

A bit outstanding, but still a generic tools is BOX. You can learn from the website about BOX focus on industries. I shared some of my thoughts about that here – Can BOX become a platform for PLM?

Another interesting recent development is Adobe Document cloud.

Cloud CAD data management tools

New cloud CAD systems are coming with solid data management foundation. Two examples here – Autodesk A360 and Onshape data management. Both systems are capable to manage CAD data coming from multiple CAD systems.

You probably heard and seen Autodesk Fusion360. In fact Fusion360 runs on top of A360 – backbone and platform to manage data and collaborate socially about projects and changes. Project collaboration approach is a central concept of A360. You can read more here. A360 is a platform to create, collaborate and compute in the cloud. And it is capable to manage different CAD files from Autodesk and other CAD vendors. More about features and what you can do is here.

Onshape is providing core data management capabilities around Onshape documents. In of my experiments with Onshape, I’ve learned that I can upload other CAD files into Onshape documents, manage their versions and translate it into native Onshape data too. You can find this approach a bit different from A360 project. However, we can only guess where future Onshape product development will go. I found the concept of Organization management in Onshape, which can be developed in the future.

Cloud PDM tools

GrabCAD Workbench is probably one of the earliest cloud CAD data management tools. Workbench can give you an option to put multiple CAD data into cloud and manage versions. It is combined with GrabCAD Open Engineering community that allows you to leverage CAD data openly shared by about 2 millions of GrabCAD community members.

Kenesto Drive is another product you might pay attention too. After few product pivots, Kenesto came with a simple concept of “Drive” – a place where you can synchronize engineering data (including CAD files, of course) and keep using this data with your desktop tools.The following video gives you some overview of what Kenesto Drive does.

What is my conclusion? Growing interest to upload, share and manage data in the cloud will require better tools and probably new concept of data management. Customers won’t be happy with “double PDM tax”. I’m sure, the idea to move existing complexity of CAD data management in the cloud won’t excite users. My hunch customers won’t move into 100% cloud environment and we will be using both cloud and desktop in parallel for some time (I even don’t want to predict for how long). So, new paradigms will be developed to manage and collaborate heterogeneous CAD and engineering data in both cloud and desktops. Just my thoughts…

Best, Oleg

Image courtesy of watcharakun at FreeDigitalPhotos.net




Get every new post delivered to your Inbox.

Join 290 other followers