PLM Technology vs Vertical Industries: Wrong balance?

April 14, 2014

plm-industries

Let’s talk about PLM technologies. Err.. PLM is not a technology. Even more, PLM is even not a product. So, what is that? Business strategy? Product development politics? For the sake of this conversation let’s leave these debates out. I want to speak about PLM technologies that allow you to manage product data, CAD files, bill of materials, rich set of related information as well as processes around it. This technology came to us about 20-25 years ago first as a very hard-coded set of tools. You had to build it literally different for every customer. So, it supported only large customers that were able to pay for software, infrastructure and implementation. Later on, PDM/PLM turned into software toolkit. The next step in PDM/PLM technology evolution was called flexible data modeling. The first flexible (dynamic) PLM data modeling tools were released back in 1995-2000 and… not much changed since then.

So, what happened since that time? PLM vendors went to develop out-of-the-box and vertical industry solutions in a massive way. David Linthicum’s article Saleforce.com officially is out of ideas reminded me about the joke comparing technology vs. industry play. Here is the passage:

When you run out of new ways to provide innovative technology, you go vertical. That was the running joke among CTOs back in the day. It usually meant the market had reached the saturation point and you could not find new growth

I found this message very compelling to what happens in PLM industry. PLM vendors are trying to compete by providing more comprehensive set of data models, best practices, process templates. By doing so, vendors want to reduce TCO of PLM implementations. It is actually brings success and many customers are using these solutions as a starting point for their PLM implementation.

So, where is the problem? For most of the situations, PLM is still costly and expensive implementation. Services may take up to 50% of the cost. Here is the issue – core PLM data and process modeling technology didn’t change a lot for the last 10-15 years. Data models, CAD file management, product structure, process orchestration. All these things are evolving, but very little. The fundamental capabilities are the same. And it is very expensive to develop solutions using these technologies.

You may ask me about cloud technologies. Cloud is the answer. But only partially. It solves problems related to infrastructure, deployments and updates. Cloud provides clear benefits here. However, from the implementation technology standpoint, it is very similar to what non-cloud solutions can offer. Another interesting passage from Infoworld cloud computing article explains what is the problem new SaaS/cloud products can experience when trying to displace existing vendors:

So many companies have tried this approach — many times — but most found limited success. I can’t help but think the same will occur here. Salesforce will soon discover that when you get into vertical industries, the existing foundation of industry-specific applications is difficult to displace. Although Salesforce can always play the SaaS card, most of those industry-specific providers have already moved to SaaS or are in the midst of such a move. That means SaaS won’t be the key differentiator it was when Salesforce first provided its powerful sales automation service more than a decade ago.

What is my conclusion? Efficiency and cost. These are two most important things to make PLM implementation successful. So, the technology must be improved. Data and model capturing tools, flexibility and ease of use – everything must be more efficient to support future of manufacturing processes. How to do so? This is a good topic to discuss with technology leaders and strategiest. I’m going to attend COFES 2014 in 10 days. I hope to find some answers there and share it with you.

Best, Oleg


PLM Best Practices Torpedo

March 10, 2010

One of the very important aspects of PLM as enterprise software is the ability to be implemented in the fast and easy way. I think you will agree, that long implementation cycle can put your PLM project into the wastebasket as well as your carrier on hold. I’m observing strong move of PLM companies towards proposing of “best practices” or so called “industry best practices”. Such best practices are normally set of data models, process definitions and other recommendations about how to implement a system in the organization.

The whole approach made me think this can represent the mainstream trend in PLM implementations. On the surface, it sounds as a silver bullet. You are getting “ready-to-use” system with all bells and whistles. What do you need is only run it in your organization.

However, when thought about that more, I got a different way to see it. The significant piece of these best practices is a data modeling schema. Basically, it represents the way system works. On top of these data models, applied different business rules and process definitions. It reminds taxonomies (Wikipedia: Taxonomy is the practice and science of classification) applied to the data you manage. Why do I have a concern in such models? In my view taxonomy-based approach is good when your data is stable, and you have an agreed way to organize it. When applied to manufacturing organization, it means you have an agreed way to handle product data and processes around. However, this is not true in the modern manufacturing organizations. Today’s organizations are dynamic and experience on going change due to economical, business and regulatory activities. How do you think it will work when pre-build best practices will be applied?

Here, I’m coming to the second question. The most important activity when implementing PLM is the ability to make changes and react in a fast manner when you experience changes in your organizational processes. Customers these days are interested in how to make small and lean implementations as well as a move with the short steps. And it sounds like a contradiction with pre-defined templates and best practices. Whatever you are going to apply in the beginning will be changed later.

And my third and final question is about initial implementation. Each organization is sort of unique skills, rules and business processes. How these  practices can be mapped to the predefined best practices. Hm… It sounds as another point when changes will be applied.

This is my take. Best practices are like a torpedo. When it comes as a bunch of models, processes and rules, you need to spend organizational time to apply them to the way your company is doing business. This is a first time explosion. Within the time, you’ll need spend more time to change various aspects of predefined pieces. So, this is your next explosion. After few of such explosions, I think, your model will be completely different from the original best practices.

So, what is my conclusion? What should be implementation starting point? I believe best practices are the excellent way to show what your PLM product is capable of doing. However, as implementation practices, it doesn’t work. You need to have a system that can capture your business practices. Once you did it, you can change and optimize. The system need to be flexible enough. The cost of the initial best practices’ application is too high. Instead of that you better invest into a system that can capture your business processes.

Just my thoughts… It will be great if you can share your experience. I hope we’ll have a good conversation.

Best, Oleg

Share


Best-practices aren’t good enough for PLM?

October 6, 2009

Picture 19What I want to discuss today is PLM Best Practices. Frankly saying, my thoughts about the topic were accelerated by COFES 2010 theme publishing “COFES 2010: Best-practices aren’t good enough“. So, it brings me to think in loud about what we call “best practices”. I think, you can hear a lot about this in PLM community. Best practices became very popular and had promoted as the way to have an efficient PLM implementation. You can find it in different “envelops” and combinations – best practices, industry solution, express offering etc. But speaking with different people in our community, I identified two main trends, if you will, in everything related to best practices:

1. Support best practices coming from PLM vendors.
This community of people truly believes, PLM providers, by supporting best practices will release packages that will be ready to use and will be adopted very quickly by organization. For the small organizations, it will help to reduce the cost of implementation. For the big organizations, it will provide a framework to change way organization work. Customers don’t need to spend time to define processes, models, implementation specific stuff. Just install and go…

2. Support flexible configurable PLM software. This community is actually very opposite to previous one. From the standpoint of these people, PLM vendors have no sufficient knowledge to provide pre-packaged configurations. In addition, they believe in uniqueness of product development processes in the organization (even in the same industry). In their view, PLM software vendors need to focus on producing highly configurable, flexible software that can be customized, configured, adapted for specific customer needs.

So, what is my conclusion for today? I don’t see any of these approaches as a “silver bullet”. And I definitely see advantages of both directions. I’d be interested in open discussion with you to share and discuss your experience, vision and future thoughts.

Best, Oleg


Should I keep secrets from my PLM system?

August 12, 2009

Product Lifecycle Management pretend to manage everything in the organization that related to product, information about product, development IP etc . So far, a pretty long list. The ultimate PLM goal, is to manage product lifecycle from an initial idea until disposal. When I’m thinking about such as goal is always sounding big and ambitious. However, do we really need it to success with PLM?

Maybe we can learn from people experience in non-PLM areas. I was reading interesting post on New Com Biz related to people notebooks. I discover some interesting observations related to notebook (paper, not computer :)) management. You can see below a picture of weekly pages separated into four zones – Work, Projects, Personal, Misc. You can consider it as a weekly lifecycle. So far, this approach will not manage all your tasks, but will provide you quite good control on what is going on.

My experience in the area of digital control is related to my own implementation of GTD (Getting Things Done) in my Outlook system. If you are not familiar with GTD approach, take a look on this book. Take a look on the picture. You can see a stream of incoming messages, contextual folders, next actions, projects etc. This is an ultimate system to control your digital life.

My Getting Things Done Folders

My Getting Things Done Folders

Now, back to PLM. Should we take “total approach” and manage everything related to a product with PLM? The answer probably yes, but the long term. For the short term, please keep secrets from PLM. Don’t give PLM system power to make your life complex. You better keep your PLM implementation simple and get your things done on time.

Best, Oleg


PLM Transformation: Easy, No; Costly, Yes.

July 31, 2009

For the last weeks, I have chance to speak with few of my colleagues about PLM implementations and technologies in different contexts. The main point of these conversations was about how to make PLM implementation easy. Few interesting blog posts related to this topic – “A PLM Success Story with ROI” by Jos Voskuil and “Why is implementing PLM Hard” by Jim Brown. Few days ago I wrote about how to move PLM to mainstream. However, my take in this post was mostly about technologies, Jim and some of my other blogging colleagues raised valid question – PLM is about people and organizational transformation and therefore, it cannot be easy. This is about change! Change is hard… So, my thoughts today are about to analyze what PLM transformation mean from both sides – technological and organizational.

Here is my short conclusion about main factors that make PLM transformation to be a very not trivial task in organization.

New operational environment for many people in organization

In most of the cases introduction of PLM systems bring new environment to users. New data and process tools, viewing software etc. PLM environment normally combined from few connected pieces – Product, Process and Organization related. Because of complex dependencies existing between these pieces, resulting environment is not easy. So what we have in organization – confused customers and increased training budgets. A key conclusion – PLM needs to look for Trojan horse that will help them to come easy for organization.

Complexity of Infrastructure and Implementation

PLM naturally positioned in the middle of everything in organization. Allowing to connect requirement and design, design and manufacturing, supply chains, PLM position as a system that needs to have quite heavy integration portion with other systems. We know, integration work in enterprise is complicated, painful and very expensive if you start touching main enterprise systems such as CRM, ERP, SCM and some others. So, in many cases PLM stacks in the middle of these integrations and requires significant effort and resources to move it forward to completion. What is possible to improve in this integration journey? My recommendation is to make your integration project part of other projects that bring value. Even if you 100% sure you need to integrate two systems to work together, never start this task alone. You will be going to an endless process of integration. Instead of this approach, make integration to achieve a specific task and show result values. Integration technologies are very painful, don’t try to re-invent a wheel. You will not change “the integration kingdom”. So keep your budgets here.

Changes in Processes

This is another painful point. When we start PLM journey we are always saying – we are going to change how people work, improve it, optimize it. This initial value proposition sounds great and can be appreciated by many stakeholders. However, as soon as people start this process of change they face so many organizational problems and obstacle that they either going for the next 1-2-year  discussion, about how they need to work in organization or try to implement processes that have relatively low maturity. Result is obvious – drain of resources and people dissatisfaction. Meantime you spent your PLM $$$ and what is mostly important credit to make PLM implementation successful. My recommendation here – keep focus and boundary.

Content Transformation

This is last, but not least. In many cases PLM system comes in place where different legacy systems or just handmade processes worked. As a result organization faced huge requirement to transform a lot of organizational content – metadata, document, process information etc. to a new system. This legacy data imports are also not simple. People worried when their data moved, they not always can find it. Also, organization has a tendency to get rid of what they don’t need at the same time. My conclusion, be prepared to import legacy content and handle it initially in your system.

So, what is my conclusion? PLM is a not easy task. If you don’t plan it properly, you can stack many times on your way to transform company to the future way to develop and support a product. As soon as you will get there you will enjoy benefits, but transformation will be not easy and costly. You will ask me – what to do? My short answer – plan it before, have a good team of people committed to making it happen and go!

PS. I tried not to talk about technologies in this post. Technologies can be helpful, but sometimes you need to see beyond magic PLM technologies. So I did and I hope you’ll do so. I’m open to discussions and feedbacks.

Best, Oleg


My Slice of PLM Single Version of Truth

June 18, 2009

I have a crazy idea to discuss today. I’d like to talk about a topic that we like very much and that is often discussed when we mention PLM. The topic is a “single version of the truth”. In my opinion, in many cases, we do present it as being obvious.. Yes, the fundamental intention of Product Lifecycle is to cover a product from the initial concept up until the product is manufactured, released, supported and recycled. So, having a unified way to manage product, processes, and resources is one of the most important ideas concerning PLM.

Today’s enterprises are becoming very dynamic: changes are happening all the time; companies are working with a wide range of suppliers for different purposes. How can PLM provide affordable and scalable solutions for such a dynamic eco-system? This creates a lot of challenges for a company providing product data and lifecycle management solutions. How you can get everybody synchronized in the way you do business processes, and how can you keep your PLM systems up-to-date in this environment?

So, I came to a working conclusion that I’d like to discuss. My point is that in today’s enterprise eco-system, you cannot demand people to agree about how to manage your product data and processes. Ah… I know, it sounds bad, but bear with me for few more minutes, don’t close this post:)… I think today’s data management is too complex to allow large organizations to agree on a single way to do business and implement a PLM system to follow this agreement. This task is too complex and too long. You won’t be able to finish this task and you will have to start with new one! So, this is probably the most fundamental problem in today’s system implementations. It’s too long and too expensive since we are trying (and we need) to agree on how to implement the systems.

Here’s my 5-point view on the subject as follows:

  1. Organizations and systems are too complex to agree on PLM related data, processes and best practices.
  2. Successful PLM implementations need to focus on how to manage ongoing system changes.
  3. Best practices and processes in an organization will be a result of multiple changes and improvements in the PLM system implementation.
  4. The system needs to keep track of all changes
  5. We need to have very a flexible PLM system, and I don’t believe we have one yet.

What’s my conclusion? I was reading Jos Voskuil’s blog post about PLM ROI yesterday and thought about why ROI for PLM is not obvious. My take on this today is that, probably, as our implementations are still too big and too complex, people see this as very big and fundamental investment. So, they need to double-check themselves with many calculations around ROI. Allowing ongoing changes and modifications of PLM systems will make implementation simpler and ROI calculations easier…

So, don’t keep quiet… I know you won’t all agree with each other – but let me know what you think.


Why do I Need to Change My “Out-of-the-Box PLM”?

June 4, 2009

I’d like to discuss a topic which is probably the most “non technological” topic I have ever discussed in this blog. This is what we refer to as ‘best practices’. This exists in PLM, ERP, and many other business and enterprise systems. But I’d like to discuss what is behind this topic, particularly the business and technological drivers that will change the “out-of-the-box” PLM system.

So, what are benefits to have a plain, vanilla, out-of-the-box PLM implementation?

1. No need to have expensive implementation services; you just need to install it

2. You don’t need to define processes; you just need to map your organization roles to those that already exist in the system

3. You future PLM version will be easily implemented on top of the existing one

I’m sure, in the beginning, your first impulse is to opt for “out of the box”. But I suggest that you look at the factors that prevent you from doing so realistically:

1. You need to integrate data with existing systems. As a result, you need to enhance your data model

2. You are running a system inside of an organization with all the related business systems – so you need to adapt to the existing business processes (ERP, CRM etc.)

3. You are working for OEM/Suppliers, so you need to justify your processes with suppliers

Therefore, how can you handle system deployment in order to prevent future hassles? Here’s how:

1. Data Models: reuse what you have with out-of-the-box and add what you need. Try to avoid changes in existing models.

2. Business Processes: use as much as possible in a declarative way to define processes. For any additional implementation separate as much as possible between process definition and addition customization you will make using programming language.

3. Integrations: Set up an integration between systems as part of the business process. Try to avoid batch data transfers. Keep logic separate.

In short, here’s my conclusion: – (1) You cannot implement totally out-of-the box; (2) You need to minimize amount of customization you will do; (3) You need to apply tools and technology that will minimize the cost and time for future migration.

I’d like to figure out what tools and technologies would be helpful to optimize the cost of your implementation:

1. Use declarative tools as much as possible as well as tools provided by your vendor (for customization, development scripting etc.)

2. Use standard-based customization if possible (i.e. BPMN for process management and workflow)

3. Use low cost customization and development tools for easy implementation. Your services will cost less and in the future it will be easy to find suppliers for next customization.

To sum up, these are basic, if not obvious principles. I’d like to hear whether or not you think that they are applicable, and in which instances.


Follow

Get every new post delivered to your Inbox.

Join 252 other followers