You need PLM project to fail… to start lifecycle

June 23, 2014

plm-project-failure

One day you discover that your PLM implementation project is not doing so well. It happens and it called failure. Scott Cleveland’s blog took me back to the topic of PLM implementation failures. Unfortunately, I didn’t find the link on CIMdata research to read the paper mentioned by Scott. According to the post, wrong scoping and failure to get buy-in from users are on the topic of the list. Not a surprise.

Failure is not such a rare situation in IT world. Google "IT project failure" and your can quickly discover that 68% of all IT projects fail. Few months ago, I had long discussion around my Why PLM stuck in PDM? article on LinkedIn. I cannot publish all comments from closed discussion group, but the question about "how to identify what is PLM project failure" was one of the dominant topics in the discussion. Guess what… no agreement about how to identify "Failed PLM project". Few other notable references on PLM failure publications: PLM Failure, you didn’t see anything; Keynote presentation by Martin Eigner on ProSTEP iViP Symposium 2009.

Unfortunately, most of PLM events and publication are placing shining picture of success on their PLM references. The problem that all these successes looks the same. It is time to remember Leo Tolstoy passage from Ana Karenina – Happy families are all alike; every unhappy family is unhappy in its own way. One of the interesting place to learn about failures is to attend FailCon – the conference for startup founders to study their own and others’ failures. There is no PLM Failure Con… yet. And I doubt companies will be ready to do so.

Reading and discussing PLM failure articles, made me think that you really want your first PLM project to fail. Why so? Here are few thoughts…

Challenge the status quo. As people often say – PLM exists in every manufacturing company. You do product lifecycle management by the way you manage product development processes, store data, communicate within organization and with outside contractors. At the first attempt you will try to build PLM system that mimic all existing practices. I’ve heard it many times – if you have a mess, don’t bring a computer. Otherwise, you will have computerized mess. First, fix the mess, then bring computers.

Get rid of outdated stuff. Every manufacturing company usually trailing lots of existing software and practices. It is hard to cut the cord and switch and leave outdated stuff behind you. PLM project failure can bring an awareness to the problem and force company to make a change. It is hard to company and people to admit they do something wrong. Especially if they do it many years.

Learn as you go. You have the best chance to learn when you actually do something. Regardless on your experience, every manufacturing company is different. How to see that new system will fit? Put it in action! When it comes to people, they only way to see if it works is to try it. Then you fail and only after, find the right way to do it right.

Think about your PLM system in the same way you think about product development processes. Your design doesn’t fit manufacturing plan, some requirements are failing to communicate and some of them got misunderstood. Your first manufactured item fails and you need to fix issues. These are absolutely normal thing for every manufacturing company. Your PLM is not much different.

What is my conclusion? Failure is not an option is probably wrong PLM implementation strategy. Opposite to that, you need to bring it fast, engage with users, fail, fix it and bring back fixed. Lifecycle. This is the only way to make it right. Just my thoughts…

Best, Oleg


Traditional PLM have reached their limits

June 11, 2014

plm-limits

To sell PLM to small and medium enterprise (SME) companies is a challenging tasks. I guess many of my readers will agree with me. I expressed some of my thoughts here – Why PLM stuck to provide solutions for SME? Opposite to that, large manufacturing companies, especially in aerospace, automotive and defense industries, were always a sweet spot to sell PLM solutions. Actually not any more…

Earlier this week, my attention was caught by CIMdata article – CIMdata Announces the Formation of an Aerospace & Defense PLM Action Group. Here is how CIMdata President, Peter Bilello defines the objective of the group:

“PLM solution providers continually deliver new products, architectures, and solutions to market, while industrial customers must cope with previous product launches, attempting to realize the value from existing PLM investments. The Aerospace & Defense PLM Action Group will define and direct the research efforts on key areas needed to meet future challenges. Current experiences with PLM-related implementations, best practice research, and close examination of emerging technologies will help define what the PLM solution providers should be offering

Another article by ConnectPress speaks about top three challenges of PLM in A&D industry – Global collaboration, Integration and obsolescence management.

#CIMdata Aerospace & Defense #PLM Action Group Addresses The Big 3: http://goo.gl/flMUx4 via @ConnectPressLtd

Integration is a topic that near and dear my heart. In my view, the future of manufacturing will be heavy dependent on the solving of old integration problems. Multiple enterprise software systems is a reality of all large manufacturing companies. I guess aerospace and defense companies are an absolutely extreme case of multiple systems integrated together. This is a place where existing PLM system might have a challenge. Here is my favorite passage from ConnectPress article:

According to Roche, most major aerospace companies make a major investment in PLM, either changing to a new system or upgrading their current system, roughly every five to 10 years. But in more recent iterations of this cycle aerospace companies have been spending more money and seeing smaller returns on their investments. The reason for this appears to be because the traditional PLM components have reached the limits of what they can offer.

The following areas mentioned are expecting to bring maximum ROI from PLM investment – non-core PLM domains such as requirements management, configuration management, change management, service lifecycle, etc.

It made me think, the integration of these solutions to core PLM modules can introduce a significant problem. For most of PLM systems, the architecture and technologies of core functions such as CAD data management and BOM management were designed back 10-15 years ago. To connect and interplay between heavily customized core PLM modules and expanded PLM solutions can bring significant service and implementation expenses.

In my view the following four major paradigms used by existing core PLM modules will require some sort of architectural upgrade to take them to the next level of integration in large global companies: 1. Global data management; 2. Concurrent design and related content access; 3. Management of multiple Bill of Materials; 4. Cross system data federation and integration.

What is my conclusion? Redesign the core PLM functions can be an interesting challenge for major PLM suppliers. In my view, this is something that will require a significant revamp in existing platforms and data management paradigms. Cloud can help to solve global collaboration challenge. Will cloud help vendors to solve a problem of multiple system integration? It looks to me a good topic to discuss. Just my thoughts…

Best, Oleg


Why PLM stuck in PDM?

April 5, 2014

plm-stuck-pdm-round-square

I’ve been following CIMdata PLM market industry forum earlier this week on twitter. If you’re are on twitter, navigate here or search for #PLM4UM hash tag on twitter. The agenda of PLM forum is here. The following session discussed one of my favorite topics- PDM v PLM. PLM: Well Beyond Just PDM by Peter Bilello. This passage is explaining what the session is about

CIMdata’s research reveals that leading industrial companies are looking to expand beyond PDM functionality to truly enable a more complete PLM strategy. This becomes even more important in a circular economy. In this presentation, CIMdata will discuss which areas are most important, and what opportunities they create for PLM solution and service providers.

My attention was caught by the following tweets coming from this session:

According to CIMdata, leading Mfrs are now looking to move beyond PDM. #PLM4um
— ScottClemmons (@ScottClemmons) link to tweet.

Peter B / CIMdata explains that it’s hard to find a ‘real’ end-to-end #PLM implementation hat works #plm4um
— Marc Lind (@MarcL_) link to tweet.

It made me think why after so many years of PLM implementations, most of vendors are still solving mostly PDM problems for customers and it is hard to move on into broad downstream and upstream adoption of PLM beyond CAD data management functions. Here are my four points explaining in a nutshell why I think "PLM stuck in PDM".

1- Focus on design and CAD.

Most of PLM vendors historically came from CAD-related domain. Therefore, PLM business for them was the expansion of CAD, design and engineering business. As a result of that, use cases, business needs and customer focus were heavy influenced by design domain. The result – PDM focus was clear priority.

2- PLM is a glorified data management toolkit

The initial focus of many PLM systems was to provide a flexible data management system with advanced set of integration and workflow capabilities. There are many reasons for that – functionality, competition, enterprise organization politics. Flexibility was considered as one of the competitive advantages PLM can provide to satisfy the diversity of customer requirements. It resulted in complicated deployments, expensive services and high rate of implementation failures.

3- Poor integration with ERP and other enterprise systems

PLM is sitting on the bridge between engineering and manufacturing. Therefore, in order to be successful, integration with ERP systems is mandatory. However, PLM-ERP integration is never easy (even these days), which put a barrier to deploy PLM system beyond engineering department.

4- CAD oriented business model

Because of CAD and design roots, PLM sales always were heavily influenced by CAD sales. Most of PLM systems initially came to market as a extensions of CAD/PDM packages. With unclear business model, complicated VARs and service companies support, mainstream PLM deployment always focused on how not to slow CAD sales.

What is my conclusion? Heavy CAD roots and traditional orientation on engineering requirements hold existing PLM systems from expanding beyond PDM for midsize manufacturing companies. The success rate of large enterprise PLM is higher. But, it comes at high price including heavy customization and service offerings. Just my thoughts…

Best, Oleg


PLM Is Challenged With Compliance Data

October 24, 2013

open-data-types

Manufacturing is going global. This is not about the future. This is a reality of all manufacturing companies today. So, with such a reality, supply chain solutions is getting more and more traction and interest from IT managers and other people in companies. If I will use the language of one of my sales exec friends, supply chain solution turns from a vitamin to pain killer. Which means sales opportunity.

At the same time, supply chain is not a new topic for manufacturing companies. Very often PLM companies are focusing on supply chain with “CAD in mind”, which makes them looking mostly on design supply solutions. Many supply chain management opportunities are turning directly to ERP vendors and other vendors specialized on specific vertical supply chain management solutions. In my view, data becomes one of the most critical topic in every supply chain solution. In most of the situation, data needed for supply chain solutions “stick in the middle” between ERP, PLM and SCM. It combined from design data, manufacturing data, suppliers data. It is not easy. It requires openness and transparency between multiple systems. Let’s say the truth – it doesn’t work well in existing enterprise systems. These systems never been designed for openness and transparency.

The topic of supply chain compliance was discussed during recent PI Congress in Chicago two weeks ago. I found a good summary of PI Congress provided by CIMdata here. Here is an interesting passage from this write up related to supply chain compliance topic:

The first [panel] focused on supply chain transparency and traceability. This issue occurs at the intersection of PLM and ERP, and is critically important to firms that must increasingly compete on a global basis. The panel agreed there was a need for a common dataset on compliance issues, a common problem when selling in many countries. They recognized that PLM solution providers are challenged to provide this information in a timely fashion, and challenged the audience to help find or create new alternatives.

Common dataset is an interesting topic. It made me to think about the trend towards open data. I can see as part of broader discussion about BigData opportunity these days. In that context, I want to mention Open Data initiative led by some organizations – Open Data Institute (ODI) and Open Knowledge Foundation (OKF). The first one – ODI founded by Tim Berners Lee. The topic of open data is complex and I hope to speak about it in my future blog posts. You can find some information about open data on ODI website here. Another thing derived from open data is Open Definition initiatives. These are all new initiatives that mostly unknown in the enterprise software domain.

What is my conclusion? I think we are in the beginning of data revolution. To provide a better solutions these days, we need to have data available. It includes openness and data access. It also related to company data stored on company servers and, even more important, open data outside of organizations that must be available to everyone. In my view, common compliance data set is a perfect example of open data that must be available to enable future development of PLM supply chain solutions. Just my thoughts…

Best, Oleg

image courtesy of Open Knowledge Foundation.


PLM, BigData and Importance of Information Lifecycle

September 25, 2013

BigData is trending these days. It goes everywhere. Marketing people are in love with this name. It brings such a good taste of “big something”. It might be $$ or amount of problems it supposed to solve. It can be potentially related to big value proposition. Net-net, the amount of people and articles around you referring to the opportunity related to big data is probably skyrocketing. If you want to read more about big data, navigate to the following Wikipedia article – it is a good starting point.

CIMdata, well-known PLM advisory outfit, recently published an interesting paper about PLM and BigData. Navigate to this link, download research paper (it requires registration) and have a read. I’d say, this is the best reference about intersection of PLM and Big Data worlds. Here is what is the document about:

This paper focuses on the intersection of PLM and what has come to be known as “Big Data.” The increasing volume and growth rate of data applicable to PLM is requiring companies to seek new methods to turn that data into actionable intelligence that can enhance business performance. The paper describes methods, including search-based techniques, that show promise to help address this problem.

Search and analytic is one of the ways to dig into big data problem. Last year, I wrote about why PLM vendors need to dig into Big Data. Here is the link to my post – Will PLM vendors dig into Big Data?. I believe, BigData can provide a huge value to organization. To unlock this value is extremely important. However, looking on BigData hype these days, I got a feeling of wrong priorities and some gaps between vision of BigData and reality of PLM implementations.

I’ve been reading an ITBusinessEdge article – Three Reasons Why Life Cycle Management Matters More with Big Data. The main thing I learned in this article – even big data is going to change a lot, it won’t change some fundamental data management laws. Data lifecycle is one of them. Here is my favorite passage:

With Big Data, which can be unpredictable and come in many different sizes and formats, the process isn’t so easy,” writes Mary Shacklett, president of technology research and market development firm Transworld Data. “Yet if we don’t start thinking about how we are going to manage this incoming mass of unstructured and semi-structured data in our data centers.

It means a lot in the context of PLM systems. This is where I can see the biggest gap between BigData and PLM. It is easy to collect data from multiple sources. That’s what everybody speaks about. However, big data needs to be managed as well together with other information managed by PLM. Big Data is coming through the lifecycle of processing, classification, indexing and annotation. To connect pieces and to related big data pieces of information to PLM system – significant problem to think about. Engineers and other people in the company probably won’t be interested to access data itself, but analytics, insight and recommendation.

What is my conclusion? The value behind big data is huge. It can improve decision making, quality of service, suppliers bids and lot of other things. However, it creates a huge pressure on IT and organization in terms of resources, data organization and data infrastructure. PLM systems won’t be able to start with big data overnight. Whoever, will tell you “now we support big data” is probably too marketing oriented. PLM will have to focus on data lifecycle to bring a realistic big data implementation plans to organization. Just my thoughts…

Best, Oleg


PLM and The Art Of Simplicity

April 12, 2013

For many years, enterprise software was known as a place where development of new features was one of the main priorities. To have comprehensive list of features was considered as absolute necessity. The army of sales people and advisers spent enormous amount of time in validating and comparing features of enterprise systems. From the early beginning, PLM was considered as an extremely complex discipline. Product development methodology and engineering culture made PLM in the way we have it now – overloaded with features, struggling with user experience and the ability to have fast and broad PLM adoption in any company.

It looks like we are going to spot some changes in this place. It is not about small manufacturing shops anymore. Companies like Boeing are complaining about PLM software usability and looking for solutions to solve that problem.

To confirm what I said above take a look on the slide presented yesterday at CIMdata PLM Forum in Ann Arbor. CIMdata defines the simplification of PLM as one of the key challenges PLM is facing with customers.

However, the act of bringing a simplicity is not a simple thing to do. CIMdata confirms that to make complex simple is a significant development undertaking. CIMdata defines it as "art and science" at the same time.

What is my conclusion? We are going to see a big change in PLM development. PLM developers, just a minute ago focused on how to add one more feature to the product, will take a step back and think about user experience and simplicity. This is a result of many years of customer disappointment in the way PLM systems are implemented. To hide complexity of data models and processes make a total sense. People like "everything simple" these days. Vendors must take a notes. Just my thoughts…

Best, Oleg


CIMdata PLM Forum: PLM Never Ends

April 10, 2013

I’ve been attending North American PLM Market & Industry Forum organized by CIMdata earlier today. CIMdata is running these forums across the different geographies. Navigate to the following link to learn more about future locations and forums. Here you can see the agenda. I’ve made some calculation. The pure presentation time was about 6 hours. CIMdata planned to present total about 369 slides. It means attendees supposed to digest slides with the average speed of 1.02 slides/min. The top slide/min speed was captured by me during Big Data and PLM presentation. Ken Amman performance was 1.86 slides/min.

The amount of information shared by CIMdata was huge. There is no sense to copy/paste all graphs and charts. I will take time to digest it and probably will come later with some thoughts and ideas. Nevertheless, there are three topics that stand out in the overall stream of information presented by CIMdata earlier today. I wanted to share some of my thoughts about them. These topics are software and service revenues; PLM evolution chart and Collaboration.

PLM Revenues: Software vs. Services

An interesting piece of information was presented by Peter Bilello during his State of PLM presentation. The following slide shows the overall state of 2012 PLM Market. The data point that caught my attention was about software vs. services revenue growth.

According to CIMdata, in cPDM/PLM segment of market services grew slower than software. Traditionally, service component of PLM implementation was significant. It is not unusual to see 50/50 split of software and services revenues. What means this data point from 2012? Is it a local 2012 anomaly or, maybe it represents a trend towards different ways to implement PDM/PLM solutions? Interesting question to ask. I hope CIMdata will follow up this topic with additional research.

PLM technologies

It is always interesting to see how analysts are presenting the history of PLM. I found the following slide showing the evolution of PLM market quite interesting. Here is a main reason why. I’m sure you are familiar with the theory presenting evolution using spiral patterns.

Interesting enough, PLM evolution slide doesn’t address the spiral of evolution. According to that slide, the evolution of PLM went from data and technology to processes and “bottom line” of business solutions. However, we  need to remember a massive disruptive technological innovation that happens around us now with web, mobile, big data, open source, etc. Many legacy PDM/PLM solutions were built back using the technologies of 1980s. Do you think, the technology of 1980s and 1990s is keeping up to speed with the bottom line of processes and business solutions? I don’t think so, therefore, I’m looking to see next spiral of PLM technologies. New technology will drive the change across the whole solution chain.

Re-think Collaboration?

Last, but not least – collaboration. PDM/PLM industry spent significant amount of time working on solutions for collaboration for the last 2 decades. Nevertheless, the following research presented by John MacKrell demonstrate that collaboration is actually weak link in the landscape of PLM system engineering solutions.

 

People are expecting vendors to make improvements in all aspects of collaboration – people, data and processes. The are two main reasons for that, in my view. Traditionally, vendors have a difficulties with the openness and data access. It leads customers to disappointment and anger about unwillingness of vendors to make a change in their strategy. On the other side, modern web and social networking tools are providing a good examples of collaboration – Skype, Facebook, Twitter, Google+, LinkedIn. This is only short list of available products and technologies. In my view, it is a time to re-think collaboration by reusing social web paradigm and modern web technologies.

What is my conclusion? I captured the title of this blog post from one of the final tweets from CIMdata PLM forum. PLM never ends. PLM has deep connections to product development processes and innovation. You cannot stop the innovation process. If you don’t do it, innovation will happen anyway, but in another place. I think, PLM vendors need to remember that. The technology is democratizing these days. The question how to democratize technology becomes more and more relevant. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 252 other followers