PLM implementations: nuts and bolts of data silos

July 22, 2014

data-silos-architecture

Data is an essential part of every PLM implementation. It all starts from data – design, engineering, manufacturing, supply chain, support, etc. Enterprise systems are fragmented and representing individual silos of enterprise organization. To manage product data located in multiple enterprise data silos is a challenge for every PLM implementation.

To "demolish enterprise data silos" is a popular topic in PLM strategies and deployments. The idea of having one single point of truth is always in mind of PLM developers. Some of my latest notes about that here – PLM One Big Silo.

MCADCafe article – Developing Better Products is a “Piece of Cake” by Scott Reedy also speaks about how PLM implementation can help to aggregate all product development information scattered in multiple places into single PLM system. The picture from the article presents the problem:

product-data-silos

The following passage is the most important, in my view:

Without a PLM system, companies often end up with disconnected silos of information. These silos inhibit the ability to control the entire product record and employees waste unnecessary time searching for the correct revision of the product design. As companies outsource design or manufacturing, it becomes even harder to ensure the right configuration of the product is leveraged by external partners.

Whether your company makes medical devices, industrial equipment, laptops, cell phones or other consumer products – PLM provides a secure, centralized database to manage the entire product record into a “Single Record of the Truth”… With a centralized product record, it is easy to propose and submit changes to the product design, track quality issues and collaborate with your internal teams and supply-chain partners.

The strategy of "single record of truth" is a centerpiece of each PLM implementation. However, here is the thing… if you look on the picture above you can certainly see some key enterprise systems – ERP, CRM, MES, Project and program management, etc. PLM system can contain scattered data about product design, CAD files, Part data, ECO records, Bill of Materials. However, some of the data will still remain in other systems. Some of the data gets duplicated. This is what happens in real world.

It made me think about 3 important data architecture aspects of every PLM implementation: data management, data reporting and data consistency.

Data management layer is focusing on what system is controlling data and providing master source of information. Data cannot be mastered in multiple places. Implementation needs to organize logical split of information as well as ability to control "data truth". This is the most fundamental part of data architecture.

Data reporting is focusing how PLM can get data extracted from multiple sources and presented in seamless way to end user. Imagine, you need to provide an "open ECO" report. The information can reside in PLM, ERP and maybe some other sources. To get right data in a right moment of time, can be another problem to resolve.

Last, but not least - data consistency. When data located in multiple places system will rely on so-called "eventual consistency" of information. The system of events and related transactions is keeping data in sync. This is not a trivial process, but many systems are operating in such way. What is important is to have a coordinated data flow between systems supporting eventual consistency and data management and reporting tools.

What is my conclusion? To demolish silos and manage single point of truth is a very good and important strategic message. However, when it comes to nuts and bolts of implementation, an appropriate data architecture must be in place to insure you will have right data at right time. Many PLM implementations are underestimating the complexity of data architecture. It leaves them with marketing slogans, burned budgets and wrong data. Just my thoughts…

Best, Oleg

picture credit MCADCafe article.


PLM Implementations Challenges and 3 Organizational Lenses

June 4, 2014

3-lenses-organization

It is not unusual to hear people speaking about PLM implementation and changes that need to be done in the organization. Very often, PLM vendors or implementers are calling this process business transformation, which is literally supposed to make a change in everything that related to product design, engineering, manufacturing, support and services.

So, to implement PLM is hard. I can admire the power of some PLM technologies. At the same time, to make customer implementing and using them takes time and energy. PLM vendors understand that. PLM service companies aimed to make implementation successful by understanding organizational specifics, adapting business processes and configuring PLM software.

It made me think about how PLM implementation challenges can be mapped on some fundamental organizational behaviors. Organizations are often inspected with 3 fundamental lenses: (1) strategic (2) political, and (3) cultural.

PLM and Strategic Lens

The strategic lens is the most often applied perspective. Under this lens, managers are looking how to optimize work and meet corporate goals. This lens is responsible for processes and procedures. This is a place where all benefits of PLM can shine. However, very often, the application of PLM strategic transformation is triggering significant organizational turbulence in everything that related to processes and procedures. People dislike the change as well as tend to spend lot of time discussing how to make an optimal strategic process alignment. It drives lots of confusion and can derails PLM implementation.

To map existing organizational processes can be a good starting point to overcome challenges that PLM implementation can face with changes of processes and procedures. A good approach can be to apply changes in the specific processes without changing whole organization in a single shot.

PLM and Political Lens

The political lens looks at the distribution of power and influence. This is one of the most complicated part. This is a part where organization is distributing power and authority. The major challenge for PLM is related to the need to cross organization silos. Organizational silos are distributing power by separating data, people and responsibility. It is also separating IT stack and data ownership.

To bridge organizational silos can be a good way to optimize organizational behaviors and establish "political contracts". PLM can be a factor that consolidate people and help them to turn organizational silos into "cylinders of excellence".

PLM and Cultural Lens

This is probably the most unclear thing. It reflects underlying attitudes and beliefs. In many situation it reflects how culture and history of the company can affect their decisions. This is where you can expect lots of historical "PLM pitfalls" to happen. You need to understand the motives of people in power in order to be able to understand and predict their decisions. It will help you to influence them and increase PLM adoption.

What is my conclusion? Don’t ignore fundamental organizational structure and mechanisms. Strategic, Political and Cultural lenses can give you a good model to survive PLM implementation and make it successful for organization in all aspects. Just my thoughts…

Best, Oleg


PLM Services, Department Stores and Digital Future

June 2, 2014

plm-department-stores-services-amazon

Don’t be surprised if your most trusted CAD/PLM service provider will be acquired tomorrow. According to Joe Barkai’s post- PLM Service Providers Ready To Deliver Greater Value, we have been witnessing a wave of mergers and acquisitions of PLM services companies (the examples – Accenture / PRION Group, Accenture / PCO Innovation, KPIT-Tech / I-Cubed /Akoya; Kalypso / Integware merge). The following passage gives you a feeling of the core reason behind that.

For years, PLM companies focused more on PLM /PDM implementation than on actually improving business processes. While the business benefits of PLM were well articulated and supported by rosy ROI models and complex colorful architecture slides, many manufacturing companies were unable to achieve the process changes and enterprise software integration that were need to reap the promised benefits, and ended up implementing a PDM system. Albeit critical for managing product data, this reality might explain why some manufacturers feel they might have overpaid for their PLM implementation efforts.

The status quo may be changing, and organizations that have gone through massive implementation projects are ready for more. They need to improve their capacity for more complex multidisciplinary decisions using product data, whether it’s stored in PLM/PDM, ERP or in other, less structured forms; they need to improve collaboration in elongated and fragmented design partner networks and supply chains; they need to leverage product and consumer insight garnered from social media, warranty claims, and channel activities.

The story makes sense to me. In my post few weeks ago – Why PLM stuck in PDM?, I’ve been talking about exactly the same reasons behind a problem with deep and broad PLM adoption – (1) focus on CAD, (2) poor integration between PLM and ERP, (3) absence of process thinking, etc.

Joe’s article made me think about the role PLM service providers will play in the future PLM implementation strategies. It reminded me department stores. Think Macy’s, JCPenny, Bloomingdale, Nordstorm… Large manufacturing companies own a huge chunk of PLM software. Every PLM vendor has their own strong characteristics. One size doesn’t fit all. Customers’ existing investments are huge.I don’t see these manufacturing companies will start jumping between vendors. So, how to make existing PLM system work and show bigger value becomes very important. Which obviously raises the question about qualified service providers. Large teams and ability to implement any PLM software will be the key for success and profit. Customers will be coming to the PLM service department store and guided to the right brand(s) or configuration of brands depends on their preferences and constraints.

You may ask me who will play the role of Amazon in the growing PLM service eco-system? This is a very interesting question to ask. Will e-commerce come and disrupt B&M PLM services? Who will provide a new class of systems, which requires different service capabilities? Who will provide online PLM services in a lean way. Joe is mentioning Autodesk PLM360, GrabCAD and Aras in the list of potential candidates. Who knows…

What is conclusion? The history often repeats. To pay attention on existing trajectories and department stores and e-commerce is important. New e-commerce vendors are growing up, but existing B&M department stores are selling lots of stuff. The same happens in PLM. Today, large vendors provide solutions for companies that ready to implement existing PLM software. It sounds like a good strategy for large manufacturing companies with deep pockets. Until the question about “lean and digital” will come up. Will online and lean PLM offerings compete with existing PLM vendors? This is a good question. There is a good chance for newcomers to play disruptive strategies. However, alternatives are possible as well. Either newborns in the cloud will outgrow existing B&M or existing vendors will develop right digital skills and experience. Just my thoughts…

Best, Oleg


Why My PLM Won’t Work For You?

February 6, 2014

plm-customization-complexity

To implement PLM is a process and change. Speak to anyone in engineering and manufacturing community and they will bring you lots of stories about complexity of PLM implementations and associated cost. Also, you can hear lots of stories about complexity of moving from one PLM implementation to another or switching from one PLM system to another. Companies are spending tons of money to align PLM systems to a specific set of requirements to fit company date management and process needs.

Couple of years ago, I posted "Is PLM customization a data management Titanic?". For many of manufacturing companies, it is a reality these days. Implementations done 10 years ago can be hardly maintained. To update current implementation to a new PLM system version or another PLM systems is mission impossible. Companies are hiring advisers and consulting companies involved in the implementations and development of PLM systems to run migration and adjust PLM system to a new set of requirements.

Vendors have been trying to resolve the complexity of PLM systems by applying out-of-the-box configurations. However, the success of these ready-to-go systems was somewhat mixed. Pre-configured templates worked well during marketing shows, presentations and evaluations. However, in order to bring system to production mode, (still) required customization to be done. Very often, the customization was coming to replace pre-configured template totally. The problem is not very unique in PLM space. I posted – How to de-customize PLM article few weeks ago. I discussed the importance to decrease customization level as well as presented some similar customization complexities coming from SharePoint implementations.

Today I wanted to provide some recommendations you can follow in order to stay away from costly PLM customizations. These recommendations will also help you to avoid some typical PLM implementation pitfalls. Here are 4 steps to follow:

1- Ask yourself what problem you want to solve with PLM for the next 2 years. The term PLM used by many people in a variety of forms and meanings. Going with specific scope (e.g. change management, quality, bill of materials, etc.) will help you to chart functionality you expect PLM system to support.

2- Outline main data elements and structures PLM system needs to support in order to solve list of problems from previous step. Do it with no connection to specific PLM system and vendor. Make an agreement in your extended team about that. It can take you some time to get to the agreement, but in many situations this is one of the best investments you can do in order to eliminate extra customization steps.

3- Pickup few PLM systems and try to map your requirements to what these systems can provide you without customizations. Don’t be afraid to change your terminology alongside of this process. However, insure that whatever name PLM system is using, it will do what you expect from functional standpoint. It will be a good idea to hire consultant during this stage. It is worth to spend some dollars to avoid future budget waste.

4- Last, but very important. You need to test that selected system is flexible enough to apply changes on top of pre-configured parameters/templates. It is not unusual to provide out-of-the-box system configuration that cannot be practically changed. Practically means in this context the ability to add/modify system and data model and (at the same time) keep most of existing functionality in place. Stay away from system configurations with scripts and customized behaviors hard-coded to a particular data models and workflows.

What is my conclusion? The combination of flexibility and preconfigured environment is the key to stay away from costly PLM customizations. However, these two characteristics are very often mutually exclusive. Vendors can show up ready-to-be-used PLM configuration that will be literally destroyed as soon as you will have to change something. To run a test of how flexible is out-of-the-box PLM model is a key thing not to be punched in the face by future PLM system re-configuration and customization cost. Just my thoughts…

Best, Oleg


Why PLM should not think about ownership?

October 10, 2013

who-should-own-plm

Enterprise software business is complicated. It often comes with questions of system responsibilities and ownerships. IT usually holds primarily responsibility for cross company software stack. At the same time, divisions and departments often try to influence IT and make their own decision related to what software to use independently. One of the terms came out of this political and technological confrontation is shadow IT.

Ownership is a sensitive question. In enterprise organization (manufacturing companies are included) it often involved with many politics, hidden agendas and power influence. This is a reality of many enterprise organizations. By the nature, PLM is getting deeply involved in this politics. If you think about PLM as a business strategy focused on organizational processes, it becomes clear why it is hard to avoid cross departments conflicts. Then, if you think about PLM as a technology, you might appreciate why ownership of a particular set of technology and/or data can be involved in the same cross department operation politics.

Earlier this week, I’ve been reading Aras blog – Who should take ownership of PLM? Navigate to the following link to have a read (). Draw your own opinion. I think it provide some interesting perspective on PLM ownership. This is how Aras blog explained that:

With this business dynamic in mind, have you reexamined who should take ownership of PLM in your organization? Something to consider, there will be different driving forces behind PLM depending upon who has ownership. Here are some to keep in mind: Should Design select and own the PLM system? As you’ve likely seen, the main focus of PLM is in CAD file management, change management and BOM management when design is at the wheel. What about Operations? With cost reduction, inventory management and purchasing at the top of their priorities you can bet these will be the focus for Operations driven PLM as well. What about Quality? With the Quality organization leading, the driving forces for PLM are likely to be compliance, regulations and overall quality of products.

I agree, the story is really not black and white. However, good organization will break this story down to the pieces of business processes and technologies. There are clear benefits for division and departments to be responsible for their processes. This is should be an ultimate goal. Nevertheless, I still can see some overall responsibility on product development processes (especially when it comes to cross functional teams). So, to have somebody responsible for product can provide a lot of value. At the same time, it doesn’t mean technology needs to be driven by the same department people. I see technological and software stack responsibility separately. Sometimes, it can come as multiple pieces of product and technologies.

What is my conclusion? In my view, the question of "PLM ownership" is a wrong one. It must be split into business, process, communication and technologies. It is obvious Quality (or any other) department should be responsible for how processes are organized. However, it doesn’t imply ownership of technologies and software pieces. Just my thoughts…

Best, Oleg

Image is courtesy of Aras blog.


PLM real implementations: too long to be on-time?

July 7, 2013

One of the looongest US weekends I remember is going to end. This is a time to get back from relaxing holiday atmosphere to business reality. I’ve been skimming social channels and stumbled on PLM statistics posts published by my good friend and PLM blogging buddy Jos Voskuil. Navigate to the following link – PLM statistics–the result. Read and make your opinion. Jos’ main conclusion – PLM is more vision than tech. PLM implementation is a journey that takes time, effort and resources. Some interesting and funny things came out of comparison of experience and PLM implementation time. Here is the passage I liked:

Here, it was interesting to see that more than 60 % of the respondents have over 8 years of experience. As mentioned related to the previous questions it is necessary to have a long term experience. Sometimes I meet a “Senior” PLM Consultant (business card) with two or three years of experience. I believe we should reserve the word “senior” for PLM with a minimum amount of 5 years experience. And it is also depending on the amount of projects you were involved in. Interesting thought came into my mind. Some vendors claim the provide extreme rapid implementations for PLM ( 2 weeks / 30 days / 3 months). If this is real PLM you could do 25, 12 or 4 PLM projects per year full time.

It made me think about PLM implementations in the way they exist today – journey type of specialized implementation requiring time and effort. I certainly agree with Jos- to change the way companies work requires vision, time and effort. In some situations, PLM implementations are coming to change product development processes established during decades.

However, here is a different angle to look on PLM problem. Business is very dynamic these days. Business environment, ecosystem, technology, human resources, business landscape. What if the current lifecycle of many PLM implementations is not very inline with business needs? It reminds me one of the old PTC slides from COFES Israel – people just want drink beer!

What is my conclusion? New enterprise landscape and business reality will be require a different approach in everything – IT, computing models, enterprise software and implementations. We’ve seen lots of changes in consumer technology space, open source and other places happened over the past 10 years. People are looking how to build new products faster and provide a quick respond on customers demands. So, my hunch some of PLM journeys will be late to deliver results. Just my thoughts…

Best, Oleg


Who should be the first PLM user in a company?

May 9, 2013

Enterprise software implementations are usually not a simple task. Compared to selection of your next mobile device and RSS reader, it is an organizational effort. Enterprise software gets really complicated when it comes to the point implementation requires involvement of people. Product lifecycle management (PLM) is one of these systems. Implementation of PLM is deeply connected to product development and manufacturing processes. Success or failure of PLM implementation is directly impacted by how people are involved in PLM system adoption and use.

Companies are taking different approaches in implementing PLM. However, fundamentally, I can see two different ways in implementation. First is holistic approach usually called "business transformation". It implies significant process changes as a result of PLM system implementation. Companies are analyzing their existing processes, optimize and restructuring the way they do business. Second approach is focusing on a specific process or problem solving. It is usually come as an improvement of a specific activity and/or process.

There are lots of debates about PLM implementations these days. The value of PLM system implementations becomes clear to organizations on different levels. At the same time, it is obviously not easy to people to understand how to start using a PLM system that will have such a significant impact of everything they do.

I was reading an Minerva blog post – Should we pull PLM deployment? A new lean deployment strategy by Yoann Maingon. In this article Yoann shares his view on different approaches to implement PLM. The idea of lean and "pulling data" resonated. Here is an interesting passage:

The lean concept is highly based on a pull flow. Most of the arguments I’ve had were about the fact that the main data is created in Engineering so we should start deployment in engineering. Well, what if you should provide a system to the first person who enter the system. The one who will pull the flow, the customer? the marketing? assistance & support?

It made me think about how to maximize the value of PLM implementation withing short period of time. Here is the idea. Every company is manufacturing products for customers in some ways. The biggest process loop in every manufacturing company starts from requirements and ends with "release" of product to customer. To control the loop between requirements and results can be an interesting problem to handle first.

The idea of "pull" will be related to pulling of product requirements and documents representing released products and combined them together in a single system. In my view, it can provide an interesting insight on company operation. It is also very useful information source that every company can "re-use" for different purposes – new projects, customer support, etc.

What is the conclusion? It all starts from ROI. How to make it faster… This is a challenge most of PLM implementations are facing these days. For most of the implementations the process of getting to results can be slow. To provide system that can capture requirements to release control can be an interesting option. Lots of valuable information is hidden in this relationships of requirements-result. It also can drive management attention and focus in a company. Just my thoughts…

Best, Oleg


PLM adoption and CAD management valley of death

March 5, 2013

The issue of PLM adoption remains critical, in my view. Even if we can see more examples of PLM implementations, companies usually consider "PLM project" as something that needs to be taken with care, significant amount of planning and justification. So, I wanted to ask "why it happens"? The traditional answer mostly coming from PLM vendors and PLM consulting is leading to the complexity of business processes, the need of people to change and technological challenges related to implementation and customization of systems. Usually, vendors and consulting are working with CAD/IT managers that lead "engineering part" of the implementation. In parallel, PLM consulting often dream to work with C-level people in a company to define better alignment of PLM strategy.

I want to raise a question of balance between engineering IT / CAD managers and more strategic PLM business planning. Here is the thing – even if companies are succeeding in making strategic planning for future business and product planning development changes, CAD management is still can create too many complication on the way. CAD management is complicated, requires lengthy implementations, data import and… more important – change the way engineers and designers are working by introducing "data management". The last one is hated by everybody hates because of complexity.

In addition to complexity, CAD data management introduces an issue of compatibility between different CAD systems and PDM components. New trend to solve this problem is to use PDM system of the same CAD vendor. It simplifies CAD/PDM integration, but introduces the problem of multiple PDM/PLM system integration. CAD management (PDM) projects often positioned as an introductory step in PLM implementations often end up as a long and painful journey.

What is my conclusion? In my view, CAD management is a valley of death for many PLM implementations. Many PLM consulting and even some PLM vendors are trying to avoid and position PLM implementations "beyond CAD/PDM". It certainly gives some advantages, but (in my view) just hides the real problem of bad engineering data management. Engineering data from CAD is an important element of change management and many other PLM-related processes. As we move to more agile and efficient product development process management, to solve the problem of CAD data management and PDM becomes very important. Just my thoughts…

Best, Oleg


Dogfooding and PLM APIs random thoughts

November 14, 2012

If you are long time enough in software business you should be familiar with the term "dogfooding" (or eat your own dog food). This term used to explain the situation or scenario in which company is using their own products to demonstrate their capabilities and quality. If you are not familiar with this process, navigate to the following Wikipedia article to read more. I liked some examples there, specifically, Apple one, which I wasn’t aware about -

Apple Computer president Michael Scott in 1980 wrote a memo announcing that "EFFECTIVE IMMEDIATELY!! NO MORE TYPEWRITERS ARE TO BE PURCHASED, LEASED, etc., etc." by the computer company, with a goal to eliminate typewriters by 1 January 1981.[9]

The following passage brings few more examples:

One perceived advantage beyond marketing is that dogfooding allows employees to test their company’s products in real-life scenarios,[3][5] and gives management a sense of how the product will be used, all before launch to consumers.[5] In software development, the practice of dogfooding with build branches, private (or buddy) builds, and private testing can allow several validation passes before the code is integrated with the normal daily builds. The practice leads to more stable builds[citation needed], and proactive resolution of potential inconsistency and dependency issues, especially when several developers or teams work on the same product. For example, Microsoft and Google emphasize the internal use of their own software products[citation needed]. For Microsoft, especially during the development stage, all employees across the corporation have access to daily Software builds of most products in development, including the Windows operating system.[citation needed]

Today, I want to speak about specific "dogfooding", which is related to PDM/PLM APIs or (Application Programming Interfaces). In the world of PLM implementations, the role of Open API becomes very important. Usually, when I’m working with customer requirements, I can see the following notes – external programming or customization as a way to resolve features or function absence available in the product. Yesterday, I had a chance to read the following TechCrunch article – 5 Rules for API Management. Even if you are not programmer or software engineer, have a read and make your opinion.

The article made me think of the complexity of API delivery in PDM/PLM as well as about "lifecycle". The latest is important – PDM/PLM products live very long period of time, and the development of stable APIs is a separate and almost "must have" a prerequisite. The 5 rules – design, documentation, analytics, universal access and uptime made a perfect sense to me. I found interesting note about the relationships between IT and business group (which is also very typical for many PDM/PLM implementations):

Enterprise API Management must include the entire Enterprise, not just the techies in IT. The SOA solution, and the other gateways as well, is focused on the IT person and not the business owner of the API program. This is reflected in the UI that they present in their free version as well as their language that includes things like “policies”; too much of the business rules are codified in complex policies that require a technical expert to really use.

However, I found the notion of analytics, mostly interesting, since it can address the idea and requirements of API management through the lifecycle of the product. Here is the passage to think about:

[how to] think about the collection and processing of all the statistics associated with the use of the API, with an eye toward supporting and encouraging effective usage and discouraging/limiting usage that is counter to your business or technology goals.

What is my conclusion? The days of single PLM platforms are almost gone. The future belongs to Networks. Data networks, product and cloud services networks. The ability to adapt a product to customer needs, to continue product development in a fast-changing customer environment and strategic goal for cloud, deployment set new goals in front of PDM / PLM developers. The importance of having agile and flexible API that can sustain many product releases and development cycles was never as important as of today. Just my thoughts…

Best, Oleg

Image is courtesy of TechCrunch article (Feature image courtesy of XPlane – under Creative Commons.)


How Do You Know Your PLM Project is in Trouble?

December 19, 2011

Blogosphere and other literature are full of remarks about companies that stuck in different phases of PLM process. You probably had a chance to read Aras’ Frustrated by a stuck PLM project? blog post last year. Recent Autodesk announcement of Nexus PLM raised again many publications about the complexity and sophistication of existing PLM implementations. While time will show if Autodesk cloud PLM technologies will be able to reduce the complexity of PLM implementations, I’ve been thinking about how you can today to identify your own PLM project (if you are running one already) is in trouble.

I had a chance to read infoworld article earlier this week – Six lessons from lightning ERP rollout. Have a read and make your opinion. I especially like the following passage from the beginning:

Here’s something you don’t hear everyday: "Our SAP implementation finished ahead of schedule. Sorry, let me rephrase that. Hearing about an SAP implementation that finished ahead of schedule is like hearing that someone captured the Loch Ness Monster and turned it into a kiddie ride. It’s as likely as Bigfoot singing "La Traviata" at Lincoln Center. It’s as if you called a software company’s tech support line and the voice on the other end didn’t insist you reboot your PC.

This article made me think (again and again) about how you need to plan your PLM implementations. Here is my top 5 symptoms you should be careful about. When you discover them, you better check what you do with your PLM project:

1. You cannot control your PLM project budget. As R&D, you know that "shit happens" all the time. However, be aware – the achievements of your PLM system will be significantly diminished when you overspend 200-300%.

2. Engineers and other people in your company work around PLM system. This is should be a "red-flag" for you. If people think the system doesn’t work (or way too complex), check your fasten belt and run fast to understand what is the core reason for that.

3. The infrastructure becomes more and more complicated. You need more databases, storage, CPU, etc.You are probably familiar with that – after first pilot, the system requirements are growing. Watch carefully the fist production data load. Your bill of materials, check-in/out operations and some other elements are sensitive and your can run out of budget fast.

4. Vendor is pushing you towards the next release of their flagship product. This is another "red-flag". Normally, it means something was over-promised by sales fellows. Watch this moment as well.

5. You start hearing that you will be able to take a full advantage of your PLM system when you completely integrate it with your ERP (and other systems) as well as migrate to another CAD system. This is, actually, the right time to stop and re-think what you do. The best talk with somebody who is not involved in the business of PLM vendors.

What is my conclusion? Looking up on what I wrote, I found symptoms that probably will be true not only for PLM, but for a broader range of enterprise software. However, as you probably know, in PLM and enterprise, one size doesn’t fit all. You need to have a diversity of knowledge and experience to make thing work. Just my thoughts…

Best, Oleg

picture courtesy digitalart / FreeDigitalPhotos.net


Follow

Get every new post delivered to your Inbox.

Join 239 other followers