PLM and “The Whole Truth” Problem

March 31, 2012

It is hard to find somebody in PDM/PLM business that is not familiar with the idea of a “single point of truth (SPOT)”. The idea is not new. In my view, it was one of the most powerful model that convinced people to implement PLM during the last decade. Similar to the idea of technological singularity, the idea of single point of truth assumes the ability of PLM as a business system to absorb all aspects of a product development lifecycle from early idea generation and requirements building through manufacturing and disposal. Architects of PLM systems took ERP resource and accounting as a model to build a complete model of product development processes in an organization. In the past, I addressed this topic in my posts. Navigate to Back topan> basics: PLM and Single Point of Truth or PLM and a single point of disagreement posts to read more.

Practical application of “single point of truth” model was far from ideal. High level of diversity in engineering and manufacturing processes combined with a significant cost needed to implement a singular model, ended up with very limited implementations. Lots of data elements and processes weren’t covered by PLM implementations. The discussions around “holistic” or “total” PLM implementations is not an unusual topic discussed during every PLM implementation. Earlier today I was listening to the tweet stream coming from CIMdata PLM forum. I learned a new term coined by Stan Przybylinski of CIMdata – “The Whole Truth model”. You can see below tweets about that. What I learned is that “whole truth model” supposed to expand PLM to domains not well covered by PLM today – electronic, software, process (and not only limited by traditional “mechanical orientation” of PLM).

This conversation made me think about possible trajectories of PLM model development into so-called “The Whole Truth”. I decided to make a picture to illustrate that. On the right side of the picture, you can see a PLM model growing and absorbing various domains of product development. Extra data elements represent supplemental aspects of product development PLM model needs to cover. It includes additional data elements as you can see on the picture.

At the time, I can see clear advantages and logic behind this “the whole truth” model, I have concerns about overall scalability and feasibility of such data model organization. The complexity and cost of this model will create difficulties in real-life implementations. Change management will be costly and complicated. Therefore, I decided to predict an alternative model. I called it “web truth” model. The idea behind isn’t new and present an organization of scalable network to represent multiple aspects of the product development. This model can have all advantages of a “single model”, but at the same side assumes some level of Independence in data organization, which will make overall data architecture mode lean and agile.

What is my conclusion? PLM vendors need to learn more about last decade of web development and organization of large scalable web systems. In my view, an attempt to build a “singular” system won’t be successful and create a complex system that hard to maintain, change and scale. The future belongs to data networks and more flexible data organization. Just my thoughts…

Best, Oleg

Picture Victor Habbick / FreeDigitalPhotos.net


Product and Process Models in PLM – What Should Come First?

December 3, 2009

Common definition of the process – “a set of activities leading to the desired outcome”. Despite on such simple and straightforward definition, implementation of processes in PLM delayed and very often lead to a significant complexity in implementation. I’d like to analyze and discover why it happens and what other factors can influence process implementation in the organization.

Process Model.
It depends on tools, technologies and environment customer has, processes in the organization can be modeled and implemented differently. Normally, there are more than one enterprise systems in the organization that is able to handle process modeling. Starting from middleware and specialized BPM software, following by enterprise systems such as ERP and PLM and ending with various Enterprise 2.0 collaboration tools. Process model, these days, can be developed by multiple tools. For the last few years, BPMN becomes somewhat similar to the standard process definition tools. What is the main problem? Data. Various products and corporate data needed to be injected into process implementation to make it work.

Product Model.
Originally started from CAD models, product model developed as an extended set of information describing various aspects and dimension of product – model, bill of material, requirements, items, information about customer requests etc. As we learned from process model definition, this specific product model information is needed to make process definition. Processes actually need to access data to trigger tasks and events to handle processes.

So, what should come first? Product or Process? My conclusion is that lack of the rational product model can drive to a very high level complexity of process definition and implementation. Product Model is the foundation of product lifecycle. Without a well defined product model that can cover enterprise product definition scope and related disciplines, development of a process oriented PLM environment becomes a complex and not achievable task. Organization implementing PLM as a process environment needs to invest first in implementation or adoption of the product model that will be used as a process foundation.

What is your opinion? What was your experience in similar tasks and efforts?
Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 250 other followers