PDM 101: Engineering Document Management Fallacy

August 30, 2013

We love new technologies and trends. However, from time to time, I want to get back to basic topics of engineering and manufacturing software. The topic I’d like to discuss today is Engineering Document Management (EDM). This post was triggered by DM vs. EDM article by Scott Cleveland on 2PLM letter. Here is the passage Scott use to explain the main difference:

Document management can be as simple as saving a document to a protected directory. It could be any of the document management software packages like SharePoint. Engineering document management is a different beast. Engineering document management follows some basic engineering rules. The concept is that of a vault.

Later in the article engineering rules are explained as access control, version control, process states (create, change, release) and audit trail.

I found myself a bit confused by this definition. There are many document management systems that will comply with rules described above. However, I’d not recommend to use these systems for engineering document management purposes. I took a look in wikipedia and here is what I found. Navigate to the following wikipedia link about Document Management System (DMS). The article is quite comprehensive. Here is a short passage that defines DMS:

A document management system (DMS) is a computer system (or set of computer programs) used to track and store electronic documents. It is usually also capable of keeping track of the different versions modified by different users (history tracking). The term has some overlap with the concepts of content management systems. It is often viewed as a component of enterprise content management (ECM) systems and related to digital asset management, document imaging, workflow systems and records management systems. Document management systems commonly provide storage, versioning, metadata, security, as well as indexing and retrieval capabilities.

Later in the article, I found a very useful table describing functions and components of document management. One of the them (very important) is Versioning:

Versioning is a process by which documents are checked in or out of the document management system, allowing users to retrieve previous versions and to continue work from a selected point. Versioning is useful for documents that change over time and require updating, but it may be necessary to go back to or reference a previous copy.

Now, let’s move forward and see what wikipedia states about Engineering Document Management (EDM). I didn’t find a separate EDM article. The most relevant one was Technical Data Management derived from Document Management System (DMS). I captured the following important passage:

A Technical Data Management System (TDMS) is essentially a Document management system (DMS) pertaining to the management of technical and engineering drawings and documents. Often the data are contained in ‘records’ of various forms, such on paper, microfilms or on digital media. Hence technical data management is also concerned with record management involving purely technical or techno-commercial or techno-legal information or data.

Wikipedia article compares TDMS and DMS in a following way:

TDMS functions are conceptually similar to that of conventional archive functions, except that the archived material in this case are essentially engineering drawings, survey maps, technical specifications, plant and equipment data sheets, feasibility reports, project reports, operation and maintenance manuals, standards, etc.

In my view, these days, most of people are associating Engineering Document Management directly with PDM. Navigate to wikipedia page EDM page and you find a confirmation to that (Engineering Data Management, also known as Product Data Management).

So, what is so special and different about Engineering Document Management that confuses many people? In my view, it comes down to the type of data system is managing. It is about CAD models, Drawings, Design, Simulation, etc. This data is semantically rich and contains lots of connections and constraints. To manage versions of Excel files is easy. Many document management systems can do so. However, to manage versions of SolidWorks or Inventor assemblies is not so simple. You need to track dependencies between parts, drawings and other elements of interconnected data.

What is my conclusion? Semantic complexity makes engineering document management complicated. It is all about connections and data dependencies. This is a specialty of engineering document management software. To manage revisions of interconnected files is complicated. It cannot be done on a level of single file and requires different approach. Engineering Document Management (today mostly known as PDM) is a special class of data management solutions used for this purposes. Just my thoughts…

Best, Oleg


PLM and Oracle New Full Text Query

August 28, 2013

Are you familiar with 1% internet rule? It is also called 90-9-1 principle. According to this rule, only 1% of internet users are actively involved into content creation. Even 1% percent rule cannot be applied as-is to enterprise, my hunch retrieval of data (or data access) becomes more and more important these days. The amount of data in enterprise organizations is growing. To get access to this data is a critical IT requirement.

Companies are thinking about different ways to access data. It comes in a form of search, reports, analysis and many others. All I said above, applies to manufacturing companies and PLM systems. Traditionally, engineering uses, which represents a majority of CAD/PLM users from the beginning are very concerned about data creation – CAD design, Simulation, Drawings – all these elements assumes creation of information. However, when it comes to user adoption, the problem of data consumption comes to the place.

While the question of noSQL database adoption by PLM is still open, my guess the majority of PLM implementation in the world are running on top of RDBMS and probably with heavy presence of Oracle databases. On this note, Oracle recently updated its database to include XQuery Full Text Search. The following article in AMS Tech blog provides lots of useful information about this full text search query functionality and code examples.

What is my conclusion? New technologies like noSQL and others are cool. However, while you have to support many existing customers in production, usage of existing mainstream database technologies can become an interesting option. I’m not sure it will be become a life saver for Oracle databases in the race against other databases options. Nevertheless, to solve customers problems is a priority. Full text query can provide some cool search capabilities to existing implementations. Just my thoughts…

Best, Oleg


Cloud and Traditional PLM Industries Trajectories

August 27, 2013

The cloud trend is shifting the direction. Only few years ago, we’ve been saying cloud focus is on consumer applications and public web. Lately we said – cloud clearly will impact small and medium business first. However, it looks like cloud will are going to see future shifts sooner than later. PLM cloud switch finally happened to all (I have to say almost all) PLM vendors. It made me think about what will happen to traditional PLM industries such as automotive, aerospace and defense.

My hunch, automotive industry will be the first "ready to go" with cloud from traditional PLM industries. Automotive Engineering and Technology Forum by AIAG can give you some confirmation to think about that. Navigate to the following link to read about coming forum in Southfield, MI in October. The name of the discussion panel is intriguing – Impact of Cloud Computing on Automotive Engineering. The name of moderator (Mike Payne) is even more intriguing. Mike’s name was behind well known companies such as PTC, SolidWorks, and SpaceClaim. The following passage outlines the future panel:

Computing has evolved over the years from one platform to the next. New approaches are catalyzed by the next platform. As it was in previous generations, there are naturally the early adopters and the followers. While this new platform offers the promise of enterprise-wide interoperability, and even beyond the enterprise to the ecosystem, there remain the challenges of the huge base of installed software. In this session we will explore with some of the movers and shakers in the software industry the opportunities offered by this new platform. We also discuss the barriers to having the cloud technologies used by everyone in an organization, just like many people already use cloud-based e-mail today, instead of the installed Exchange clients on their own premises.

What my conclusion? To adopt new technology platform is a risky game. Many industries were playing safe strategies. Which means to come late in the game. It worked in the past for some of them. However, 21st century proves to be different. It introduces new speed of success and failure we never seen before. Think about Google, Facebook, Kodak, Nokia and RIM. These are good examples of fast success and very fast failures. Speaking about automotive industry, think about Tesla and Detroit city… It will be very interesting to see what trajectories automotive and other traditional PLM industries will take with cloud PLM in a near future. Just my thoughts…

Best, Oleg


Will PLM manage enterprise BOM?

August 26, 2013

Bill of Materials is a huge topic. It is hard to underestimate the importance of BOM management in general as well as specifically for PLM systems. I’ve been blogging about BOM many times. Use this custom Google search to browse my previous beyond PLM articles about BOM.

One of the topics I’m debating already long time is so called "single BOM". I explained my early view on a problem of a single BOM back in 2009 in my post – Seven Rules Towards Single Bill of Materials. Despite most of people you talk to would agree with the idea, the implementation of single BOM in modern enterprise is complicated and challenging. Take a look on another post to get a glimpse of what I’m talking about – Why companies are not ready for single BOM? One of the biggest problem towards achieving of single BOM idea is integration challenges. Today, BOM data resides in many systems in enterprise. Each of these systems provides a specific functional view on BOM management – design, engineering, supply chain, manufacturing, finance, etc. However, the integration of these systems is a very complicated task. In addition to that, the attitude of vendors to lock data to improve the competitiveness of their systems applies additional integration and communication difficulties.

Manufacturing companies have many challenges these days. BOM management is one of them. Despite the fact of complexity of single BOM management strategies, businesses need to have a holistic view of bill of materials. I touched it in my 3 modern BOM management challenges blog post – lean manufacturing, heavy customization, supply and contract manufacturing.

In my view, the problem of BOM management is well understood by industry. I stumble upon one of the old Siemens PLM blog posts written by Nick Pakvasa – BOM Challenges in Automotive Industry. In addition to (obvious) recommendation to use TeamCenter as for enterprise BOM management, article nails down the problem of single BOM. Here is my favorite passage:

The requirement to get the BOM under control at the enterprise level is reaching a critical stage. Exponential increase in software content is driving the requirement to eliminate legacy BOM applications that were structured purely to support older product architectures which were predominantly based on mechanical parts, to an environment that has to deliver integrated BOM management capability for all content in the product. And, to deliver this capability with lifecycle BOM management needs incorporated into the application, and to manage this BOM data starting in the Product Planning phase, through designing, building, selling, servicing and retiring the product. Without integrated enterprise level BOM management supporting all functional areas, geographical locations and external suppliers, partners and joint ventures, actions such as traceability of quality and other imperatives, reuse, BOM access for Service, and lifecycle enterprise change management are either cumbersome, with redundant or non-value added effort, and potential for errors, or in some cases, cannot be executed.

What is my conclusion? The recommendation of all PLM vendors is to use PLM as a central system to manage all BOMs. The recommendation is obvious. However, my hunch is that problem is bigger than it presented by PLM companies and other vendors developing systems producing and requesting data about of BOMs. The complexity of integration, implementation legacy and variety of requirements are creating a very complex application space to manage. In addition to that, high level of competition together with political influence inside of every manufacturing company, make the decision about BOM management one of the most complicated in modern enterprise systems. Just my thoughts…

Best, Oleg


Why graph analyzes will rule PLM in the future?

August 23, 2013

PLM is all about data. It is about products, requirements, configurations, Bill of Materials, CAD Models manufacturing instructions and zillions of other documents. What is specially interesting about product lifecycle is the fact how data is interconnected. When you think about CAD model, assemblies and drawings, the relations are mostly obvious. Even so, PDM systems are working hard to maintain these relationships during the change process. However, let think beyond design and engineering department. Think about the whole lifecycle of the product. Think about usage of components on a global scale.Think about supply chain and design suppliers. Think about product behaviors in a socially connected world.

Data is complex. To "understand" data and find right relationships is a complex tasks. Try to use these relationships and contextual data to drive better decision process is even more complex. This is a right time to start thinking about graphs and this is where graph models come to place. This is a good time refresh your university notes about graph theory :). My recommendation is to add some practical sense to that and look on every day use cases like Facebook friends model and Page Rank.

I’m getting lots of graph-related links from big data publications these days. If you feel uncomfortable with the term big data (it trails too much hype these days), just think about data complexity beyond the level we can handle today with relational databases and excel spreadsheets. Earlier tonight, on my plane from San Francisco to Boston, I was reading Infoworld article – Graph analysis will make big data even bigger. Here is a passage I specifically liked:

Social networks transformed the Internet into a complex web of relationships; social graph analysis offers a way to understand those relationships. When it comes to social graph analysis, that task can be simple if you’re only interested in a few individuals, only investigating one type of connection among them, and only mining one static pool of behavioral data associated with them. On the other hand, if you’re trying to assess the shifting behavioral patterns of every possible relationship among every person, place, and thing on the planet, plus all the things they might be saying to each other, dynamically and in real time with perfect predictions about what they might do at every point in the future … you’re living in a science-fiction fantasy world.

The sci-fi fantasy is coming to our everyday life these days in many places. You don’t think about it, but it is around you in Facebook, Yelp, Twitter, LinkedIn and many other applications. However, it is not true when you come to your engineering office. In many situations you are surrounded by applications developed 10-15 years ago.

The enterprise software world s waking up to the potential of graphs analyses in a wide range of applications. It looks like a promising segment. These days is sold very often under "noSQL" umbrella. Oracle noSQL announcement few days ago is just one example.

What is my conclusion? Analyzing complexity is a big task. Nobody will disagree with the importance of the analyzes. However, the biggest challenge is to drive simple conclusions out of this complexity. You can get it easy from Facebook and Yelp pages as a recommendation what restaurant to go. Now think about simplification of design or supply chain process. In a future, PLM applications will need to handle more complexity, more data and do more analyzes. This is a way to make your application smarter. In my view, graph models will come to solve product lifecycle problem we cannot even think about today. Just my thoughts…

Best, Oleg


The role of PLM in mass customization

August 21, 2013

Product customization is one of the trends that changing the manufacturing landscape these days. Mass production was a mainstream for many manufacturing companies and industries. The appearance of configurable products and manufactured end items was limited. One of the most visible examples of customization in production was Dell. However, even Dell had a very limited ability to configure products. Cost advantage was clear and customers dragged by affordable prices of predefined configurations.

However, business is getting different these days. The demand of customers for customization is getting higher. Technological changes, internet, cloud solutions, supply chain and many other factors are re-shaping manufacturing landscape. Earlier today, I was reading Mashable article – Why Large-Scale Product Customization Is Finally Viable for Business. The writeup presents an interesting perspective of why mass customization is finally getting affordable and desirable. Here is my favorite quote explaining customer demand for mass customization:

Consumers’ expectations are being shaped by their lives online. Customization plays a large and growing role in digital experiences, from Facebook to Pandora Internet radio to mobile applications like location-aware Google Maps.

One of the elements solutions for mass customization is customer facing product configurator. Here is another interesting passage from the article:

Today’s customer-facing technologies are cheaper and easier to deploy than ever. The price (and time requirement) for developing customer-facing configurators has dropped significantly in the past few years. It’s a fraction of the cost even compared to a few years ago (think $50,000, down from $1 million). And new uses –- like embedding configurators within Facebook — make configurators more accessible (and more social).

Mass customization and product configurator topic made me think about what is the role of PLM in manufacturing of configurable products. Traditional PLM was dealing with customization via heavy loaded engineering to order processes. For most of the cases PLM companies were partnering with companies making product configurators or used ERP-based products. The most complicated part was to integrate multiple systems in a single solution, which required lost of code “hard-wire” and data transformations. The integrated software product line of product configurator – ERP – PLM – ERP and back to the shopfloor and shipping was hardly achievable by most of the manufacturing companies and PLM software vendors.

What is my conclusion? I think customization is the future for many manufacturers these days. The percentage of “configurable” and flexible manufacturing facilities will be growing. It will include online configurators, flexible configuration platforms and integration with production and supply chain. In my view, PLM should play a role in this modernization. It is a huge opportunity and the way to re-shape the future of manufacturing. Just my thoughts…

Best, Oleg


Who will take on PLM legacy data?

August 20, 2013

Legacy data it painful. Speak to anybody in the business of PDM/PLM implementation and they will tell you that importing existing (aka legacy) data is complicated, time consuming and after all very expensive task. It can easy cut your implementation profits and to increase project time. In past, I was blogging about legacy data problem, different types of legacy data and main options to solve the problem. Navigate to this link to refresh your memories.

However, legacy data problem is much bigger than PDM/PLM only. ZDNet article – Business intelligence, tackling legacy systems top priorities for CIO. The article speaks about the fact "legacy data problem" is climbing up to the top CIO priorities these days. For them, legacy data is the information stuffed in file cabinets and Excel spreadsheets or buried in antique data management systems company built and/or acquired for the last two decades. These systems are sitting in corp data centers or (sometimes) under the desks of employees. I found the following passage interesting:

Business intelligence projects are the top priority for government CIOs this year, followed by plans to strip out legacy systems. According to analyst Gartner, government IT organizations are expecting their budgets to have a modest compound growth rate of 1.3 percent through to the end of 2017, with increased spending on IT services, software and datacentres likely to be fuelled cuts in internal technology services, devices and telecoms services.

Manufacturing companies are interesting outfits when it come to legacy data problem. The years of CAD/PDM/PLM implementations, heavy customization and product modernization created a unique system zoo. To maintain these system is a very expensive task. To make changes in existing implementations is complicated. It slows business and blows up IT budgets. This is where "strip out legacy system" becomes a priority for IT.

What is my conclusion? The traditional PDM/PLM (and not only) business practices is to come with new solution and try to ignore the existence of legacy data. Very often it becomes a problem of customers and (for the best) implementers and service providers. My hunch is that companies and products showing value prop around legacy data can get some traction in CIOs corner offices. Systems that able to handle legacy data can have a competitive advantages these days. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 260 other followers