How CAD collaboration can avoid competition with OneDrive?

April 15, 2014

web-services-recombination-cad-plm

Collaboration is the name of the game today for many vendors. CAD and PLM vendors are included. Cloud technology is opening many new capabilities to collaborate and it is captured by existing players and newcomers. Sometimes, it happens so fast that it even create an internal competition. Image and Data Manager article Is OneDrive for Business the SharePoint alternative? speaks about interesting transformation that happens these days around file collaboration using Microsoft family of product. We knew about SharePoint capabilities to collaborate and share content (files). However, the new born child – OneDrive is growing fast and potentially can capture some spaces occupied by SharePoint today. I liked the following passage explaining how OneDrive takes on SharePoint:

OneDrive has a very simple interface (one that has been simplified further with recent updates). So it’s easy to upload your files and share them. You can also sync to all your devices, desktop, tablet, smartphone, giving you direct access to your content when you are online or offline. You even have mobile apps for iOS, Android, Windows 8 and Windows RT.

OneDrive even has this cool feature that allows you to grab a file from your PC even if you haven’t uploaded it to OneDrive. You have to turn that feature on, but it’s pretty nice to have.

SharePoint’s interface is OK, but it’s the subject of much debate. It’s not very intuitive to use and requires a fair amount of planning and organizing to get it set up in a way that’s easy for people to understand. Getting access to SharePoint on mobile devices has been spotty at best. Access via mobile (tablet or smartphone) has improved a lot with SharePoint 2013, but for those on SharePoint 2010, the story is not so good.

What I learned from this article is that file sharing, collaboration space is getting busy and competitive. Which brings me back to the discussion about specialized CAD collaboration tools. It made me think about some strategies CAD collaboration tools can use in order to avoid frontal competition with OneDrive, Dropbox and other file sharing and sync tools.

The name for this game is "layers". Creating of layered architecture will allow to CAD collaboration tools to store data using OneDrive (or other storage and share service) and, at the same time, enhance it with the data layer providing rich access to CAD specific content, viewer and other CAD data relationships. Think about it in a similar way how Google organized information from web for you. You are not necessarily store data on websites and other locations. Nevertheless Google gives you easy access to this information via different services. The basic service is search. Enhanced services can provide a specific vertical slices of information (think about Google Flight as an example).

What is my conclusion? To separate vertical application and horizontal services is getting more and more important. It was true in the past to build right enterprise architecture, but it is getting even more important in the era of cloud services. To be successful, cloud vendors will have to learn how to recombine and reuse technologies provided by different players. File Share and Synchronization is a very good examples to start with. For CAD vendors it means to learn how to share data on OneDrive or Dropbox, but at the same time to provide vertical experience specific for CAD content. Just my thoughts…

Best, Oleg


PLM Technology vs Vertical Industries: Wrong balance?

April 14, 2014

plm-industries

Let’s talk about PLM technologies. Err.. PLM is not a technology. Even more, PLM is even not a product. So, what is that? Business strategy? Product development politics? For the sake of this conversation let’s leave these debates out. I want to speak about PLM technologies that allow you to manage product data, CAD files, bill of materials, rich set of related information as well as processes around it. This technology came to us about 20-25 years ago first as a very hard-coded set of tools. You had to build it literally different for every customer. So, it supported only large customers that were able to pay for software, infrastructure and implementation. Later on, PDM/PLM turned into software toolkit. The next step in PDM/PLM technology evolution was called flexible data modeling. The first flexible (dynamic) PLM data modeling tools were released back in 1995-2000 and… not much changed since then.

So, what happened since that time? PLM vendors went to develop out-of-the-box and vertical industry solutions in a massive way. David Linthicum’s article Saleforce.com officially is out of ideas reminded me about the joke comparing technology vs. industry play. Here is the passage:

When you run out of new ways to provide innovative technology, you go vertical. That was the running joke among CTOs back in the day. It usually meant the market had reached the saturation point and you could not find new growth

I found this message very compelling to what happens in PLM industry. PLM vendors are trying to compete by providing more comprehensive set of data models, best practices, process templates. By doing so, vendors want to reduce TCO of PLM implementations. It is actually brings success and many customers are using these solutions as a starting point for their PLM implementation.

So, where is the problem? For most of the situations, PLM is still costly and expensive implementation. Services may take up to 50% of the cost. Here is the issue – core PLM data and process modeling technology didn’t change a lot for the last 10-15 years. Data models, CAD file management, product structure, process orchestration. All these things are evolving, but very little. The fundamental capabilities are the same. And it is very expensive to develop solutions using these technologies.

You may ask me about cloud technologies. Cloud is the answer. But only partially. It solves problems related to infrastructure, deployments and updates. Cloud provides clear benefits here. However, from the implementation technology standpoint, it is very similar to what non-cloud solutions can offer. Another interesting passage from Infoworld cloud computing article explains what is the problem new SaaS/cloud products can experience when trying to displace existing vendors:

So many companies have tried this approach — many times — but most found limited success. I can’t help but think the same will occur here. Salesforce will soon discover that when you get into vertical industries, the existing foundation of industry-specific applications is difficult to displace. Although Salesforce can always play the SaaS card, most of those industry-specific providers have already moved to SaaS or are in the midst of such a move. That means SaaS won’t be the key differentiator it was when Salesforce first provided its powerful sales automation service more than a decade ago.

What is my conclusion? Efficiency and cost. These are two most important things to make PLM implementation successful. So, the technology must be improved. Data and model capturing tools, flexibility and ease of use – everything must be more efficient to support future of manufacturing processes. How to do so? This is a good topic to discuss with technology leaders and strategiest. I’m going to attend COFES 2014 in 10 days. I hope to find some answers there and share it with you.

Best, Oleg


How to forget ODBC and Rethink PLM Data Openness?

April 12, 2014

plm-link-data

One of the most popular topics in engineering (but not only) software ecosystem. Open vs. Close. I’ve been discussing it many times – Open vs. Closed PLM Debates, PLM and New Openness, Closed Thoughts About PLM openness and few more. There is clear trend towards openness these days and, in my view, it is hard to find a PDM/PLM company that will defend closed approach and not openness.

However the definition of openness can be quite different. What else, the implementation of openness can be different too. Speaking from the engineering standpoint, devil is in details. So, I wanted to speak about some aspects of "openness" and how it might be implemented in PDM / PLM world. For very long period of time, data in PDM/PLM world was completely dependent on Relational Database Management Systems (RDBMS). The time of proprietary databases and data files is finally over. So, you can think data is peacefully located in RDBMS where it can be easy accessed and exchanged. Not so fast… There are two main constraints preventing data openness in RDBMS: data access technology and data schema. You need to support both in order to have access to the data. An alternative would be to use published APIs, which will provide you an access layer. In most cases, APIs will eliminate the need to know data model, but in a nutshell will not be very different from data access technology.

For many years ODBC remains one of the most widely adopted database access technology. I’m using name ODBC, but it can also refer variety of similar data access technologies – JDBC, OLE DB, JDBC, ADO.NET, JDO, etc. This is where things went wrong with data access and openness. The power and success of ODBC came from the use of DSN (Data Source Names) as a identification of data access. All ODBC-compliant applications leveraged the fact other developers have implemented RDBMS specific libraries – ODBC drivers. So, used don’t need to think about Oracle, SQL server, MySQL, etc. User just need to connect to DSN.

The distinct development and end-user models of ODBC ensured a massive ecosystem of ODBC-compliant applications and database connectivity drivers. Unfortunately, RDBMS vendors — the same ones that collectively created the SQL CLI and inspired its evolution into ODBC — also sought to undermine its inherent RDBMS agnosticism. The problem it created lies in the producing of huge amount of data driven applications relying on ODBC data access and claiming data openness as the ability to access, retrieve and (sometimes) update data in the RDBMS. Hidden behind DNS, databases converted into data silos. Data extracted from a specific database was dead and lost without context of the database. So called "openness" became simple "data sync pipe". What else, each DNS remains separate. So, if you have few databases you are out of luck to access data in logical way. Applications are pumping data from one database to another mostly trying to synchronize data between different databases. The amount of duplicated and triplicated data is skyrocketing.

So, what is the alternative? We need to stop "syncing data" and instead of we need to start "linking data". Think about simple web analogy. If you want to reference my blog article, you don’t need to copy it to your blog. For most of the cases you can create a link to my blog and URL address. Now, let’s bring some more specific technologies into this powerful analogy. Maybe you are familiar with semantic web and linked data. If not, this is the time! Start here and here.

There is a fundamental differences between old ODBC world and new way of linking data. You can get some fundamentals here and by exploring W3C data activity. I can summaries three main principles of linking data – 1/ use of hyperlinks to the source of data; 2/ separation of data abstraction data access APIs; 3/ conceptual data modeling instead of application level data modeling. So, instead of implementing ODBC drivers and APIs to access data, each data provider (think about PLM system, for the moment) will implement an linked data web abstraction layer. This abstraction layer will allow to other applications to discover data and run queries to get results or interlink data with data located in other systems. LinkedData is fast developed ecosystem. You can lear more here.

What is my conclusion? We are coming to the point where we need to re-think the way we are accessing data in business systems and start building a better abstraction level that will allow to stitch data together via linkage opposite to synchronization. The wold wide web and the internet are ultimately success stories for open standard adoption and implementation techniques. Applying that will simplify access to data and build a value of data connection to the enterprise. Just my thoughts…

Best, Oleg


Will PLM Vendors Jump into Microsoft Cloud Window in Europe?

April 11, 2014

european-plm-cloud

Cloud is raising lots of controversy in Europe. While manufacturing companies in U.S. are generally more open towards new tech, European rivals are much more conservative. Many of my industry colleagues in Germany, France, Switzerland and other EU countries probably can confirm that. Europe is coming to cloud systems, but much slower. I’ve been posting about cloud implications and constraints in Europe. Catch up on my thoughts here – Will Europe adopt cloud PLM? and here PLM cloud and European data protection reforms. These are main cloud concerns raised by European customers – data, privacy and specific country regulation. With companies located in different places in EU, it can be a challenge.

Earlier today, I’ve heard some good news about cloud proliferation in Europe coming from Microsoft. TechCrunch article – Microsoft’s Enterprise Cloud Services Get A Privacy Thumbs Up From Europe’s Data Protection Authorities speaks about the fact Microsoft enterprise cloud service meets the standards of data privacy in several European countries. Here is a passage that can put some lights on details and what does it mean:

But today comes a piece of good news for Redmond: the data protection authorities (DPAs) of all 28 European member states have decided that Microsoft’s enterprise cloud services meet its standards for privacy. This makes Microsoft Azure, Office 365, Microsoft Dynamics CRM and Windows Intune the first services to get such approval. The privacy decision was made by the “Article 29 Data Protection Working Party,” which notes that this will mean that Microsoft will not have to seek approval of individual DPAs on enterprise cloud contracts. In its letter to Microsoft (embedded below), chair Isabelle Falque-Pierrotin writes, “The MS Agreement, as it will be modified by Microsoft, will be in line with Standard Contractual Clause 2010/87/EU… In practice, this will reduce the number of national authorizations required to allow the international transfer of data (depending on the national legislation).”

Majority of PDM / PLM providers are friendly with Microsoft tech stack. Some of them are completely relies on MS SQL server and other Microsoft technologies. Most of them are supporting SharePoint. Now, these PLM vendors have an additional incentive to stay with Microsoft technologies for the cloud. It can be also a good news for manufacturing companies already deployed PDM/PLM solutions on top of Microsoft technologies and developed custom solutions.

What is my conclusion? The technological landscape these days is very dynamic. The time, one platform worked for everybody is over. In light of technological disruption and future challenges tech giants will be using different strategies in order to stay relevant for customers. Will European cloud regulation keep PDM/PLM players with MS Azure and other Microsoft technologies compared to alternative cloud technological stacks? How fast will take to other players to reach the same level of compliance? These are good questions to ask vendors and service providers. Just my thoughts…

Best, Oleg


How PLM can join semantic enterprise graph?

April 10, 2014

plm-and-enterprise-graph

Connectivity is a key these days and graphs are playing key role in the development of our connectivity. It doesn’t matter what to connect – people, information, devices. Graphs are fascinating things. Actually, I came to conclusion we live in the era of fast graph development. More and more things around us are getting “connected”.

It is almost two years since I first posted about Why PLM need to learn about Google Knowledge Graph. The story of Knowledge Graph is getting more power every day. GKG is growing. It represents “things” in the knowledge base describing lots of topics – music, books, media, films, locations, businesses and many others. Part of Google Knowledge Graph is fueled by Freebase – large collaborative database of structured data. Originally Freebase was developed by Metaweb and acquired by Google in 2010. It is still not completely clear how Google Knowledge Graph built. You can read some investigations here. Nevertheless, it is hard to undervalue the power of Knowledge Graph.

Another well known and publicly developed graph is Facebook social graph. Last year I posted – Why PLM should pay attention to Facebook Graph Search. Facebook graph represents structured information captured from Facebook accounts. It allows to run quite interesting and powerful queries (also known as Facebook Graph Search).

In my opinion, we are just in the beginning of future graph discovery and expanded information connectivity. It won’t stop in social networks and public web. I can see graphs will proliferate into enterprise and will create lots of valuable opportunities related to information connectivity and efficient query processing. Semanticweb.com article Let Enterprise Graph Tell You A Story speaks about enterprise as a set of Facebook pages. It explains how we can build a graph story of enterprise communication, collaboration, people activities, related data and other things. Here is my favorite passage from the article:

Wallace relies on Hadoop and graph database technology, with network data represented as a property graph. “Property graphs are utterly, totally extensible and flexible,” she said, and “the system gets smarter as you add more data into it.” The enterprise social network data generates triple sets (that John Smith created X Document that was downloaded by Jane Doe) that get pocketed into the graph, for example, as is metadata extracted from relational databases. A general set of algorithms can find a user within the graph and calculate his or her engagement level – activities, reactions, eminence and so on. “We now have a Big Data service with a set of APIs so people can query the enterprise graph,” she set, and then run analytics on those results that can drive applications.

I found this aspect of graph development very inspiring. To collect enterprise information into graph database and run a diverse set of queries can be an interesting thing. If I think about PLM as a technological and business approach, the value of graph connecting different part of information about product and activities located in different enterprise systems can be huge. These days, PLM vendors and manufacturing companies are using a diverse strategies to manage this information – centralized databases, master data management, enterprise search and others. Graph data approach can be an interesting option, which will make enterprise looks like a web we all know today.

What is my conclusion? The growing amount of information in enterprise organizations will change existing information approaches. It doesn’t mean all existing technologies will change overnight. However, new complementary techniques will be developed to discover and use information in a new ways. Graph is clearly going to play big role. PLM strategist, developers and managemers should take a note. Just my thoughts…

Best, Oleg

picture credit semanticweb.com


Why so hard to break PLM into components?

April 9, 2014

plm-componentizing

Product Lifecycle Management is not a software. It is business strategy and approach. One of my blog readers mentioned that in the discussion few days ago. Nevertheless, manufacturing companies are usually talking about PLM systems and platforms as something solid and unbreakable. The same picture you can see when looking on PLM online marketing materials and brochures. Despite recent changes in broad PLM acceptance and value proposition, companies still see PLM as a software mostly for engineering domain or driven by engineering IT. One of the dreams many PLM vendors developed for the last decade is how to reach the C-level management such as CIO and engineering executives. In other words, how to reach ERP level of acceptance and awareness.

Earlier today, my attention was caught by Toolbox.com article about modern ERP trends. Navigate to read ERP Trends: Shifting from Big ERP Systems to Componentized ERP Environments. Cloud is changing the face of ERP. The technology is breaking ERP into pieces. One of the results – two tiers ERP configuration. Here is the explanation I captured from the article.

Because of the coinciding innovations in cloud technology, instead of deploying and implementing traditional ERP infrastructure, organizations started adopting a two-tier, or hybrid, ERP model. Two-tier ERP is a method of integrating multiple ERP systems simultaneously. For instance, an organization may run a legacy ERP system at the corporate level while running a separate ERP system or systems, such as cloud ERP, at a subsidiary or division level for back-office processes that have different requirements. To facilitate the adoption of the two-tier methodology, vendors increasingly opened core databases and application programming interfaces and provided customization tools, thus spurring the advent of self-contained, functional ERP components or modules.

So, what does it mean for existing and future PLM strategies and products. More specifically, it made me think about the possibility to break large and heavy PLM platforms into sets of re-usable components. ERP componentizing example speaks about splitting ERP system into modules such as – supply chain, financial, management, human resources. So what potential PLM split can look like? I can see two possible ways here – business process and lifecycle. The first one is something probably we can see a lot in existing PLM platforms. Requirement management, Design Collaboration, Change Management, NPI, etc. I’ve been thinking about Lifecycle as an alternative approach to the traditional business process oriented approach. Lifeycle approach means to develop applications to serve people with their everyday tasks based on maturity of product in the development or services. Think about manufacturing assembly line. Different tools and operations are applied to manufacturing product to bring it to life. Now think about PLM and software tools. PLM components will be used to create product (actually product data and related information).

Toolbox article also speaks about difficulties of componentized approach. The main one is a potential growth of TCO because of the need to integrated data coming from different modules. Here is the passage I specially liked:

The data from the second-tier cloud ERP or modules typically require normalization to integrate with the legacy ERP system at the corporate level. Although direct cost is associated with master data management to ensure consistency and no redundancy, by extending the life of the legacy system, the intention is to reduce the total cost of ownership (TCO) while meeting additional needs for flexibility and functionality. However, the shorter duration of implementing and deploying a two-tier ERP model can actually lead to increased TCO if the indirect costs, such as training, hiring staff, and vendor support, are not taken into to account, as well.

The same problem will arise if we try to break PLM into components. With no solid data foundation, ability to bring and integrate various PLM components will be questionable. The integration cost will skyrocket. Compatibility between PLM components versions will make it even harder. Nevertheless, I can see growing business requirements, customers’ demand and shorter lifecycle for software products as something that will drive future PLM technological changes. Componentizing will be one of them.

What is my conclusion? To break large and heavy PLM suites into configurable and flexible components is an interesting opportunity to satisfy today’s dynamic business reality. However, two fundamental technologies are required to make it happen – scalable open data platform and reliable integration technologies. Just my thoughts…

Best, Oleg


PLM Best Practices and Henry Ford Mass Production System

April 6, 2014

henry-ford-assembly-line

If you are in PLM business, I’m sure you are familiar with term called "best practices". The term is widely used to explain how PLM system can be deployed, how to manage data and how to organize and optimize product development processes. So, where are roots of PLM best practices and why PLM vendors like them so much? Remember, the original PLM (and even PDM) systems started as a glorified data management toolkit with elements of CAD and ERP integrations. To get such system in product was very expensive and it required lot of time and implementation services. The reason is simple – every manufacturing company is different. It takes time for service provider to understand company landscape, processes, data requirements, legacy systems and suggest a solution. Put heavy price tag next to this activity. You can think about this process as something similar to organizing mass production assembly line. It is costly and complicated. Once you’ve get it done, your objective will be simple – run it to the largest possible quantity without re-configuration (which will cost you money, again). The same happened with first large PLM implementations.

The invention of "best practices" helped to figure out how to move from heavy and complicated PLM assembly line to more configurable and flexible mechanisms of PLM deployment. Technologically, toolkit approach was a underline product foundation. PLM companies and especially service providers and PLM consultants liked the approach. To create OOTB (out-of-the-box) pre-configured environments was relatively easy based on the practices gathered from existing large customers. However, to get it to the field and implement wasn’t so simple. Marketing and sales used OOTB environments to demonstrate and make sales. However, implementations and fine tuning was failing to apply it after that. The implementation devil was in details and service teams were required to bring to production. Similar to manufacturing mass production environment, customizing and services was a straightforward answer to solve the problem of product and requirement diversity.

As we know from the history of manufacturing, mass customization won and left mass production system in a dust. What was clear innovation 100 years ago was replaced by new forms of manufacturing, customization and flexible manufacturing units. I believe this is still very hot topic in the industry and every manufacturing company. The diversity of product requirements is skyrocketing, product lifecycle is getting even shorter. To produce PLM system that will fit this type of environment is probably one of the most important innovation that might happen in engineering and manufacturing software technologies these days.

What is my conclusion? I think software companies can learn something from the history of manufacturing companies. The move from from mass product to mass customization is one of them. PLM software made a turn from from complicated preconfigured assembly lines to expensive data management toolkits that require services. Manufacturing is getting different these days. Next step can be hardly achieved by pure technology or process organization. My hunch it is going to be a hybrid of new data management technologies empowered by crowdsourcing and customer innovation. Just my thoughts…

Best, Oleg

Photo source.


Why PLM stuck in PDM?

April 5, 2014

plm-stuck-pdm-round-square

I’ve been following CIMdata PLM market industry forum earlier this week on twitter. If you’re are on twitter, navigate here or search for #PLM4UM hash tag on twitter. The agenda of PLM forum is here. The following session discussed one of my favorite topics- PDM v PLM. PLM: Well Beyond Just PDM by Peter Bilello. This passage is explaining what the session is about

CIMdata’s research reveals that leading industrial companies are looking to expand beyond PDM functionality to truly enable a more complete PLM strategy. This becomes even more important in a circular economy. In this presentation, CIMdata will discuss which areas are most important, and what opportunities they create for PLM solution and service providers.

My attention was caught by the following tweets coming from this session:

According to CIMdata, leading Mfrs are now looking to move beyond PDM. #PLM4um
— ScottClemmons (@ScottClemmons) link to tweet.

Peter B / CIMdata explains that it’s hard to find a ‘real’ end-to-end #PLM implementation hat works #plm4um
— Marc Lind (@MarcL_) link to tweet.

It made me think why after so many years of PLM implementations, most of vendors are still solving mostly PDM problems for customers and it is hard to move on into broad downstream and upstream adoption of PLM beyond CAD data management functions. Here are my four points explaining in a nutshell why I think "PLM stuck in PDM".

1- Focus on design and CAD.

Most of PLM vendors historically came from CAD-related domain. Therefore, PLM business for them was the expansion of CAD, design and engineering business. As a result of that, use cases, business needs and customer focus were heavy influenced by design domain. The result – PDM focus was clear priority.

2- PLM is a glorified data management toolkit

The initial focus of many PLM systems was to provide a flexible data management system with advanced set of integration and workflow capabilities. There are many reasons for that – functionality, competition, enterprise organization politics. Flexibility was considered as one of the competitive advantages PLM can provide to satisfy the diversity of customer requirements. It resulted in complicated deployments, expensive services and high rate of implementation failures.

3- Poor integration with ERP and other enterprise systems

PLM is sitting on the bridge between engineering and manufacturing. Therefore, in order to be successful, integration with ERP systems is mandatory. However, PLM-ERP integration is never easy (even these days), which put a barrier to deploy PLM system beyond engineering department.

4- CAD oriented business model

Because of CAD and design roots, PLM sales always were heavily influenced by CAD sales. Most of PLM systems initially came to market as a extensions of CAD/PDM packages. With unclear business model, complicated VARs and service companies support, mainstream PLM deployment always focused on how not to slow CAD sales.

What is my conclusion? Heavy CAD roots and traditional orientation on engineering requirements hold existing PLM systems from expanding beyond PDM for midsize manufacturing companies. The success rate of large enterprise PLM is higher. But, it comes at high price including heavy customization and service offerings. Just my thoughts…

Best, Oleg


How to make PLM UI less terrible?

April 3, 2014

handwritten-BOM

I’m coming again to this topic – User Interface. These days you can hear about it as user experience (UX). UX is more complicated thing and includes lots of factors and aspects. So, I’d like to speak first about how UI looks. Back in time when I was developing and demonstrating PDM user interfaces, the worst thing was to get in line after somebody presenting CAD and visualization software. Their UI are always looks good. It was obvious, since they can show all these cars, phones and airplanes… Opposite to that, PDM user interface is all about tables, list and values. The nature of PDM system makes this type of UI boring and not interesting. For example, take a look on the photo above. This is handwritten BOM of locomotive made almost 100 years ago (image credit) . It doesn’t look nice, but it is absolutely "must have" document in manufacturing.

To change UX concept is a complex things. It requires to make a lot of changes in the way people performing their tasks. For engineering, manufacturing and enterprise organization is a big thing. However, what about to make a change just in a way PDM / PLM UI looks like?

The following image by darkhorseanalytics caught my attention with the presentation how to make table looks less terrible. Take a look on the power of "less is more". It comes as a sequence of remove colors, remove gridlines, remove fills, remove the border, remove bolding, left align text, right align number, align titles with data, resize columns with data, put whitespace to work, use consistent precision, round the numbers, remove repetition, no more Calibri font, add back emphasize.

So, here is the table before:

table-nice-ui-before

… and here is the table with UI improvements.

table-nice-ui-after

Not sure about you, but I like the comparison and the result.

It made me think about how many places in PDM UI is actually requires clean table presentation. Think about drawing reports, bill of materials and many other things. To make them look clean and fresh will improve to visual impression about PDM product.

What is my conclusion? It is very hard to design nice and clean UI. Every company developing software applications these days must focus on how to make the UI less terrible. The ugly and annoying enterprise software UI is a thing in the past. The new UI will be designed with the a different state of mind and thinking about modern web and mobile user interface and experience. Just my thoughts…

Best, Oleg


Bill of Materials (BOM) Management: Data, Lifecycle, Process

April 2, 2014

BOM-data-lifecycle-process

In my recent post about bill of materials – Bill of Materials (BOM): process or technology challenge? I touched the variety of topics related to BOM organization – multiple BOMs and need to manage BOM located in different systems. My main question at the post was around how to make the work with multiple BOMs easier? The problem is tough and the answer is not easy and straightforward. While I was googling the internet to find what others think about this problem, my attention caught TeamCenter PLM blog post – Bill of Material Lifecycle. This posts presents multiple BOMs as a result of changes in the product lifecycle – design, manufacturing, service. Here is a passage I captured:

It is interesting to discuss on BOM lifecycle and its evolution from conceptual stage to full fledge manufactured product to maintenance. In this blog I will explain through life cycle of BOM across the product life cycle done as in house development. The BOM lifecycle can varies based on overall process of company for example some company might only manufacture as order hence they As Build design BOM and they directly CREATE Manufacturing BOM from it.

All together, it made me think that concepts of data, lifecycle and process is often can create a confusion and overlap. I want to clarify these concepts and present how they can be combined together to manage single BOM in the organization.

1- Data

Data is the most fundamental part of Bill of Materials. It combined from data about product, assemblies, parts and relationships between them. Fundamentally, assemblies and components are connected together to form the result data set representing a product. This data set can be presented in many ways – tabular, hierarchical and many other forms (eg. graph). Data about parts leads us to the place where information about product, assemblies, components, supplies and manufacturers is managed. This information can reside into one of the following systems – CAD, PDM, PLM, ERP, SCM and others.

2- Lifecycle

Lifecycle defines the difference between bill of materials of the same product, but associated with different product development periods (stages). Here is the example of some typical stages – concept, design, manufacturing, service. However, these stages are not the same for many companies and can reflect industry, specific business practices, regulation and many other aspects. It is very important to capture relationships between Bill of Materials of the same product (assembly) in different lifecycle stages. Missing lifecycle stage connection can cause a lost of very important product lifecycle information and product development traceability. In some regulated industries such

3- Process

Process is a set of activities that defines Bill of Materials data as well as changes in a lifecycle. Sometimes process can be very informal- save of assembly and parts using design system. It will product design BOM data. However, with the complexity of product development and specific organization, some processes are including changes of data, lifecycle stages as well as people involvement. If you think about ECO process, it might change few bill of materials, lifecycle stages as well as product/part information.

What is my conclusion? The problem of bill of materials management must be separated into three distinct problems: 1/ how to create data with BOM? 2/ how to control product dev stages and differentiate the same BOM across the lifecyle; 3/ how to provide tools to manage process and people to work with data and stages. All together, the problem is complicated. However, separated into these pieces it can help you to build a strategy for your BOM management regardless on tools you are going to use. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 217 other followers