How can you prevent PLM 2.0 silos?

April 30, 2009

The latest development in Product Lifecycle Management has raised the level of PLM systems with their ability to support wider areas of product development, manufacturing and maintenance. I think that over the past few years we have seen many new capabilities, systems and technologies developed by leading PLM companies. So, where is my concern?. Massive development of PLM technologies creates a new domain of enterprise software which then creates a new island of data. With new and modern technologies, you can see very reliable and centralized storage of product information that allows you to track every record of product data and related information. However, this development has led to another problem – product information has become siloed inside of the enterprise.

So, how can you develop Product Lifecycle Management strategies that can prevent you from creating additional PLM silos?

1. Develop integration strategies for PLM including the connection of various existing software (design, planning, manufacturing) to PLM. Without these integration strategies, you will find yourself quickly landing on an island that contains a huge amount of information that you need to exchange with external bodies.

2. Plan Business Process Management initiatives that prevent disintegration of information and processes. These BPM initiatives can be processes connected on the organizational level. As part of this connection, these establish (physical and logical) links between PLM and other enterprise systems.

3. Invest into Business Intelligence and establish a way of accessing product information in PLM systems from outside systems. This business intelligence software will create a business background for improving the PLM information available and the general ‘de-siloing’ of data inside the organization.

So, what are my practical recommendations? Start from investing in overall IT infrastructure integration with PLM. You need to figure out where all the pieces of product data are related and what processes they touch. Afterwards, you will need to see how to optimize processes and align them with available platforms and tools for business process management.


PLM Integration Gotchas

April 30, 2009

Today I would like to discuss PLM integrations. I see PLM as a business software that heavily relies on integration. You need to integrate PLM with multiple design, manufacturing and business systems. Manufacturing companies are using a huge set of multiple systems as a result of their operational history, acquisitions and product preferences for specific solutions.

In my opinion, as soon as you decide to get into the PLM story, you will find yourself in the business of Integration Services. Even after many years of implementation, I don’t think we have a consistent agreement about PLM integration eco-systems and tools that you need for integration. So, where do you start? How do you avoid typical mistakes in such complex topics as PLM Integration? Let’s define the following areas of Product Lifecycle Management where integration is required today. In my view, there are three areas where PLM product (or project) can face different integration use cases: (1) User Interactive Integrations; (2) Process Integrations.

User Interactive Integrations

These are integrations that span the scope of CAD (design) and CAE systems. Traditionally, these integrations mostly rely on API systems. Additional usage of standard formats for data exchange can simplify integration efforts (3DXML, JT, STEP etc.). More advanced integration in this space can use Mashup technologies to allow mixing data from multiple systems on client and component based integration with portals and other tools. Capturing of data from multiple desktop systems is a significant part of the integration effort. Usage of XML based data mixing and transformation tools can simplify overall integration costs.

Process Integrations

These are integration tools that focus on the integration of multiple transactional systems. In addition, business process management is also part of this process integration game. In this area, you can find potential for heavy usage of integration middleware and ESB (Enterprise Service Buses). Some of the integration techniques are based on proprietary tools developed inside of PDM products. Additionally, you have the potential to use an integration solution from large IT vendors (Microsoft SharePoint, BizTalk Server, IBM WebSphere etc.). You will need to optimize your organization and system usage in order not to fall into complex integrations and expensive implementation in these areas.

Additional Gotchas:

Peer-to-Peer vs. Middleware/SOA based integrations:

This is a very known trade-off in the integration business. You can connect a system directly by custom developed bridges and you can use mediation software for the integration. You need to use a rule of thumb.  I don’t think there is “one-size decision” in this case. On one hand, middleware solutions are naturally more expensive to introduce and support. On the other hand, a number of local bridges can result in a spaghetti-integration mess .

Promising Integration Technologies:

There are a few interesting integration technologies in the horizon that, in my view, are undervalued by PLM products. Mashup technologies can be used to mix information from different data sources and reuse this information for multiple purposes. There are different flavors of Mashups- (1) client/browser and (2) data (or sometime called enterprise) Mashups. There are a few interesting products that have been introduced in this space (i.e. JackBe, PopFly, Denodo and others)

Out-of-the-Box and Integration Services

You can achieve a pretty good out-of-the-box desktop integration, especially if you can standardize specific data formats and scenarios. Excellent examples are multi-CAD and other engineering systems. At the same time, process and more complex business integration are hard to achieve with out-of-the-box solutions and require implementation in the context of specific customers and scenarios.

Open Source

Since the integration business depends heavily on services and customer-oriented implementation, I see a potential for integration based on open source products. Advantages of open source products for customer are that they can modify it in any time because they fully own them. These integration solutions have a long life span, which can be a significant advantage in the total cost of the solution.

Integration as a Service

This class of new and emerging solutions allows you to use integration products available online from the Web. In my opinion, these solutions are mostly focused on the growing SaaS space. This is a very new zone in which I think needs to be watched more in the future.

The bottom line in this discussion is that PLM integrations have many faces and aspects. The optimization and future development of technologies in this space can bring significant competitive advantages and more integration solutions to customers.

What’s your opinion?

 


When BOM seeks the right enterprise nanny…

April 29, 2009

Yes, I have come to the conclusion that the Bill of Materials (BOM) child is quite alone… probably because the BOM children were very popular and everybody wanted to take care of them . Just take a look at our closest environment . We do have a Bill of Materials in CAD/design, Engineering, Manufacturing, Support and Maintenance…. As I mentioned in my previous post, Search for the right BOM – I’m feeling lucky?, finding the right Bill of Materials in an enterprise environment is not simple.

So, the question I wanted to ask is how can we improve Bill of Materials, which fundamentally represents everything in Product Lifecycle Management – from the early requirements and design until production and disposal. My initial idea was about a synchronized BOM (Is it time for a synchronized Bill of Materials?). The biggest challenge I found is actually Bill of Materials separated by multiple systems in the organization. I’m sure you are very familiar with this problem. Bill of Materials have many flavors where each system tries to manage its own flavor of Bill of Materials. As a result, we have NO Bill of Materials.

I have seen a few trends in Bill of Materials management as of today:

1. Master trend. This is probably the oldest one. The idea is quite straightforward and based on trying to build master-relationships for the Bill of Materials. So far, when you have a master, you supposed don’t have a problem with multiple BOMs. But, to define such “master behavior” is difficult, and the process of master definition spans across time, products and technologies within the organization. It may work, in my view, in quite a synchronized and centralized environment. But if you take a more detailed look, you will find “another small BOM” somewhere around :)….

2. Multi-BOM trend. This one is quite established. As we said, there are many Bill of Materials – here we have an answer. We can manage many BOMs! With all the technologies we have today, we can manage as many as we want. Where is Catch 22? The governance model for Bill of Materials in this case becomes very problematic. Now we have too many BOMs and they are trying to state their single point of truth about what is going on with a product. Multiple tools around this problem can help you to compare, change, and even find inconsistencies, but the overall system becomes quite unstable, in my opinion…

3. Process trend. This is a new one, I discovered. We don’t manage Bill of Materials anymore. We do manage processes for organizations. Design, Engineering, Manufacturing. This sounds very reasonable. Process can formalize our activity around data and provide a reliable way to manage our ancient data life on a different level. So, in other words, – BOM is stupid and the process is smart. So far so good… Process management is a big scope. To implement it for an organization is not a simple task at all…. I’m just afraid, this is too much for “my small BOM child” :)…

So, what is my conclusion? I think we are still an the age where we need to find better technologies for managing Bill of Material(s). Do you know of any alternatives? I’m looking how to resolve the BOM problem within an organization and beyond…


What is the future of search for PLM?

April 28, 2009

There is no doubt a search changed the way we work today on the Internet. Remember how looked for  information in the pre-Internet and pre-Google age? It wasn’t so simple as today. 

So, the change of the user’s behavior regarding ‘search’ has created a lot of opportunities on the Internet surface. I’ve tried to look at how PLM and search intersects and perhaps find new ways to improve PLM system behaviors today. I touched on this before in my posts about Enterprise Search and PLM. So, I’ll do a deeper discovery into this space.

 What is available today? :

 1.    Enterprise Search.

This term is used to describe the application of search technologies inside of organization. This term is opposite two other searches – web search and desktop search – it is probably the most relevant to what PLM does inside an organization. The ability to find the right information about product, documentation, changes etc. is extremely important. I think that this area will grow in the near  future.

 Microsoft Futuristic View on Enterprise Search

2.    3D Space search.

This is a very specific area in search applications that allows you to specify your search criteria based on certain geometric characteristics of a product. Even if such capabilities already exist inside of many PLM applications, this search is interesting since it can connect to an enterprise search. Actually, it’s important to integrate 3D Space search into enterprise search and other applications.

 Dassault Systems 3DLive User Experience (UX)

3.    Visual Search / 3D search

If 3D Space search mostly uses 3D as a criteria for search (WHERE), Visual search (or 3D search, as I call it) is mostly about being able to provide a search with information about WHAT to search for. This can be interesting for the decision making and design process areas – because it allows you to find all information visually. Sometimes, however, it’s misleading as it can be hidden by wrong meta-information and additional data.

 Innovative solution from 3DPartBrowser 

The above are my top three perspectives for Search in the PLM domain. I’m going to discover search topic in context of PLM during these days on Infonortics Search Meeting in Boston. During this meeting, I had some casual talk with Steven Arnold and got some interesting perspectives on his Beyond Search Blog. I’m looking forward to speak more about this topic in the future…



5 reasons why Wiki fails for PLM collaboration

April 27, 2009

In the last few posts, I wrote about how you can use Enterprise 2.0 tools to improve various processes related to product development. I think that Wiki is an easy collaborative platform. I also think that Wikis have a great potential to be transformed into something that bigger from which smaller enterprises can benefit. At the same time, over the past few weeks, I tried to experiment with Wiki platforms.

 My initial proposition was to use Wiki as content space to keep information for collaborative usage. I don’t think this is a very new and fresh idea, but I wanted to see how it would be possible to use existing tools without significant modifications.

 Below is my conclusion regarding the weak sides of the Wiki story for product collaboration:

1.       Information Access. The biggest advantage of Wiki is simplicity. But this means that Wiki is completely flat. You don’t have any way of organizing access to information rather than creating of pages / sub-pages and links between them. When you deal with short flat pages it works well. When you start to add more complex content, it becomes unusable. I found that the simplest way ,to search for information was a browser embedded search (even though theChrome search is quite good, it wasn’t very simple to find what I needed. If you have multiple Wikis, you have the problem of information being separated and not accessible across Wikis.

2.       Content Maintenance. It’s very simple to insert information and update a Wiki. Butm at the same time, you have to take care of all the information there. This means that you can’t maintain your content with rules and logic. This is not a good way to organize it. You need to take care of the content or else it turns into garbage very quickly.

3.       Updates. I didn’t find any way to maintain automatic and dependency in updates of content. This is creates cumbersome situation when I need to update information I already put in wiki, but keep history of my updates. Quite straightforward requirements in our space, but hardly can be achievable out of the box.

4.       Integration. This disappointed me very much. The only way to integrate Wikis is to put hyperlinks on the relevant information. But what if this information is located in other systems / storages and formats?. I didn’t find any way to mashup information inside a Wiki page. Although Web Parts or similar functionality is available, it breaks the  page into segments and is not as good as what I want.

5.       Structural Information. This is a higher degree of content maintenance, but a very important one, in my opinion. Product information is highly structured by nature. Maintaining this information only by URL/links mechanisms is not trivial. So, the structure of information is probably a very desired feature.

Therefore, what is my conclusion? The Wiki collaborative mechanism is very nice and simple. It provides a very affordable way of collaborating and co-editing information. It seems hard to balance between the benefits and burdens of Wiki on the user.. I’d like to hear your opinions and experience of applying Wiki technologies.


How to create self-contained PLM persistent storage?

April 24, 2009

I’m getting back to the topic of persistent storage of PLM data. In my previous posts, I touched on this a few times.

PLM Persistent Content and Dynamic 3D PDF

PLM Dream Technologies for 2009

First of all, I will try to define functional scope for product persistent storage: In my view, there are five groups of product information that need to be included in this storage: (1) product geometry and models; (2) drawings; (3) knowledge representation (how-to; reasoning; math…); (4) Technical data such as suppliers, manufacturing, etc. and finally (5) meta-data describing all the product data together.

While I see a growing interest and demand for a solution that could possibly covering all the five topics I mentioned above, I don’t see technologies mature enough to provide an answer and allow practical implementation of such a solution in the field. I will try to cover briefly what I found available and what I see practically possible today.

I see STEP as a very solid and mature format that probably can consolidate geometry, model, and sometime additional product information. At the same time, raster formats (IGES, TIFF) and some others can be used to represent drawings for the long-term. Broad usage of PDF makes this format a reliable and stable option for long-term persistent storage as well. On top of PDF, I’ve seen some development such as PDF/A (for document archiving). Even if this format cannot be used as is for product-related data, this is the direction that can be taken to develop extensions for PDF that can support needs for representation of additional technical data.

I didn’t find any reliable technologies that can be used to store meta-data and technical data, rather than existing database and XML technologies. Even if these (Database and XML) technologies prove themselves for transactional usages, I haven’t found practical reference on using databases for data storage over a long time..

Apart from the above-mentioned technologies, I didn’t find mature technologies that can be used. There are a few research projects that I found in this space – LOTAR and some developments done by OMG, OASIS, and W3C. All these development are mostly research in character and contain research and development of models for persistent storage. At the same time, I don’t see them ready for production deployment.

So far, storage has become less expensive and we continue to produce a massive amount of new data on a daily basis. I see persistent storage as a very interesting opportunity for development that we’ll be able to see in the near future.


Will Master Data Management (MDM) work for PLM?

April 23, 2009

Lately, I’ve seen a potential for a growth in Master Data Management (MDM) initiatives. For those who don’t know MDM technologies are, I recommend you start with Wikipedia and check out the links of the MDM offering by IBM, Oracle and some other large IT vendors.

http://en.wikipedia.org/wiki/Master_Data_Management

http://www.dmoz.org/Computers/Software/Master_Data_Management/

Looking at the broad range of PLM capabilities and needs, I noticed that some of them deal with non-transactional and released products, and referenced data. This released data is need by various departments in the organization. The ability to find relevant product data related to a specific design as well as to change things from the past is growing. With growth of regulations in today’s word, being able to track and discover designs in the vault is a real need that can provide fast ROI

mdm-for-plm

So, can you figure out the math between having to access non-transactional product data and master data management technologies?. At first glance, this is something that can be validated and tested.

The following are some pros and cons for using MDM in the context of PLM:.

Pros:

· Reliable infrastructure to manage a single source of information

· Ability to redistribute information across the organization

· Global data availability in the organization

Cons:

· Expensive data transformation when you move data from design

· Not flexible for scenarios where back-reference to original data is needed.

· Data redundancy

What will be the future of MDM in the context of Product Development and Product lifecycle Management? Although there is some potential here, Master Data Management needs to adopt a flexible ability to play around various scenarios related to data referencing and data retention; I hardly see it being applied “as is” to product development today.

What is your opinion?


How to Improve Engineering Change Processes using Enterprise 2.0 Technologies?

April 22, 2009

I think that using Enterprise 2.0 technologies for Product Lifecycle can bring significant improvement in implementation and services. Today’s traditional approach is to use workflow-based tools to implement ECO processes. Is this good? Yes, it’s probably good, but at the same time, establishing such implementation can be relatively complex. You need to rely on database management tools and process management infrastructure. This is expensive. What alternative do we have today? I noticed that there is an emerging group of software which is starting to be referred to as Enterprise 2.0. Although there isn’t a consolidated agreement as to the scope of Enterprise 2.0 software, but there does seem to be a reference to a group of software tools and technologies that use Web tools for collaboration.

So, a typical, traditional ECO implementation includes the following components: Data, Process and Collaboration. Data allows you to keep information about ECO and link to the relevant CAD files located in a vault. Process allows you to set up a workflow to pass ECO information and requests among people in an organization. Collaboration tools are dedicated tools that allow you to present connected information about change and design to users. Collaboration tools normally include data and visual tools. Most of such implementations today are based on proprietarily rich (windows) clients and web tools. Workflow implementations in most cases rely on proprietary process tools, and sometime rely on IT process middleware. Overall, such implementation requires significant planning of everything – starting from ECO data through down to processes and people communication.

How we can make this implementation simpler and cut implementation costs? First of all – managing of all data natively in RDBM can be replaced by an implementation based on Wiki. This would allow us to keep information about ECO and reuse regular Web wiki editing tools to put information there. Depending on the Wiki engines you choose, you will have already user interface (web like) and data capture capabilities. You can tag this information and make it easy to search using desktop and/or enterprise search engines. Approvals and ECO process can be part of associated Wiki page data and ad-hoc collaboration. To establish more formal process you can use built-in workflow engines (i.e. Windows Workflow Foundation etc.). And last but not least is collaboration. Your environment can be Web-based and use all Web-based collaboration tools, and co-editing web pages. Additional power is that all ECO information can be easily shared as regular URLs. Additional interest can be gained by using subscription models similar to RSS. These can easily be applied on native web data and gives you the ability to use organizational information to discover which people to whom you should connect..

I believe, we are only in the beginning of Enterprise 2.0 tools introduction in organization, but for me it looks very promising…


PLM Next Big Things

April 21, 2009

Last week I had the chance to attend COFES 2009 forum in Scottsdale, AZ. It had a great atmosphere and there were excellent discussions about the present and future of design and engineering software. While I’m still digesting everything I’ve seen and talked about, I’d like to focus my thoughts about the potential next big things in PLM. I will try not to stick with particular buzzwords and TLA (three letter acronyms), but give more explanation about what should be achieved.

Technological Reuse of (New technologies, Computing, Collaboration tools). This sounds too broad, so let me try to explain. From my perspective, most of today’s PDM/PLM products were developed based on a pretty stable set of technological assets – Database and File Management technologies, PC and Windows Operational systems. The last time I saw significant achievements in new technologies were in those such as cloud data services, rich internet application user experience. Windows and other platform /technological providers accumulated a large potential for the development of collaboration capabilities. I think that providers who reuse these technologies and capabilities will be able to gain a significant competitive advantage in today’s market.

Low-cost PLM solutions. I think our industry has already seen introductions of products that have significantly changed the product landscape in terms of cost and availability. The first change came with the first introduction of AutoCAD on PC, followed by the introduction of SolidWorks and other 3D CAD on Windows. The next significant change came with the introduction of affordable PDM solutions in the mid 1990s (i.e. SmarTeam, Agile etc.) based on Windows technologies. I think that we are going to face the next step and delivery of very affordable solutions that will allow us to collaborate and manage product data and processes in a way we have never seen before. These solutions will leverage all available new platform technologies and may be focused on vertical market solutions.

Social Tools. To Tweet or Not To Tweet (kidding :)…). But this is the next important question we need to answer regards how PLM/PDM will leverage the huge potential of social networking, crowdsourcing and social computing. I don’t think there is a choice. I think that the companies that will be do this fast and will be able to connect to the social environment for PDM/PLM will have a significant competitive advantage tomorrow. But, I don’t think that means we should try to introduce “yet another social platform”. This is beyond people’s capacity, since there is a physical limit to how many social networks to which you can be connected at the same time. The biggest power of a social environment is to be connected and consolidated with existing networks and social platform capabilities.

As I mentioned, I’m going to digest and analyze COFES 2009 over the next few days or weeks, and I’m sure will come up with additional ideas to discuss. I’m interested in hearing your thoughts.


PLM Process Management – How many Workflows do we Need?

April 20, 2009

In one of my previous posts, I already discussed PLM process management: Should PLM develop its own process tools?. In reality, I see that companies have many products that have process management and workflow capabilities. Some of them are part of IT platforms (Microsoft SharePoint, IBM WebSphere etc.), while others are part of PDM, PLM and ERP tools. With such a large number of capabilities, I noticed that companies often develop multiple solutions to manage these processes – and these solutions are tightly connected to existing products. From a particular standpoint, it will let customers maximum the reuse of product capabilities and organize a dedicated process management and workflow solution integrated with data managed by a particular system (PDM, ERP etc.). But, one of the biggest drawbacks of that kind of situation is that organizations have multiple silos of disconnected solutions, with multiple process/workflow management implementations.

So my question is how many ”workflows” do we need in an organization? More precisely, I’d like to think about how to organize separated and disconnected workflow and business processes management solutions. Following are the priorities needed to organize this solution:

1. Establish a single process modeling environment

2. Multiple process deployment

3. Immersive access to process /workflow execution in a built-in user environment

A single process modeling environment would user to organize and maintain a single picture of the organizational processes. My preference in this case is IT platforms. Organizations normally chooses one IT platform, so having an environment in which to model processes makes a lot of sense to me. Consolidation around popular notations such as BPMN can let you use 3rd party tools, in some cases, if they provide additional benefits in managing of single process model.

Multiple process deployment can resolve the procedure of integrating processes into many existing systems. This depends on the specific system deployed by the company, and can be done in different ways – but the goal here is to keep the process connected to specific solution as much as possible (i.e. product data management and/or any vertical solution in the organization). This will allow existing systems to maintain the connection with data management using this system/sub-system. Access to this data is very important since most of process logic, in many cases, depends on this information.

Most of the processes require user involvement for control and data submission (i.e. document approval, ECO management etc.) Immersive access to process/workflow execution and control from the regular user daily environment is critical – because this is what guarantees the user’s acceptance. A process solution will be live only when customers will use it rather than bypass it.

So, where do you start? 1- Analyze what system can be used to keep overall control of processes in the organization; 2- Choose process modeling tools; 3- Analyze how to connect multiple workflow and process management solutions that already exist in the organization; 4- Give priority to solutions that have immersive integration in the user environment.

As usual, I’m open to discuss this and am interested to know what type of solutions you have and how you organize workflow and business processes within your organization.


Follow

Get every new post delivered to your Inbox.

Join 244 other followers