PLM Think Tank Top 5, September 2010

October 31, 2010

I postponed my monthly writing this month to give few words PTC Creo launch this week. CAD community already started to discuss this event. You cannot dismiss such a type of the event in CAD industry. My initial conclusion about that can be described as following: 1/ consolidation; 2/ people-oriented; 3/ industry knowledge. You can ask me about next CAD revolution. This is something, in my view, requires additional understanding. The technological revolution of PTC Creo is in merging of CoCreate and Pro/E platforms. For the moment, I don’t have enough knowledge to state it. However, it absolutely represents the PTC CAD platforms’ consolidation process. PTC demonstrated a deep understanding of important industry problems. At the same time, the first companies benefiting from Creo solution will be large PTC account focusing primary on PTC platform. I want to put a few special word about people-orientation. This is something very important, since, in my view it following the “consumerization trend” in enterprise IT. Remember, we had a “role-based” story before. The Creo AnyApp, in my view, is a transcription of a “role-based” approach. This idea is a right one. However, the execution of this idea is the most important thing. Apps idea is something that requires the existence of a Platform. The known software or hardware platform playing big games now (Apple Mac, iOS, Android, Windwos Azure etc.). In enterprise, Oracle is trying to turn their offering as a platform. Will PTC Creo become a platform? This is a question we will be discussing during the next 5 years. Now, let’s turn to the list of September’s PLM Think Tank Top5.

How To Choose PLM? (Visual Guide)

In the beginning, I was surprised by popularity of this post. In this post, I put a visual diagram that presents a process of PLM solution selection. The idea was to make it simple to digest in 5 minutes. Main PLM decision these days seems to me in selecting of one of three options – 1/Following mindshare of PLM vendor; 2/Identification of PLM as a part of ERP and 3/ Not seeing PLM as a solution to be used in the product development process. Take a look on this diagram and make your opinion. I hope, you will find it valuable.

PLM and Open Source Licenses

This post started by referencing of Google OSI license acceptance. However, the conversation quickly moved to the discussion about the value of Open Source and all details related to usage of Open Source in PLM. The Open Sournce licensing story is damn complicated. However, with a growing interest and influence, understanding of available open source and fee software licenses will be crucial. I think OSS will play an increased role in the future of software.

PLM Basics: Reference Designator and Find Numbers

This post was generated as an answer to one of my readers. The usage of the Find Number and Reference designator is often overlapped and not clear. In addition, there is a significant difference in usage between diverse industries. You can find a long thread of comments and reader opinions sharing their experience and practices in how they use RD and FN. In addition you can find very useful links with related information.

PLM Philosophies Collide

This post was raised by few conversations where I’ve been asked to say how PLM future looks like – BOM or Workflow? I think, this question represents one of the biggest philosophical collide in engineering and manufacturing software. What will be the winning behavior in the future? It is hard to say. In my view, the end-game solution will need to provide answers to both sides of the problem. BOM and Worklow need to be equaly included into PLM solutions. Only together they can keep an organization to manage efficiently product lifecycle. You can read a long list of people’s opinion and shared thoughts about this topic.

PLM: Open Source vs. Free?

The last top 5 this month is related to the comparison of Free and Open Source models. Both “free” and “open source” can create an interesting innovation trend and change today’s status quo. However, I don’t think it is a silver bullet. Businesses have a lot of concerns about “free model”, since everybody understands that, in the end, TCO is important. On the other side, free can make PLM systems widely adopted and not limit to organizations that can pay for PLM licenses. Open Source is a separate story. To have a broad community is one of the main questions to be answered to understand the potential viability of PLM Open Source.

Best, Oleg


PTC Creo – AnyThing Possible?

October 29, 2010

So, it finally happened. After almost 6 months of official preparation, PTC launched a new product line – CREO. CREO will replace existing Pro-E, CoCreate and ProductView products. I had a chance to attend this event today in Boston and made few pictures. Take a look on the next picture. In my view, it gives you the most non-marketing view of what PTC presented:

For the beginning, I’d like to mention the following 3 topics that come to my mind in the context of PTC Creo: Applification, Common Data Model, Interoperability. Let me dig inside of each of these topics.

Applification
It seems to me PTC folks finally got the world of applications and App stores. Like Apple split 10 song CD into 10 songs and start selling them by pieces, PTC is planning to deliver CAD and PLM applications using the same concept. It will be hard to understand and to map what will belong to CAD and PLM. However, the strategy for app split and granularity seem to me logical. It will be interesting to see how to sell this strategy to engineering software folks.

Common Data Model
CDM is an element of PTC Creo architecture that supposed to provide the unification of functions and serve as a root for integration. You can find such an element almost in every enterprise solution. Common Data Model is a feature in Creo I want to understand better. Having hundreds of applications dependent on such a complicated element of infrastructure can make the overall solution less robust. However, I will dig into this later.

Interoperability
The notion of interoperability was in the air when PTC talked about multiple apps communicating with each other. Interoperability is the old and well known topics. PTC is trying to leapfrog and close an interoperability gap. How it will happen? I’m not sure understood it. The Creo box allowing to get any CAD model into Creo. However, nothing was said about how possible to get this information out back to the original CAD system.

What is my conclusion today? I think, PTC demonstrated a very good understanding of market and customer. They are presenting a plan to deliver solutions answering a set of well known problems. To identify problems right is already more than 50% in the delivery process. Let see how many PTC Creo Apps we’ll see in the future…

Best, Oleg


How To Reset PLM Collaboration?

October 28, 2010

Almost 2 weeks ago, I wrote about De-confusing of PLM Collaboration. Today, I want to suggest a different angle and talk about how we can re-think PLM collaboration concept. Why I think it can be interesting? In my view, the enterprise software industry is coming through the process of consumerization. It means that lot of technologies well-established in the consumer space will be coming to the enterprise. PLM will not be excluded from this process. Collaboration could be a good starting point for such type of re-thinking. To put some lights I created the following diagram.

I think about 3 fundamental activities: Communication, Collaboration and Process Management. I’d like to discuss them separately.

Communication

This is the most straightforward part. In my view, it represents fundamental activities in an organization today. I heard from people that organization is practically “driven by emails”. New technologies and the web are bringing alternative ways to communicate (i.e. IM, Blogs, Forums…). However, email is very strong and all ambitious plans about how to replace emails became a failure.

Collaboration

I used “collaboration” word to identify tools helping people to share data. I can see “synchronous” and “asynchronous” tools that can be used by people to collaborate. In the past, enterprise product developed the whole world of various “workspaces”, “whiteboards” and other solutions to share information between users. New web tools (i.e. Wiki) are coming to this space from the internet. In my view, it represents interesting perspectives on how to share data.

Process Management

Fundamentally organization is driven by processes. Various business processes can drive and formalize people’s activities, define goals and measurement system. The important aspect of process management is “people’s adoption”. Very often, company is spending a significant amount of resources to formalize and establish a process management system. However, next day, the system abused by people running emails and by doing so, voting against complicated process management procedures.

Moving from Spaces to Channels

In my view, enterprises are moving from a database world to networks. I touched this point earlier in my post PLM Network Effect and Single Point of Truth few days ago. Slowly, companies are starting to understand that database-centralizing has the limit and will not scale up. The Internet experience shows clearly that “network organization” can be much more powerful. Thinking about this abstraction, I came to the conclusion about significant movement from the concept of “spaces” that was dominant in PLM collaboration for the last 10-15 years to the concept of “Channels”. The way to organize channels can present an innovation in streamlining communication and processes in enterprise organizations. I can see existing and new companies are innovating in this space. Just to bring few examples – Cisco Quad, Salesforce.com Chatter, Yammer, Vuuch. This is my short list of innovators in this domain.

What is my conclusion? Customer demand to re-think collaboration in enterprise will provide a significant impact on how PLM collaboration will be developed in this the next 3-5 years. Intersection of Process management, communication and old-fashion collaborative tools is a good starting point to reset everything we knew about PLM collaboration. Just my thoughts…

Best, Oleg


PLM and Complex Products

October 26, 2010

Early this week I had a chance to read the following blog post by Accidental Product ManagerReally, Really Complex Products: Is PLM Software the Solution? I found it short and insightful. Have a read and make your opinion. The question asked by Dr. Jim Anderson opens an unusual angle of conversation about PLM. Why I see it this way? The product lifecycle management concept was born as the solution to support OEMs in aerospace, defense and car manufacturing industry. I can hear people are discussing the mainstream usage of PLM for smaller companies. However, when talking with major PLM vendors (Jim’s blog points on Siemens PLM), you may think large OEMs (Jim’s points on Toyota) are taking PLM as something that they cannot live without.

There are thee main problems that can pop up when you will start using PLM software: Software compatibility, Missed Errors and Bad Data. Jim mentioned some industry use cases to describe problems (i.e. A380 delay, Toyota virtual tests). In addition he pointed on the complexity of IT related to these solutions. I think these are excellent points to discuss.

Tools Compatibility and FFF

Everybody in manufacturing is familiar with FFF term (Form, Fit, Function). When you design your products, you ultimately can use replaceable parts. The concept of FFF helps you to define if one part can be replaced by another part. I believe, the same practice can be applied to a software when it comes to usage multiple versions of tools. I can agree with Jim – compatibility tests are complicated and not always obvious. This is the place when CAD and PLM vendors can definitely innovate. The main point here is the ability to migrate from one software package version to another. By making this process smooth, PLM vendors to make life of manufacturers easier.

Physical Products and Virtual Errors

The question of how virtual tests can be adequate and represent physical products is not new. I think, manufacturing industry made significant progress in the last 20 years by replacing of physical tests by virtual tests. However, planning of virtual tests is something that still requires people’s experience. The interesting potential in this domain is to increase the ability of PLM software to monitor physical objects (cars, planes, ships, etc.) during their lifecycle. This monitoring will allow to build more trustful virtual tests.

Bad Data and Linked Data

Enterprise companies operate with a huge amount of electronic data sets scattered among disparate data sources. PLM system is an island in the huge amount of data. Integration problem is not a new one. For the last 15 years, I’ve seen multiple attempts to resolve the problem of data consolidation. EAI, Federation, Master Data Management and Data Warehousing is only short list of the technologies and products in this space. In my view, there is a potential here to solve the problem of "Bad Data" by starting to link islands of data in enterprise organization. This is a big and challenging task.

What is my conclusion? PLM software was born to provide a solution for Product Lifecycle Management. However, I agree with Jim – this is not a silver bullet. You need to think about PLM software as a set of tools helping manufacturers to solve their product management and product lifecycle related problems. To find a right set of tools is a challenge for every manufacturing shop (especially a big one). Just my thoughts…

Best, Oleg


Stuck PLM Project and Leo Tolstoy

October 25, 2010

I read a very short post from Aras by Jennifer McCullough of ArasFrustrated by a Stuck PLM Project?. The whole purpose of this blog is to point on the link to Stuck PLM page on Aras’ website. The Aras website is talking about license cost frustration, uncovered functionality and a problem to grow with limited budget. At the same time, it promises results.

Does it mean Aras engineers invented "Perpetuum Mobile"? No, I don’t think so. Aras engineers are making software. Aras business wizards decided to delay a painful moment of license’s fees to a later time and wrap it differently. It seems to be smart. They probably made homework in Chris Anderson’s Sunday school about the power of "FREE". The Aras’ Stuck PLM Project story reminded my Anna Karenina roman by Leo Tolstoy. "All happy families are happy alike, every unhappy family is unhappy in its own way". PLM marketing wizards are telling stories about "happy PLM implementations". These stories are all the same. I think, the real implementation stories become more popular. The most interesting PLM implementation stories are about how to use a diverse set of tools to handle product development processes. When / If we will come to these stories, we have a chance to get back and talk about what tools we need to turn these stories into PLM happy stories.

What is my conclusion? Aras’ case in PLM is interesting. The important point is a sequence of events. Aras business managers are trying to put a carriage before horses. Will it work? A good question to ask. When you keep something in your hands, you can decide to give it away for free. Can it re-build a trust of potential customers frustrated from previous PLM experience? I think, it depends on what is next. If a quality of tools is good, and you have a right "set of expectations", you can have a decent PLM free ride. You need to make it sustainable. If I need "a commute car", I will never take an advantage of the entertainment system in the back of my limousine. If my "commute car" stack, I don’t need entertainment neither. How to have right tools to get a job done- this is a right question to ask these days. Just my thoughts…

Best, Oleg


PLM Network Effect and Single Point of Truths

October 23, 2010

Few weeks ago, I had a chance to attend a webinar – Learn How PLM Propels Innovation at Mercury Marine. The webinar is available now on demand. Navigate you browser to the following link and can see a recorded version of this webinar (requires an additional registration). This webinar provides a comprehensive study about Siemens PLM implementation at Mercury Marine. I recommend to spend your time and watch this. If you are running PLM implementation you can find some interesting hints of the decisions that were made by Mercury Marine people and implementation team. However, I want to focus on a specific issue covered by this implementation – Single Source of Truth. The notion of a single source of truth is a popular one in PLM world. In the past, I wrote about it in few of my posts: PLM and A Single Point of Disagreement and Back to basics: PLM and Single Point of Truth and My slice of PLM Single Version of Truth. So, why I decided to come back to the story again? I want to add few perspectives on this in the context of some industry and technological trends that became clearer over the past year

PLM and Single Model

The idea of a single point of truth based on few concepts that were developed by PLM industry during last 10-15 years: flexible data repository, common data model, data integration and federation. The fundamental belief of these concepts relies on the ability to manage a central database that provides a universal and scalable data storage used for product lifecycle. In addition it assumes that company data will be integrated into this repository and all people in the company will have an access to this data. However, in case of data located in other systems, data federation can be used to connect external data sources into this repository, which assumes a central data model consolidating these parameters. All major PLM vendors are using these concepts. Implementations can vary and it differentiates one PLM system from another.

In the case of Mercury Marine, the following picture presented a clear view on how product lifecycle integrated around central data model and data representation in TeamCenter.

This model represents a stable way to implement data lifecycle with the ability to control global changes, access and proces orchestration. However, this architecture has few potential downsides: 1- it requires definition of single model and agreement about this model in an organization; 2- additional cost of integration and 3- high sensitivity to change requests.

Networks vs. Databases

In my view, we have seen a significant growth into network-based architectures over the last decade. The boost of networks caused by development of Internet technologies and wide adoption of web resources and mobile expansion. If you come today to any organizational IT, you easy can find people that understand RDBMS. However, try to find people that understand Networks. Not so easy… Remember client-server technologies 10-15 years ago? Everybody understood mainframes… Today, the same happens with networks. Try to find people that understand networks of data, RSS, mobile data connectors, on demand replications, complex network data architectures. In my view, network organization will become dominant in the next 10 years. A significant growth in data and high demand to lower the cost of data management solutions will require to find a new reliable way to manage a product data lifecycle. Think about the following “dream architecture”:

Yes, it lacks of details, for the moment. However, network effect can change current product data life cycle towards new ways to keep consistency of data and will provide an alternative way to data organization.

What is my conclusion? The biggest question enterprise IT will need to answer is the cost of IT servers. Data management and data lifecycle is one of the most strongest data consumers in the organization. Central PLM databases and consolidated data storage can be too expensive for 2010s. Organizations need to learn about successful PLM implementation and understand how to make them more efficient in the future. Just my thoughts…

Best, Oleg


PLM SharePoint Thoughts

October 22, 2010

I’ve been attending SPTech Boston Conference today. Navigate your browser on this link. You can get some information there. Microsoft is rolling SharePoint 2010 out, and I wanted to understand what traction it takes in the community of developers. To understand that I spent couple of hours on the SPTech Exhibition.

SharePoint Collaboration

Collaboration remains one of the most popular words in SharePoint field. The companies on the exhibition can be classified into to two groups: 1- Administrating SharePoint; 2- Collaborating with SharePoint. Compared to my previous SharePoint conference, I can see a definite trend down of companies helping people to install, configure and maintain SharePoint. Does it mean stuff became easier for the last 1-2 years? No, I don’t think so. I think people just learned a bit.

The second group of companies is focusing on how people can collaborate with data and documents. SharePoint remains a strong player in this domain. Nevertheless, multiple vendors are working to make it more usable and more attractive. In the context of engineering software, it related to two fields – Viewing Solution and Document Sharing. Viewing solutions were presneted by few companies – adlibsoftware, Atalasoft, BAInsight, Surfray and some others. Most of the companies in this space, are working on formats of multiple document and struggling complexity of engineering documents. I will spend more time in coming weeks to learn more what these companies are doing.

The Google Wave Miracles

The very interesting things happen in what I call “after Google Wave” age. I can see companies coming with the nice ideas that very similar to the original Google Wave ideas. Google was very ambitious in their plans. Most of the products I’ve seen are similar, but trying to provide a niche solution. However, the following product was kind of different and interesting. VIZit from Atalasoft Inc. came with the early beta of Vizit Social eXchage (VSX). They are extending SharePoint with a very interesting feature – social discussions around SharePoint content. I found it kinda cool. You can mark any piece of content in document (i.e. PowerPoint, Acrobat PDF, Word) and organize discussion of people around this topic. The product is still in beta. You can contact company using the following link.

The Cost of Free SharePoint
My last portion of SharePoint thoughts are indirectly related to Open Source SharePoint PLM solution announced few weeks ago. You can see more details in my previous blog about that – PLM SharePoint: Silver Bullet of Fierce Criticism. I think one important point was missed in previous article related to the SharePoint based PLM solution – solution cost. I want to thank ArnoldIT for sharing the link on SharePoint Price Calculator (the link was live when I published it). I think you can find it interesting. It shares some numbers and details related to SharePoint 2010 licensing. Here is the example I run on this sample:

What is my conclusion today? SharePoint is continuing to be interesting as a product that drives people and organization. The diversity of solutions on top of SharePoint is high and Microsoft continue to hold big gaps in functionality allowing to partners to develop decent solutions. At the same time, the price of “Free SharePoint” is far from free. The complexity of SharePoint solution is always beyond the average. You should consider it before you move… Just my thoughts.
Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 252 other followers