The foundation for next PLM platforms

August 29, 2014

PLM-software-platforms

Platform. This is a sweet word in a lexicon of every developer. The desire of software vendors is to become a platform to fuel the development of other products and serve needs of customers. In my debates with Chad Jackson about granularity and integration earlier this month, I outlined what, in my view, can differentiate tools, bundles and platforms. That discussion made me think even more about what PLM platforms are made today. In my view, there are two major foundations for most of PLM systems and tools developed today: 1- 2D/3D design platform and 2- object database modeling abstraction. Let me speak more in details about each of these foundations.

2D/3D design platform

Geometric paradigm provided strong foundation for design and engineering since early beginning of CAD/PLM. Therefore, CAD systems are deep in roots of PLM vendors today. Historically, all major PLM vendors today developed their software and businesses from CAD and related engineering applications. As a result of that, 2D/3D geometry, design, modeling and related information is a foundation of their products. Geometry modeling combined with PDM (product data management) created core foundation of these platforms.

Object Database Modeling

Object data modeling paradigm used by many CAD agnostic PLM vendors. Many of these vendors started as PDM companies expanded to support product development processes. Therefore, flexible data management approach became a main foundation layer for these products. Most of these systems were developed on top of relational databases (RDBMS). The flexibility of these platforms to manage any product information and related processes is a key strength.

Next PLM platform

What do you think will happen in the future of PLM platform? Are we going to see new elements and technologies to fuel future PLM development? In my view, last decade of innovation in open source, data management, web and cloud technologies created a new foundation for future PLM platforms. At the same time, the maturity of product lifecycle management implementations can provide a better understanding of functional architecture of PLM products. It made me think about what can become a foundation of future PLM platform development. Below, I put my four candidates to play a role of next PLM platform foundation.

1. MBSE (Model Based System Engineering).

As products are getting more and more complex, the approach that helps us to support product development becomes more visible and important. Product is going much beyond 3D mechanical design and contains information about system architecture, requirements, functional decomposition of mechanical, electronic and software elements. From that standpoint, MBSE is a good foundation to create a platform and I can hear many voices these days about future of MBSE approaches.

2- Unbundled 3D service

3D was born as part of CAD design. Engineers need to use 3D CAD system to create actual product. However, there are many people in manufacturing ecosystem that just need to consume 3D data or information in the context of 3D data. Think about 3D service unbundled from CAD system providing ability to visualize and re-use 3D information, combine it with other non-3D information. In my view, such approach can create a good foundation for future PLM platforms. I can see PLM vendors taking some elements of this approach today.

3- Product Development Standards

The level of dependencies in a modern manufacturing eco-system is huge. You can hardly find a single manufacturing company solely responsible for the development of their products. Companies are relying on development partners and hundreds of suppliers. Therefore, standards are getting more and more important. Some of product development and vertical industry standards can provide a functional foundation for future PLM platforms too.

4- Database technologies, big data and web infrastructure

Data technologies is a key element of any PLM system. We need to be able to manage a diverse set of information about product – visual, structured and unstructured. Functional requirements are different from the ability to create and maintain the information as well as ability to make analysis and re-use the information in a very scalable way. Modern data management software stack can become a foundation for future PLM platforms.

What is my conclusion? Product and technological development are going together. New platforms can arise from as a result of maturity of product and technological innovation. I see these four sources as a list of core elements of platform innovation sources. This is of course not an exhaustive list. I can see potential mix of these approaches together as well. These are just my thoughts and I’m looking forward to your comments.

Best, Oleg


Traditional PLM have reached their limits

June 11, 2014

plm-limits

To sell PLM to small and medium enterprise (SME) companies is a challenging tasks. I guess many of my readers will agree with me. I expressed some of my thoughts here – Why PLM stuck to provide solutions for SME? Opposite to that, large manufacturing companies, especially in aerospace, automotive and defense industries, were always a sweet spot to sell PLM solutions. Actually not any more…

Earlier this week, my attention was caught by CIMdata article – CIMdata Announces the Formation of an Aerospace & Defense PLM Action Group. Here is how CIMdata President, Peter Bilello defines the objective of the group:

“PLM solution providers continually deliver new products, architectures, and solutions to market, while industrial customers must cope with previous product launches, attempting to realize the value from existing PLM investments. The Aerospace & Defense PLM Action Group will define and direct the research efforts on key areas needed to meet future challenges. Current experiences with PLM-related implementations, best practice research, and close examination of emerging technologies will help define what the PLM solution providers should be offering

Another article by ConnectPress speaks about top three challenges of PLM in A&D industry – Global collaboration, Integration and obsolescence management.

#CIMdata Aerospace & Defense #PLM Action Group Addresses The Big 3: http://goo.gl/flMUx4 via @ConnectPressLtd

Integration is a topic that near and dear my heart. In my view, the future of manufacturing will be heavy dependent on the solving of old integration problems. Multiple enterprise software systems is a reality of all large manufacturing companies. I guess aerospace and defense companies are an absolutely extreme case of multiple systems integrated together. This is a place where existing PLM system might have a challenge. Here is my favorite passage from ConnectPress article:

According to Roche, most major aerospace companies make a major investment in PLM, either changing to a new system or upgrading their current system, roughly every five to 10 years. But in more recent iterations of this cycle aerospace companies have been spending more money and seeing smaller returns on their investments. The reason for this appears to be because the traditional PLM components have reached the limits of what they can offer.

The following areas mentioned are expecting to bring maximum ROI from PLM investment – non-core PLM domains such as requirements management, configuration management, change management, service lifecycle, etc.

It made me think, the integration of these solutions to core PLM modules can introduce a significant problem. For most of PLM systems, the architecture and technologies of core functions such as CAD data management and BOM management were designed back 10-15 years ago. To connect and interplay between heavily customized core PLM modules and expanded PLM solutions can bring significant service and implementation expenses.

In my view the following four major paradigms used by existing core PLM modules will require some sort of architectural upgrade to take them to the next level of integration in large global companies: 1. Global data management; 2. Concurrent design and related content access; 3. Management of multiple Bill of Materials; 4. Cross system data federation and integration.

What is my conclusion? Redesign the core PLM functions can be an interesting challenge for major PLM suppliers. In my view, this is something that will require a significant revamp in existing platforms and data management paradigms. Cloud can help to solve global collaboration challenge. Will cloud help vendors to solve a problem of multiple system integration? It looks to me a good topic to discuss. Just my thoughts…

Best, Oleg


PLM One Big Silo

June 9, 2014

plm-one-big-silo

Silos is an interesting topic in enterprise software. And it is a very important topic for product lifecycle management. Why so? Because, PLM is heavily relies on the ability to work and communicated across the organization and extended value chain. Accessing information in multiple departments, functional domains and application is part of this story. Silos is clearly one of the biggest PLM challenges. At the same time, silos can be also a good thing. They are reflection of org structure and sort of order we can use to navigate inside of organization.

Engineering.com posted PLM/ERP article – “Demolish the silos in PLM”: Why Dassault’s Bernard Charles believes in the 3D Experience. Read the article and draw your opinion. My understanding – Dassault System is combine multiple technologies and product belonging to different organizational domains to improve communication and information access across silos in organization.

Dassault System is not alone in the try to crush silos. Article is referencing other PLM companies’ effort to interconnect people and products. I liked the following passage:

The main idea behind DS’ 3DExperience is to provide the IT tools needed to break down the silos and connect the development work not only to software, electronics and manufacturing, but also to the end-customers. No doubt there are similarities and touch points between what this solution aims to do and Siemens PLM’s Industry 4.0 concept as well as PTC’s broader ALM, MES and SLM/IoT scope. The difference is that Siemens PLM places a higher priority on the engineering side of product realization, whereas PTC presently zooms in on the aftermarket and product-as-a-service concept.

Interesting enough, web is also got infected with the problem of silos. Large web 2.0 platforms are very similar to enterprise software silos, which put a lot of questions about availability of information across the web. There are quite lot of debates these days around the topic of web openness and information access. I’ve been reading Aral Balkan’s article – How Web 2.0 killed the Internet. The article contains a lot of controversial opinions about development of Web 2.0 and the way Open API used to support the development of huge centralized systems such as Facebook, Google, LinkedIn, Twitter and some others.

The thing that made me think the most was the question about Openness and Open APIs. Here is the passage from the article.

An Open API is simply a limited, revokable license to access and consume a certain subset of data and/or functionality that belongs to a closed silo. It is a visitor’s pass. The terms of the pass grant you limited access for a specific time. The terms can be altered — and routinely are altered — at the whim of the API owner. This is a painful lesson that many developers learned, for example, while working to add value to the Twitter platform by building Twitter clients. They were unceremoniously dumped after expending their energies to help Twitter build up its closed silo.

These two articles made me think about demolishing organizational silos, enterprise software, and future trajectories of PLM development. The term silos is misleading. There are organizational silos and application silos. The first (organizational silos) is something that needs to be demolished to improve communication and process transparency. However, the second one (applications) is something that will be built-up to connect people, applications and data. So, there is a high probability to have PLM One Big Silo built to solve the problem of communication and streamlining of product development.

The thing that raises my concern are related to open API. Enterprise software companies might have different reasons to product data compared to Google, Facebook and Twitter. However, fundamentally these APIs are controllable by vendors that can turn them off and on depends on the strategy, competition and many other reasons.

What is my conclusion? To build an open system is a very complicated task. I can see a shift towards creating of huge monolithic vertical silos. So, PLM One Big Silo is a very possible future for customers looking for smoothly integrated systems and aligned experience. However, my belief is that future will belong to open systems, which will bring an additional level of innovation and flexibility. Long term commitment of vendors for Open API is an important indication of software trajectory. Just my thoughts…

Best, Oleg


PDM/PLM. Why The Cloud? Wrong question…

June 6, 2014

how-to-do-plm-on-the-cloud

You may think engineers like everything new. You may think it is so obvious. Engineers are developing all new technologies, gadgets and machines. All this new stuff… Literally everything… was actually developed by engineers. Engineers are in love from everything new they develop.

But, here is the problem. When it comes to the point of deciding about technology and software engineers use by themselves to develop products, it turns opposite. Engineers is probably one of the most conservative group of people to adopt new tech. It may take months to manufacturing company to decide about usage of enterprise software. When it comes to PDM/PLM system, the evaluation can take even longer…

I’ve been reading Manufacturing Business Technology Magazine article – Why The Cloud? Navigate here to read the article. It speaks about benefits of cloud technologies such as low license cost, fast deployment and ease of data sharing. It makes some points of advantages of cloud PLM tools. The last one is the most interesting, since it emphasize the ability to turn manufacturing and supply chain into connected eco-system. Here is my favorite passage:

As engineering and manufacturing data moves to the cloud, mid-sized manufacturers are finding that they can easily and automatically pull component and engineering data into their designs, transparently move those designs between different tools to ensure performance and manufacturability, and securely and directly publish data to suppliers worldwide for prototyping and production. Manufacturing data in the cloud is nimble: It can be connected into a larger ecosystem of cloud services and moved where you need it, when you need it.

This article made me think that the question "Why The Cloud?" is a little bit… outdated. I will try to explain what does it mean for me. For the last 3-4 years, we’ve seen a massive shift of IT into the cloud. It is clear to every CIO and IT manager these days that they can benefit from the cloud. All PLM vendors are developing cloud strategies and provide a way to deploy their software in the cloud in some ways. However, this is exactly the place that requires validation. Not "why the cloud?". This is wrong question. The right one is – how to implement the cloud? I can see 3 main groups of cloud PDM/PLM tools applicable for engineers and manufacturers

Mainstream cloud tools

IT people can disagree with me. I can see people every day are using consumer and other mainstream cloud tools for business. If your Exchange cannot handle large emails, Gmail most probably does. You can share CAD and Excel files via Dropbox, Google Drive, iCloud, OneDrive and others. Most of these tools are free or very cheap and it is very hard to prevent people from using them.

IaaS based PDM/PLM tools

Many PDM/PLM vendors are choosing IaaS as a cloud strategy. I’ve been posted about it here. Nothing wrong with that. By leveraging elastic computing power and virtual servers, you can get PLM system deployed on private and/or public cloud. For most of cases, these we are talking about PLM solutions adopted to cloud IaaS infrastructure. While vendors can create a different licensing schema and get all advantages of cloud infrastructure, for most of the cases, these tools are still replicated the same "PDM/PLM story". The main difference – your server is on the cloud now. And some of your servers can be shared between multiple customers, so you can get cost advantage of shared resources, deployments and updates.

Specialized "born in the cloud" (PDM/PLM) tools

The main difference of these tools is that they were natively developed for the cloud. Tools in this category leverage not only computing infrastructure, but also social, functional and business aspects of cloud eco-system. Most of them are implementing the ability to support social interaction and communication. Also, these tools are focusing how to share information beyond the point of single organization.

What is my conclusion? The period of early adoption of cloud technologies is over. It is clear – cloud is going to stay with us. However, the question how to leverage cloud technologies and turn it into best products expanding customer ability to design and manufacturing best products is still in front of us. It is going to be a massive shift towards different approach in the way cloud will helps to build new products. Just my thoughts…

Best, Oleg


PLM Technology vs Vertical Industries: Wrong balance?

April 14, 2014

plm-industries

Let’s talk about PLM technologies. Err.. PLM is not a technology. Even more, PLM is even not a product. So, what is that? Business strategy? Product development politics? For the sake of this conversation let’s leave these debates out. I want to speak about PLM technologies that allow you to manage product data, CAD files, bill of materials, rich set of related information as well as processes around it. This technology came to us about 20-25 years ago first as a very hard-coded set of tools. You had to build it literally different for every customer. So, it supported only large customers that were able to pay for software, infrastructure and implementation. Later on, PDM/PLM turned into software toolkit. The next step in PDM/PLM technology evolution was called flexible data modeling. The first flexible (dynamic) PLM data modeling tools were released back in 1995-2000 and… not much changed since then.

So, what happened since that time? PLM vendors went to develop out-of-the-box and vertical industry solutions in a massive way. David Linthicum’s article Saleforce.com officially is out of ideas reminded me about the joke comparing technology vs. industry play. Here is the passage:

When you run out of new ways to provide innovative technology, you go vertical. That was the running joke among CTOs back in the day. It usually meant the market had reached the saturation point and you could not find new growth

I found this message very compelling to what happens in PLM industry. PLM vendors are trying to compete by providing more comprehensive set of data models, best practices, process templates. By doing so, vendors want to reduce TCO of PLM implementations. It is actually brings success and many customers are using these solutions as a starting point for their PLM implementation.

So, where is the problem? For most of the situations, PLM is still costly and expensive implementation. Services may take up to 50% of the cost. Here is the issue – core PLM data and process modeling technology didn’t change a lot for the last 10-15 years. Data models, CAD file management, product structure, process orchestration. All these things are evolving, but very little. The fundamental capabilities are the same. And it is very expensive to develop solutions using these technologies.

You may ask me about cloud technologies. Cloud is the answer. But only partially. It solves problems related to infrastructure, deployments and updates. Cloud provides clear benefits here. However, from the implementation technology standpoint, it is very similar to what non-cloud solutions can offer. Another interesting passage from Infoworld cloud computing article explains what is the problem new SaaS/cloud products can experience when trying to displace existing vendors:

So many companies have tried this approach — many times — but most found limited success. I can’t help but think the same will occur here. Salesforce will soon discover that when you get into vertical industries, the existing foundation of industry-specific applications is difficult to displace. Although Salesforce can always play the SaaS card, most of those industry-specific providers have already moved to SaaS or are in the midst of such a move. That means SaaS won’t be the key differentiator it was when Salesforce first provided its powerful sales automation service more than a decade ago.

What is my conclusion? Efficiency and cost. These are two most important things to make PLM implementation successful. So, the technology must be improved. Data and model capturing tools, flexibility and ease of use – everything must be more efficient to support future of manufacturing processes. How to do so? This is a good topic to discuss with technology leaders and strategiest. I’m going to attend COFES 2014 in 10 days. I hope to find some answers there and share it with you.

Best, Oleg


PLM Software and Open Source Contribution

February 11, 2014

plm-future-open-source-contribution

Open source is a topic that raised many controversy in the last decade. Especially if you speak about enterprise software. The trajectory of open source software moved from absolute prohibition to high level of popularization. In my view, the situation is interesting in the context of PLM software. The specific characteristic of PLM is related to a very long life span of the software. PLM system developed and deployed 10-15 years ago are continue to be in active production mode by many customers. However, the question of how PLM software can leverage the value of open source software remains open.

Readwrite article Open Source Should Thank These Five Companies put an interesting perspective of the value of open source software for enterprise community. In my view, the article brings examples of software components that can be re-used by enterprise vendors. However, technology is not everything. People and culture is another important element of open source contribution. Here is my favorite passage from the article:

"To encourage excellence, you need to do things out in the open,” Kreps [Jay Kreps of LinkedIn] said. “Engineers are like everyone else—if everybody’s watching, they want to look good. Otherwise we’re building a crappy internal tool that’s just good enough to meet our immediate needs.”

I have to admit, open source initiatives didn’t start in web companies only. During the past decade we’ve seen significant contribution to open source made by major software vendors like IBM and some others specifically if you speak about massive investment in Linux projects. Another Readwrite article is trying to establish balance in open source contribution debates. However, the main conclusion remains the same – web companies these days leads the way for open and innovative technological development. I found conclusion interesting:

"…old school" companies like IBM don’t get the credit they deserve. But it is the Web companies that are building data superstructure on the Internet".

I found the comment about data superstructure important in the context of our PLM software discussion. The replacement of Windows servers and other backend infrastructure by Linux and other open source software is just a matter of time. Also, I believe this is an active process for many enterprise IT organizations these days. However, the question of data management foundation for future PLM software remains hugely open.

What is my conclusion? Open source can re-shape the landscape and future technological trajectories of PLM software. Technologies polished and contributed by web giants to open source community can provide a solid foundation to existing PLM vendors and startup companies to develop future foundation of scalable enterprise product data management solutions. Open culture combined with public quality acceptance can be another major shift that to differentiate future enterprise software developers. Just my thoughts…

Best, Oleg


PLM v BIM: Common or Different?

January 1, 2014

plm-v-bom-common-different

As a matter of fact, PLM and BIM domains are quite independent. Nevertheless, I can hear more and more voices recently trying to create a marriage between these two. The latest one that caught my attention just before New Year party was Jos Voskuil’s blog – 2014 The year the construction industry will discover PLM? It is a bit long, but very thoughtful article speaking about variety of topics related to history, development and perspective of PLM and BIM usage. Have a read. My favorite passage is the following one:

An intermediate conclusion might be that construction companies follow the same direction as early PLM. Standardizing the data (model) to have a common understanding between stakeholders. Construction companies might not want to implement a PLM system as ownership of data is unclear as compared to manufacturing companies every discipline or department in PLM might be another company in the construction industry.

Data collaboration between people having different ownership and purpose on working with data is something that clearly can provide PLM and BIM perspective. At the same time, as you getting down to the earth, you might discover so many differentiability.

Thinking about technology and data deliveries as we move from files on hard discs to data in the cloud, lots of "application specifics" can disappear in the future by providing single collaborative cloud data platform to run variety of calculations and processes.

So, I can see a point why technologically driven people can see how to combine PLM and BIM to provide a broader unified platforms. At the same time, going down to bits you can discover lots of differentiations in data, terminology, processing and more. All together made think about what are top 3 common and different characteristics of PLM and BIM.

Top 3 commons:

1- Data Sharing. Both PLM and BIM solutions have a need to share and access a combined set of 3D and 2D product and project data with different roles and access requirements. Data can belongs to the same organization as well as different organizations.

2- Project management. Whatever we do, we call it projects. You can find some specifics between discrete manufacturing and building projects, but we would like to organize people and teams around deliveries and timeline.

3- Visualization. Both PLM and BIM have a strong tendency to visualize the objects. It doesn’t matter what – airplane, building, car or just office design. We want to see and experience it virtually before making it real. It even come in commonality of processes such as clash detection.

Top 3 differentiations:

1- Single model. You can hear both PLM and BIM people are talking about single model. It sounds similar, but I can see a big difference in handling of variety of Bill of Materials (EBOM, MBOM, etc.) vs. different elements of information about building (architecture, construction, equipment, etc.)

2- Processes and changes. Even every definition of what is process in the world sounds similar, I can see significant difference in the way changes and data integrity should be maintained between manufacturing product development and construction projects. The variety of specific models, data definitions, reporting, updates and many other specific won’t allow to create a single solution to support both manufacturing (PLM) and construction (BIM) domains.

3- Tools, Apps and terminology. At the end of the day, we are talking about people. Both PLM and BIM are representing almost different set of programs for design, planning, etc. These two tool sets are providing specific language and terminology. Even if some techie people can see similarity between them, it is often goes very down to HEX code, rather than to practical similarity. People are regular to use their apps and terminology and to make them change their behavior for sake of PLM and BIM unification sounds like a crazy task and mission impossible.

What is my conclusion? I can see some infrastructure commonality that can come in the future between PLM and BIM implementations. It will come first from tech and computing infrastructure. As much as we go towards cloud based solution, we might see some re-use of sharing of multidisciplinary solutions for data management, project organization, visualization, mobile access, etc. However, both manufacturing (PLM) and construction (BIM) industries will keep specific data organization, processes and terminological differences that will drive diversity in solution delivery. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 250 other followers