What’s wrong with “analog PLM”?

July 6, 2015

analog-computing-devices-plm-old

In engineering world, digital vs. analog differentiation can be explained in a very precise manner – analog is using electrical signals. Opposite to that digital is using binary format. Things are much more complicated when “digital” term comes into marketing realm.

Accenture article Faster, fitter, better: Why product innovation is going digital is taking a marketing spin and coming with the term of “digital PLM” as a system or approach that can help companies to improve their product development processes and improve innovation. I like the following passage explaining the difference between digital and non-digital PLM worlds:

Such models, where the multiple processes and systems live in silos, inhibit the flow of relevant information needed to optimize product development. Engineers, for example, are often disconnected from the new-product introduction process. As a consequence, new-product launch teams don’t always hear about critical last-minute design changes. And because vital insights are not shared, the solutions that eventually emerge from this fragmented system just aren’t meeting customer expectations for innovation and relevance. Moreover, because of this linear approach, product launch is often delayed.

It made me think more about “digital vs. analog”. An obvious thing to think about companies are using “paper” trays to share information and manage processes. I can see it as an option. But the chances are companies are using at least emails to share information. Still can be very complicated way to collaborate. I guess most of companies are trying to step up from an analog way of sharing information and manage changes using emails into the realm of PLM systems. But it doesn’t go very well in many situations or it gets very expensive. So, what is the problem with inefficiency with “analog PLM” implementations?

One of the core elements of every PLM system is its “data model”. Jos Voskuil provided a good explanation about what is “PLM data model” in his last article – Importance of PLM data models. According to Jos, the success of PLM implementation depends on an efficient organization of PLM data models. Here is the passage that explains that:

My conclusion for both situations is that it all leads to a correct (PLM) data model, allowing companies to store their data in an object-oriented manner. In this way reflecting the behavior the information objects have and the way they mature through their information lifecycle. If you making compromises here, it has an effect on your implementation, the way processes are supported out-of-the-box by a PLM system or how information can be shared with other enterprise systems, in particular, ERP. PLM is written between parenthesis as I believe in the future we do not talk PLM or ERP separate anymore – we will talk business.

Jos is absolutely right in his assumption. The traditional world of PLM implementations is requiring to translate everything into PLM data models. Take every PLM system and you will see this step as an essential element of every implementation. If I refer to the engineering definition of “analog” system, it means to translate the real world – organization, data, processes into “analog world” of data models. With a full respect to flexible PLM data models and architectures, this translation is creating distortion and the process itself is very complicated.

Is there a better way? I think digital PLM (or whatever other name our marketing genius bring) will have to abandon old “analog PLM” practices of data modeling as a complex and outdated approach. Digital PLM will be able to use native representation of product information and product development processes across silos without translation into variety of PLM data models abstractions. It doesn’t mean data models will disappear tomorrow. Software will have to use data models anyway. But customers and implementers will be excluded from the loop.

What is my conclusion? Think about existing PLM data modeling approach as a way to translate native sound or video signals into electric signal to transmit it into organization. Think about it as an “analog approach” similar to what we had in the past in audio and video recording. In PLM it created a whole level of implementation complexity – a need to define data models and and map organizational reality into these models. Future “digital PLM” will have to bring a better way and exclude people from a formal data modeling loop. It will make PLM systems simpler to understand and easier to implement. Just my thoughts…

Best, Oleg

picture credit Wikipedia article presenting an ancient analog computer

 



PLM, mass-customization and 3D printed high heels

June 3, 2015

thesis-culture-plm-mass-customization

What is connecting  high heels and airplane landing gear? Some people can find a commonality and special ones can build a business around it. My attention caught by the following BI article – A former SpaceX exec is reinventing the high heel with the help of an astronaut and a rocket scientist. Meet Dolly Singh and Thesis Couture. Dolly spent 6 years working at SpaceX as a recruiter in southern California, matching those who could build rockets with the companies who wanted to enter the next frontier before deciding to redesign high-heels. The outdated design is just a metal plate, a metal shank and compressed cardboard.

The definition of the problem is a key. This is a moment of time high-heels are getting a direct connection to the airplanes and landing gears. The following passage is explaining that.

They took a cue from Musk and broke the problem down to the fundamental laws of physics acting on high heels, or chassis depending on your approach. When it comes to high heels, there’s three: how the shoe distributes weight, what happens when it hits the ground and the friction between your foot and the shoe.

Those are the only design constraints, Singh said. The basic shape of the high heel and its materials — a metal plate, a metal shank and compressed cardboard — haven’t changed in many years. “A skinny metal rod and cardboard is basically all you’re standing on when you’re wearing stilettos, so it doesn’t take a lot for scientists to see that it’s not a particularly sophisticated structure from an engineering standpoint,” Singh said. Instead of asking for help with high heels, she approached them with an engineering problem: how would they redesign a chassis to support a human’s weight and range of motion?

According to the article, Thesis Couture is planning to 3D print high heel shoes. You can learn more on their website.

plm-thesis-culture-shoe

The story about Thesis Couture made me think about data driven mass customization. Mass customization is an interesting trend these days in manufacturing. The demand for customization is high. Customers are looking for more products tailored for specific customer needs and requirements. Now, think for a moment that the design of high- heel shoes is customized for your weight, body and (just dream for a moment) for the way you walk. This is an enormous potential to create a custom build high-heel show.

However, mass customization created large number of challenges for manufacturers. Product lifecycle management technologies are standing just in the middle between engineering and manufacturing. In many situations, this is a center piece of mass customization challenges. Some of my previous articles can give you some idea about these challenges – The role of PLM in mass customization; PLM, mass customization and ugly truth about vertical BOM integration; Mass customization is the real reason for PLM to want MBOM.

What is my conclusion? Maybe I’m just dreaming and 3D printed mass customizable high-heel shoe is not what customer wants. Maybe the biggest problems of mass customization are coming from aircraft, automotive and some other industries. However, just think for a moment – one of the most custom built product in the world is our body. To tailor products to our body is a big challenge. And high heels is just one example. We are going to see more examples in the future. And PLM vendors should think how to make their engineering and manufacturing technologies capable to handle people body configurations as well. Just my thoughts…

Best, Oleg

pictures credit BI article and Thesis Couture website


Why to ask cloud PLM vendor about Devops and Kubernetes

October 23, 2014

dockers-containers-cloud-plm

I want to continue the theme of how do we move to the cloud. While Amazon remains one of the major providers of elastic computing services, other options are emerging too. If you consider to move your PLM initiatives to the cloud, you might do some analysis about how actually cloud PLM can be made. Few weeks ago I was talking about large manufacturing and public cloud. Public cloud is an interesting option. At the same time, regulated manufacturing and companies with significant security restrictions can question this path. One of the alternatives for these companies can be just announced Azure Cloud System from Microsoft/Dell. It will take time for PLM vendors to support it, but Cloud PLM In Azure Box can become a reality soon.

Today I want to speak more about some trends in cloud computing and how it can be related to you future cloud PLM project. Remember my article What cloud PLM cannot do for you? The biggest achievements of cloud PLM today is removal of IT hassle and complexity. With cloud PLM you don’t need to think about servers, installations and even upgrades. However, here is the thing. The number of cloud applications is growing. Application lifecycle is getting more interesting these days. Large enough company can easy face the situation of management of multiple clouds – public and private at the same time. Complexity of manufacturing organization, supply chain, security or other IT-related reasons can easy bring you to such situation. These are not simple questions and it is very important to create a right strategy for your IT organization managing cloud PLM and other application providers.

Devops

You can consider “devops” as a new buzzword. It comes from a combination of “development” and “operations”. Bricks and mortar PLM software vendors were doing development only. They developed, tested and shipped CAD, PDM and PLM software on CDs and you had to hire IT specialists to install, configure and run it. Now, it is different with cloud software. By removing IT hassle from customer, software vendor is taking a role of IT too. It created a new paradigm of development+operations together. Think about engineering and manufacturing. They have to go together to make it work.

InfoWorld article Devops has moved out of the cloud speaks more about devops trend. I like the way it makes demystification of cloud by explaining how the same infrastructure can be used for both cloud and non-cloud development and IT environments. It also helps you to understand the importance of operation to achieve the quality of cloud services. Here is my favorite passage:

Many people attribute the rise of devops directly to the growth of cloud computing. The connection: It’s easy to continuously update cloud applications and infrastructure. For example, a SaaS application typically requires 1,000 lines or more of changed or added code each time you use it. Its functionality is continuously updated, which makes the cloud-delivered application, platform, or infrastructure more valuable to the users. Gone are the days when you received CDs or DVDs in the mail and had to manually update the servers. Although the cloud is certainly a better place for devops, I don’t believe that devops should be used only in cloud deployments. Instead, you should use devops approaches and enabling tools such as Puppet or Chef in most of the development you do these days — both cloud and on-premises.

Kubernetes

We need to thank Amazon EC and other IaaS vendors for incredible success of cloud computing we have today. However, technology doesn’t stay still. For the last decade public web companies learned many lessons how to manage infrastructure and software development on demand and on scale.

Kubernetes is an example how web companies can scale using cloud infrastructure. Navigate to ComputerWeekly article – Demystifying Kubernetes: the tool to manage Google-scale workloads in the cloud and spend some time even you will consider it a bit technical. In a nutshell it speaks about new technology of cloud deployment – containers, which comes to replace well-known VMs (Virtual Machines). Here is the most important passage in my view:

Kubernetes and Docker deliver the promise of PaaS through a simplified mechanism. Once the system administrators configure and deploy Kubernetes on a specific infrastructure, developers can start pushing the code into the clusters. This hides the complexity of dealing with the command line tools, APIs and dashboards of specific IaaS providers. Developers can define the application stack in a declarative form and Kubernetes will use that information to provision and manage he pods. If the code, the container or the VM experience disruption, Kubernetes will replace that entity with a healthy one.

containers-vs-VMs

While it may sounds too complex, the key issue here is related to the lifecycle of complex cloud PLM environments. At the end of the day, cloud PLM vendors will have to manage updates, introduce new features, maintain data and more. This technical example can show you the gap between new type of cloud infrastructure and opportunity to move an existing PLM server from your server room to the cloud.

What is my conclusion? We should move beyond “cloud PLM” buzzword. Enterprise software vendors are moving from shipping CDs towards selling software services. It simplifies customer experience, but creates new layers of complexity in vendor’s organization. It moves software development to devops and creates technologies that capable to manage application lifecycle easier. It ends up with the quality of PLM cloud service. Keep it in mind when you evaluate you future cloud PLM project. Just my thoughts…

Best, Oleg


The foundation for next PLM platforms

August 29, 2014

PLM-software-platforms

Platform. This is a sweet word in a lexicon of every developer. The desire of software vendors is to become a platform to fuel the development of other products and serve needs of customers. In my debates with Chad Jackson about granularity and integration earlier this month, I outlined what, in my view, can differentiate tools, bundles and platforms. That discussion made me think even more about what PLM platforms are made today. In my view, there are two major foundations for most of PLM systems and tools developed today: 1- 2D/3D design platform and 2- object database modeling abstraction. Let me speak more in details about each of these foundations.

2D/3D design platform

Geometric paradigm provided strong foundation for design and engineering since early beginning of CAD/PLM. Therefore, CAD systems are deep in roots of PLM vendors today. Historically, all major PLM vendors today developed their software and businesses from CAD and related engineering applications. As a result of that, 2D/3D geometry, design, modeling and related information is a foundation of their products. Geometry modeling combined with PDM (product data management) created core foundation of these platforms.

Object Database Modeling

Object data modeling paradigm used by many CAD agnostic PLM vendors. Many of these vendors started as PDM companies expanded to support product development processes. Therefore, flexible data management approach became a main foundation layer for these products. Most of these systems were developed on top of relational databases (RDBMS). The flexibility of these platforms to manage any product information and related processes is a key strength.

Next PLM platform

What do you think will happen in the future of PLM platform? Are we going to see new elements and technologies to fuel future PLM development? In my view, last decade of innovation in open source, data management, web and cloud technologies created a new foundation for future PLM platforms. At the same time, the maturity of product lifecycle management implementations can provide a better understanding of functional architecture of PLM products. It made me think about what can become a foundation of future PLM platform development. Below, I put my four candidates to play a role of next PLM platform foundation.

1. MBSE (Model Based System Engineering).

As products are getting more and more complex, the approach that helps us to support product development becomes more visible and important. Product is going much beyond 3D mechanical design and contains information about system architecture, requirements, functional decomposition of mechanical, electronic and software elements. From that standpoint, MBSE is a good foundation to create a platform and I can hear many voices these days about future of MBSE approaches.

2- Unbundled 3D service

3D was born as part of CAD design. Engineers need to use 3D CAD system to create actual product. However, there are many people in manufacturing ecosystem that just need to consume 3D data or information in the context of 3D data. Think about 3D service unbundled from CAD system providing ability to visualize and re-use 3D information, combine it with other non-3D information. In my view, such approach can create a good foundation for future PLM platforms. I can see PLM vendors taking some elements of this approach today.

3- Product Development Standards

The level of dependencies in a modern manufacturing eco-system is huge. You can hardly find a single manufacturing company solely responsible for the development of their products. Companies are relying on development partners and hundreds of suppliers. Therefore, standards are getting more and more important. Some of product development and vertical industry standards can provide a functional foundation for future PLM platforms too.

4- Database technologies, big data and web infrastructure

Data technologies is a key element of any PLM system. We need to be able to manage a diverse set of information about product – visual, structured and unstructured. Functional requirements are different from the ability to create and maintain the information as well as ability to make analysis and re-use the information in a very scalable way. Modern data management software stack can become a foundation for future PLM platforms.

What is my conclusion? Product and technological development are going together. New platforms can arise from as a result of maturity of product and technological innovation. I see these four sources as a list of core elements of platform innovation sources. This is of course not an exhaustive list. I can see potential mix of these approaches together as well. These are just my thoughts and I’m looking forward to your comments.

Best, Oleg


Traditional PLM have reached their limits

June 11, 2014

plm-limits

To sell PLM to small and medium enterprise (SME) companies is a challenging tasks. I guess many of my readers will agree with me. I expressed some of my thoughts here – Why PLM stuck to provide solutions for SME? Opposite to that, large manufacturing companies, especially in aerospace, automotive and defense industries, were always a sweet spot to sell PLM solutions. Actually not any more…

Earlier this week, my attention was caught by CIMdata article – CIMdata Announces the Formation of an Aerospace & Defense PLM Action Group. Here is how CIMdata President, Peter Bilello defines the objective of the group:

“PLM solution providers continually deliver new products, architectures, and solutions to market, while industrial customers must cope with previous product launches, attempting to realize the value from existing PLM investments. The Aerospace & Defense PLM Action Group will define and direct the research efforts on key areas needed to meet future challenges. Current experiences with PLM-related implementations, best practice research, and close examination of emerging technologies will help define what the PLM solution providers should be offering

Another article by ConnectPress speaks about top three challenges of PLM in A&D industry – Global collaboration, Integration and obsolescence management.

#CIMdata Aerospace & Defense #PLM Action Group Addresses The Big 3: http://goo.gl/flMUx4 via @ConnectPressLtd

Integration is a topic that near and dear my heart. In my view, the future of manufacturing will be heavy dependent on the solving of old integration problems. Multiple enterprise software systems is a reality of all large manufacturing companies. I guess aerospace and defense companies are an absolutely extreme case of multiple systems integrated together. This is a place where existing PLM system might have a challenge. Here is my favorite passage from ConnectPress article:

According to Roche, most major aerospace companies make a major investment in PLM, either changing to a new system or upgrading their current system, roughly every five to 10 years. But in more recent iterations of this cycle aerospace companies have been spending more money and seeing smaller returns on their investments. The reason for this appears to be because the traditional PLM components have reached the limits of what they can offer.

The following areas mentioned are expecting to bring maximum ROI from PLM investment – non-core PLM domains such as requirements management, configuration management, change management, service lifecycle, etc.

It made me think, the integration of these solutions to core PLM modules can introduce a significant problem. For most of PLM systems, the architecture and technologies of core functions such as CAD data management and BOM management were designed back 10-15 years ago. To connect and interplay between heavily customized core PLM modules and expanded PLM solutions can bring significant service and implementation expenses.

In my view the following four major paradigms used by existing core PLM modules will require some sort of architectural upgrade to take them to the next level of integration in large global companies: 1. Global data management; 2. Concurrent design and related content access; 3. Management of multiple Bill of Materials; 4. Cross system data federation and integration.

What is my conclusion? Redesign the core PLM functions can be an interesting challenge for major PLM suppliers. In my view, this is something that will require a significant revamp in existing platforms and data management paradigms. Cloud can help to solve global collaboration challenge. Will cloud help vendors to solve a problem of multiple system integration? It looks to me a good topic to discuss. Just my thoughts…

Best, Oleg


PLM One Big Silo

June 9, 2014

plm-one-big-silo

Silos is an interesting topic in enterprise software. And it is a very important topic for product lifecycle management. Why so? Because, PLM is heavily relies on the ability to work and communicated across the organization and extended value chain. Accessing information in multiple departments, functional domains and application is part of this story. Silos is clearly one of the biggest PLM challenges. At the same time, silos can be also a good thing. They are reflection of org structure and sort of order we can use to navigate inside of organization.

Engineering.com posted PLM/ERP article – “Demolish the silos in PLM”: Why Dassault’s Bernard Charles believes in the 3D Experience. Read the article and draw your opinion. My understanding – Dassault System is combine multiple technologies and product belonging to different organizational domains to improve communication and information access across silos in organization.

Dassault System is not alone in the try to crush silos. Article is referencing other PLM companies’ effort to interconnect people and products. I liked the following passage:

The main idea behind DS’ 3DExperience is to provide the IT tools needed to break down the silos and connect the development work not only to software, electronics and manufacturing, but also to the end-customers. No doubt there are similarities and touch points between what this solution aims to do and Siemens PLM’s Industry 4.0 concept as well as PTC’s broader ALM, MES and SLM/IoT scope. The difference is that Siemens PLM places a higher priority on the engineering side of product realization, whereas PTC presently zooms in on the aftermarket and product-as-a-service concept.

Interesting enough, web is also got infected with the problem of silos. Large web 2.0 platforms are very similar to enterprise software silos, which put a lot of questions about availability of information across the web. There are quite lot of debates these days around the topic of web openness and information access. I’ve been reading Aral Balkan’s article – How Web 2.0 killed the Internet. The article contains a lot of controversial opinions about development of Web 2.0 and the way Open API used to support the development of huge centralized systems such as Facebook, Google, LinkedIn, Twitter and some others.

The thing that made me think the most was the question about Openness and Open APIs. Here is the passage from the article.

An Open API is simply a limited, revokable license to access and consume a certain subset of data and/or functionality that belongs to a closed silo. It is a visitor’s pass. The terms of the pass grant you limited access for a specific time. The terms can be altered — and routinely are altered — at the whim of the API owner. This is a painful lesson that many developers learned, for example, while working to add value to the Twitter platform by building Twitter clients. They were unceremoniously dumped after expending their energies to help Twitter build up its closed silo.

These two articles made me think about demolishing organizational silos, enterprise software, and future trajectories of PLM development. The term silos is misleading. There are organizational silos and application silos. The first (organizational silos) is something that needs to be demolished to improve communication and process transparency. However, the second one (applications) is something that will be built-up to connect people, applications and data. So, there is a high probability to have PLM One Big Silo built to solve the problem of communication and streamlining of product development.

The thing that raises my concern are related to open API. Enterprise software companies might have different reasons to product data compared to Google, Facebook and Twitter. However, fundamentally these APIs are controllable by vendors that can turn them off and on depends on the strategy, competition and many other reasons.

What is my conclusion? To build an open system is a very complicated task. I can see a shift towards creating of huge monolithic vertical silos. So, PLM One Big Silo is a very possible future for customers looking for smoothly integrated systems and aligned experience. However, my belief is that future will belong to open systems, which will bring an additional level of innovation and flexibility. Long term commitment of vendors for Open API is an important indication of software trajectory. Just my thoughts…

Best, Oleg


PDM/PLM. Why The Cloud? Wrong question…

June 6, 2014

how-to-do-plm-on-the-cloud

You may think engineers like everything new. You may think it is so obvious. Engineers are developing all new technologies, gadgets and machines. All this new stuff… Literally everything… was actually developed by engineers. Engineers are in love from everything new they develop.

But, here is the problem. When it comes to the point of deciding about technology and software engineers use by themselves to develop products, it turns opposite. Engineers is probably one of the most conservative group of people to adopt new tech. It may take months to manufacturing company to decide about usage of enterprise software. When it comes to PDM/PLM system, the evaluation can take even longer…

I’ve been reading Manufacturing Business Technology Magazine article – Why The Cloud? Navigate here to read the article. It speaks about benefits of cloud technologies such as low license cost, fast deployment and ease of data sharing. It makes some points of advantages of cloud PLM tools. The last one is the most interesting, since it emphasize the ability to turn manufacturing and supply chain into connected eco-system. Here is my favorite passage:

As engineering and manufacturing data moves to the cloud, mid-sized manufacturers are finding that they can easily and automatically pull component and engineering data into their designs, transparently move those designs between different tools to ensure performance and manufacturability, and securely and directly publish data to suppliers worldwide for prototyping and production. Manufacturing data in the cloud is nimble: It can be connected into a larger ecosystem of cloud services and moved where you need it, when you need it.

This article made me think that the question "Why The Cloud?" is a little bit… outdated. I will try to explain what does it mean for me. For the last 3-4 years, we’ve seen a massive shift of IT into the cloud. It is clear to every CIO and IT manager these days that they can benefit from the cloud. All PLM vendors are developing cloud strategies and provide a way to deploy their software in the cloud in some ways. However, this is exactly the place that requires validation. Not "why the cloud?". This is wrong question. The right one is – how to implement the cloud? I can see 3 main groups of cloud PDM/PLM tools applicable for engineers and manufacturers

Mainstream cloud tools

IT people can disagree with me. I can see people every day are using consumer and other mainstream cloud tools for business. If your Exchange cannot handle large emails, Gmail most probably does. You can share CAD and Excel files via Dropbox, Google Drive, iCloud, OneDrive and others. Most of these tools are free or very cheap and it is very hard to prevent people from using them.

IaaS based PDM/PLM tools

Many PDM/PLM vendors are choosing IaaS as a cloud strategy. I’ve been posted about it here. Nothing wrong with that. By leveraging elastic computing power and virtual servers, you can get PLM system deployed on private and/or public cloud. For most of cases, these we are talking about PLM solutions adopted to cloud IaaS infrastructure. While vendors can create a different licensing schema and get all advantages of cloud infrastructure, for most of the cases, these tools are still replicated the same "PDM/PLM story". The main difference – your server is on the cloud now. And some of your servers can be shared between multiple customers, so you can get cost advantage of shared resources, deployments and updates.

Specialized "born in the cloud" (PDM/PLM) tools

The main difference of these tools is that they were natively developed for the cloud. Tools in this category leverage not only computing infrastructure, but also social, functional and business aspects of cloud eco-system. Most of them are implementing the ability to support social interaction and communication. Also, these tools are focusing how to share information beyond the point of single organization.

What is my conclusion? The period of early adoption of cloud technologies is over. It is clear – cloud is going to stay with us. However, the question how to leverage cloud technologies and turn it into best products expanding customer ability to design and manufacturing best products is still in front of us. It is going to be a massive shift towards different approach in the way cloud will helps to build new products. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 284 other followers