Rethinking PLM ROI in cloud era

March 6, 2015

cloud-plm-roi

To measure ROI is important part of PLM implementation decision. These days companies are less interested in 3-5 years implementation roadmaps. Therefore, discussion about PLM ROI can be very painful. It is better to get prepared upfront. PLM ROI is not a simplest thing to get into. I put some of my thoughts why hard to sell PLM ROI earlier on my blog.I think companies have problems to measure ROI related to PLM implementations. The level company are measuring product development efficiency is far from the maturity, especially compared to company financial activities.

Jim Brown’s guest post on PTC Creo blog made me think again about PLM ROI. Navigate to Cloud PLM – A Big Return with a Smaller Investment to read more. Turns out cloud can change PLM ROI mathematics. Here is my favorite passage from the post:

Cloud Changes ROI Mathematics. The ROI for cloud solutions is fundamentally different. On-premise, licensed approaches take a lot of justification and validation before pulling the trigger. Why? Because you have to commit to spend a lot of money without any guarantee that you’ll get a return. Effectively, your “I” is fixed and your “R” is variable. That means you have to make a big bet on the value and you can end up upside-down on your investment. With cloud, your cost is aligned with value. As more people use the system the costs rise proportionally. That takes away the big financial hole that you had to dig at the beginning of a project and had to hope it would pay off. It also puts the vendor in the same camp as you. If you aren’t successful, neither are they. That’s a “win-win."

It made me think more deeply about PLM ROI formula in the cloud era. Cloud (or SaaS) systems are bringing changes into two aspects – (1) upfront cost and (2) IT investment and running cost. All together it converted into cost of "PLM services". I agree with Jim – it makes PLM more affordable for many companies by reducing an initial barrier to get into PLM world.

And it does change PLM ROI math. Assuming companies can implement PLM faster (no time for installation, hardware allocation, etc.) and upfront cost is lower, the return on PLM investment will be faster. However, here is a thing… Despite all obvious advantages of the cloud model, cloud PLM cannot solve fundamental problems of PLM implementations – most of PLM implementation got stuck with people and organizational changes. So, after quick honeymoon, cloud PLM systems can get back to the traditional math related to PLM ROI. And it all starts from the way to measure value. This is where core problem of PLM ROI calculations. Jim speaks about alignment of cost and value. The value of cloud PLM will be reflected in usage and aligned with cost. However, I can see it only as a starting point. PLM vendors should think beyond just PLM usage. Cloud PLM is an opportunity to provide a calculated real time ROI.

Think about cloud software and PLM DevOps. The ability to monitor and measure processes is increased because of nature of cloud software. PLM vendors can help organizations with measurement of their PLM ROI. Cloud software can give much more tools to provide KPI and related parameters that can be used to calculate ROI. However, manufacturing companies will have to do their job by bringing organizational information into PLM systems. Without that, ROI will be a fantasy of PLM vendors.

What is my conclusion? Cloud PLM is not a silver bullet to solve a problem with slow PLM ROI. Cloud reduces upfront cost and can show value of PLM faster. However, organization should focus on how to measure value of PLM implementations. At the same time, PLM vendors have an opportunity to develop measurement tools that will help to customers to make a value of PLM clear. Overall, cloud brings a new opportunity to both customers and vendors to streamline PLM value proposition and show clear ROI. Just my thoughts…

Best, Oleg

Image courtesy of Stuart Miles at FreeDigitalPhotos.net


How cloud CAD can avoid “double PDM tax”

March 5, 2015

cloud-cad-API

My yesterday post – Will cloud CAD inherit data interoperability problem? raised few interesting discussion about cloud data management in PDM/PLM implementations. How cloud CAD/PDM will make our life simpler? In my view, the most important part is to exclude files from data management chain. By doing that, new cloud based CAD systems are able to make data flow much easier. Existing check-in/out behavior will become redundant in cloud systems, cloud applications can save data instantaneously and redundantly and will allow you to restore to any point of changes. I also hope cloud CAD systems will have lock function in case you want to prevent somebody else from changing your design. The same mechanism will also provide a way to branch design options much easily by leveraging direct data access to all design data stored in the cloud databases. This is my dream scenario.

However, data interoperability of new CAD/PDM bundles seems to be a potential point of failure. And it is can slow down adoption of cloud CAD systems in environments that require integration with existing desktop CAD, PDM and PLM systems. The following Engineering.com article can give you some context to the problem -Dassault or Siemens PLM? The Contrasting Paths of Jaguar Land Rover and Volvo Cars. It speaks about challenges of large manufacturing companies related to usage of CATIA V6 / ENOVIA PDM. Here are few passages that caught my attention:

Volvo invested in Siemens PLM solution Teamcenter as a backbone, and kept CATIA V5. However, the automaker is reluctant to switch to CATIA V6 and the 3DEXPERIENCE/Enovia V6 platform. "We will not use the V6 version if it requires double PDM installations", says VCC’s Andreas Westholm, IT Director – Geely Liaison.

Volvo will not use CATIA V6 if it requires a second PDM implementationAll CATIA files are managed in Teamcenter. Since Volvo does not have any plans at this time to migrate to CATIA V6, they don’t need Dassault’s Enovia PDM as an intermediate step in the data management.

”It is not possible to work effectively with two PDM systems”, asserts the Volvo IT-director. ”And we will not use CATIA V6 if it requires double PDM installations. However, we will bring in a new V5-V6 release that facilitates the import of V6 information”.

Potentially, any cloud CAD (with embedded PDM functionality) can create a situation similar to CATIA V6, which is a problem. Engineering and manufacturing companies have very slow process of new software adoption. So, to be successful, cloud CAD systems will have to co-exist and be used alongside with existing desktop CAD systems. What is even more important, new cloud CAD systems will have to be integrated with existing PLM products to become part of product development processes. How to prevent future cloud CAD systems from a problem described by Volvo? How to avoid future "double PDM tax" on cloud CAD systems?

I think the answer is in a new cloud system architecture. It reminded me one of my old posts – Why PLM needs to learn Web APIs? A potential solution to the double PDM integration problem is future cloud CAD platforms, web APIs and data openness. Think about the way most of modern web platforms are consuming data. Seamless data streaming, avoiding local temp file storage and standard REST-based API is allowing us to create better integration between web systems. This is a way new cloud CAD solutions can be seamlessly integrated into existing PLM solutions and eliminate "double PDM tax".

What is my conclusion? Future of cloud CAD/PDM bundles is promising and can provide many advantages to users – transparent data management, ease of revision management and collaboration. However, it is very important to think how new cloud solutions will be integrated with existing PLM platforms. Openness and web-based APIs are two most critical elements to support integration and adoption of new systems. Just my thoughts…

Best, Oleg


Will cloud CAD inherit data interoperability problem?

March 3, 2015

cloud-cad-pdm-interop

Cloud and CAD are probably getting to the point where it starts become a real thing. Autodesk Fusion360, Onshape, SolidWork Industrial design. It is likely to absorb some PDM functionality to make collaboration, branching, revisions and other data management tasks easier. Cloud CAD means no files, so engineers have nothing to mess with… Life is getting more Googley if you read John McEleney Onshape blog.

However, here is the thing… What if (for some crazy reason, which is easy to imagine when you deal with engineers :)), customer will decide to do a work with two cloud CAD systems? It is not unusual to see multiple desktop CAD systems in engineering organizations, so why cloud CAD will be any different.

In my yesterday blog – Cloud CAD infrastructure is getting more PDM-ish, I draw the picture of cloud CAD/PDM bundle helping us to collaborate and manage revisions. Now how two cloud CAD systems will work together? I’ve been trying to bring my cloud imagination and thought about Google Docs and Office 365 services co-existence. Actually, it is not very nice story- I can easy get my files distributed between my Dropbox, Google Drive and OneDrive accounts. So, what if my parts will be stored on Google Drive and Assembly on Dropbox? Not sure I will like it…

Similar problem in PLM world created many debates and issues. Do you remember Dassault CATIA V6 story, which required ENOVIA backend to run it? It made few customers unhappy when they discovered that they need to run two PDM/PLM systems. I can see some similarity with multiple CAD/PDM cloud bundles co-existence and interoperability.

What is my conclusion? How engineers will collaborate using multiple CAD cloud software? Cloud technology is great, but it looks like cannot magically resolve some old fundamental problems of multiple systems, collaboration and interoperability. I wish cloud CAD / PDM vendors will think about it upfront before customers will find themselves in the middle of messy CAD / import/export/migrate data scenarios. Just my thoughts…

Best, Oleg


Cloud CAD infrastructure is getting more PDM-ish

March 2, 2015

fusion360-cloud-pdm-cad-branches

PDM was long time a step child for many CAD systems. To deal with CAD files, their relationships and dependencies including multiple revisions of document was complex and painful. So, many customers just gave up and stored files on shared drives. It was okay until the our life was disrupted by a new way to get work done – online, connected, collaborative.

The initial intent of collaborative cloud systems was to solve the problem of collaboration and data sharing. The idea to provide a value on top of existing CAD desktop file systems was interested. So, specialized cloud CAD file sharing and collaboration systems got focus and attention.

At the same time, CAD vendors got to think about longer term solutions. CATIA was one of the first systems that announced the disconnect from a traditional file systems. I posted about that few years ago – The future of CAD without files.

These days, the race towards cloud CAD is accelerating development of data management and CAD collaboration technologies for the cloud. Few weeks ago, I shared some of my thoughts about importance of PDM technologies for cloud CAD. Cloud CAD vendors are clear about their intent to make PDM part of their core product technology.

My attention caught Autodesk Fusion360 article – Fusion System Architecture Changes Coming in the Next Release – Why and What. Read it, because it contains some very interesting pieces of information about how files are going to disappear into future cloud infrastructure. Here is the passage I captured:

With the coming release of Fusion, we will be introducing significant changes to Fusion Cloud Service architecture that lays a strong foundation on which we can build an environment that is rich in WIP DM. As part of this change, we are introducing the Autodesk Work In Progress Cloud Service which is designed to model and manage complex relationships that are associated with a design. The service is highly scalable, highly available and optimized for performance. Another important change in the February release is significant improvements to the Fusion Object Storage Service. Taken together, these changes will result in immediate benefits in the way of performance gains and high reliability in the Fusion upload and download data pipeline, and allow the Fusion team to deliver rich DM workflows in subsequent releases.

Another article from Fusion360 blog brings an excellent explanation what these date services mean for end user. These are functions that belonging to PDM system in a traditional file based CAD / PDM setup.

…it solves so many common design problems that we’ve heard from the community, both in Fusion 360 and other programs, and improves workflows for both teams and single designers. Branching and merging lets you easily:Work in parallel with other members of your team. Explore changes or alternatives to a project and keep changes that make sense while leaving behind changes that don’t. Understand how your project evolved over time and what decisions were made (and why). Restore or reuse any design(s) in your project from any point in your project. Use any point in your project as a starting point for a different project.

What is my conclusion? It is hard to bring a value of cloud design collaboration without re-thinking the way CAD-PDM bundle is operating. For new type of cloud CAD systems it means to embed core PDM collaborative function and make it part of CAD system. It sounds like a very exciting time- many collaboration and data management problems are going to be solved by that. However, here is a question. What will happen when two cloud CAD systems will have to collaborate together? Looks like a topic for another blog. Just my thoughts…

Best, Oleg


The route beyond PLM – m3 Manufacturing 4.0 meetup

March 1, 2015

m3berlinmeetup

I’ve been attending m3 Manufacturing meetup in Berlin earlier this week. It was a very interesting gathering of makers, hardware geeks and manufacturers. Absolutely crazy agenda organized by Lutz Villalba, founder of Makercloud.io. More information about the event is here. Photos from the event is here. I wanted to share my presentation – The route beyond PLM and some of my thoughts.

The route beyond PLM – Manufacturing 4.0 meetup from Oleg Shilovitsky
Some of my thoughts after the event…

The impact of new generation on manufacturing is much bigger than you can think. I’m speaking about generation of people that grew up with internet in mind and generation of people that using web as a platform for everything in their life. Those people are using cloud applications and mobile devices naturally. They are sending less emails and using Whatsapp chat. They are using online tools like Google Docs and don’t know how to save Microsoft Word files. They don’t understand word “processes”, but know exactly how to manufacture new cool products.

What is my conclusion? New generation of people is getting access to modern manufacturing environments and tools. It is very interesting moment.  Watch it carefully- many amazing things will go out of there.  Just my thoughts…

Best, Oleg


Why PLM should leave a comfort zone?

February 27, 2015

Thoughts after PI Congress in Dusseldorf…

why-plm-stuck-in-comfort-zone1

Earlier this week, I attended PI Congress in Dusseldorf. For me, it was an interesting experience. I had no chance to attend PLM events for almost two years. PI Congress is an event I know from a very first time it was introduced in London back in 2011. The event became bigger and now have much richer content. Take a look on a 2015 program here. You can check back on twitter and take a look on a stream of updates and slides. I had great time discussing variety of PLM related topics with people I know many years. I met bunch of new people and got introduced to new ideas. I will share some of them later over the weekend and next week.

The topic I want to discuss today is a bit controversial… But let’s call the elephant in the room. PLM industry is stuck in a comfort zone. We know that. We know problems of PLM industry. We know problems of PLM systems and vendors. But we live with the status quo. Why? The simple answer is because it is a well-known comfort zone. We know how to live here. We know how to struggle with complex PLM projects. We know how to get executives on board of PLM projects and sell them the value propositions of PLM implementations. We know to run PLM implementations, import data and bring service organizations to complete PLM projects. We know how to upgrade PLM systems and get support from PLM vendors to do so. We know all that…

I’m coming today with three main reasons why I think PLM industry must leave a comfort zone.

1- Existing paradigms have slow ROI

Each large manufacturing firm is completed 2-3 cycles of PLM implementation for the last 15-20 years. Some of them did it with a single vendor. Some of them jumped a ship and moved between vendors. Technologies changed, version changed, user interface changed. However, the fundamental ideas of PLM remained the same. PLM systems are creating data model to manage information about product and related product development processes. This “implementation” process is tedious and complex. It is about existing manufacturing environment, ways people organize their work and people ego. Most of companies are doing so because they don’t know how to do things differently.

2- New generation of people

New generation of people grew up for the last decade. Those people are using cloud applications and mobile devices naturally. They are sending less emails and using chats. They are using Google Docs instead of Microsoft Office files. They are less thinking about processes and more thinking about design excellence. These people are getting access to modern manufacturing tools and environments and they are creating amazing products. We see them around every day.

3- Connected life

Our life is getting more connected every day. Think Network with capital “N”. It means nothing is located in a single place. Even more. It is less important where things are located and where people are working. What is important is to get an access to the network. As soon as you connected, you can do the job. People, devices, processes – all connected in a single live environment. Opposite to that, our PLM systems are about how to put data in a single database. How individual company databases can be connected in a Network? There is a gap here we need to close.

What is my conclusion? One of the sessions at PI Congress was not PLM related. It was a presentation done by Alisee de Tonnac, co-founder of Seedstars World. Check her profile on twitter. I want to bring only one statement from that presentation – “You’re so far behind, you think you’re first”. Think about it… Change is our biggest fear. In everything… We are afraid of change, so we keep existing systems, existing paradigms and do very little to introduce something new. However, the time is running out. Fast ROI, global, connected and mobile business environment – this is only a short list of what industry is demanding from PLM environment. Young generation of people is less interested in old acronyms, but more focused on how to get a job done efficiently. They are running bunch of cloud tools that have nothing to do with PLM paradigms. Some of them are politely asking “what is PLM”, but trying to bring systems they understand to design, engineer and manufacturing products differently. So, kick ourselves of out of comfort zone is hard. But we need to do that. Just my thoughts…

Best, Oleg


What will change existing PLM paradigms?

February 26, 2015

It is not uncommon to hear about “changing paradigms” in different domains these days. We are watching Netflix and disconnecting cable TVs, using Uber instead of driving our own cars. Yesterday at PI Congress, I saw the following slide demonstrating examples of digital disruption in different industry domains.

digital-disruption-pressure

Which obviously made me think about disruption in PLM. This domain has some characteristics that make it hard to disrupt. 1 / It is dominated by a small number of very well established vendors. 2/ The barrier to entry the space is high in terms of expertise and completeness of the solution. 3/ Decision lifecycle for customers to buy a software is long and the usage lifecycle is even longer. Companies can use software for 10-15 years because of product lifecycle (eg. aero-planes). As a result of that, one of the main drivers to change PLM system is in fact because existing PLM software will no longer developed or supported by PLM vendor.

For the last decade, we’ve seen very few example of starting a fresh new paradigm in PLM system. Aras Corp came with enterprise open source Aras Innovator. It was a cool idea – think about “Linux of PLM”. It would be interesting to see how much focus Aras will put in their open source in the future.

Another fresh start was Autodesk PLM360, which introduced “cloud PLM alternative”. Even ideas of “cloud” or “hosting” aren’t new and some vendors in PLM space did it before, entrance of such a big vendor like Autodesk in this domain made a change in the industry. 3 years later, we can see all PLM vendors have “cloud” in their portfolios.

There is one thing that didn’t change in PLM and this is very painful thing. You cannot just install and start using PLM like email. In the world of PLM it called “implementation”. You need to figure out how PLM products will help to organization to use it for their product development processes. And this is all about people. Technologies are easy, but people are really hard. Therefore, in my view, PLM got stuck with people. The current paradigm assumes PLM implementation as a core fundamental part of everything. It slows down adoption and requires extensive resources and effort from organization. How to change that?

Have you heard about DevOps? If not, I recommend you to put aside whatever you do and close this educational gap. It is well known in software development and it is essentially a combination of two terms – “development” and “operations”. It became popular and it is a result of massive introduction of new software development practices combined with cloud operations. Few months ago, I mentioned devops in my post – Why to ask your cloud PLM vendor about devops and kubernetes? Business insider article Today’s IT department is in fight for its life helped me to bring my thoughts to clarity. Here is my favorite passage.

Devops is all about how do things faster,” Red Hat CEO Jim Whitehurst tells Business Insider. It’s the IT department’s version of Facebook’s famous mantra “go fast and break stuff.” IT departments say they had better figure out how to be faster, cheaper, and better. If they don’t, the company’s employees will no longer depend on them. They bring their own PCs, tablets and phones to work and they buy whatever cloud services they want to do their jobs. And the CIO will find his budget increasingly shifted to other manager’s pockets.

“Like the manufacturers were in the 1970s and 1980s were fighting for their lives, today’s IT departments are going to fight for their survival,” Whitehurst says. Traditional IT departments are slow and methodical. Rule no. 1 was to never bring the systems down. They would take months, even years, to roll out new new software, testing everything carefully, often spending millions in the process. Devops eliminates that. Instead, IT departments tear their projects apart into teeny components that can be implemented in tiny changes every day.

The last phrase is a key one. How to tear projects apart into teeny components to be implemented in tiny changes. It made me think about existing PLM implementation paradigm. It heavily relies on long planning cycle and business department alignment. Once this planning made, implementation takes long time and put ROI in absolutely wrong place from what organizations are demanding it to be.

So, how PLM can adopt new way to do things? It requires 3 main changes – 1/ To change state of mind. Don’t think “one big implementation”. Opposite to that, think about small steps that will make business better, faster, efficient. 2/ To bring new PLM biz development tools that can help organizations to plan into small steps. 3/ To make PLM platform capable to function in Devops mode. It requires new type of data modeling, deployment and monitoring tools.

More to come, but I think, Devops ideas can inspire and educate PLM developers to think differently. How to develop PLM practices in a different way. How to bring a new feature in a day and how to test changes for the next hour. These are questions PLM business consulting, developers and business consulting should ask about.

how-to-change-plm-paradigm-with-devops

What is my conclusion? Changing paradigms is hard. For many years, PLM industry fundamental paradigm was to relies on implementation as adoption process of PLM technologies. It started from selling PLM toolkits that required long implementation. PLM vendors tried (still do) out of the box approach, which mostly ended up as a good marketing to demonstrated capabilities of PLM technologies, but required implementation anyway. Cloud approach cut the need for expensive IT involvement, but still requires implementation process. PLM industry needs to find a way to make PLM implementation simpler and easier, so people will stop thinking about PLM implementations as a mess. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 272 other followers