Kenesto cloud PDM hybrid

December 18, 2014

cloud-pdm-hybrid

Few months ago, I posted about latest development of Kenesto cloud data management solutions – Kenesto revamp: does it change cloud PLM game? I saw it as a sharp turn for Kenesto from focusing on collaboration towards engineering and product data management business. From earlier comments made by Steve Bodnar of Kenesto here, I’ve learned Kenesto is developing technology to synchronize CAD data between desktops and cloud locations. Here is the comment made back in October:

…automatic synchronization maintains appropriate version control as well as permissions. This way, if you have “download only” permission, as an example, you can synchronize to one or more of your locations, and any updates will automatically be synchronized to those locations for you (in addition to notifications being sent).

CIMdata recent publication about Kenesto Collaboration Platform made me think again about what it does and how it might be different from other cloud PDM products available now or soon become available on the market. What caught my special attention in CIMdata publication is related to so called “innovative intersection of cloud-based file management and data sharing with traditional PDM vaulting”. A massive amount of CAD data is stored on corporate networks and just CAD desktops. It made me think Kenesto is trying to bring solution to customers that already have traditional PDM systems and extend it with a better collaborative option. The following passage from CIMdata commentary provides more explanations:

The Kenesto solution is a secure, hybrid, cloud-desktop collaboration platform where product development and delivery teams can collaborate using discussion threads, or by co-authoring documents and design files, with anytime, anywhere access. Kenesto puts a broad range of capabilities at the fingertips of product delivery teams to organize and manage their programs, products, and projects. Teams can create their workspaces with people, workflow, forms, data, and reports—including bills of materials, change requests, and purchasing forms—and be kept on the same page with Kenesto’s proprietary intelligent synchronization approach. Each user is provided with a dashboard that can be customized to personal preferences. An important feature in Kenesto is that users are always in full control of their documents and designs. A user can permit their teammates to view, mark-up, or edit their documents and designs and can collaborate with them in real time or asynchronously.

Many of features such as project, workspaces, workflow, forms, bill of materials, change requests etc. are not new in PDM industry. However, “cloud-desktop” hybrid sounds like a new buzzword. Does it mean Kenesto found something unique in terms how to bring desktop CAD users to the cloud? It hard to say based on a commentary, but it might go that way.

What is my conclusion? Market dynamics are bringing more engineering and manufacturing companies to the cloud. It gives more opportunities to cloud PDM/PLM vendors. At the same time, it raises more questions how existing environment and data assets will be managed and how people will collaborate in a hybrid environment. Kenesto might solve an interesting problem here and compete with other vendors in the same domain – Autodesk, SolidWorks, GrabCAD and others. Just my thoughts…

Best, Oleg

photo credit: ukCWCS via photopin cc

Photo is an illustration only and does not reflect Kenesto architecture.


How to migrate into “future PLM platform”?

December 6, 2014

plm-platform-migration

One of the topics I touched in my yesterday post about future PLM platforms is platform migration. The ability of customer to make a move is significantly dependent on how existing environment can be migrated. You can catch up on some of my earlier thoughts about PLM migrations by reading the following posts – PLM upgrades, release cycle and legacy software; PLM migration and product data rock-n-roll; PLM cloud and future of upgrades.

Most of large manufacturing companies (and even smaller companies) already made some sort of investment in PLM products. What is ROI of move to a new platform? How to calculate it? How not to get troubled by supporting multiple versions of applications and environment? These are good questions. Customers and PLM vendors are equally interested how to manage it in a right way.

My attention caught Dassault Systemes’ 3Dperspective blog post – Top Three Considerations for Planning Your Move to the 3DEXPERIENCE Platform. It speaks about how customer can migrate into new 3DEXPERIENCE platform. Here is an interesting passage:

The same data model and business process rules that power the 3DEXPERIENCE platform also powered the ENOVIA platform. In fact, the same basic approach also powered the MatrixOne platform. This is why so many of ENOVIA’s current customers have been able to successfully upgrade since their first implementation in the mid to late 1990’s.

The following picture shows the history of 3DEXPERIENCE platform evolution. It basically means that the say foundation platform used by all MatrixOne and ENOVIA customers and migration is effortless. I’m not sure if I’m happy to know that the same data technology used by all generation of systems from mid 1990s. However, it is clear benefit for customers looking how to migrate data between different versions of MatrixOne and ENOVIA V6.

3D-experience-platform-evolution

Dassault System’s rival – Siemens PLM and its TeamCenter platform also has long history of transformations. I didn’t find specific public references on compatibility between data models and application among TeamCenter versions. However, the following article from Tech-Clarity blog by Jim Brown presents an interesting diagram of TeamCenter evolution – Siemens PLM vision 2014+.

TeamCenter platform evolution

More information about evolution of TeamCenter can be found in the following CIMdata document – TeamCenter “unified”. The following passage speaks about “migration” issues:

Siemens PLM will continue to support Teamcenter Engineering and Enterprise for those customers that have them in production. Importantly, with each release of these older products, they have updated the underlying architecture and technology so that when a customer decides to change, the transition to the unified Teamcenter solutions will be easier. They have also developed a robust suite of migration tools that can be used when moving from earlier versions of Teamcenter products to the unified platform.

What is my conclusion? The migration is a complex topic. It is probably one of the most important topics that will define ability of large vendors to move into bright future of next generation PLM platforms. Regardless on what platform customer is going to move, migration will have cost that must be calculated and validated. The idea of “federated platforms” brings some promise of minimizing of migration cost. However, the mechanics of this process is not very clear. At the end of the day, data must be brutally dumped out and transferred. Application migration is even more complex. Users must be re-trained. All together, it is not a simple task. Just my thoughts…

Best, Oleg


The definition of cloud PLM

November 7, 2014

cloud-plm-def

PLM industry is moving towards broader adoption of cloud solutions. More people these days are asking how to implement cloud. It becomes more and more clear that devil is in details and cloud environment can be very different. It comes in variety of aspects related to infrastructure, support of browsers, the need to have elements of software installed on your desktop and mobile devices. It depends on many options. I’ve been touching some of them in my earlier blog – PLM cloud options and 2014 SaaS survey.

As part of overall eduction about cloud technology, it is not unusual to get a question about what is definition of cloud solution in general and more specifically – cloud PLM. I’ve been reading CIMdata commentary – Next Generation Cloud-Based PLM Solutions. I found the following passage as a good summary of cloud PLM definition:

1- On-demand solutions with new cost models that have lower upfront costs for software licenses, subscriptions, or rights-to-use, allowing smaller companies to afford PLM

2- Hosted computing services and environments that do not require investments in infrastructure, providing access to information for anyone at any time while minimizing administrative overhead

3- The ability to add and increase scope of capability and the performance of the solution and processes without requiring additional investment in the underlying IT infrastructure

4- Global access to required application functions, information, and processes

CIMdata’s cloud PLM definition combines some technical aspects blended with business and licensing characteristics of PLM solutions. In my view it is clear indication that cloud PLM story is not pure technology. Customer demands are to find solution that solves multidisciplinary problem of system – technological, business, licensing. It is also shows the fact customers are dissatisfied with today’s business practice of PLM software licensing.

What is my conclusion? The technology and business are going together. Cloud PLM is about to solve customer problems in different aspects – improved business models, low cost and better experience. However, in my view, an interesting part of cloud PLM innovation can be related to the part of PLM system implementation. For many years, implementation was one of the most complicated element of PLM. It takes time to adjust system, capture business processes and set up tools to run and optimize product development. The first PLM vendor who will crack how to leapfrog PLM implementation using cloud business model and technology can gain a significant competitive advantage. Just my thoughts…

Best, Oleg

photo credit: JD Hancock via photopin cc


You need PLM project to fail… to start lifecycle

June 23, 2014

plm-project-failure

One day you discover that your PLM implementation project is not doing so well. It happens and it called failure. Scott Cleveland’s blog took me back to the topic of PLM implementation failures. Unfortunately, I didn’t find the link on CIMdata research to read the paper mentioned by Scott. According to the post, wrong scoping and failure to get buy-in from users are on the topic of the list. Not a surprise.

Failure is not such a rare situation in IT world. Google "IT project failure" and your can quickly discover that 68% of all IT projects fail. Few months ago, I had long discussion around my Why PLM stuck in PDM? article on LinkedIn. I cannot publish all comments from closed discussion group, but the question about "how to identify what is PLM project failure" was one of the dominant topics in the discussion. Guess what… no agreement about how to identify "Failed PLM project". Few other notable references on PLM failure publications: PLM Failure, you didn’t see anything; Keynote presentation by Martin Eigner on ProSTEP iViP Symposium 2009.

Unfortunately, most of PLM events and publication are placing shining picture of success on their PLM references. The problem that all these successes looks the same. It is time to remember Leo Tolstoy passage from Ana Karenina – Happy families are all alike; every unhappy family is unhappy in its own way. One of the interesting place to learn about failures is to attend FailCon – the conference for startup founders to study their own and others’ failures. There is no PLM Failure Con… yet. And I doubt companies will be ready to do so.

Reading and discussing PLM failure articles, made me think that you really want your first PLM project to fail. Why so? Here are few thoughts…

Challenge the status quo. As people often say – PLM exists in every manufacturing company. You do product lifecycle management by the way you manage product development processes, store data, communicate within organization and with outside contractors. At the first attempt you will try to build PLM system that mimic all existing practices. I’ve heard it many times – if you have a mess, don’t bring a computer. Otherwise, you will have computerized mess. First, fix the mess, then bring computers.

Get rid of outdated stuff. Every manufacturing company usually trailing lots of existing software and practices. It is hard to cut the cord and switch and leave outdated stuff behind you. PLM project failure can bring an awareness to the problem and force company to make a change. It is hard to company and people to admit they do something wrong. Especially if they do it many years.

Learn as you go. You have the best chance to learn when you actually do something. Regardless on your experience, every manufacturing company is different. How to see that new system will fit? Put it in action! When it comes to people, they only way to see if it works is to try it. Then you fail and only after, find the right way to do it right.

Think about your PLM system in the same way you think about product development processes. Your design doesn’t fit manufacturing plan, some requirements are failing to communicate and some of them got misunderstood. Your first manufactured item fails and you need to fix issues. These are absolutely normal thing for every manufacturing company. Your PLM is not much different.

What is my conclusion? Failure is not an option is probably wrong PLM implementation strategy. Opposite to that, you need to bring it fast, engage with users, fail, fix it and bring back fixed. Lifecycle. This is the only way to make it right. Just my thoughts…

Best, Oleg


Traditional PLM have reached their limits

June 11, 2014

plm-limits

To sell PLM to small and medium enterprise (SME) companies is a challenging tasks. I guess many of my readers will agree with me. I expressed some of my thoughts here – Why PLM stuck to provide solutions for SME? Opposite to that, large manufacturing companies, especially in aerospace, automotive and defense industries, were always a sweet spot to sell PLM solutions. Actually not any more…

Earlier this week, my attention was caught by CIMdata article – CIMdata Announces the Formation of an Aerospace & Defense PLM Action Group. Here is how CIMdata President, Peter Bilello defines the objective of the group:

“PLM solution providers continually deliver new products, architectures, and solutions to market, while industrial customers must cope with previous product launches, attempting to realize the value from existing PLM investments. The Aerospace & Defense PLM Action Group will define and direct the research efforts on key areas needed to meet future challenges. Current experiences with PLM-related implementations, best practice research, and close examination of emerging technologies will help define what the PLM solution providers should be offering

Another article by ConnectPress speaks about top three challenges of PLM in A&D industry – Global collaboration, Integration and obsolescence management.

#CIMdata Aerospace & Defense #PLM Action Group Addresses The Big 3: http://goo.gl/flMUx4 via @ConnectPressLtd

Integration is a topic that near and dear my heart. In my view, the future of manufacturing will be heavy dependent on the solving of old integration problems. Multiple enterprise software systems is a reality of all large manufacturing companies. I guess aerospace and defense companies are an absolutely extreme case of multiple systems integrated together. This is a place where existing PLM system might have a challenge. Here is my favorite passage from ConnectPress article:

According to Roche, most major aerospace companies make a major investment in PLM, either changing to a new system or upgrading their current system, roughly every five to 10 years. But in more recent iterations of this cycle aerospace companies have been spending more money and seeing smaller returns on their investments. The reason for this appears to be because the traditional PLM components have reached the limits of what they can offer.

The following areas mentioned are expecting to bring maximum ROI from PLM investment – non-core PLM domains such as requirements management, configuration management, change management, service lifecycle, etc.

It made me think, the integration of these solutions to core PLM modules can introduce a significant problem. For most of PLM systems, the architecture and technologies of core functions such as CAD data management and BOM management were designed back 10-15 years ago. To connect and interplay between heavily customized core PLM modules and expanded PLM solutions can bring significant service and implementation expenses.

In my view the following four major paradigms used by existing core PLM modules will require some sort of architectural upgrade to take them to the next level of integration in large global companies: 1. Global data management; 2. Concurrent design and related content access; 3. Management of multiple Bill of Materials; 4. Cross system data federation and integration.

What is my conclusion? Redesign the core PLM functions can be an interesting challenge for major PLM suppliers. In my view, this is something that will require a significant revamp in existing platforms and data management paradigms. Cloud can help to solve global collaboration challenge. Will cloud help vendors to solve a problem of multiple system integration? It looks to me a good topic to discuss. Just my thoughts…

Best, Oleg


Why PLM stuck in PDM?

April 5, 2014

plm-stuck-pdm-round-square

I’ve been following CIMdata PLM market industry forum earlier this week on twitter. If you’re are on twitter, navigate here or search for #PLM4UM hash tag on twitter. The agenda of PLM forum is here. The following session discussed one of my favorite topics- PDM v PLM. PLM: Well Beyond Just PDM by Peter Bilello. This passage is explaining what the session is about

CIMdata’s research reveals that leading industrial companies are looking to expand beyond PDM functionality to truly enable a more complete PLM strategy. This becomes even more important in a circular economy. In this presentation, CIMdata will discuss which areas are most important, and what opportunities they create for PLM solution and service providers.

My attention was caught by the following tweets coming from this session:

According to CIMdata, leading Mfrs are now looking to move beyond PDM. #PLM4um
— ScottClemmons (@ScottClemmons) link to tweet.

Peter B / CIMdata explains that it’s hard to find a ‘real’ end-to-end #PLM implementation hat works #plm4um
— Marc Lind (@MarcL_) link to tweet.

It made me think why after so many years of PLM implementations, most of vendors are still solving mostly PDM problems for customers and it is hard to move on into broad downstream and upstream adoption of PLM beyond CAD data management functions. Here are my four points explaining in a nutshell why I think "PLM stuck in PDM".

1- Focus on design and CAD.

Most of PLM vendors historically came from CAD-related domain. Therefore, PLM business for them was the expansion of CAD, design and engineering business. As a result of that, use cases, business needs and customer focus were heavy influenced by design domain. The result – PDM focus was clear priority.

2- PLM is a glorified data management toolkit

The initial focus of many PLM systems was to provide a flexible data management system with advanced set of integration and workflow capabilities. There are many reasons for that – functionality, competition, enterprise organization politics. Flexibility was considered as one of the competitive advantages PLM can provide to satisfy the diversity of customer requirements. It resulted in complicated deployments, expensive services and high rate of implementation failures.

3- Poor integration with ERP and other enterprise systems

PLM is sitting on the bridge between engineering and manufacturing. Therefore, in order to be successful, integration with ERP systems is mandatory. However, PLM-ERP integration is never easy (even these days), which put a barrier to deploy PLM system beyond engineering department.

4- CAD oriented business model

Because of CAD and design roots, PLM sales always were heavily influenced by CAD sales. Most of PLM systems initially came to market as a extensions of CAD/PDM packages. With unclear business model, complicated VARs and service companies support, mainstream PLM deployment always focused on how not to slow CAD sales.

What is my conclusion? Heavy CAD roots and traditional orientation on engineering requirements hold existing PLM systems from expanding beyond PDM for midsize manufacturing companies. The success rate of large enterprise PLM is higher. But, it comes at high price including heavy customization and service offerings. Just my thoughts…

Best, Oleg


PLM Is Challenged With Compliance Data

October 24, 2013

open-data-types

Manufacturing is going global. This is not about the future. This is a reality of all manufacturing companies today. So, with such a reality, supply chain solutions is getting more and more traction and interest from IT managers and other people in companies. If I will use the language of one of my sales exec friends, supply chain solution turns from a vitamin to pain killer. Which means sales opportunity.

At the same time, supply chain is not a new topic for manufacturing companies. Very often PLM companies are focusing on supply chain with “CAD in mind”, which makes them looking mostly on design supply solutions. Many supply chain management opportunities are turning directly to ERP vendors and other vendors specialized on specific vertical supply chain management solutions. In my view, data becomes one of the most critical topic in every supply chain solution. In most of the situation, data needed for supply chain solutions “stick in the middle” between ERP, PLM and SCM. It combined from design data, manufacturing data, suppliers data. It is not easy. It requires openness and transparency between multiple systems. Let’s say the truth – it doesn’t work well in existing enterprise systems. These systems never been designed for openness and transparency.

The topic of supply chain compliance was discussed during recent PI Congress in Chicago two weeks ago. I found a good summary of PI Congress provided by CIMdata here. Here is an interesting passage from this write up related to supply chain compliance topic:

The first [panel] focused on supply chain transparency and traceability. This issue occurs at the intersection of PLM and ERP, and is critically important to firms that must increasingly compete on a global basis. The panel agreed there was a need for a common dataset on compliance issues, a common problem when selling in many countries. They recognized that PLM solution providers are challenged to provide this information in a timely fashion, and challenged the audience to help find or create new alternatives.

Common dataset is an interesting topic. It made me to think about the trend towards open data. I can see as part of broader discussion about BigData opportunity these days. In that context, I want to mention Open Data initiative led by some organizations – Open Data Institute (ODI) and Open Knowledge Foundation (OKF). The first one – ODI founded by Tim Berners Lee. The topic of open data is complex and I hope to speak about it in my future blog posts. You can find some information about open data on ODI website here. Another thing derived from open data is Open Definition initiatives. These are all new initiatives that mostly unknown in the enterprise software domain.

What is my conclusion? I think we are in the beginning of data revolution. To provide a better solutions these days, we need to have data available. It includes openness and data access. It also related to company data stored on company servers and, even more important, open data outside of organizations that must be available to everyone. In my view, common compliance data set is a perfect example of open data that must be available to enable future development of PLM supply chain solutions. Just my thoughts…

Best, Oleg

image courtesy of Open Knowledge Foundation.


Follow

Get every new post delivered to your Inbox.

Join 262 other followers