Open Cloud, PLM Backbone and Vendor Lock-In

November 30, 2010

Picture-41-300x199.pngVendor Lock-In is painful. I think, customers in the space of CAD/PLM even more sensitive to this issue. Proprietary CAD formats used by vendors many years, which allowed them to charge premium fees. Recently, we learned that Data Backbone lock-in can be even more dangerous. Customers are spending millions in their data management infrastructures and proprietary platforms. It helps them later to navigate customers like Daimler to right decisions.

I just learned about IBM VISION Cloud initiative. More about IBM VISION Cloud in IBM press release. Navigate your browser to the following link and read the interview from Dr, Yaron Wolfsthal, IBM’s senior manager responsible for VISION Cloud.

The EU-funded VISION Cloud initiative, led by IBM, has been launched and is focused on creating a metadata format that will enable users’ data to be interoperable among Cloud service providers. This is potentially a huge development for all business, but especially small businesses, which run the risk of vendor lock-in and general unhappiness when they find that it’s not cost-effective to switch Cloud vendors should they encounter problems.

As far as I understand, IBM is investigating how to develop a cloud storage that can be used for future internet data services. Here is a very interesting quote from Dr. Wolfsthal interview:

In principle we are targeting an open specification, open APIs etc. The participation of the SNIA.Europe (this is the European chapter of the Storage Networking Industry Association) will help us promote the open Specifications and standards developed/extended by the project beyond the boundaries of the project.

In 10 days, I will be attending first COFES-Israel forum. COFES is a unique event where Engineering, Manufacturing and Construction Software industry executives can share their view and discuss innovative ideas. Part of the extended forum agenda is visiting Israeli companies and local branches and development offices of international companies located in Israel. We will be visiting IBM Haifa Lab and I hope to learn more.

Open Cloud
We need to come to the idea of Open Cloud. Focus on open standards that can facilitate data exchange and data openness can be an important factor in customer’s decision to move to cloud solutions. The potential winners will be companies investing in open platforms and not lock-in their customers in proprietary PLM backbones. Will PLM on the cloud initiative is the next mouse-trap for customers similar to what CAD format was last 20 years? Time will show.

What is my conclusion? In my view, the question of openness on the cloud is an unread chapter. Cloud lock-in can be even more dangerous than file format or data backbone lock-in. Important.

Best, Oleg

PLM, SharePoint and Migration Madness

November 29, 2010

I just learned that SharePoint migration projects require user’s involvement to be really successful. Navigate your browser to the following link and read – “When and How to Include End Users in SharePoint Migration Planning“. This story reminded me all stories I’ve heard in my life about PLM migrations. The following passage is interesting:

The problem with this view is that your end users know their requirements, essential business processes, and data better than you do. Input from the staff and managers who are responsible for the artifacts managed within SharePoint is a critical factor for a successful migration.

The story of migration between different PLM systems is complicated. Last week we’ve seen lots of buzzes and publications about Daimler’s decision to switch PLM systems used worldwide. I believe, the problem of the migration is valid not only for big OEMs, but also for smaller companies. What caught my attention is the fact SharePoint was pointed by Microsoft as the universal hummer to solve all possible and impossible data management and collaboration problems in a manufacturing organization of all sizes. Nope. From what I learned SharePoint is sharing the same enterprise software and PLM problems related to software upgrades and migrations. I’d be interested to learn how customers are handling migrations between different SharePoint point versions, which include dependent PLM solutions.

What is my conclusion? SharePoint is Microsoft’s heavy weapon to solve enterprise problems. However, I see more and more examples of SharePoint having the same weakness points as PLM and other enterprise software – dependencies on services, complicated customization and need to maintain complicated migrations. Now it is a time to check your PLM/Sharepoint options…

Best, Oleg

[categories Daily PLM Think Tank, Microsoft]

PLM Wave In a Box?

November 29, 2010

Google Wave Dead. Long live Wave In a Box (WIAB). Navigate your browser to the following link and you will learn that despite terrible things that happened to Google Wave and the Google’s team in charge of this product, it is still alive and even have some interesting turns ahead.

Google Wave and Apache Incubator

According to publications, a new Apache-incubator proposal would see an open source version of Google Wave technology. Initially, community expects to rely on the support of vendors like Novel, SAP and some others. If you read the following article in PCWorld, you can learn Wave has many supporters, such as US Navy and some others. Take a look on a Wiki page of Apache Incubator Proposal.

"Apache Wave is the project where wave technology is developed at Apache. Wave in a Box (WIAB) is the name of the main product at the moment, which is a server that hosts and federates waves, supports extensive APIs, and provides a rich web client. This project also includes an implementation of the Wave Federation protocol, to enable federated collaboration systems (such as multiple interoperable Wave In a Box instances)."

Catch PLM Waves?

I can see a good chance PLM companies can rely on WIAB to improve their collaborative capabilities. The technology is there and hopefully will be developed by a community. This can be an interesting turn, especially for companies with Open Source mindset. Almost year and half ago, I wrote 6 reasons why Google Wave can change PLM Collaboration. I think, the need is still here. If I’m thinking about companies like Vuuch innovating around new forms of collaboration or Aras Enterprise Open Source model, I can see some benefits for them to leverage Wave technology. It can be an interesting to compare a possibility of Vuuch/Wave vs. PTC/SharePoint from the standpoint of functionality and cost.

What is my conclusion? Open Source projects are trending nowadays. Here and there, I can see examples of Open Source innovation. In my view, Google Wave can be a good platform to boost collaboration. Time will show if companies in PLM domain will recognize the opportunity behind WIAB. Just my thoughts…

Best, Oleg

Daimler PLM Dilemma – PDM First

November 27, 2010

This week Siemens PLM announced – "Daimler AG has selected CAD Software from Siemens as their standard for their worldwide vehicle development". The event of such size is notable in the PLM space and generated a significant amount of buzz and publications. Despite the fact most of publications talked about what was a decision base for Daimler AG, my favorite quote was from Graphic Speak article:

All the MCAD/PLM vendors want the major automotive manufacturers as their customers, for both the bragging rights and the additional sales to the supply chain. But automotive is not the big story in the next few years for PLM. Recently PTC disclosed to business analysts information on their current competitive campaigns. They listed the number of targeted customers by existing PLM platforms. “In-house or home-grown” was in second place, with Siemens PLM the only vendor with more installations in the PTC cross-hairs. Real market growth is not coming from a few large vendors who have been using PLM for years, but from the thousands of smaller manufacturers who will leap-frog from a “PLM system” based on AutoCAD, Excel, email, and Windows Explorer to state-of-the-art engineering IT. This larger market is wide-open.

CAD and PDM History

For many years, CAD system was a leading software component in the overall strategy related to the design and engineering world. The decision about CAD was always one that set up the agenda to work with a vendor. At the same time, PDM was an appendix to a CAD kingdom. PDM integration with CAD system was considered as strategically important. The ability of PDM to be connected and used together with CAD environment was one of the key decision points for many companies. The importance of CAD (design) data was absolutely undoubted.

New Horizons of Product Data

In my view, last 10 years, introduced some changes in the priorities of engineering IT. The importance of "integrated solutions" raised significantly. The driver behind that is the understanding of manufacturing companies about how to control cost. The importance of product data management beyond CAD and design became clear for large OEMs and even smaller companies. The amount of product data outside of design environment outgrows the amount of CAD data. The introduction of lightweight data formats like JT, XVL and others decreased the dependencies of people outside of the design department on CAD data. Today, PDM system is a platform used to support expanded product data scope. Most of these systems are heavily customized. In includes the complication of CAD-PDM integration. However, the importance of the global product data management is growing.

PLM Platforms and Cost of Change

CAD/PLM vendors noticed the importance of vertical integration in the beginning of 2000s. This factor led them to decisions about the future platform strategies. All PLM vendors spent significant resources over the past decade to modernize and re-architecture their platforms – TeamCenter Unified, Enovia V6, Windchill. Lots of money was spent to introduce modern backbones, expanded portfolios and integration strategies. However, the reality in the field is heterogeneous software landscape. It leads to a question of "change cost" as the most important element of the future PLM decision.

What is my conclusion? Cost is important. In 2000s , the decision CAD vs. PDM was almost always CAD + any PDM Integration. Which means – CAD First. We learned something new this week. PDM and Data Management becomes more and more important. The cost of global product data platform change and potential IT disruption is much bigger compared to the cost switching to another CAD. So, in 2010s, the math CAD <or> PDM is different and the answer is probably PDM change + IT cost. Which means PDM and product data backbone first. This is an important difference, which will have an implication on engineering and manufacturing software decisions of the current decade. PDM system position can give some advantages in the PLM giant wars for large automotive and aerospace OEM accounts. Questions about cost of change and untapped PLM markets are more interesting, in my view.

Best, Oleg

PLM Backbones and Engineering Process Complexity

November 24, 2010

My blogging friend Deelip Menezes ( wrote few days ago in his twitter – “Mapping engineering processes to PLM backbones is this most difficult part in implementing PLM in a company“. This statement made me think about PLM implementations, flexibility and PLM adoption in organizations. Why PLM faces many problems during the implementation? What are so specific about engineering and product development processes?

PLM and Process Management

PDM is one of the predecessors of modern PLM. In the beginning of PDM, people didn’t speak about processes. The initial PDM idea was mostly focusing on data. However, the overall development of enterprise software and specifically, consolidation of enterprise applications into ERP suites, got CAD/PDM/PLM companies to think about future development strategies related to processes. The development of ERP and independent Business Process Management suites shifted PLM towards thinking about process management inclusion into PLM story. Modern PLM backbones provide a wide spectrum of capabilities to model and execute engineering and product development processes.

Engineering Processes Mapping

The idea of mapping between business applications and organizational processes was born first in ERP software. The components of ERP suites, actually, provided one-to-one mapping of finance, procurement, manufacturing and other organizational processes to ERP. Combined with business data stored in ERP backend, these applications widely used to run a company. Now, what happens in product development and engineering? Every PLM implementation requires mapping of product development processes So, what makes engineering processes different and complicated? I’d like to mention three aspects that characterize engineering and product development differently from other company processes.

Diversity of requirements.Every manufacturing company runs their development shop differently. This is the reality that cannot be eliminated. There are some patterns and rules that make companies in the same industry segment do it similar, but the diversity will exist even in this case. When it comes to the implementation, it creates a need for an additional investment to make a system adjusted to the requirements of a specific customer.

Execution flexibility.In my view engineering processes has a higher level of expected flexibility. Most of the product development and engineering processes are not linear. It is very hard to create this non-linear mapping with all possible exceptions and potential situations. People’s demand is to have an additional level of flexibility during the process run time. It creates more complexity.

Continues change. Last, but definitely not least. Manufacturing companies these days are under high pressure of improvement. Much of them is related to the product development, engineering and manufacturing. Taking into account that, engineering and product development process implementation will require changes more often. This is creating a significant impact on process mapping that are under on-going change.

Sustainable PLM Implementation

One of the things that important to mention in the context of specific characteristics of manufacturing and engineering processes is how to establish a sustainable PLM implementation. My hunch is that “flexibility” is a key factor. However, I think, it will be way too simple to say only that. The increased flexibility in process management creates a situation when a definition of a process changed. I’d like to specify three elements related to the definition of what I called – sustainable PLM: mainstream organizational processes, data transparency, informal communication. The first one is actually formal organizational processes that run an organization. Informal communication is a level that compensates the need in execution flexibility. Data transparency mechanism creates way people can leverage organizational data assets in order to run processes in a more efficient way.

What is my conclusion? Three main characteristics make Engineering Processes problematic when thinking about PLM – diversity of requirements, execution flexibility and continues change. Traditional process management methods are useless in such a type of situation. This is a place where a new way to organize processes needs to come. It includes new methods to work with data and social technologies to make a communication more efficient.

Just my thoughts…

PLM+ERP: Outside of Equations?

November 23, 2010

I’ve been watching a recorded webinar with a fascinating name: PLM+ERP = Manufacturing Success. The tweet about this webinar caught my attention. Navigate your browser to the following link to get an access to the recording. You need to register. I registered my blog as a media company. In addition, I failed to run webinar videoe in both Safari and Chrome on my MacBook. FireFox was okay. Later, I’ve seen that video streaming is using SilverLight plug-ins, which made me think about Silverlight portability and future Microsoft strategy with HTML5.

PTC+Microsoft: Webinar

This webinar is organized by PTC and Microsoft and help of IBIS, Microsoft partner. You will have an intensive and deep set of information about what PTC, Microsoft Dynamics and IBIS. There is a mix of sale, marketing and user. The most valuable, in my view, is the last part where you can see a demo of scenario with involvement of both systems – Windchill and AX. I put few slides below that did catch my attention.


I remember this formula for the last 10 years. The combination of PLM and ERP systems working together was a permanent challenge for mid-size and bigger companies that were thinking about how to streamline processes in the organization. If you will follow slides and webinar video, you will see all logical steps by organizing processes and data flow between PLM and ERP systems. The complication of organizing the process is high. It would be interesting to understand the effort needed to organize such integration. This is a key question, in my view.

Wrong Equation?

The webinar and demo made me think about how to streamline processes in the organization. The traditional and very often used way to solve the PLM+ERP equation is to make data flow between systems. It requires a significant amount of work in organizing these processes. Hard-wiring of data, connecting events, sending messages, screening exchange logs. Systems are working according to its own rules. Push/Pull of data is complicated. The overall system is very sensitive to changes and requires baby-sitting. The system can function. However, significant resources need to be spent to keep it up-and-running.

What is my conclusion? One of the fundamental principles behind PLM+ERP equation is an event driven process push. Many manufacturing companies and software vendors got into this for the last 10-15 years. It was an obvious way to solve PLM+ERP equation. It makes systems dependent and costly to manage. One of the fundamental ideas that may improve it is to get out of this equation. The name of the idea is "Pull". Pull can make systems independent and much easy managed. Just my thoughts…

Best, Oleg

PLM and Google TV: Not for Average People?

November 22, 2010

I was reading NYT on my flight today. The following article made me feel bad.Google TV, Usability is Not Included. I didn’t buy Google TV yet. I’m still checking my options. Read the article and made your conclusion. The idea of turning TV screen in full scope computer screen is fascinating, but I’m thinking about the end user. Can I explain to non-computer-savvy person how to switch browser screen? Mission impossible.

PLM Complexity Trends

The complexity of the Google TV explained by NYT made me think again about some PLM implementations. How many times you’ve been facing multiple screens, options, connections. I think, PLM implementation problem lays in their fundamental intere

st to expose the complexity of product development processes, dependencies and data connections. Even looking on new software in the enterprise space, I can see these complexity symptoms. I figured out 3 main PLM complexity trends.

Modeling complexity
This is normally happening when engineers are trying to apply all possible and impossible combinations of data models to reflect the situation in an organization. However, in many cases, I see it as not needed. A lot of situations can be solved by applying much fewer simple models. When you build your data model, just ask engineers to simplify it. If you do it constantly, you will see that you end up with half of features.

Presentation complexity
In my view, PLM software is still keeping the previous desktop paradigm. It means to put as much as possible information in front of customer’s eyes. This is a mistake. In order to fix it, send your people to learn mobile applications. The limited resources of mobile screen real estate drove people to change a paradigm. In addition, ask to move to action-based presentation concept. You provide only information needed for the task decision and show a subset of that options.

Process complexity
Last, but not least. There is a need to map processes into the organization. However, when starting to do so, keep in mind you don’t want to replicate all implementations you had in place before you started to transform your organization with PLM system. You can discover processes that simply not needed.

Simplicity Always Wins
If you think about modern trends in hardware, software and almost everything, you can see a strong trend for simplification. When I developed my first PDM/PLM products, the question of "documentation" was absolute. The need to have a documentation was very critical. What was discussable is how much documentation you need and how fast you can deliver it. Nowadays, everybody understood, that in order to stay alive, you need to create products that not require user manuals.

What is my conclusion? My conclusion is simple. Simplicity wins! To understand the true meaning of this is not simple. PLM software people need to understand it in order not to become dinosaurs with user manuals. Just my thoughts…

Best, Oleg


Get every new post delivered to your Inbox.

Join 287 other followers