PLM 2013: What is your 7-years plan?

December 31, 2012

I’ve been out of active blogging for the last week because of Autodesk week of rest. It is a perfect time to disconnect from day-to-day activities, stop and think about what happens in PLM from different perspective – customers, vendors, technology. Approaching the end of the year, we can see huge amount of blog posts with titles like "the most important things you need to pay attention in 2013". I’m a bit tired of these "predictions". I decided to jump over and think about longer perspective of PLM beyond 2013. One of the companies that always impressed me by their long term thinking strategy is Amazon. You can read a bit about Amazon long term strategy from the last year NYT article here. My favorite passage is this one –

“If everything you do needs to work on a three-year time horizon, then you’re competing against a lot of people,” Mr. Bezos told reporter Steve Levy last month in an interview in Wired. “But if you’re willing to invest on a seven-year time horizon, you’re now competing against a fraction of those people, because very few companies are willing to do that. Just by lengthening the time horizon, you can engage in endeavors that you could never otherwise pursue. At Amazon we like things to work in five to seven years. We’re willing to plant seeds, let them grow—and we’re very stubborn.”

It made me think about what I can see on a horizon pf PLM product and technologies for the next seven years. To make long term prediction is risky thing. Nevertheless, I will take a chance to put some of my questions, ideas, thoughts and the conclusion below.

What we do after "cloud" goes mainstream?

We are in the middle of the biggest technological transformation since PC-era. Think about consumer technologies and products. Cloud technologies went mainstream in many areas – email, photo sharing, social networks, video streaming. PLM cloud revolution is moving forward these days as well. Vendors are applying different strategies. Two main disputed topics – security and availability, will be resolved sooner or later. Today, you still have time to get prepared to a new level of access, openness, mobility and ease of PLM cloud adoption.

PLM business models

Customers’ demand is to have predictable business models that will allow to PLM to grow in organizations. Today, PLM business models cannot scale to the level of enterprise and beyond. You can hear lots of discussion about that. To provide simple and affordable pricing models is a next challenge in front of CAD/PLM vendors. It won’t happen fast. To change revenue model is one of the most challenging part for every organization. At the same time, I don’t see any other ways to make PLM for everyone.

The value of small communities

The idea of communities is going mainstream in social networking and media. PLM companies and few startup companies tried to apply it to engineers, manufacturing companies and product development. It had their successes and failures. There are few Facebook-copycats developed by startups and established PLM/ERP vendors. I believe, it was a right step for 2012. However, thinking about longer terms, the value of these communities is limited. PLM companies will have to explore the value of small communities focused on specific vertical, industry, segment, etc. These small communities will have a lot of impact and potential in the future.

How to build PLM open data?

The era of data exchange is about to end. No industry can survive under such a huge amount of data translators and data modeling best practices. PLM vendors need to discover a growing world of shared vocabularies. Dublin core, Freebase, Good Relations, RDFS, SKOS, SIOC and others. To leverage open vocabularies will be extremely important in order to build connected PLM services.

What is my conclusion? The world of manufacturing, engineering and product development is changing. It is not as fast as consumer world and web. In my view, many manufacturers are holding back now their plans about the future development of PLM initiatives. My favorite quote of Nathan Myhrvold (ex CTO of Microsoft) – If you want to have a great future you have to think about it in the present, because when the future’s here you won’t have the time. Manufacturing and product development is in the early beginning of adopting curve of technologies and methods developed by web during last two decades. It will take time to transform and adapt it to fit enterprise companies need, but once it came, "new PLM" will stay with us long time.

Best, Oleg

Image courtesy of [jscreationsz] /

PLM Scale and Some Internet Factoids

December 22, 2012

The scalability of enterprise systems is an interesting topic. Enterprise IT usually keeps the story about scalability of systems close to their chest. It involves data centers, databases, channels, networks, latency, and many other aspects that allows you to tune your enterprise PLM. And I know, it was absolutely true for existing enterprise PDM and PLM.

The situation is different nowadays. Last 10 years of web development and internet established a new level of scale. The amount of data and user activities web and social networks can handle is going much beyond typical enterprise deployments. The following AronoldIT factoid article captured my attention earlier this week. I don’t know if these numbers are accurate. But knowing that Gangnam style fist video just hit 1B Youtube views, I can easy believe that.

Every minute more than 1,649,305 tweets get shared.
Every minute more than 3,472,225 photos get added to Facebook.
Every minute more than 2,060 brand new blogs are created.
Every minute more than 52,488 minutes of video are added to YouTube.
Every minute more than 31,510 new articles are created by an online newspaper.
Every minute more than 3,645,833,340 new spam emails are delivered online.

What is my conclusion? The consumer web and social media introduced a completely different perspective of scale, capacity and system performance. Enterprise PLM vendors and IT service companies need to start paying attention. The technological gap consumer systems are developing these days can easy outperform existing enterprise PDM and PLM deployments. Important. Just my thoughts…

Best, Oleg

Image courtesy of [ddpavumba] /

Who will clone existing PLMs to the cloud?

December 20, 2012

The discussion about PLM and cloud is moving to the levels where "details become important". In my view, many customers today are moving from "why cloud?" to "how we can leverage cloud?" type of questions. Cloud has many faces. In my previous blog post – PLM and the diversity of cloud options, I discussed how multiple cloud deployment options can be used for PLM – IaaS, PaaS, SaaS. Different PLM vendors are choosing different strategies. Here are some examples – TeamCenter is choosing IaaS, Aras decided for PaaS and Autodesk PLM360 is SaaS.

PLM industry is coming to the cloud with the a heavy baggage of technologies and products developed for the last 15-20 years. The existing PLM products and amount of customer investments into PLM program can raise valid questions about how to leverage these assets in the cloud. I’ve been reading TechCrunch article about CloudVelocity startup – CloudVelocity Launches With $5M From Mayfield To Bring The Hybrid Cloud To The Enterprise. Here is the interesting passage:

Users can discover, blueprint, clone, and migrate applications between data centers and public clouds. Currently, CloudVelocity supports full server, networking, security and storage integration with AWS but plans to integrate other public clouds, such as RackSpace in 2013. The beta trial of the Developer Edition cloud cloning software allows users to clone multi- tier app clusters and services, without modification into the Amazon Web Services (AWS) EC2 cloud. The Enterprise Edition enables users to clone, migrate and failover multi-tier apps and services into the AWS EC2 cloud.

The article made me think more about hybrid cloud and opportunity to expand existing PLM implementations to the cloud. Imagine, you can clone your existing PLM implementation and move it to AWS or RackSpace cloud? It allows you to build a secured environment to expand your PLM deployment into additional services. Here is a possible example. Many companies have BOM management implementation done as part of basic PDM/PLM programs. Future expansion of these services to NPI or Service Management requires additional resources and global availability. By "cloning" existing BOM management implementation to the cloud you can future expand to additional services.

However, I can see potential problems too. Many PDM/PLM environments have tight connections with desktop CAD applications. How to clone these environments to the cloud? This is a good question to ask.

What is my conclusion? The interest of customers to leverage cloud is growing. Still, many customers see cloud is a potential to implement something that they cannot do today with traditional PLM programs. Sometimes it is infrastructure limitations such as global deployment and sometimes it is related to cost implied into growing PLM deployment. I can see a growing opportunity to provide a technology enabling "to clone" existing PLM program to the cloud with future growth. It can be interesting option. So, dear PLM developers, somebody will clone you tomorrow in the cloud. What do you think about that? Just my thoughts…

Best, Oleg

Image courtesy of [Victor Habbick] /

Why Companies are Not Ready for Single BOM?

December 19, 2012

Bill of Material is one of the fundamental things in engineering, manufacturing and product development. Whatever topic you start discussing, you end up with the discussion about BOM. Wikipedia, actually, provide a decent definition of Bill of Material. Here is the link and quote:

A BOM can define products as they are designed (engineering bill of materials), as they are ordered (sales bill of materials), as they are built (manufacturing bill of materials), or as they are maintained (service bill of materials). The different types of BOMs depend on the business need and use for which they are intended. In process industries, the BOM is also known as the formula, recipe, or ingredients list. In electronics, the BOM represents the list of components used on the printed wiring board or printed circuit board. Once the design of the circuit is completed, the BOM list is passed on to the PCB layout engineer as well as component engineer who will procure the components required for the design.

It sounds so simple and straightforward. If you just finished your “BOM 101”, you can think the topic is really easy to get. The complexity of Bill of Material management comes as a result of the process, which is happening around BOM during design, engineering, manufacturing and support. What defined as “different types” of BOM, in reality, representing people, teams, department and sometimes different tools and enterprise solutions.

Time ago, I posted about how companies can have a single BOM – Seven rules towards single bill of materials, which raised many questions and comments in the past. One of the ideas of having single Bill of Materials is to streamline processes across disparate teams and departments. Few weeks ago, I came across a white paper published by Arena – Beyond BOM 101: Next Generation Bill of Materials Management. Navigate to the following link to read the document. This white paper provides a very interesting picture, which demonstrates the reality of BOM management in any manufacturing company.


This whitepaper is highlighting a very important fact – during the design, engineering and manufacturing process, engineers need to update BOM in many systems. Here is my favorite passage explaining the complexity of BOM management.

A modern BOM often includes a complex set of hundreds to thousands of structured items… Even after the first product is built, the BOM will continue to evolve—whether due to potential bug fixes, design improvements, part substitutes, or supplier switches—until the product reaches its end of life. The time spent to manually make changes and fix mistakes throughout the lifecycle of a product may amount to a substantial delay in its shipment. With multiple teams inputting frequent changes, manual revision control processes can easily become overwhelming and chaotic. It is difficult to track which changes have been made to which revisions. There is a lack of “a single version of truth” —the latest product information including BOM—that all project teams can consistently and confidently rely on throughout the lifecycle of a product.

The main challenge in this process is to maintain multiple BOMs in different systems. So, the idea of single Bill of Materials can be easy materialized to solve the complexity of synchronizations. So, why companies often fail to establish this single BOM? I can identify 3 main reasons why it happens:

1- Companies are using variety of tools to design, build and support products. Single platform PLM is probably a dream that not going to be materialized. In most of the companies, multiple design tools (including CAD), product data management and ERP systems are creating a complicated eco-systems with many rules and dependencies.

2- Because of specialization, people are not interested to switch from specialized and tailored tools to somewhat less functioning but common. The change is complex, can lead to potential delays and involvement of IT in system deployment and data integration. People prefer to bump BOM between systems rather than use a single tool.

3- It is hard to agree on how to share a single structured set of information (single BOM) among multiple teams, department and organizations. To develop export/import functionality as well as multiple synchronization services is, unfortunately, the mainstream decision taken by many companies.

What is my conclusion? I think, companies need to have a single, sharable, structured BOM representation reflecting all aspects of product development. PLM vendors applause to the idea of single database, but most of the integration and data synchronization tools and techniques are still very premature. In addition to that, PLM vendors are usually trying to lock customer to a single platform solution preventing independent and open bill of material storage. So, all together it blocks customers from migrating their infrastructure and system towards “single BOM” implementation. Just my thoughts…

Best, Oleg

3 things PLM can learn from UCO (Used Car Ontology)

December 18, 2012

Our digital and real lives are getting more and more connected. Think about our web experience these days. Mobile devices, map service, location based query, social networks. The list of examples can continue. One of the biggest challenges we have, as a result of web exposure in our real life, is the need to integrate and interconnect many sources of information coming from different places. Think about intertwining of your location information, photo posts and reviews. You have Facebook Nearby service (the enhancement of this service was just announced yesterday), You can get some interesting perspective on the service here.

Moving to the main topic I wanted to discuss today, navigate your browser to the following SemanticWeb article – Introducing the used car ontology (UCO). The article speaks about publishing of ontology (knowledge of concepts) that supports a precise description of used cars. The article reference MakoLab, the company that made this work as well as provides a link to the ontology itself. I found the following passage interesting.

“Publishing information about used cars… containing a description which refers to the Used Cars Ontology will allow for easier searching of cars for purchase, along with a more in-depth description of their state of exploitation. The UCO supplements the more general ontologies GoodRelations and Vehicle Sales Ontology created by Professor Martin Hepp. The GoodRelations ontology is now integrated with the famous dictionary created by Google, Yahoo and Bing, for the purpose of improvement of information searching on the Internet and better positioning of websites.”

Dig a bit into UCO ontology document and you find some examples of queries and operations build with the use of UCO ontology. Spent some time with the document and you will learn how to get report of used cars, getting car description, updating information about the car and more. The information about used cars can be located on multiple websites. It made me think about possibility to improve interaction with multiple island of information. Here are top 3 things I came after the analyzes of UCO doc:

1. Ontology can be used to produce a meaningful queries. The web technologies are providing a reliable instrument to work with data located in multiple websites. Semantic web provides set of technologies helping you to describe, query and manipulate the data.

2. Nobody is interested how data is stored. It is almost meaningless how data is stored. The website with the information about used cars can use any technology (from text files to excels and databases) to store data. This information is not needed to process data on a web scale.

3. Publishing semantic information can improve cross system data access. When your website and/or service is publishing information in semantically accessible way the information can be intertwined and used by other services for different purposes.

What is my conclusion? Web is a good example of the system that grew up beyond the level of single database. Web data processing mechanisms are interesting from the standpoint of sustainability and data scalability. Used car ontology provide a good example of organizing interoperability beyond the level of a single website. My hunch, we are going to see some of these technologies can change the way PLM systems operate today. Just my thoughts… What is your take?

Best, Oleg

How to prevent PLM cloud from buying experience complexity?

December 17, 2012

Experience is one of the most popular words in tech these days. CAD and PLM vendors are part of this journey as well. I would like to talk about "buying experience". For many customers, enterprise software experience is actually starts from the moment company needs to make a decision what product (software) configuration to buy. PLM and other enterprise software price lists are well known by the complexity. Many products, options, configurations. In many cases, the final decision about the desired configuration of software package can be taken after the evaluation and pilot implementation. Very often, consulting and/or service company needs to be hired to help in software configuration choice.

Opposite to enterprise on premise software, cloud or SaaS software is coming with the model that supposed to simplify the decision process. Especially, when it comes to the buying decision. For many SaaS software companies, the simplification of buying process is one of the most critical elements to success. The configuration of SaaS product is combined from few options like the following one for I’m sure you can come with more examples like that.

Over the weekend, my attention was caught by the following article in ReadWriteWeb Enterprise – Cloud Complexity Clouts Enterprise Customers. Author speaks about complication of Amazon Web Service cloud – experience gathered from recent re:Invent conference. One of the sessions of the conference was completely dedicated to… billing. I found the following passage quite interesting:

The billing session came as a bit of a shock. One Apache OpenCloud committer who did not wish to be identified summed it up best: "If you have to have a session on billing, you’re doing it wrong."… It’s a valid argument, because while one should rightfully expect all levels of interest to be addressed at a trade show’s first run, it seems that something like billing for cloud services should be pretty simple and not in need of tutoring. This kind of complexity raises serious questions about the real value of using a public vs. private cloud in the enterprise.

I see the potential of PLM cloud to get infected with the "complexity disease". It can be easy inherited from on-premise PLM portfolios. It can also come from some examples of cloud complexity like the one above. This is a very dangerous trajectory. In my view, cloud software is all about easy experience – buying, starting to use, expanding. Vendors need to be focus how to get customers involved and make a decision fast. Renewal of the service is another key element of success. Complex buying experience and complex paying experience (including billing) won’t help cloud solutions to expand their positions in organizations.

What is my conclusion? The simplification is one of the most strongest trends today. It goes everywhere. Buying experience is one of the most critical elements, since it is one of the first interaction with vendor and software customer may have when he starts adopting a service. It is also critical from the standpoint of understanding software ROI. If you cannot understand your software bill, you most probably will have a problem to calculate ROI too. Just my thoughts…

Best, Oleg

Image courtesy of [pakorn] /

How BoxCryptor can solve CAD cloud concerns?

December 16, 2012

The simplicity of DropBox and similar cloud based file storage makes it very attractive to many people. Engineers are not an exclusion from the list. The discussion about "Dropbox and PLM" is on going already long time. I posted about some interesting dropbox usage patterns few weeks ago here. As you can read from the article, engineers are placing files to the Dropbox. The problem of security is clearly identified these days. So, to see a startup trying to solve this problem should not surprised you much. That’s what I felt when my attention was caught by BoxCryptor for Dropbox. Still in alpha version, but promising … Here is how BoxCryptor CEO Andrea Wittek explains what they do:

According to BoxCryptor CEO Andrea Wittek, the benefit of that would become apparent if you happen to want to download and decrypt something using someone else’s machine. I can also see the feature coming in handy for Chrome OS business users, down the line at least. “We call it an alpha version,” Wittek told me. “We’ve been testing it for a while. We definitely recommend people try it, though we wouldn’t recommend it for very sensitive files. It can crash – the worst thing that can happen is you think it’s encrypted a file and it hasn’t.”

Watch the following video demo how BoxCryptor works.

What is my conclusion? Security is number #1 concern to adopt many cloud-based solutions. At the same time, you cannot stop new technologies. Access is one of the big advantages of the cloud. I can expect more companies will try to crack cloud-security problem and find innovative ways to share files in secured manner. The prize is huge – seamless access of files plus simplicity of Dropbox. For engineers, BoxCryptor can be a very attractive solution to share files with suppliers and remotely located design partners. This is only partial list. I’m sure you will come with more use cases. I’d be glad to read more about your experience with Dropbox file sharing.

Best, Oleg

Image courtesy of [Renjith Krishnan] /

[tag CAD, Cloud, Files, Dropbox, Security]


Get every new post delivered to your Inbox.

Join 290 other followers