PLM vendors and multiple IoT platforms

September 15, 2015


How are you doing this morning? About joining IoT world and create (or buy) IoT platform? You are not alone, by the way. Just two weeks ago, I mentioned Autodesk/SeeControl acquisition in my article "How IoT can eclipse and outcompete PLM vendors".

Earlier this morning, the big news coming from Dreamforce 2015 conference – Salesforce is jumping into IoT bandwagon and announced Salesforce Thunder. I picked few articles from endless list of publication already available about Salesforce/IoT. Here is my favorite short explanation about Salesforce Thunder:

The platform will collect data from connected sensors, mobile devices, social networks, wearables and cloud technologies and will be powered by a new events processing engine called Thunder. Thunder is the massively scalable events processor while IoT Cloud is the user interface built on top of the information it provides. Salesforce plans to release the product in trial for in early 2016 and make it generally available later in the year.

ComputerWorld article brings few interesting examples of companies making early testing of Salesforce Thunder. Read the following passage about two use cases from Emerson climate and Hexagon-Metrology.:

Wanting to go one-up on Next, Emerson climate used the platform to connect thermostats for both commercial and domestic settings. Emerson used the sensor data to generate insights into problem identification, preventative maintenance, proactive alerting, and customer life cycle management.

Hexagon-Metrology, a global manufacturing company headquartered in London, used the IoT platform to monitor real-time data feeds from laboratory and production line machinery. They used this data to identify catastrophic event and combined the insights with a real-time manager information system to send notifications of these events directly to the person responsible.

Both examples are coming from "manufacturing" space. The use cases mentioned by Salesforce are very similar to those I’ve heard earlier from PLM IoT presentations – machine monitoring and predictive maintenance. Which made me think that a competition for customer IoT mind have started.

Does it mean Autodesk, Siemens PLM, PTC and other PLM IoT visionary will be competing with Salesforce Thunder? This is a good question to ask. And the competition will have two trajectories – customer vision and data infrastructure. Another example from the same ComputerWorld article shows you a potential competitive scenario based on Microsoft use case. Microsoft is testing Salesforce Thunder to get information from Office product logs, alongside with point of sale information and customer support data in order to provide a complete picture of customer and product. It reminded me Dell use case presented at Siemens PLM Analyst event last week.

What is my conclusion? IoT is getting crowded by large and small companies. In my view, IoT buzz is dead to be used to sell PLM connected solutions. PLM vendors should think how to develop differentiation solution and technologies that will distant PLM IoT initiatives from a broad range of opportunities and products that will be coming to customers from multiple vendors. This is a note to PLM strategist. It is not too late, but you better move fast. Just my thoughts…

Best, Oleg

Image courtesy of Stuart Miles at

Why Big Data opportunity for product information is huge

September 11, 2015


Time. It is all that matters for business. The demand of customers to solve problems here and now are growing. To present 5 years roadmap is important, but to solve today’s problem is even more critical. If something will kill our business today, then 5 years roadmap can be irrelevant. Access to information is in many ways one of the keys to save time.

Companies manufacturing products are drowning into the ocean of data. It starts from design information, product, engineering, manufacturing data. But it doesn’t stop there. Products are generating lot of information during their lifecycle – new trends for IoT will bring even more information to manufacturing companies. With the shift of companies to sell services (and not products), companies will be facing even bigger challenges how to handle information.

I’m sure you’ve heard about “big data”. And in my view, big data is a big challenge for PLM vendors. There are technological reasons for that, but it is also significant cultural and vision change.

I’m following how PLM vendors are making progress with big data technologies and solution for the last few years. Siemens PLM is one of them. Earlier this year at PLM Connection 2015 in Dallas I learned about Siemens PLM acquisition of Omneo and future development of big data solutions as part of Siemens PLM Cloud Services group. Navigate to my old blog – Cloud Services and Big Data.

Yesterday at Siemens PLM Analyst event in Boston, I had a chance to listen to the presentation of Michael Shepherd of Dell about how big data can improve customer experience. Below few snippets of the presentation that caught my special interest and attention.

It all starts from the fact Dell is collecting an overwhelming amount of log information about what happens with customer devices in a real life.


Which creates a real problem of finding, slicing and dicing date. Which can be solved using different tools. A team of data geeks can work on the data and find a problem, but here is the thing – time is a problem. In most of the situations you would like to find a pattern of problem in a much faster way than team of data geeks can do.



This is where big data solutions like Siemens PLM Omneo can solve a problem. By collecting product information, applying variety of methods can bring data and find pattern in a much faster way.


One of the things that is very important is user experience. A combination of search and data exploration can be an interesting way to solve the problem. The following picture can give you few examples of how to do with Omneo.


What is my conclusion? PLM and IoT solutions a coming to the intersection of their data platforms. Manufacturing companies have a problem to process, classify, explore, search large volumes of data and relationships. PLM data platforms are hitting some limits here. This is where big data technologies and new data tools can come in place. Siemens PLM Omneo is a good example of such tools bundled into solution to save time to manufacturing companies facing data problems these days. Just my thoughts…

Best, Oleg

How to close collaboration gaps in CAD and PDM

September 10, 2015


We don’t have to share the same room to work together these days. Business are discovering globalization option following customers across the globe, skilled workforce, capital and supply chain option. In such environment, it is an imperative goal to develop reliable collaboration option.

There are many technologies these day that can support global teams with variety of collaborative options. However, with all respect to technologies and products, our ability to correlate the work with do on everyday basis with other people in the team is not following the demand for collaboration.

How can we better connect our business output with the work teams and individuals are doing in different offices and locations? Conference calls, video conferences, webcasts, instant messages, social tools. All these tools are supposed to help, but in fact very often these tools are one of the biggest productivity drains. After all, fancy collaborative tools are becoming an expensive and inefficient “file share” servers and most of our effort is how to keep these complex shares in sync between teams and locations. When it fails, companies are creating new data silos to help people to coordinate their work.

There are many examples of how CAD and PLM tools are solving collaboration problems in design, engineering and manufacturing. I just want to bring two that just came to me yesterday.

I captured my first example at Siemens PLM Analyst Event in Boston. Siemens PLM and Teamcenter has big plans to develop variety of tools to support supplier participation in high-value business processes. My human translation – Teamcenter will help organizations and people to collaborate.


The following example of Design Data Exchange shows specifically how portion of data can be retrieved and shared for collaboration with a supplier. Rules are supporting automatic data retrieval from Teamcenter.


This is not very unique process. The devil is in details and the way data can be extracted and shared in the context of work done by other people at the same time is critical. I hope to learn more about that later today.

My second example comes from the blog post about new feature developed by Onshape (a new software outfit founded by Jon Hirschtick and few other members of early Solidworks team). The fundamental premises of Onshape cloud CAD tool is support online collaboration between people working on the same design.The design teams spread across the room or across the world can collaborate on the same CAD model at the same time. You can learn more about Onshape collaboration functionality here. The last Onshape functionality – Onshape Teams allows to share information with a group of people and simplify the process of sharing.



What is my conclusion? You can get around using different tools to collaborate by sharing information between users and groups. What seems to me important is to be able to manage information boundary for collaboration. You can do it using variety of technologies on premise or cloud. However, the most important thing is to create a real time collaborative context. It can be tricky. To get data export / import and exchange information can be relatively easy, but it won’t help you to collaborate on the same piece of data at the same time. To support real time collaborative context can be a potential gap. By developing technologies to support it we enable a greater level of collaboration efficiency in the future. Just my thoughts…

Best, Oleg

Image courtesy of renjith krishnan at

PLM, Upgrades and Competition

June 1, 2015


Traditional PLM wisdom says to buy PLM system is a complex process. Once you decided for a specific platform or technology, it will be very hard to change or replace with something else. I’ve been skimming social network news this morning The following tweet from @jonathanpscott caught my attention – “More details on the #ENOVIA #SmarTeam User Group meeting in OH next week #PLM“. I’ve been involved into SmarTeam development many years ago and I’m aware Dassault System is still supporting the system. However, the following message from the agenda – “ENOVIA SmarTeam – a safe place to be”, increased my curiosity up even more. Future in the agenda you can see topics related to SmarTeam migration and co-existence.

It made me think about lifecycle of PLM systems and implementation more. What is the average cycle time for PLM implementation? How often companies are replacing PLM systems and what does it mean for a company in terms of effort, planning, operation and support?

Earlier this month I came across Aras Corp. materials about “resilient PLM”. If you haven’t heard about this new PLM buzz, navigate to the following link to read more. The term was coined by Aras to explain how Aras Innovator’s technology can withstand multiple upgrades and changes. Peter Schroer, Aras CEO is explaining about resilient PLM in the following video. Pay attention to the following part of the video explaining how Aras customers successfully moved between different versions of Aras on different databases – Postgress, Oracle, Microsoft SQL for the last 15 years.

Cloud technologies is another way to solve the problem of upgrades. Jim Brown of Tech-Clarity and Brian Roepke of Autodesk are discussing the advantages of cloud PLM. Watch the following video from 6:05 when Brian speaks about upgrades and revision lock. The upgrade sometimes even more expensive than initial implementation. According to Brian Roepke, cloud PLM technologies are solving the problem of upgrades and migrations in traditional PLM implementations.

Migration of PLM solutions can be significant driver in fundamental strategic decisions manufacturing companies are taking. Earlier last week Siemens PLM announced about the successful completion of PLM2015 project and move from CATIA to NX. Daimler’s decision was heavily influenced by the preference not to move between two PLM systems (Teamcenter and ENOVIA). Read about it in Schnitgercorp blog Reaching that one customer in a PLMish landscape. Here is the passage, which explains the reason:

Daimler‘s decision to move from CATIA to NX, huge as it was, was ultimately made by a team that weighed the benefits and risks in a more limited context than the overall Siemens portfolio. As I understand it, in the end it was simple: Daimler had based many business processes on its Teamcenter implementation; CATIA V6 requires ENOVIA, so Daimler would have had to build links between ENOVIA and Teamcenter to move forward with Dassault Systemes. That was more complicated, to Daimler, than migrating 235,000 “CAD objects” and retraining 6,000 people.

What is my conclusion? PLM upgrade or migration is sensitive and complicated process. It requires a lot of resources and can be very costly. In the current state of manufacturing and PLM technology customers are looking how to insure many years of operation once they implemented the system. However, business is changing and the need to be flexible is striking back as a conflicting requirement. Combined together it brings a very interesting flavor into PLM competition. The ability to implement PLM system and upgrade an existing (often outdated) PLM implementation becomes a key feature in the future competitiveness of PLM system. It is equally important for cloud and non-cloud implementations. I think the PLM vendor and technology capable to do so can gain a lot of traction in the future. Just my thoughts…

Best, Oleg

Image courtesy of iosphere at


Siemens PLM: Cloud Services and Big Data

May 27, 2015


You can say that buzz around big data is annoying. At the same time, organization are struggling with a fundamental challenge – there are far more data than they can handle. Some interesting facts about data growth around us. Back in 2000, only 25% of all data stored in the world was digital. By 2007, 94% of all data was stored digitally. Some experts has estimated that 90% of all data in the world was produced for the last 2 years.

Manufacturing and engineering organization have to deal with a growing amount of data. Old fashion methods of handling data are not good anymore. You may want to look on some of my previous posts – Will PLM vendors will dig into big data? , Big data and importance of information lifecycle. Even more, the question of how to use data to improve product quality or design becomes important – PLM and big data driven product design. For many organizations data can become a very disruptive force.

Last week at PLM Connection 2015 conference in Dallas, I learned few interesting facts about how Siemens PLM is developing big data cloud solutions to handle large volumes of complex information for manufacturers. Steve Bashada’s presentation was about the work Siemens PLM did following the acquisition of Omneo, which was part of Siemens PLM acquisition of Camstar.

Getting back to Siemens PLM Omneo. The idea is to discover data patterns that can lead to optimal product performance. This is may sound too generic. However, if you translate it into more specific actions. Think about finding reasons why the last batch of hardware devices such as computer flash drive or wearable gadget was defective and track a supplier of faulty components. Inside Big Data whitepaper gives you an interesting perspective on Omneo solution. You can download whitepaper in exchange of your email address here. Here is the passage from the article I specially liked:

For a compelling example that illustrates how big data is affecting the manufacturing sector, we can consider Omneo, a provider of supply chain management software for manufacturing companies. The business need was to enable global manufacturers to efficiently manage product quality/performance and customer experience. Consequently, Omneo needed to collect, manage, search and analyze vast amounts of diverse data types, and it sought the right software and hardware infrastructure to support this effort.

  • Enables global-brand owners to manage product performance and customer experience
  • Delivers a 360-degree view of supply chain data
  • Searches billions of data records in less than three seconds
  • Scales to support 300 million records every month
  • Allows customers to quickly search, analyze and mine all their data in a single place so that they can identify and resolve emerging supply chain issues

The following slide can give you generic yada-yada about the solution.


Siemens PLM is working on a solution with few selected customers. Dell is one of them. The following slides gives you an idea how a specific customer problem can be solved.


The solution uses “search based” user experience to search, filter and navigate between bits of data.


What is the technical foundation of the solution? Omneo is using some elements of existing big data stack you might be familiar with – HDFS, Hadoop, Cloudera combined with open source search technologies like SOLR. Omneo brings meta data and unified data model to handle product information and uses HBASE to manage information. The following slide can give you some more information about technical stack and how product is handling data.


What is my conclusion? Big data is a hard problem to solve. But it brings very interesting business cases. Siemens PLM Omneo is an example of specific data solution targeting big data problems in manufacturing organization. So far, the most specific example I was able to find reported by PLM vendors. My hunch, other PLM vendors might be looking on solutions, but haven’t seen specific publications about that. I think, big data can be applied in a very interesting ways to handle different product development, customer and manufacturing issues. We just not there yet. Manufacturing organizations and existing vendors are too slow to discover them. Just my thoughts…

Best, Oleg

picture credit Inside Big data article

Active Workspace: The transformation of search user experience

May 21, 2015

To access product information is one of the most important roles of PLM system. Time ago, organizing data in folders was good enough solution. Not anymore. Google changed the way we think about accessing information. Searching engineering and manufacturing software is tricky. There are many things that influencing the way you search for information – access rights, context, dependencies. The following picture summarizes top obstacles in finding information according to the Enterprise Search and Findability Survey 2013 published by FINDWISE.


I wrote few posts about search in PLM in the past. Navigate to one of them to read more – PLM search and findability. For the last few years, PLM vendors put significant focus on improvements of user experience and search functions. There are examples of search driven user interfaces in Aras PLM, Autodesk PLM360, Dassault / EXALEAD and others.

Earlier this week at Siemens PLM connection in Dallas, I had a chance to watch newest updates Teamcenter team made around Active Workspace. The first time product was presented back in 2011. Here is my original post about that- Siemens Active Workspace: PLM next big thing. Since that time, product evolved into rich user experience focusing on providing role-based, information search and navigation client. Siemens PLM Active Workspace is available in a browser and provides a way to search, filter and visualize information. In the example below, you can see an interesting combination of search, filtering and bar-chart visualization.


Product information is usually intertwined with many dependencies. The following example shows the ability to navigate between interconnected pieces of information.


The same UI is providing an access to the viewer.


One of the new functions I found is an access to the information such as Bill of Material in a spreadsheet-like way with the ability dynamically select column, filter and sort.


And finally you can annotate and comment on the pieces of information together with other people. It can give you some sort of social experience


Search became the most important element of UI when it comes to access of complex information. Search is playing a significant role in the transformation of PLM user experience towards more simple and intuitive UI. Many users will appreciate a better search driven navigation. PLM vendors are paying more attention to search and you can find search functions in other PLM systems too. At the same time, Active Workspace is probably the latest example of search-driven UI in PLM.

What is my conclusion? I found interesting the evolution of Active Workspace for the last 4 years since it was originally presented. The UI became completely webish, many new functions were added. However, the core function of search-driven user experience is there and it provides differentiation to traditional folder-based navigation and browsing interfaces. Just my thoughts…

Best, Oleg

Bill of Materials (BOM) and product lifecycle open loops

May 19, 2015


It is hard overestimate the importance of Bill of Materials for product development. In my keynote at ProSTEP iViP symposium in Stuttgart earlier this month I’ve been sharing my thoughts why developing of single BOM across multiple disciplines in critical for organization. I wanted to bring few examples that can demonstrate why having a single BOM strategy can bring benefits to product development and manufacturing organization.

Earlier today, at Siemens PLM connection event in Dallas, I captured the following slide demonstrating an integrated approach in design, manufacturing, planning and production. What is really interesting is how as-design, as-planned and as-build views in PLM are integrated with design, manufacturing, planning and production.


Few days ago, I the following article by 3D CAD World article caught my attention – Progress in closing the product lifecycle’s loops  by Peter Bilello, president of CIMdata. The article speaks about the importance of collaboration across diverse enterprise groups.

For many years, the PLM industry has greatly benefited from a steady stream of improvements in collaboration among ever more diverse enterprise groups—in data interoperability, for example, and in the transparency of workflows and processes. The development, manufacture and support of globally competitive new products are, however, still hamstrung by the remaining open loops new and old.

Later in the article it came to the topic I was looking for – Bill of Materials. According to article, BOM is a biggest remaining challenge to make integration running smooth. Here is the passage, which explains that.

Between engineering, manufacturing and finance, a big remaining challenge is the bill of materials (BOM) in its many forms—the as-designed BOM, the as-engineered BOM, the as-manufactured BOM, and so on. Generated and managed with PLM and often executed by enterprise resource planning (ERP) systems, BOMs themselves are loop closers. PLM-ERP connectivity and interoperability are steadily improving, but some open-loop issues are resolved only after time consuming face-to-face meetings.

What is my conclusion? Single BOM could be a great thing if vendors will figure out how to implement that. As you can learn from Biello’s article, PLM-ERP has open-loop issue and BOM is a tool to close that. However, companies are concerned about bringing single BOM strategy since it can raise lot of organizational challenges for them. At the same time, the demand for better integration and collaboration can put companies in front of decision to bring single BOM to close open loops between engineering, manufacturing and production anyway. Just my thoughts…

Best, Oleg

Image courtesy of Stuart Miles at



Get every new post delivered to your Inbox.

Join 287 other followers