What is the future of PLM data analytics?

October 31, 2013


The amount of data we produce is skyrocketing these days. Social web, mobile devices, internet of things – these are only few examples of data sources that massively changing our life. The situation in the business space is not much different. Companies are more and more involved into connected business. Design, supply chain, manufacturing – all activities are producing a significant amount of digital information every minute. In design, the complexity of products is growing significantly. Personalization, customer configurations and many other factors are creating significant data flow. Simulation is another space that potentially can bring a significant amount of data. I was watching presentation "Workspace 2020" made by Dion Hinchcliffe last week at the forum "Transform the way work gets done". Navigate to the slide share slides and take a look. One of the numbers stroke me – speed of data growth. Now (in 2013) we double data in 2 years. By 2020, we are going to double the data every 3 months.


The massive amount of data brings the question how engineering, design and manufacturing systems can handle this data and produce a meaningful insight and decision support for engineers and other people involved into development process. The question of data analytics came to my mind. In the past data analytics was usually associated with long, complicated and expensive process involving IT, resources, time and programming. Cloud and new computing ecosystem are changing this space. I was reading the announcement made by Tableau (outfit focusing on providing analytic dashboards and tools) – Tableau Software partners with Google to Visualize Big Data at Gartner IT Symposium.

The partnership mixes Tableau’s analytics with the Google Cloud Platform. The technology was presented at Gartner conference in Orlando recently. Here is an interesting passage explaining what Tableu did with Google:

“Tableau and Google created a series of dashboards to visualize enormous volumes of real-time sensory data gathered at Google I/O 2013, Google’s developers’ conference. Data measuring multiple environmental variables, such as room temperature and volume, was analyzed in Tableau and presented to attendees at the Gartner event. With Tableau’s visual analytics, Gartner attendees could see that from the data created, I/O conference managers could adjust the experience and gain insights in real time, like re-routing air-conditioning to optimize power and cooling when rooms got too warm.”

I found the following video interesting explaining how easy you can build some analytics with Tableu and Google Big Query.

It made me think about future potential of analytics we can bring into design and engineering process by analyzing huge amount of data – simulation, customer, operational. Just think about combining data collection from products in the field, mixed with some simulation analyzes that can be used to improve design decisions. Does it sounds crazy and futuristic? Yes, I think so. However, there are many other places that we considered crazy just few years ago and it became the reality now.

What is my conclusion? Data analytics is one of the fields that can provide a potential to improve design and engineering process by analyzing significant amount of data. Think about leveraging cloud infrastructure, network and graph databases combined with visual and intuitive analytic tools. It is not where we are today. This is how future will look like. Just my thoughts…

Best, Oleg

Will PLM Highway Route to Cloud Corridor?

October 31, 2013

It is not unusual to hear about technological clusters these days. Cluster development emphasizes the importance of geography, or more correctly economies and/or technologies of co-located companies and businesses. In manufacturing world it very often includes suppliers or companies specialized in a specific domain or industry. Wikipedia article Business Cluster is covering the history and reasons of different industrial, technical and finance clusters. One of the classical examples of clusters is Silicon Valley. Here is how it explained in Wikipedia:

In the mid- to late 1990s several successful computer technology related companies emerged in Silicon Valley in California. This led anyone who wished to create a startup company to do so in Silicon Valley. The surge in the number of Silicon Valley startups led to a number of venture capital firms relocating to or expanding their Valley offices. This in turn encouraged more entrepreneurs to locate their startups there. In other words venture capitalists (sellers of finance) and dot-com startups (buyers of finance) “clustered” in and around a geographical area. The cluster effect in the capital market also led to a cluster effect in the labor market. As an increasing number of companies started up in Silicon Valley, programmers, engineers etc. realized that they would find greater job opportunities by moving to Silicon Valley. This concentration of technically skilled people in the valley meant that startups around the country knew that their chances of finding job candidates with the proper skill-sets were higher in the valley, hence giving them added incentive to move there.

In the past I was writing about concentration of manufacturing companies and associated software companies alongside of Rt. 128 in Greater Boston. I called it “PLM highway”. Navigate to my historical post here. Following my post, the same topic was captured by PTC publication and Boston Globs article. Both stories emphasized the importance of Manufacturing companies location on Mass Rt 128


Few days ago, I was reading Buzzfeed article Tech Companies Are Trying To Rename Downtown San Francisco The “Cloud Corridor”. The map below looks fascinating. It clearly demonstrates the importance of technology concentration.


Cloud technologies are not differently in that case and it leads to even greater specialization and interest of people to co-locate at the same place to enable better communication and collaboration. It made me think about PLM and other software companies focusing on engineering and manufacturing. The PLM cloud switch is going to happen. The question is just about the time. The ability to stay connected with other cloud tech companies can be an interesting competitive advantage.

What is my conclusion? Technology is hardly can be disconnected from people. It is clear that in order to succeed in cloud business, companies will have to find a way to be connected with people and other companies focusing on the business in the same domain. Will CAD / PLM companies follow the dynamic of the cloud cluster development to get better connection with other cloud businesses? It is an interesting question and trend to follow. The answer is unclear to me. Just my thoughts…

Best, Oleg

Why PLM selection is about data access problem first?

October 29, 2013


How to select PLM? Manufacturing companies, industry pundits and vendors are trying to simplify this process. Nevertheless, after almost three decades, the problem is still on the table. PLM sales is value based and unfortunately requires to juggle too many people and events together. I see this process as a combination of technological choices, company practices and vendor selection.

I recently came to few notable PLM blog articles focusing on different aspects of PLM selection. My long time industry colleague and blog buddy Jos Voskuil put an article PLM Selection – Additional thoughts. Take a time and read Jos’ article – it contains many good recommendations and options to check when you trying to select PLM system such as – organizational requirements, implementation specifics, cost and even vendors FUD. The last one was interesting and I specially liked this passage:

My recommendation to a company that gets involved in FUD during a PLM selection process, they should be worried about the company spreading the FUD. Apparently they have no stronger arguments to explain to you why they are the perfect solution; instead they tell you indirectly we are the less worst.

Another article I came across is a publication in CL2M portal. A short writeup by Scott Cleveland is interesting – Why PLM? Scott is also mentioning multiple reasons to get involved into PLM. One of them is about "looking for information" effort was caught my special attention. It comes as a first problem one of his client was trying to solve by implementing of PLM. Here is how Scott explains that:

Looking for Information: He told me about the time his engineers spend looking for ‘stuff’ [like drawings and files]. I said this is a problem everywhere. I told him there have been many studies performed analyzing the time it takes engineers to find ‘stuff’, and all of them say that, without document management software, engineers can spend as much as 25% of their time looking for ‘stuff’.

He said he couldn’t put a figure on it, but believes that could be true at his company. He also mentioned that at some point the engineer will stop looking and just recreate the missing information. He said this is a killer. First it wastes project time and second, it leads to duplicate part numbers and possible errors caused by the duplicate drawings.

All together, it made me think about how to get information access into the central place of PLM selection process. Getting access to the information from multiple devices, organization locations and at any time becomes an absolutely must requirement any vendor should answer before getting into future PLM evaluation. Otherwise, you will be sinking into people inefficiency every day.

What is my conclusion? We live in the era when access to information becomes mission critical. Your design can lead to much more expensive options. You can potentially select wrong supplier. You can miss delivery dates. However, the most important is to note time aspect. Engineers are spending lots of time looking for "stuff". This is the problem nobody can tolerate any more. Just my thoughts…

Best, Oleg

How to stop comparing PLM and SharePoint?

October 29, 2013


SharePoint is a long debated topic in PLM ecosystem. Starting from the early success of SharePoint back in 2006-2007, many companies are asking question about SharePoint and PLM system comparison. SharePoint took a significant market share using a successful licensing policy to embed the introductory version of SharePoint into Windows Server license. Another strong point of SharePoint was a very good integration with Microsoft Office. At the same time, despite a significant mainstream success, SharePoint still raised many questions about how manufacturing companies can leverage SharePoint to manage CAD drawing, collaborate and share engineering data.

I addressed SharePoint point on my blog many times. If you never had a chance to read my SharePoint related articles, maybe you can start by Why PLM should care of SharePoint? Another post that will help you to learn more about SharePoint and PLM could be this one – PLM and SharePoint Technical Definition.

Microsoft and PLM vendors had long debates about how to partner about SharePoint. The results of debates and partnership was mixed. Some of vendors (eg. Siemens PLM) made a long commitment by integrating between TeamCenter and SharePoint. The examples are TeamCenter Community, SolidEdge Insight (SolidEdge XT, SolidEdge SP). Other vendors like PTC, tried to establish multiple products that re-using SharePoint infrastructure (Windchill SocialLink, Windchill ProductPoint, etc.). Some of them succeed and some of them failed. You can read more here. Other vendors tried to investigate more into service contracts to integrate SharePoint with PLM products and infrastructure.

Microsoft SharePoint business is big these days. At the same time, it started to show some indication of weakness that might be typical for enterprise type of solution. In a nutshell it became complex, hard to deploy and costly (especially in the situations when organization required to scale up SharePoint to a fully blown enterprise solution). Read my article SharePoint got infected with PLM disease. Microsoft is trying to react on the new SharePoint status. From what I see, Microsoft is transforming SharePoint from product into infrastructure. You can find more information about this activity in my post – Will PLM Benefit From “SharePoint Death”?

Recently, another interesting information about SharePoint deployment came to my attention. Navigate your browser to ArnoldIT blog – SharePoint Not on the Radar. The writeup quoting some research surveys made by AIIM. I found the following passage important.

AIIM conducted a survey and found that only 6% of its respondents found their deployments successful, while 43% are struggling with implementing SharePoint, and another 28% say that progress has stalled in their SharePoint projects. That only touches the shallow end of the SharePoint pool. Many companies are also running multiple versions of the software, which can only lead to compatibility issues.

Another interesting publication research made by consulting outfit – Razorleaf. Navigate to the following link to access Razorleaf’s PLM and SharePoint white paper. Here is an interesting passage from Razorleaf article:

SharePoint is poised to become an ubiquitous document and content management system with great capabilities for ad hoc collaboration, whereas Product Lifecycle Management (PLM) is a similar but more structured system focused on product development content. By integrating SharePoint and PLM (the Product Data Management (PDM) components of PLM), users can blend the benefits of both to enable ad hoc collaboration on top of rich, structured product data and information.

Another similar conclusion is coming from Aras PLM blog – Why Can’t We Just Use SharePoint? The discussion in the article is about why SharePoint is not capable to replace PLM software. Aras blog is mentioning two significant capabilities missed in SharePoint – context / relationships management and missing workflow capabilities. Without agreeing on specific elements, I have to admit that general conclusion in the following passage seems to be right:

SharePoint is a powerful tool that has a very useful place in your PLM strategy, however, unless you are ready to invest significant time and resources into customization (i.e. building all the PLM functionality), it is not a replacement for choosing and deploying a real PLM system.

What is my conclusion? Back in 2006-2007, SharePoint provided an interesting and easy to deploy set of tools perceived by everybody as something that might have a potential of PLM system. Since that time, SharePoint gravitated more towards "platform". At the same time, PLM is more focusing on cloud vertical solution and business application. Will their paths meet in the future? This is a very good and important question to ask. I’m doubt about that. Meanwhile, the difference is so obvious, in my view. So, to stop comparing PLM and SharePoint should be a clear goals. The complimentary paths are possible indeed. Just my thoughts…



Tech Soft 3D TechTalk: PLM and Data Management in 21st Century

October 25, 2013


Boston is one of the rare places where you meet many CAD and PLM people at the same time at the same place. You don’t need to guess a lot why so. MIT CAD Lab as well as many companies in this domain made Greater Boston a unique place for talents in CAD and PLM space.

Tech Soft 3D is well known technological outfit helping many companies in CAD and PLM domain to develop successful products. Besides that Tech Soft 3D is sponsoring a gathering of technological fellows in the CAD/PLM domain to come, network and share their experience – Tech Talk. Yesterday was my first time attending Tech Talk in downtown Boston. I missed one last year because of crazy travel schedule. This year I’ve been honored to get invited and make a short speak. I shared my experience and thoughts about database and data management technological trends. As part of my presentation I shared my thoughts about so called NoSQL trend, what it contains and how it can be useful for CAD, PDM/PLM. Below you can see a full slide deck of my presentation.

PLM and Data Management in 21st Century from Oleg Shilovitsky

On the following slide, you can see a simplified decision table that can help you to designate what noSQL databases can be useful for different type of solutions.


What is my conclusion? Database and data management technology is going through cambrian explosion of different options and flavors. It is a result of massive amount of development coming from open source, web and other places. Database is moving from “solution” into “toolbox” status. Single database (mostly RDBMS) is no longer a straightforward decision for all your development tasks. My hunch, CAD/PLM developers need to ramp up with with tools and knowledge to tackle with future database decisions. Just my thoughts…
Best, Oleg

PLM Is Challenged With Compliance Data

October 24, 2013


Manufacturing is going global. This is not about the future. This is a reality of all manufacturing companies today. So, with such a reality, supply chain solutions is getting more and more traction and interest from IT managers and other people in companies. If I will use the language of one of my sales exec friends, supply chain solution turns from a vitamin to pain killer. Which means sales opportunity.

At the same time, supply chain is not a new topic for manufacturing companies. Very often PLM companies are focusing on supply chain with “CAD in mind”, which makes them looking mostly on design supply solutions. Many supply chain management opportunities are turning directly to ERP vendors and other vendors specialized on specific vertical supply chain management solutions. In my view, data becomes one of the most critical topic in every supply chain solution. In most of the situation, data needed for supply chain solutions “stick in the middle” between ERP, PLM and SCM. It combined from design data, manufacturing data, suppliers data. It is not easy. It requires openness and transparency between multiple systems. Let’s say the truth – it doesn’t work well in existing enterprise systems. These systems never been designed for openness and transparency.

The topic of supply chain compliance was discussed during recent PI Congress in Chicago two weeks ago. I found a good summary of PI Congress provided by CIMdata here. Here is an interesting passage from this write up related to supply chain compliance topic:

The first [panel] focused on supply chain transparency and traceability. This issue occurs at the intersection of PLM and ERP, and is critically important to firms that must increasingly compete on a global basis. The panel agreed there was a need for a common dataset on compliance issues, a common problem when selling in many countries. They recognized that PLM solution providers are challenged to provide this information in a timely fashion, and challenged the audience to help find or create new alternatives.

Common dataset is an interesting topic. It made me to think about the trend towards open data. I can see as part of broader discussion about BigData opportunity these days. In that context, I want to mention Open Data initiative led by some organizations – Open Data Institute (ODI) and Open Knowledge Foundation (OKF). The first one – ODI founded by Tim Berners Lee. The topic of open data is complex and I hope to speak about it in my future blog posts. You can find some information about open data on ODI website here. Another thing derived from open data is Open Definition initiatives. These are all new initiatives that mostly unknown in the enterprise software domain.

What is my conclusion? I think we are in the beginning of data revolution. To provide a better solutions these days, we need to have data available. It includes openness and data access. It also related to company data stored on company servers and, even more important, open data outside of organizations that must be available to everyone. In my view, common compliance data set is a perfect example of open data that must be available to enable future development of PLM supply chain solutions. Just my thoughts…

Best, Oleg

image courtesy of Open Knowledge Foundation.

Why PLM should friend Chief Data Officer?

October 23, 2013


Technology space is good about inventing new jobs, titles and responsibilities. Until now, we knew about CEO, COO, CIO… The last one was a very important guy when it came to the decision point about enterprise software. PLM included. The dream of every PLM vendor and implementer was to get closer to CIO to influence company software strategies.

The explosive growth of data in companies influenced the creation of new jobs and new roles. Have you heard about "data scientist" job? If not, you better learn about it now. According to HBR, data scientist becomes the sexiest job of 21st century. I stumbled on an interesting write up from Guidware blogChief Data Officer vs. Chief Scientist Officer. Read the article and draw your opinion. What became clear to me – people are paying more attention on what happens with information in an organization these days. The following passage from the article is actually explaining it very well.

It’s good to finally see appropriate attention paid to the power of information across the insurance value chain, particularly in atypical areas and through non-traditional means. Though, there is still an opportunity for data capabilities to mature significantly across the industry.

You can tell me "insurance" examples may not apply to manufacturing. Actually, I can see it even more applicable in manufacturing companies. Think about outsource design and suppliers contracts. How many anomalies you can find there? What about strange usage of components, reducing the overall number of component parts, suppliers, ECO processing time and many others. The article brings an interesting list of responsibilities of Chief Data Officer. Some of them sounds very related to product development processes, product data management and PLM. Here is the list –

Oversees Data Management Office (DMO) and related shared services
Accountable for Data Governance
Defines data standards and policies
Manages standard business taxonomy and data dictionary
Provides common tools and platforms
Responsible for data quality monitoring and management
Drives prioritization, provides budget, and oversees execution for related business and technology initiatives
Oversees data audits and largely supports regulatory compliance requirements

It made me think that people responsible for product lifecycle management, product development processes are actually should be very much interested in this role and related opportunities. Since CDO could be responsible for overall data management services, it can be very much connected to how product data can be managed by multiple applications (PLM included). Topics like data audit and regulation can be even more connected to product development processes.

What is my conclusion? Companies are getting more and more data driven these days. The new roles of data scientists, chief data officers, etc. are just a confirmation that people put lot of focus on the value data can bring. The role of PLM is traditionally very focused on integration of data and processes across organization. This is where making good friendship between PLM and DMO make a lot of sense and potential to bring value. Just my thoughts…

Best, Oleg


Get every new post delivered to your Inbox.

Join 290 other followers