How cloud PLM will protect CAD engineers from shadow IT and data fragmentation

September 30, 2013


The life of IT departments is complicated these days. What was IT kingdom 10 years ago gone forever. IT is loosing the battle of the the ability to control what users are doing as well as where they store information. If you are not familiar with the term, you better learn it – shadow IT. This is a term often used to describe IT systems and IT solutions built and used inside of organization, but without organizational approval. Sometimes, shadow IT comes from user activities and sometimes based on department activities of large corporation. The last one has a chance to happen a lot within large engineering organizations.

I’ve been reading ITProPortal article examined the growing problem data fragmentation in the article – Shadow IT: The Struggle to Protect Corporate Information in the Face of Growing Data Fragmentation. The extensive interview with Mimecast’s chief strategy officer, Matthew Ravden, examines the issue, from defining the problem to offering advice on how to deal with it. Here is my favorite and the most important passage from the article

“Ultimately, the employee is at the heart of this issue; using multiple applications and devices, often without the IT manager’s knowledge. You can understand why they do it; they want to be able to use the same applications and embrace the same ‘sharing’ culture at work that they do in their personal lives. They also sometimes feel forced to use consumer-grade tools because of the restrictions placed on them by IT, including the size of files that can be sent via the corporate email system. Of course, most employees are not conscious of the risk – they just want to use a fast and easy service which will help them get their job done. As well as identifying the potential third-party services used, IT managers need to educate users on the risks involved, in order to ensure corporate policies are respected.”

I found the problem described above very typical for engineering departments and work manufacturing companies are doing with subcontractors. There is a growing needs to share data. CAD files are big by nature. It is hard to send CAD files via email. These days, consumer IT services are coming close to tackle with files sizes in a much better way than corporate email servers. Google, Dropbox, and similar companies are taking leading positions. Corporate portals and Exchange Servers are clearly followers in this game.

The problem of data fragmentation makes the situation even more complicated. In addition to the fact engineers are using consumer tools not approved by IT, actually data is spreading around and moving into different locations – most of them are outside of corporate firewall.

ITProPortal article provides few recommendations (actually affiliated with author’s company – Mimecast). In my view, even these recommendations are specific for a particular tool, I can see them as something that can be applied in a general way to different cloud solutions.

Cloud vendors have a critical role to play in solving the problems of Shadow IT since they are uniquely placed to meet the needs of both IT managers and end users. By creating a safe, intuitive to use and easy to manage environment, cloud vendors can be a major ally to IT managers who want to keep users from straying beyond their official corporate systems.

Cloud services can also provide secure access to all data sources from common applications or from any mobile device, making sure users can get their data where they need it without leaving the safety of the organization’s IT system.

What is my conclusion? Growing power of consumer cloud services are creating new set of risks for manufacturing companies these days. By providing new and easy options for engineers and other people in an organization to share and exchange data, they significantly increase the risk of data fragmentation and establishment of "shadow IT" storage. However, I can see cloud technologies also have an opportunity to solve these problems by providing consistent secured tools as well as new monitoring capabilities to IT. Just my thoughts…

Best, Oleg

PLM, Google Knowledge Graph and Future Decision Support

September 27, 2013


Do you remember live without Google? It was only 15 years ago. You should agree, information was less available back that days. Fast forward – good news! Google made a change to its Web search system. ReadWrite Web article says – Now You Can Ask Google Search To Compare, Filter And Play. Another article from TheNextWeb – Google unveils search updates for mobile, new Page Rank algorithm, and Knowledge Graph comparisons.

You probably recall my post from last year – Why PLM need to learn about Google Knowledge Graph? That was the first publication about GKG and it wasn’t clear what use Google will give to Google Knowledge Graph. Now you can see some examples – type in your browser "wine vs beer" or "eiffel tower vs empire state building" and you can expect some meaningful results about objects.

I specially liked Eiffel vs. Empire state building example. It can give you a perspective of where data exploration and search can go. People are not interested in search results.


People are looking for meaningful information. In that context, GKG is the right way to go – it accumulates knowledge and can provide it in a consumable way.


So, how it related to PLM data you may ask? Here my hunch. People are not interested how to find assemblies and parts any more. The decision driven by data is getting more and more focus. The complexity of design decision is skyrocketing. Engineers are facing it every day. Engineers will need to get decision support. Try to type Vishay – DF10M-E3/45 vs. Vishay MB6S-E3/80. Don’t expect Google to help you. These are special types of rectifiers. Google doesn’t know what is Miniature Glass Passivated Single-Phase Bridge Rectifier. Maybe in few years? There are other systems that can produce better results like this (below). However, we still have long way to go.

Vishay MB6S-E3

What is my conclusion? I find it more difficult to locate correct product information in the environment with growing complexity. Engineers are looking how to make right decisions. In most of the cases, they are making these decisions based on their past experience rather than using right information. This is a place where we should expect some changes in the future. We just in the beginning. Product data should be collected, analyzed and presented in the way that helps engineers to make a right decision. Just my thoughts…

Best, Oleg

Will SaaS and Open API solve BOM Management problem?

September 27, 2013


I like the way BOM discussion usually sparks. From my experience, "BOM management" topic has the ability to ignite conversation almost immediately. Therefore, I wasn’t too much surprised when I got blog attention from Hardi Meybaum, CEO of GrabCADmost buzzed about company in Boston if you trust Venturefizz. Here is the article – Why single BOM is not the answer for you. Read it and draw your own conclusion. My favorite passage is the one that explains "BOM problem":

Recent posts on bill of materials (BOM) like Oleg’s ‘Will PLM Manage Enterprise BOM,’ make me [Hardi] remember my first conversation about BOM management. It was when I was in charge of IT in a mid-size manufacturing company. Then, and now, Procurement Specialists said that BOM should live in ERP, Engineering Managers said it should live in PLM, Engineers said that it should live in spreadsheets, and PLM vendors said it should live in PLM. This is where today’s BOM problems start.

According to Hardi, BOM management problem will disappear because of SaaS and API availability. He promotes open exchange of data between GrabCAD and other SaaS applications and out-of-the-box integration. It will finally make right information available to everyone in an organization. From what I can see, GrabCAD’s believe that companies will move from integrated large product suites into lean and granular applications. Actually, I like the idea very much. Granularity is a very good thing. Two years ago, Chad Jackson addressed granularity on his Lifecycle Insight blog –Point Solutions, Integrated Solutions and the Granularity Value Proposition. This is how Chad explains the granularity:

The fundamental idea is that you layer on different solutions that each do something very specific and well. Basically it is the point solution approach but from an ecosystem perspective. It would include something like leaving your workgroup PDM software in place. Layer on top of that a workflow. Then add some social computing solution for collaboration. Then you can add in a project management solution. You get the idea. Leave what you have in place. Add in other point solutions where needed. And integrate them as lightly as you can.

Granularity and best-of-breed application set usually comes with two additional price tickets – IT/setup and integration.I can see an interesting and new perspective in the way Hardi presents the advantage of granular solution. It comes together with SaaS, which makes a lot of sense to me. Installation, setup, updates – all these things will go away with SaaS approach. However, I have a doubt about integration. Out of the box integration is a tricky part. I never seen it works well. If you want to create a custom integration, you eventually coming down to nuts and bolts. API plays an important role in the ability of systems to be integrated. I mentioned the importance of Web APIs in PLM last year – Why PLM Need To Lean Web APIs? Even if APIs are available, integration is an expensive thing. From that standpoint, it doesn’t really matter if your PLM and ERP systems are running as SaaS applications or installed in you datacenter. My hunch, traditional RDBMS technologies allows more options to build hardwired integration compared to multi-tenant cloud applications. BOM implementations are complex and required lots of business rules to be implemented to synchronize multiple bill of materials. This is pure integration question and hardly can be handled as custom-made work.

What is my conclusion? SaaS is a game changer in terms of how systems can be deployed and maintained. It creates a completely different view on IT spending and TCO. It is also creates a different perspective of openness and importance of APIs. It was easy to hack ERP or PLM RDBMS with direct SQL call to solve your integration needs. It is completely different for SaaS apps. You cannot do integrations without custom code or flexible integration frameworks and middleware. In my view, the real BOM management problem is not related to the fact your system runs on the cloud or on premise. It is about how to connect multiple BOMs into a single meaningful solution. This task is complex and cost money. Cloud is an excellent foundation to simplify solution available and making information easy accessible. Will it solve BOM management problems across organizational silos. I’m not sure about that. Just my thoughts. What is your take?

Best, Oleg

PLM, BigData and Importance of Information Lifecycle

September 25, 2013

BigData is trending these days. It goes everywhere. Marketing people are in love with this name. It brings such a good taste of “big something”. It might be $$ or amount of problems it supposed to solve. It can be potentially related to big value proposition. Net-net, the amount of people and articles around you referring to the opportunity related to big data is probably skyrocketing. If you want to read more about big data, navigate to the following Wikipedia article – it is a good starting point.

CIMdata, well-known PLM advisory outfit, recently published an interesting paper about PLM and BigData. Navigate to this link, download research paper (it requires registration) and have a read. I’d say, this is the best reference about intersection of PLM and Big Data worlds. Here is what is the document about:

This paper focuses on the intersection of PLM and what has come to be known as “Big Data.” The increasing volume and growth rate of data applicable to PLM is requiring companies to seek new methods to turn that data into actionable intelligence that can enhance business performance. The paper describes methods, including search-based techniques, that show promise to help address this problem.

Search and analytic is one of the ways to dig into big data problem. Last year, I wrote about why PLM vendors need to dig into Big Data. Here is the link to my post – Will PLM vendors dig into Big Data?. I believe, BigData can provide a huge value to organization. To unlock this value is extremely important. However, looking on BigData hype these days, I got a feeling of wrong priorities and some gaps between vision of BigData and reality of PLM implementations.

I’ve been reading an ITBusinessEdge article – Three Reasons Why Life Cycle Management Matters More with Big Data. The main thing I learned in this article – even big data is going to change a lot, it won’t change some fundamental data management laws. Data lifecycle is one of them. Here is my favorite passage:

With Big Data, which can be unpredictable and come in many different sizes and formats, the process isn’t so easy,” writes Mary Shacklett, president of technology research and market development firm Transworld Data. “Yet if we don’t start thinking about how we are going to manage this incoming mass of unstructured and semi-structured data in our data centers.

It means a lot in the context of PLM systems. This is where I can see the biggest gap between BigData and PLM. It is easy to collect data from multiple sources. That’s what everybody speaks about. However, big data needs to be managed as well together with other information managed by PLM. Big Data is coming through the lifecycle of processing, classification, indexing and annotation. To connect pieces and to related big data pieces of information to PLM system – significant problem to think about. Engineers and other people in the company probably won’t be interested to access data itself, but analytics, insight and recommendation.

What is my conclusion? The value behind big data is huge. It can improve decision making, quality of service, suppliers bids and lot of other things. However, it creates a huge pressure on IT and organization in terms of resources, data organization and data infrastructure. PLM systems won’t be able to start with big data overnight. Whoever, will tell you “now we support big data” is probably too marketing oriented. PLM will have to focus on data lifecycle to bring a realistic big data implementation plans to organization. Just my thoughts…

Best, Oleg

PLM, Cloud and European Data Protection Reforms

September 24, 2013

Have you had a chance to speak about cloud technology in Europe? If you did, I’m sure you had a mixed experience of love and hate relationships between European companies and cloud solutions. There is lot of interest in cloud, but local law, corporate policies and regulation make it really complex. Last year I wrote about some trends related to Europe and cloud business development – Will Europe Adopt Cloud PLM? I can see some interest in cloud market development in Europe for the last 12 months. Nevertheless, the cloud atmosphere in Europe is still very cloudy :). One of the biggest concerns and showstoppers to make enterprise cloud solutions proliferating in Europe is low and regulation around data privacy. It resulted in quite complicated restrictions on data location, data collection and storage.

Yesterday evening I stumbled on an the following GigaOM article – Europe’s love-hate affair with cloud computing; the week in cloud. Short article provides a believe that forecast for cloud solution can change in a near future. European lawmakers is calling for data protection reforms. The key is to create a single data privacy law across European countries. Here is an interesting passage:

According to a statement outlining the proposal, the proposed regulation is: “[T]he Union’s response to fear of surveillance. By adopting the Data Protection Regulation, the Union will equip it itself with a set of rules fit for the 21st century. Rules that will empower the very people whose data fuels the digital economy. Rules that will ensure the digital economy’s growth can be sustained.”

Single data privacy law can change marketing situation with cloud PLM solutions in Europe. Think about typical European manufacturer. Every manufacturing company has has lots of connection and relations in Europe and outside. The question of data storage, data access policies and other data related regulation can easy stop any cloud initiative.

What is my conclusion? Europe is an important market for PLM companies. Changes in data privacy regulation can enable faster adoption of cloud PLM technologies in Europe. In my view, it will have different aspects – data centers and IT partnerships, implementation services and more. One of the most important aspects is related to solution cost. Cloud infrastructure is expensive. By enabling sharing of this infrastructure cross EU, will allow to set a different perspective on solution / service cost. Just my thoughts…

Best, Oleg

Will PLM Benefit From “SharePoint Death”?

September 23, 2013

PLM has love and hate relationships with SharePoint. For the last 5-8 years, SharePoint becomes a symbol of mainstream corporate portals, basic document and content management and set of collaboration tools well integrated with Microsoft Office platform. Microsoft used very sneaky freemium strategy of integrating SharePoint basic version into Windows Server license, so everybody who is buying it, was able to use basic level of SharePoint. The result – growing market share, ecosystem of partners and service providers. SharePoint business became popular.

SharePoint developed an interesting position in CAD/PLM ecosystem. The original attempt of SharePoint, to be used as a platform for PDM/PLM systems failed. After 10 years, only one company (Siemens PLM) is using SharePoint as a platform for their products – SolidEdge Insight XT (recently renamed into SolidEdge SP to emphasize tight connection with SharePoint). After big marketing campaign, PTC discontinued Windchill ProductPoint. Common PLM vendors strategy with regards to SharePoint is to report availability of the integration with SharePoint tools. Usually, it provided to confirm compliancy with IT/SharePoint strategy and to acknowledge significance of SharePoint install base.

However, the morning sun never lasts a day. SharePoint is lately going through the lifecycle and changes. I’ve been reading CMS Wire article – Reports of SharePoint’s Death are Greatly Exaggerated. Read the article and make an opinion. The article speaks about transformation of SharePoint ecosystem, growing dominance of Office brand and evolving of SharePoint into invisible collaborative service platform. Here is an interesting passage:

The concept of “SharePoint for end users” will go away, because end users will interface with SharePoint via Office (365 or no) or mobile apps as much as they do via browser. And speaking of the browser, what you see there can be heavily customized and made responsive. Microsoft itself has made this easier than ever in 2013, and things like device channels and variations barely scratch the surface of what’s possible. If SharePoint provides the services to all those devices … well, it’s basically a platform (again) for admins to maintain and developers to improve, but decidedly not a product aimed at end-user consumers.

Office is a well-known brand name and Microsoft clearly trying to fix problems of SharePoint in usability and amount services needed to make SharePoint work for customers. Interesting enough, I can see this problem is one of the fundamental problem behind the failure of SharePoint – PLM partnerships alliances. PLM resellers and services organizations were in competition with SharePoint and other IT service organizations.

Microsoft strategy is to fix SharePoint usability with Office 365 user experience. In parallel, by shifting towards Office from "SharePoint for users", Microsoft demonstrates some weakness and decreased SharePoint value proposition. The following passage can give you a glimpse of what it looks like. It speaks about huge amount of SharePoint business built around problems with usability and tailoring SharePoint to specific customer-oriented scenarios:

Look, it’s great that there’s so much help out there for end users struggling with SharePoint usability. Large and thriving websites, communities and consultancies have been built around this problem. But does anyone really believe that Microsoft enjoys or appreciates the fact that a billion-dollar business has inspired so much thought and activity around its weaknesses? If they can get people to happily use SharePoint (and more importantly, purchase licenses) without ever consuming it in its native environment, you’d better believe they will.

Similar problem exists in PLM eco-system as well. Problems with usability, huge amount of customization and services – these are attributes of aged mainstream PLM platforms and not only. PLM sales people can confirm they are competing with SharePoint based offering and services in many situations.

My hunch is that changes in SharePoint eco-system can work well for PLM business. The trend to converge SharePoint into invisible collaborative platform with Office365 facade can remove SharePoint sales focus from enterprise content management and document management. Microsoft will become more focused on collaboration and share of information for Office tools.

What is my conclusion? PLM struggled to compete with broad and extensive SharePoint presence. It created significant competition between vendors and confusion among customers. Very often SharePoint sales confused manufacturing companies with functionality SharePoint can provide to manage engineering data. Switching focus on Office365 and usage of SharePoint as a Office collaboration platform will bring clearness and improve potential competitiveness of PLM tools. Just my thoughts…

Best, Oleg

Future CAD Platforms and Google Chrome Native Client

September 20, 2013

Our life is getting more and more web-like. Think about applications and tools we use in our everyday life 10 years ago and now – you can see how many of them moved from your Windows desktops to web browsers and mobile devices. However, if you are engineer using CAD application and/or simulation tool, most probably, you are still anchored to your desktop machines. The same you can probably say about photo and video editing applications. The common thing between CAD and photo / video editing is related to the need to use extensive computation and/or graphic resources.

Speaking about photo editing applications, Google is clearly making a leapfrog activity in this space. Google+ photo editing application is getting better everyday. Many times in my personal life of photo hobbyist I ended up with editing photos using Google+ without reaching to my usual Photoshop tools.

I’ve been reading TechCrunch article earlier this week – Google’s Bet On Native Client Brings Chrome And Google+ Photos Closer Together. This article confirms my guess about Google technologies behind new Google+ photo editing tools as well as made me think about some potential opportunities in CAD / PLM space. Here is an interesting passage from the article.

As you’ve probably heard a thousand times now, it’s virtually impossible to build great photo apps that can rival the likes of Photoshop in HTML5. That’s where Native Client comes it. This technology allows developers to execute native code in a sandbox in the browser. It can execute C and C++ code at native speeds and with the ability to, for example, render 2D and 3D graphics, run on multiple threads and access your computer’s memory directly. All of that gives it a massive speed bump over more traditional HTML5 apps.

If you want to learn more about Google Native Client, you probably can start here. Google Developers website provides a good set of information well organized with use cases, videos, documents and references. Navigate here to read more.

It is interesting to see common use cases presented on Google Developers website. Some of them are very relevant to CAD / PLM domain – enterprise applications and legacy desktop applications. Another interesting use case is related to existing software components. You may think about Geometric modelers as one example of existing components that can run inside of Google Native client. Look on how Google phrase this use case on the development website:

Existing software components: With its native language support (currently C and C++), Native Client enables you to reuse current software modules in a web app—you don’t need to spend time reinventing and debugging code that’s already proven to work well.

Compiling existing native code for your app helps protect the investment you’ve made in research and development. In addition to the protection offered by Native Client compile-time restrictions, users benefit from the security offered by its runtime validator. The validator decodes modules and limits the instructions that can run in the browser, and the sandboxed environment proxies system calls.

Let me speculate a bit here – recent announcement of Siemens PLM about licensing of Parasolid components to Belmont Technologies developing cloud CAD can provide a potential use case. So, maybe future cloud CAD of Jon Hirschtick with use Google Native Client… who knows?

The following video provide you short summary of how Google Native App works.

What is my conclusion? Web is a future platform for everything. Engineering and manufacturing applications are not exclusion from this rule. However, it will not happen overnight. Companies made significant investment in existing technologies and products. How to move from today’s mostly desktop CAD into future cloud design platforms? This is a good question to ask CAD technologists, industry pundits and internet developers. Google Chrome Native Client provide an interesting technological set to consider. Today Google Chrome Native apps directory contains only games. But who knows what will be tomorrow? Just my thoughts…

Best, Oleg


Get every new post delivered to your Inbox.

Join 290 other followers