PLM security: data and classification complexity

July 30, 2014


Security. It is hard to underestimate the importance of the topic. Information is one of the biggest assets companies have. Data and information is a lifeblood of every engineering and manufacturing organization. This is a key element of company IP. Combined of 3D models, Bill of Materials, manufacturing instructions, suppliers quotes, regulatory data and zillions of other pieces of information.

My attention caught Forrester TechRadar™: Data Security, Q2 2014 publication. Navigate to the following link to download the publication. The number of data security points is huge and overwhelming. There are different aspects of security. One of the interesting facts I learned about security from the report is growing focus on data security. Data security budgets are 17% as for 2013 and Forester predicts the increase of 5% in 2014.


The reports made me think about some specific characteristics of PLM solutions – data and information classification. The specific characteristic of every PLM system is high level of data complexity, data richness and dependencies. The information about product, materials, BOMs, suppliers, etc. is significantly intertwined. We can speak a lot of about PLM system security and data access layers. Simple put, it takes a lot of specifics of product, company, business process and vendor relationships. As company business is getting global, security mode and data access is getting very complicated. Here is an interesting passage from report related to data classification:

Data classification tools parse structured and unstructured data, looking for sensitive data that matches predened patterns or custom policies established by customers. Classiers generally look for data that can be matched deterministically, such as credit card numbers or social security numbers. Some data classiers also use fuzzy logic, syntactic analysis, and other techniques to classify less-structured information. Many data classification tools also support user-driven classification that users can add, change, or conrm classification based on their knowledge and the context of a given activity. Automated classication works well when you’re trying to classify specic content such as credit card numbers but becomes more challenging for other types of content.

In my view, PLM content is one of the best examples of data that can be hardly classified and secured. It takes long time to specify what pieces of information should be protected and how. Complex role-based security model, sensitive IP, regulation, business relations and many other factors are coming into play to provide classification model to secure PLM data.

What is my conclusion? I can see a growing concern to secure data access in complex IT solutions. PLM is one of them. To protect complex content is not simple – in many situations out of the box solutions won’t work. PLM architects and developers should consider how to provide easier ways to classify and secure product information and at the same time be compliant with multiple business and technical requirements. Important topic for coming years. Just my thoughts…

Best, Oleg

PLM implementations: nuts and bolts of data silos

July 22, 2014


Data is an essential part of every PLM implementation. It all starts from data – design, engineering, manufacturing, supply chain, support, etc. Enterprise systems are fragmented and representing individual silos of enterprise organization. To manage product data located in multiple enterprise data silos is a challenge for every PLM implementation.

To "demolish enterprise data silos" is a popular topic in PLM strategies and deployments. The idea of having one single point of truth is always in mind of PLM developers. Some of my latest notes about that here – PLM One Big Silo.

MCADCafe article – Developing Better Products is a “Piece of Cake” by Scott Reedy also speaks about how PLM implementation can help to aggregate all product development information scattered in multiple places into single PLM system. The picture from the article presents the problem:


The following passage is the most important, in my view:

Without a PLM system, companies often end up with disconnected silos of information. These silos inhibit the ability to control the entire product record and employees waste unnecessary time searching for the correct revision of the product design. As companies outsource design or manufacturing, it becomes even harder to ensure the right configuration of the product is leveraged by external partners.

Whether your company makes medical devices, industrial equipment, laptops, cell phones or other consumer products – PLM provides a secure, centralized database to manage the entire product record into a “Single Record of the Truth”… With a centralized product record, it is easy to propose and submit changes to the product design, track quality issues and collaborate with your internal teams and supply-chain partners.

The strategy of "single record of truth" is a centerpiece of each PLM implementation. However, here is the thing… if you look on the picture above you can certainly see some key enterprise systems – ERP, CRM, MES, Project and program management, etc. PLM system can contain scattered data about product design, CAD files, Part data, ECO records, Bill of Materials. However, some of the data will still remain in other systems. Some of the data gets duplicated. This is what happens in real world.

It made me think about 3 important data architecture aspects of every PLM implementation: data management, data reporting and data consistency.

Data management layer is focusing on what system is controlling data and providing master source of information. Data cannot be mastered in multiple places. Implementation needs to organize logical split of information as well as ability to control "data truth". This is the most fundamental part of data architecture.

Data reporting is focusing how PLM can get data extracted from multiple sources and presented in seamless way to end user. Imagine, you need to provide an "open ECO" report. The information can reside in PLM, ERP and maybe some other sources. To get right data in a right moment of time, can be another problem to resolve.

Last, but not least - data consistency. When data located in multiple places system will rely on so-called "eventual consistency" of information. The system of events and related transactions is keeping data in sync. This is not a trivial process, but many systems are operating in such way. What is important is to have a coordinated data flow between systems supporting eventual consistency and data management and reporting tools.

What is my conclusion? To demolish silos and manage single point of truth is a very good and important strategic message. However, when it comes to nuts and bolts of implementation, an appropriate data architecture must be in place to insure you will have right data at right time. Many PLM implementations are underestimating the complexity of data architecture. It leaves them with marketing slogans, burned budgets and wrong data. Just my thoughts…

Best, Oleg

picture credit MCADCafe article.

Will PLM Vendors Jump into Microsoft Cloud Window in Europe?

April 11, 2014


Cloud is raising lots of controversy in Europe. While manufacturing companies in U.S. are generally more open towards new tech, European rivals are much more conservative. Many of my industry colleagues in Germany, France, Switzerland and other EU countries probably can confirm that. Europe is coming to cloud systems, but much slower. I’ve been posting about cloud implications and constraints in Europe. Catch up on my thoughts here – Will Europe adopt cloud PLM? and here PLM cloud and European data protection reforms. These are main cloud concerns raised by European customers – data, privacy and specific country regulation. With companies located in different places in EU, it can be a challenge.

Earlier today, I’ve heard some good news about cloud proliferation in Europe coming from Microsoft. TechCrunch article – Microsoft’s Enterprise Cloud Services Get A Privacy Thumbs Up From Europe’s Data Protection Authorities speaks about the fact Microsoft enterprise cloud service meets the standards of data privacy in several European countries. Here is a passage that can put some lights on details and what does it mean:

But today comes a piece of good news for Redmond: the data protection authorities (DPAs) of all 28 European member states have decided that Microsoft’s enterprise cloud services meet its standards for privacy. This makes Microsoft Azure, Office 365, Microsoft Dynamics CRM and Windows Intune the first services to get such approval. The privacy decision was made by the “Article 29 Data Protection Working Party,” which notes that this will mean that Microsoft will not have to seek approval of individual DPAs on enterprise cloud contracts. In its letter to Microsoft (embedded below), chair Isabelle Falque-Pierrotin writes, “The MS Agreement, as it will be modified by Microsoft, will be in line with Standard Contractual Clause 2010/87/EU… In practice, this will reduce the number of national authorizations required to allow the international transfer of data (depending on the national legislation).”

Majority of PDM / PLM providers are friendly with Microsoft tech stack. Some of them are completely relies on MS SQL server and other Microsoft technologies. Most of them are supporting SharePoint. Now, these PLM vendors have an additional incentive to stay with Microsoft technologies for the cloud. It can be also a good news for manufacturing companies already deployed PDM/PLM solutions on top of Microsoft technologies and developed custom solutions.

What is my conclusion? The technological landscape these days is very dynamic. The time, one platform worked for everybody is over. In light of technological disruption and future challenges tech giants will be using different strategies in order to stay relevant for customers. Will European cloud regulation keep PDM/PLM players with MS Azure and other Microsoft technologies compared to alternative cloud technological stacks? How fast will take to other players to reach the same level of compliance? These are good questions to ask vendors and service providers. Just my thoughts…

Best, Oleg

How cloud PLM can reuse on-premise enterprise data?

April 7, 2014


Cloud becomes more and more an obsolete additional word to call every technology we develop I hardly can image anything these days that we develop without "cloud in mind". This is absolutely true about PLM. Nowadays, it is all about how to make cloud technologies to work for you and not against you.

For cloud PLM, the question of secure data usage is one of the most critical topics. Especially, if you think about your existing large enterprise customers. These large companies started PLM adoption many years ago and developed large data assets and custom applications. For them, data is one of the most important elements that can enable use of cloud PLMs.

Networkworld article How Boeing is using the cloud caught my attention this morning. The writeup quotes Boeing chief cloud strategies David Nelson and speaks about very interesting approach Boeing is using to deploy and use on-premise data on public cloud. Here is the passage that outline the approach:

Nelson first described an application the company has developed that tracks all of the flight paths that planes take around the world. Boeing’s sales staff uses it to help sell aircraft showing how a newer, faster one could improve operations. The app incorporates both historical and real-time data, which means there are some heavy workloads. “There’s lots of detail and analysis,” he says. It takes a “boatload” of processing power to collect the data, analyze it, render it and put it into a presentable fashion.

The application started years ago by running on five laptop computers that were synced together. They got so hot running the application that measures needed to be taken to keep them cool, Nelson said. Then Nelson helped migrate the application to the cloud, but doing so took approval from internal security, legal and technology teams.

In order to protect proprietary Boeing data the company uses a process called “shred and scatter.” Using software supported by a New Zealand firm, GreenButton, Boeing takes the data it plans to put in the cloud and breaks it up into the equivalent of what Nelson called puzzle pieces. Those pieces are then encrypted and sent to Microsoft Azure’s cloud. There it is stored and processed in the cloud, but for anything actionable to be gleaned from the data, it has to be reassembled behind Boeing’s firewall.

It made me think about one of the most critical things that will define future development and success of cloud PLM technologies and products – data connectivity and on-premise/cloud data sync. Here is my take on this challenge. It is easy to deploy and start using cloud PLM these days. However, PLM system without customer data is not very helpful. Yes, you can manage processes and new projects. However, let’s state the truth – you need to get access to legacy data to fully operate your PLM software on enterprise level. Manufacturing companies are very sensitive about their data assets. So to develop kind of "shred and scatter" data sync approaches can be an interesting path to unlock cloud PLM for large enterprise customers.

What is my conclusion? I can see cloud data sync as one of the most important cloud PLM challenges these days. To retrieve data from on-premise location in a meaningful way and bring it to the cloud in a secure manner is a show stopper to start broad large enterprise adoption. By solving this problem, cloud PLM vendors will open the gate for large enterprises to leverage public cloud. It is a challenge for top enterprise PLM vendors today and clearly entrance barrier for startup companies and newcomers in PLM world. Just my thoughts…

Best, Oleg

What metadata means in modern PDM/PLM systems

February 26, 2014


Metadata is "data about data". If you stay long with PDM industry, you probably remember how earlier EDM/PDM software defined their role by managing of "data about CAD files" (metadata). However, it was long time ago. Wikipedia article defines two types of metadata – structural and descriptive. Here is a quote from the article:

The term is ambiguous, as it is used for two fundamentally different concepts (types). Structural metadata is about the design and specification of data structures and is more properly called "data about the containers of data"; descriptive metadata, on the other hand, is about individual instances of application data, the data content.

In my view, CAD/PDM/PLM is using both types. Since design is very structured and contains lots of rich semantic relations, metadata about CAD files stored in PDM system is structured. At the same time, descriptive metadata such as file attributes, information about people, project, organization can be applied to individual instance of CAD data (files) as well.

Since early EDM/PDM days, lots of changes happened in the definition and usage of a word metadata. Some of them are very confusing. The traditional use and definition of files (for example, in context of CAD files) is changing. Sometimes, we want to to keep "file" as a well-known abstraction, but underlining meaning is completely different and point more on "data" or "design" rather than actual files. Also, introduction of web based systems are changing physical use of files. The usage of file accessed via mobile application located in Dropbox is completely different. In many scenarios you will never get access to these physical files.

DBMS2 article Confusion about Metadata speaks about some additional aspects of metadata management that getting more relevant these days. It includes data about mobile devices usage (telephone metadata) and document data. Document data is getting more structured these days and often cannot be distinguished from structured RDBMS data. Here is interesting passage that describes the transformation of database and document based data.

[data about data structure] has a counter-intuitive consequence — all common terminology notwithstanding, relational data is less structured than document data. Reasons include: Relational databases usually just hold strings — or maybe numbers — with structural information being held elsewhere. Some document databases store structural metadata right with the document data itself. Some document databases store data in the form of (name, value) pairs. In some cases additional structure is imposed by naming conventions. Actual text documents carry the structure imposed by grammar and syntax.

Modern data management systems and especially noSQL data bases such as document and key-value databases can introduce new types of metadata or data. IoT (Internet of things) brings another level of complexity to data management. I can see many others to come.

What is my conclusion? I think, the term meta-data is getting outdated at least in the scope of design and engineering. Originally used a lot in EDM/PDM systems managing metadata about CAD files is not relevant anymore. Design data originally stored in CAD files becomes more transparent and connected to external to design world. The whole data paradigm is changing. Just my thoughts…

Best, Oleg

Why PLM vendors need to hire data scientists?

December 4, 2013


The importance of data is growing tremendously. Web, social networks and mobile started this trend just few years ago. However, these days companies are starting to see that without deep understanding of data about their activities, the future of company business is uncertain. For manufacturing companies, it speaks a lot of about fundamental business processes and decisions related to product portfolios, manufacturing and supply chain.

It sounds like PLM vendors have a potential best fit to fulfill this job. PLM portfolios are getting broader and covers lots of applications, modules and experience related to optimization of business activities. In one of my earlier blogs this month, I was talking about new role of Chief Data Officer in companies. Navigate here to read and draw your opinion. However, to make this job successful is mission impossible without deep understanding of company data by both sides – company and vendors / implementers.

Few days ago, I was reading InformationWeek article – Data Scientist: The Sexiest Job No One Has. The idea of data scientist job is very interesting if you apply it beyond just storing data on file servers. Think about advanced data analyst job focusing on how company can leverage their data assets in a best way. The amount of data companies are generating is doubling every few months. To apply right technology and combine it with human skills can be an interesting opportunity. Pay attention on the following passage:

The role of data scientist has changed dramatically. Data used to reside on the fringes of the operation. It was usually important but seldom vital — a dreary task reserved for the geekiest of the geeks. It supported every function but never seemed to lead them. Even the executives who respected it never quite absorbed it.

But not anymore. Today the role set aside for collating, sorting, and annotating data — a secondary responsibility in most environments — has moved to the forefront. In industries ranging from marketing to financial services to telecommunications, the data scientists of today don’t just crunch numbers. They view the universe as one large data set, and they decipher relationships in that mass of information. The analytics they develop are then used to guide decisions, predict outcomes, and develop a quantitative ROI.

So, who can become data scientist in a manufacturing companies? Actually, this major is still not defined in American colleges. Anybody with good skillset of math, computer science and manufacturing domain knowledge can think about this work. So, I can clearly can see it as an opportunity for retired CAD and PLM IT managers spending their life on installation of on premise PLM software as soon as the software will be moving to the cloud environment.

What is my conclusion? In the past, installation and configuration skill set was one of the most important in PDM/PLM business. The time vendors spent on system implementation was very significant. PLM cloud switch is going to create a new trend – understanding of company data and business processes will be come and #1 skill So, PLM vendors better start thinking about new job description – people capable to understand how to crunch manufacturing data to create a value for customers. Just my thoughts…

Best, Oleg

How CAD/PLM can capture design and engineering intent

November 8, 2013


It was a big Twitter day. Twitter IPO generated an overflow of news, articles, memorable stories. For me, twitter become a part of my working eco-system, the place I use to capture news, exchange information and communicate with people. If you are on twitter, try Vizify to visualize you twitter account. I did it here. The most insightful information for me was the fact I tweet 24 hours a day… (well, I don’t know how Vizify deal with my time zone changes). It made me think about what impact Twitter-like ecosystem can provide on engineers and designers. It came to me as a continues thoughts about failure of Social PLM – Why Social PLM 1.0 failed and What PLM can learn from public social data?

I’ve been reading an article and interview with Biz Stone, Twitter co-founder and entrepreneur – Be emotionally invested. It is a good story. Read it, especially if you are involved in startup activity. One of interesting pieces that caught my attention was a story about Google working environment. Here is the passage:

“I used to just walk around. I don’t know if I was supposed to, but I’d just open doors and see what people were doing.” One led to a guy surrounded by DVRs. Stone asked what he was doing. “I’m recording everything being transmitted on TV all over the world.” Another led to “a sea of people operating illuminated foot-pedal scanning devices. “We’re scanning every book ever published.”

Another interesting article that caught my attention was about an interesting behavior – deleted tweets. Navigate to read – Why do people delete their tweets? University of Edinburgh researchers have been looking into the motives behind deleted Twitter missives. You can read more about this study here. The funny part of this mechanism is that it implements the old idiom – Word spoken is past recalling. Here is passage explained the research and how it works.

Right now there’s no way to tell whether you’ll be proud of your rousing 140 character defense of James Franco in a few years, or deeply, deeply ashamed. But hiding the evidence isn’t hard. Deleting a tweet is not a complicated process. If you don’t like what you wrote, you can trash it in a few clicks. And there are services like Tweet Delete that help you mass-delete older tweets.

These two examples – capturing of information streams from global and personal perspective made me think about how potentially we can capture engineering activities and discover design intent of decision making factors similar to techniques used to identify deleted tweets and other related twitter user behaviors. The challenge of CAD/PLM environment compared to Twitter is obviously security and open APIs. It is hard to capture information from design and engineering systems. In most of the cases, the information is secured and access is restricted.

What is my conclusion? There is a huge potential in analyzing of design and engineering activity from capturing information about people behavior. My hunch, it can become one of the places CAD/PLM companies and startups might crack to discover a future potential of design optimization and decision making. Just my thoughts…

Best, Oleg

What is the future of PLM data analytics?

October 31, 2013


The amount of data we produce is skyrocketing these days. Social web, mobile devices, internet of things – these are only few examples of data sources that massively changing our life. The situation in the business space is not much different. Companies are more and more involved into connected business. Design, supply chain, manufacturing – all activities are producing a significant amount of digital information every minute. In design, the complexity of products is growing significantly. Personalization, customer configurations and many other factors are creating significant data flow. Simulation is another space that potentially can bring a significant amount of data. I was watching presentation "Workspace 2020" made by Dion Hinchcliffe last week at the forum "Transform the way work gets done". Navigate to the slide share slides and take a look. One of the numbers stroke me – speed of data growth. Now (in 2013) we double data in 2 years. By 2020, we are going to double the data every 3 months.


The massive amount of data brings the question how engineering, design and manufacturing systems can handle this data and produce a meaningful insight and decision support for engineers and other people involved into development process. The question of data analytics came to my mind. In the past data analytics was usually associated with long, complicated and expensive process involving IT, resources, time and programming. Cloud and new computing ecosystem are changing this space. I was reading the announcement made by Tableau (outfit focusing on providing analytic dashboards and tools) – Tableau Software partners with Google to Visualize Big Data at Gartner IT Symposium.

The partnership mixes Tableau’s analytics with the Google Cloud Platform. The technology was presented at Gartner conference in Orlando recently. Here is an interesting passage explaining what Tableu did with Google:

“Tableau and Google created a series of dashboards to visualize enormous volumes of real-time sensory data gathered at Google I/O 2013, Google’s developers’ conference. Data measuring multiple environmental variables, such as room temperature and volume, was analyzed in Tableau and presented to attendees at the Gartner event. With Tableau’s visual analytics, Gartner attendees could see that from the data created, I/O conference managers could adjust the experience and gain insights in real time, like re-routing air-conditioning to optimize power and cooling when rooms got too warm.”

I found the following video interesting explaining how easy you can build some analytics with Tableu and Google Big Query.

It made me think about future potential of analytics we can bring into design and engineering process by analyzing huge amount of data – simulation, customer, operational. Just think about combining data collection from products in the field, mixed with some simulation analyzes that can be used to improve design decisions. Does it sounds crazy and futuristic? Yes, I think so. However, there are many other places that we considered crazy just few years ago and it became the reality now.

What is my conclusion? Data analytics is one of the fields that can provide a potential to improve design and engineering process by analyzing significant amount of data. Think about leveraging cloud infrastructure, network and graph databases combined with visual and intuitive analytic tools. It is not where we are today. This is how future will look like. Just my thoughts…

Best, Oleg

What PLM can learn from public social conversation?

October 16, 2013


Social PLM is not cool any more. The article Why is Social PLM DOA? from PLMJim popped up in my twitter stream earlier today and drove my attention to social PLM topic again. In my view, it gets nasty this time. We loved the idea of social communication and how to these tools can improve collaboration. However, it all gets wrong. Here is the passage Jim is using to explain that:

There have been many articles on the value that social tools can bring to your business. However, the uptake of social tools within Engineering organizations in the guise of social PLM has been very low; possibly non-existent.Why is this the case? Is there no value in Engineering for social tools, or is it just hard to exploit these tools in the product development environment? There is clearly a need for more social collaboration during product design, so it would stand to reason that these social tools would have some value. As I have introduced many engineers to Social PLM in my PLM Certificate Education classes, I have often wondered about the lack of enthusiasm for these kinds of tools.

So, after all hype, the solution is not there and demand is near to zero. I’ve been thinking about social topic and PLM quite some time and I agree with Jim’s point. Maybe not in a such disruptive form. I called it – Why social PLM 1.0 failed?  in the beginning of the year.

However, here is the deal. The more I think about social, the more I’m convinced PLM vendors and startups in “social PLM” domain took a wrong approach by trying to convince that “social collaboration” will provide a silver bullet to improve communication between people. In my view, it is totally wrong. People are locked in silos and not interested to get out of their silos. This is how organizations work and the best communication tools cannot change this trend for the moment.

So, what is the potential future leverage of social tools in engineering and manufacturing? In my view, social data is a potential Klondike for manufacturing companies. It is a place where manufacturing companies can get ideas about customer demands, future product improvements and existing product failures.

I’ve been reading makeuseof article Facebook usage is changing, so which online activities are growing. The article brings an interesting perspective on what happens with social tools. It speaks about variety of tools people use anonymously to publish social information. Tools like tumblr, instagram, whatsapp, twitter. Some of them are well known and some of them are not very popular in professional social space.

Manufacturing companies are sensitive to social activities these days. Look on how CEO of Tesla, Elon Musk was handling Tesla Model S fire very recently. It will give you a sense of potential value and danger manufacturing companies can experience by missing this type of sensitive social information.


What is my conclusion? I think Social PLM 1.0 was a nice try and… failure. It was good experience to learn how mimicking something buzzy and hyping is dangerous without focusing on values for individuals and companies. I’m expecting to see new Social PLM 2.0 to come soon with new agenda, ideas and lesson learned. Social data has a huge potential. Not to leverage this potential will be a huge mistake made by PLM vendors. Just my thoughts…

Best, Oleg

Will PLM Data Size Reach Yottabytes?

October 14, 2013


Everybody speaks today about big data. It is probably one of the most overhyped and confused terms. It goes everywhere and means different things depends who you are talking to. It can be data gathered from mobile devices, traffic data, social media and social networking activity data. The expectations are the size of big data will be going through the roof. Read Forbes article Extreme Big Data: Beyond Zettabytes And Yottabytes. The main point of the article – we produce data faster than we can invent a name how to call it. Here is a scale we are more/less familiar with – TB terabyte, PB petabyte, EB exabyte, ZB zettabyte, YB yottabyte…

However, article also brings an interesting lingo of data sizes. Here are some examples: Hellabytes (a hell of a lot of bytes), Ninabytes, Tenabytes, etc. Wikipedia provides a different option to extend prefix system – zetta, yotta, xona, weka, vunda, uda, treda, sorta, rinta, quexa, pepta, ocha, nena, minga, luma, … Another interesting comparison came from itknowledgeexchange article. Navigate here to read more. Here is my favorite passage. The last comparison to Facbook is the most impressive.

Beyond what-do-we-call-it, we also have the obligatory how-to-put-it-in-terms-we-puny-humans-can-understand discussion, aka the Flurry of Analogies that came up when IBM announced a 120-petabyte hard drive a year ago. Depending on where you read about it, that drive was: 2.4 million Blu-ray disks; 24 million HD movies; 24 billion MP3s; 6,000 Libraries of Congress (a standard unit of data measure); Almost as much data as Google processes every week; Or, four Facebooks.

Forbes article made me think about sizes of PLM data, engineering data, design data. It is not unusual to speak about CAD data and/or design data as something very big. Talk to every engineering IT manager and he will speak to you about oversizing of CAD files in libraries. Large enterprise companies (especially in regulated industries) are concerned about how to store data for 40-50 years, what format to use, how much space it can keep and how it can be accessible. At the same time, I’ve seen a complete libraries of CAD components together with all design data coming from a mid size companies backed up with simple 1TB USB drive. I believe software like simulation can produce lots of data, but this data today is not controlled and just lost on desktops. One of the most popular requirements from engineers about PDM was the ability to delete old revisions. The sizes of PLM repositories for Items and Bill of Materials can reach certain size, but still I can hardly see how it compete to Google and Facebook media libraries. At the same time, engineering is just before to explore the richness of online data and internet of things. So, the size of engineering repositories will only grow up.

What is my conclusion? If you compare to Google, Twitter and Facebook scale, the majority of engineering repositories today are modest sized. After all, even very large CAD files can hardly compete with the amount of photo and video streams uploaded by billion people on social networks. Also, tracking data captured from mobile devices oversize every possible Engineering Change Order (ECO) records. However, engineering data has a potential to become big. An increased interest to simulation, analyzes as well as design options can bump sizes of engineering data significantly. Another potential source of information is related to an increased ability to capture customer interests and requirements as well as product behavior online. Just my thoughts. So, how fast PLM will grow to Yottabytes? What is your take?

Best, Oleg


Get every new post delivered to your Inbox.

Join 244 other followers