PLM and Big Data Driven Product Design

September 25, 2014

plm-data-driven-design-big-data-2

One of the most interesting trends to watch these days is big data. It started few years ago and I can see different dynamics related to usage and value proposition of big data. It certainly started as a technology that revolutionized the way we can capture and process large amount of data. This is where everybody get into Hadoop. However, to have a possibility to process data is just a beginning. What and how can we get value of this data? How can we efficiently use this data for decision processes? These are the most important questions on the table of data scientists, data architects, IT managers and software vendors.

I touched big data topics on my blog several time before. In one of my earlier posts – Will PLM vendors dig into Big Data? two years ago, I discussed big data opportunity and adoption rate in different industries. However, in many situations, I’ve seen vendors are approaching big data in a too wide and abstract way. It sounds like “just collect data and see the magic”. Take a look on my writeup for more details – Why PLM cannot adopt Big Data now?

I’ve been tying to come with more specific examples of how companies can use data. My attention caught by smart data collective article – Data Driven Lingerie few days ago. It speaks about company True&Co, which is focusing on design and e-commerce lingerie based on absolutely incredible data driven approach. Here is a short passage from article, which explains that:

True & co is an interesting company that combines data and design to create an opportunity for consumers to share data with the company, thereby improving the appropriateness of the product to the customer. True & co claims to be the first company to fit women into their favourite bra with a fit quiz – no fitting rooms, no measuring tape, no photos. The data they collect allows them to match the customer to over 6000 body types on their database.

I recommend you to spend 12 minutes of your time and watch True&Co CEO Michelle Lam speaks about data driven product experience.

I found very interesting to see how True&C use data not only to empower e-commerce experience, but actually design products with a specific requirements. It is fascinating example of how specific data collected from millions of customers can be used to classify product requirements, distinguish product configurations and optimize supply chain.

trueco-product-data-driven-design

Product configuration is a very complex field. Traditional PLM implementations typically demonstrating aerospace and automotive industries to describe the complexity of configurations. It is unusual and interesting to see 6000 product configurations of bras tailored to specific customer requirements. This is a very unique experience and good example of specific big data applications.

What is my conclusion? With growing interest in product customization, we are going to see requirements to manage product configurations everywhere. Sometimes it is driven by personalization and sometimes it is driven by diverse set of customer requirements. The example of True&Co is maybe unique these days. However, I think, the trend is towards empower designers and manufacturing companies with data insight to develop better product. Big data can help companies to create a unique product experience, to design better products and optimize resources. This is a future as I can see today. Just my thoughts…

Best, Oleg


What is the potential of product lifecycle analytics?

September 15, 2014

plm-epr-integration-analytics

The aim of PLM is to improve product development processes and lifecycle. One of the biggest challenges to make it happen is to deal with disparate enterprise applications and tools. The cost of system integration is high and many companies are not ready to make an investment in integration projects that will have a questionable ROI.

PLM-ERP integration is probably one of the best examples of integration projects with complex software, technologies, business dependencies and corporate politics. PLM and ERP vendors are often locking customer data and control access. It prevents customers to make a right decision to integration PLM and ERP. Few years ago, I covered it in my blog – The ugly truth about PLM-ERP monkey volleyball. I don’t think the situation is better since I posted it back in 2010. PLM-ERP integration projects are messy combination of requirements, technologies, tools and specific customer-oriented services. It usually ends up with a number of Excel and XML files flying around between PLM, ERP and other systems (including email). No surprise, companies are not read to move fast and engage into PLM-ERP integration projects.

I’ve been thinking how to change a trend of complex PLM-ERP implementation with slow ROI. Maybe to focus on transferring data between systems is not a first priority (even if it looks like the obvious one)? What can be done these days differently, so it will allow to come with a new definition of PLM-ERP integration value and maybe faster ROI from these projects? Here is the idea I wanted to share – analytics.

I have to admit, talks about data analytics are everywhere these days. It is hard to undervalue the importance and opportunity of big data. However, I want to take it beyond traditional technological level. Think about PLM and ERP applications. With significant functional overlap, companies are often see a high level of competitiveness between them. What if we can bring new capability of capturing data and making analytics into two sources of information – product lifecycle data and ERP. Can we change a trend of data competitiveness into trend of value of analytics?

Here are couple of scenarios that in my view can produce some useful analytic use cases by mixing data coming from PLM and ERP. Let me call it – lifecycle analytics. Think about NPD (new product development), but it can apply to existing products as well. Forecasting is an important step in every project, especially when it comes to a new product design and development. I can see multiple aspects of forecasting. What if I can create a forecast for entire lifecycle cost of the product. Another emerging need today is compliance forecasting. With growing number of regulation requirements to forecast compliance cost for a new product can be a challenging task. Related to that comes the need for recycle cost.

My hunch data for analytics and forecasting is available in both PLM and ERP system. It is on the crossroad between, sales, manufacturing and product engineering. To get data out of these systems and create an appropriate analytics can be an interesting, but challenging process. I think, the number of companies doing it as mainstream activity is very low, but demand should be huge.

What is my conclusion? To switch from data ownership debates into data analytics can be a way for both PLM and ERP vendors to come out of clash and competition. Enterprise databases (PLM and ERP are good examples to that) are holding a very valuable data that can be used to support decision making and provide a way to optimize product development processes. The potential for lifecycle analytics using data from both PLM and ERP systems can be significant. The development of specific analytical application can be an interesting task for existing vendors and new companies. Just my thoughts…

Best, Oleg


Will public clouds help enterprises to crunch engineering data?

August 6, 2014

google-data-center-crunches-engineering-data

The scale and complexity of the data is growing tremendously these days. If you go back 20 years, the challenge for PDM / PLM companies was how to manage revisions CAD files. Now we have much more data coming into engineering department. Data about simulations and analysis, information about supply chain, online catalog parts and lot of other things. Product requirements are transformed from simple word file into complex data with information about customers and their needs. Companies are starting to capture information about how customers are using products. Sensors and other monitoring systems are everywhere. The ability to monitor products in real life creates additional opportunities – how to fix problems and optimize design and manufacturing.

Here is the problem… Despite strong trend towards cheaper computing resources, when it comes to the need to apply brute computing force, it still doesn’t come for free. Services like Amazon S3 are relatively cheap. However, if we you want to crunch and make analysis and/or processing of large sets of data, you will need to pay. Another aspect is related to performance. People are expecting software to work at a speed of user thinking process. Imagine, you want to produce design alternatives for your future product. In many situations, to wait few hours won’t be acceptable. It will be destructing users and they won’t use such system after all.

Manufacturing leadership article Google’s Big Data IoT Play For Manufacturing speaks exactly about that. What if the power of web giants like Google can be used to process engineering and manufacturing data. I found explanation provided by Tom Howe, Google’s senior enterprise consultant for manufacturing quite interesting. Here is the passage explaining Google’s approach.

Google’s approach, said Howe, is to focus on three key enabling platforms for the future: 1/ Cloud networks that are global, scalable and pervasive; 2/ Analytics and collection tools that allow companies to get answers to big data questions in 10 minutes, not 10 days; 3/ And a team of experts that understands what questions to ask and how to extract meaningful results from a deluge of data. At Google, he explained, there are analytics teams assigned to every functional area of the company. “There’s no such thing as a gut decision at Google,” said Howe.

It sounds to me like viable approach. However, it made me think about what will make Google and similar computing power holders to sell it to enterprise companies. Google ‘s biggest value is not to selling computing resources. Google business is selling ads… based on data. My hunch there are two potential reasons for Google to support manufacturing data inititatives – potential to develop Google platform for manufacturing apps and value of data. The first one is straightforward – Google wants more companies in their eco-system. I found the second one more interesting. What if manufacturing companies and Google will find a way to get an insight from engineering data useful for their business? Or even more – improving their core business.

What is my conclusion? I’m sure in the future data will become the next oil. The value of getting access to the data can be huge. The challenge to get that access is significant. Companies won’t allow Google as well as PLM companies simply use the data. Companies are very concerned about IP protection and security. To balance between accessing data, providing value proposition and gleaning insight and additional information from data can be an interesting play. For all parties involved… Just my thoughts..

Best, Oleg

Photo courtesy of Google Inc.


Will GE give a birth to a new PLM company?

July 9, 2014

ge-datamanagement-initiative

Navigate back into histories of CAD and PLM companies. You can find significant involvement of large aerospace, automotive and industrial companies. Here are few examples – Dassault Systemes with Dassault Aviation, SDRC with US Steel, UGS with McDonnell Douglas. In addition to that, involvement of large corporation as strategic customers, made significant impact on development of many CAD/PLM systems for the past two decades. Do you think we can see something similar in the future?

Inc. article GE’s Grand Plan: Build the Next Generation of Data Startups made me think about some potential strategic involvement of large industrial companies in PLM software business. The following passage can give you an idea of how startups will be organized.

A team from GE Software and GE Ventures has launched an incubator program in partnership with venture capital firm Frost Data Capital to build 30 in-house startups during the next three years that will advance the "Industrial Internet," a term GE coined. The companies will be housed in Frost’s incubator facility in Southern California.

By nurturing startups that build analytical software for machines from jet engines to wind turbines, the program, called Frost I3, aims to dramatically improve the performance of industrial products in sectors from aviation to healthcare to oil and gas. Unlike most incubator programs, GE and Frost Data are creating the companies from scratch, providing funding and access to GE’s network of 5,000 research assistants and 8,000 software professionals. The program has already launched five startups in the past 60 days.

This story connects very well to GE vision and strategy for so called Industrial Internet. The following picture can provide you some explanations of what is the vision of GE industrial cloud.

industrial-internet-applications

What is my conclusion? Industrial companies are looking for new solutions and probably ready to invest into ideas and innovative development. Money is not a problem for these companies, but time is very important. Startups is a good way to accelerate development and come with fresh ideas of new PLM systems. Strategic partnership with large company can provide resources and data to make it happen. Just my thoughts…

Best, Oleg

Picture credit of GE report.


PLM: Manufacturing Big Data Ngram Dream?

February 3, 2014

plm-data-god

My attention was caught this weekend by thedailybeast article with funny title – Why Big Data Doesn’t Live up to the Hype. I read the article and on my long travel during the weekend skimmed over the book Uncharted: Big Data as a Lens on Human Culture by Erez Aiden and Jean-Baptiste Michel mentioned in this article. The authors were instrumental in creating of Google Ngram Viewer.

The Google Ngram Viewer is a phrase-usage graphing tool developed by Jon Orwant and Will Brockman of Google, and charts the yearly count of selected n-grams (letter combinations)[n] or words and phrases,[1][2] as found in over 5.2 million books digitized by Google Inc (up to 2008).[3][4] The words or phrases (or ngrams) are matched by case-sensitive spelling, comparing exact uppercase letters,[2] and plotted on the graph if found in 40 or more books during each year (of the requested year-range).[5] The Ngram tool was released in mid-December 2010.[1][3]

The word-search database was created by Google Labs, based originally on 5.2 million books, published between 1500 and 2008, containing 500 billion words[6] in American English, British English, French, German, Spanish, Russian, Hebrew, and Chinese.[1] Italian words are counted by their use in other languages. A user of the Ngram tool has the option to select among the source languages for the word-search operations.[7]

Researchers have analysed the Google Ngram database of books written in American or British English discovering interesting results. Amongst them, they found correlations between the emotional output and significant events in the 20th century such as the World War II.[8]

If you never tried Ngram Viewer, you should. Navigate here and try it out. You can find some interesting trends. Here is my funny example – “data” is eclipsing “love” trend. Does it mean something? I’m not sure, but it is funny…

Screen Shot 2014-02-03 at 9.26.02 AM

Google certainly has a power to deal with such large projects. Everybody are trying to collect data these days. You can see some very interesting examples. Ambitions of CAD and PLM companies are not going so far… yet. Here is the idea for somebody with budget and free time – to collect product lifecycle information related to manufacturing industry, suppliers, material trends and consumer behaviors. More and more data becomes available publicly on the web. To collect and classify this information can help us to explore future demands and opportunities.

What is my conclusion? In data we trust. Data is a very powerful argument and we use it frequently. With globalization of manufacturing industry and ambitious to discover future trends and opportunity of manufacturing and supply chain, I can see collecting of publicly available manufacturing data as a key towards unknown unknowns. Just a crazy idea and my thoughts… Happy Monday!

Best, Oleg


Why PLM cannot adopt Big Data now?

January 30, 2014

plm-big-data-reuse

The buzz about Big Data is everywhere these days. From 2011 and up today, we can clearly see skyrocketing interest in Big Data as well as to what is behind this buzzword. Companies around the world are trying to figure out what Big Data means for them and how they can leverage it now. Engineering and manufacturing software vendors are doing this as well. I’ve been speculating about opportunity of PLM vendors to dig in Big Data last year. So far, I’ve heard lots of talks, but never seen much practical results of how Big Data can help to improve PLM products as well as influence product development processes.

I stumble on AllThingsD article – Big Data and the Soles of Your Shoes. Take 10 minutes and read the article. It speaks very nicely about modern customer-manufacturing e-commerce driven interaction. The overall process of information flow is interesting – product configuration, ordering system, materials supply, financial transactions, transportation and many other aspects. You can only imagine of how many data pieces should be moved behind this scenario – product information, bill of materials, manufacturing orders, shipment tracking, manufacturing process, delivery shipment. I specially liked the following passage coming as a conclusion of the article:

Big Data always comes across as “Big” first and “Data” second. What I urge you to do is think about the “small data.” This type of data is what happens every moment of every day. The humble pair of shoes represents small data. It’s a pair of shoes. It doesn’t pretend to be a space shuttle. But that pair of shoes has generated a massive quantity of data in its journey to you.

Small data represents the constant dripping faucet of information you generate every day. From ordering food at a restaurant to visiting a Web page to buying a pair of shoes, this faucet never stops. The amount of small data out there trumps the amount of Big Data.

The article made me think about interesting term coined by social scientists – Ambient Awareness. It refers to the information that surrounding us online – social networks, e-commerce other websites producing so-called activity streams. These streams creates business specific contextual information. The problem is that despite wide adoption of social networks in consumer spaces, organizations are still in a very premature phase of understanding how to use and leverage this information and how it might be relevant.

The challenge for most manufacturing organization is how to use right snippets of Big Data. Let’s take product design and cost assessment process. In my view, the opportunity is to see how product configurations and variety design options are impacting product cost. The pieces of data to make this analysis are in the data flow between vendors, suppliers, shipments, material cost, etc. Now think about engineer option to live in the ambient awareness of information driving towards right design for cost process. The main question that comes to my mind is related to ‘relevance’ of every bit of big data coming from outside. What is relevant to cost? What impact every bit of information has on overall cost? How to calculate it and how to put it in front of engineer at the right time?

What is my conclusion? Big data is a big opportunity for many companies. However, "big data" is too big and too abstract for companies to understand and use. Companies need to develop a way to use small bits of data coming from different sources to drive decision process and choose options. This is not simple and will not happen overnight for most of manufacturing companies using PLM systems. PLM vendors need to come with the approach how to inject small chunks of Big Data in the product development process. A task for PLM strategists and product managers. Just my thoughts…

Best, Oleg


Why PLM vendors need to hire data scientists?

December 4, 2013

plm-data-knowledge

The importance of data is growing tremendously. Web, social networks and mobile started this trend just few years ago. However, these days companies are starting to see that without deep understanding of data about their activities, the future of company business is uncertain. For manufacturing companies, it speaks a lot of about fundamental business processes and decisions related to product portfolios, manufacturing and supply chain.

It sounds like PLM vendors have a potential best fit to fulfill this job. PLM portfolios are getting broader and covers lots of applications, modules and experience related to optimization of business activities. In one of my earlier blogs this month, I was talking about new role of Chief Data Officer in companies. Navigate here to read and draw your opinion. However, to make this job successful is mission impossible without deep understanding of company data by both sides – company and vendors / implementers.

Few days ago, I was reading InformationWeek article – Data Scientist: The Sexiest Job No One Has. The idea of data scientist job is very interesting if you apply it beyond just storing data on file servers. Think about advanced data analyst job focusing on how company can leverage their data assets in a best way. The amount of data companies are generating is doubling every few months. To apply right technology and combine it with human skills can be an interesting opportunity. Pay attention on the following passage:

The role of data scientist has changed dramatically. Data used to reside on the fringes of the operation. It was usually important but seldom vital — a dreary task reserved for the geekiest of the geeks. It supported every function but never seemed to lead them. Even the executives who respected it never quite absorbed it.

But not anymore. Today the role set aside for collating, sorting, and annotating data — a secondary responsibility in most environments — has moved to the forefront. In industries ranging from marketing to financial services to telecommunications, the data scientists of today don’t just crunch numbers. They view the universe as one large data set, and they decipher relationships in that mass of information. The analytics they develop are then used to guide decisions, predict outcomes, and develop a quantitative ROI.

So, who can become data scientist in a manufacturing companies? Actually, this major is still not defined in American colleges. Anybody with good skillset of math, computer science and manufacturing domain knowledge can think about this work. So, I can clearly can see it as an opportunity for retired CAD and PLM IT managers spending their life on installation of on premise PLM software as soon as the software will be moving to the cloud environment.

What is my conclusion? In the past, installation and configuration skill set was one of the most important in PDM/PLM business. The time vendors spent on system implementation was very significant. PLM cloud switch is going to create a new trend – understanding of company data and business processes will be come and #1 skill So, PLM vendors better start thinking about new job description – people capable to understand how to crunch manufacturing data to create a value for customers. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 260 other followers