The Complexity of Product Lifecycle and Google’s Blindspot

September 6, 2013

The power of search giants like Google is enormous these days. Think about the amount of information Google, Twitter and Facebook are processing and you will be knocked down by the numbers. Consumerization is a significant trend these days and everybody are thinking how we can apply well proven web and open source technologies in the enterprise field. Think about product designs, engineering documents, Bill of Materials – things that we commonly considering as product data. Eventually, the dream could be to see how Google’s engineers are recommending best parts to use or cracking Bill of Materials with 100s levels of data. Not so fast…

When it comes to a product data you can discover that this type of information processing is different from what we got to know on the web. It starts from the diminished importance of ranking mechanism based on other people discoveries. For example, if you happen to be searching for “Part CHI-93939-STD” it may not come up on the first pages of a search. But it may be found more directly via a connection to an existing assembly that references it. Data semantics in this case is more important than data ranking.

I recently came across the following study – Top Google Result Gets 36.4% of Clicks. Have a look at the charts, you’ll get my point quickly: if you are out of first five (5) page results, you essentially don’t exist. So if you’re “Lady Gaga”, you are certain to appear and ranked in the top pages. These days "social ranking" is adding some additional flavors to the overall search results. Nevertheless if you are “Part CHI-93939-STD”, then chances are, you don’t exist!

Another interesting blindspot of Google search – lifecycle data. Few days ago, I caught an interesting study – Filling a Search Engines Blindspots. Here is the passage describing lifecycle blindspot:

Today, Christian von der Weth and Manfred Hauswirth at the National University of Ireland in Galway identify one blind spot in Google’s coverage and describe their vision for how to fill it. This information blackspot consists of location-specific information that is only useful for people for short periods of time. An example would be a question such as whether an advertised bargain is still available at a particular shop. Another is to ask whether parking spaces are available at a public event such as an air show, music concert or such like. There is no way that a search engine like Google can index that kind of information that is specific to a particular location for just a short period of time.

What is my conclusion? Product data is extremely complex. It contains lots of relationships, dependencies and semantics. However, it is not everything. The most important element of product data is lifecycle information. Since product data is changing as a result of product development, use, maintenance, etc. systems need to be able to capture this product lifecycle data in a real time to provide a correct data representation for people in manufacturing companies and extended eco-system. It is not a trivial tasks and very interesting problem to crack. PLM software architects and other techies – be aware about complexity of product data lifecycle management. Just my thoughts…

Best, Oleg


Product Lifecycle Future in 60 years point-of-view BBC film

September 5, 2013

Speak to people about PLM and they will tell you about CAD design, BOMs, processes, ECOs, collaboration and other similar topics. I want to change it and speak about “lifecycle” part of PLM. The core function of the lifecycle is the ability of PLM system to maintain changes and manage snapshots of product design and organization in a specific period of time. Some of products have very short period of time. However, many of products such as airplanes, powerplants and even cars have very long lifecycle. The information about a specific aircraft can easily live for 50-60 years and more.

Few years ago, I discussed the problem of long term product data retention problem. Navigate here to refresh your memory. The main part of that conversation was about how to create logical model and physical data storage for data to be preserved for a long period of time.

Earlier today, I was watching a very interesting video created by BBC. Take a look below. Here is what they did.

Sixty years ago, the BBC filmed a train journey from London to Brighton, squeezed into just four minutes. Thirty years ago, we did it again. Now we are bringing it up to date, to see how much has changed – and how much is still the same. Here’s all three journeys side by side

Here is the challenge I can see in front of design, simulation and PLM systems. How we can store and play with design and other information about products that can help us to recreate a product virtually in a different periods of time. How I can create a product experience of “a specific version of a car” company designed ten, twenty or even more years ago? How we can store all information about product environment to recreate what we had on the street and in the city 50 years ago. Some information is available in GIS and other mapping services. Some of this information is available in PDM/PLM product designs and other data sources. To combine all these data sources together can be an interesting and real challenge for PLM system.

What is my conclusion? I think PLM industry is underestimating the value of lifecycle data and the ability of this data to influence the analytic and decision process. The ability to ‘recreate’ product experience in a different stages of lifecycle and period of time can provide a tremendous changes in the way we design, manufacture and support products in the future. Just my thoughts…

Best, Oleg


PLM Data vs. Process: A Turn Towards Linked Data.

September 4, 2013

Data vs. Process. The egg or chicken of PLM industry. This topic is near and dear to many people in PLM ecosystem . What comes first and why? My attention was caught by Jos Voskuil blog post – Mixing past and future generations with a PLM sauce. Have a read and make your opinion. I liked the following passage:

This culture change and a different business approach to my opinion are about modern PLM. For me, modern PLM focuses on connecting the data, instead of building automated processes with a lot of structured data. Modern PLM combines the structured and unstructured data and provides the user the right information in context.

Link is a powerful word. I appreciate the power of data connection. I reminded one of the writeups I did in Inforbix blog – Product Data: The Power is in the link more than one year ago. It goes back to Richard Wallis’ presentation on Semantic Tech and Business 2013 in Berlin.

…the power of the links in Linked Data – of the globally unique identifiers of things and relationships described by URIs (Uniform Resource Identifier) in RDF – for more seamlessly interconnecting data within users’ own domains and with other data in other domains, too…

Jos’ commentary made me think about process vs. data again. I addressed this topic few times in my past blogging. My first attempt was PLM: Controversy about Process and Data Management. I wanted to emphasize my strong believe in the need to solve the problem of product data access in an organization.

…the failure to design data access in organizations, was a recipe for disaster for many PLM implementations. PLM programs were focused on “how to improve processes” and forgot about how to put a solid data foundation to support cross-departmental process implementations. So, I’d like to put a quote from Bell Helicopter’s presentation during DSCC 2011 as something PLM vendors and customers need to remember – “to get the core data right first”. Just my opinion, of course.

My next attempt to talk about data vs. process was earlier this year. The discussion was triggered by Tech4PD dialog between Jim Brown and Chad Jackson. It was precisely named PLM’s chicken or Egg Scenario. In a bit confusing (to me) voting between "going beyond file control data" and "data beyond engineering has to be centralized, secure and accessible to PLM", I decided a process is more important. I explained myself in the post – PLM: Data vs. Process – Wrong Dilemma? My conclusion that focus on product lifecycle – a data set combined information about the product (data) with information about process (lifecycle).

The debate made me think about why Data vs. Process is probably a wrong dilemma in the context of PLM. In my view, the right focus should be on “lifecycle” as a core value proposition of PLM and ability of PLM to support product development. In a nutshell, product development is about how to move product definition (in a broad sense of this word) from initial requirements and design to engineering and manufacturing. If I go future, next stages of product definition will be related to maintenance and disposal. To define what represent product on every stage together with what is required to move product from one stage to another is a core value of product lifecycle and PLM.

What is my conclusion? I agree with Jos. Business is getting more data sensitive these days. Time ago, the value of data wasn’t predominant like today in our Google era. It is clear to everybody that "data matters" and the best you can do to prove your point is to bring "data points". This is why the ability to bring "linked data points" about a product becomes so valuable. This is the next PLM turn. Just my thoughts…

Best, Oleg


How PLM will embed information in products?

July 24, 2013

Experience is a new modern hype. You can see it everywhere these days. User experience, selling experience, learning experience, total experience, etc. You can continue the list… I want to talk about "product experience" today. This is obvious and new at the same time. Manufacturers are interested to know more about their products. It related to sales, usage, problem reports and defects, maintenance, etc. It becomes almost obvious – the more information you get about usage of your product during whole product cycle – the better you can do. I like old, but famous quote by W. Edwards Deming – "You can’t manage what you can’t measure."

An interesting article by SolidSmack caught my attention earlier today – InfraStructs: Embedded ID Tags in 3D Printed Objects Eliminate Need for RFID and Barcodes. The article speaks about how to embed the information in 3D printed objects:

California event is the announcement from Microsoft Research that they are developing embedded ID tags for 3D printed objects. Titled InfraStructs, the internal tags are created from the same 3D printing process already used to create the intended, printed object; effectively generating an internal, invisible tag that can be read with a terahertz (THz) imaging scanner.

How manufacturers will use and why PLM vendors will benefit is? The premise of existing RFID technology is to use a specific tag that must be attached to product that will allow to tag manufacturing items, spare parts during whole lifecycle. The idea was good, but the implementation is a bit complicated and still costly. By direct embedding of additional information manufacturing can achieve the next level of efficiency. Here is another passage from the article.

Ultimately, the benefit of this approach for manufacturers is that they can embed unique information such as serial numbers or simple programs in coded tags by integrating the design into a pre-determined 3D printed design. In turn, this eliminates the potential need for other (and oftentimes more expensive) identification systems such a RFID tags and electronic chips that can add cost and complexity to the manufacturing, as well as the need for bar codes which can be cumbersome to work with and are vulnerable to tampering.

What is my conclusion? We are moving towards connected world, where design and engineering parts will be more connected to their physical implementations. It will allow better measurement of product experience and, as a result -better product lifecycle management. Just my thought…

Best, Oleg


Do We Need a Delete Button in PLM?

May 14, 2013

Delete is a special function. In the system dealing with the live data, the meaning of delete is interesting. My first lesson about <delete> function in PDM was 25 years ago. In one of very first data management systems I implemented we used a special flag to mark deleted parts. Later on, I was discussing delete functionality with engineering managers of one of the firms. Think about parts used in production. How you can delete them? They can be not effective for usage, out of stock, discontinued, etc. However, you cannot literally delete them. Back 20 years ago the technology was different. We marked parts and revisions as "obsolete", but we didn’t keep them forever.

Yesterday, in the airport, the following CNET article caught my attention – Google’s Schmidt: The Internet needs a delete button. Schmidt is discussing the nature of internet to absorb data and information that cannot be deleted. Here is an interesting passage:

Actions someone takes when young can haunt the person forever, Schmidt said, because the information will always be on the Internet. He used the example of a young person who committed a crime that could be expunged from his record when he’s an adult. But information about that crime could remain online, preventing the person from finding a job. "In America, there’s a sense of fairness that’s culturally true for all of us," Schmidt said. "The lack of a delete button on the Internet is a significant issue. There is a time when erasure is a right thing."

Well, privacy has a different angle, of course. People are not Part Numbers. However, think about technology behind the internet these days. Think about Gmail. You can be doing email forever without deleting them. I’ve heard some rumors first version of Gmail had no delete functionality. Storage is cheap these days. You literally can keep all information created by design, engineering, manufacturing all the time without deleting this information. Isn’t it fascinating. It can change the way people design and manufacturing things.

What is my conclusion? Delete is a very specially functionality when it comes to systems dealing with a lifecycle. Internet is very much change our horizons in understanding what potentially can include a "total lifeycle" management. It also change a perspective of how to manage lifecycle for a particular eco-system such as PLM. The increasing lifespan covered by PLM systems can improve decision making and provide additional insight in the areas of product development, quality management and others. Just my thoughts…

Best, Oleg


PLM, Lifecycle and Google Timelapse ideas

May 10, 2013

lifecycle-plm.jpgManufacturing businesses are getting more dynamic these days. It is all about how to change and change fast. The days where manufacturing companies allowed months and even years to respond to business changes are gone. Competition is getting more aggressive. Cost pressure is getting tight. Companies need to have a way to analyze what they do in a perspective of time. This is actually one of the places where PLM technology can provide a bigger value in the future. Think about design changes and problems reported in your products with the time lapse of last 10 years. Think about quality and cost of suppliers for the last 5 years. How PLM can provide such type of insight and information?

Very often, when we speak about PLM, we want to emphasize the middle "L" of what PLM is accomplishing for our customers. It is about Lifecycle. This is where every PLM solution want to excite and provide a value. However, this part of PLM is not developed much.

I’ve been reading about Google Timelapse project earlier this week. Navigate here to read ABC news article – Google Timelapse: A Quarter Century of Earth’s Change. Working with the U.S. Geological Survey, NASA and Time magazine, search giant Google has unveiled a project that shows how planet Earth has changed over the course of a quarter century.

Another articleTime’s Timelapse story provides a bit more details about the project.

With the help of massive amounts of computer muscle, they have scrubbed away cloud cover, filled in missing pixels, digitally stitched puzzle-piece pictures together, until the growing, thriving, sometimes dying planet is revealed in all its dynamic churn. The images are striking not just because of their vast sweep of geography and time but also because of their staggering detail. Consider: a standard TV image uses about one-third of a million pixels per frame, while a high-definition image uses 2 million. The Landsat images, by contrast, weigh in at 1.8 trillion pixels per frame, the equivalent of 900,000 high-def TVs assembled into a single mosaic.

The technology is available online. You can try it by yourself. Read Google’s blog and navigate to the following link to experiment with Google Earth Engine. Here is a search for changes in Haifa district Israel I captured.

haifa-1984-2011.jpg

What is my conclusion? To think about time exploration in the context of manufacturing and engineering information is very inspiring. Data visualization can be potentially very cool and provide a lot of insight to manufacturing organization about how to improve their businesses. What is your take? Do you have an idea of engineering timelapse visualization? Speak up.

Best, Oleg


Product Lifecycle and Social Timeline

January 8, 2013

I want continue the conversation about the intersection of social software and PLM. Yesterday blog Why Social PLM 1.0 failed? made me think about how to find a single utility for user in that context. As Jim Brown mentioned in his comment earlier, the social hype calmed down and PLM eco-system didn’t change overnight. All together it brings me to the point of finding additional value-add services.

The following blog post caught my attention – 5 Fun Ways to Travel Back in Time with Your Social Data. The article speaks about tools helping us to discover, analyze and get back in history of our social channels – LinkedIn, Twitter, Facbook, etc. One of the tools – Vizify drove my special interest. The application with a very easy user interface helped me to get access and analyze my twitter behavior as well as connection to photo, linkedin and facebook. Take a look on few screen shots I’ve made. You can also see the results by navigating to this link.

I found an interesting intersection between social, timeline and big data. PLM lifecycle function is somewhat that can be improved by a similar experience. To have the ability to access product historical data, making analyzes and merge information can be extremely interesting. Think about the ability to merge product releases with customer social stream and defect database. I can bring more examples. The utility was able to analyze my data fast, easy and without heavy IT involvement. I understand that getting access to public services like Twitter and Facebook is easier than to your corporate ERP system. At the same time the trend towards simplification and added value cloud services is clear to me.

What is my conclusion? In my view, Social PLM 1.0 was completely focused on how to create and collaborate. To me it was an obvious attempt – collaborative design and engineering was always a focus of CAD/PLM vendors . However, think about single user utility and the value of additional services. Social timeline (or lifecycle timeline in the context of PLM) can be an interested feature and value added utility. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 250 other followers