Top 5 PLM trends to watch in 2015

January 15, 2015

plm-trends-2015

Holidays are over and it was a good time to think about what you can expect in engineering and manufacturing software related to PLM in coming year. You probably had a chance to listen to my 2015 PLM predictions podcast few months ago. If you missed that, here is the link. Today I want to give a bit more expanded list of trends in product lifecycle management to observe in 2015.

1- Greater complexity of cloud PLM implementations

Cloud adoption is growing in enterprise for the last few years and it is getting more mature. PLM vendors are making steps in the cloud direction too. Companies are moving from marketing and research to “nuts and bolts” of implementations. Switch to the cloud is not as simple as some marketing pundits predicted. It is more than just moving servers from your data center to somebody else place. The complexity of implementation, maintenance and operation will emerge and will drive future difference between “born in the cloud” solutions and existing PLM platforms migrating to the cloud.

2- The demand to manage complex product information will be growing

Products are getting more complex. You can see it around you. A simple IoT gadget such as door lock can combine mechanical, electrical, electronic and software parts. It introduces a new level of complexity for manufacturing and PLM vendors – how to manage all this information in a consistent way? To bring together design and bill of materials for every discipline becomes a critical factor in manufacturing company of every size.

3- New type of manufacturing companies will be attracting focus of PLM vendors

Manufacturing landscape is changing. Internet and globalizaiton enabling to create a new type of manufacturing companies – smaller, distributed, agile, crowdfunded. It requires new type of thinking about collaboration, distribute working, digital manufacturing and more. These companies are representing new opportunity and will drive more attention from PLM vendors.

4- Growing interest in mobile enterprise PLM solutions

Mobile went mainstream in many domains. Until now, engineers in manufacturing companies mostly used mobile for email. In 2015 I can see a potential to have a greater interest in mobile solution from manufacturing companies. Distributed work and need for collaboration will drive the demand to make existing enterprise systems more mobile.

5- The demand for big data and analytics in product lifecycle.

Data is driving greater attention these days. I even heard data “data as a new oil”. Manufacturing companies will start to recognize the opportunity and think how to use piles of data from their enterprise engineering and manufacturing system to drive some analysis and use it for decision making.

What is my conclusion? I think 2015 will be a very interesting year in PLM. Broader adoption of cloud, mobile and big data analytics will drive future transformation in engineering and manufacturing software. The disconnect between old fashion enterprise software and new tech vendors will increase. Just my thoughts…

Best, Oleg


What stops manufacturing from entering into DaaS bright future?

January 7, 2015

cloud-mfg-daas

There are lot of changes in manufacturing eco-system these days. You probably heard about many of them. Changes are coming as a result of many factors – physical production environment, IP ownership, cloud IT infrastructure, connected products, changes in demand model and mass customization.

The last one is interesting. The time when manufacturing was presented as a long conveyor making identical product is gone. Diversification and local markets have significant impact. Today manufacturing companies are looking how to discover and use variety of data sources to get right demand information, product requirements and connect directly with customers. Data has power and the ability to dig into data becomes very valuable.

As we go through the wave of end of the year blog summaries, my attention caught Design World publication – 7 Most Popular 3D CAD World Blog Posts of 2014 . I found one of them very interesting. Navigate your browser to read – Top Ten Tech Predictions for 2015. One of them speaks about DaaS – Data-as-a-Service will drive new big data supply chain. Here is the passage I captured:

Worldwide spending on big data-related software, hardware, and services will reach $125 billion. Rich media analytics (video, audio, and image) will emerge as an important driver of big data projects, tripling in size. 25% of top IT vendors will offer Data-as-a-Service (DaaS) as cloud platform and analytics vendors offer value-added information from commercial and open data sets. IoT will be the next critical focus for data/analytics services with 30% CAGR over the next five years, and in 2015 we will see a growing numbers of apps and competitors (e.g., Microsoft, Amazon, Baidu ) providing cognitive/machine learning solutions.

The prediction is very exciting. Future data services can help manufacturing companies leverage data to optimize production, measure demand and help manufacturing a diverse set of product for wide range of customers. However, here is a problem. I guess you are familiar with GIGO – Garbage in, Garbage out. When you deal with data, there is nothing more important then to have an access to an accurate and relevant data sets. Big data analytic software can revolutionize everything. But it requires data. At the same time, data is located in corporate databases, spreadsheets, drawings, email systems and many other data sources. To get these data up to the cloud, crunch it using modern big data clouds and make it actionable for decision processes is a big deal.

What is my conclusion? Data availability is a #1 priority to make DaaS work for manufacturing in coming years. The ability to collect right data from variety corporate sources, clean, classify, process and turn into action – this is a big challenge and opportunity for new type of manufacturing software in coming years. Just my thoughts…

Best, Oleg

photo credit: IvanWalsh.com via photopin cc


How PLM can “build itself” using artificial intelligence technologies

January 7, 2015

building-the-grid-of-data

I had a chance to visit The Art of Brick exhibition in Boston’s Faneuil Hall Museum few days ago. If you following me on social media websites, there is a chance you noticed few pictures. Afterwards, I read more about LEGO artist Nathan Sawaya. What impressed me is a power of “simple LEGO brick”. A simple plastic brick and huge amount of imagination is allowing to create such an incredible models.

lego-bricks-sawaya

You can ask me – how is that connected to engineering, manufacturing and product lifecycle management? Here is the thing… It made me think about ways PLM systems are implemented these days. I’m sure you are familiar with the “best practices” approach. The topic isn’t new. I found my old post - PLM best practices torpedo. After five years, I still like my conclusion – PLM best practices are good to show what PLM technology and software are capable to do. However, for real implementation, it is not very useful. You have to come back to a “simple bricks” of PLM technology – data models, documents, lifecycle statuses, bill of materials, processes.

I captured a bit different perspective about PLM best practices. Navigate to PLM cultural change blog – PLM design patterns. It took me back into thinking about best practices. How to define implementation patterns and make a real PLM implementation much easier? The article speaks about general way of PLM implementation can be done, organizational transformation and change. Read the article. I found the following few passages interesting:

In general you can setup all required supporting procedures in using the PLM design patterns. Even for specific supporting procedures of a business process pattern like Engineer to Order (ETO) you can derive patterns, which consists of a framework of general PLM design patterns and are adapted to the specific business needs. There is enough freedom to derive based on these patterns supporting procedures to fulfill specific business needs.

If some organizations would have implemented supporting procedures based on patterns already, then consultants in introducing PLM to an organization could refer to “state of the art” implementation examples of these organizations. The target is to convince an organization, that the decision for a new practice requesting organizational change is required and works. Only then the organization can enable the full potential of the PLM methodology without remaining stuck in the current practice.

Instead of inventing a Ping-Pong table “from scratch” with a cabinetmaker we can make a clear decision based on all the options available, fulfilling and probably exceeding our originally perceived needs (with a safe and easy-to-use folding mechanism). And we can afford it, because a stock table is cheaper than a custom built one.

The time saved in avoiding the endless discussions and continual redesign of processes because of paradigm paralysis, based on current methods, could be better used in a well-planned, strategic deployment of the new processes leading to an improved business solution.

plm-design-patterns

The idea and vision of configurable patterns and best practice is interesting. In my view, it was invented earlier as PLM toolkits, flexible data models and process templates. The key problem here is not related to technology- software does what it does. The problem is related to people and organization. Remember, technology is simple, but people are really hard. What called “to convince people” is actually a process that takes organization and people to understand their business and product development patterns. Without that understanding the chances of successful PLM implementation are very low and probability of PLM project failure is high.

So, what could be 21st century solution for that problem?

My attention today caught by a new startup – The Grid. The tagline states – AI websites that design themselves. The vision of The Grid is to change the paradigm of website building. The idea of self-building websites driven by artificial intelligence and data analysis is something worth to think about. Watch the video.

Now let me back to manufacturing companies and PLM implementations. All manufacturing organizations are different. The approach most of PLM vendors are taking these days is to classify companies by size (small, medium, large), industry (aero, auto, industrial equipment, etc), manufacturing model (mass production, configured to order, engineering to order, etc.) and many others such as locations, supply chain, existing enterprise systems (ERP, SCM, etc.). The decision matrix is huge. To make analysis of existing manufacturing company, processes, existing systems, requirements – this is what takes time and money during PLM implementation.

What is my conclusion? The opportunity we have today is coming from new way to process data. Call it cloud computing, big data, whatever. Facebook is reporting about a capability to index trillion posts. Would it be possible to capture data from an existing manufacturing company and ask PLM system to build itself? Is it a dream or a future of PLM? Just my thoughts…

Best, Oleg

pictures credit to The Grid website and PLM cultural change blog


How PLM can ride big data trend in 2015

December 22, 2014

plm-actionable-data-decision-process

Few month ago, I shared the story of True & co – company actively experimenting and leveraging data science to improve design and customer experience. You can catch up by navigating on the following link – PLM and Big Data Driven Product Design. One of the most interesting pieces of True & Co experience I’ve learned was the ability to gather a massive amount of data about their customers and turn in into a information to improve product design process.

Earlie this week the article What’s next for big data prediction for 2015 caught my attention. I know… it is end of the year “prediction madness”. Nevertheless, I found the following passage interesting. It speaks about emerging trend of Information as a service. Read this.

The popularity of “as-a-Service” delivery models is only going to increase in the years ahead. On the heels of the success of software as a service models, I believe Information-as-a-Service (IaaS) or Expertise-as-a-Service delivery models are likely the next step in the evolution. The tutoring industry provides a good blueprint for how this might look. Unlike traditional IT contractors, tutors are not necessarily hired to accomplish any one specific task, but are instead paid for short periods of time to share expertise and information.

Now imagine a similar model within the context of data analytics. The shortfall most often discussed with regard to analytics is not in tooling but in expertise. In that sense, it’s not hard to imagine a world where companies express an interest in “renting” expertise from vendors. It could be in the form of human expertise, but it could also be in the form of algorithmic expertise, whereby analytics vendors develop delivery models through which companies rent algorithms for use and application within in their own applications. Regardless of what form it takes in terms of its actual delivery, the notion of information or expertise as a service is an inevitability, and 2015 might just be the year IT vendors start to embrace it.

It made me think about how PLM can shift a role from being only “documenting and managing data and processes” towards providing services to improve it by capturing and crunching large amount of data in organization. Let speak about product configurations – one of the most complicated element of engineering and manufacturing. Mass production model is a think in a past. We are moving towards mass customization. How manufacturing companies will be able to get down cost of products and keep up with a demand for mass customization? Intelligent PLM analytics as a service will be able to help here.

What is my conclusion? Data is a new oil. Whoever will have an access to a most accurate data will have a power to optimize processes, cut cost and deliver product faster. PLM companies should take a note and think how to move from “documenting” data about design and processes towards analytical application and actionable data. Just my thoughts…

Best, Oleg


PLM and Big Data Driven Product Design

September 25, 2014

plm-data-driven-design-big-data-2

One of the most interesting trends to watch these days is big data. It started few years ago and I can see different dynamics related to usage and value proposition of big data. It certainly started as a technology that revolutionized the way we can capture and process large amount of data. This is where everybody get into Hadoop. However, to have a possibility to process data is just a beginning. What and how can we get value of this data? How can we efficiently use this data for decision processes? These are the most important questions on the table of data scientists, data architects, IT managers and software vendors.

I touched big data topics on my blog several time before. In one of my earlier posts – Will PLM vendors dig into Big Data? two years ago, I discussed big data opportunity and adoption rate in different industries. However, in many situations, I’ve seen vendors are approaching big data in a too wide and abstract way. It sounds like “just collect data and see the magic”. Take a look on my writeup for more details – Why PLM cannot adopt Big Data now?

I’ve been tying to come with more specific examples of how companies can use data. My attention caught by smart data collective article – Data Driven Lingerie few days ago. It speaks about company True&Co, which is focusing on design and e-commerce lingerie based on absolutely incredible data driven approach. Here is a short passage from article, which explains that:

True & co is an interesting company that combines data and design to create an opportunity for consumers to share data with the company, thereby improving the appropriateness of the product to the customer. True & co claims to be the first company to fit women into their favourite bra with a fit quiz – no fitting rooms, no measuring tape, no photos. The data they collect allows them to match the customer to over 6000 body types on their database.

I recommend you to spend 12 minutes of your time and watch True&Co CEO Michelle Lam speaks about data driven product experience.

I found very interesting to see how True&C use data not only to empower e-commerce experience, but actually design products with a specific requirements. It is fascinating example of how specific data collected from millions of customers can be used to classify product requirements, distinguish product configurations and optimize supply chain.

trueco-product-data-driven-design

Product configuration is a very complex field. Traditional PLM implementations typically demonstrating aerospace and automotive industries to describe the complexity of configurations. It is unusual and interesting to see 6000 product configurations of bras tailored to specific customer requirements. This is a very unique experience and good example of specific big data applications.

What is my conclusion? With growing interest in product customization, we are going to see requirements to manage product configurations everywhere. Sometimes it is driven by personalization and sometimes it is driven by diverse set of customer requirements. The example of True&Co is maybe unique these days. However, I think, the trend is towards empower designers and manufacturing companies with data insight to develop better product. Big data can help companies to create a unique product experience, to design better products and optimize resources. This is a future as I can see today. Just my thoughts…

Best, Oleg


What is the potential of product lifecycle analytics?

September 15, 2014

plm-epr-integration-analytics

The aim of PLM is to improve product development processes and lifecycle. One of the biggest challenges to make it happen is to deal with disparate enterprise applications and tools. The cost of system integration is high and many companies are not ready to make an investment in integration projects that will have a questionable ROI.

PLM-ERP integration is probably one of the best examples of integration projects with complex software, technologies, business dependencies and corporate politics. PLM and ERP vendors are often locking customer data and control access. It prevents customers to make a right decision to integration PLM and ERP. Few years ago, I covered it in my blog – The ugly truth about PLM-ERP monkey volleyball. I don’t think the situation is better since I posted it back in 2010. PLM-ERP integration projects are messy combination of requirements, technologies, tools and specific customer-oriented services. It usually ends up with a number of Excel and XML files flying around between PLM, ERP and other systems (including email). No surprise, companies are not read to move fast and engage into PLM-ERP integration projects.

I’ve been thinking how to change a trend of complex PLM-ERP implementation with slow ROI. Maybe to focus on transferring data between systems is not a first priority (even if it looks like the obvious one)? What can be done these days differently, so it will allow to come with a new definition of PLM-ERP integration value and maybe faster ROI from these projects? Here is the idea I wanted to share – analytics.

I have to admit, talks about data analytics are everywhere these days. It is hard to undervalue the importance and opportunity of big data. However, I want to take it beyond traditional technological level. Think about PLM and ERP applications. With significant functional overlap, companies are often see a high level of competitiveness between them. What if we can bring new capability of capturing data and making analytics into two sources of information – product lifecycle data and ERP. Can we change a trend of data competitiveness into trend of value of analytics?

Here are couple of scenarios that in my view can produce some useful analytic use cases by mixing data coming from PLM and ERP. Let me call it – lifecycle analytics. Think about NPD (new product development), but it can apply to existing products as well. Forecasting is an important step in every project, especially when it comes to a new product design and development. I can see multiple aspects of forecasting. What if I can create a forecast for entire lifecycle cost of the product. Another emerging need today is compliance forecasting. With growing number of regulation requirements to forecast compliance cost for a new product can be a challenging task. Related to that comes the need for recycle cost.

My hunch data for analytics and forecasting is available in both PLM and ERP system. It is on the crossroad between, sales, manufacturing and product engineering. To get data out of these systems and create an appropriate analytics can be an interesting, but challenging process. I think, the number of companies doing it as mainstream activity is very low, but demand should be huge.

What is my conclusion? To switch from data ownership debates into data analytics can be a way for both PLM and ERP vendors to come out of clash and competition. Enterprise databases (PLM and ERP are good examples to that) are holding a very valuable data that can be used to support decision making and provide a way to optimize product development processes. The potential for lifecycle analytics using data from both PLM and ERP systems can be significant. The development of specific analytical application can be an interesting task for existing vendors and new companies. Just my thoughts…

Best, Oleg


Will public clouds help enterprises to crunch engineering data?

August 6, 2014

google-data-center-crunches-engineering-data

The scale and complexity of the data is growing tremendously these days. If you go back 20 years, the challenge for PDM / PLM companies was how to manage revisions CAD files. Now we have much more data coming into engineering department. Data about simulations and analysis, information about supply chain, online catalog parts and lot of other things. Product requirements are transformed from simple word file into complex data with information about customers and their needs. Companies are starting to capture information about how customers are using products. Sensors and other monitoring systems are everywhere. The ability to monitor products in real life creates additional opportunities – how to fix problems and optimize design and manufacturing.

Here is the problem… Despite strong trend towards cheaper computing resources, when it comes to the need to apply brute computing force, it still doesn’t come for free. Services like Amazon S3 are relatively cheap. However, if we you want to crunch and make analysis and/or processing of large sets of data, you will need to pay. Another aspect is related to performance. People are expecting software to work at a speed of user thinking process. Imagine, you want to produce design alternatives for your future product. In many situations, to wait few hours won’t be acceptable. It will be destructing users and they won’t use such system after all.

Manufacturing leadership article Google’s Big Data IoT Play For Manufacturing speaks exactly about that. What if the power of web giants like Google can be used to process engineering and manufacturing data. I found explanation provided by Tom Howe, Google’s senior enterprise consultant for manufacturing quite interesting. Here is the passage explaining Google’s approach.

Google’s approach, said Howe, is to focus on three key enabling platforms for the future: 1/ Cloud networks that are global, scalable and pervasive; 2/ Analytics and collection tools that allow companies to get answers to big data questions in 10 minutes, not 10 days; 3/ And a team of experts that understands what questions to ask and how to extract meaningful results from a deluge of data. At Google, he explained, there are analytics teams assigned to every functional area of the company. “There’s no such thing as a gut decision at Google,” said Howe.

It sounds to me like viable approach. However, it made me think about what will make Google and similar computing power holders to sell it to enterprise companies. Google ‘s biggest value is not to selling computing resources. Google business is selling ads… based on data. My hunch there are two potential reasons for Google to support manufacturing data inititatives – potential to develop Google platform for manufacturing apps and value of data. The first one is straightforward – Google wants more companies in their eco-system. I found the second one more interesting. What if manufacturing companies and Google will find a way to get an insight from engineering data useful for their business? Or even more – improving their core business.

What is my conclusion? I’m sure in the future data will become the next oil. The value of getting access to the data can be huge. The challenge to get that access is significant. Companies won’t allow Google as well as PLM companies simply use the data. Companies are very concerned about IP protection and security. To balance between accessing data, providing value proposition and gleaning insight and additional information from data can be an interesting play. For all parties involved… Just my thoughts..

Best, Oleg

Photo courtesy of Google Inc.


Follow

Get every new post delivered to your Inbox.

Join 273 other followers