Why now is the right time to reinvent PDM?

August 15, 2014

re0invent-pdm-now

Product Data Management (PDM) isn’t a new domain. The first PDM systems were invented 20-30 years ago with a simple objective – to manage product data. The scope of PDM was heavily debated and included design, engineering BOMs, ECO and even supply chain. However, the most widely accepted role of PDM is to manage CAD files and their revisions.

For long time, PDM was recognized as somewhat you only need to consider if a size of your engineering department is large enough. Even starting price to implement PDM solution went down significantly for the last 20 years, my hunch average PDM solution starting cost for engineering organization with 10-15 people will be about $30-50K. Cost and implementation complexity made PDM business limited to larger companies and was mostly handled by resellers with special skills and knowledge. Most of them associated with a specific CAD vendor channel.

CAD vendors recognized the need and complexity of PDM. For most of vendors the answer on PDM demand was to develop (or acquire) a dedicated PDM system bundled with their CAD software. As a result of that, most of PDM players were acquired. Most of existing (remaining) PDM vendors are either focusing on a specific geographical niche or developed additional solutions usually branded with "PLM" buzzword and strategy.

My hunch is that until last year, PDM market was somewhat stalled and focusing on replacing of outdated versions of PDM software as well as support of new CAD software releases. Then something happens… For the last months, I can see an increased interested in PDM software. I noticed few focused researches and articles in the field of PDM – Expert Guide to the Next Generation of PDM; TechClarity Expert Guide for Basic CAD management and few others.

Also I want to mention few activities by vendors focusing on basic PDM functionality. It started from more traditional OOTB approach made by PTC Windchill PDM Essentials, SolidEdge SP focusing on SharePoint platform leverage and GrabCAD Workbench using "cloud platform" as a differentiation strategy.

Consilia Vector published CAMScore report for GrabCAD Workbench where CAMS stands for Cloud, Analytics, Mobile, Social. In my view, these major trends are making a renaissance in the space of PDM.

As I mentioned before, because of cost and complexity, PDM software was out of reach for many smaller companies and engineering departments. DIY (Do it yourself) PDM approach combining network file share, Excel files and FTP is a solution for probably 60-70% of market. For many years, to share files using network and USB drives was "good enough solution". But the era of file sharing changed forever with coming trend of social networks, mobile and cloud. So called YAPSA (Yet Another Photo Sharing Apps) became widely available in our everyday life. The question why PDM is so complex and why we cannot manage and access CAD data similar to what we do with photos and videos brings PDM solution back to the innovation room.

What is my conclusion? Cloud, web and social technologies in consumer space reached the level of maturity. It comes to the point where new tech and awareness of cloud and social approach are going to challenge a traditional PDM space. In addition to that, looks like an existing approach to use network drives and file sharing to manage CAD files is coming to logical end. People will be looking how to copy YAPSA approach into PDM space. So, it is time for PDM to change. Just my thoughts…

Best, Oleg


How to visualize future PLM data?

August 12, 2014

collective experience of empathetic data systems

I have a special passion for data and data visualization. We do it every day in our life. Simple data, complex data, fast data, contextual data… These days, we are surrounded by data as never before. Think about typical engineer 50-60 years ago. Blueprints, some physical models… Not much information. Nowadays the situation is completely different. Multiple design and engineering data, historical data about product use, history of design revisions, social information, data about how product is performing coming in real time from sensors, etc. Our ability to discover and use data becomes very important.

The ways we present data for decision making can influence a lot and change our ability to design in context of right data. To present data for engineers and designers these days can become as important as presenting right information to airplane pilots before. Five years ago, I posted about Visual Search Engines on 3D perspective blog. I found the article is still alive. Navigate your browser here to have a read. What I liked in the idea of visual search is to present information in the way people can easy understand.

Few days ago, my attention was caught by TechCrunch article about Collective Experience of Empathetic Data Systems (CEEDS) project developed in Europe.

[The project ]… involves a consortium of 16 different research partners across nine European countries: Finland, France, Germany, Greece, Hungary, Italy, Spain, the Netherlands and the UK. The “immersive multi-modal environment” where the data sets are displayed, as pictured above — called an eXperience Induction Machine (XIM) — is located at Pompeu Fabra University, Barcelona.

Read the article, watch video and draw your conclusion. It made me think about the potential of data visualization for design. Here is my favorite passage from the article explaining the approach:

“We are integrating virtual reality and mixed reality platforms to allow us to screen information in an immersive way. We also have systems to help us extract information from these platforms. We use tracking systems to understand how a person moves within a given space. We also have various physiological sensors (heart rate, breathing etc.) that capture signals produced by the user – both conscious and subconscious. Our main challenge is how to integrate all this information coherently.”

Here is the thing. The challenge is how to integrated all the information coherently. Different data can be presented differently – 3D geometry, 2D schema, 2D drawings, graphics, tables, graphs, lists. In many situations we can get this information presented separately using different design and visualization tools. However, the efficiency is questionable. Many data can be lost during visualization. However, what I learned from CEEDS project materials, data can be also lost during the process of understanding. Blindspotting. Our brain will miss the data even we (think) that we present it in a best way.

What is my conclusion? Visualization of data for better understanding will play an increased role in the future. We just in the beginning of the process of data collection. We understand the power of data and therefore collect an increased amount of data every day. However, to process of data and visualizing for better design can be an interesting topic to work for coming years. Just my thoughts…

Best, Oleg


PLM: Tools, Bundles and Platforms

August 11, 2014

plm-tools-bundles-platforms

I like online debates. The opportunity to have good online debates is rare in our space. Therefore, I want to thank Chad Jackson for his openness to have one. I don’t think Chad Jackson needs any introduction – I’m sure you had a chance to watch one of his Tech4PD video debates with Jim Brown of TechClarity.

Here is my post that ignite the debates – CAD: Engineering bundles vs. Granular Applications. In a nutshell, I caught Chad by saying that his idea of bundling of MCAD and ECAD in a single application might go against another idea of granular integrated application he articulated before.

chad-jackson-beyondplm-blog-fight

Here it starts! Chad twitted it a blog fight… whatever. I saw it as a good opportunity to debates what is the future engineering landscape might be. In a world where large CAD and PLM players are aggressively acquiring companies, products and technologies, the idea to combine MCAD and ECAD application can be quite disruptive.

However, my intention is not to discuss who is buying whom in CAD/PLM world. There is relatively limited number of MCAD and ECAD vendors. You can see them by navigating to the following links – 3D CAD, ECAD.

Chad’s main point – Granularity and Integration are not diametrically opposite. I agree with the statement. I also find examples of 3DEXPERIENCE, PTC and Transmagic very relevant. I found very important to clarify the differences between so called "granular apps" and "data integration". Here is my favorite passage from Chad’s article:

Granular Apps offer a limited set of capabilities that are focused on a specific job. These apps are more accessible to different roles in the company because their limited set of functionality requires less training and retention in terms of how they work. They are valuable in the network of roles that participate because they are so accessible. Data Integration means that multiple software applications work against a single set of data in a coordinated fashion. There can be value in this in propagating change and enabling collaboration across the network of roles that participate in overall product development.

The way article presents the combination of integration and granularity made me think about some interesting trajectories in future development of engineering software. I’d like to classify things into 3 distinct categories – Tools, Bundles and Platforms.

1- Tools.

The history of engineering applications goes back into development of tools that helped engineering to be more productive – drafting tools and calculation tools. You can find many of these tools in the past – 2D CAD, 3D CAD, Simulation and analysis tools. If you look on current software landscape, you can find most of the tools are still here.

2- Bundles and/or Suites

One of the biggest challenge with tools is related to the fact how customers can use them together. The topics of data integration and interoperability are very often discussed in the context of ability to use multiple tools, especially when these tools are developed by different vendors. The problem of interoperability is well recognized by vendors. One of the answers is to provide so called "suites" or application bundles with special focus on how tools are integrated together.

3- Platforms.

Platform is a lovely word in a lexicon of software developers. For most of them, this is an end game in the maturity of software tools. How to become a platform that can be used by other developers? There are so many advantages you can unlock as a provider of a platform. Easy to say, but very hard to do. The critical characteristics of platforms are hard to achieve – openness, data integration, maturity of data standards, tools and APIs and many others.

What is my conclusion? My guess, Chad is speaking about the opportunity to provide a unified product development platform that combines MCAD and ECAD tools. His statement about data integration indicates that tools can be still granular but become part of an integrated platform. I don’t think everybody will see it in the same way. I want to see mechanical engineer is using ECAD type environment for his work. I hardly can imagine some ECAD related work done in 3D environment. 3D view can be potentially cumbersome and confusing for most of electronic design. I believe IT and PLM architects might appreciate platform thing, but engineers can disagree. Where is the middle ground? It made me think more about what future engineering and manufacturing platforms will look like. I guess Chad Jackson might have some ideas about that and he would like to share them. I will work on my list to compare notes too. Just my thoughts…

Best, Oleg


PLM workflow dream

August 8, 2014

plm-workflow-dream

Process management is a very important part of any PLM software. You can find one in every PLM system. There are so many ways to define and manage process. Few years ago I captured some of them here – PLM Processes: Flowchart vs. Rule-based? While, I believe, we can agree about importance of processes management, I found hard to find simple and powerful implementation of PLM workflow. I believe this statement holds for every enterprise system. Time ago I had a dream that PLM vendors will adopt best in class BPM (Business Process Management) tools and infrastructure. My dream didn’t come true. Instead of that, the reality is that every PLM system has some (not the best) workflow implementation.

As part of my thinking about un-bundling in PLM, I decided to come with a description of what I call PLM workflow dream – list of features for an ideal PLM workflow system.

1- Visual designer. Majority of people think visually when it comes to workflow. So, visual designer should be a tool to draw a workflow in an easiest way, put boxes with activities and connect them together. It would be very interesting to have it done in a collaborative manner – typically, you need more than 1 person to define a good workflow.

2 – Drag-n-drop activity planning. There should be a very clear way to define activities. In most of PLM systems, activities should be connected to something that happens in your system (eg. Part status change; Document release, etc.) To connect them together with flow activities is a key.

3- Visualize and test. Designer should provide a way to "lineup" worklow into simple set of events (boxes) without cumbersome lines/nodes intersection mess. No cyclic visualization, unclear sub graph connections etc. System also should provide way to test the workflow with dummy or real data.

4- Program activities easily. Each activity node should support a notion of process such as failure, alert, delegation and user action (if needed). It would be really nice to have some predefined "processing rules" such as how to react on people absence and mistakes. The interface to set these values and action should be user friendly without additional complexity.

5- Failure programming. I need to be able to program what happens in case of general workflow failure in terms of who to call and what to do.

6- Programming scripts. The ability to attach programmable scripts to every activity/note. These days, Java script is probably a standard to should be just adopted. Don’t invent yet another programming language. Testing ability should support debugging and data dump for analysis.

What is my conclusion? Well, this is my dream list. If I missed something, please don’t hesitate to add it to the list. I believe, there is a possibility to build easy to use workflow system that can be easy plugged into any PLM system. Just my thoughts…

Best, Oleg


CAD: Engineering Bundles vs. Granular Apps?

August 7, 2014

cad-mcad-ecad-bundle-future

Packages, bundles, product suites, integrated environments. I’m sure you are familiar with these names. The debates about best of breed solutions vs. single-vendor integrated suites are going long way back in the history of CAD and PLM. Some companies are ready for functional trade-off and afraid of additional integration cost. For other companies performance and functionality are absolutely important to innovate.

Service oriented architecture and web technologies are bringing a sense of granularity into application and product world. In my view, Develop3D article – Why granularity is going to rock the future brings a very important perspective on the future of products. Al Deans speaks about granularity in data management. The complexity of product and data is growing. More people should work collaboratively on the same information (what was bundled before as a single CAD file). Here is my favorite quote:

When you’re working on a system that’s remotely located on a server, whether that’s over your internal network or across the wider web, you’ll need to manage and exchange finite packets of information, features, sketch entities and such. the rise of collaborative working systems, such as Catia V6, will mean that users are working on the same data, in parallel, at the same time. If not at the same time, there will be times when design changes, down to feature and maybe sub-feature level, will need to be managed and rationalised. To do that, you need to manage and keep track of those individual parcels of data and oackets of change. That’s going to require a level of granularity that’s way beyond what most data management systems are currently capable of.

Last year I watched Tech4PD video capturing debates between well known PLM pundits – Jim Brown and Chad Jackson – CAD Granularity vs. Integrated Suites. Navigate here to watch the recording. I voted for granularity. It was well captured by Chad Jackson statement. Here is the passage:

Granular CAD applications enable many roles in the enterprise, expanding the use of the 3D asset company-wide. Granular apps are better at enabling individual roles.

Latest blog post on Lifecycle Insight (again by Chad Jackson) – No More Excuses: It’s Time to Merge MCAD and ECAD caught my attention by something that I see as opposite to principles of granularity. Chad is debating the need to bundle and unite the functionality of MCAD and ECAD applications. I found his conclusion in a slight controversy with previously introduced concept of "granular CAD applications":

Why are there two separate toolsets at all? And that’s where, despite the lack of enthusiasm and interest in the topic, I think there is potential for disruption and innovation. There shouldn’t be two toolsets. You should be able to conduct mechanical and electrical design in a single CAD application…. Call it Hardware CAD (HCAD). Call it Electro-Mechanical CAD (EMCAD). I don’t care. But don’t tell me such an offering wouldn’t be intriguing. In my eyes, there is no reason that a combined MCAD-ECAD application shouldn’t be available. Large existing software providers have their reasons for inaction. But that means there is a ripe opportunity for disruption from smaller companies.

I want to elaborate more about the last point related to disruption and innovation. I explained my point of view in the blog post last year – The Future Unbundling Strategies in CAD/PLM. I want to repeat some of my assertions from the last year:

1. CAD and PLM is too big to sustain as a one big aggregated solution provided by a single vendor. This is a polystate diversified space that needs to be covered by multiple solutions, features and vendors.

2. Vendors are never good enough to see what exact problem customers want to solve. Especially when it comes to large manufacturing companies and complicated supply chain eco-systems. That’s way armies of consulting services as well as diversified products must be applied to provide a final solution.

3. Customers often don’t know what problem to solve. For most of the situations product development is a complex problem. It requires the team of people to work on. In addition to that, large organizations are involved into politics and confrontation related to usage of different enterprise software and tools.

What is my conclusion? I see a very strong potential in unbundling of existing large product suites. Take a piece of functionality, re-invent it, provide bigger value to a customer. Cloud technologies and future focus on data will be an imperative to make it successful. Vendors’ focus is shifting towards services. It is about how to satisfy customers each day. Selling of expensive bundles can be a thing in the past. Just my thoughts…

Best, Oleg


Will public clouds help enterprises to crunch engineering data?

August 6, 2014

google-data-center-crunches-engineering-data

The scale and complexity of the data is growing tremendously these days. If you go back 20 years, the challenge for PDM / PLM companies was how to manage revisions CAD files. Now we have much more data coming into engineering department. Data about simulations and analysis, information about supply chain, online catalog parts and lot of other things. Product requirements are transformed from simple word file into complex data with information about customers and their needs. Companies are starting to capture information about how customers are using products. Sensors and other monitoring systems are everywhere. The ability to monitor products in real life creates additional opportunities – how to fix problems and optimize design and manufacturing.

Here is the problem… Despite strong trend towards cheaper computing resources, when it comes to the need to apply brute computing force, it still doesn’t come for free. Services like Amazon S3 are relatively cheap. However, if we you want to crunch and make analysis and/or processing of large sets of data, you will need to pay. Another aspect is related to performance. People are expecting software to work at a speed of user thinking process. Imagine, you want to produce design alternatives for your future product. In many situations, to wait few hours won’t be acceptable. It will be destructing users and they won’t use such system after all.

Manufacturing leadership article Google’s Big Data IoT Play For Manufacturing speaks exactly about that. What if the power of web giants like Google can be used to process engineering and manufacturing data. I found explanation provided by Tom Howe, Google’s senior enterprise consultant for manufacturing quite interesting. Here is the passage explaining Google’s approach.

Google’s approach, said Howe, is to focus on three key enabling platforms for the future: 1/ Cloud networks that are global, scalable and pervasive; 2/ Analytics and collection tools that allow companies to get answers to big data questions in 10 minutes, not 10 days; 3/ And a team of experts that understands what questions to ask and how to extract meaningful results from a deluge of data. At Google, he explained, there are analytics teams assigned to every functional area of the company. “There’s no such thing as a gut decision at Google,” said Howe.

It sounds to me like viable approach. However, it made me think about what will make Google and similar computing power holders to sell it to enterprise companies. Google ‘s biggest value is not to selling computing resources. Google business is selling ads… based on data. My hunch there are two potential reasons for Google to support manufacturing data inititatives – potential to develop Google platform for manufacturing apps and value of data. The first one is straightforward – Google wants more companies in their eco-system. I found the second one more interesting. What if manufacturing companies and Google will find a way to get an insight from engineering data useful for their business? Or even more – improving their core business.

What is my conclusion? I’m sure in the future data will become the next oil. The value of getting access to the data can be huge. The challenge to get that access is significant. Companies won’t allow Google as well as PLM companies simply use the data. Companies are very concerned about IP protection and security. To balance between accessing data, providing value proposition and gleaning insight and additional information from data can be an interesting play. For all parties involved… Just my thoughts..

Best, Oleg

Photo courtesy of Google Inc.


The end of single PLM database architecture is coming

August 5, 2014

PLM-distributed-cloud-database-architecture

The complexity of PLM implementations is growing. We have more data to manage. We need to process information faster. In addition to that, cloud solutions are changing the underlining technological landscape. PLM vendors are not building software to be distributed on CD-ROMs and installed by IT on corporate servers anymore. Vendors are moving towards different types of cloud (private and public) and selling subscriptions (not perpetual licenses). For vendors it means operating data centers, optimize data flow, cost and maintenance.

How to implement future cloud architecture? This question is coming to the focus and, obviously, raising lots of debates. Infoworld cloud computing article The right cloud for the job: multi-cloud database processing speaks about how cloud computing is influencing what is the core of every PDM and PLM system – database technology. Main message is to move towards distributed database architecture. What does it mean? I’m sure you are familiar with MapReduce approach. So, simply put, the opportunity of cloud infrastructure to bring multiple servers and run parallel queries is real these days. The following passage speaks about the idea of how to optimize data processing workload by leveraging cloud infrastructure:

In the emerging multicloud approach, the data-processing workloads run on the cloud services that best match the needs of the workload. That current push toward multicloud architectures provides the ability to place workloads on the public or private cloud services that best fit the needs of the workloads. This also provides the ability to run the workload on the cloud service that is most cost-efficient.

For example, when processing a query, the client that launches the database query may reside on a managed service provider. However, it may make the request to many server instances on the Amazon Web Services public cloud service. It could also manage a transactional database on the Microsoft Azure cloud. Moreover, it could store the results of the database request on a local OpenStack private cloud. You get the idea.

However, not so fast and not so simple. What works for web giants might not work for enterprise data management solutions. The absolute majority of PLM systems are leveraging single RDBMS architecture. This is fundamental underlining architectural approach. Most of these solutions are using "scale up" architecture to achieve data capacity and performance level. Horizontal scale of PLM solutions today is mostly limited to leverage database replication tech. PLM implementations are mission critical for many companies. To change that would be not so simple.

So, why PLM vendors might consider to make a change and to think about new database architectures? I can see few reasons – the amount of data is growing; companies are getting even more distributed; design anywhere, build anywhere philosophy comes into real life. The cost of infrastructure and data services becomes very important. In the same time for all companies performance is an absolute imperative – slow enterprise data management solutions is a thing in the past. To optimize workload and data processing is an opportunity for large PLM vendors as well as small startups.

What is my conclusion? Today, large PLM implementations are signaling about reaching technological and product limits. It means existing platforms are achieving a possible peak of complexity, scale and cost. To make the next leap, PLM vendors will have to re-think underlining architecture, to manage data differently and optimize cost of infrastructure. Data management architecture is the first to be considered. Which means end of existing "single database" architectures. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 246 other followers