How to transform old CAD-PDM integration paradigms

January 23, 2015

cloud-pdm-paradigm

Integration of CAD and PDM is a field with long history of battles, innovation and failures for the last 15-20 years. You can hardly undervalue the importance of integration between CAD and data management tools. For some time in the past CAD and PDM were separate systems. Engineers had to switch from CAD to PDM to perform specific tasks related to data management functions. To integrate PDM tools inside CAD was one of the greatest ideas of 1990s, which improved significantly what we call today “user experience”.

However, the complexity of data management interfaces was always something that made engineers uncomfortable. Another innovative approach that was introduced in the field of integration between CAD and PDM was to embed PDM tools into File Explorer user interface. One of the companies that did it back in 2000s was Conisio (later SolidWorks Enterprise PDM). The idea got lot of traction and allowed to engineers to work with a familiar file based interface while in fact using PDM tools.

People are hard. Especially, when it comes to adopting of new paradigms. Dassault System blog SOLIDWORKS and PLM: No Fear Required brings an interesting perspective on integration between ENOVIA PLM and SolidWorks.

3DEXPERIENCE platform offers a fresh approach to this problem. Recognizing that our existing Enterprise PDM solution has been greatly accepted by the design community, the same R&D group has designed a new product that offers the usability of EPDM but actually stores the data in a broader and more capable PLM solution. The result is the SOLIDWORKS Collaborative Innovation Connector, a product that works and acts much like a workgroup solution would but gives the designer just enough access to the PLM functionality to innovate their processes beyond what they can do today in a PDM environment.

The following video is one of the confirmation for that. You can see how ENOVIA PLM traditional web interface is morphing to provide File-Explorer user experience for SolidWorks users. What I found specifically interesting is that you can hardly distinguish between ENOVIA PLM and SolidWorks EPDM, which has very similar user experience for both file explorer and SolidWorks UI.

The video about ENOVIA SolidWorks integration made me think about what can be a new PDM paradigm as we move forward into cloud future. I’d like to bring few references to new products and companies in that space – GrabCAD, Autodesk Fusion360 and Onshape.

Fusion360

At recent Autodesk University in Las Vegas, Autodesk CEO Carl Bass presented the evolution of Fusion360 and its connection with cloud services such as Autodesk A360. According to Carl Bass, you can think about Fusion is a GitHub for engineers. Combined with A360, Fusion is a full digital re-imagination of how designers and engineers will collaborate – online and social. What is important to understand is that A360 provides data and collaboration backbone for Fusion360, so engineers are not facing file-based operations like in traditional desktop CAD tools.

carl-bass-fusion-360-au2014-2

Onshape

Onshape is a new company re-imagining CAD for Google era. Large group of Onshape founding team is coming from SolidWorks. Last week, Onshape started to blog. One of the things I captured from Onshape blog is their claim to rethink PDM role and appearance for cloud CAD. You can read some of my thoughts here – Future CAD won’t require PDM. Here is quote from Onshape blog:

on-shape-world-changed

We tried with traditional PDM, but fundamentally the architecture of copying files around, to and from servers and desktops, is just not a good basis for solving version control and collaboration problems. We think we have a better way to solve the problems, and no PDM system is needed.” Mac, Windows, phone or tablet. No PDM system needed. The files stay in one place. Different UI look. Now those sound like interesting and wonderful things. We’ll continue to anxiously anticipate what they have planned and what you have to say about it.

GrabCAD

GrabCAD workbench is another system that introducing a different experience by merging cloud and file-based data management operations. GrabCAD didn’t develop CAD system as it was predicted by some CAD industry insiders. However, GrabCAD Workbench is a PDM system on the cloud that can remind you some elements of Dropbox combined with CAD viewer and ability to control file revisions.

grabcad-workbench

What is my conclusion? Existing paradigms are hard to change. In my view, engineers are one of the most innovative groups of people. However, when it comes to their own tools, engineers are very conservative. You can easy expect the following vision for data management from an engineer – “I want to work with my designs (files), please leave me alone and stop selling me PDM tools”. However, here is the thing – collaboration can make a difference. The integration of data management and collaboration can provide a significant advantage to engineers in a modern mobile and distributed environment. This is a key thing, in my view. Cloud and mobile collaboration will change CAD /PDM integration paradigm in the future. Just my thoughts…

Best, Oleg


Why manual PLM data modeling should be a thing in the past?

January 22, 2015

plm-data-modeling

One of the most complicated parts of any PLM implementation is data modeling. Depends on PLM vendor, product and technology, the process of data modeling can be called differently. But fundamentally, you can see it in any PLM implementation. This is a process, which creates an information model of product and processes in a specific company. To get it done is not simple and it requires lot of preparation work, which is usually part of implementation services. Even more, once created data model needs to be extended with new data elements and features.

Is there a better way? How other industries and products are solving similar problems of data modeling and data curating. It made me think about web and internet as a huge social and information system. How data models are managed on the web? How large web companies are solving these problems?

One of the examples of creating a model for data on the web was Freebase. Google acquired Freebase and used as one of the data sources for Google Knowledge Graph. You can catch up on my post why PLM vendors should learn about Google Knowledge Graph. Another attempt to create a model for web data was Schema.org, which is very promising in my view. Here is my earlier post about Schema.org – The future of Part Numbers and Unique Identification. Both are examples of curating data models for web data. The interesting part of schema.org is that several web search vendors are agreed on some elements of data model as well as how to curate and manage schema.org definitions.

However, it looks like manual curating of Google Knowledge Graph and Schema.org is not the approach that makes web companies to feel happy about and leapfrog in the future. Manual work is expensive and time consuming. At least some people are thinking about that. Dataversity article “Opinion: Nova Spivack on a New Era in Semantic Web History” speaks about some interesting opportunities that can open a new page in the way data is captured and modeled. He speaks about possible future trajectories of deep learning, data models and relationships detecting. It can extend Schema.org, especially in the part that related to automatically generated data models and classifications. Here is my favorite passage:

At some point in the future, when Deep Learning not only matures but the cost of computing is far cheaper than it is today, it might make sense to apply Deep Learning to build classifiers that recognize all of the core concepts that make up human consensus reality. But discovering and classifying how these concepts relate will still be difficult, unless systems that can learn about relationships with the subtly of humans become possible.

Is it possible to apply Deep Learning to relationship detection and classification? Probably yes, but this will likely be a second phase after Deep Learning is first broadly applied to entity classification. But ultimately I don’t see any technical reason why a combination of the Knowledge Graph, Knowledge Vault, and new Deep Learning capabilities, couldn’t be applied to automatically generating and curating the world’s knowledge graph to a level of richness that will resemble the original vision of the Semantic Web. But this will probably take two or three decades.

This article made me think about the fact manual data curating for Freebase and Schema.org is a very similar process to what many PLM implementers are doing when applying specific data and process models using PLM tools. Yes, PLM data modeling happens usually for a specific manufacturing companies. At the same time, PLM service providers are re-using elements of these models. Also companies are interconnected and working together. The problem of communication between companies is painful and still requires some level of agreement between manufacturing companies and suppliers.

What is my conclusion? Data modeling is an interesting problem. For years PLM vendors put a significant focus how to make flexible tools that can help implementers to create data and process models. Flexibility and dynamic data models are highly demanded by all customers and this is one of the most important technological element of every PLM platform today. New forms of computing and technologies can come and automate this process. It can help to generate data models automatically via capturing data about what company does and processes in a company. Sounds like a dream? Maybe… But manual curating is not an efficient data modeling. The last 30 years of PDM/PLM experience is a good confirmation to that. To find a better way to apply automatic data capturing and configuration for PLM can be interesting opportunity. Just my thoughts…

Best, Oleg

photo credit: tec_estromberg via photopin cc


Cloud PDM: stop controlling data and check shadow IT practices

January 21, 2015

cloudpdm-shadow

An interest of customers in cloud PDM solution is growing. I guess there are multiple factors here – awareness about cloud efficiency and transparency, less concern about cloud security and improved speed and stability of internet connections. If you are not following my blog, you can catch up on my older blog articles about cloud PDM – Cloud PDM ban lifted. What next?; Cloud PDM hack with Google Drive and other tools; Cloud can make file check-in and check-out obsolete. The confluence of new technologies around cloud, web, mobile and global manufacturing is creating a demand for cloud (or web based) solution helping distributed design teams.

So, where is a challenge for cloud PDM? My hunch, the biggest one is how to sell cloud PDM to manufacturing companies. I can divide all customers into two groups – larger manufacturing companies that already implemented PDM solutions and smaller manufacturing firms that are still managing CAD design with folders, FTP and Dropbox accounts.

Analysts, researchers and PDM marketing pundits are trying to convince companies that cloud PDM can become a great enabler for collaboration and leaving CAD data “not managed” can bring even greater risk to organization. There is nothing wrong with that… PDM was build around the idea of how to take a control over data. However, the idea of “control” is not something engineers like. Ed Lopategui is speaking about engineers and control in his last blog – The day the strength of PDM failed. Here is a passage I liked:

The second reason, which is not so legitimate, is a loss of control. The reason so many engineers pine about the days of paper-based PDM in document control departments (or instead nothing at all) is that world could be circumvented in a pinch. It was flawed because it was run by humans, and consequently also replete with errors. Replaced with immutable and uncaring software, engineers working in groups nonetheless become irritated because they can’t just do whatever they want. You see this very conflict happening with regard to source control in software development circles. The order needed to manage a complex product necessarily makes manipulating pieces of that engineering more cumbersome. It’s one thing to be creating some widget in a freelance environment, it’s another matter entirely when that end product needs traceable configuration for a serialized certification basis. And that will happen regardless of how the software operates.

Here is the thing… Maybe cloud PDM should stop worry about controlling data and think more about how to bring a comfort to engineers and stop irritating users with complex lifecycle scenarios? It made me think about practice that known as “shadow IT”. For the last few years, shadow IT and cloud services have lot of things in common. Don’t think about shadow IT as a bad thing. Think about innovation shadow IT can bring to organizations.

Forbes article “Is shadow IT a runaway train or an innovation engine?” speaks about how shadow IT can inject some innovative thinking into organization. This is my favorite passage:

As we reported last month, one corporate employee survey found that 24% admit they have purchased and/or deployed a cloud application — such as Salesforce.com, Concur, Workday, DropBox, or DocuSign. One in five even use these services without the knowledge of their IT departments.

The rise of shadow IT may actually inject a healthy dose of innovative thinking into organizations, at a time they need it most. The ability to test new approaches to business problems, and to run with new ideas, is vital to employees at all levels. If they are encumbered by the need for permissions, or for budget approvals to get to the technology they need, things will get mired down. Plus, shadow IT applications are often far cheaper than attempting to build or purchase similar capabilities through IT.

What is my conclusion? Stop controlling data and bring a freedom of design work back to engineers. I understand, it is easy to say, but very hard to implement. To control data is a very fundamental PDM behavior. To re-imagining it require some innovative thinking. It is also related to the fact how to stop asking engineers to check-in, check-out and copy files between different locations. Maybe, this is an innovation folks at Onshape are coming with? I don’t know. In my view, cloud PDM tools have the opportunity to change the way engineers are working with CAD data. Many new services became successful by providing cloud applications and making existing working practices much easier than before. Just my thoughts…

Best, Oleg
photo credit: Dean Hochman via photopin cc


Can BOX become a platform for PLM?

January 20, 2015

box-collaboration

Platform is a topic, which comes quite often in the discussion about future of PLM. CIMdata recently came with a topic of “platformization” in PLM. You can catch up on the discussion – A CIMdata dossier: PLM platformization. I can probably divide all existing PLM platforms into two groups – 2D/3D design platform and Object Database Modeling platform. Last year, I charted some of possible options for a foundation of future PLM platform – System Engineering, 2D/3D services, product development standards, New database technologies. From another standpoint, the debates about future PLM platforms are often raising a question of single vs. federated platform for PLM.

New technological trends and demands of customers can bring new platforms into PLM world. One of them is cloud storage. I touched cloud storage topic in my article – CAD companies and cloud storage strategies. One of the points was related to longevity of “cloud storage” business. Cloud companies want to store your data. It gives them an opportunity to understand your business better. However, the prediction is that cloud storage cost is eventually coming to zero. Which leaves cloud companies with the need to develop solutions to elevate productivity and improve collaboration and document creation. This is where it comes to PLM as a future platform for product innovation.

BOX is a company which is located at the intersection of cloud storage and enterprise business. My attention was caught by BI article – In One Slide, Box Explains What Everybody’s Getting Wrong About The Company. Here is the slide:

box-enterprise-platform

Here is an interesting passage and conclusion from the article:

In an interview with Business Insider, Box CEO Aaron Levie said he knew storage business was going to turn into a commodity business back when he first started the company. Instead, he said he’s creating a platform business, where more value is added on top of things like storage, computing, and security. “It’s all about going into the top 8 to 10 industries and finding where are companies reimagining their business, where are they going digital, where are they transforming their business model, and how does Box act as a platform that could accelerate that push into the future,” he said. If the critics are right, Box is doomed. If Box is right, it has a chance at being a valuable enterprise company along the lines of Salesforce.

Looking on customers, partners and, especially BOX enterprise content collaboration platform, made me think about an interesting intersection between product lifecycle and BOX business. Of course BOX is not in the business of design and engineering software. However, enterprise collaboration has a significant overlap with what most of PLM platforms are providing – metadata, security, workflow, collaboration, content search. These are topics that always presented in PLM. It seems to me current focus of BOX is outside of manufacturing companies. However, maybe future BOX growth will take it towards manufacturing enterprises.

What is my conclusion? I don’t think BOX is focusing today on manufacturing companies. However, elements of BOX platform have a perfect sense when you think about product lifecycle collaboration. What is especially interesting is content collaboration on an enterprise scale. This is a topic, which most of PLM companies are struggling with. Existing PLM platforms have good representation in engineering domain, but lack of broad enterprise adoption. This is a place where future competition between PLM vendors and BOX (or similar companies) can occur. On the other side, BOX can become a platform to take PLM collaboration forward in enterprise companies. Just my thoughts…

Best, Oleg

Picture credits box.com


Will search replace engineer’s brain in the future?

January 17, 2015

engineers-plm-brain

Computers are changing the way we work. It is probably too broad statement. But if I think about the fact today is Friday afternoon, it should be fine :). I want to take a bit futuristic perspective today. Google, internet and computing are good reason why our everyday habits today are different from what we had 10 years ago. Back in the beginning of 2000s we’ve been buying paper maps before going on vacation and kept paper books with phone numbers of people we need. Look how is it different now. Maybe we still need to make a hotel reservation before the trip, but most of the thing we do can be achievable online via internet and mobile devices.

A month ago, I posted about connecting digital and physical entities. I was inspired by Jeff Kowalski presentation at AU 2014. You can get a transcript and video by navigating to the following link. The idea of machine learning and "training" computer brain to find an optimal design is inspiring. The following passage from Kowalski’s presentation is a key in my view:

…we’re working on ways to better understand and navigate existing solutions that might be relevant to your next design project. Using machine learning algorithms, we can now discover patterns inherent in huge collections of millions of 3D models. In short, we can now discover and expose the content and context of all the current designs, for all the next designs. Taxonomies are based on organizing things with shared characteristics. But they don’t really concern themselves with the relationships those things have with other types of things — something we could call context. Adding context reveals not only what things are, but also expresses what they’re for, what they do, and how they work.

Nature explores all of the solutions that optimize performance for a given environment — what we call evolution. We need to do the same thing with our designs. But first we have to stop "telling the computer what to do," and instead, start "telling the computer what we want to achieve." With Generative Design, by giving the computer a set of parameters that express your overall goals, the system will use algorithms to explore all of the best possible permutations of a solution through successive generations, until the best one is found.

Another time, I’ve was recently thinking about artificial intelligence, machine learning and self-organized systems was my article – How PLM can build itself using AI technologies. The idea of The Grid that allows to self organize website based on a set of input parameters and content learning is interesting. It made me think about future PLM system that self-define system behaviors based on the capturing of information and processes from a manufacturing company.

The article Google search will be your brain put another interesting perspective on the evolution of computer and information system. Take some time over the weekend and read the article. The story of neural nets is fascinating and if you think about a potential to train the net with the knowledge of design, it can help to capture requirements and design commands in the future. Here is an interesting passage explaining how neural nets are working from the article:

Neural nets are modeled on the way biological brains learn. When you attempt a new task, a certain set of neurons will fire. You observe the results, and in subsequent trials your brain uses feedback to adjust which neurons get activated. Over time, the connections between some pairs of neurons grow stronger and other links weaken, laying the foundation of a memory.

A neural net essentially replicates this process in code. But instead of duplicating the dazzlingly complex tangle of neurons in a human brain, a neural net, which is much smaller, has its neurons organized neatly into layers. In the first layer (or first few layers) are feature detectors, a computational version of the human senses. When a computer feeds input into a neural net—say, a database of images, sounds or text files—the system learns what those files are by detecting the presence or absence of what it determines as key features in them.

So, who knows… maybe in a not very far future CAD and PLM systems will be providing a specific search based experience helping engineers to design and manufacturing in a completely different way.

What is my conclusion? While it still sounds like a dream, I can see some potential in making design work looks similar to search for an optimal solution with specific constraints and parameters. A well trained algorithm can do the work in the future. Just thinking about that can fire so many questions – how long will take to train the net, what will be a role of engineers in the future design and many others. But these are just my thoughts… Maybe it will inspire you too. Have a great weekend!

Best, Oleg


Top 5 PLM trends to watch in 2015

January 15, 2015

plm-trends-2015

Holidays are over and it was a good time to think about what you can expect in engineering and manufacturing software related to PLM in coming year. You probably had a chance to listen to my 2015 PLM predictions podcast few months ago. If you missed that, here is the link. Today I want to give a bit more expanded list of trends in product lifecycle management to observe in 2015.

1- Greater complexity of cloud PLM implementations

Cloud adoption is growing in enterprise for the last few years and it is getting more mature. PLM vendors are making steps in the cloud direction too. Companies are moving from marketing and research to “nuts and bolts” of implementations. Switch to the cloud is not as simple as some marketing pundits predicted. It is more than just moving servers from your data center to somebody else place. The complexity of implementation, maintenance and operation will emerge and will drive future difference between “born in the cloud” solutions and existing PLM platforms migrating to the cloud.

2- The demand to manage complex product information will be growing

Products are getting more complex. You can see it around you. A simple IoT gadget such as door lock can combine mechanical, electrical, electronic and software parts. It introduces a new level of complexity for manufacturing and PLM vendors – how to manage all this information in a consistent way? To bring together design and bill of materials for every discipline becomes a critical factor in manufacturing company of every size.

3- New type of manufacturing companies will be attracting focus of PLM vendors

Manufacturing landscape is changing. Internet and globalizaiton enabling to create a new type of manufacturing companies – smaller, distributed, agile, crowdfunded. It requires new type of thinking about collaboration, distribute working, digital manufacturing and more. These companies are representing new opportunity and will drive more attention from PLM vendors.

4- Growing interest in mobile enterprise PLM solutions

Mobile went mainstream in many domains. Until now, engineers in manufacturing companies mostly used mobile for email. In 2015 I can see a potential to have a greater interest in mobile solution from manufacturing companies. Distributed work and need for collaboration will drive the demand to make existing enterprise systems more mobile.

5- The demand for big data and analytics in product lifecycle.

Data is driving greater attention these days. I even heard data “data as a new oil”. Manufacturing companies will start to recognize the opportunity and think how to use piles of data from their enterprise engineering and manufacturing system to drive some analysis and use it for decision making.

What is my conclusion? I think 2015 will be a very interesting year in PLM. Broader adoption of cloud, mobile and big data analytics will drive future transformation in engineering and manufacturing software. The disconnect between old fashion enterprise software and new tech vendors will increase. Just my thoughts…

Best, Oleg


How many enterprise PLM systems will survive cloud migration

January 14, 2015

plm-stairs-to-the-cloud

Cloud adoption is growing. For most of existing PLM vendors it means to think about how to migrate existing platforms and applications to the cloud. I covered related activities of PLM vendors in my previous articles. Take a look here – PLM cloud options and 2014 SaaS survey. It can give you an entry point to few more articles. Some of vendors such as Dassault System are promising to deliver a full set of cloud options – private, public and hybrid. Aras is partnering with Microsoft Azure and Siemens PLM is focusing on a diverse set of IaaS options. At the same time to move existing platform to the cloud won’t be simple. To migrate customers’ environments to the cloud will be even more complicated.

My attention caught by InfoWorld article – Docker’s tremendous upside could upset some enterprises. If you are not familiar with what Docker is, navigate to my earlier blog – Why to ask cloud PLM vendor about Devops and Kebernetes. InfoWorld article speaks about Docker’s ability to support application portability and a potential clash between what Docker can provide and the cloud migration strategies developed by enterprises for the last few years. Here is an interesting passage.

With Google, Microsoft, and Amazon Web Services all supporting Docker, your management may feel compelled to take a hard look at it as the right enabling technology. If this means rebooting your existing application migration strategy, perhaps even redoing 50 applications, then so be it. After all, the technology is changing so quickly that enterprises should be allowed to change strategy when new developments arise.

How is that related to what PLM vendors are doing? In my view, it is an additional shakeout to PLM vendors as they go towards more learning about cloud applications, services, and ways to migrate from existing PLM platforms into future “clouds”. It is about “how” to make cloud real and it will require to go down from marketing messages about moving to the cloud into deep waters of DevOps and services. One of my PLM predictions for 2015 was about the fact software vendors will discover the complexity of cloud PLM migrations. You can listen to my 3 predictions for PLM in 2015 by navigating to the following podcast by SPK and Associates.

What is my conclusion? PLM vendors and enterprise customers soon to discover the complexity of migration to the cloud. It will come trough understanding of underlining architectures, complexity of operation, service level commitments and other business and technologies topics. Most of enterprises are heavy invested into customization of existing PLM platforms, which will add an level of complexity for migration. How many enterprise PLM apps will survive cloud migration? This is a good question to ask in coming year. Just my thoughts…

Best, Oleg

photo credit: M J M via photopin cc


Follow

Get every new post delivered to your Inbox.

Join 268 other followers