PLM Workflow “California Roll” Style

June 29, 2015

plm-workflow-california-roll

Product Lifecycle Management software is obsessed with the idea of workflow. You can see it offered by all PLM vendors. To have a taste of traditional PLM workflows, navigate to my five years old blog post – PLM processes: flowcharts vs rule-based – it gives some ideas how PLM products are defining processes. PLM workflow business was also targeted by newer product in PLM domain. Autodesk PLM360 put a significant focus on process management. You can read more in Mike Watkins blog here. Kenesto – a software outfit founded by Mike Payne was also relaunched last year with a specific focus on workflow process management – Kenesto PLM platform relaunched.

The idea of workflows didn’t die. It is actually a good idea. But, here is the thing – PLM workflows is very hard to implement. Therefore for many manufacturing companies, PLM workflow is still a dream that you can experience in PowerPoint presentations only. An the same time, the idea of workflow brings a way to think about processes in a holistic way. At least if you think about all elements of PLM workflow (or process) are important – activities, events, approval, actions, statuses, etc. However, it doesn’t work or doesn’t work in an easy way.

My attention was caught by a blog post by Nir Eyal, which has no obvious connection to PLM. It speaks about intersection of psychology, technology and business – People Don’t Want Something Truly New, They Want the Familiar Done Differently. I like California Roll example. Here is my favorite passage from the blog:

Then came the California Roll. While the origin of the famous maki is still contested, its impact is undeniable. The California Roll was made in the USA by combining familiar ingredients in a new way. Rice, avocado, cucumber, sesame seeds, and crab meat — the only ingredient unfamiliar to the average American palate was the barely visible sliver of nori seaweed holding it all together.

The California Roll provided a gateway to discover Japanese cuisine and demand exploded. Over the next few decades sushi restaurants, which were once confined to large coastal cities and almost exclusively served Japanese clientele, suddenly went mainstream. Today, sushi is served in small rural towns, airports, strip malls, and stocked in the deli section of local supermarkets. Americans now consume $2.25 billion of sushi annually.

It made me think about “PLM workflows” done differently. So, what if can take all right elements of PLM workflow and combined them in the way it won’t hurt companies to think about it. SolidSmack article earlier today speaks about IFTTT – a platform to automate web application tasks. Navigate to the following link – IFTTT introduce the maker to maker channel for makers and hardware developers. I liked the following passage.

Of course, while the ability to automatically automate web application tasks is certainly a very powerful thought, one can only imagine what this might mean as we enter an age of more connected hardware devices in addition to our existing phones, tablets and laptops. Within the last year, the platform has started to integrate its service into a collection of connected home products including Wink devices, Nest Thermostat, SmartThings, WeMo switches and other off-the-shelf products.

The following video can give you an idea how IFTTT works.

IFTTT is not only the way to automate you web tasks. Navigate here to see alternatives to IFTTT – Zapier, itDuzzit, Workauto, Jitterbit and others.

What is my conclusion? It is time to re-think the way Workflow is done for PLM. A more traditional PLM architecture is overemphasizing the planning step – organizing data and control of information. Think about somethign different – a set of re-usable web services orchestrated by tools that every person can use. So, all components of a typical PLM workflow is there, but it is user friendly and not requires 3-6 months of planning activities. Is it a dream? Maybe… But to me it sounds like something PLM architects and technologists might think about. Just my thoughts…

Best, Oleg


Cloud CAD/PDM and mass customization future

June 25, 2015

mass-customization-cad-cloud-3d-local-motors

The era of mass production is near the end. The demand for mass customization is coming. We can see signs of customizable products everywhere – e-commerce configurators, personalization in apparel industry, individual shoe design, personalization in medical devices etc.

At the same time, the opportunity around mass-customization is facing challenges in engineering and manufacturing environments. I shared some of my thoughts about PLM, mass customization and BoM vertical integration last year. The problem with management of bill of materials to support manufacturing integration is real. We can achieve an improved product customization by improvement of BOM management and providing configurable BOM solution integrated with manufacturing systems. However, in many situation, product configuration capabilities are defined in the core of product design – CAD system.

You can manage product configurations using CAD system. Most of 3D mechanical CAD systems are supporting the ability to create some sort of product variations. But here is the thing, it is very hard to connect CAD product configurations to engineering and manufacturing systems.

My attention was caught by an article Building Adaptable CAD Databases—How and Why written by Chris Loughnane. In a nutshell, it speaks about how to make a traditional CAD design more data driven. The idea is fascinating and goes much beyond discrete configuration parameters. Here is my favorite passage explaining that.

Adaptable databases. By implementing additional techniques on top of traditional best practices, design intent is able to be so thoroughly baked into an adaptable database that its flexibility is no longer limited to a few discrete parameters. Instead, it’s able to read user-specific scan data and adjust the height, length, width, and surface curvature such that the resulting database is now custom-fit to the user.

It made me think about potentially mind blowing future of adapting CAD models. Imagine CAD design that can be changed based on customer data scanned using your mobile phone. Whoa… that would be amazing, but I stopped dreaming for a moment… The data integration chain in engineering and manufacturing systems is broken in many ways. CAD design is hardly integrated with PDM databases. Engineering BOMs are not synchronized with manufacturing BOM and shop floor and production facilities. To connect dots is possible, but it is a very complicated and expensive process.

The industry is discovering cloud CAD systems these days. One of the significant advantages of cloud CAD is the fact it includes data management functionality. In fact, these data management functions are giving us an option to control design on a very granular level. You can see an example of how cloud CAD systems are capable to control versions and collaboration between people.

Cloud based CAD systems can leverage data management capabilities to control more design parameters and product features. By doing that, it will enable better integration between design configuration and product features. Today most of these parameters are hardly can be captured. New cloud CAD systems can provide data driven environment to control important design parameters and to support data-driven design.

What is my conclusion? Mass customization and personal product developing is a future. One of the problems to solve in order to make it happen is to integrate engineering and manufacturing environments. The wall between design models and manufacturing product configuration should be removed. The first step into that direction is done by cloud CAD / PDM systems today. Just my thoughts…

Best, Oleg

Picture Strati BAAM 3D printed car. (c) Photo courtesy of Local Motors.

 



How Fusion360 and Onshape are solving fundamental CAD collaboration problem

June 24, 2015

3d-puzzle-design-collaboration

For many years, design collaboration and change management was an ultimate requirement for PDM tools. To manage revision history, share data in the team and apply changes made by different team members was a dream for many users. I’ve seen many attempts to solve this problem by PDM developers with questionable results. The challenge for PDM system was to connect two islands of data – CAD files and PDM database. More successful implementations in this space are belonging to CAD/PDM bundles provided by a single vendor in the situation when both CAD file structure and PDM data is controlled by a single tool.

Cloud CAD technologies are breaking the barrier of existing CAD/PDM bundles by introducing embedded PDM functionality as part of CAD tools. You probably remember my earlier post – Cloud CAD will have to solve PDM problem first. Autodesk Fusion360 and Onshape are two cloud CAD products today that are supposed to turn design collaboration dream into reality. Earlier in my blog I explained why I think Autodesk and Onshape disagree about cloud technology and focus. There are differences in data management approaches, offline mode support and application technologies used by both vendors. But, at the same time, it is very interesting to compare how both products are solving similar problems.

Autodesk Fusion360 blog – June product update review by keqingsong speaks about functionality added to Fusion360 to support distributed design and allows collaboration in distributed teams.

fusion360-distributed-design

The following passage can give you a good description of what means distributed design for Fusion360 including usage of reference geometry and specific version inside of the project. What is interesting is how Fusion360 holds top down relationships between different elements of the project.

This release lays the foundation for distributed designs that will allow for future enhancements. In this update, you will able to insert referenced geometry that is part of the same project. Models outside of the project you are working must be moved or copied to your current project before they can be referenced. When a referenced model is inserted into another model, a reference image appears before the name identifying which components are being referenced.

A “component is out-of-date” notification will appear when a referenced part is updated. You will then have a choice to update and receive the change or keep the current version in your model. Simply right click on the referenced component and select “Get latest”. This intended workflow allows for designs that are in production to reference one version of a model while other versions are being created for a future design. If a component is inside a model that is referenced by another model you must update the sub model first, save it, and then go to the top level and update.

At the same time, my attention was caught by Onshape blog – Under the Hood: How Collaboration Works in Onshape by Ilya Baran gives you a deep insight on how Onshape is managing changes by introducing a concept of "microrevisions".

onshape-microversions

The following passage is explaining how microversions technique applies into distributed environment with multiple users.

For a given Part Studio, at each point in time, the definition is stored as an eternal, immutable object that we internally call a microversion. Whenever the user changes the Part Studio definition, (e.g., edits an extrude length, renames a part, or drags a sketch), we do not change an existing microversion, but create a new one to represent this new definition. The new microversion stores a reference to the previous (parent) microversion and the actual definition change. In this way, we store the entire evolution of the Document: this is accessible to the user as the Document history, allowing the user to reliably view and restore any prior state of an Onshape Document.

These definition changes are designed to be very robust: a change stored in a microversion is intended to apply to the parent microversion, but could be applied to a different one. For instance, if the change is “change the depth of Extrude 1 to 4 in,” as long as the original feature exists (identified using an internal id, so it can be renamed), this change can be applied. As a result, changes coming simultaneously from multiple collaborators can simply be applied to the latest microversion without interfering with each other. Traditional CAD systems based on saving an ever-changing memory state into files cannot do this, even if run on a remote server or with a PDM system attached: the data itself has to be collaborative.

What is my conclusion? Fusion360 and Onshape are trying to solve the problem of design collaboration. Both systems are leveraging cloud data management backend (Autodesk A360 and Onshape) to create robust mechanism to manage data, changes and relationships between design components and projects. The advantage of cloud architecture is that all "implementation mechanics" will be hidden from end users, which is absolutely great news. At the same time, it would be interesting to see how robust these approaches for use cases where Fusion360 and Onshape will have to manage CAD data coming from other CAD systems. To avoid "double PDM tax" is a challenge both systems will have to deal with. Just my thoughts…

Best, Oleg

Image courtesy of Stuart Miles at FreeDigitalPhotos.net


“True SaaS” PLM – the devil is in details

June 23, 2015

cloud-plm-the-devil-in-details

My earlier attempt to compare PLM vendors and cloud services raised many comments online and offline. I want to thank everybody who commented and shared your insight – it helps me to build a next version of comparison table. Also, I can see an importance of open discussion for the future of cloud PLM industry.

One of the most debate topic is the definition of SaaS. The questions and opinions are keep coming. Can we qualify a particular solution as SaaS? What are characteristic of SaaS product from technological and business perspective? And finally… can we define what is “true SaaS”? I want to step back and talk about SaaS first. Below is the definition by Wikipedia. You can learn more here – Software as a Service.

Software as a service (SaaS; pronounced /sæs/ or /sɑːs/) is a software licensing and delivery model in which software is licensed on a subscription basis and is centrally hosted. It is sometimes referred to as “on-demand software”. SaaS is typically accessed by users using a thin client via a web browser.

This definition leaves a lot of flexibility and, of course, doesn’t reflect multiple aspects of product and technology – the core source of disagreement about what is “true SaaS”. I want to focus on some of them – product portfolio, subscription business model, IaaS and hosting, product versions and releases, upgrades and thin/think client access.

1- Product portfolio. This is a question about cloud and on-prem portfolios. Do you believe company can be providing both traditional PLM software and cloud (SaaS) software. For large companies it is mostly part of their strategy. It is a tricky balancing act of selling existing products and moving into the future. For smaller companies, it is a question about their focus.

2- Subscription business model. Most of subscription-based products are tagged with a price per user / per month. Is it a model you want to follow? Do you expect paying monthly? Is it just a way to advertise the price? What is the additional cost associated with product deployment, operation, support and services.

3- IaaS and Hosting. There are multiple sources of infrastructure for cloud software these days. You can run it using services like AWS and Microsoft Azure. Alternatively, you can host it using variety of hosting providers. If your business is large enough, the question about company datacenter can come .

4- Product versions and release. An important question about availability of multiple versions and configuration of your products. The option to keep a single version of truth for your cloud product has lot of advantages. But at the same time, it can raise a concern from IT folks thinking about how to make cloud product compatible with other software running by a company.

5- Upgrades. The topic of software upgrades is painful. Who is responsible to upgrade your environment when product is moving to the next release? Cloud software vendors are traditionally responsible for infrastructure and upgrades. But some specific customizations and configurations can complicate things.

6- Thin vs. Thick clients. Do you think “cloud” is equal “browser”? For some people, the answer is clear yes. Do you think browser access is an ultimate characteristic of “true SaaS” software? You can decide what is important for you, but consider the implication of managing software installed on mobile devices, laptops and desktop computers.

What is my conclusion? The devil is in details and SaaS definition brings many questions. I’m working on a next version of PLM cloud services comparison between vendors. It is a competitive space and vendors will have to work to explain their products and technology. To say “cloud” is not enough. SaaS has no simple definition. To understand multiple characteristics of SaaS is important to take a right decision about what is a right solution for you. Just my thoughts…

Best, Oleg

PS. If you have specific questions, please reach me out via email – oleg [@] beyondplm [.] com

Image courtesy of Suat Eman at FreeDigitalPhotos.net


PLM vendors are in risk to follow ERP dinosaurs

June 22, 2015

old-flying-cars1

When it comes to PLM and ERP, you may always get some feeling of rivalry at the air. PLM and ERP domains are not competing. However, ERP and PLM vendors are clearly competing for customers strategic mindset. After all, it always comes to the competition for strategic budget allocation.

For many years ERP vendors came first to the corner office of CIO. These days some ERP dinosaurs are fighting for surviving in a new era of cloud computing and SaaS software. Toolbox.com article – 6 Reasons Why The Dinosaurs of ERP Are About To Go Extinct provides an interesting perspective on what happens in ERP software domain and industries today. 6 reasons in the article is the answer why traditional on-prem software solutions provided by companies like PeopleSoft are eaten by rivalry of cloud newcomers.

The article made me think about trajectory of some implementations and vendors in PLM domain. I can clearly see some similarities. Do you think some 20-30 years old PLM vendors will follow the path of the dinosaurs of ERP? Here are some of my thoughts about PLM domain and on-prem / cloud trends.

1- Old platforms need to retire

For the last 15-20 years, manufacturing companies adopted 2-3 generations of PLM software. For some of them (especially very large companies), the process of adoption was long and expensive. It took time to align processes and systems together. At the same time, technology is moving forward. To move forward, many customers need to drop old products and move to support new operational systems, hardware, devices, internet browsers, etc. It is obviously raises a question of how to retire old platforms. But this is a very painful question for many companies.

2- IT managers and upgrades

Upgrades are painful and IT is the department that traditionally spending a lot of resources and cost to upgrade all systems for a company. Cloud systems are going to change it. Since data centers and cloud infrastructure are owned by software vendors, they are also taking responsibilities for the upgrade. Some innovative PLM vendors such as Aras is including upgrades into their subscription also on on-prem installations.

3- Mobile and site independence

Our working environment is changing. 10-15 years ago, out work was mostly on site. These days the workforce is distributed. People don’t need to be at their desk to do a job. Multiple locations is a reality even for small companies. Mobile devices are in everyone pocket. To have a system that capable to work in such environment is an imperative for every company.

4- How to get small customers on board

PLM vendors made multiple attempt to provide a solution for smaller companies. It never worked. I can list number of products that were announced, retired and discontinued. However, the importance of smaller companies will only increase. New technologies and online market communities are making smaller manufacturing more competitive. It will bring an additional need for cloud PLM systems.

What is my conclusion? Manufacturing companies are slow in their adoption of new technologies. PLM never been a first place to innovate for cloud companies. But the reality of the outside world and online business are bringing manufacturing companies to the point that they will need to have a competitive software for product development and manufacturing. Old systems won’t survive and will have to retire. It is a time for PLM vendors to think about innovation and new platforms. Otherwise, it might to be too late to build and too expensive to buy. Just my thoughts…

Best, Oleg


PLM and a future of deep linking

June 19, 2015

deep-linking-plm

I like links. The things about links are fascinating. It is about connection between products, people, companies and things. The nature of our life today is to stay connected. Therefore links are important. It is natural to see links appearing everywhere in engineering and manufacturing too. Think about a product and its dependencies. The information is naturally linked between assemblies, parts, documents, bill of materials, materials, suppliers, manufacturing, shop-floor, orders and many other things.

The nature of things in PLM to be connected. At the same time, the reality of engineering software is in fact highly fragmented. The time when vendors and customers believed in a single system (or database) that can contain and manage all information is over. Customers are using multiple applications and it is unusual to see two or more PLM systems in the same company. When it comes to supply chain, the situation is even more complex.

Application integrations remains one of the most painful aspect of enterprise software and PLM can clearly lead the wave of complexity involved into implementations. My article yesterday –How PLM can avoid cloud integration spaghetti was a warning to all folks that imagine that cloud will be a silver bullet to kill application pain. It doesn’t. The cloud integration can be sometimes even more complex compared to traditional integration hacks using SQL and ETL tools.

I want to continue discussing the topic of cloud integration. The topic I’m bringing today is related to so called "deep linking". If you’re not familiar with the topic, navigate to the following Wikipedia article – Deep Linking – to get some background information.

In the context of the World Wide Web, deep linking consists of using a hyperlink that links to a specific, generally searchable or indexed, piece of web content on a website (e.g.http://example.com/path/page), rather than the home page (e.g. http://example.com/). The technology behind the World Wide Web, the Hypertext Transfer Protocol (HTTP), does not actually make any distinction between "deep" links and any other links—all links are functionally equal. This is intentional; one of the design purposes of the Web is to allow authors to link to any published document on another site. The possibility of so-called "deep" linking is therefore built into the Web technology of HTTP and URLs by default—while a site can attempt to restrict deep links, to do so requires extra effort. According to the World Wide Web Consortium Technical Architecture Group, "any attempt to forbid the practice of deep linking is based on a misunderstanding of the technology, and threatens to undermine the functioning of the Web as a whole".[1]

TechCrunch article – A brief history of deep linking brings an interesting perspective of trajectory of deep linking development on the web and in app word. Below is my favorite passage. It is important since it gives a notion of how to threat standards in internet and application world.

In order for me to write this article, and for you to be able to read it, we have to share a common language: modern English. The same holds true for directing users through deep links — in order to construct a deep link that an application will understand, we need to have some shared way of expressing information or addresses. In software engineering, a well-defined shared vernacular is defined by a “standard.”

The problem with standards, though, is that many of them do not actually become standard practice, and introduce as much fragmentation as they resolve. I could define the word “basilafacitarian” as “a person who likes basil a lot,” but unless it enters the common vernacular, it’s useless as a means of communication for me to tell you that I like basil.

The same is true for an app speaking to another app; unless the URL “myapp://show-item/id123” is mutually agreed upon, there’s no guarantee what the receiving app will do with it.

Coming back to PLM world, we can see customers and vendors are struggling with data and application fragmentation. It is not getting any better in cloud PLM world – we just move to another set of APIs and technologies.

What is my conclusion? The idea of setting standards for deep linking is interesting. It can provide some level of solution to stop proprietary silos, data islands and data pumping challenges to send data between applications back and forth. It is not simple and requires some level of synchronization between vendors and customers. We have already enough history on the web and in app world to learn and correct the course to control data and make it transparent at the same time. Just my thoughts…

Best, Oleg

Image courtesy of Stuart Miles at FreeDigitalPhotos.net


How PLM can avoid cloud integration spaghetti?

June 18, 2015

21-centure-spaghetti-integration

Enterprise integration is a messy space. It is always complex – applications, databases, new and legacy systems, complexity of requirements. People usually need to run at least dozen of applications to run things smoothly. It is never done out-of-the-box and it is always requires circles of implementations and professional services.

I caught the following picture tweeted yesterday by Stan Przybylinski of CIMdata. It provides an excellent view of integration complexity. Unfortunately, in many situations, integration is a major challenge in PLM adoption. To get full value of PLM, company should spend a fortune integrating data and processes – CAD, Bill of materials, ECO, etc.

data-silos

Cloud is coming to enterprise these days. In many ways it creates a new way to think about data, software, services and maybe integrations too. The new technologies and eco-system of services can make a difference. It also creates a significant demand for openness and interoperability. This is a main reason why PLM need to learn web APIs. Web services and REST APIs are changing the way integration can be done.

Technology can make a difference. However, integrations are still hard. Few months ago, I shared my thoughts how to prevent cloud PLM integration mistakes. It comes down to three main things – 1/ lost data semantics; 2/ limitation of data transfers; 3/ transaction management in distributed and cross site environment.

Unfortunately, cloud is not a silver bullet to solve integration challenges. The demand for holistic integration continuum is still in the future. In practice, cloud applications today are replicating bad siloed behaviors of on premise applications. I captured the following picture earlier this week at Boston New Technology meetup.

cloud-silos-apps

This picture is the great demonstration of how bad aspects of siloed on premise applications are moving to cloud environment. Migration of applications on cloud infrastructure such as IaaS can simplify IT’s life. However, it won’t make life of users simpler. From end user standpoint, applications will still run in a silo.

What is my conclusion? The danger is to move established on premise PLM paradigms to the cloud. Technologically new cloud systems can give an advantages in terms of integrations. REST API is one example – it is much easier to code integration scenarios using REST APIs and modern web based tools. At the same time, closed data paradigms and data duplication between silos can bring well-know data spaghetti from on-premise applications to the cloud. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 280 other followers