Integration with manufacturing robots might be the next PLM challenge

August 21, 2015

rethink-robotics-plm-collaboration

Integration is one of the most painful aspects of PLM deployment and implementations. Especially when you need to integrate engineering, manufacturing planning and shopfloor systems. Usually it comes down to large number of data synchronization between each and every system in the loop. Integration failures can slow production process and lead to mistakes. In one of my earlier post I discussed why manufacturing future will depend on solving of old PLM / ERP integration process. The earlier visibility of product information in manufacturing process can reduce cost and optimize production schedule.

As complexity of product lifecycle is growing, the role of integration becomes even more important. The following Forbes article caught my attention earlier today – In the Factories of the Future: A Conversation With Jabil’s John Dulchinos. It speaks about changes in manufacturing driven by factors such as – mass customization, shrinking product lifecycle and offloading manufacturing cost to regions with low cost labor. These trends are quite usual these days. But here is where it starts very interesting – robots. Jim Lawton of Rethink Robotics speaks about smart collaborative robots. The following passage is my favorite:

Think about it – every robot we deploy is a computer. That means, going back to what I said earlier about the role of data in production environments, is that these robots will become critical in that model. Robots will be information management systems that can collect and analyze data on the floor, in real-time and make it available for interpretation.

That represents a real break-through in manufacturing allowing us to not only see what is happening now, but able to apply predictive technologies to the information. Everything from when a machine needs to be serviced to when a process needs to be adjusted will become available to us.

With that ability, we’ll no longer be simply looking at the past, but able to see ahead – a significantly more powerful tool for increasing efficiency and productivity. More compelling though, may in fact be the contribution that it makes toward accelerating innovation and creativity.

The future of collaborative robots is really exciting. At the same time, it made me think about complexity of product data integration between systems to support that. To support predictive analytics model and many other aspect of robotics operation. the information about variety of product characteristics should be available for robot information management systems. It would be interesting to learn more about potential information and process flow- I’m sure it will impose many challenges as soon as we will demand robot to take decisions about building specific configuration demanded by customers in a real time. It might represent the next level of complexity comparing to traditional configuration to order models.

What is my conclusion? Product complexity, shortening of lifecycle and cost pressure are forcing manufacturing companies to innovate in everything that related to optimization of product planing, manufacturing and shopfloor operation. I can see a new type of manufacturing product line equipped with collaborative robots capable to produce a specific configuration of product on demand driven by customer order. It will create the next challenge for PLM systems integration. Do you think traditional PLM architectures and platforms are ready to meet future integration with robots? This is a good question to ask PLM architects. Just my thoughts…

Best, Oleg

Image credit Rethink Robotics


Visual Design Collaboration: Bundle vs Unbundle?

August 20, 2015

3d-plm-collaboration

Collaboration in is an interesting topic. Whether you are a 3-person team or a 10,000 person OEM manufacturing company, some of the same rules for successful collaboration apply. The more you share what you know the more value it creates. To understand specific personal or use case context is absolutely critical to successful collaboration regardless on what technology you may use.

The rules of collaboration are changing these days. New collaborative technologies coming from web, mobile and social network experience are coming to disrupt what traditionally considered as a good collaboration practice. In the past, we used an old “mom test” for collaboration software – collaborative application should be easy enough for my Mom to use it (she is smart and well educated, but she didn’t grew up with computers). That was probably a good idea back in 1995 when desktop computers and Windows disrupted office environment.

Fast forward into 2015, many rules established by desktop computers where broken by web and mobile products. The disruption in a workspace is coming from younger generation. The Content Strategist Blog brings an interesting comparison of how differently Milleninals, GenX and Boomers are consume content. They same rules apply to collaboration. If the old “Mom test” spoke about simplicity that Mom can get, the new rules my trigger a question about what collaboration style new generation of people can get. Maybe it is a question how to bring SnapChat type of collaboration into design process?

All together made me think about what is the optimal strategy for design collaboration that will better address the needs users in engineering and manufacturing scaling from needs of individual makers to large OEM manufacturing shops.

Traditional design collaboration approach is going around 3D design. Most of the tools developed by CAD and PLM vendors for the last 10-15 basically created a way for users to access 3D product representation with additional set of functions like redline, comments, etc. The core complexity of these tools related to seamless access of diverse set of information – 3D, 2D, specification documents and contextual information coming from data management tools. These bundling made it complex to develop and use.

Unbundling is an interesting business strategy used in many application domains these days. Read my earlier blog about that – The future unbundling strategies in CAD / PLM. The potential of unbundled services can be significant. Existing bundles are complex and inefficient. People don’t use all functionality and look for something simple and easy to grasp. However, unbundle can be hard. Read my Why unbundle 3D is hard for PLM vendors? As much as people are valuing simplicity and ease of use today, vertical integration remains a very important thing for many companies.

Last year I started a discussion about PLM tools, bundles and platform. Since then, few new interesting products and came to the market that pursue the value of design collaboration – I should mention two cloud CAD products Onshape and Autodesk Fusion 360. Also, I have to mention engineering communication and collaboration tools provided by GrabCAD. However, I want to bring to examples today to show two distinct approaches in development of design collaboration products – bundle vs. unbundle.

The first one is Visual Collaboration product Aras Corp. introduced in the last version of Aras Innovator. Navigate to the following link to read more. In the following video you can get a full demo of visual collaboration fully integrated with PLM product. The approach taken by Aras to bring visual collaboration to all users is absolutely valuable. Everyone in an organization can collaborate in the 2D/3D and any other design context.

My second example came from new startup company founded by ex-Facebook product designers – Wake.io. Read more about Wake.io story on TechCrunch – Designers Ditch Perfectionism For Instant Feedback With Wake. The idea to solve a problem of collaboration in a community of product designers made them think about product that capture and share a very simple design collaboration process. The following video can give you an idea of that about:

What is my conclusion? Both bundling and unbundling approaches have pros and cons. Vertical integration is important, but simplicity and capturing a specific design workflow without overwhelming users with additional information can be valuable too. In my view, unbundling is trending. This is the way to create new products solving painful problems. The same collaboration problems engineers and other people are experiencing when designing products can be applied to other places as well. An example of Wake.io is a hint to CAD and PLM companies to think where future disruption can come from. The same way Slack disrupted existing collaborative approaches practiced by companies today, new products like Wake.io can disrupt future of 3D and engineering collaboration. Just my thoughts…

Best, Oleg

Image credit GrabCAD

 

 


PLM and integration business

August 19, 2015

plm-integration-business

Integrations. Enterprise software implementations are heavy depending on the ability to integrate different pieces of software. Each and every PLM implementation I’ve seen required some sort of integrations. It might be integration of CAD and PDM packages, which is relatively straightforward in many situations. But it can be also very challenging one, such as integrations between PLM and ERP functionality which can bring many organizational and technological difficulties.

Most of PLM integrations are doing by integration and service partners. It removes many problems with licensing of competitive software from different vendors. The integration business is tricky. As an example of turbulent character of integration business you can read news about Informatica buyout few weeks ago – Microsoft And Salesforce Join In $5.3 Billion Buyout Of Informatica. Not directly related to PLM world, but it gives some impression about business of integration software (related to both Informatica and Tibco):

But Informatica couldn’t ultimately find a better option for its $1 billion in annual revenue business, which grew just 10% on constant currencies in Q2 of 2015 on software revenue growth of 13% and subscription growth of 44% year-to-year. That rate of growth was essentially flat from the year before. Like competitor Tibco, Informatica had fallen into a low-growth, mature sales cycle after seeing its stock soar and then crater when the dotcom bubble burst. Both had eventually regrown into multi-billion valuations, but after years of sales growth to get back where they were. Tibco was taken private in December for about $4.3 billion, $1 billion less than Informatica.

After some thinking, it occurred to me that large enterprise PLM implementations are essentially integration projects. It combined from very typical set of integration steps – analysis of data processes in the organization, data modeling, defining flows of data, reporting and monitoring tools. PLM platforms are essentially data integration toolkits allowing to handle very specific set of information. Which connected me to one of my previous articles – How PLM can avoid cloud integration spaghetti. As PLM industry moves to the cloud, it must find a better way to deal with PLM implementations and its essential part – integrations.

It made me think about few possible ways PLM vendors can change a trajectory of traditional integrations and business strategies.

1- Open source PLM data toolkits. Open source software has a strong presence in a modern software eco-system. For many software vendors today, open source is a natural way to develop products. I’ve been watching few PLM open source initiatives, but most of them were lack of product maturity. Turning part of existing PLM platform into open source, can trigger a change in the way PLM implementations can be done. Aras Corp is the most closed example of such initiative. Although Aras Innovator core module is not open source, most of solutions developed on top of Aras are open source projects.

2- Automation platforms to trigger and action based integrations. You might be familiar with integration automation services such as Zapier and IFTTT. Both are extremely efficient to automate variety of integration activities between cloud applications. These automation services are providing development platform for other companies to create specific integration connection points and services. Jitterbit is probably the closed example of automation services in PLM ecosystem.

3- Integration businesses as part of cloud hosting services. In a growing eco-system of cloud PLM software, hosting providers can play a role of implementation and integration service providers too. In my view, it is a very dynamic space. All large manufacturing companies implemented on premise PLM as of today will start looking how to bring cloud PLM solutions – integrations will become the most challenging part of making transformation happen.

What is my conclusion? PLM implementations are complex. And "integration" is the most complicated part of it. The traditional PLM implementation approach is holding back PLM business. How to turn PLM implementations into agile and lean process? PLM integration improvement can be a good step to clean the mess of PLM implementations. Just my thoughts…

Best, Oleg


3 ways to break PLM social intertia

August 18, 2015

social-plm-inertia

You might think "social trend" is over for enterprise product lifecycle management. PLM companies are not buzzing too much about social PLM these days. In my earlier post back in 2012, I shared my thoughts Why Social PLM 1.0 failed? One of my conclusions was that "social PLM" had low value for single user and provided too much focus on enterprise-wide value proposition such as improved collaboration, streamline processes, etc. Nothing wrong with that, but it failed.

At the same time, I think, the fight for social enterprise is not over yet. My attention caught E(E) blog Antisocial Enterprise V: the final Facebook by Ed Lopategui. The article brings few very interesting points related to failure or success of social systems inspired by analysis of Google+ trajectory and his own experience of moving from G+ to Facebook. He summarized it with 3 points conclusion: First movers inertia, network competition and users fatigue, technology is irrelevant. Ed is asking for a system for both – work and play. The following passage can give you an idea and reason in a nutshell.

So it should be rather plain right now that Facebook is eating the world. People want to leave it, but they simply can’t. Lots of upstarts appear to dethrone them, they languish for a while, and are crushed. The only meaningful exodus is younger people who are leaving, not necessarily for a better experience elsewhere, but merely to escape a system that ties them uncomfortably close to their parents. What does that mean for enterprise networks? Competing in this environment may not be an option. It may be high time to change strategy – turn to the Trojan horse approach perhaps. Integration encapsulated within a robust security model to slowly build the inertia necessary to ween users off Facebook altogether, without having to directly give up Facebook. Until it’s too late. That would require a system designed for both work and play that understands and can transparently enforce the needed boundaries between both.

Here is the thing. I’m not sure agree with the approach of building a social system for both play and work. It sounds not realistic to me. But I captured one thing, which is important to understand – inertia. This is a huge deal. Many manufacturing enterprises are operating under high level inertia assumption. It means that they have no real reason to make a specific change decision. Business is operating as usual, until something really bad happens. To take a decision and introduce a change is risky and people are afraid of making mistakes. This is what happened with social systems. Systems such SharePoint took enterprises by storm. It means that IT managers discovered one day that all employees in a company is using SharePoint, but central IT has no idea where servers located and who installed them.

So, how to design a Tojan horse that will solve a problem of social inertia. I think, a deeper look at Facebook story as well as experience with other systems. Here is a potential options of how to make it happen.

1- Ease of data capture. The massive success came to Facebook with availability of camera on every mobile device. We can think about ease of capturing data for social PLM system. If a "social Trojan horse" will be able to capture data in the organization and help share it with other people can be a good way preventing people to leave a system.

2- Open – to prevent data locks. People are afraid of data locked in a specific system. Each time they think, that social system is another silo to lock data, they will run away. Make it easy to get in and out can be another way to eliminate initial inertia.

3- Innovate in business model to make it available for a whole organization. Social software brings value when it used by many people. The more people you bring in, the better is system behavior and more substantial is value proposition. Lucrative licensing and business models of PLM vendors are not very much appealing to most of social PLM software to be used by all people in an organization.

What is my conclusion? Social inertia is a big deal. Few tricks like real identity and pictures made a change for Facebook and allowed to spread across communities and people. Looking for something similar for enterprise can be a key to unlock the future of social enterprise PLM system. Who will find that key first, will have it all. Just my thoughts…

Best, Oleg

Image courtesy of stockimages at FreeDigitalPhotos.net


IBM Watson won’t solve PLM platform problems

August 17, 2015

plm-platform-ibm-watson

Last year, my attention was caught by CIMdata article – IBM Forms New Watson Group to Meet Growing Demand for Cognitive Innovations. The interesting for cognitive computing is growing these days and you can get tons of interesting materials about that on IBM Watson website.

Cognitive computing is the simulation of human thought processes in a computerized model. Cognitive computing involves self-learning systems that use data mining, pattern recognition and natural language processing to mimic the way the human brain works.

The following passage from CIMdata article caught my attention:

IBM Watson Analytics allows users to explore Big Data insights through visual representations, without the need for advanced analytics training. The service removes common impediments in the data discovery process, enabling business users to quickly and independently uncover new insights in their data. Guided by sophisticated analytics and a natural language interface, Watson Analytics automatically prepares the data, surfaces the most important relationships and presents the results in an easy to interpret interactive visual format.

Data discovery is a tricky topic. As I mentioned in my earlier blog last week – PLM cannot drain product data swamps. The problem of PLM is in fact related to limitations of data modeling and ability to capture large scales of organizational data. In a long run it limits ability to create an environment for product innovation. So, maybe IBM Watson is here to help?

Over the weekend, my attention was caught by The Platform article “The Real Trouble With Cognitive Computing” and the troubles IBM has trying to figure out what they are going to do with the Watson supercomputer. The article explains that IBM folks came up 8,000 potential experiments for Watson to do, but only 20 percent of them.

The discussion about single information model in Watson is something PLM folks can benefit when thinking about future of PLM platformization. Here is my favorite passage about Watson:

“The non-messy way to develop would be to create one big knowledge model, as with the semantic web, and have a neat way to query it,” Pesenti tells The Platform. “But that would not be flexible enough and not provide enough coverage. So we’re left with the messy way. Instead of taking data and structuring it in one place, it’s a matter of keeping data sources as they are—there is no silver bullet algorithm to use in this case either. All has to be combined, from natural language processing, machine learning, knowledge representation. And then meshed as some kind of distributed infrastructure.”

What is my conclusion? The odds are Watson won’t be a pragmatic technology for PLM vendors to rely on and build a future of PLM platform innovation. However, the giant knowledge model Watson failed to build can be an alert for PLM architects trying to create a holistic model of the future PLM platforms. It might not work… The reality is much messy than you think. This is a note to folks taking strategic decisions and PLM innovators. Just my thoughts…

Best, Oleg

picture credit IBM Watson

 

 


Future PLM competition in Electronic Design

August 14, 2015

electronic-plm

The roots of traditional PLM systems are in mechanical CAD systems. As a result, these PLM systems always had some gaps in electronic and high tech industry. If I look back electronic design automation vendors developed their own eco-system of tools to manage design, component lifecycle, project collaboration, bill of materials and manufacturing processes.

My attention caught by two acquisition announcements made by Altium – software outfit focusing on PCB design and manufacturing. The first one is about Octopart – search engine for electronic and industrial parts. More details is here – Octopart is joining Altium. Octopart founder, Sam Wurzel provided an interesting perspective on merging Octopart into Altium environment:

We live in a time when electrical engineers, makers, and hackers have high expectations for their design tools and for component search. Octopart users expect rich content like CAD models and reference designs at their fingertips when doing component selection. Users of PCB design software expect supply chain intelligence at hand when they are designing new products. Bringing Octopart together with Altium will make this possible, and more. We envision a future where going from prototyping to production is a seamless experience and we’re going to work together to make that vision a reality.

The second Altium acquisition is Ciiva – a startup company developed tools to simplify electronic components and bill of material management. You can read more on Ciiva blog.

Ciiva was started with a vision to bring fast, easy access to electronic part data and provide a platform to help people overcome common BOM and component data management challenges. Our products and services will complement the advanced PCB design and development tools that Altium has had much success in serving the electronic design industry with for the last 30 years. This will help Altium to deliver unparalleled capabilities to the wider electronic design community, while providing us with a solid platform to continue to grow and expand our capabilities.

Both acquisitions is clear indicating that EDA vendors are thinking about future of cloud tools and ability to help engineers working in a very connected world. It is specially true for electronic manufacturing industry.

Traditional PLM vendors are aware about importance of electronic design and management both mechanical and electronic data. The work to provide better integration with electronic design tool is on going. You might take a look on what Dassault Systemes, Siemens PLM and PTC MKS did in mechatronics and software integrated design. Autodesk also provided some indication of growing interest in accessing product data from electronic tools. You probably noticed that Autodesk recently acquired ecad.io technology (previously known as intertiacad). The following article – Fusion 360 – "The ultimate CAD/CAM/CAE/PCB design tool set" gives you also some hints about integrating of 123circuts.io with Autodesk Fusion360.

What is my conclusion? Modern products are presenting a growing needs to integrate mechanical, electronic and software design. PLM tools are not perfect in this. I shared my thoughts earlier – Why PLM is failing to manage multidisciplinary bill of materials. EDA and MCAD tools were developed their own silos. For many years, PLM was playing a leading force in product development defining product shape and mechanical characteristics of products. But modern product design requires better integration of electronic design tools into networked eco-system of product development and manufacturing. It can raise an importance of electronic PLM development. Just my thoughts…

Best, Oleg


3 reasons why PLM cannot drain product data swamp

August 13, 2015

plm-data-swamp

The amount of data around us is growing. The same applies to engineering and manufacturing companies. Our ability to collect data is astonishing. But we are failing to bring the right data in the right form and at the right time to people. Product lifecycle management software is often recognized as a glue that should bring information about product and its lifecycle to everyone in a company.

Here is the thing – legacy data import is one of the most painful projects in PLM implementations. Few years ago, I came with a blog post – Who will take on legacy data in PLM? Guess what? Not many candidates in the list… Typical PLM implementation project is trying to allocate time to import existing data, but usually it is done as customization project by service organization and always requires additional resources.

Zero Wait State blog – The PLM State: Drain Your Data Swamp speaks exactly about the problem with existing data in organization. I like "swamp" metaphor, because it is really looks like a swamp. Think about legacy databases, tons of Excel files, existing PDM and PLM systems. What else? Emails, SharePoint website, wikis and many others…

The following passage from the article is a great value proposition for creating a system that will make existing data usable.

Data that isn’t used is data that isn’t profitable. All data should add value, whether by contributing to strategic decisions, marketing, process improvements, or regulations (data kept due to regulatory retention policies adds value by allowing your business to operate in a regulated environment). You paid for, and are paying for it already, so why wouldn’t you put that data to use?

But how do you drain a data swamp? The first step is to identify what data is likely to have been swamped. Identifying misplaced or forgotten data would seem like an insurmountable task, but look to those processes in your organization that generate data. I’m not talking about just test data or formal analyses, but all the data. That RoHS-compliant capacitor? That’s a data point. Device master records (DMR) and device history records (DHR) are obvious datasets. That prototype from CAD cost a lot to develop, so include that data where it can be leveraged for other designs rather than being ignored after its initial use.

Enterprise resource planning (ERP), project lifecycle management (PLM), laboratory information management system (LIMS), document management system (DMS) software and others can play a role in draining your data swamp, associating and linking disparate datasets and providing a means for their use.

Unfortunately, I see a very little progress with a process of drying data swamps by PLM systems. It made me think about reasons why is it hard to do. Here are 3 reasons why existing PLM platforms cannot do so.

1- Limited data modeling. Although all PLM systems have some kind of "flexible data modeling" capabilities, it is a lot of work to get data modeling done for all legacy data. Just think about converting all existing data structures into some sort of data models in existing PLM systems. It can be lifelong manual project.

2- Limited flexibility of relational databases. Majority of PLM architectures are built on top of relational databases, which provides almost no ways to bring unstructured or semi-structured data such as emails, PDF content, Excel spreadsheets, etc.

3- Absence of data transformation and classification tools. Existing PLM platforms have no tools that can allow you re-structure or re-model data after it is already imported into the system. Think about importing data and "massaging" it afterwards.

What is my conclusion? Companies are failing to bring existing data into PLM system because it is hard. Although, PLM vision is to provide a glue between variety of product information sources, in practice, it is very complex and expensive to achieve. Typical company is operating with number of siloed applications developed by different people and implemented to solve specific situational problems. It leaves holistic approach in product data management outside of scope in most of PLM implementations. Just my thoughts…

Best, Oleg

Image courtesy of jscreationzs at FreeDigitalPhotos.net



Follow

Get every new post delivered to your Inbox.

Join 285 other followers