How to prevent cloud PLM integration mistakes

February 2, 2015

connected-plm-erp

Cloud is huge enabler for collaboration between people. It makes your data and processes accessible everywhere from a browser. It can help you to collaborate between engineers and suppliers. It can help you to integrate systems and people across enterprise.

Let me speak about the last one. The integration topic is actually tricky. I’ve been sharing some of my thoughts about cloud integration challenges – Integration is holding back PLM cloud adoption few months ago. Last week, I had a chance to attend two webinars about PLM and integration.

Become a Connected Manufacturing Enterprise with Agile Integration by Jitterbit. The following picture gives you a perspective on a problem of “connected manufacturing” and architecture solutions like Autodesk PLM360 and Jitterbit are solving this problem.

plm360-jitterbit-1

Here is the view that shows you the reality of mixed (cloud and on-premise) integrations.

plm360-jitterbit-2

Another webinar by CIMdata – “PLM & ERP: What’s the Difference, and Why Should you Care?” is providing another perspective on integration challenges between engineering an manufacturing.

cimdata-plm-erp-1

cimdata-plm-erp-2

Companies are moving into mixed cloud and on premise environment. This is a reality and we cannot avoid it. So, for a foreseeable future, we will have to deal with integration of multiple systems – some of them will continue to run on premises and some of them will be elsewhere (public cloud). It made me think about potential mistakes you can run into while integrating systems.

1- Lost data semantics

Most of integration scenarios are about how to send data back and forth between systems. It is hard to keep semantics of data and not to loose it when exchanging information. So, define what data means and keep an overall integration data schema. Otherwise, the result can be messy.

2- Data transfer limitation

Although some of integration infrastructure can allow you to implement data exchange quickly, you can underestimate the bandwidth requirements. Sending large packets of data can cause significant latency and create runtime errors and problems. Check what monitoring tools are available to handle such situations.

3- Transaction management

Most of manufacturing systems are sensitive to transactions. To manage distributed transactions can be tricky and require some fine tuning. Pay attention on how you handle error processing when integrating transaction system managing ordering, lifecycle and bill of materials.

What is my conclusion? The complexity of integration is growing. Cloud systems are bringing many advantages, but will create additional challenges to IT and professional services. Most of integrations are not working out of the box. New tools running from the cloud can help you to integrate it faster, but it will require good coordination with IT and planning upfront to prevent potential mistakes. Data integration is hard and requires experience and familiarity with manufacturing systems. Just my thoughts…

Best, Oleg

photo credit: freefotouk via photopin cc


SAP has a magic technology to improve enterprise integration

December 4, 2014

digital-integration

Integration is a big deal. When it comes to enterprise organizations and specifically manufacturing companies of different kinds, enterprise integration is one of the major challenges that influence broad PLM adoption and ROI. Enterprise integration isn’t a simple problem to solve. It requires a diverse set of tools to support data exchange and process orchestration. PLM vendors are providing a diverse set of solutions for that. On the side of the strategic position, re PLM companies are expanding their data reach by attempting to cover larger scope of data and process. You can read about it in my post about ECO and EBOM/MBOM synchronization complexity.

My attention caught by betanews article – SAP launches new manufacturing solution to improve enterprise integration. It speaks about new technologies developed on top of new SAP HANA in memory database to manage real time integrations. Here is the passage that gives you a glimpse of what is that about:

SAP Manufacturing Execution includes the SAP Manufacturing Integration and Intelligence (SAP MII) application and SAP Plant Connectivity software. This allows it to provide machine-to-machine integration and orchestrate intelligent manufacturing. Using the existing SAP HANA platform it offers global visibility into operations by making manufacturing big data more accessible and enabling predictive analytics capabilities to be used in house or in the cloud. This gives businesses advanced problem solving ability and ease of access to manufacturing data so they can make improvements in cost, quality, asset utilization and performance.

I’ve been touching SAP HANA topic before in my post – Future PLM platforms and SAP / Oracle technological wars and Is SAP HANA the future of PLM databases.

I’ve made a trip to SAP MII website to dig for more information about integration architecture. Here is an interesting document (with lot of technical details) that worth looking if you are interesting in SAP MII HANA integration – SAP Process Integration Act as Adapter for SAP MII integrating with Enterprise Applications. The document is available here. I captured the following architecture slide that give you a detailed view. More information is available here. From that picture you can see SAP’s view on how PLM and other business apps are going to be integrated with manufacturing and shopfloor system.

sap-mii-architecture

What is my conclusion? Modern manufacturing requires high level of integration. It goes from design and engineering to manufacturing and operation. Data integration and transparency will allow to companies to optimize performance, save cost and streamline processes. However, to make it happen is not a simple job and it requires lot of hard-wiring, data transformation and openness. PLM vendors’ demand to have control over MBOM to make vertical integration easier. As you can see on pictures and documents above SAP is working to create a grand vision of data integration from shopfloor to business applications and services. How and where PLM and ERP vendors will meet to create an integrated manufacturing solution is an interesting question to ask.

Best, Oleg


Integration is still the major challenge in PLM adoption

November 30, 2014

data-silos

I want to continue the discussion about data ownership and synchronization between islands of information in a company. EBOM and MBOM synchronization I mentioned in my previous post is probably the most typical one. But there are many others. Supply chain, contract manufacturing, regulation, customer support and services – this is only very short list where data stays under control of different systems. Even the idea of pulling data under a control of single system was an option a decade ago, my hunch that these days, the idea of PLM one big silo is getting less popular.

Control Engineering Asia published an article Hitachi Sunway Talks PLM Opportunities and Developments. Thammaya Chuay-iam of Hitachi Sunway Information Systems in Tailand, shared his thoughts about some of the major trends happening in the PLM sector in Asia, the opportunities and challenges. One of the topics caught my attention. It was specifically related to the issue of integration. Here is my favorite passage:

[Q] What are the biggest challenges being faced by manufacturers when it comes to their PLM activities? [A] Even though PLM initiatives within global companies have developed significantly over the years, the core challenge of the manufacturing industry remains the ever-growing need for consistent integration between PLM solutions and other enterprise systems. Another challenge is the need for focused solutions that address the needs of targeted groups within the PLM environment.

It made me think again about integration topic. The problem is here for the last 20-25 years. In many situations, the solutions companies are using remain unchanged for decades. It is brutal export/import/sync of data. It brings complexity and it cost money.

What is my conclusion? I guess the fundamental idea of “data pumping” between different systems should be replaced by something better. I’ve been touching it in my post about data ownership and data sharing. Web technologies can give us a better way to share, link and intertwine data. I believe it can be a better way and it will replace brutal data synchronization. Just my thoughts…

Best, Oleg


CAD: Engineering Bundles vs. Granular Apps?

August 7, 2014

cad-mcad-ecad-bundle-future

Packages, bundles, product suites, integrated environments. I’m sure you are familiar with these names. The debates about best of breed solutions vs. single-vendor integrated suites are going long way back in the history of CAD and PLM. Some companies are ready for functional trade-off and afraid of additional integration cost. For other companies performance and functionality are absolutely important to innovate.

Service oriented architecture and web technologies are bringing a sense of granularity into application and product world. In my view, Develop3D article – Why granularity is going to rock the future brings a very important perspective on the future of products. Al Deans speaks about granularity in data management. The complexity of product and data is growing. More people should work collaboratively on the same information (what was bundled before as a single CAD file). Here is my favorite quote:

When you’re working on a system that’s remotely located on a server, whether that’s over your internal network or across the wider web, you’ll need to manage and exchange finite packets of information, features, sketch entities and such. the rise of collaborative working systems, such as Catia V6, will mean that users are working on the same data, in parallel, at the same time. If not at the same time, there will be times when design changes, down to feature and maybe sub-feature level, will need to be managed and rationalised. To do that, you need to manage and keep track of those individual parcels of data and oackets of change. That’s going to require a level of granularity that’s way beyond what most data management systems are currently capable of.

Last year I watched Tech4PD video capturing debates between well known PLM pundits – Jim Brown and Chad Jackson – CAD Granularity vs. Integrated Suites. Navigate here to watch the recording. I voted for granularity. It was well captured by Chad Jackson statement. Here is the passage:

Granular CAD applications enable many roles in the enterprise, expanding the use of the 3D asset company-wide. Granular apps are better at enabling individual roles.

Latest blog post on Lifecycle Insight (again by Chad Jackson) – No More Excuses: It’s Time to Merge MCAD and ECAD caught my attention by something that I see as opposite to principles of granularity. Chad is debating the need to bundle and unite the functionality of MCAD and ECAD applications. I found his conclusion in a slight controversy with previously introduced concept of "granular CAD applications":

Why are there two separate toolsets at all? And that’s where, despite the lack of enthusiasm and interest in the topic, I think there is potential for disruption and innovation. There shouldn’t be two toolsets. You should be able to conduct mechanical and electrical design in a single CAD application…. Call it Hardware CAD (HCAD). Call it Electro-Mechanical CAD (EMCAD). I don’t care. But don’t tell me such an offering wouldn’t be intriguing. In my eyes, there is no reason that a combined MCAD-ECAD application shouldn’t be available. Large existing software providers have their reasons for inaction. But that means there is a ripe opportunity for disruption from smaller companies.

I want to elaborate more about the last point related to disruption and innovation. I explained my point of view in the blog post last year – The Future Unbundling Strategies in CAD/PLM. I want to repeat some of my assertions from the last year:

1. CAD and PLM is too big to sustain as a one big aggregated solution provided by a single vendor. This is a polystate diversified space that needs to be covered by multiple solutions, features and vendors.

2. Vendors are never good enough to see what exact problem customers want to solve. Especially when it comes to large manufacturing companies and complicated supply chain eco-systems. That’s way armies of consulting services as well as diversified products must be applied to provide a final solution.

3. Customers often don’t know what problem to solve. For most of the situations product development is a complex problem. It requires the team of people to work on. In addition to that, large organizations are involved into politics and confrontation related to usage of different enterprise software and tools.

What is my conclusion? I see a very strong potential in unbundling of existing large product suites. Take a piece of functionality, re-invent it, provide bigger value to a customer. Cloud technologies and future focus on data will be an imperative to make it successful. Vendors’ focus is shifting towards services. It is about how to satisfy customers each day. Selling of expensive bundles can be a thing in the past. Just my thoughts…

Best, Oleg


Thoughts about PDM/PLM jumbos and PLM glue

December 10, 2013

plm-big-systems-integrations

PDM v. PLM. This topic is usually raising lots of questions. Still is… People are getting confused by names and functions. Few years ago, I wrote 3 posts comparing PDM and PLM from different aspects – data, process and integration. Recently, Chad Jackson made me think about PLM and PDM topic again by his write up of Enovia capabilities. You might read my PDMish, PLMish and other CADPDLM bundles following Chad’s post.

Aras blog is bringing PDM v PLM topic again. Navigate to PDM or PLM? Yes. story by Peter Schroer – CEO and President of Aras. Peter draws a clear functional line between PDM and PLM. The following passage put all "dots" in comparison between D and L in product development.

PDM doesn’t provide product configuration management (effectivity) or enterprise process management. It doesn’t keep the design in synch with product workflows or requirements management, it doesn’t manage non-CAD and non-file based data very well, and it doesn’t track where that part or assembly fits in to the entire system lifecycle process. While PDM is useful, it doesn’t help make supply chains more efficient, it doesn’t improve quality or customer satisfaction, and it doesn’t help increase revenue.

The recipe I captured in Aras’ blog is suggesting PLM to play the role of glue that connect PDM (engineering) and extended enterprise (rest of the company).

PLM, or product lifecycle management, is the glue between PDM and the extended enterprise. PLM takes product data and puts it in the correct context for each user. For some users the CAD file is the center of their universe, but for many others CAD-based data is just a small subset of the entire set of product information they work with.

The last things about "glue" made me think about future integration strategies in PDM/PLM world. It was a time when everybody had a dream of a single PLM system used by everybody in the company providing a holistic set of functions. However, nowadays the number of "single PLM" believers are going down.

So, what comes next? Few weeks ago, I’ve been discussing the idea of Future unbundling strategies in PLM. Thinking more, I can see future separation of giant systems into small services as something more feasible. I can see how small features and functions are getting traction in a company to fulfill a specific need – change management, configurations, engineering BOM, etc.

What is my conclusion? I can see more tools and service diversity in the future. It is very hard to provide ready to go out-of-the-box set of functions. Compared to that, I can see set of services to make product development, collaboration, data management and communication more efficient. Some of tools can be cloud- and some of them – on-premise based. Social platforms will play a role of one-big-system-glue. Just my thoughts…

Best, Oleg


Why cloud won’t solve CAD/PDM/PLM integration problem tomorrow?

October 17, 2013

cloud-cad-pdm-plm-integration

All roads lead to Rome. Sometimes, I have a feeling whatever discussion happens in CAD, engineering and manufacturing world, it will lead to PDM and PLM. My earlier conversation about pros and cons of having special CAD file sharing tools started here, ended up on GrabCAD blog by Hardi Meybaum here with a conversation about what system should be used in organization to manage data, users and workflows and how to integrate these systems. Here is the passage:

In addition to the cons Oleg brought up I actually see a bigger issue – integration. A good thing about a generic/horizontal file sharing tool is that everyone is using the same system so it’s easy to manage data, users and workflows. Conversely, a manufacturing company might have someone who only needs to access very specific CAD data once or twice a quarter. Does this person really need access to your manufacturing-specific tool to get it? I think overcoming this problem is going to be an important obstacle to address in the next 3-5 years.

Another interesting point raised by Hardi, was the role of VAR is changing cloud/SaaS software role. In the past I discussed the future of PLM VARs last year here – What is the future of PLM VARs? My main point was that technical experience will become one of the main differentiation factors for future SaaS VARs. GrabCAD blog practically in the same way.

The benefit of VARs repositioning within the CAD space is that most of them have offered value implementing rather complicated engineering systems to customers already. They understand the customer well and have software knowledge within the company. I believe the new model for VARs in the next 5 years will be reinvention – some will become software companies providing specific integration like Zapier and others will focus on very specific workflow integrations between horizontal products.

However, let me get back to CAD, PDM, PLM and integration topic. While CAD File Sharing is highly demanded by almost all organizations these days, customer requirements are moving very fast from pure need to share information to questions and requirements how to control CAD data and manage change processes.

In the early beginning, PLM vendors and implementers were trying to get deep associations with CAD roots. This is not true anymore. One of the latest trend is to focus PLM on the business level of the company and how to improve company business processes. In one of my early articles CAD data and PLM, I described this position talking about CAD Rootless PLMs and importance of integration between CAD and PLM.

At the same time, the center of integration gravity is still positioned on integration of CAD and PDM. The complication of this type of integration is related to multi-CAD nature of CAD data. Companies are using different CAD systems. Some of CAD systems can dominant, but you always can find data from multiple CAD systems in every organization. GrabCAD post mentioned that as well. In the early days, PDM vendors were focusing on how to develop plug-ins for every CAD. This is still true for many vendors. However, next trend, in my view, would be future openness of CAD systems toward providing interfaces integrating to many PDM systems. You can read more about it in my post – Multi-CAD integration: Yesterday, Today and Tomorrow .

Last, but not least topic is related to Cloud/SaaS solutions. This is a place where Hardi Meybaum of GrabCAD believe the most. Here is an interesting passage from GrabCAD post:

It will only be solved once there is enough adoption in the general market for the next generation of SaaS CAD tools like GrabCAD Workbench, Autodesk PLM360, TeamPlatform. At that point the integration challenges will evolve.

I can see a clear point of movement to the cloud / SaaS trajectory. However, in my view, despite a significant growths in design domain, co-existence will be probably the right word to describe the reality of coming few years. I want to stress this point – CAD will be the latest application in the list of PDM, PLM and other business services to move to the cloud. What is interesting to me is how vendors are going to support this “cloud transition”. Companies clearly won’t be able to move all in a single shot. You can take a look on how I can see a transition in my blot post – PDM/PLM Evolution: Final Step and Cloud / On-Premises Integration.

What is my conclusion? Integration is a complicated and important topic. However, to think integration will disappear with movement to the cloud is too naive position. Cloud/SaaS can clearly simplify some technological aspects of integrations. However, the complexity of design, data management and processes in an organization will be very hard to overcome only by the change of technological foundation. The "transition phase" will be another reason why vendors and customers will have to deal with integration for coming decade, at least. Just my thoughts…

Best, Oleg


3 steps how to put PLM / ERP each in their place

March 14, 2013

PLM and ERP integration. Endless topic. My PLM blogging friends – Chad Jackson and Jim Brown decided to discuss it again at their Tech4PD video blog. Even if it is presented as a debate, I found much more similarity between their statements about PLM/ERP integration rather than differences. They both agree PLM and ERP systems are taking a big part of engineering information environment, both systems are important, PLM and ERP are complimentary in terms of their business functions.

There are differences in the positions, of course. Jim is advocating for single source of products data integrated with ERP, CRM and other systems. PLM is the master of product data. From Jim standpoint, the data from PLM system is integrated and represents as product master data to all other data management systems. Chad agrees, the integration needed, but think low-touch integration is more secured and can be less risky. His (Chad’s) opinion is that total integration proposed by Jim is a too big shot to take. Chad advocates for simple integration and so called "one time information push". From Chad standpoint to keep integration simple is an easier and less risky path to integrate data. If you have few minutes, I’d encourage you to watch the video for few minutes (especially if you want to see how Jim takes a swim with polar bears).

So, a simple conclusion after 7 min of video – data needs to be integration between PLM and ERP. Well, it gave me sort of disappointment. Ask anybody in the industry they will tell you about needs to integrated information between PLM and ERP. I like the way Jim present the need – we have to have a comprehensive solution. At the same time, Chad put it down in a very pragmatic way- complex integration won’t work. We need to have something simple in place. The debates between Jim and Chad made me think about possible steps to resolve the complexity of integration between PLM and ERP.

1. Move from "data ownership" to "data publishing". This is an important shift. Enterprise system integration people spent decades debating "data ownership" topic. It is all about "master" and about how to establish a logic of data ownership. The concept of data publishing takes the data ownership upside down. You shift the focus on how to use data. The access can be provided using multiple ways (APIs and variety of data formats). My personal preference to use linked data and RDF/RDFS model (format) for that purpose.

2. Move from "data mapping" to "data description and connection". In other words, link everything. In the original concept of data integration (Chad called it "data push"), the focus is on how to map data located in one system to another system. The whole purpose is to take data in one system, convert (map) and put in another system. After you accomplished "data publishing" concept, you don’t need to transfer data. You just find right data using model qualifiers. It simplify the business logic and helps to establish more reliable data integration mechanisms.

3. Include links to data elements from other systems. Think about this option similar to links on web pages. If you need to extend data or navigate to related data, you just follow the link, which takes you to the right place.

What is my conclusion? I can see PLM and ERP integrations fail in many companies because it relies to old and outdated integration principles. These old principles relies on data ownership and business of protecting data in a boundary of enterprise applications. It is going to change soon. The new one doesn’t move the data while provide an open and reliable mechanism to build an integration system similar to web. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 273 other followers