PLM and integration business

August 19, 2015

plm-integration-business

Integrations. Enterprise software implementations are heavy depending on the ability to integrate different pieces of software. Each and every PLM implementation I’ve seen required some sort of integrations. It might be integration of CAD and PDM packages, which is relatively straightforward in many situations. But it can be also very challenging one, such as integrations between PLM and ERP functionality which can bring many organizational and technological difficulties.

Most of PLM integrations are doing by integration and service partners. It removes many problems with licensing of competitive software from different vendors. The integration business is tricky. As an example of turbulent character of integration business you can read news about Informatica buyout few weeks ago – Microsoft And Salesforce Join In $5.3 Billion Buyout Of Informatica. Not directly related to PLM world, but it gives some impression about business of integration software (related to both Informatica and Tibco):

But Informatica couldn’t ultimately find a better option for its $1 billion in annual revenue business, which grew just 10% on constant currencies in Q2 of 2015 on software revenue growth of 13% and subscription growth of 44% year-to-year. That rate of growth was essentially flat from the year before. Like competitor Tibco, Informatica had fallen into a low-growth, mature sales cycle after seeing its stock soar and then crater when the dotcom bubble burst. Both had eventually regrown into multi-billion valuations, but after years of sales growth to get back where they were. Tibco was taken private in December for about $4.3 billion, $1 billion less than Informatica.

After some thinking, it occurred to me that large enterprise PLM implementations are essentially integration projects. It combined from very typical set of integration steps – analysis of data processes in the organization, data modeling, defining flows of data, reporting and monitoring tools. PLM platforms are essentially data integration toolkits allowing to handle very specific set of information. Which connected me to one of my previous articles – How PLM can avoid cloud integration spaghetti. As PLM industry moves to the cloud, it must find a better way to deal with PLM implementations and its essential part – integrations.

It made me think about few possible ways PLM vendors can change a trajectory of traditional integrations and business strategies.

1- Open source PLM data toolkits. Open source software has a strong presence in a modern software eco-system. For many software vendors today, open source is a natural way to develop products. I’ve been watching few PLM open source initiatives, but most of them were lack of product maturity. Turning part of existing PLM platform into open source, can trigger a change in the way PLM implementations can be done. Aras Corp is the most closed example of such initiative. Although Aras Innovator core module is not open source, most of solutions developed on top of Aras are open source projects.

2- Automation platforms to trigger and action based integrations. You might be familiar with integration automation services such as Zapier and IFTTT. Both are extremely efficient to automate variety of integration activities between cloud applications. These automation services are providing development platform for other companies to create specific integration connection points and services. Jitterbit is probably the closed example of automation services in PLM ecosystem.

3- Integration businesses as part of cloud hosting services. In a growing eco-system of cloud PLM software, hosting providers can play a role of implementation and integration service providers too. In my view, it is a very dynamic space. All large manufacturing companies implemented on premise PLM as of today will start looking how to bring cloud PLM solutions – integrations will become the most challenging part of making transformation happen.

What is my conclusion? PLM implementations are complex. And "integration" is the most complicated part of it. The traditional PLM implementation approach is holding back PLM business. How to turn PLM implementations into agile and lean process? PLM integration improvement can be a good step to clean the mess of PLM implementations. Just my thoughts…

Best, Oleg


How PLM can avoid cloud integration spaghetti?

June 18, 2015

21-centure-spaghetti-integration

Enterprise integration is a messy space. It is always complex – applications, databases, new and legacy systems, complexity of requirements. People usually need to run at least dozen of applications to run things smoothly. It is never done out-of-the-box and it is always requires circles of implementations and professional services.

I caught the following picture tweeted yesterday by Stan Przybylinski of CIMdata. It provides an excellent view of integration complexity. Unfortunately, in many situations, integration is a major challenge in PLM adoption. To get full value of PLM, company should spend a fortune integrating data and processes – CAD, Bill of materials, ECO, etc.

data-silos

Cloud is coming to enterprise these days. In many ways it creates a new way to think about data, software, services and maybe integrations too. The new technologies and eco-system of services can make a difference. It also creates a significant demand for openness and interoperability. This is a main reason why PLM need to learn web APIs. Web services and REST APIs are changing the way integration can be done.

Technology can make a difference. However, integrations are still hard. Few months ago, I shared my thoughts how to prevent cloud PLM integration mistakes. It comes down to three main things – 1/ lost data semantics; 2/ limitation of data transfers; 3/ transaction management in distributed and cross site environment.

Unfortunately, cloud is not a silver bullet to solve integration challenges. The demand for holistic integration continuum is still in the future. In practice, cloud applications today are replicating bad siloed behaviors of on premise applications. I captured the following picture earlier this week at Boston New Technology meetup.

cloud-silos-apps

This picture is the great demonstration of how bad aspects of siloed on premise applications are moving to cloud environment. Migration of applications on cloud infrastructure such as IaaS can simplify IT’s life. However, it won’t make life of users simpler. From end user standpoint, applications will still run in a silo.

What is my conclusion? The danger is to move established on premise PLM paradigms to the cloud. Technologically new cloud systems can give an advantages in terms of integrations. REST API is one example – it is much easier to code integration scenarios using REST APIs and modern web based tools. At the same time, closed data paradigms and data duplication between silos can bring well-know data spaghetti from on-premise applications to the cloud. Just my thoughts…

Best, Oleg


How to prevent cloud PLM integration mistakes

February 2, 2015

connected-plm-erp

Cloud is huge enabler for collaboration between people. It makes your data and processes accessible everywhere from a browser. It can help you to collaborate between engineers and suppliers. It can help you to integrate systems and people across enterprise.

Let me speak about the last one. The integration topic is actually tricky. I’ve been sharing some of my thoughts about cloud integration challenges – Integration is holding back PLM cloud adoption few months ago. Last week, I had a chance to attend two webinars about PLM and integration.

Become a Connected Manufacturing Enterprise with Agile Integration by Jitterbit. The following picture gives you a perspective on a problem of “connected manufacturing” and architecture solutions like Autodesk PLM360 and Jitterbit are solving this problem.

plm360-jitterbit-1

Here is the view that shows you the reality of mixed (cloud and on-premise) integrations.

plm360-jitterbit-2

Another webinar by CIMdata – “PLM & ERP: What’s the Difference, and Why Should you Care?” is providing another perspective on integration challenges between engineering an manufacturing.

cimdata-plm-erp-1

cimdata-plm-erp-2

Companies are moving into mixed cloud and on premise environment. This is a reality and we cannot avoid it. So, for a foreseeable future, we will have to deal with integration of multiple systems – some of them will continue to run on premises and some of them will be elsewhere (public cloud). It made me think about potential mistakes you can run into while integrating systems.

1- Lost data semantics

Most of integration scenarios are about how to send data back and forth between systems. It is hard to keep semantics of data and not to loose it when exchanging information. So, define what data means and keep an overall integration data schema. Otherwise, the result can be messy.

2- Data transfer limitation

Although some of integration infrastructure can allow you to implement data exchange quickly, you can underestimate the bandwidth requirements. Sending large packets of data can cause significant latency and create runtime errors and problems. Check what monitoring tools are available to handle such situations.

3- Transaction management

Most of manufacturing systems are sensitive to transactions. To manage distributed transactions can be tricky and require some fine tuning. Pay attention on how you handle error processing when integrating transaction system managing ordering, lifecycle and bill of materials.

What is my conclusion? The complexity of integration is growing. Cloud systems are bringing many advantages, but will create additional challenges to IT and professional services. Most of integrations are not working out of the box. New tools running from the cloud can help you to integrate it faster, but it will require good coordination with IT and planning upfront to prevent potential mistakes. Data integration is hard and requires experience and familiarity with manufacturing systems. Just my thoughts…

Best, Oleg

photo credit: freefotouk via photopin cc


SAP has a magic technology to improve enterprise integration

December 4, 2014

digital-integration

Integration is a big deal. When it comes to enterprise organizations and specifically manufacturing companies of different kinds, enterprise integration is one of the major challenges that influence broad PLM adoption and ROI. Enterprise integration isn’t a simple problem to solve. It requires a diverse set of tools to support data exchange and process orchestration. PLM vendors are providing a diverse set of solutions for that. On the side of the strategic position, re PLM companies are expanding their data reach by attempting to cover larger scope of data and process. You can read about it in my post about ECO and EBOM/MBOM synchronization complexity.

My attention caught by betanews article – SAP launches new manufacturing solution to improve enterprise integration. It speaks about new technologies developed on top of new SAP HANA in memory database to manage real time integrations. Here is the passage that gives you a glimpse of what is that about:

SAP Manufacturing Execution includes the SAP Manufacturing Integration and Intelligence (SAP MII) application and SAP Plant Connectivity software. This allows it to provide machine-to-machine integration and orchestrate intelligent manufacturing. Using the existing SAP HANA platform it offers global visibility into operations by making manufacturing big data more accessible and enabling predictive analytics capabilities to be used in house or in the cloud. This gives businesses advanced problem solving ability and ease of access to manufacturing data so they can make improvements in cost, quality, asset utilization and performance.

I’ve been touching SAP HANA topic before in my post – Future PLM platforms and SAP / Oracle technological wars and Is SAP HANA the future of PLM databases.

I’ve made a trip to SAP MII website to dig for more information about integration architecture. Here is an interesting document (with lot of technical details) that worth looking if you are interesting in SAP MII HANA integration – SAP Process Integration Act as Adapter for SAP MII integrating with Enterprise Applications. The document is available here. I captured the following architecture slide that give you a detailed view. More information is available here. From that picture you can see SAP’s view on how PLM and other business apps are going to be integrated with manufacturing and shopfloor system.

sap-mii-architecture

What is my conclusion? Modern manufacturing requires high level of integration. It goes from design and engineering to manufacturing and operation. Data integration and transparency will allow to companies to optimize performance, save cost and streamline processes. However, to make it happen is not a simple job and it requires lot of hard-wiring, data transformation and openness. PLM vendors’ demand to have control over MBOM to make vertical integration easier. As you can see on pictures and documents above SAP is working to create a grand vision of data integration from shopfloor to business applications and services. How and where PLM and ERP vendors will meet to create an integrated manufacturing solution is an interesting question to ask.

Best, Oleg


Integration is still the major challenge in PLM adoption

November 30, 2014

data-silos

I want to continue the discussion about data ownership and synchronization between islands of information in a company. EBOM and MBOM synchronization I mentioned in my previous post is probably the most typical one. But there are many others. Supply chain, contract manufacturing, regulation, customer support and services – this is only very short list where data stays under control of different systems. Even the idea of pulling data under a control of single system was an option a decade ago, my hunch that these days, the idea of PLM one big silo is getting less popular.

Control Engineering Asia published an article Hitachi Sunway Talks PLM Opportunities and Developments. Thammaya Chuay-iam of Hitachi Sunway Information Systems in Tailand, shared his thoughts about some of the major trends happening in the PLM sector in Asia, the opportunities and challenges. One of the topics caught my attention. It was specifically related to the issue of integration. Here is my favorite passage:

[Q] What are the biggest challenges being faced by manufacturers when it comes to their PLM activities? [A] Even though PLM initiatives within global companies have developed significantly over the years, the core challenge of the manufacturing industry remains the ever-growing need for consistent integration between PLM solutions and other enterprise systems. Another challenge is the need for focused solutions that address the needs of targeted groups within the PLM environment.

It made me think again about integration topic. The problem is here for the last 20-25 years. In many situations, the solutions companies are using remain unchanged for decades. It is brutal export/import/sync of data. It brings complexity and it cost money.

What is my conclusion? I guess the fundamental idea of “data pumping” between different systems should be replaced by something better. I’ve been touching it in my post about data ownership and data sharing. Web technologies can give us a better way to share, link and intertwine data. I believe it can be a better way and it will replace brutal data synchronization. Just my thoughts…

Best, Oleg


CAD: Engineering Bundles vs. Granular Apps?

August 7, 2014

cad-mcad-ecad-bundle-future

Packages, bundles, product suites, integrated environments. I’m sure you are familiar with these names. The debates about best of breed solutions vs. single-vendor integrated suites are going long way back in the history of CAD and PLM. Some companies are ready for functional trade-off and afraid of additional integration cost. For other companies performance and functionality are absolutely important to innovate.

Service oriented architecture and web technologies are bringing a sense of granularity into application and product world. In my view, Develop3D article – Why granularity is going to rock the future brings a very important perspective on the future of products. Al Deans speaks about granularity in data management. The complexity of product and data is growing. More people should work collaboratively on the same information (what was bundled before as a single CAD file). Here is my favorite quote:

When you’re working on a system that’s remotely located on a server, whether that’s over your internal network or across the wider web, you’ll need to manage and exchange finite packets of information, features, sketch entities and such. the rise of collaborative working systems, such as Catia V6, will mean that users are working on the same data, in parallel, at the same time. If not at the same time, there will be times when design changes, down to feature and maybe sub-feature level, will need to be managed and rationalised. To do that, you need to manage and keep track of those individual parcels of data and oackets of change. That’s going to require a level of granularity that’s way beyond what most data management systems are currently capable of.

Last year I watched Tech4PD video capturing debates between well known PLM pundits – Jim Brown and Chad Jackson – CAD Granularity vs. Integrated Suites. Navigate here to watch the recording. I voted for granularity. It was well captured by Chad Jackson statement. Here is the passage:

Granular CAD applications enable many roles in the enterprise, expanding the use of the 3D asset company-wide. Granular apps are better at enabling individual roles.

Latest blog post on Lifecycle Insight (again by Chad Jackson) – No More Excuses: It’s Time to Merge MCAD and ECAD caught my attention by something that I see as opposite to principles of granularity. Chad is debating the need to bundle and unite the functionality of MCAD and ECAD applications. I found his conclusion in a slight controversy with previously introduced concept of "granular CAD applications":

Why are there two separate toolsets at all? And that’s where, despite the lack of enthusiasm and interest in the topic, I think there is potential for disruption and innovation. There shouldn’t be two toolsets. You should be able to conduct mechanical and electrical design in a single CAD application…. Call it Hardware CAD (HCAD). Call it Electro-Mechanical CAD (EMCAD). I don’t care. But don’t tell me such an offering wouldn’t be intriguing. In my eyes, there is no reason that a combined MCAD-ECAD application shouldn’t be available. Large existing software providers have their reasons for inaction. But that means there is a ripe opportunity for disruption from smaller companies.

I want to elaborate more about the last point related to disruption and innovation. I explained my point of view in the blog post last year – The Future Unbundling Strategies in CAD/PLM. I want to repeat some of my assertions from the last year:

1. CAD and PLM is too big to sustain as a one big aggregated solution provided by a single vendor. This is a polystate diversified space that needs to be covered by multiple solutions, features and vendors.

2. Vendors are never good enough to see what exact problem customers want to solve. Especially when it comes to large manufacturing companies and complicated supply chain eco-systems. That’s way armies of consulting services as well as diversified products must be applied to provide a final solution.

3. Customers often don’t know what problem to solve. For most of the situations product development is a complex problem. It requires the team of people to work on. In addition to that, large organizations are involved into politics and confrontation related to usage of different enterprise software and tools.

What is my conclusion? I see a very strong potential in unbundling of existing large product suites. Take a piece of functionality, re-invent it, provide bigger value to a customer. Cloud technologies and future focus on data will be an imperative to make it successful. Vendors’ focus is shifting towards services. It is about how to satisfy customers each day. Selling of expensive bundles can be a thing in the past. Just my thoughts…

Best, Oleg


Thoughts about PDM/PLM jumbos and PLM glue

December 10, 2013

plm-big-systems-integrations

PDM v. PLM. This topic is usually raising lots of questions. Still is… People are getting confused by names and functions. Few years ago, I wrote 3 posts comparing PDM and PLM from different aspects – data, process and integration. Recently, Chad Jackson made me think about PLM and PDM topic again by his write up of Enovia capabilities. You might read my PDMish, PLMish and other CADPDLM bundles following Chad’s post.

Aras blog is bringing PDM v PLM topic again. Navigate to PDM or PLM? Yes. story by Peter Schroer – CEO and President of Aras. Peter draws a clear functional line between PDM and PLM. The following passage put all "dots" in comparison between D and L in product development.

PDM doesn’t provide product configuration management (effectivity) or enterprise process management. It doesn’t keep the design in synch with product workflows or requirements management, it doesn’t manage non-CAD and non-file based data very well, and it doesn’t track where that part or assembly fits in to the entire system lifecycle process. While PDM is useful, it doesn’t help make supply chains more efficient, it doesn’t improve quality or customer satisfaction, and it doesn’t help increase revenue.

The recipe I captured in Aras’ blog is suggesting PLM to play the role of glue that connect PDM (engineering) and extended enterprise (rest of the company).

PLM, or product lifecycle management, is the glue between PDM and the extended enterprise. PLM takes product data and puts it in the correct context for each user. For some users the CAD file is the center of their universe, but for many others CAD-based data is just a small subset of the entire set of product information they work with.

The last things about "glue" made me think about future integration strategies in PDM/PLM world. It was a time when everybody had a dream of a single PLM system used by everybody in the company providing a holistic set of functions. However, nowadays the number of "single PLM" believers are going down.

So, what comes next? Few weeks ago, I’ve been discussing the idea of Future unbundling strategies in PLM. Thinking more, I can see future separation of giant systems into small services as something more feasible. I can see how small features and functions are getting traction in a company to fulfill a specific need – change management, configurations, engineering BOM, etc.

What is my conclusion? I can see more tools and service diversity in the future. It is very hard to provide ready to go out-of-the-box set of functions. Compared to that, I can see set of services to make product development, collaboration, data management and communication more efficient. Some of tools can be cloud- and some of them – on-premise based. Social platforms will play a role of one-big-system-glue. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 285 other followers