PLM Integrations: If-then, hammers and polymorphism

September 14, 2015

Many years ago, I’ve been doing some work implementing PLM-ERP integration using Microsoft BizTalk Server. It was fascinating experience. BizTalk was expanding its integration capabilities and tools. It worked very well for many cases. However, after many years, I have to admit that for some complex PLM-ERP use cases it was a stretch. Biztalk blog article reminded me elements of integration complexity – schema mapping and scripting tools.


In practical sense, many integration technologies are ending up as a complex scripting and code projects. Integration is still one of the most challenging parts of PLM implementations.

Fast forward in 2015. With recent development of web and cloud technologies, there is a real opportunity to re-think the way PLM integration are done – PLM needs to learn Web APIs.

Two weeks ago at Autodesk PLM360 conference in Boston, I started the conversation about PLM integrations inspired by Autodesk presentation of “evented web” integration approach. Thanks Monica Schnitger – she gave an excellent name to that – IFTTT-ization of PLM integration technologies. It took me some time to think about IFTTT approach. Autodesk introduced integration technologies using software Jitterbit – a very easy integration tool allowing to hook processes and events between PLM360 and other applications. I left the event with a following question – how different PLM360/Jitterbit from IFTTT native tools.

Over the weekend, I’ve got a good portion of laugh reading Ed Lopategui article – Accelerating the PLM Cloud Integration Bus. I think Ed is spot on in the following passage

The 64,000 dollar question is if Autodesk is implementing Jitterbit like it was IFTTT for PLM360 why not open the door to IFTTT itself? Referring back to the SOA/ESB discussion at the beginning of this post (if was in there for some reason, you know), it’s important to note that point to point solutions just don’t scale. Once integration gets complicated enough IFTTT is just not going to cut it. And while things might be simple enough at the outset, it gets messy quick. You don’t want the wheels to come off and the bomb to explode at the first bump in the road. Best to found the system on what you’re really going to need in the end: a lightweight, business-facing ESB. But there’s no need to open that door completely at the outset, because very early stage a point-to-point event driven system might be just right.

It made me think again about IFTTT approach. I went back to my earlier post – PLM Workflow “California Roll” Style. IFTTT technologies as a driver for web based workflows. In a nutshell, it is a easy tool to hook simple events between web services. But, what is the limit for such logic?

If you had a chance to study programming, you probably familiar with the discussion about when “if” statement. In many situations “if” is considered as an evil keywords in programming languages. Conditional clauses sometimes do result in code which is hard to manage. In my view, “if” is as bad as hummer if you misuse it. But, in some situations hammer can be very useful. Remember, if you run “if” or “switch” around type code, it can smell bad. By allowing to branch on a certain type code, you are creating a possibility to end up with numerous checks scattered all over your code, making maintenance more complex. Polymorphism allows you to bring this branching decision as closer to the root of your program. This is where a more powerful technique should come. It called polymorphism.

The essence of polymorphism is that it allows you to avoid writing an explicit conditional when you have objects whose behavior varies depending on their types.As a result you find that switch statements that switch on type codes or if-then-else statements that switch on type strings are much less common in an object-oriented program.

This approach works in many programming languages, but don’t think similar paradigm is supported in integration middleware. I never heard about applying polymorphism in PLM integration scenarios. I think, it can apply well, especially if both PLM and ERP systems can rely on a consistent classification infrastructure. Once, the code of Item Master integration can be inherited to implement specific aspects of integration for electrical, mechanical and other components – it might be more scalable rather than simple “if-then” clause.

What is my conclusion? The devil of integration techniques is in details. This is where it becomes interesting and you can see how specific integration technology can scale. A cost of integration maintenance is the second important criteria after scaling factor. What is the limit of IFTTT like integration and how does it apply to PLM? Maybe cloud based PLM can simplify overall integration flow with IFTTT and PLM360/Jitterbit combination has some interesting technologies to make it happen? Maybe Autodesk implemented some “polymorphism” into PLM360/Jitterbit bundle to make it scale? It can be an interesting turn, in my view. Just my thoughts (and speculation) for the moment…

Best, Oleg

PLM360 is promising to integrate with everything

September 4, 2015


Just to think about integrations and PLM can hurt your mind. Integrations are complex and expensive business. It is not unusual to hear from manufacturing companies that they do manual integration between products to avoid unexpected complexity and cost. Usually integration is done as a service project and requires skilled people familiar with PLM products, APIs and variety of other technologies and tools.

Cloud is introducing the new level of challenges in front of manufacturing companies. Modern integration scenarios are breaking corporate walls and exposed into online world. In my earlier article – How PLM can avoid cloud integration spaghetti, I discussed some challenges and potential ways to overcome it.

One of them is the opportunity to leverage Web APIs and automation platforms like IFTTT and Zapier to manage integration between systems. Read more in my PLM and Integration business () article.

It looks like the idea of IFTTT-like integration found few supporters in Autodesk PLM360 team. Earlier this week during PLM360 conference in Boston, Autodesk previewed some of their plans related to new generation of PLM integration. In a presentation made by Jared Sund, Sr. Product Manager for PLM360, Autodesk made a promise to integrate with everything.

The technology behind this promising statement is Jitterbit middleware and so called “evented Web”. The last one is very similar to the approach taken in automating platforms IFTTT and Zapier and relates to Web APIs (reminded me an old article – why PLM vendors need to learn Web API).

The following few slides can show you the presentation of how PLM360 is going to make integration part of their Web interface.



“Evented Web” is a good news for PLM360 customers because it brings much easier way to create integration scenarios. The following slides and video can give you an idea how Autodesk is thinking to make it happen.


The following video shows simple scenario of event driven integration between PLM360 and

PLM360 event driven integrations made me think again about PLM integration. I don’t think this approach is a silver bullet to solve all PLM integration problems. In the past, different middleware technologies were created to manage integration (if you’re long enough in the integration domain, you might remember Microsoft Biztalk and IBM WBI products). The challenging aspect of integration is to maintain integrations and apply additional changes when new business requirements come. In addition to that, many event handlers can significantly slow down system, so integration monitoring will be highly demanded tool. So, if you are considering event driven web integrations, think about these questions and make an assessment of the data amount to be transferred.

The selection of tools would be another question. Personally, I love IFTTT and Zapier approach. My question to Autodesk and Jitterbit people is about differentiation between Jitterbit tools and IFTTT-like platforms. What is the advantages to use PM360 and Jitterbit? In case PLM360 is supporting REST API, it can be used with IFTTT or Zapier too. If organization is already using IFTTT for to integrate cloud product, why they will consider to bring Jitterbit platform in addition to IFTTT.

What is my conclusion? Integration problems are painful. The fact PLM360 is focusing on solving integration challenges is a very good news for customers. Web products can provide a new approach to implement integration scenarios and “evented web” is one of them. The cost of implementation and maintenance will be still key factor that will drive customer decision to implement integration and move away from manual re-entering of information between applications. Just my thoughts…

Best, Oleg

PLM and integration business

August 19, 2015


Integrations. Enterprise software implementations are heavy depending on the ability to integrate different pieces of software. Each and every PLM implementation I’ve seen required some sort of integrations. It might be integration of CAD and PDM packages, which is relatively straightforward in many situations. But it can be also very challenging one, such as integrations between PLM and ERP functionality which can bring many organizational and technological difficulties.

Most of PLM integrations are doing by integration and service partners. It removes many problems with licensing of competitive software from different vendors. The integration business is tricky. As an example of turbulent character of integration business you can read news about Informatica buyout few weeks ago – Microsoft And Salesforce Join In $5.3 Billion Buyout Of Informatica. Not directly related to PLM world, but it gives some impression about business of integration software (related to both Informatica and Tibco):

But Informatica couldn’t ultimately find a better option for its $1 billion in annual revenue business, which grew just 10% on constant currencies in Q2 of 2015 on software revenue growth of 13% and subscription growth of 44% year-to-year. That rate of growth was essentially flat from the year before. Like competitor Tibco, Informatica had fallen into a low-growth, mature sales cycle after seeing its stock soar and then crater when the dotcom bubble burst. Both had eventually regrown into multi-billion valuations, but after years of sales growth to get back where they were. Tibco was taken private in December for about $4.3 billion, $1 billion less than Informatica.

After some thinking, it occurred to me that large enterprise PLM implementations are essentially integration projects. It combined from very typical set of integration steps – analysis of data processes in the organization, data modeling, defining flows of data, reporting and monitoring tools. PLM platforms are essentially data integration toolkits allowing to handle very specific set of information. Which connected me to one of my previous articles – How PLM can avoid cloud integration spaghetti. As PLM industry moves to the cloud, it must find a better way to deal with PLM implementations and its essential part – integrations.

It made me think about few possible ways PLM vendors can change a trajectory of traditional integrations and business strategies.

1- Open source PLM data toolkits. Open source software has a strong presence in a modern software eco-system. For many software vendors today, open source is a natural way to develop products. I’ve been watching few PLM open source initiatives, but most of them were lack of product maturity. Turning part of existing PLM platform into open source, can trigger a change in the way PLM implementations can be done. Aras Corp is the most closed example of such initiative. Although Aras Innovator core module is not open source, most of solutions developed on top of Aras are open source projects.

2- Automation platforms to trigger and action based integrations. You might be familiar with integration automation services such as Zapier and IFTTT. Both are extremely efficient to automate variety of integration activities between cloud applications. These automation services are providing development platform for other companies to create specific integration connection points and services. Jitterbit is probably the closed example of automation services in PLM ecosystem.

3- Integration businesses as part of cloud hosting services. In a growing eco-system of cloud PLM software, hosting providers can play a role of implementation and integration service providers too. In my view, it is a very dynamic space. All large manufacturing companies implemented on premise PLM as of today will start looking how to bring cloud PLM solutions – integrations will become the most challenging part of making transformation happen.

What is my conclusion? PLM implementations are complex. And "integration" is the most complicated part of it. The traditional PLM implementation approach is holding back PLM business. How to turn PLM implementations into agile and lean process? PLM integration improvement can be a good step to clean the mess of PLM implementations. Just my thoughts…

Best, Oleg

How PLM can avoid cloud integration spaghetti?

June 18, 2015


Enterprise integration is a messy space. It is always complex – applications, databases, new and legacy systems, complexity of requirements. People usually need to run at least dozen of applications to run things smoothly. It is never done out-of-the-box and it is always requires circles of implementations and professional services.

I caught the following picture tweeted yesterday by Stan Przybylinski of CIMdata. It provides an excellent view of integration complexity. Unfortunately, in many situations, integration is a major challenge in PLM adoption. To get full value of PLM, company should spend a fortune integrating data and processes – CAD, Bill of materials, ECO, etc.


Cloud is coming to enterprise these days. In many ways it creates a new way to think about data, software, services and maybe integrations too. The new technologies and eco-system of services can make a difference. It also creates a significant demand for openness and interoperability. This is a main reason why PLM need to learn web APIs. Web services and REST APIs are changing the way integration can be done.

Technology can make a difference. However, integrations are still hard. Few months ago, I shared my thoughts how to prevent cloud PLM integration mistakes. It comes down to three main things – 1/ lost data semantics; 2/ limitation of data transfers; 3/ transaction management in distributed and cross site environment.

Unfortunately, cloud is not a silver bullet to solve integration challenges. The demand for holistic integration continuum is still in the future. In practice, cloud applications today are replicating bad siloed behaviors of on premise applications. I captured the following picture earlier this week at Boston New Technology meetup.


This picture is the great demonstration of how bad aspects of siloed on premise applications are moving to cloud environment. Migration of applications on cloud infrastructure such as IaaS can simplify IT’s life. However, it won’t make life of users simpler. From end user standpoint, applications will still run in a silo.

What is my conclusion? The danger is to move established on premise PLM paradigms to the cloud. Technologically new cloud systems can give an advantages in terms of integrations. REST API is one example – it is much easier to code integration scenarios using REST APIs and modern web based tools. At the same time, closed data paradigms and data duplication between silos can bring well-know data spaghetti from on-premise applications to the cloud. Just my thoughts…

Best, Oleg

How to prevent cloud PLM integration mistakes

February 2, 2015


Cloud is huge enabler for collaboration between people. It makes your data and processes accessible everywhere from a browser. It can help you to collaborate between engineers and suppliers. It can help you to integrate systems and people across enterprise.

Let me speak about the last one. The integration topic is actually tricky. I’ve been sharing some of my thoughts about cloud integration challenges – Integration is holding back PLM cloud adoption few months ago. Last week, I had a chance to attend two webinars about PLM and integration.

Become a Connected Manufacturing Enterprise with Agile Integration by Jitterbit. The following picture gives you a perspective on a problem of “connected manufacturing” and architecture solutions like Autodesk PLM360 and Jitterbit are solving this problem.


Here is the view that shows you the reality of mixed (cloud and on-premise) integrations.


Another webinar by CIMdata – “PLM & ERP: What’s the Difference, and Why Should you Care?” is providing another perspective on integration challenges between engineering an manufacturing.



Companies are moving into mixed cloud and on premise environment. This is a reality and we cannot avoid it. So, for a foreseeable future, we will have to deal with integration of multiple systems – some of them will continue to run on premises and some of them will be elsewhere (public cloud). It made me think about potential mistakes you can run into while integrating systems.

1- Lost data semantics

Most of integration scenarios are about how to send data back and forth between systems. It is hard to keep semantics of data and not to loose it when exchanging information. So, define what data means and keep an overall integration data schema. Otherwise, the result can be messy.

2- Data transfer limitation

Although some of integration infrastructure can allow you to implement data exchange quickly, you can underestimate the bandwidth requirements. Sending large packets of data can cause significant latency and create runtime errors and problems. Check what monitoring tools are available to handle such situations.

3- Transaction management

Most of manufacturing systems are sensitive to transactions. To manage distributed transactions can be tricky and require some fine tuning. Pay attention on how you handle error processing when integrating transaction system managing ordering, lifecycle and bill of materials.

What is my conclusion? The complexity of integration is growing. Cloud systems are bringing many advantages, but will create additional challenges to IT and professional services. Most of integrations are not working out of the box. New tools running from the cloud can help you to integrate it faster, but it will require good coordination with IT and planning upfront to prevent potential mistakes. Data integration is hard and requires experience and familiarity with manufacturing systems. Just my thoughts…

Best, Oleg

photo credit: freefotouk via photopin cc

SAP has a magic technology to improve enterprise integration

December 4, 2014


Integration is a big deal. When it comes to enterprise organizations and specifically manufacturing companies of different kinds, enterprise integration is one of the major challenges that influence broad PLM adoption and ROI. Enterprise integration isn’t a simple problem to solve. It requires a diverse set of tools to support data exchange and process orchestration. PLM vendors are providing a diverse set of solutions for that. On the side of the strategic position, re PLM companies are expanding their data reach by attempting to cover larger scope of data and process. You can read about it in my post about ECO and EBOM/MBOM synchronization complexity.

My attention caught by betanews article – SAP launches new manufacturing solution to improve enterprise integration. It speaks about new technologies developed on top of new SAP HANA in memory database to manage real time integrations. Here is the passage that gives you a glimpse of what is that about:

SAP Manufacturing Execution includes the SAP Manufacturing Integration and Intelligence (SAP MII) application and SAP Plant Connectivity software. This allows it to provide machine-to-machine integration and orchestrate intelligent manufacturing. Using the existing SAP HANA platform it offers global visibility into operations by making manufacturing big data more accessible and enabling predictive analytics capabilities to be used in house or in the cloud. This gives businesses advanced problem solving ability and ease of access to manufacturing data so they can make improvements in cost, quality, asset utilization and performance.

I’ve been touching SAP HANA topic before in my post – Future PLM platforms and SAP / Oracle technological wars and Is SAP HANA the future of PLM databases.

I’ve made a trip to SAP MII website to dig for more information about integration architecture. Here is an interesting document (with lot of technical details) that worth looking if you are interesting in SAP MII HANA integration – SAP Process Integration Act as Adapter for SAP MII integrating with Enterprise Applications. The document is available here. I captured the following architecture slide that give you a detailed view. More information is available here. From that picture you can see SAP’s view on how PLM and other business apps are going to be integrated with manufacturing and shopfloor system.


What is my conclusion? Modern manufacturing requires high level of integration. It goes from design and engineering to manufacturing and operation. Data integration and transparency will allow to companies to optimize performance, save cost and streamline processes. However, to make it happen is not a simple job and it requires lot of hard-wiring, data transformation and openness. PLM vendors’ demand to have control over MBOM to make vertical integration easier. As you can see on pictures and documents above SAP is working to create a grand vision of data integration from shopfloor to business applications and services. How and where PLM and ERP vendors will meet to create an integrated manufacturing solution is an interesting question to ask.

Best, Oleg

Integration is still the major challenge in PLM adoption

November 30, 2014


I want to continue the discussion about data ownership and synchronization between islands of information in a company. EBOM and MBOM synchronization I mentioned in my previous post is probably the most typical one. But there are many others. Supply chain, contract manufacturing, regulation, customer support and services – this is only very short list where data stays under control of different systems. Even the idea of pulling data under a control of single system was an option a decade ago, my hunch that these days, the idea of PLM one big silo is getting less popular.

Control Engineering Asia published an article Hitachi Sunway Talks PLM Opportunities and Developments. Thammaya Chuay-iam of Hitachi Sunway Information Systems in Tailand, shared his thoughts about some of the major trends happening in the PLM sector in Asia, the opportunities and challenges. One of the topics caught my attention. It was specifically related to the issue of integration. Here is my favorite passage:

[Q] What are the biggest challenges being faced by manufacturers when it comes to their PLM activities? [A] Even though PLM initiatives within global companies have developed significantly over the years, the core challenge of the manufacturing industry remains the ever-growing need for consistent integration between PLM solutions and other enterprise systems. Another challenge is the need for focused solutions that address the needs of targeted groups within the PLM environment.

It made me think again about integration topic. The problem is here for the last 20-25 years. In many situations, the solutions companies are using remain unchanged for decades. It is brutal export/import/sync of data. It brings complexity and it cost money.

What is my conclusion? I guess the fundamental idea of “data pumping” between different systems should be replaced by something better. I’ve been touching it in my post about data ownership and data sharing. Web technologies can give us a better way to share, link and intertwine data. I believe it can be a better way and it will replace brutal data synchronization. Just my thoughts…

Best, Oleg


Get every new post delivered to your Inbox.

Join 287 other followers