Will “cloud” change the way we integrate PLM and ERP?

October 29, 2015


Integration has a very important role in PLM implementations. PLM is intended to manage design and engineering aspects of product development. ERP is intended to manage manufacturing resources, process and inventory. Therefore PLM and ERP are complementary. Most of PLM implementations I’ve seen in my life, required PLM-ERP integration.

At the same time, PLM-ERP integration is often the most complicated part. There is high diversity in the ways manufacturing companies manage data about product, bill of materials, parts, inventory and manufacturing processes. Outcome is multiple BoMs, product and item records and the need to synchronize information.

Traditional PLM-ERP integration is complex and never done out-of-the-box. It requires detailed definitions, data mapping and variety of data synchronization techniques. The last one is usually code effort that done by service provider or IT programmers. In some situations PLM and ERP vendors are offering integration tools, but because of different reasons such as cost and complexity of these tools, integration often end up as SQL hacking into two databases of PLM and ERP. Software vendors are not appreciating this approach, but usually face the reality of large implementation complexity and just live with it. In most of these situations, vendors would not jeopardize PLM deal by preventing customers to access databases directly. The result is high cost of maintenance and problem during upgrades.

Cloud technologies are simplifying IT and deployment. But, at the same time, cloud can create an additional integration complexity. Traditional integration code, including SQL often not applicable without direct access to databases in web environment. But cloud environment is still very complex. It contains PLM, ERP and many other systems and services companies are using. Few months ago, I shared my thoughts about how to avoid cloud integration spaghetti. One of the biggest dangers is that closed data paradigms and data duplication between silos can bring well-know data spaghetti from on-premise applications to the cloud.

For the last few months, I’ve been learning about what cloud PLM companies are doing to simplify cloud PLM-ERP integration. I wanted to share some of my thoughts about it

Autodesk PLM360

PLM Connect is a complete solution portfolio provided by PLM360 integrate business systems. First of all, it applies to PLM-ERP integration, but not only. Earlier this year, at PLM Accelerate 2015 conference in Boston, Autodesk promised to integrate PLM360 with everything. PLM360 is using Jitterbit middleware for integration.


In addition to that Autodesk seems to be inspired by IFTTT -like tools announced “evented web” integration for PLM360. Read more here.


From the side of Jitterbit, it looks similar to traditional middleware. The fact it runs on the cloud doesn’t change much. But it has nice UI for integration and mapping. Also, granularity of REST API and ease of code can potentially make PLM360 / Jitterbit environment more efficient. Evented-web integration style has advantages, but it is not clear to me how effectively it can be used to synchronize data between PLM and ERP environments.

Arena – Kenandy integration

I’ve been learning about Arena PLM integration with Kenandy ERP. My attention was caught by the following article and Arena-Kenandy partnership press release. You can get some details about the integration by navigating to the following data sheet.

I spent some time looking into specific ways integration is done. Arena and Kenandy is not using middleware style integration. At the same time, both are supporting modern web based APIs to code integration behavior. Which allows the both solution to leverage service APIs on both sides for efficient and granular data integration. Arena and Kenandy is synchornizing data by transferring XML documents.


Administration console can show you status of data synchronization.


In my view, Arena-Kenandy is a modern variant of point-to-point integration with realization using Web services API. It makes code easier, but still requires implementation of synchronization logic between systems.

Razorleaf – Clover Open Integration Platform

Companies doing implementation services for PLM usually have high sense of urgency to work on PLM-ERP integrations. It is part of their implementation schedules. My attention was caught by Razorleaf announcement yesterday about Clover Open Integration platform. Read more here – Razorleaf Introduces Clover™, a New Open Integration Platform that Supports Any-to-Any Endpoint Integration for PLM Applications. The following passage provides some high level explanation about what Clover does.

“The Clover platform is a result of our long-standing experience in creating CAD/CAM/PLM integration endpoints,” stated Eric Doubell, CEO of Razorleaf Corporation. “We now have created an industry standard application integration platform that has a flexible architecture and can scale easily based on its endpoint applications. This platform helps our customers retain the feature sets they have come to rely upon in their application investments and allow for a more controlled migration path forward when upgrading is a requirement. Making up-to-date data available across applications accelerates decision-making and process efficiency across the organization.”

Razorleaf is providing services for different cloud and on premise PLM environments. Learn more here. You can see on-premise and cloud systems including Autodesk PLM360 and Jitterbit. I’m still learning about Clover technology and platform. So, stay tuned for updates.

What is my conclusion? Cloud brings some limitations to integration techniques. Very often integration was done using direct SQL-code injections and batch processing. You cannot do it anymore in cloud-based / web environment. Web based APIs can compensate it, but it requires products to support granular REST APIs for specific operations. This is something you want to be sensitive to when choosing cloud PLM vendor. Web API can make cloud-based integration easy to code and implement. However, cloud integration patterns are still the same – middleware or point-to-point integration. Cloud didn’t bring anything new here. At least from the standpoint of systems I learned. Integration remains complex and requires planning and resources during PLM implementations. A note or PLM architects and strategists. Just my thoughts…

Best, Oleg

PLM and Zero BOM errors: the devil is in details

September 10, 2014


To manage Bill of Materials (BOM) is not a simple job. Often you can hear a simple definition of bill of material as a "list of component needed to build a product". However, in reality, BOM is much more complex and contains information about product structure, the ways product is manufactured, maintained and even disposed. The variety of requirements coming from multiple departments make BOM a complex information entity. Because of diversity of disciplines, organizations and tools BOM traditionally managed as a separate structures related to design, engineering, manufacturing, support, supply chain, etc. Mistakes in Bill of Material management are costly and painful to companies. It can lead to wrong material orders, shipment delays, regulation issues and many other problems.

My attention was caught by few examples of PLM vendors emphasizing their ability to support "zero BOM errors" in their BOM management solutions.

First example came from Dassault Systems ENOVIA. Navigate to my post – PLM, demolished silos and closed BOM loop. You can get more information also in recent Razorleaf blog covering ENOVIA conference here. According to ENOVIA, errors are coming from synchronization or design and engineering BOM. Therefore "zero file solution" strategy developed by Dasasult System ENOVIA will lead to zero BOM problems. Here is a passage from both articles:

The zero error BOM (Bill of Materials) demands a zero file solution. 3DEXPERIENCE brings the zero file world into the engineering environment; what we do is to connect directly to product data, not to files”.

Dassault spent significant time at the event returning to the theme of the business benefits of ENOVIA, describing a “Power of Zero” mantra across ENOVIA’s capabilities (for example, “Target Zero BOM Errors”). ENOVIA CEO Andy Kalambi offered a nice overview of how these “Power of Zero” themes connect the direction of the ENOVIA product line with the business needs of ENOVIA’s customers.

Second example came from Arena Solutions case study – How Nutanix Reduced BOM Errors to Absolute Zero. You can download case study for free by registering on the website. Interesting enough, the problem of "Zero BOM errors" is completely different here. It speaks to collaboration and access of BOM by multiple people in a team or even different organizations suppliers. Here is an interesting quote from case study that outlines that:

“Our suppliers now access the same BOM and revision, and we have had zero wrong BOMs built since the system was implemented. Configuration integrity is assured… Change management was a nightmare,” said Sangster. “With several people making changes and suggestions to uncontrolled documents there were multiple revisions of the same BOM flying around the ether. No one had any trust in the data, so many local copies abounded based on the ‘mine is right’ premise.”

The devil is in details. When you plan how to implement BOM management, you need to work on multiple use cases. Bill of Material has multiple point of failures. I mentioned two in my post today – 1/ synchronization between design and engineering/PLM tools; 2/ collaboration and change management scenarios. I can see many other use cases. When you plan a solution, it is important to focus on a specific problem you want to handle. At the same time, when vendor speaks to you about "Zero BOM error", don’t hesitate to ask questions. Same buzzwords mean different things sometimes.

What is my conclusion? BOM management is a complex domain. It is hard to underestimate the value of having correct BOM without error. BOM errors are costly and to manage consistency of BOM is one of the most important objectives of PLM solutions. At the same time, BOM has multiple points of failure. This is a note to PLM implementers and IT people to focus on important scenarios and not to take "Zero BOM mantra" as silver bullet that solves all problems. Just my thoughts…

Best, Oleg

Thoughts about PLM whitepapers and PLM processes

March 28, 2013

Are you reading whitepapers these days? Hmm… Not much you can say, probably. I’m reading blogs, twitter streams and use Flipboard app. I think, whitepapers are getting into a crisis similar to publishing industry. It is not as popular as IBM Redbooks 10 years ago. PLM whitepapers are interesting in particular. Very often, whitepapers are sponsored by vendors and, as a result of this, lose their attractiveness. You still can learn from them, but the scale is limited by the commitment of an author to a specific vendor(s). Another class of whitepapers are presenting research made by analyst or research firm. These whitepapers are interesting from the standpoint of learning data, but also can be limited.

I’ve been reading Razorleaf’s whitepaper Achievable PLM by Jonathan Scott. The following link will give you an access to the whitepaper in exchange of your email and phone number. Razorleaf is a business outfit focusing on PLM services without much focus on what vendor, product, technology to use. Take few minutes coming weekend and read the whitepaper. I found the following part of the whitepaper the most important to me. It helps customers to identify company and product development processes to be supported by a PLM system. It defines 3 main processes that you can find almost in every manufacturing company – NPD / NDPI, ECO/ECN/ECR and CM/BOM. Here is the passage from whitepaper:

Do you have a process for dreaming up new products and turning them into something that can be made? Formally, some people call this New Product Development & Introduction (NPD or NPDI).

Do you have a process for changing the design of existing products to fix problems that you or your customers discover, and to improve your product so that more people will buy it? Many people call this Engineering Change Management (ECx) and there can be numerous subprocesses like Engineering Change Request (ECR), Engineering Change Order (ECO), and Engineering Change Notification (ECN).

Do you have a process for describing the “recipe” for your product, the list of ingredients/components that go together to make up your product? People in defense -related industries have been calling this Configuration Management (CM) for years, but a lot of other industries think of it as Bill-Of-Material (BOM) Management.

It made me think about new type of PLM whitepapers focusing on helping customers to identify their product development processes. That would be a different set of whitepapers. Think about that as a collection of process recipes. One size doesn’t fit all when it comes to processes related to product development in manufacturing companies. To have multiple options and viewpoints could be very beneficial if you are implementing PLM.

What is my conclusion? Content is a critical element these days. Whitepaper is an important type of content, in my view. You need to have a good content to drive attention. It seems to me, a good PLM whitepapers can play two important roles – drive traffic to the websites of PLM vendors and service providers, but most importantly – to change the way customer perceive PLM implementations by providing practical information and guidance about PLM-related processes. Just my thoughts…

Best, Oleg

SolidWorks PDM Customers Focus on Office. Cloud? Meh…

August 2, 2011

August is a typical vacation time. At least, I was thinking so… Not any more. Today’s news brought me at least two acquisition notices. One from Autodesk about their acquisition of Instructables – a popular online community where people can upload, discuss, rate and collaborate on a wide variety of do-it-yourself (DYI) projects. The second one, maybe less notable compared to Autodesk, but very interesting in the context of PDM and PLM – Razorleaf acquired Office2PDM product from Extensible CAD. Razorleaf is a small outfit in PDM/PLM world providing services focuses on the implementation of design automation, SharePoint, Product Data Management and PLM solutions.

So, what happened? Razorleaf acquired Office2PDM – a suite of products that better connects SolidWorks Enterprise PDM to Microsoft Office products. Read full text of Razorleaf announcement here. Here is the the top quote:

Office2PDM offers a seamless integration between Microsoft Office products and SolidWorks Enterprise PDM (EPDM). EPDM Dashboard provides real-time access to Enterprise PDM vaults, right from Microsoft Outlook, for monitoring the status of documents and workflows.

In a nutshell, Office PDM is for people that are spending their lives in Microsoft Office products. To have an ability to seamlessly accessing EPDM data is important. Take a look on following two videos.

Razorleaf published on their blog an additional information about their intents related to the future of Office2PDM:

Razorleaf plans to extend and enhance EPDM Dashboard and Office2PDM, as we strive to deliver on its pomise of making PLM tools work better for our clients. Office2PDM and EPDM Dashboard are SolidWorks Partner Products, offering one of the highest levels of integration for access to the Enterprise PDM system. The transition from Extensible CAD to Razorleaf should be transparent to existing customers, and a net benefit given Razorleaf’s long history with Enterprise PDM and its predecessor, Conisio.

SolidWorks EPDM and Office

Razorleaf acquisition made me think about some interesting aspects of PDM and PLM implementations related to Microsoft Office. The connection of EPDM and Office is a big deal, in my view. Email and Office are mainstream tools that used by all manufacturing companies implementing SolidWorks and EPDM. Seamless Office integration is a key functionality that expects to be highly demanded by everyone in SolidWorks customer community.

What is my conclusion? There is one question I have not answered reading Razorleaf announcement. What happens with V6 Enovia cloud tools and SolidWorks PDM? SolidWorks n!Fuze is the newest DS SolidWorks product on the cloud. Dassault just released n!Fuze a month ago. n!Fuze supposed to provide a first step into the future of SolidWorks data management and collaboration tool on the cloud. How fast SolidWorks EPDM will be replaced by V6 on the cloud? In my view, Razorleaf is clearly saying – Focus on Office. Cloud? Meh… Just my thoughts…

Best, Oleg


Get every new post delivered to your Inbox.

Join 290 other followers