PLM: configuration v customization. Let’s sort it out..

October 6, 2015


Enterprise software customizations are painful. Remember my old post – Is PLM customization a data management Titanic? Nobody likes to customize PLM software, but all companies are doing that during implementations to some degree. You can catch up on my previous articles about that – How to eliminate PLM customization problem and How to de-customize PLM. The demand to eliminate the need to customize systems, but how is that feasible?

My earlier conclusion is that PLM vendors need to think how to make implementation cost effective and to support flexibility of PLM products and tools. It is especially important in the era of cloud computing and growing number of cloud deployments. PLM vendors will have to invest in technologies and methods to simplify deployment, flexibility and speed of implementations.

Jim Brown and Stan Przybylinski, both well known analysts in PLM industry, just released a funny video and serious interview on PLM customization. Navigate here to read more. Watch the video:

It brings up a topic of a difference between customization vs. configuration. It might be confusing. Where is the border between customizations and configuration? So, I thought, it will be useful to clarify things a bit and put it in a perspective of modern technological trends and development. Both configuration and customization are aiming to alter software product behavior. At the same time, there is a difference in two approaches.


In the old days of enterprise software, customization, assuming altering of software code. Customized product was deployed by customer. It took time and was expensive. In addition to that, future releases of the product potentially becoming incompatible with customized version.

For the last 10-15 years, enterprise software (PLM software included) developed ways to customize software using API and data modeling changes. For most of PLM products the trick was to use only approved API and not to hack data model using direct SQL commands injections. That last one was a grey area. Many customers did it, but not everyone will admit that guilt.


The term configuration means that system behavior will be altered using vendor supplied configuration tools. Some systems provided more user friendly UI for administration, which became important, especially for software integrators running PLM implementations for their clients. Configuration tools are provided by vendors and, therefore, vendor is taking care of future compatibility between releases.

So, "configuration" assumes that you don’t need to write "code" to configure the system. But it can be a bit complicated. Especially when it comes to APIs. What if API is provided by vendor?

APIs – the devil is in details

Application Programming Interface (API) became popular for the last two decades. The demand for openness, integration and broader platform development made vendors to invest more in API development. Many of these APIs are used by vendors and partners for application development and… customization.

Here is the thing. APIs are getting more popular and easy to use. For the last decade, development of scripting languages like Java Scripts and others made APIs a very effective way to configure and customize system behavior. A lot of them are used for automation and integration.

Web APIs and cloud technologies

Cloud brings many challenges to enterprise software configuration and customization. Many well known techniques (especially related to SQL and database customization) cannot be used. Databases are hidden behind web and application servers. Multi-tenant cloud systems are bringing even more complexity to support database level customization.

As a result of web and cloud technologies development, there is an increased demand for two things – 1/ Robust configuration tools provided by vendors; 2/ Web based APIs. Together, API and configuration tools need to support the demand for PLM system flexibility.

What is my conclusion? It is important to understand what is behind "configuration vs. customization" semantics. Even more, it is important to align customer requirements with the level of flexibility PLM product and technology can support. The demand to provide open, flexible and configurable systems that can configured using tools and wide range of APIs. All these options should be supported by a vendor. The development of web APIs and cloud based automation tools makes both (configuration tools and APIs) important for successful PLM implementations. Just my thoughts…

Best, Oleg

Image courtesy of blackzheep at

PLM and a future of deep linking

June 19, 2015


I like links. The things about links are fascinating. It is about connection between products, people, companies and things. The nature of our life today is to stay connected. Therefore links are important. It is natural to see links appearing everywhere in engineering and manufacturing too. Think about a product and its dependencies. The information is naturally linked between assemblies, parts, documents, bill of materials, materials, suppliers, manufacturing, shop-floor, orders and many other things.

The nature of things in PLM to be connected. At the same time, the reality of engineering software is in fact highly fragmented. The time when vendors and customers believed in a single system (or database) that can contain and manage all information is over. Customers are using multiple applications and it is unusual to see two or more PLM systems in the same company. When it comes to supply chain, the situation is even more complex.

Application integrations remains one of the most painful aspect of enterprise software and PLM can clearly lead the wave of complexity involved into implementations. My article yesterday –How PLM can avoid cloud integration spaghetti was a warning to all folks that imagine that cloud will be a silver bullet to kill application pain. It doesn’t. The cloud integration can be sometimes even more complex compared to traditional integration hacks using SQL and ETL tools.

I want to continue discussing the topic of cloud integration. The topic I’m bringing today is related to so called "deep linking". If you’re not familiar with the topic, navigate to the following Wikipedia article – Deep Linking – to get some background information.

In the context of the World Wide Web, deep linking consists of using a hyperlink that links to a specific, generally searchable or indexed, piece of web content on a website (e.g., rather than the home page (e.g. The technology behind the World Wide Web, the Hypertext Transfer Protocol (HTTP), does not actually make any distinction between "deep" links and any other links—all links are functionally equal. This is intentional; one of the design purposes of the Web is to allow authors to link to any published document on another site. The possibility of so-called "deep" linking is therefore built into the Web technology of HTTP and URLs by default—while a site can attempt to restrict deep links, to do so requires extra effort. According to the World Wide Web Consortium Technical Architecture Group, "any attempt to forbid the practice of deep linking is based on a misunderstanding of the technology, and threatens to undermine the functioning of the Web as a whole".[1]

TechCrunch article – A brief history of deep linking brings an interesting perspective of trajectory of deep linking development on the web and in app word. Below is my favorite passage. It is important since it gives a notion of how to threat standards in internet and application world.

In order for me to write this article, and for you to be able to read it, we have to share a common language: modern English. The same holds true for directing users through deep links — in order to construct a deep link that an application will understand, we need to have some shared way of expressing information or addresses. In software engineering, a well-defined shared vernacular is defined by a “standard.”

The problem with standards, though, is that many of them do not actually become standard practice, and introduce as much fragmentation as they resolve. I could define the word “basilafacitarian” as “a person who likes basil a lot,” but unless it enters the common vernacular, it’s useless as a means of communication for me to tell you that I like basil.

The same is true for an app speaking to another app; unless the URL “myapp://show-item/id123” is mutually agreed upon, there’s no guarantee what the receiving app will do with it.

Coming back to PLM world, we can see customers and vendors are struggling with data and application fragmentation. It is not getting any better in cloud PLM world – we just move to another set of APIs and technologies.

What is my conclusion? The idea of setting standards for deep linking is interesting. It can provide some level of solution to stop proprietary silos, data islands and data pumping challenges to send data between applications back and forth. It is not simple and requires some level of synchronization between vendors and customers. We have already enough history on the web and in app world to learn and correct the course to control data and make it transparent at the same time. Just my thoughts…

Best, Oleg

Image courtesy of Stuart Miles at

Importance of PLM and PIM bridge

October 11, 2014


PIM. Product Information Management. Sorry for brining yet another three letter acronym into discussion today. PIM stands for a discipline to manage data about products available outside of the company. Here is Wikipedia description:

Product information management or PIM refers to processes and technologies focused on centrally managing information about products, with a focus on the data required to market and sell the products through one or more distribution channels. A central set of product data can be used to feed consistent, accurate and up-to-date information to multiple output media such as web sites, print catalogs, ERP systems, and electronic data feeds to trading partners. PIM systems generally need to support multiple geographic locations, multi-lingual data, and maintenance and modification of product information within a centralized catalog to provide consistently accurate information to multiple channels in a cost-effective manner.

Kalypso article Viewpoints on Innovation Is Your Data Holding You Back? Product Information Management for Retail brings a topic of PIM importance for retail industry. It explains about omnichannels business model, leveraging “big data” and making data available across multiple channels and business initiatives. Article recommends building central data repository for product information as well as integrate and streamline all processes related to product information. Here is a passage from the article.

Integrate and streamline all processes that relate to product information. For most retailers this means integrating all the processes that have to do with setting up items in a given system. There are three that are the most important – product development, merchandising, and eCommerce. Integrating and streamlining these processes will remove duplication of work, and improve communication and efficiency.

Build a centralized repository for all product information. Product information lives not only in product development, merchandising and eCommerce systems, but also in the warehouse management system, marketing systems, and even in ad hoc desktop databases such as Microsoft Access and Excel. Creating one centralized location for all product-related data ensures a single version of the truth that all functional groups can access.

The story about PIM importance in retail reminded me about latest purchase on Amazon. It was a sofa for kids playroom. Nothing special, but it came disassembled as most of furniture you buy on Amazon. Sofa itself had special plastic feet assembled for transportation purposes. feet were packed separately. Package itself supposed to have also screws. The picture below shows you product assembly guide. Screws were absent and the number of screws specified on the picture is wrong. In addition to that, mounting positions for feet and construction of feet clashed (it was wrong welded). Until Amazon sent a set of missing screws, I was trying to find more information about the products, screws and mounting online. It wasn’t very successful. You can easy get part number, but to find more specific information about mounting was not possible.


The example above is not unique, in my view. The information about products is often missed online and on e-commerce website. It is hard to identify products and find relevant information that you are looking for. These days is directly translated in bad channel performance and customer satisfaction. One of possible steps to improve it is to bridge product development systems and information systems supplying product data to outside world. Think about product documentation, manufacturing identification (like Part Numbers) and many others. The information about products typically stuck in engineering department, variety of databases and excel spreadsheets. To bring it online in a structured way can be an interesting opportunity.

What is my conclusion? Correct product information is a huge power to improve everything from sales to support and maintenance. With growing online sales in both B2C and B2B domains, it becomes absolutely important to maintain correct online information. CAD/PLM/ERP systems are primarily source of this information today and it is still very hard to get right information and bring it to business users and customers. Just my thoughts…

Best, Oleg

Why now is the right time to reinvent PDM?

August 15, 2014


Product Data Management (PDM) isn’t a new domain. The first PDM systems were invented 20-30 years ago with a simple objective – to manage product data. The scope of PDM was heavily debated and included design, engineering BOMs, ECO and even supply chain. However, the most widely accepted role of PDM is to manage CAD files and their revisions.

For long time, PDM was recognized as somewhat you only need to consider if a size of your engineering department is large enough. Even starting price to implement PDM solution went down significantly for the last 20 years, my hunch average PDM solution starting cost for engineering organization with 10-15 people will be about $30-50K. Cost and implementation complexity made PDM business limited to larger companies and was mostly handled by resellers with special skills and knowledge. Most of them associated with a specific CAD vendor channel.

CAD vendors recognized the need and complexity of PDM. For most of vendors the answer on PDM demand was to develop (or acquire) a dedicated PDM system bundled with their CAD software. As a result of that, most of PDM players were acquired. Most of existing (remaining) PDM vendors are either focusing on a specific geographical niche or developed additional solutions usually branded with "PLM" buzzword and strategy.

My hunch is that until last year, PDM market was somewhat stalled and focusing on replacing of outdated versions of PDM software as well as support of new CAD software releases. Then something happens… For the last months, I can see an increased interested in PDM software. I noticed few focused researches and articles in the field of PDM – Expert Guide to the Next Generation of PDM; TechClarity Expert Guide for Basic CAD management and few others.

Also I want to mention few activities by vendors focusing on basic PDM functionality. It started from more traditional OOTB approach made by PTC Windchill PDM Essentials, SolidEdge SP focusing on SharePoint platform leverage and GrabCAD Workbench using "cloud platform" as a differentiation strategy.

Consilia Vector published CAMScore report for GrabCAD Workbench where CAMS stands for Cloud, Analytics, Mobile, Social. In my view, these major trends are making a renaissance in the space of PDM.

As I mentioned before, because of cost and complexity, PDM software was out of reach for many smaller companies and engineering departments. DIY (Do it yourself) PDM approach combining network file share, Excel files and FTP is a solution for probably 60-70% of market. For many years, to share files using network and USB drives was "good enough solution". But the era of file sharing changed forever with coming trend of social networks, mobile and cloud. So called YAPSA (Yet Another Photo Sharing Apps) became widely available in our everyday life. The question why PDM is so complex and why we cannot manage and access CAD data similar to what we do with photos and videos brings PDM solution back to the innovation room.

What is my conclusion? Cloud, web and social technologies in consumer space reached the level of maturity. It comes to the point where new tech and awareness of cloud and social approach are going to challenge a traditional PDM space. In addition to that, looks like an existing approach to use network drives and file sharing to manage CAD files is coming to logical end. People will be looking how to copy YAPSA approach into PDM space. So, it is time for PDM to change. Just my thoughts…

Best, Oleg

Pintrest will teach CAD companies to search

April 29, 2014


Search is a difficult problem. Especially when it comes to enterprise and organization. According to researches, more than 2/3 of people searches are not returning satisfactory results. Enterprise is messy, complicated and contains lot of unstructured data these days. CAD and other 3D files are part of this messiness. For many years, we generally thoughts of web and enterprise search as a place where text begot text – you input some text, press search, and get a bunch of relevant results, also in text form. However, in many places, and 3D and CAD is one of them, search by keywords is not very efficient.

Company have been trying to innovate in 3D or Shape search for the last decade or so. Read my old blog – 3D Shape Search in CAD and PLM. Despite many available solutions in 3D search, I can see a low adoption rate. In connection to that, I saw the opportunity to rethink 3D search.

The following article caught my attention earlier this week – In Challenge To Google, Pinterest Launches Guided Search. Pinterest is a visual discovery tool that people use to collect ideas for their different projects and interests. People create and share collections (called “boards”) of visual bookmarks (called “Pins”) that they use to do things like plan trips and projects, organize events or save articles and recipes. According to the article

At an event at Pinterest headquarters Thursday evening, CEO Ben Silbermann announced Guided Search, a new visual way to explore Pinterest’s more than 30 billion pins—links or images chosen by users and assigned by them to topical collections. Unlike most search engines, where you must choose a precisely constructed string of keywords for what you want to find, Guided Search offers suggestions as you go, based on the associations Pinterest has learned to make between all the objects in its database.

You can learn more about how guided search works here and on the following video:

The idea of "discovery" is very compelling in the engineering discipline. In many situations it is very hard to formulate a specific keyword based query to find a result. Therefore the ability to classify, categorize, slice and dice data can be very powerful to search and navigate 3D data.

What is my conclusion? Guided search is not a completely new idea. You can see that in some old enterprise search systems. However, to combine together with visual data corpus provides some fresh experience in results. The combination of technology and experience is important. Web is slowly becoming a future platform for engineers. We are spending more time online and getting additional web experience. Some web tools are providing ideas, technologies and solutions that can be re-used in engineering and product development. I think Pinterest guided search is one of examples. Just my thoughts…

Best, Oleg

CAD is half pregnant by cloud

April 1, 2014


The usage of cloud is growing every day. Started as an option to simplify collaboration and data exchange, it is proliferating into spaces such as backup, computation and many others. CAD and design are remaining one of the most conservative zone of the cloud and engineering software. Commonly agreed opinion – desktop is the best place to run CAD system in terms of resources, performance and tasks designers want to accomplish.

With such CAD desktop paradigm, you might think CAD users are completely independent from cloud. Actually, it is not so true. My attention caught by Cadalyst article – Advocate for Internet Access for CAD Tools by Robert Green. According to him CAD is no longer an island and significantly depends on cloud services and public internet infrastructure. Here is the passage, which explains that

Like it or not, our CAD users are becoming more and more dependent on tools that reside outside our company’s internal network. The types of systems we use may vary, but they typically include the following: FTP access sites for file uploading and downloading, Remote access of workstations for technical support; Remote log-on sessions to run compute-intensive tasks, such as rendering or analysis, on powerful remote workstations; Enterprise data management (EDM) and product data management (PDM) systems for CAD/BIM models and files; Cloud-based services on vendor-supplied servers. Whether you use one, several, or all of these Internet-based resources in your day-to-day CAD system, the fact remains that using CAD is no longer something that you alone can control. And when you must reach out over the Internet to complete your CAD tasks, IT becomes a crucial part of your workflow.

It made me think again about future of cloud and CAD relationships. In my view, cloud won’t be introduced to CAD users as a single lifetime event. Instead of switching designers and engineers to cloud CAD overnight, companies will introduce some cloud based services to maximize cloud value proposition to existing CAD-based workflows. You can navigate to some of my earlier posts – The future of CAD without files?; A moment before CAD files cloud mess…; What “end of local storage” means for CAD?; CAD, PLM and Future Cloud File Systems.

I can identify 3 main zones of existing CAD system enhancements that will leverage cloud eco-system to provide additional benefits to engineers:

1. Cloud based backup and file exchange. This is mainstream scenario that requires very little from CAD and other engineering software vendors. Cloud services such as Dropbox, Google Drive and some others can provide it today without even disrupting current workflows.

2. Viewing and collaboration. This is more complicated, but still very feasible scenario. Think about services such as GrabCAD Workbench, TeamPlatform and some others. These services can solve basic revision management needs and collaborative viewing of files.

3. Computation and special engineering design services. This is the most interesting case, in my view. In this scenario, desktop CAD systems will use services running from public cloud to solve simulation, analysis, and more complicated design tasks. Some of them can leverage elastic nature of cloud and some of them can be collaborative by allowing several engineers working together.

What is my conclusion? Can you be ‘half pregnant"? Actually, you can, if you think about CAD and cloud services. In my view, existing and new design tools will be leveraging hybrid resources (from desktop and cloud) to support optimal workflow and implement best user experience in the future. Just my thoughts…

Best, Oleg

Web and DIY Future of PLM Integrations

September 16, 2013

Application integrations is a complicated topic. Especially when it comes to enterprise. I can confirm decades of different attempts to simplify integration tools and create an easy way build integrations. If you are long enough in enterprise software domain, you can probably remember the variety of buzzwords like EAI (Enterprise Application Integration), ESB (Enterprise Service Bus) and many others.

There are three main components in every integration solution – data retrieval (often called connectors), integration infrastructure (middleware) and specific business code to support your integration scenario. It is complicated and can fail in many ways. Navigate to one of my historic posts to read more – PLM integration failures.

There is a chance things are going to change these days. We are getting web-like more and more every day. Which means the technology we use (also in the enterprise) is getting more similar to technologies used to build regular web sites and applications. The amount of data on the web is skyrocketing. To have technologies that can help you to deal with this data (also for integration purposes) is important. The technologies can be applied in enterprise space as well and change they way we do integrations. I want to bring few examples of tools today to explain what I mean.

Import.IO. Few days ago, I learned about interesting company Import.IO. Navigate to the following link to read more – Turns Web Pages Into Spreadsheets For Getting Out The Data That Matters Most. Spreadsheets is a good thing. Since most of enterprise organizations are overflowing by spreadsheets, the ability to convert your data into spreadsheet is good. However, the most interesting past of Import.IO is an easy way to scrap data out of web pages. Imagine if you can scrap data from your enterprise web applications. That would be cool thing to do.

Import.IO is not alone in the game of scrapping and re-purposing data on the web. There are two other products that came to my mind when I was listening and thinking about the problem Import.IO is trying to solve.

Yahoo Pipes. According to Yahoo website, Pipes is a powerful composition tool to aggregate, manipulate, and mashup content from around the web. The idea of pipes is coming from Unix operation system. Yahoo developed quite interesting and nice infrastructure how to create pipes for scrapping and integrating data.

Google Fusion Tables. Another interesting piece of data re-purposing tools – Fusion Tables. This is an experimental tools created by Google Research. Navigate here to learn more. Fusion tables provides you another way to scrap, import, mix and re-shape data.

What is my conclusion? DIY integration tools is an interesting category. For the past decade, all DIY integration efforts in enterprise and manufacturing failed. Very few manufacturing companies embarked into integration development. Most of companies used services and integration providers that dedicated to develop integration solutions (with high $$ value behind the effort). Cloud technologies and web applications are open new era in both requirements and needs for integration. Native web tools can get some advantage. There is a possibility to open a new page in DIY integrations. Just my thoughts…

Best, Oleg


Get every new post delivered to your Inbox.

Join 286 other followers