PLM customization and the role of SQL Data Schema

The business of PDM and PLM systems are tightly connected to data. In the early days of EDM (Engineering Data Management) and PDM, developers used variety of data-management technologies – text files for meta-data, proprietary data bases and lately relational databases (RDMBS). RDBMS became a mainstream for enterprise software 15-20 years ago and since then everybody developing some kind of data systems (PDM and PLM clearly counted in) are tightly connected to RDBMS and SQL (Structured Query Language). SQL became a mainstream in many applications – developers and application engineers are familiar with the language and use it in variety of implementations – customizing a data schema, building reports, and optimizing application for performance and workload.

Object Model Abstraction

The complexity of product and engineering applications such as PDM and PLM is related to the complexity of data. Engineering, product and lifecycle data is very semantically rich and as a result makes the development of solution complex and dependent on different factors related to customer needs. Therefore, PDM and PLM systems in late 90s and early 2000s developed object models that created logical mapping between SQL data tables and conceptual (logical) model of data. Such type of models is used almost in every PDM /PLM system. Depends on the complexity of the system and functional needs vendors created object models with a different level of complexity and flexibility.

I’ve been reading Aras blog few days ago on the topic of SQL data modeling. The name of the post has some marketing flavor – Get the Inside Scoop: CEO on Architecture Benefits. Despite that fact, the post itself is interesting and speak about flexibility of PLM solutions, data models and the mapping of object models to SQL tables. Here is the passage that speaks about the problem of object modeler and abstractions in enterprise systems and PDM/PLM systems:

Essentially, it’s been too difficult [and/or costly] to modify the software to fit your processes. It’s been even more difficult to modify that software again six months from now because your processes have evolved. And, assuming you’re able to modify the software and get it into production, with all that modification it’s now close to impossible to upgrade it. This is because legacy enterprise software technology allows customization through very "clever" abstractions of the object model vs. the relational database design. The more you customize and the larger the data set, the more these abstractions create scalability and performance issues. It’s a huge problem.

Certainly, to create an efficient abstraction level is important. I will leave the advantages of a specific vendor(s) out of this blog. I’m sure almost all PLM systems today on the market made an effort to develop an efficient abstraction level. Some of them succeeded more and some less. However, let me speak more about what future holds, in my view.

Data Web Applications and Service Abstraction

Development of object model and data abstraction is not something, specifically unique for PDM and PLM systems. Fundamentally, any software created data abstraction and uses it for application / implementation needs. As far technology develops for the last 10 years, we’ve seen many examples of data abstractions developed for web applications. The main difference is that architecture of these systems is not allowing easy exposition of database to end users and applications’ developers. As a result, most of these systems developed service oriented model that used to customize data as well as make changes. Because of web nature of application, the requirement to run it with high availability is a natural per-requisite. At the same time, it allows to engineers to hide and optimize data schema for many of these applications. It was done by many web systems started from web giants like Google, Amazon, Facebook, LinkedIn and going down to many smaller web apps.

What is my conclusion? Data and data efficiency remains of the key topics on the table when it comes to the development of PDM and PLM applications. To make it simple and stable to updates and system customization is a priority task. I think, SQL data schema is a technique that uses by almost all PDM/PLM systems. Thinking about the future, I can see systems are moving towards something more efficient, exposing less SQL outside and keep web oriented tools to maintain data customization.

Best, Oleg

Image: FreeDigitalPhotos.net


About these ads

2 Responses to PLM customization and the role of SQL Data Schema

  1. Dijon says:

    The supposed “abstraction” that some PLM vendors claim is often made as a self serving revenue generator (e.g. a ploy) as it often leads to consulting services, ITK/SDK product licensing, etc.

    Try drilling cautiously with SQL anyway, it often gets the job done regardless of what others advise about not being able to find what you’re looking for using that officially unsupported technique.

  2. Dijon, yes, SQL modeling works for most of the cases. However, with the increased level of complexity the cost of such implementations is skyrocketing. The last moment, before you decide to re-think is the time when your PLM vendor decides to upgrade the data model………………Huh? Best, Oleg

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 250 other followers

%d bloggers like this: