The future of PLM vendors differentiation?

January 26, 2014

plm-vendors-differentitation

Differentiation. Competitive advantage. Value sales. I’m sure you’ve heard these buzzwords many times. Competition is part of everyday business life. Usually, I don’t speak about competition. I searched across my blog and founded only one reference to competition related writing – PLM Competition Toolbox. But I want to look in my crystal ball today. Here is the article that made me do so. Over the weekend, I’ve been reading Joe Barkai’s blog post – How To Win Without Differentiation. Article speaks about how to develop differentiation strategies and what to do when differentiation is not coming easy. I liked the following passage:

When value differentiation is too vague and difficult to demonstrate, price competitiveness does not work. Potential buyers seek other ways to drive their decisions, and, as Vermeulen points out, they rely on other factors, such as the seller’s brand, status in industry, and prior relationships. In other words, the buyer switches from assessing and comparing features and costs to differentiate based on the brand’s credibility and trustworthiness.

The article made me think about PLM vendors competition and differentiation. PLM industry is dominated by small number of large vendors (namely alphabetically – Autodesk, Dassault, Oracle, PTC, SAP and Siemens PLM). There are some specific competitive niches each of these companies were developed for the last 10-20 years. However, looking on websites and public marketing materials about PLM solutions, I can see less visible difference. These companies are targeting similar businesses and within time it is not simple to get value differentiation between brands.

Enterprise software is an interesting business. One of the characteristics of software for engineering and manufacturing is lifetime customers and legacy software. The lifecycle of customers in this domain is relatively long. It goes from extremely long in defense, aerospace programs to long in automotive and others. To get familiar with engineering software (such as CAD and PLM) takes time and effort. You need to cross educational barriers. So, when you already “in”, the entrance barrier for competitor is getting bigger. Overall investment and significant amount of customization play another role. This business is different from selling smartphones. After spending few millions of $$$ on a specific solution, it is very hard to justify the replacement of this solution with a competitor.

So, what will differentiate PLM vendors in coming 10 years? What will become future competitive advantage? Technology will obviously play some role, but I mostly agree with Joe – “Don’t oversell technical wizardry. Buyers of enterprise software and services consider your product roadmap and long-term commitment to the space as much as they do to your product features and engineering skills. So, it is very hard to create sustainable technological advantages in this market. Very few companies succeeded to do it in the past and kept it for a long time.

However, there is one thing that getting more and more value points. I call it “vertical experience”. Sometimes vendors call it “industry practices”. However, it can go much more beyond what vendors are doing today in this space. I can see specific vertical solutions focused on design patterns, bill of material management, change management, services, suppliers related to particular segment or industry. The niche can be big enough to serve business of service providers as well as provide an impact on overall vendor business. This is a place where PLM vendors will be able to show big value to customers and fast implementation ROI. It is not simple and it takes time and dedication.

What is my conclusion? Vertical (or industry) specialization can become a future goldmine for PLM vendors and solution providers. To develop deeply integrated solution including specific behaviors in data and process management is not a simple task. Customer experience is something that very hard to gain. However, once achieved it can be leveraged for a long time. Industry verticals can become a future differentiation factor for large vendors and startup PLM companies. Just my thoughts…

Best, Oleg


PLM and Oracle New Full Text Query

August 28, 2013

Are you familiar with 1% internet rule? It is also called 90-9-1 principle. According to this rule, only 1% of internet users are actively involved into content creation. Even 1% percent rule cannot be applied as-is to enterprise, my hunch retrieval of data (or data access) becomes more and more important these days. The amount of data in enterprise organizations is growing. To get access to this data is a critical IT requirement.

Companies are thinking about different ways to access data. It comes in a form of search, reports, analysis and many others. All I said above, applies to manufacturing companies and PLM systems. Traditionally, engineering uses, which represents a majority of CAD/PLM users from the beginning are very concerned about data creation – CAD design, Simulation, Drawings – all these elements assumes creation of information. However, when it comes to user adoption, the problem of data consumption comes to the place.

While the question of noSQL database adoption by PLM is still open, my guess the majority of PLM implementation in the world are running on top of RDBMS and probably with heavy presence of Oracle databases. On this note, Oracle recently updated its database to include XQuery Full Text Search. The following article in AMS Tech blog provides lots of useful information about this full text search query functionality and code examples.

What is my conclusion? New technologies like noSQL and others are cool. However, while you have to support many existing customers in production, usage of existing mainstream database technologies can become an interesting option. I’m not sure it will be become a life saver for Oracle databases in the race against other databases options. Nevertheless, to solve customers problems is a priority. Full text query can provide some cool search capabilities to existing implementations. Just my thoughts…

Best, Oleg


PLM, Information Discovery and Excel Provisioning

July 17, 2013

Excel is one of the most widely accepted PLM systems. PLM vendors and consultants may disagree with this strong statement. Despite that, I believe the amount of product data that lives today in Excel spreadsheets spread around organizational servers and cloud storage like Dropbox and Google is huge. The main reason companies are getting into DIY Excel PLM business is related to simplicity of data capturing and Excel ease of use. Long time ago I blogged about Why do I like my Excel spreadsheets. In my view, 4 years blog post is relevant today too.

One of the trends that I can see today is related to establishing of services helping people to discover the information. Widely spread product data can be very useful when it comes to the point of decision making, design reuse and analyzes.

Oracle acquired search and data processing outfit Endeca last year. I was following Endeca activity for some time. One of the articles caught my attention yesterday – David Sowerby at 3sixty-analytics outlines the setup in, Endeca Provisioning – ‘Self Service’. The article describes how you can find a spreadsheet, load it into the system, experimenting with the data and getting the results. Here is an interesting passage:

The future direction of Endeca by including a feature described as ‘Provisioning’. In practice, this is the facility to load a single spreadsheet, perform some simple but useful data cleansing, and then let Endeca generate an application. Although it currently only loads a single spreadsheet per application, it is a great way to see some of Endeca’s capability first hand, and also to act a hands on introduction to building an application from your data.”

The following Oracle video shows how you can work with data and create applications.

The idea of creating application out of Excel data, made me think about past experience and technologies invented during my previous work at Inforbix. During that work, Inforbix product data crawlers loaded lots of CAD and Excel spreadsheets data and provided a way for customer to explore this data using simply and easy to use interfaces – tables and charts. Saved queries can be stored for future use.

What is my conclusion? The value of information is getting more and more important these days. Lots of organizations understood that data reuse and information discovery can provide a significant value to organization and help to improve decision process. The cost of data migration, cleansing is high. Therefore, applications that can make it possible getting more and more focus. Just my thoughts…

Best, Oleg


What Oracle Results Mean for PLM vendors?

March 29, 2013

For many years the business model of CAD, PDM and later PLM vendors was structured as high upfront license combined with continues maintenance payments. The same is true for many other enterprise software vendors. PLM vendors built their businesses around expensive licenses sell covering significant sales cycle cost and even pilot implementations. The majority of PLM software is running on top of relational databases (RDBMS) licensed from big 3 vendors – Oracle, Microsoft and IBM. Back in 2010, I posted Faltered licenses and future PLM business models. 3 years ago it sounded as something that will never happen to enterprise software. However, things are going differently in 2013. Navigate to ReadWrite Enterprise article Oracle big miss and the end of an enterprise era. The author, Matt Asay, VP of 10gen, the outfit behind noSQL open source database MongoDB is clearly biased with the disruptive ideas and business models coming to the enterprise space. Even so, some of his assumptions in the article resonated. One of the key points – Changing the way how vendors get paid. He doesn’t see the problem of bad sales execution, but a fundamental shift in technology and product landscape. Here is my favorite passage:

This isn’t just a matter of improving legacy software products. It’s a matter of fundamentally changing how these legacy vendors deploy and charge for software. For example, Oracle’s entire cost structure is built around the premise of a hefty upfront license and high-margin maintenance (Over 20% of the license fee). We believe the primary issue is a fundamental shift in the technology landscape away from legacy systems towards a new breed of better products at a lower cost both in Apps and in Data Management. Virtually every emerging software trend is having a deflationary impact on spend.

Another aspect of disruption is related to developer communities. Software developers and CIOs of enterprise companies are looking for technological platforms. They don’t like the idea of expensive licenses and approvals of enterprise vendors to develop software on top of their platforms. As a result of that, they are turning to open source as an option to have their “platform of choice”. Here is a interesting quote:

With the rise of open source…developers could for the first time assemble an infrastructure from the same pieces that industry titans like Google used to build their businesses — only at no cost, without seeking permission from anyone. For the first time, developers could route around traditional procurement with ease. With usage thus effectively decoupled from commercial licensing, patterns of technology adoption began to shift…. Open source is increasingly the default mode of software development….In new market categories, open source is the rule, proprietary software the exception. The top-down approach, in other words, is losing its currency within the enterprise, as both open source and cloud enable developers (not to mention line of business executives) to get work done without getting permission.

So, getting back to future PLM business models and ways to disrupt PLM today, what does it mean for PLM vendors? I want to outline 3 main points:

1 – Alternative business models.

Customers are looking for alternatives to existing PLM licensing models. The biggest conflict here is between high cost of lucrative PLM licenses and interests of PLM vendors to take PLM software upstream and downstream in the organization in order to increase usage. However, adoption speed of PLM software is low. High cost of additional licenses is one of the factors preventing customers to expand the usage of PLM software. PLM vendors need to think how to provide flexible portfolios with options allowing to customers to spread PLM systems and technologies across the enterprise.

2 – Diversify revenues and activities

PLM vendors need to learn from the past of IBM and some other vendors. Years ago, IBM reshaped their business from software licenses to services and consultancy. IBM was extremely successful in this change. We can see how existing enterprise software vendors (Oracle, Microsoft, SAP, etc.) are trying to diversify their activities by providing new services and solutions. PLM vendors might be taking a similar path in the future.

3 – Pay attention to open source disruption

Open source is disruptive. Period. It is very hard to compete with free software and good technologies that can be used to develop solutions. Mainstream web is running on top of open source technology foundation. New generation of developers are coming with a significant baggage of knowledge and experience in this space. PLM vendors, system integrators and service providers need to take a note, until it will be late. You future competitors are developing from Starbucks shop next to your office.

What is my conclusion? I think, Oracle miss is a big alert sign to PLM companies. Changes are coming. It will come from customers that will be looking for alternative business models, from developers that cannot tolerate an expensive infrastructure and from technological vendors that will propose alternatives to expense PLM infrastructure. All together it will move PLM industry towards new horizons. PLM vendors need to take a note. Just my thought…

Best, Oleg


PLM Implementation Lifecycle Challenges

March 13, 2013

Implementation of enterprise information systems is a long and painful process. It requires a lot of efforts, investments and planning. ERP and other enterprise backbone technologies (eg Oracle) are good examples. In my view, PLM comes to the same list as Oracle. Yesterday, I had a chance to read CW article – Oracle Blasts Forrester Report On Fusion Applications Adoption. Spend few minutes and read that. Forrester research made a suggestion that Oracle’s customers won’t be interested to update to new Fusion Applications. Forrester defines it as a strategic dilemma – Oracle makes a lot of money from old product lines and at the same time, not many customers are adopting new stuff and tech like Fusion. Oracle criticized Forrester research. Here is a key passage to me -

Oracle has taken a "co-existence" approach to pushing Fusion Applications, wherein an installed base customer would use some Fusion modules alongside their existing E-Business Suite or PeopleSoft implementation. While Forrester mentioned the co-existence concept in its report, in Oracle’s view the research firm gave it short shrift. "Customers can adopt modules of Oracle Fusion Applications incrementally and at their own pace to co-exist with their existing deployments. Never have we forced existing customers to move/migrate out of Oracle Applications Unlimited to adopt Oracle Fusion Applications," the response states. "Customers may not know what to do in the context of their specific implementation as each deployment is more or less specific, but Oracle’s strategy is very clear. We do not have a dilemma." Forrester also failed to adequately acknowledge the investments Oracle is making in its Applications Unlimited products, which include planned future releases and "long lists of enhancements," according to the response.

I found some similarity between Oracle implementation lifecycle and PLM implementations. One of the critical elements of PLM technologies and products is PLM adoption. It takes time to implement and start using PLM in an organization. Earlier this year, I attended PI Congress in Berlin where I had a chance to watch many customer presentations. I called them "big mono-PLM" projects. Every implementation is multi-year, project with a significant budget and long lifecycle. Once implemented in a company, it can be used for a long period of time. However, to maintain it and to upgrade to another system versions is very complicated. If you look over the landscape of existing PLM implementations, you can discover a variety of product, versions and technologies coming from PLM vendors for the last 15-20 years. Because of long implementation cycle and high cost, once implemented, PLM systems live very long lifecycle. Sometimes, it is also a result of a specific industry (the best example I can come with is airline program).

What is my conclusion? I think, the slow lifecycle of PLM systems is a problem. It prevents companies from innovation, adopting new business practices. PLM vendors need to focus on how make PLM implementation agile and lean initially with the ability to update with more features, and modifications. Today, there are too many companies that using old technologies and products because they cannot afford a future step in PLM systems development. Just my thoughts…

Best, Oleg


IaaS, Cloud PLM and Disruptive Pricing

January 18, 2013

PLM vendors are continuing to adopt cloud. I can clearly see a difference between people attitude for cloud solutions now and 4 years ago. Here is my simplistic definition of changes that happened for the last 4 years. The following sequence represents a typical reaction on "cloud PLM" for the last 4 years. 2009: What is cloud? 2010: Why I need cloud? 2011: Why not to use cloud? 2012: How to use cloud?

The last question, actually, has multiple angles. It means all – technology, implementation, product licenses and finally pricing model. The last one is obviously important and I can see some interesting dynamics between cloud and on premise software in coming years. The following Infoworld article caught my attention – Oracle’s faux IaaS now gets faux on-demand cloud pricing by David Linthicum. Take a read. I found it interesting. Oracle is a king of enterprise software market has a lot to lose when it comes to cloud adoption. I found the following passage the most interesting:

Oracle’s "on-demand private cloud" isn’t merely an equipment lease either. It’s an odd hybrid created because Oracle finds itself stuck between the rock and the cloud, reluctant to devalue its hugely lucrative enterprise software products by folding into cloud-service pricing. The rise of cloud computing very much goes against Oracle’s highly profitable way of doing business: enterprise license agreements, maintenance contracts, and all the other trappings of big software.

2013 is perhaps the first year where Oracle will feel real pain from public cloud providers, such as Amazon Web Services, Rackspace, and Google, as well as emerging private cloud providers such as Eucalyptus and those based on OpenStack or CloudStack.

I made me think more about what happens in PLM vendors ecosystem. Traditional PLM vendors (Siemens PLM, PTC and Dassault) are selling premium lucrative enterprise oriented packages with a lot of functionality and value behind that. Autodesk is a newbie of PLM market is playing "cloud alternative game" with SaaS prices and less functionality out of the box. Aras Corp. is providing Aras Innovator using disruptive enterprise open source. I can see some similarities in the attempts of traditional PLM vendors to embrace cloud technology and delivery models. You can see how Aras position their solution as "true cloud" with all advantages of cloud and on premise software. Aras leverages Microsoft Azure platform. Navigate to this link to read more. Siemens PLM introduced TeamCenter IaaS option delivery few months ago. I wasn’t able to get information about IaaS and cloud prices for both Aras and TeamCenter. Both website provided contact option to request the price, but no price.

What is my conclusion? Cloud plays a disruptive roles these days in many markets. Enterprise software is one of them. We can see an interesting combination of vendors, IT and infrastructure providers plays. IaaS vendors will keep existing technological platforms afloat by providing a seamless cloud infrastructure environment to support existing server-client and web technologies. Oracle is a good demo how vendor can reposition and tailor technology model to new conditions. I think, we will see lots of "cloud innovation" from traditional PLM providers in a near future. For a long run, cost matters. Just my thoughts…

Best, Oleg

Image courtesy of [Stuart Miles] / FreeDigitalPhotos.net


Will PLM follow a custom hardware path?

October 10, 2012

I want to talk about hardware today. You probably surprised, but I hope not so much. During the last 10-15 years, the majority of works PDM/PLM systems were doing were focused on the commodity low end x-86 servers. There is nothing wrong with that. Nevertheless, I can see some new trends coming in this space. It comes with web development, large data scale, mobile, data analytic and more. I can clearly see two patterns in how vendors are using hardware. One of them is an attempt to build proprietary data centers from commodity level servers (eg. Google, etc.). Another one is to focus on how to delivery solutions bundled with specific highly profiled hardware platforms (IBM Pure Data, Oracle Exadata, Cisco, etc.). Data centers are an ideal place for such type of boxes.

I’ve been reading GigaOM article earlier today – Does Big Data really need custom hardware? The article itself is not about PLM. At the same time, it made me think about some examples author is using. Think about large computational tasks related to designing, rendering, simulation, data analyses or just check-out for a very sizeable assembly for a configured order. All these use cases requires data on scale. To get this information you need to have a very efficient data backend with a significant ability to scale in different dimensions. Here is an interesting passage from the article:

Where the generic server market has been commodified with low-end x86 servers companies like Teradata and EMC are doing their best to hold onto their hardware margins with specially designed systems. And it looks like IBM and Cisco have decided this is an opportunity not to be missed, and are taking it further. Cisco has released a unified computing system specifically designed to run SAP’s HANA database. Oracle is also heading down this path.

The question author is asking is actually a good one. Do we need high-scale performance data boxes or we can leave with the data centers built on top of commodity hardware? Here is another quote:

Instead of these two boxes representing a new hardware for big data these really represent that capitulation by the major hardware vendors to a services model. Technically these boxes may have different chips when compared with commodity servers, but what these guys are actually selling is the plug and play aspect. Sure a customer can buy cheaper boxes and download a Hadoop or other open source software (or pay a licensing fee and have someone like Cloudera manage it for them) but they want something that works with little or no effort.

So, what happens with CAD/PDM/PLM vendors? The expectations of companies are moving beyond simple engineering document management, checkin/checkout process, towards data analytics, social software and more heavy data oriented tasks. I can hear voices of "big data" discussions. However, there are not much clarity in these discussions yet. Vendors are going to re-think many data-driven paradigms. What path vendors will follow? Some vendors will follow cloud data centers and commodity hardware. Another group (eg. SAP HANA) is planning to develop proprietary server boxes.

What is my conclusion? The awareness about data driven backend systems is growing in manufacturing and other enterprise companies. In my opinion, PLM vendors are not there yet. To deliver scalable, performance oriented back-end is nontrivial task and this task can allow PLM software to scale. Cloud PLM opens new untapped classes of applications that we have never seen before. What path PLM vendors will take? Will PLM follow some of the paths alongside custom hardware, continue to use standard hardware and database software or move to open source to develop separate bundles – time will show? What is your opinion?

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 250 other followers