PLM Services, Department Stores and Digital Future

June 2, 2014


Don’t be surprised if your most trusted CAD/PLM service provider will be acquired tomorrow. According to Joe Barkai’s post- PLM Service Providers Ready To Deliver Greater Value, we have been witnessing a wave of mergers and acquisitions of PLM services companies (the examples – Accenture / PRION Group, Accenture / PCO Innovation, KPIT-Tech / I-Cubed /Akoya; Kalypso / Integware merge). The following passage gives you a feeling of the core reason behind that.

For years, PLM companies focused more on PLM /PDM implementation than on actually improving business processes. While the business benefits of PLM were well articulated and supported by rosy ROI models and complex colorful architecture slides, many manufacturing companies were unable to achieve the process changes and enterprise software integration that were need to reap the promised benefits, and ended up implementing a PDM system. Albeit critical for managing product data, this reality might explain why some manufacturers feel they might have overpaid for their PLM implementation efforts.

The status quo may be changing, and organizations that have gone through massive implementation projects are ready for more. They need to improve their capacity for more complex multidisciplinary decisions using product data, whether it’s stored in PLM/PDM, ERP or in other, less structured forms; they need to improve collaboration in elongated and fragmented design partner networks and supply chains; they need to leverage product and consumer insight garnered from social media, warranty claims, and channel activities.

The story makes sense to me. In my post few weeks ago – Why PLM stuck in PDM?, I’ve been talking about exactly the same reasons behind a problem with deep and broad PLM adoption – (1) focus on CAD, (2) poor integration between PLM and ERP, (3) absence of process thinking, etc.

Joe’s article made me think about the role PLM service providers will play in the future PLM implementation strategies. It reminded me department stores. Think Macy’s, JCPenny, Bloomingdale, Nordstorm… Large manufacturing companies own a huge chunk of PLM software. Every PLM vendor has their own strong characteristics. One size doesn’t fit all. Customers’ existing investments are huge.I don’t see these manufacturing companies will start jumping between vendors. So, how to make existing PLM system work and show bigger value becomes very important. Which obviously raises the question about qualified service providers. Large teams and ability to implement any PLM software will be the key for success and profit. Customers will be coming to the PLM service department store and guided to the right brand(s) or configuration of brands depends on their preferences and constraints.

You may ask me who will play the role of Amazon in the growing PLM service eco-system? This is a very interesting question to ask. Will e-commerce come and disrupt B&M PLM services? Who will provide a new class of systems, which requires different service capabilities? Who will provide online PLM services in a lean way. Joe is mentioning Autodesk PLM360, GrabCAD and Aras in the list of potential candidates. Who knows…

What is conclusion? The history often repeats. To pay attention on existing trajectories and department stores and e-commerce is important. New e-commerce vendors are growing up, but existing B&M department stores are selling lots of stuff. The same happens in PLM. Today, large vendors provide solutions for companies that ready to implement existing PLM software. It sounds like a good strategy for large manufacturing companies with deep pockets. Until the question about “lean and digital” will come up. Will online and lean PLM offerings compete with existing PLM vendors? This is a good question. There is a good chance for newcomers to play disruptive strategies. However, alternatives are possible as well. Either newborns in the cloud will outgrow existing B&M or existing vendors will develop right digital skills and experience. Just my thoughts…

Best, Oleg

Why PLM vendors might decide to beat Amazon?

March 21, 2014


Amazon is an absolutely marketshare leader in cloud computing. Because "cloud" is such a big and vague word these days, we must clarify and say "public cloud". So, you may think for most of us, cloud is equal to Amazon. AWS EC2 allows us to spin new servers quickly and provide great services to everybody interested in development of SaaS packages.

Not so fast… Questions are coming too. I can see two major ones – cost and strategy. I’ve been posted Cloud PLM and battle for cost recently. Amazon public cloud is coming with challenging cost sticker to some of us. Strategy question is connected to many factors – PLM PaaS opportunity, security and storage alternatives. Finally, with huge respect to Amazon, I’m not sure how many CAD / PLM companies are interested in catholic marriage between cloud PLM platforms and AWS. To provide PLM solution independent from Amazon IaaS and to control data storage is an interesting option for many vendors and partners. How to do so? I think, this is part of strategy for every PLM vendor these days looking how to develop long term relationships with manufacturing OEMs and suppliers.

My attention caught Gigaom article – Want to beat Amazon in the cloud? Here are 5 tips. Read the article. It provides some interesting opportunities how to compete AWS. It raises the point that in 2014 AWS became an elastic service commodity competing on cost. This is an interesting quote explaining that -

But fast-forward to 2014: there are dozens of IaaS providers offering similar capabilities. The selling points — like self-service, zero CAPEX and elasticity — that once made the cloud look exciting are not as appealing anymore, and they are no longer the differentiating factors. In the current context, selling cloud for its self-service capabilities is similar to Microsoft trying to sell the latest version of Windows only for its graphical interface.

Cost is important. However, for enterprise, value is often even more important. Therefore, speaking from the perspective of PLM players, my favorite passage is related to how to support scale-up and shared storage:

AWS’s philosophy of throwing more VMs at an application is not ideal in many scenarios. It might work wonders for marketing websites and gaming applications but not for enterprise workloads. Not every customer use case is designed to run on a fleet of servers in a scale-out mode. Provide a mechanism to add additional cores to the CPU, more RAM and storage to the VM involving minimal downtime. The other feature that’s been on the wish list of AWS customers for a long time is shared storage. It’s painful to setup a DB cluster with automatic failover without shared storage.

Here is my point. I think, CAD and PLM vendors will have to discover how to provide a balanced and scalable cloud platform. This platform will have to answer on questions how to scale from the solution for small manufacturers and mid-size companies to enterprise OEMs and Tier 1 suppliers. The border between these segments is vague. It is hard to develop two distinct PLM offerings and support two separate platforms. It was hard in the past on premise software and it is even more complicated on the cloud.

What is my conclusion? PLM providers will have to discover how to grow up from AWS-based offering and develop scalable cloud PLM platforms. It must include diverse options for data storage as well as computing power. So, to beat Amazon can be not such a dream option for PLM vendors like it looks from the beginning. Just my thoughts…

Best, Oleg

What cloud PLM vendors can learn about AWS speed

February 24, 2014


Amazon Web Services is one of the most popular cloud platform these days. We can say that Amazon became de-facto standard of public cloud. Amazon cloud keeps growing and remains the leader these days according to Synergy Research Group. A year ago article by Stackdriver published information about popularity of different AWS services. According to the article these are top three popular AWS services – EBS (Elastic Block Store), EC2 (Elastic Compute Cloud) and S3 (Simple Storage Service).

Lots of startup companies and established software vendors discovered the power and elastic capabilities of Amazon. Cloud is not only about consumer web application and social networks. The cloud era is coming to enterprise software companies too., Workday, Netsuite is the only short list of companies in the enterprise cloud domain. CAD and PLM companies are also taking advantages of the power and popularity of Amazon public cloud. As PLM vendors cloud strategies are getting more mature, I can see a potential for CAD/PLM companies to switch their focus on platform development and not only focus on applications. While there are still lot of not answered questions about future of PLM PaaS, many software vendors are asking about what is the right cloud platform for them. Recent GigaOm article raised interesting discussion about cost of public cloud.

Over the weekend, my attention was caught by VentureBeat article – Amazon Web Services speeds can vary by up to 200X depending on region. Thanks Startupmoon blog for pointing on this publication. Takipi, company focused on debugging of servers, discovered huge difference in speed of AWS applications in different regions and shared this information. Here is my favorite passage:

What that ultimately means is that developers who don’t pay attention to cloud regions, as server debugging company Takipi discovered, can actually cause their apps to run 10 or even 200 times slower than necessary. And Amazon doesn’t disclose any of that data. “We noticed that in many cases there’s 10X (or even more) difference in the performance of services due to AWS/S3 regions and external APIs,” Takipi co-founder Iris Shoor told me via email. “It’s possible to make amazing optimizations just by changing the region,” says Shoor. “For example, choosing Oregon over CA for a company that serves the European market will cut the upload time by half.” One of the main problems when hunting down the cause of slow services, Takipi says, is API latency. And regionality has a huge impact there, even if you’re as geographically optimized as you can possibly be.

One of the advantages of cloud applications is global availability. It looks like optimization of cloud PLM platforms can become a next focus for software vendors. One of the challenges of previous PDM/PLM platforms was to insure global availability and performance level. Latency was an issue for PDM users trying to access CAD data located in European servers from China. It looks like, the problem is just migrating from one software layer to another and requires new implementation approaches.

What is my conclusion? Cloud is not a silver bullet. Customers are moving from digesting of PLM cloud marketing towards analysis and technical investigation of different PLM cloud platforms. PLM vendors should take a note and focus on technological differentiation of their platforms. In coming PLM cloud competition, cost, performance and efficiency will become the most important factors influence future market dominant positions. Just my thoughts…

Best, Oleg

Picture courtesy of VentureBeat article.

How PLM can transmit data to the cloud?

November 24, 2013


Transmitting of data is complicated and painful. Remember few years ago, when industry just started to switch to the cloud, the discussion around how to transfer large CAD parts and assembly was very hot. Actually not much changed since that time. A huge amount of data lives on the desktops and produced outside of cloud environment. Think about variety of sensors that started to appear almost everywhere. Also, think about potential for huge amount of simulations – still many of these are run on desktop computers.

My attention was caught by Amazon newly announced Kinesis service. The description provided by Amazon is quite impressive.

Amazon Kinesis provides an easy way to process fast moving data in real-time. You simply provision the level of input and output required for your data stream, in blocks of 1 megabyte per second (MB/sec), using the AWS web management console, API, or SDKs. The size of your stream can be adjusted up or down at any time without restarting the stream and without any impact on the data sources pushing data to Amazon Kinesis. Once your stream has been created, you can immediately start loading your data, with simple HTTP PUTs. Amazon Kinesis automatically manages the infrastructure, storage, networking, and configuration to collect and process your data at the level of throughput your application needs. Within seconds, data PUT into an Amazon Kinesis stream is available for analysis. Stream data is stored across multiple availability zones in a region for 24 hours. During that window, data is available to be read, re-read, backfilled, and analyzed, or moved to long-term storage (e.g. Amazon S3 or Amazon Redshift).

If you don’t have time to read product description, watch the following 2 minutes video. The idea in a nutshell is quite simple – Amazon Kinesis will pump your data to the cloud. After that you can decide what to do and how to proceed.

I can see few potential applications of Kinesis service within cloud infrastructure today. One of them is massive data processing from desktops and corporate network locations. It can enable analytic applications, search and many other services that cannot live without data. Another application area is so called "IoT" (Internet of things) I covered few days ago here. By processing data from various sensors and mobile devices, engineering applications can provide new data input that can improve design process.

What is my conclusion? Kinesis service shows a potential of data processing from variety of locations to the cloud. Bringing data together can enable a number of a completely new ways to design, making analysis and optimization. Capturing a huge amount of real time data, can help improve quality and provide insight on how customers are using products. Just my thoughts…

Best, Oleg

PDM: Bring Your Own Cloud Or Die?

November 16, 2013


Cloud is not a "new black" in PLM industry these days. It is hard to find vendor that not associating itself with "sort of cloud solution". However, here are new a trend on horizon – bring your own cloud. Dropbox, a very successful in all means these days is trying hard to get in the enterprise business. Which made it very interesting from the standpoint of engineering and manufacturing business. Especially for companies focusing on design disciplines.

I was reading Dropbox is getting down in the business on In a nutshell, the problem Dropbox is trying to solve is related to accounts and identity management. Navigate here and you will see how Dropbox is preparing to solve the problem of multiple accounts and security. The message is clear – for whatever it worth, you will be able to get a "piece of your cloud" secured for you and for your company if you need it. Dropbox claims full redesign of security system. Here is the passage:

“We didn’t just re-do Dropbox for Business,” Houston said. “We re-did the [whole] foundation of Dropbox.” The company redesigned the service across the desktop, mobile and the Web. The changes also include advanced security and access controls, which should help mollify company IT managers. Businesses can, among other things, manage (or block) sharing to outside users, prevent sensitive docs from going into personal accounts, monitor all activity around work files, and even remotely wipe files from the devices of former employees.


Another piece of news also came few days ago from Amazon. Navigate to the following TechCrunch article and read Amazon Launches WorkSpaces, A Virtual Desktop Service On AWS. Amazon workspaces will allow you to have virtual working space from the cloud for less cost. Here is the quote from the article:

The news plays into the company’s effort to take more business from enterprise providers by providing customer-centric services with security that is sufficient for companies with significant operations at a lesser cost.


You can ask me how these two news related to PDM. Here is the deal. In my view, CAD will be the last engineering system that will migrate to the cloud. Hardi Meybaum and I had an interesting discussion about it in one of GrabCAD blog posts here and my post here. Solutions provided by Amazon and Dropbox made me think about about a possibility to make current desktop CAD system "cloud-enabled" without even changing their desktop nature or significant re-write. Of course, all these solutions won’t stop (and don’t need to stop) future cloud design product development. However, the combination of secured cloud solution together with some virtualization technologies can make cloud based collaboration possible today (not tomorrow).

What is my conclusion? Nowadays, we can see concurrent development of different technologies related to cloud – virtualization, file sharing and others. All together, combined with existing engineering and design software they can be used to build future cloud collaborative platforms. Product Data Management (PDM) can leverage these platforms in order to build secured, cost-effective and simple solution to manage files and have collaborative access to them. Just my thoughts…

Best, Oleg

Siemens PLM Analyst Event and PLM Public Cloud Strategies

September 10, 2013

Social tools can make your professional life much more efficient these days. I’ve been following Siemens PLM analyst event in Boston last week via twitter. The even is over, but you can still reach the tail of the information today by searching for #SPLM13 hash tag on twitter. Twitter search tool has a limited timespan, so run fast if you want to get the original tweet stream.

Cloud PLM switch is under go and obviously, one of the topics of my interest was Siemens PLM software in the cloud. Few months ago, Siemens PLM announced about TeamCenter cloud availability and IaaS cloud strategy. I wanted to find some examples of TeamCenter cloud experience provided by customers. Cloud buzzworld wasn’t on the top of hype list for the event. It was very easy to catch the following tweet by Jim Brown:

Jim Brown @jim_techclarity. Sterling – Yes, Teamcenter is on #Cloud. 2 customers up. Working w/ partners vs setting up cloud #SPLM13 #PLM

One of the top differentiations in cloud strategies today is private vs. public cloud. The associate cost is one of the factors of the decision. The cost of data centers and services can easy go high and it will influence other decisions – availability, packages, price. I found a very interesting article speaking about cloud cost differentiations published in Wired magazine. Navigate to the following link to read – Why Some Startups Say the Cloud Is a Waste of Money. Make a read. The main point in the article is that public cloud and Amazon can be quite costly and not efficient in specific cloud configurations. The article brings multiple examples companies started with public cloud on the Amazon and moved towards private cloud within the time. Here is my favorite passage from the article.

“The public cloud is phenomenal if you really need its elasticity,” Frenkiel says. “But if you don’t — if you do a consistent amount of workload — it’s far, far better to go in-house.” Within IT departments, public clouds do tend to get more expensive over time, especially when you reach a certain scale.’

PLM vendors are following different strategies when it comes to public and private cloud these days. Arena Solutions (aka in early days) is offering their software as public cloud. The same does Autodesk with PLM360. Dassault Systemes didn’t provide any information about how much cost their new cloud offering. Meantime, Siemens PLM didn’t provide any information about TeamCenter on the cloud as well. Here is the only relevant tweet from #SPLM13 I found about Siemens PLM cloud licensing and cost:

PJ Jakovljevic @pjtec4 #splm13 Interesting that in #SiemensPLM’s new go-to-market initiatives there are no mentions of #cloud & subscription licensing #JustSaying

What is my conclusion? We are getting into period of time PLM vendors will try to innovate by trying different cloud strategies. My hunch there are two main reasons here – cost and market differentiations. Public vs. private cloud will be one of the key differentiators. The elastic capability of public cloud is a huge advantage and it was proven by many internet and enterprise companies. At the same time, specific characteristics of PLM business can make private cloud and combined options attractive as well. The jury is still out. Just my thoughts…

Best, Oleg

Amazon, CIA and Future of PLM Private Clouds

June 24, 2013

Cloud. Public. Private. Dedicated. Secured. Security topic can detonate and destabilize any discussion about cloud deployment. Tell people about security and discussion will be derailed for the next 30 min… I’ve been discussing cloud security on my blog many times. You probably can skim few notable discussions by reading – Cloud PLM and Good Enough Security and Thoughts about cloud PLM security and iPhone 5.

Big companies and cloud providers are moving forward to improve cloud security and certification. Maybe you remember this one – Cloud PLM and Security Certification. Google App Engine is officially secured now. Large PLM providers are checking Cloud IaaS options. Over the weekend I stumbled on interesting Quartz article – Amazon is staffing up for its 600 million cloud for spooks. Here is an interesting passage:

On Friday June 14, a US Government Accountability Office (GAO) report elaborated on previous reports that Amazon had won a $600 million contract to build a “private cloud” for the CIA. (The GAO report was generated when IBM, which had been competing for the contract, protested that it had lost unfairly.) More than half a billion dollars will buy you a lot of cloud computing, and now, according to postings on Amazon’s own jobs site, the company is staffing up to meet the demand the new contract will require. Specifically, Amazon is looking for engineers who already have a “Top Secret / Sensitive Compartmented Information” clearance, or are willing to go through the elaborate screening process required to get it. TS/SCI is the highest security clearance offered by the US government, and getting it requires having your background thoroughly vetted.

So, Amazon is staffing for systems engineers —government cleared. Eventually, it means Amazon will be gathering top knowledge about security and certification. It can be a good news for large PLM vendors looking for experience in heavy PLM implementations for large OEMs.

What is my conclusion? My hunch, CIA security requirements should be in-line with requirements of big manufacturing firms. Security is not a simple act or feature. To ensure security of large OEMs requires time and experience. Leading PLM vendors are trying to figure out how to expand their platforms to cloud environments and security experience can become hot topic for them. Just my thoughts…

Best, Oleg


Get every new post delivered to your Inbox.

Join 250 other followers