The end of single PLM database architecture is coming

August 5, 2014

PLM-distributed-cloud-database-architecture

The complexity of PLM implementations is growing. We have more data to manage. We need to process information faster. In addition to that, cloud solutions are changing the underlining technological landscape. PLM vendors are not building software to be distributed on CD-ROMs and installed by IT on corporate servers anymore. Vendors are moving towards different types of cloud (private and public) and selling subscriptions (not perpetual licenses). For vendors it means operating data centers, optimize data flow, cost and maintenance.

How to implement future cloud architecture? This question is coming to the focus and, obviously, raising lots of debates. Infoworld cloud computing article The right cloud for the job: multi-cloud database processing speaks about how cloud computing is influencing what is the core of every PDM and PLM system – database technology. Main message is to move towards distributed database architecture. What does it mean? I’m sure you are familiar with MapReduce approach. So, simply put, the opportunity of cloud infrastructure to bring multiple servers and run parallel queries is real these days. The following passage speaks about the idea of how to optimize data processing workload by leveraging cloud infrastructure:

In the emerging multicloud approach, the data-processing workloads run on the cloud services that best match the needs of the workload. That current push toward multicloud architectures provides the ability to place workloads on the public or private cloud services that best fit the needs of the workloads. This also provides the ability to run the workload on the cloud service that is most cost-efficient.

For example, when processing a query, the client that launches the database query may reside on a managed service provider. However, it may make the request to many server instances on the Amazon Web Services public cloud service. It could also manage a transactional database on the Microsoft Azure cloud. Moreover, it could store the results of the database request on a local OpenStack private cloud. You get the idea.

However, not so fast and not so simple. What works for web giants might not work for enterprise data management solutions. The absolute majority of PLM systems are leveraging single RDBMS architecture. This is fundamental underlining architectural approach. Most of these solutions are using "scale up" architecture to achieve data capacity and performance level. Horizontal scale of PLM solutions today is mostly limited to leverage database replication tech. PLM implementations are mission critical for many companies. To change that would be not so simple.

So, why PLM vendors might consider to make a change and to think about new database architectures? I can see few reasons – the amount of data is growing; companies are getting even more distributed; design anywhere, build anywhere philosophy comes into real life. The cost of infrastructure and data services becomes very important. In the same time for all companies performance is an absolute imperative – slow enterprise data management solutions is a thing in the past. To optimize workload and data processing is an opportunity for large PLM vendors as well as small startups.

What is my conclusion? Today, large PLM implementations are signaling about reaching technological and product limits. It means existing platforms are achieving a possible peak of complexity, scale and cost. To make the next leap, PLM vendors will have to re-think underlining architecture, to manage data differently and optimize cost of infrastructure. Data management architecture is the first to be considered. Which means end of existing "single database" architectures. Just my thoughts…

Best, Oleg


Cloud PLM and Battle for Cost?

January 31, 2014

plm-cloud-cost-battle

PLM companies are switching to the cloud. Software vendors are taking different paths and technical strategies – IaaS, PaaS, private clouds, public clouds with high diversity of options and marketing messages. Navigate to some of my previous posts to get up to speed with the topic – Cloud PLM and IaaS Options, PLM PaaS, PLM cloud strategies.

Public cloud and specifically Amazon Web Services is one of the options to explore the potential of new PLM technologies, delivery and business models. To use elastic infrastructure provided by Amazon is compelling to newcomers in PLM industry as well as for established PLM vendors transforming their PLM portfolios. A potential disadvantage of Amazon is that it can get a little pricey. Many cloud companies discovered "cost issue" especially when they come to the point of scaling customers and data.

Earlier this week, I was reading an interesting article by Heap – “How We Estimated Our AWS Costs Before Shipping Any Code”. Heap is an iOS and Web analytics tool that captures every user interaction. Interesting enough, Heap helps you to estimate their AWS cost to decide if product / project/ website has a sustainable business model. Here are few interesting examples provided by Heap article:

Cost reduction: CPU. Our queries involve a large amount of string processing and data decompression. Much to our surprise, this caused our queries to become CPU-bound. Instead of spending more money on RAM, we could achieve equivalent performance with SSDs (which are far cheaper). Though we also needed to shift our costs towards more CPU cores, the net effect was favorable.

Cost inflation: Data Redundancy. This is a necessary feature of any fault-tolerant, highly-available cluster. Each live data point needs to be duplicated, which increases costs across the board by 2x.

This article made me think about possible trajectories of cloud PLM options. PLM vendors thinking about transforming and adapting their existing PLM products for cloud must be aggressively making assessments about their cloud cost on Amazon or alternative platforms. Startup companies developing new generation of PLM products have a very good opportunity to check their costs and viability of their future business models.

What is my conclusion? The battle about cloud viability has strong cost relation. Software companies are moving from "CD shipments" to "service providing". This process will be painful for many of them and sooner they validate and build their future business models is better. For PLM companies, the best association should be "cost model for manufacturing" – the earlier in the process of product design you can see the cost – the better chance this product become successful. Just my thoughts…

Best, Oleg


PLM cost and future of public cloud

November 11, 2013

plm-public-cloud-cost

PLM and cost. The topic is important and heavily debated among vendors, customers and industry pundits. Major established PLM vendors are keeping bar high, cloud vendors and open source innovators are trying to find alternatives to existing licensing mechanisms and upfront PLM payments. For most of the people, cloud and/or web is usually associated with free and inexpensive. In my view, this is a feeling developed by most of web giants.

However, for manufacturing companies and maybe even for the enterprise the situation might be different. I’ve been reading Joe Barkai writeup stating that according EMC research, large enterprises are reducing their investment into public cloud. Joe quoting EMC Forum 2013 event held in David Intercontinental hotel Tel Aviv. The source of the article in Hebrew is here. You can translate using Google for free and have decent quality translation. Here is the passage I specially liked:

"Over the years, the public cloud was a major trend, since it bore promise of downloadable dramatic costs, increasing accessibility and ease of use. Meanwhile, research firms show the public cloud is not suitable organizations enterprise," said Adrian McDonald, president of EMC EMEA… According to McDonald, many of whom he met CIOs report public cloud is more expensive than the alternatives, taking into account the needs of security, compliance and business continuity. Said that a recent study by EMC found surprising data regarding the use of large enterprises public cloud. "We expected that half workloads will be on public cloud and a half on the company. However, the data show that the accumulated experience of the decision makers in these organizations were reduced percentages using public cloud. According to the study, by 2016, only 12% of the work load will be on the public cloud. Additional 12% will be virtual cloud – private, and 76% will be managed inside the organization. "

It made me think about PLM and Cloud PLM specifically. Does it mean public cloud can make PLM more expensive and to instal servers and engage IT is actually more cost effective? Looking on PLM companies announcing their support for cloud deployment, it sounds like a contradiction.

Here is my take. I believe, PLM system that used to manage product development processes has low utilization. It is actually opposite to what you can potentially see with CAD, PDM and other engineering systems. EMC examples are probably focused on storages and other devices that have high utilization. In cloud and server business, utilization matters.

What is my conclusion? PLM cloud services is too broad definition to decide about what will be cost effective model to support it. Utilization matters. Therefore depends on the type of services, amount of data, availability, usage intensiveness and many other parameters, you can come to the point where to keep hosted service can be more efficient. In the past, PLM vendors didn’t differentiate between resources used to store data and managing entire process lifecycle. It made everything equally expensive. To enable downstream PLM usage, the license model and PLM cost should be different. Public cloud is here to help. Just my thoughts….

Best, Oleg


Will Open Source Databases Make PLM Affordable?

November 7, 2013

db-license-cost

Budget and cost. These are important elements of every IT solution. PLM is not an exclusion from this list. There are lots of debates about PLM systems cost lately. Few days ago, I was discussing one element of PLM system total cost of ownership related to "up-front cost" – The Future battle of PLM upfront cost. Move to services / subscription model is clearly one of transformation in PLM TCO. Thinking more about PLM cost structure, I wanted to point on the cost of database licenses. My hunch, the majority of PLM software today runs on enterprise RDBMS platforms such as Oracle and MS SQL Server.

My attention was caught by the following article Feds Move To Open Source Databases Pressures Oracle on IW Gov blog. The affordability of open source database solutions has government IT attention. I specially liked the beginning of the article

Under implacable pressure to slash spending, government agencies are increasingly embracing open source, object-relational database software at the expense of costly, proprietary database platforms. That’s putting new pressure on traditional enterprise software providers, including Oracle, to refine their product lineups as well as their licensing arrangements.

Future in the article, there is an assessment how much companies can save as a result of moving to open source database solutions such as PostgreSQL.

Moving to open source software can help agencies slice database costs by as much as 80% because open source providers aren’t hamstrung by the conventional business and licensing practices employed by large database companies such as Oracle, IBM, Microsoft and Sybase, according to Boyajian. "The traditional, burdensome licensing practices of the big proprietary guys have really started to put new kinds of pressure on government agencies," he said. "Most of the licensing firms have come up with very inventive ways to make sure the price per year goes up and not down, and that’s in direct conflict with the way government agencies are trying to operate now."

What is my conclusion? I smell the change towards free and/or low cost software. It comes with broader use of open source and expansion of service based business models. The combination may work as well. For those companies that feel comfortable with open source, it can provide a significant cut in IT expenses. It would be interesting to see if existing PLM providers will roll out a support for open source databases such as MySQL and PostgreSQL in a near future. Just my thoughts…

Best, Oleg


The future battle for PLM upfront cost

November 5, 2013

PLM-zero-upfront-cost

Transformation of business models is one of the most important trends that happens today in the industry. Take a deep breath… it doesn’t mean companies don’t want to be compensated for the work they do. These days it is just about how to define the right business model that reflecting the relationships between all participants in the business.

The last tsunami related to the software price happened after recent announcement of Apple to make lot of their software including their OSX free. Over the weekend, I’ve been reading Forbes article – The Upfront Software Price is Now Free. Read the article and draw your conclusion. Even if the majority of the focus is related to what happens between major software giants – Microsoft, Apple and Google, in my view, it will have an impact on enterprise software too. The following passage is important.

With software being distributed either as web applications or through models that required a connection to the internet, software gained the ability to be distributed on a subscription basis or subsidized through advertising. Over the past few years, application software has increasingly moved to an initial price point that edges closer to zero on a consistent basis.

It made me think that changes of distribution model can be an additional driver to transform them way PLM (and other enterprise software) will be distributed and licensed.

The discussion about PLM software price, licenses and total cost of ownership is not new. First time I raised the question about coming business model transformation on my blog back in 2009 – Is Free the future of PLM? Since that time, the discussion went in different directions. You probably remember the Future PLM business models panel discussion I moderated during PLM Innovation 2012 in Munich. Another post, which is actually very co-sound with the idea of transformation as a result of changes in distribution model – PLM Cloud and Software licensing transformation.

However, in addition to cloud, SaaS and Web, there is another driving force behind changing PLM software business model. It is open source. Take a look here – PLM: Open Source vs. Free. In my view, open source is another powerful opportunity to change PLM business model status quo. In PLM business, Aras Corp. is driving this change since 2007 by promoting a very unique "Enterprise Open Source" model. Navigate to another article just published on Aras website – PLM Licensing is so Old School. Here is an interesting passage:

If your PLM software deployment gains wide-spread adoption (a goal of every successful PLM implementation), you will need to purchase more licenses. PLM license expenses can be huge up-front, but broad, global roll-outs are where the costs absolutely skyrocket. Effectively, the PLM project becomes a victim of its own success.

The last point about PLM upfront licenses preventing wider adoption of PLM software in the organization is interesting. PLM vendors are looking towards how to make PLM products available downstream in manufacturing organizations as well as in the supply chain. The high end-user cost is clearly not helping to make this dream come true.

What is my conclusion? In my view, PLM cost future trajectory is going towards "usage" and not "upfront cost". It is true in other places, but for PLM it can be a significant change that energize future increase in PLM software adoption and… consumption. It will be driven by major factors such as internet distribution channels as well as open source and service engagements. I don’t see big difference here – both strategies basically removes upfront cost and focus on usage. The changes are unavoidable. Business transformation train left the station. It is just a question of time. In my view, all PLM vendors are actively researching what does it mean for their current businesses and how to act (or react). Just my thoughts…

Best, Oleg


IaaS, Cloud PLM and Disruptive Pricing

January 18, 2013

PLM vendors are continuing to adopt cloud. I can clearly see a difference between people attitude for cloud solutions now and 4 years ago. Here is my simplistic definition of changes that happened for the last 4 years. The following sequence represents a typical reaction on "cloud PLM" for the last 4 years. 2009: What is cloud? 2010: Why I need cloud? 2011: Why not to use cloud? 2012: How to use cloud?

The last question, actually, has multiple angles. It means all – technology, implementation, product licenses and finally pricing model. The last one is obviously important and I can see some interesting dynamics between cloud and on premise software in coming years. The following Infoworld article caught my attention – Oracle’s faux IaaS now gets faux on-demand cloud pricing by David Linthicum. Take a read. I found it interesting. Oracle is a king of enterprise software market has a lot to lose when it comes to cloud adoption. I found the following passage the most interesting:

Oracle’s "on-demand private cloud" isn’t merely an equipment lease either. It’s an odd hybrid created because Oracle finds itself stuck between the rock and the cloud, reluctant to devalue its hugely lucrative enterprise software products by folding into cloud-service pricing. The rise of cloud computing very much goes against Oracle’s highly profitable way of doing business: enterprise license agreements, maintenance contracts, and all the other trappings of big software.

2013 is perhaps the first year where Oracle will feel real pain from public cloud providers, such as Amazon Web Services, Rackspace, and Google, as well as emerging private cloud providers such as Eucalyptus and those based on OpenStack or CloudStack.

I made me think more about what happens in PLM vendors ecosystem. Traditional PLM vendors (Siemens PLM, PTC and Dassault) are selling premium lucrative enterprise oriented packages with a lot of functionality and value behind that. Autodesk is a newbie of PLM market is playing "cloud alternative game" with SaaS prices and less functionality out of the box. Aras Corp. is providing Aras Innovator using disruptive enterprise open source. I can see some similarities in the attempts of traditional PLM vendors to embrace cloud technology and delivery models. You can see how Aras position their solution as "true cloud" with all advantages of cloud and on premise software. Aras leverages Microsoft Azure platform. Navigate to this link to read more. Siemens PLM introduced TeamCenter IaaS option delivery few months ago. I wasn’t able to get information about IaaS and cloud prices for both Aras and TeamCenter. Both website provided contact option to request the price, but no price.

What is my conclusion? Cloud plays a disruptive roles these days in many markets. Enterprise software is one of them. We can see an interesting combination of vendors, IT and infrastructure providers plays. IaaS vendors will keep existing technological platforms afloat by providing a seamless cloud infrastructure environment to support existing server-client and web technologies. Oracle is a good demo how vendor can reposition and tailor technology model to new conditions. I think, we will see lots of "cloud innovation" from traditional PLM providers in a near future. For a long run, cost matters. Just my thoughts…

Best, Oleg

Image courtesy of [Stuart Miles] / FreeDigitalPhotos.net


Cloud PLM and Cost of Data: Your Mileage May Vary?

September 6, 2012

Cost is important. Period. It drives attention of IT managers and CFOs. With a massive changes cloud and mobile bring to a modern enterprise, structure and definition of cost will be transforming. On my way from Tel Aviv to Boston yesterday, I had a chance to read RWW Mobile article – The Rising Cost Of Mobile Data For Enterprises [Infographic]. The good news – the cost of mobile data is going down. Bad news – data trafic is growing.

I found the following passage very important:

Data management is often overlooked in the enterprise mobility conversation. When employers think of mobility in the workplace, the first thing they think about is security and locking down company information on devices that could easily be stolen, lost or hacked. Application deployment and management is the next task for the IT team to figure out, making sure that the right employees have the right apps to do their jobs. In the mobile world, the trend is typically toward more of everything. More security, more apps, more functionality. But more is also… more. It means that employees suck up more megabytes and hence more corporate resources.

If you are in the IT market to shop for enterprise systems, the question of cost (or total cost of ownership) is in the list of your top priorities. Cloud transformation and introducing of Cloud / SaaS packages are changing the way people thin about TCO. Before "cloud era" the traditional TCO formula of licenses, maintenance and implementation usually provided a basis you are working with. This is not true with cloud and SaaS. Software vendors stop selling licensing and switch to sell services. It opens additional opportunity to optimize resources and to get down cost of the software. At the same time, it may introduce new cost for enterprises.

What is my conclusion? Optimization. This is the right topic to discuss with enterprise cloud software vendor. How is it optimized for resources the application sucks from cables and mobile networks? It is the same question you ask your car dealer about the MPG. In the cloud world, you need to shop for lower traffic, which fundamentally will increase the efficiency of your IT stack and will drive TCO down. Just my thoughts…

Best, Oleg

Infographic courtesy of ReadWriteWeb article



Follow

Get every new post delivered to your Inbox.

Join 248 other followers