PLM cloud options and 2014 SaaS survey

October 24, 2014

plm-cloud-options-2014-saas

The number of SaaS businesses is growing these days. You probably had a chance to read my CAD, PLM and Top 500 cloud app vendors list few months ago. However, one size doesn’t fit all. This is certainly true about engineering software and PLM. As PLM companies are moving to the cloud, we want to learn more about possible options and strategies of how do we move to the cloud. Below you can find a list of my previous articles covering the diversity of cloud strategies from major CAD/PLM vendors – Autodesk, Dassault, PTC and Siemens PLM.

PLM vendors, large manufacturers and public cloud

Dassault is going to support all PLM cloud options by 2015+

Siemens PLM Analyst Event and PLM Public Cloud Strategies

The challenging face of dual PLM clouds

PLM Cloud Switch and PTC Final Click?

In my view, CAD and PLM companies are in a very active stage looking how to build cloud technologies and products. We can make an initial comparison of cloud PLM strategies of several CAD/PLM companies. Autodesk and Arena are fully embraced cloud as a primary way to deliver PLM solutions to customers. Siemens PLM and PTC are following IaaS strategies. Dassault Systems strategy is to support all cloud options by 2015. Aras plan to leverage cloud from both technology and business strategy.

PLM companies are joining growing population of SaaS businesses. It means we can start gathering some statistics about these companies, their performance and technical aspects of cloud delivery. If you considering to build your future PLM strategies around cloud, this is can be very valuable data point for your research.

My attention caught by 2014 SaaS Survey published by for Entrepreneurs blog by David Skok (@BostonVC). If you are in SaaS business, it is must read article. The following two charts caught my special attention. It gives you a perspective on how cloud (SaaS) applications will be delivered:

saas-2014-plm-delivery-options-2

Another one shows how delivery method changes as SaaS business is growing.

saas-2014-plm-delivery-options

Today, cloud PLM is only part of the business for all major CAD/PLM companies. It is hard to statistic from the research above to these businesses. However, it clearly gives you some perspective on business performance and aspects of how cloud PLM business will be developed in the next few years.

What is my conclusion? Cloud is here. There is no escape path. Manufacturing companies should look how to optimize their IT infrastructure and application delivery methods. I’m pretty sure, cloud PLM will become part of this optimization. It is probably a good idea to make yourself familiar with the aspects of performance of SaaS companies to create a foundation for future strategy meetings. A note for PLM IT managers and PLM architects. Just my thoughts…

Best, Oleg


Why to ask cloud PLM vendor about Devops and Kubernetes

October 23, 2014

dockers-containers-cloud-plm

I want to continue the theme of how do we move to the cloud. While Amazon remains one of the major providers of elastic computing services, other options are emerging too. If you consider to move your PLM initiatives to the cloud, you might do some analysis about how actually cloud PLM can be made. Few weeks ago I was talking about large manufacturing and public cloud. Public cloud is an interesting option. At the same time, regulated manufacturing and companies with significant security restrictions can question this path. One of the alternatives for these companies can be just announced Azure Cloud System from Microsoft/Dell. It will take time for PLM vendors to support it, but Cloud PLM In Azure Box can become a reality soon.

Today I want to speak more about some trends in cloud computing and how it can be related to you future cloud PLM project. Remember my article What cloud PLM cannot do for you? The biggest achievements of cloud PLM today is removal of IT hassle and complexity. With cloud PLM you don’t need to think about servers, installations and even upgrades. However, here is the thing. The number of cloud applications is growing. Application lifecycle is getting more interesting these days. Large enough company can easy face the situation of management of multiple clouds – public and private at the same time. Complexity of manufacturing organization, supply chain, security or other IT-related reasons can easy bring you to such situation. These are not simple questions and it is very important to create a right strategy for your IT organization managing cloud PLM and other application providers.

Devops

You can consider “devops” as a new buzzword. It comes from a combination of “development” and “operations”. Bricks and mortar PLM software vendors were doing development only. They developed, tested and shipped CAD, PDM and PLM software on CDs and you had to hire IT specialists to install, configure and run it. Now, it is different with cloud software. By removing IT hassle from customer, software vendor is taking a role of IT too. It created a new paradigm of development+operations together. Think about engineering and manufacturing. They have to go together to make it work.

InfoWorld article Devops has moved out of the cloud speaks more about devops trend. I like the way it makes demystification of cloud by explaining how the same infrastructure can be used for both cloud and non-cloud development and IT environments. It also helps you to understand the importance of operation to achieve the quality of cloud services. Here is my favorite passage:

Many people attribute the rise of devops directly to the growth of cloud computing. The connection: It’s easy to continuously update cloud applications and infrastructure. For example, a SaaS application typically requires 1,000 lines or more of changed or added code each time you use it. Its functionality is continuously updated, which makes the cloud-delivered application, platform, or infrastructure more valuable to the users. Gone are the days when you received CDs or DVDs in the mail and had to manually update the servers. Although the cloud is certainly a better place for devops, I don’t believe that devops should be used only in cloud deployments. Instead, you should use devops approaches and enabling tools such as Puppet or Chef in most of the development you do these days — both cloud and on-premises.

Kubernetes

We need to thank Amazon EC and other IaaS vendors for incredible success of cloud computing we have today. However, technology doesn’t stay still. For the last decade public web companies learned many lessons how to manage infrastructure and software development on demand and on scale.

Kubernetes is an example how web companies can scale using cloud infrastructure. Navigate to ComputerWeekly article – Demystifying Kubernetes: the tool to manage Google-scale workloads in the cloud and spend some time even you will consider it a bit technical. In a nutshell it speaks about new technology of cloud deployment – containers, which comes to replace well-known VMs (Virtual Machines). Here is the most important passage in my view:

Kubernetes and Docker deliver the promise of PaaS through a simplified mechanism. Once the system administrators configure and deploy Kubernetes on a specific infrastructure, developers can start pushing the code into the clusters. This hides the complexity of dealing with the command line tools, APIs and dashboards of specific IaaS providers. Developers can define the application stack in a declarative form and Kubernetes will use that information to provision and manage he pods. If the code, the container or the VM experience disruption, Kubernetes will replace that entity with a healthy one.

containers-vs-VMs

While it may sounds too complex, the key issue here is related to the lifecycle of complex cloud PLM environments. At the end of the day, cloud PLM vendors will have to manage updates, introduce new features, maintain data and more. This technical example can show you the gap between new type of cloud infrastructure and opportunity to move an existing PLM server from your server room to the cloud.

What is my conclusion? We should move beyond “cloud PLM” buzzword. Enterprise software vendors are moving from shipping CDs towards selling software services. It simplifies customer experience, but creates new layers of complexity in vendor’s organization. It moves software development to devops and creates technologies that capable to manage application lifecycle easier. It ends up with the quality of PLM cloud service. Keep it in mind when you evaluate you future cloud PLM project. Just my thoughts…

Best, Oleg


PLM and Microsoft Azure Cloud In A Box

October 22, 2014

ms-azure-cloud

How do you move to the cloud? This is one of topics I’m discussing on my blog for the last year. The last time, I took a swing towards public cloud. Navigate to my PLM vendors, large manufacturers and public cloud article for more information. However, not everybody will move to public cloud. At least not very soon.

For those who is looking for alternatives, especially within private cloud zone, the last update from Microsoft can be a very good news. Navigate to the Business Insider blog – Microsoft’s Satya Nadella Just Fired A Shot At HP And IBM. Microsoft turns to Dell to create a new computer server. Here is the passage which provides more info:

The new computer is called the “Microsoft Cloud Platform System” and it will be a mini-version of Microsoft’s cloud, Azure, that enterprises can install in their own data centers. By using this server, enterprises can easily move applications from their own private data center to Microsoft’s cloud and back again. (In geek speak, this is called “hybrid computing”.)

Some more details came from CMSWire blog earlier today – Take a Seat Google, Amazon: Microsoft’s Cloud Wins the Day. So what is that Microsoft Azure Cloud in A Box. Here is the definition of a “Box”:

...new Azure-like appliance that Enterprises can deploy in their own data centers. It has been designed specifically to handle big data workloads (32 cores, 450 gigabytes of RAM and 6.5 terabytes of local solid-state drive storage). Officially named the Microsoft Cloud Platform System (CPS), powered by Dell it is, in essence, an “Azure consistent cloud in a box” with pre-integrated hardware from Dell and software from Microsoft.

I captured the following architecture shot from WinITPro article:

plm-azure-in-a-box

It made me think about what is the potential impact and opportunity for PLM vendors. For most of them, alignment with Microsoft can be very beneficial. In the case Microsoft will do hard work and promote their Cloud Platform System to CIOs of large enterprise companies, PLM can be the icing on the cake. So, on the surface it all looks good. Especially, for PLM vendors especially fully aligned Microsoft software stack. I guess Microsoft partnership programs can provide some additional benefits too.

The issue I’d like to question is related to data layer. Most of large PLM deployments today are running on top of Oracle database. Oracle has their own cloud plans – Oracle cloud PaaS will provide a magic button for PLM. The availability of Oracle DB as part of Azure Cloud Platform can be questionable and become an issue to move PLM systems to Azure.

What is my conclusion? The devil is in the details. This is the best way to describe the status of cloud PLM software architecture today. PLM vendors are developing their own cloud strategies. Manufacturing companies are looking for the easiest path to the cloud. We will see some interesting moves from both sides. A good time for PLM architects and tech advisers. Just my thoughts…

Best, Oleg

Note: I’m migrating my blog to http://beyondplm.com – you might consider to follow and comment there.


PLM Files Detox

October 21, 2014

zero-files-no-CAD-files

The digital life around us is changing. It was a time when everything we did was running around desktop computer. You do your job, Save As… and, yes(!) put it in a file that can give you control over the result of your job. That’s the reason why engineers are in love with CAD files and Excel spreadsheets – it gives them full control of what they do. Excels are getting messy within time, but we can start a new file or open a new Excel spreadsheet.

Rob Cohee of Autodesk reminded me how much engineers are in love with files in his LinkedIn article – My Name is Rob, and I’m Addicted to Files. I captured few passages from Rob’s article before. He brilliantly explains the full engineering enjoyment of control over design and related information.

It started out small with a .DWG here, a .DOC, there with a sprinkle of .XLS files in between.

I had the freedom to create all this data, and the power is nothing short of addicting. Critical design requirements, tolerance, specification, and performance requirements, assembly instructions, a digital folder of file after file containing all of this critical information. I was the Michelangelo of AutoCAD R13 C4, the DWG was my canvas, safety was my muse.

The drawing file became everything. It was my design, requirements document, revision control, my parts list, my BOM, my supplier and procurement instructions, my cut list, my everything. All that data, all in one place locked away in my CAD file that only I had access to make modifications. The control was dizzying, euphoric at times. Any change to the drawing file had to go through me and me alone.

Rob’s article reminded me some of my old posts – The future of CAD without files. I still like very much a diagram I placed there from O’Reilly Radar article – Why files need to die. Here is my conclusion back into 2011.

The fundamentals of CAD and design systems are files. We use them to store assemblies, parts, drawings. In addition to that, we use them as a reference in many places. Do think “file” paradigm will live with CAD and other design systems forever? The movement of CAD vendors seems to me the obvious application of modern web principles to the world of design and engineering. The initial signals are here. CATIA V6 pushed the limits and eliminated files by connecting CATIA system directly to Enovia back-end. Autodesk cloud experiments with systems like AutoCAD WS made existence of files on the disc obsolete. PTC introduced Creo Apps. It will be interesting to see if PTC will come with the future idea of eliminating files. I think the computing and information paradigms are shifting from file-oriented to data (and web) oriented. The initial signs are here. The speed of this movement is questionable. Manufacturing is slow changing environment and engineers are very reluctant to changes.

PDM (Product Data Management) was a solution to end CAD file mess. PDM systems came to hunt for CAD and other files. The intent was to bring files into order, manage revisions, share data and… after some time, to eliminate files. We can see it started to happen now in some high-end systems such as CATIA V6. So, why PDM failed to detox engineers from files? Here is the thing… PDM was invented to help engineers to manage and control data. It sounds like engineers should like PDM, since it helps them to control files. But it didn’t go according to the plan. PDM added "frictions" into engineering freedom to create data in the way engineers want. Name control, check-in/out, approvals, etc. As a result of that, PDM failed to become a friend and turned to be engineers’ nightmare. Engineers don’t like PDM and in many situations engineers were forced to use PDM.

Working environment is changing fast. We are getting disconnected from files in our digital life. Our everyday workflows are getting distributed, mobile, disconnected from desktops and… files. We want to get access to data and not to files. To make this process successful, we need to think how to remove frictions. When you go to engineering school, you learn about importance of frictions. But software is different. Especially these days. Frictions can slow down the process of software adoption.

What is my conclusion? Engineering and manufacturing is slow changing environment. Engineers are conservative and design minded. Therefore, many PLM tools failed to become a favorite engineering data management and collaboration tool. Large teams accepted PDM tools because they had no choice. I believe, the future won’t belong to files. We are going to see more data-driven environment around us. To establish such environment is one of the main challenges for PLM companies today. To make it happen, PLM vendors must think how to remove frictions between users and PLM tools. Just my thoughts…

Best, Oleg


How to rethink PLM workflows?

October 20, 2014

plm-mobile-workflow

Workflows and processes. This is an important part of any company. Like blood goes through your body, workflows are going through a company and beyond. Couple of months before, I posted by ultimate PLM workflow dream. It came as a part of my thinking about "un-bundling services". My idea was to publish list of features workflow (process management) that can be used as an independent service.

Many businesses were created with the vision to improve processes and to support business workflow. However, email is still one of the key elements of every workflow and business process management system implementation. How to move from emails and messages to collaboration – in my view, this is one of the most critical elements that can help to streamline PLM workflows. Because of ubiquity, email remains one of the most widely used engines behind companies’ workflow. One of the ideas I discussed earlier was to connect emails and workflow – how to turn email into actionable workflows.

Over the weekend, I was skimming through the article – 10 Trends Transforming Enterprise IT by TechCrunch. The trend that caught my attention was #6 – Reimagining enterprise workflows. Read the following passage:

Widespread adoption of mobile devices has led to more efficient enterprise workflows. Salesforce.com CEO Marc Benioff recently said he runs his whole business from his phone. This gets easier every day. Whether it is quickly turning around business documents via the e-signature application DocuSign or fine-tuning scheduling and route optimization for people in the field via ServiceMax, mobile applications are reinventing and automating processes across the enterprise, enabling work to get done faster and smarter.

Here is the article referencing Salesforce.com’s Marc Benioff statement earlier this year.

"I run my business entirely on my phone," CEO Marc Benioff said. "I believe this is the future." As companies store less and less data on site, more will no longer need desktops, he said.

It made me think about changing roles between email and mobile. One of the biggest early successes of mobile computing was to turn business email into mobile service. I’ve been using one with my old Blackberry and it was very reliable. Here is the thing. Mobile devices are more ubiquitous today than email. Mobile applications can be easier and more intuitive compared to the list of emails in my inbox. Mobile can be a service that can help to rethink PLM workflows bypassing email and existing complex business suites.

What is my conclusion? We need to learn how to break things that preventing us from moving forward. Email is one of them. In the past, we asked to connect every PLM workflow to the email. That was our desire to have a single point of communication. Today, our mobile phone is our single point of communication and it is more powerful than our desktop computer 10 years ago. People don’t have to be connected to their desks. Therefore, to disrupt PLM workflows by making them completely mobile can be an interesting option. Just my thoughts…

Best, Oleg


PLM: from sync to link

October 17, 2014

plm-data-link-sync

Data has an important place in our life. Shopping lists, calendars, emails, websites, family photos, trip videos, documents, etc. We want our data to be well organized and easy to find. Marketing folks like to use the term – data at your fingertips. However, the reality is just opposite. Data is messy. We store it in multiple places, we forget names of documents and we can hardly control it.

Everything I said above applies to manufacturing companies too. But, it gets even more complicated. Departments, contractors, suppliers, multiple locations and multiple systems. So, data lives in silos – databases, network drives, databases, multiple enterprise systems. In my article – PLM One Big Silo, I’ve been talking about organizational and application silos. The data landscape in every manufacturing company is very complex. Software vendors are trying to crush silos by introducing large platforms that can help to integrate and connect information. It takes time and huge cost to implement such system in a real world organization. Which makes it almost a dream for many companies.

In my view, openness will play a key role in the ability of system to integrate and interconnect. It will help to get access to information across the silos and it leads to one of the key problem of data sharing and identity. To manage data in silos is a complex tasks. It takes time to organize data, to figure out how to interconnect data, organize data reporting and to support data consistency. I covered it more in my PLM implementations: nuts and bolts of data silos article.

Joe Barkai’s article Design Reuse: Reusing vs. Cloning and Owning speaks about the problem of data re-use. In my view, data reuse problem is real and connected directly to the issue of data silos. I liked the following passage from Joe’s article:

If commonly used and shared parts and subsystems carry separate identities, then the ability to share lifecycle information across products and with suppliers is highly diminished, especially when products are in different phases of their lifecycle. In fact, the value of knowledge sharing can be greater when it’s done out of sync with lifecycle phase. Imagine, for example, the value of knowing the manufacturing ramp up experience of a subsystem and the engineering change orders (ECOs) that have been implemented to correct them before a new design is frozen. In an organization that practices “cloning and owning”, it’s highly likely that this kind of knowledge is common knowledge and is available outside that product line.

An effective design reuse strategy must be built upon a centralized repository of reusable objects. Each object—a part, a design, a best practice—should be associated with its lifecycle experience: quality reports, ECOs, supplier incoming inspections, reliability, warranty claims, and all other representations of organizational knowledge that is conducive and critical to making better design, manufacturing and service related decisions.

Unfortunately, the way most of companies and software vendors are solving this problem today is just data sync. Yes, data is syncing between multiple systems. Brutally. Without thinking multiple times. In the race to control information, software vendors and implementing companies are batch-syncing data between multiple databases and applications. Parts, bill of materials, documents, specifications, etc. Data is moving from engineering applications to manufacturing databases back and forth. Specifications and design information is syncing between OEM controlled databases and suppliers’ systems. This data synchronization is leading to lot of inefficiency and complexity.

It must be a better way to handle information. To allow efficient data reuse, we need to think more about how to link data together and not synchronize it between applications and databases. This is not a simple task. Industry that years was taking "sync" as a universal way to solve problem of data integration cannot shift overnight and work differently. But here is a good news. For the last two decades, web companies accumulated lot of experience related to management of huge volumes of interconnected data. The move towards cloud services is creating an opportunity to work with data differently. It will provide new technologies of data integration and data management. It also can open new ways to access data across silos. As a system that manage product data, PLM can introduce a new way of linking information and help to reuse data between applications.

What is my conclusion? There is an opportunity to move from sync to link of data. It will allow to simplify data management and will help to reuse data. It requires conceptual rethink of how problems of data integrations are solved between vendors. By providing "link to data" instead of actually "syncing data", we can help company to streamline processes and improve quality of products. Just my thoughts…

Best, Oleg


Kenesto revamp: does it change cloud PLM game?

October 17, 2014

kenesto-edm

It has been more than two years since I was reviewing Kenesto – an outfit founded by Mike Payne with the a strong vision to simplify process management. Navigate to the following article PLM, Kenesto and process experience to refresh your memories.

Steve Bodnar of Kenesto put comments on my blog about Google Drive and 3rd party apps with hints about some Kenesto functionality around file synchronization and cloud data management. It was a good alert that Kenesto is preparing some refresh. The following Kenesto press release caught my attention yesterday – Kenesto Extends Engineering Collaboration with New Vaulting and State-of-the-art Desktop File Synchronization. I found it interesting, since it moved Kenesto from process management cloud tool into something bigger – data management and vaulting. Back in 2012, I thought, that ability to handle engineering data is a big differentiation between traditional PLM system and cloud process management tool like Kenesto. The following passage from Kenesto press release can give a short description of the shift Kenesto made – it moved into data and file management space.

Kenesto today announced the full availability of its latest innovations – file vaulting and a pioneering file synchronization service – to enable mainstream design and engineering firms to more easily and effectively collaborate and manage their data. Kenesto’s latest capabilities also work well in conjunction with such design tools as Creo®, SolidEdge®, SolidWorks®, and Spaceclaim® for manufacturing customers and also Revit® for AEC customers, to enable file management and sharing across design workflows. This is all done while also ensuring proper handling of updates to component and assembly models connected to items and bills-of-material, for example.

I made a trip into Kenesto website. It presents a broad range of solutions – engineering design management, change management, procurement and supplier collaboration, program and project management. These are traditional PLM suspects. However, some of solutions are clearly outside of typical PLM domain – management of marketing program, PR and advertising, idea management.

Kenesto features are covering wide range of capabilities – projects, dashboard, reporting, document management, vaulting, web viewing, workflow and task management. My special attention caught Enterprise-class File Synchronization. This is an interesting feature and it made me think about cloud PDM functionality and cloud file sharing. My blog- Cloud PDM ban lifted. What next? speaks about growing interest of PLM and other vendors to apply cloud technologies to PDM – space that traditionally tried to avoid cloud-touch. So, Kenesto just joined the cloud of cloud PDM vendors and I need to add Kenesto in the list of companies open for cloud PDM competition.

kenestoDesktopSync

What is my conclusion? It looks like Kenesto decided to change the trajectory of Kenesto technologies and moved from process and workflow management to a full scope of product data management and lifecycle solutions. I guess Kenesto prefers not to use traditional PDM, PLM buzzwords. However, Engineering Data Management (EDM) acronym made me feel a bit nostalgia… At the same time, cloud sync and in-browser office files editing tools can provide an interesting differentiation in cloud-era. Just my thoughts…

Best, Oleg

Disclaimer: Kenesto didn’t sponsor and didn’t influence content of this blog post.


Follow

Get every new post delivered to your Inbox.

Join 252 other followers