Cloud PDM hack with Google Drive and other tools

November 6, 2014

google-drive-app-launch

Earlier this week I talked about future of ubiquitous CAD cloud drives. My hunch CAD and other engineering software companies will be trying to use new cloud technologies to improve the way people collaborate on design. The question what tool to use for CAD file collaboration is not simple. I discussed it last year – Top 3 pros and cons to have a special CAD file sharing tool.

Engineering software vendors are trying to bring values such as collaborative viewing, redlining and even project collaboration. At the same time, companies focused on generic file sharing and collaboration are in a full swing to improve their mainstream solutions as well.

Some interesting news came from Google yesterday. Read Google blog post – Launch desktop applications from Google Drive in Chrome. The story is quite simple – Google is chasing Dropbox in the way how to make Google Drive even more transparent to work with desktop tools.

But here’s the catch: when it comes to browsers and installed applications working well together, they aren’t quite on the same page. To change that, today we’re launching a new extension for Chrome that lets you open files from Google Drive directly into a compatible application installed on your computer. This includes apps like advanced image and video editing software, accounting and tax programs, or 3D animation and design tools. So, no matter what you keep in Drive, using the web to access and manage files doesn’t mean you’re limited to using applications that only work in your browser.

Unfortunately, CAD files are not in the list of supported file types. I guess, it may change in the future. A transparent sync of files between cloud and local file storage can open a new opportunity and hack the way to simplify future cloud PDM solutions. Still, majority of tools used by engineers today are desktop tools.

One of the biggest challenge I can see here is speed of synchronization and work with multiple dependent files. It can create an opportunity for cloud PDM vendors to innovate. Some of these problems can be solved by software technologies – cloud PDM and Dropbox Streaming Sync. CAD vendors are looking how to innovate in cloud PDM as well. Example – Autodesk adds PDM functionality to PLM360. Alternatively, I can see some potential in hardware solutions to create virtual cloud file system. Here is one possible example of such solution – Panzura Global File System.

What is my conclusion? Cloud to desktop transparency is a big deal. There is no magic. If you want to use desktop tool you need to sync files. However, technology that can make it transparent can simplify user experience and make users unaware about actual location of files and the way files are going to be synchronized. It will allow to use existing CAD tools but and manage and collaborate using cloud. Just my thoughts…

Best, Oleg


PLM collaboration – your inbox is a to-do list

November 3, 2014

plm-to-do-list

Collaboration is fascinating place. So many things were done to improve collaboration for the last decade, but it still remains a place many people are trying to improve. Technology around us is one of the reason companies are keeping to focus on collaboration. What was efficient and new 10 years ago, is completely obsolete now.

Actually not… Email is one of the things that we still keep on the top of our lists when it comes to communication and collaboration. Despite all innovations, social technologies, mobile, etc. we keep email as one of primary ways to collaborate. I touched this topic earlier few months ago in my post – How engineers find path from emails and messages to collaboration? One of the main points there – stop using inbox as a to-do list. From my experience, PLM software vendors as well as other companies focusing on collaborative software are spending a significant effort by trying to take people away from their email inboxes.

So, I wanted to get back to this question again. How to kill email inbox? Here are two examples I wanted to bring into this discussion today. One of them is a very recent announcement of Google about new app – Inbox. If you haven’t heard about that, navigate here to learn more. The following video can give you an additional idea of what Google is up to.

In a nutshell, Inbox looks like a sort of intelligent combination of everything we’ve been trying to accomplish with email- invitations, reminders, todo lists, messages, etc. I haven’t had a chance to try it yet, but I’m sure will share more as soon as I do that.

Another example of innovation in collaboration comes from Slack. I’ve been looking on Slack some time ago. It caught my attention again by TechCrunch article – Slack confirms $120M fundraise led by Google Ventures and KPCB at $1.12B valuation. Slack is an interesting combination of twitter streams, cloud file storage and messaging service. I’m trying Slack with some of my personal projects and share some thoughts about my experience soon.

What is my conclusion? Two examples of applications I shared is demonstrating a bit different approach how to disconnect us from email inbox. Google Inbox is organizing your work by extracting data intelligently from other Google apps and email itself. Slack is integrating activities via many other services, but keeps you focused on collaborative channels. In my view, both services are sharing one important characteristic – integration of information in a single place. This is one of the main reasons many of us stick with email. It is hard to following multiple places and channels to get your work done. So, I wonder, what will become the next place for engineers to communicate and collaborate. Just my thoughts…

Best Oleg

photo credit: @superamit via photopin cc


Why to ask cloud PLM vendor about Devops and Kubernetes

October 23, 2014

dockers-containers-cloud-plm

I want to continue the theme of how do we move to the cloud. While Amazon remains one of the major providers of elastic computing services, other options are emerging too. If you consider to move your PLM initiatives to the cloud, you might do some analysis about how actually cloud PLM can be made. Few weeks ago I was talking about large manufacturing and public cloud. Public cloud is an interesting option. At the same time, regulated manufacturing and companies with significant security restrictions can question this path. One of the alternatives for these companies can be just announced Azure Cloud System from Microsoft/Dell. It will take time for PLM vendors to support it, but Cloud PLM In Azure Box can become a reality soon.

Today I want to speak more about some trends in cloud computing and how it can be related to you future cloud PLM project. Remember my article What cloud PLM cannot do for you? The biggest achievements of cloud PLM today is removal of IT hassle and complexity. With cloud PLM you don’t need to think about servers, installations and even upgrades. However, here is the thing. The number of cloud applications is growing. Application lifecycle is getting more interesting these days. Large enough company can easy face the situation of management of multiple clouds – public and private at the same time. Complexity of manufacturing organization, supply chain, security or other IT-related reasons can easy bring you to such situation. These are not simple questions and it is very important to create a right strategy for your IT organization managing cloud PLM and other application providers.

Devops

You can consider “devops” as a new buzzword. It comes from a combination of “development” and “operations”. Bricks and mortar PLM software vendors were doing development only. They developed, tested and shipped CAD, PDM and PLM software on CDs and you had to hire IT specialists to install, configure and run it. Now, it is different with cloud software. By removing IT hassle from customer, software vendor is taking a role of IT too. It created a new paradigm of development+operations together. Think about engineering and manufacturing. They have to go together to make it work.

InfoWorld article Devops has moved out of the cloud speaks more about devops trend. I like the way it makes demystification of cloud by explaining how the same infrastructure can be used for both cloud and non-cloud development and IT environments. It also helps you to understand the importance of operation to achieve the quality of cloud services. Here is my favorite passage:

Many people attribute the rise of devops directly to the growth of cloud computing. The connection: It’s easy to continuously update cloud applications and infrastructure. For example, a SaaS application typically requires 1,000 lines or more of changed or added code each time you use it. Its functionality is continuously updated, which makes the cloud-delivered application, platform, or infrastructure more valuable to the users. Gone are the days when you received CDs or DVDs in the mail and had to manually update the servers. Although the cloud is certainly a better place for devops, I don’t believe that devops should be used only in cloud deployments. Instead, you should use devops approaches and enabling tools such as Puppet or Chef in most of the development you do these days — both cloud and on-premises.

Kubernetes

We need to thank Amazon EC and other IaaS vendors for incredible success of cloud computing we have today. However, technology doesn’t stay still. For the last decade public web companies learned many lessons how to manage infrastructure and software development on demand and on scale.

Kubernetes is an example how web companies can scale using cloud infrastructure. Navigate to ComputerWeekly article – Demystifying Kubernetes: the tool to manage Google-scale workloads in the cloud and spend some time even you will consider it a bit technical. In a nutshell it speaks about new technology of cloud deployment – containers, which comes to replace well-known VMs (Virtual Machines). Here is the most important passage in my view:

Kubernetes and Docker deliver the promise of PaaS through a simplified mechanism. Once the system administrators configure and deploy Kubernetes on a specific infrastructure, developers can start pushing the code into the clusters. This hides the complexity of dealing with the command line tools, APIs and dashboards of specific IaaS providers. Developers can define the application stack in a declarative form and Kubernetes will use that information to provision and manage he pods. If the code, the container or the VM experience disruption, Kubernetes will replace that entity with a healthy one.

containers-vs-VMs

While it may sounds too complex, the key issue here is related to the lifecycle of complex cloud PLM environments. At the end of the day, cloud PLM vendors will have to manage updates, introduce new features, maintain data and more. This technical example can show you the gap between new type of cloud infrastructure and opportunity to move an existing PLM server from your server room to the cloud.

What is my conclusion? We should move beyond “cloud PLM” buzzword. Enterprise software vendors are moving from shipping CDs towards selling software services. It simplifies customer experience, but creates new layers of complexity in vendor’s organization. It moves software development to devops and creates technologies that capable to manage application lifecycle easier. It ends up with the quality of PLM cloud service. Keep it in mind when you evaluate you future cloud PLM project. Just my thoughts…

Best, Oleg


PLM vendors, large manufacturers and public cloud

October 14, 2014

google-data-center

Companies are moving to cloud these days. The question vendors and customers are asking today is how do we move to the cloud. I’ve been asking this question in my post few month ago – PLM / PDM: Why the cloud? Wrong question… I discovered multiple options for customers to start their move to the cloud – mainstream cloud productivity tools to share data and collaborate, to migrate existing PLM platforms to cloud using IaaS strategies as well as to build new type of platforms and tools using new type of cloud platforms and infrastructure.

Today, I want to show the perspective on public cloud from both sides – large provider of public cloud infrastructure (Google) and large manufacturing company (GE) and to see what is the intersection between their strategies.

Google – example of public cloud platform

My attention caught Google presentation – The next generation of Cloud. Navigate your browser to the following link to watch it. Besides the fact it was inspiring by the exact same question – “How to you move to the cloud”, it provided a very interesting insight on the aspect of Google public cloud platform.

google-1

google-2

google-3

google-4

Hardware cost is declining and Google is adjusting public cloud to match economic realities. Together with economic of scale and utilization, I can see a trajectory towards decreased of public cloud cost even more in the future.

Large manufacturers move to the cloud

So, what customers are thinking about public cloud? Inforworld article just published an article presenting GE strategy to go all-in with public cloud. Presented as an interview with GE COO Chris Drumgoole, article outlines his aggressive plans to migrate to public cloud services — and how they support GE’s organizational goals. Read the article and draw your opinion. Here is my favorite passage:

Drumgoole won’t talk specific numbers, but he claims that “north of 90 percent” of the apps deployed by GE this year have been in a public cloud environment. We’re big fans of the idea that everything ends up in the public cloud utility model eventually. “Eventually” is the big caveat, because some people within GE would argue that should be tomorrow, while others would tell you it’s 15 years from now. It’s a subject of good debate. But either way, the regulatory environment we live in right now prohibits it. In a lot of spaces, when we say technically that we think something should be public, and we’re comfortable with it being public, the regulatory environment and the regulators aren’t quite there yet and we end up having to do some sort of private or hybrid cloud. That’s probably one of the biggest barriers to us moving more public.

Drumgoole speaks about connected devices, big data and analytics as a significant driver to move data to the cloud. I reminded me one of my previous posts – IoT data will blow up traditional PLM databases (http://beyondplm.com/2014/09/23/iot-data-will-blow-up-traditional-plm-databases/). The amount of data is huge and it will certainly require new approach in data management. Here is the example of how much data produced by jet engine these days:

Take one of the jet engines we make, and if it’s fully instrumented. On a typical flight, it’s going to generate about two terabytes of data. Not everybody fully instruments them, but if you instrument it the way people would like in order to get predictive data, you’re talking about 500GB per engine per flight. A flight with a GE engine takes off or lands every three seconds. All of a sudden, the data gets very, very large very, very fast.

PLM vendors and public cloud

As for today, I’m not aware about any PDM/PLM software using Google Cloud as a platform. The majority of cloud PLM software built on top of infrastructure provided by collocated hosting services and variety of Amazon cloud infrastructure. Dassault Systems and Siemens PLM made few public statements about support of diverse set of cloud options and IaaS infrastructure. It would be interesting to see future evolution of PLM cloud platforms.

What is my conclusion? The technology and economic of cloud is changing these days. My hunch, it will pull more vendors and companies to use public cloud in the next few years. Software companies will try to balance between leveraging technological platforms and cost. At the same time, customers will try to balance between regulatory requirements and opportunities to make data accessible and scale without limits. Interesting time and significant opportunity. Just my thoughts..

Best, Oleg


How cloud pricing war will affect PLM?

October 3, 2014

plm-and-cloud-price-war

Large infrastructure cloud providers are slashing prices. TechCrunch article Nobody Can Win The Cloud Pricing Wars is providing some additional details about the situation. The same article speaks about the moment when CIOs won’t be able to ignore the pricing advantage:

Earlier this week, Google lowered prices 10 percent across the board on their Google Compute Engine cloud platform . The cost is getting so low, it’s almost trivial for anyone to absorb the costs of running infrastructure in the cloud, but you have to wonder as the cloud pricing wars continue, how low can they go and if it’s a war anyone can win.

In spite of the low prices, there are still plenty of companies talking about the cloud with disdain and fear, but the fact is how long can CIOs ignore pricing as it goes this low? It doesn’t make good business sense, and whatever risks a large enterprise believe they might face with cloud services, it has to be offset by the plunging costs.

Are you confused by comparison of cloud infrastructure prices? You are not along. GigaOM article provides one easy chart that will help you to demystify cloud prices.

RBC’s formula condenses cloud services into one unit price based on “total spend per GB of RAM,” which includes storage, compute, memory, I/O and other base features. That makes it easier to compare cloud pricing across vendors. Per a research note from RBC analyst Jonathan Atkin this week, the second half of 2014 saw less price cutting than the first half — which included a round robin of competitive cuts from Google, Amazon and Microsoft in March.

RBC-cloud-price-per-GB-RAM1

The devil is in details and I’m obviously interested to see how it will impact (or not) PLM vendors. When it comes to "cloud", not all PLM vendors are the same. While most of them are publicly announced cloud strategy, the diversity of cloud solutions is pretty much high – public cloud platform, leveraging IaaS cloud layer and developing of colo-hosting solutions.

It is important to see business aspects of cloud PLM. Thomasnet article by Verfi Ogewell PLM Market Faces Challenges, Hints at Possibilities provides an interesting perspective on PLM market and impact cloud PLM created. Read the following passage:

One problem in assessing PLM investments for 2013 and beyond has to do with the changing licensing models, a matter which to some extent is connected to merging technology platforms, like the cloud. Increasingly, vendors are moving from paid-up licensing models to subscription models. Paid-up models have annual maintenance fees in the range of 18 to 22 percent of the license purchase price. Subscription models demand payment each year that is in the range of 30 to 40 percent of today’s list software pricing.

Has the hype around PLM in the cloud resulted in customer investments? So far, the answer is no. In fact, it may be the other way around. The cloud has affected the pricing and results on the on-premise market negatively, plus, while many PLM vendors have offerings, most have yet to see any real returns on their investments. Meanwhile, the discussion of SaaS (software-as-a-service) has created expectations of at least more effective pricing models. This picture may change quickly if the new business models for delivery and support of PLM act as triggers for greater investments.

So, what will cloud infrastructure price drop means for PLM vendors? My hunch, this is a good news for PLM vendors hosting their solution on IaaS infrastructure. This is very costly option, especially with existing "on-premise" single tenant PLM architecture. Lower price will allow to PLM vendor to adjust their expenses. It can be even more beneficial for vendors building optimized cloud PLM multi-tenant architecture. However, it probably won’t impact vendors focusing on private and hybrid cloud infrastructure. While regardless on PLM architecture, 50% of PLM project is services cost provided by vendors and implementers, the overall impact of infrastructure cost will have less impact.

What is my conclusion? Cloud pricing war will impact customer mindset. It will increase customer demand to lower cost of PLM solutions. It will shift CIO’s perspective on how to leverage cloud infrastructure in their business. Low cloud infrastructure cost won’t make cloud PLM software free tomorrow. At the same time, it will help PLM vendors to adjust overall cost of PLM services and implementations. Better architecture of cloud PLM solutions will help vendors leverage offsets in infrastructure cost to bring more cost effective PLM cloud services. Just my thoughts…

Best, Oleg


Will public clouds help enterprises to crunch engineering data?

August 6, 2014

google-data-center-crunches-engineering-data

The scale and complexity of the data is growing tremendously these days. If you go back 20 years, the challenge for PDM / PLM companies was how to manage revisions CAD files. Now we have much more data coming into engineering department. Data about simulations and analysis, information about supply chain, online catalog parts and lot of other things. Product requirements are transformed from simple word file into complex data with information about customers and their needs. Companies are starting to capture information about how customers are using products. Sensors and other monitoring systems are everywhere. The ability to monitor products in real life creates additional opportunities – how to fix problems and optimize design and manufacturing.

Here is the problem… Despite strong trend towards cheaper computing resources, when it comes to the need to apply brute computing force, it still doesn’t come for free. Services like Amazon S3 are relatively cheap. However, if we you want to crunch and make analysis and/or processing of large sets of data, you will need to pay. Another aspect is related to performance. People are expecting software to work at a speed of user thinking process. Imagine, you want to produce design alternatives for your future product. In many situations, to wait few hours won’t be acceptable. It will be destructing users and they won’t use such system after all.

Manufacturing leadership article Google’s Big Data IoT Play For Manufacturing speaks exactly about that. What if the power of web giants like Google can be used to process engineering and manufacturing data. I found explanation provided by Tom Howe, Google’s senior enterprise consultant for manufacturing quite interesting. Here is the passage explaining Google’s approach.

Google’s approach, said Howe, is to focus on three key enabling platforms for the future: 1/ Cloud networks that are global, scalable and pervasive; 2/ Analytics and collection tools that allow companies to get answers to big data questions in 10 minutes, not 10 days; 3/ And a team of experts that understands what questions to ask and how to extract meaningful results from a deluge of data. At Google, he explained, there are analytics teams assigned to every functional area of the company. “There’s no such thing as a gut decision at Google,” said Howe.

It sounds to me like viable approach. However, it made me think about what will make Google and similar computing power holders to sell it to enterprise companies. Google ‘s biggest value is not to selling computing resources. Google business is selling ads… based on data. My hunch there are two potential reasons for Google to support manufacturing data inititatives – potential to develop Google platform for manufacturing apps and value of data. The first one is straightforward – Google wants more companies in their eco-system. I found the second one more interesting. What if manufacturing companies and Google will find a way to get an insight from engineering data useful for their business? Or even more – improving their core business.

What is my conclusion? I’m sure in the future data will become the next oil. The value of getting access to the data can be huge. The challenge to get that access is significant. Companies won’t allow Google as well as PLM companies simply use the data. Companies are very concerned about IP protection and security. To balance between accessing data, providing value proposition and gleaning insight and additional information from data can be an interesting play. For all parties involved… Just my thoughts..

Best, Oleg

Photo courtesy of Google Inc.


Why PLM shouldn’t miss next email move?

July 18, 2014

plm-email

Email is a king of communication in every company. Many companies are literally run by email. People are using it for different purposes -notification, collaboration and very often even record management. You can hear many discussions about how companies can replace or integrate email with enterprise and social collaboration tools. I captured some of them in my previous blogging: How engineers find path from emails and messages to collaboration?; PLM Workflows and Google Actionable Emails; DIY PLM and Zero Email Policy; PLM Messaging and WhatsApp Moment.

You may think email doesn’t change. I wanted to share with you two interesting examples related to changes and innovation in email that caught my attention for the last few weeks. The Verge article speaks about Gmail API announcement.

Google announced that any app could now talk to Gmail using today’s faster, more modern languages — languages that every web developer speaks. The Gmail API lets you ask Google for threads, messages, drafts, and labels three to ten times faster than with IMAP. What it can do is provide an interface for any app to interact on a small scale with your Gmail account without having to create an entire mail client. When that happens, Google won’t have replaced email — it will have actually extended it. Instead of killing email as some hoped it would, the Gmail API gives email new life.

The following video present some additional details about Gmail API usage. Take 5 minutes to watch it, especially places where video speaks about integration between Gmail and enterprise systems.

Another example comes from TNW article – Inbox launches as an open-source email platform to replace legacy protocols.

A new startup, Inbox, is launching its “next-generation email platform” as an alternative to aging protocols like IMAP and SMTP. The core of Inbox’s efforts is an Inbox Sync Engine for developers that adds a modern API on top of mail providers, including Gmail, Yahoo and Outlook.com.

As stated in the article, Inbox is a platform play. The intent of founders is to create new generation of messaging platform. And it is an open source play. The first step for Inbox is to create Sync engine that can expose existing email providers:

The core of Inbox is an open source sync engine that integrates with existing email services like Gmail, and exposes a beautiful, modern REST API. We’re pleased to announce that beginning today, you can download the Inbox engine, sync an account, and begin building on top of Inbox in your local development environment.

These articles made me think about a potential play PLM and engineering application can make by building their collaboration application tightly integrated with email services. It will allow better communication for people and ease of data integration between PLM solutions and communication platforms such as emails. You may see it as a pure technical play. Who cares how to integrate email and data? However, in my view, this is a place where differentiation in user experience and seamless data integration can become a critical to drive user adoption.

What is my conclusion? It is very hard to change people’s habits. Email is part of our every day routine. Existing systems are integrated with email, but the way it done as well as the level of data integration is very sporadic. Lots of unstructured data about customers, engineering decisions, requirements and many others stuck in the email and lost there forever. New email approach may help to have transparent and seamless integration between business applications and email. It can make a difference for users. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 260 other followers