PLM and Microsoft Azure Cloud In A Box

October 22, 2014

ms-azure-cloud

How do you move to the cloud? This is one of topics I’m discussing on my blog for the last year. The last time, I took a swing towards public cloud. Navigate to my PLM vendors, large manufacturers and public cloud article for more information. However, not everybody will move to public cloud. At least not very soon.

For those who is looking for alternatives, especially within private cloud zone, the last update from Microsoft can be a very good news. Navigate to the Business Insider blog – Microsoft’s Satya Nadella Just Fired A Shot At HP And IBM. Microsoft turns to Dell to create a new computer server. Here is the passage which provides more info:

The new computer is called the “Microsoft Cloud Platform System” and it will be a mini-version of Microsoft’s cloud, Azure, that enterprises can install in their own data centers. By using this server, enterprises can easily move applications from their own private data center to Microsoft’s cloud and back again. (In geek speak, this is called “hybrid computing”.)

Some more details came from CMSWire blog earlier today – Take a Seat Google, Amazon: Microsoft’s Cloud Wins the Day. So what is that Microsoft Azure Cloud in A Box. Here is the definition of a “Box”:

...new Azure-like appliance that Enterprises can deploy in their own data centers. It has been designed specifically to handle big data workloads (32 cores, 450 gigabytes of RAM and 6.5 terabytes of local solid-state drive storage). Officially named the Microsoft Cloud Platform System (CPS), powered by Dell it is, in essence, an “Azure consistent cloud in a box” with pre-integrated hardware from Dell and software from Microsoft.

I captured the following architecture shot from WinITPro article:

plm-azure-in-a-box

It made me think about what is the potential impact and opportunity for PLM vendors. For most of them, alignment with Microsoft can be very beneficial. In the case Microsoft will do hard work and promote their Cloud Platform System to CIOs of large enterprise companies, PLM can be the icing on the cake. So, on the surface it all looks good. Especially, for PLM vendors especially fully aligned Microsoft software stack. I guess Microsoft partnership programs can provide some additional benefits too.

The issue I’d like to question is related to data layer. Most of large PLM deployments today are running on top of Oracle database. Oracle has their own cloud plans – Oracle cloud PaaS will provide a magic button for PLM. The availability of Oracle DB as part of Azure Cloud Platform can be questionable and become an issue to move PLM systems to Azure.

What is my conclusion? The devil is in the details. This is the best way to describe the status of cloud PLM software architecture today. PLM vendors are developing their own cloud strategies. Manufacturing companies are looking for the easiest path to the cloud. We will see some interesting moves from both sides. A good time for PLM architects and tech advisers. Just my thoughts…

Best, Oleg

Note: I’m migrating my blog to http://beyondplm.com – you might consider to follow and comment there.


PLM Files Detox

October 21, 2014

zero-files-no-CAD-files

The digital life around us is changing. It was a time when everything we did was running around desktop computer. You do your job, Save As… and, yes(!) put it in a file that can give you control over the result of your job. That’s the reason why engineers are in love with CAD files and Excel spreadsheets – it gives them full control of what they do. Excels are getting messy within time, but we can start a new file or open a new Excel spreadsheet.

Rob Cohee of Autodesk reminded me how much engineers are in love with files in his LinkedIn article – My Name is Rob, and I’m Addicted to Files. I captured few passages from Rob’s article before. He brilliantly explains the full engineering enjoyment of control over design and related information.

It started out small with a .DWG here, a .DOC, there with a sprinkle of .XLS files in between.

I had the freedom to create all this data, and the power is nothing short of addicting. Critical design requirements, tolerance, specification, and performance requirements, assembly instructions, a digital folder of file after file containing all of this critical information. I was the Michelangelo of AutoCAD R13 C4, the DWG was my canvas, safety was my muse.

The drawing file became everything. It was my design, requirements document, revision control, my parts list, my BOM, my supplier and procurement instructions, my cut list, my everything. All that data, all in one place locked away in my CAD file that only I had access to make modifications. The control was dizzying, euphoric at times. Any change to the drawing file had to go through me and me alone.

Rob’s article reminded me some of my old posts – The future of CAD without files. I still like very much a diagram I placed there from O’Reilly Radar article – Why files need to die. Here is my conclusion back into 2011.

The fundamentals of CAD and design systems are files. We use them to store assemblies, parts, drawings. In addition to that, we use them as a reference in many places. Do think “file” paradigm will live with CAD and other design systems forever? The movement of CAD vendors seems to me the obvious application of modern web principles to the world of design and engineering. The initial signals are here. CATIA V6 pushed the limits and eliminated files by connecting CATIA system directly to Enovia back-end. Autodesk cloud experiments with systems like AutoCAD WS made existence of files on the disc obsolete. PTC introduced Creo Apps. It will be interesting to see if PTC will come with the future idea of eliminating files. I think the computing and information paradigms are shifting from file-oriented to data (and web) oriented. The initial signs are here. The speed of this movement is questionable. Manufacturing is slow changing environment and engineers are very reluctant to changes.

PDM (Product Data Management) was a solution to end CAD file mess. PDM systems came to hunt for CAD and other files. The intent was to bring files into order, manage revisions, share data and… after some time, to eliminate files. We can see it started to happen now in some high-end systems such as CATIA V6. So, why PDM failed to detox engineers from files? Here is the thing… PDM was invented to help engineers to manage and control data. It sounds like engineers should like PDM, since it helps them to control files. But it didn’t go according to the plan. PDM added "frictions" into engineering freedom to create data in the way engineers want. Name control, check-in/out, approvals, etc. As a result of that, PDM failed to become a friend and turned to be engineers’ nightmare. Engineers don’t like PDM and in many situations engineers were forced to use PDM.

Working environment is changing fast. We are getting disconnected from files in our digital life. Our everyday workflows are getting distributed, mobile, disconnected from desktops and… files. We want to get access to data and not to files. To make this process successful, we need to think how to remove frictions. When you go to engineering school, you learn about importance of frictions. But software is different. Especially these days. Frictions can slow down the process of software adoption.

What is my conclusion? Engineering and manufacturing is slow changing environment. Engineers are conservative and design minded. Therefore, many PLM tools failed to become a favorite engineering data management and collaboration tool. Large teams accepted PDM tools because they had no choice. I believe, the future won’t belong to files. We are going to see more data-driven environment around us. To establish such environment is one of the main challenges for PLM companies today. To make it happen, PLM vendors must think how to remove frictions between users and PLM tools. Just my thoughts…

Best, Oleg


How to rethink PLM workflows?

October 20, 2014

plm-mobile-workflow

Workflows and processes. This is an important part of any company. Like blood goes through your body, workflows are going through a company and beyond. Couple of months before, I posted by ultimate PLM workflow dream. It came as a part of my thinking about "un-bundling services". My idea was to publish list of features workflow (process management) that can be used as an independent service.

Many businesses were created with the vision to improve processes and to support business workflow. However, email is still one of the key elements of every workflow and business process management system implementation. How to move from emails and messages to collaboration – in my view, this is one of the most critical elements that can help to streamline PLM workflows. Because of ubiquity, email remains one of the most widely used engines behind companies’ workflow. One of the ideas I discussed earlier was to connect emails and workflow – how to turn email into actionable workflows.

Over the weekend, I was skimming through the article – 10 Trends Transforming Enterprise IT by TechCrunch. The trend that caught my attention was #6 – Reimagining enterprise workflows. Read the following passage:

Widespread adoption of mobile devices has led to more efficient enterprise workflows. Salesforce.com CEO Marc Benioff recently said he runs his whole business from his phone. This gets easier every day. Whether it is quickly turning around business documents via the e-signature application DocuSign or fine-tuning scheduling and route optimization for people in the field via ServiceMax, mobile applications are reinventing and automating processes across the enterprise, enabling work to get done faster and smarter.

Here is the article referencing Salesforce.com’s Marc Benioff statement earlier this year.

"I run my business entirely on my phone," CEO Marc Benioff said. "I believe this is the future." As companies store less and less data on site, more will no longer need desktops, he said.

It made me think about changing roles between email and mobile. One of the biggest early successes of mobile computing was to turn business email into mobile service. I’ve been using one with my old Blackberry and it was very reliable. Here is the thing. Mobile devices are more ubiquitous today than email. Mobile applications can be easier and more intuitive compared to the list of emails in my inbox. Mobile can be a service that can help to rethink PLM workflows bypassing email and existing complex business suites.

What is my conclusion? We need to learn how to break things that preventing us from moving forward. Email is one of them. In the past, we asked to connect every PLM workflow to the email. That was our desire to have a single point of communication. Today, our mobile phone is our single point of communication and it is more powerful than our desktop computer 10 years ago. People don’t have to be connected to their desks. Therefore, to disrupt PLM workflows by making them completely mobile can be an interesting option. Just my thoughts…

Best, Oleg


PLM: from sync to link

October 17, 2014

plm-data-link-sync

Data has an important place in our life. Shopping lists, calendars, emails, websites, family photos, trip videos, documents, etc. We want our data to be well organized and easy to find. Marketing folks like to use the term – data at your fingertips. However, the reality is just opposite. Data is messy. We store it in multiple places, we forget names of documents and we can hardly control it.

Everything I said above applies to manufacturing companies too. But, it gets even more complicated. Departments, contractors, suppliers, multiple locations and multiple systems. So, data lives in silos – databases, network drives, databases, multiple enterprise systems. In my article – PLM One Big Silo, I’ve been talking about organizational and application silos. The data landscape in every manufacturing company is very complex. Software vendors are trying to crush silos by introducing large platforms that can help to integrate and connect information. It takes time and huge cost to implement such system in a real world organization. Which makes it almost a dream for many companies.

In my view, openness will play a key role in the ability of system to integrate and interconnect. It will help to get access to information across the silos and it leads to one of the key problem of data sharing and identity. To manage data in silos is a complex tasks. It takes time to organize data, to figure out how to interconnect data, organize data reporting and to support data consistency. I covered it more in my PLM implementations: nuts and bolts of data silos article.

Joe Barkai’s article Design Reuse: Reusing vs. Cloning and Owning speaks about the problem of data re-use. In my view, data reuse problem is real and connected directly to the issue of data silos. I liked the following passage from Joe’s article:

If commonly used and shared parts and subsystems carry separate identities, then the ability to share lifecycle information across products and with suppliers is highly diminished, especially when products are in different phases of their lifecycle. In fact, the value of knowledge sharing can be greater when it’s done out of sync with lifecycle phase. Imagine, for example, the value of knowing the manufacturing ramp up experience of a subsystem and the engineering change orders (ECOs) that have been implemented to correct them before a new design is frozen. In an organization that practices “cloning and owning”, it’s highly likely that this kind of knowledge is common knowledge and is available outside that product line.

An effective design reuse strategy must be built upon a centralized repository of reusable objects. Each object—a part, a design, a best practice—should be associated with its lifecycle experience: quality reports, ECOs, supplier incoming inspections, reliability, warranty claims, and all other representations of organizational knowledge that is conducive and critical to making better design, manufacturing and service related decisions.

Unfortunately, the way most of companies and software vendors are solving this problem today is just data sync. Yes, data is syncing between multiple systems. Brutally. Without thinking multiple times. In the race to control information, software vendors and implementing companies are batch-syncing data between multiple databases and applications. Parts, bill of materials, documents, specifications, etc. Data is moving from engineering applications to manufacturing databases back and forth. Specifications and design information is syncing between OEM controlled databases and suppliers’ systems. This data synchronization is leading to lot of inefficiency and complexity.

It must be a better way to handle information. To allow efficient data reuse, we need to think more about how to link data together and not synchronize it between applications and databases. This is not a simple task. Industry that years was taking "sync" as a universal way to solve problem of data integration cannot shift overnight and work differently. But here is a good news. For the last two decades, web companies accumulated lot of experience related to management of huge volumes of interconnected data. The move towards cloud services is creating an opportunity to work with data differently. It will provide new technologies of data integration and data management. It also can open new ways to access data across silos. As a system that manage product data, PLM can introduce a new way of linking information and help to reuse data between applications.

What is my conclusion? There is an opportunity to move from sync to link of data. It will allow to simplify data management and will help to reuse data. It requires conceptual rethink of how problems of data integrations are solved between vendors. By providing "link to data" instead of actually "syncing data", we can help company to streamline processes and improve quality of products. Just my thoughts…

Best, Oleg


Kenesto revamp: does it change cloud PLM game?

October 17, 2014

kenesto-edm

It has been more than two years since I was reviewing Kenesto – an outfit founded by Mike Payne with the a strong vision to simplify process management. Navigate to the following article PLM, Kenesto and process experience to refresh your memories.

Steve Bodnar of Kenesto put comments on my blog about Google Drive and 3rd party apps with hints about some Kenesto functionality around file synchronization and cloud data management. It was a good alert that Kenesto is preparing some refresh. The following Kenesto press release caught my attention yesterday – Kenesto Extends Engineering Collaboration with New Vaulting and State-of-the-art Desktop File Synchronization. I found it interesting, since it moved Kenesto from process management cloud tool into something bigger – data management and vaulting. Back in 2012, I thought, that ability to handle engineering data is a big differentiation between traditional PLM system and cloud process management tool like Kenesto. The following passage from Kenesto press release can give a short description of the shift Kenesto made – it moved into data and file management space.

Kenesto today announced the full availability of its latest innovations – file vaulting and a pioneering file synchronization service – to enable mainstream design and engineering firms to more easily and effectively collaborate and manage their data. Kenesto’s latest capabilities also work well in conjunction with such design tools as Creo®, SolidEdge®, SolidWorks®, and Spaceclaim® for manufacturing customers and also Revit® for AEC customers, to enable file management and sharing across design workflows. This is all done while also ensuring proper handling of updates to component and assembly models connected to items and bills-of-material, for example.

I made a trip into Kenesto website. It presents a broad range of solutions – engineering design management, change management, procurement and supplier collaboration, program and project management. These are traditional PLM suspects. However, some of solutions are clearly outside of typical PLM domain – management of marketing program, PR and advertising, idea management.

Kenesto features are covering wide range of capabilities – projects, dashboard, reporting, document management, vaulting, web viewing, workflow and task management. My special attention caught Enterprise-class File Synchronization. This is an interesting feature and it made me think about cloud PDM functionality and cloud file sharing. My blog- Cloud PDM ban lifted. What next? speaks about growing interest of PLM and other vendors to apply cloud technologies to PDM – space that traditionally tried to avoid cloud-touch. So, Kenesto just joined the cloud of cloud PDM vendors and I need to add Kenesto in the list of companies open for cloud PDM competition.

kenestoDesktopSync

What is my conclusion? It looks like Kenesto decided to change the trajectory of Kenesto technologies and moved from process and workflow management to a full scope of product data management and lifecycle solutions. I guess Kenesto prefers not to use traditional PDM, PLM buzzwords. However, Engineering Data Management (EDM) acronym made me feel a bit nostalgia… At the same time, cloud sync and in-browser office files editing tools can provide an interesting differentiation in cloud-era. Just my thoughts…

Best, Oleg

Disclaimer: Kenesto didn’t sponsor and didn’t influence content of this blog post.


Multiple dimensions of BOM complexity

October 15, 2014

complex-bom-old-fashion

Bill of Material topic is getting more attention these days. No surprise. BOM is a center of universe in manufacturing (and not only) world. People can disagree about terminology applied to BOM management. Depends on a specific domain people can call it part list, specification, formula. But at the same time, everybody speak about the same BOM. Actually, not always the same BOM. I guess you’ve heard about variation of Bill of Materials – eBOM, mBOM, xBOM, etc. The amount of abbreviations in BOM is growing and often can cause confusion. So, I decided to put some lights on that in my post today.

The importance of BOM management is growing as well as tension around who owns bill of material. Historically, people in different departments disagree about the way they manage bill of materials. As a result of that, departments are splitting and cloning bill of materials to get control and managing it in different systems. It leads to the need to synchronize and copy BOMs together with changes. The tension around BOM management is growing. Last year, I posted some of my thoughts in the post – Will PLM manage enterprise BOM? The main point in this article was around complexity of BOM management and integration between different systems and disciplines.

It looks like BOM will become the next place some of PLM vendors are going to innovate… and battle. My attention was caught by provocative ENGINEERING.COM article – The Power of Zero – Dassault’s ENOVIA chief talks about the ”Zero Error BOM”. Read the article and draw your opinion. I captured the following passage:

The “war” has generally been about linking product development with shop floor IT and the BOM certainly plays a key role in this. Right now there are four primary participants on the battlefield: Siemens, SAP, GE/PTC and IBM.

Article is emphasizing the complexity of "universal BOM" solution and potential advantages of winning BOM battle:

It’s not a simple job to manage a BOM. What might appear as ”a list of parts needed to build a product” is today a complex reality of multiple levels, diversified disciplines and BOMs contains information about structures, electronics, integrated software, manufacturing methodology and the way products are maintained and even disposed of. There are many sources of error and mistakes can be very costly.

If Dassault’s “zero error BOM” can become a reality, it’s a huge step forward and would, according to analyst Marc Halpern of Gartner, ”have the potential to realize the ’dream’ of the universal BOM”. But as Kalambi says: ”This is about to embark on a journey; once on ’the road’ the benefits of 3DEXPERIENCE and V6 will increase productivity dramatically”.

I found myself thinking quite a bit about complexity of BOM today and, as a result, came to the following diagram showing 3 main dimensions of BOM complexity: Disciplines, Lifecycle, Changes.

multiple-dimensions-of-bom-complexity

1- Multiple disciplines. The complexity of product is growing these days. Even for very simple products it goes beyond just mechanical and electromechanical design. It includes electronic, software and goes to services and deliveries. Engineers are using multiple tools to create design of products in each discipline. To combine everything together is a very challenging task.

2- Lifecycle. Design represents only one phase of product development. It must be manufactured, shipped, supported and (after all) re-furbished or destroyed. All these processes are going in parallel and requires sophisticated interplay in data and activities. How to connect requirements with design, plan and optimize manufacturing and run support services? This is only a short list of tasks that requires BOM orchestration.

3- Changes (ECO/ECN…). Nothing is static in this world. People are making mistakes. Communication failures happen. Suppliers are going out of business. All these events generate changes that must be applied into different stages of product development – design, manufacturing, services.

What is my conclusion? Bill of Material management reflects one of the most complex disciplines in product development and manufacturing these days. The time when companies managed BOM on the shop floor corkboards are gone. Future BOM management systems will have to be much more sophisticated, integrated and to support multiple dimensions of BOM complexity. Just my thoughts…

Best, Oleg


PLM vendors, large manufacturers and public cloud

October 14, 2014

google-data-center

Companies are moving to cloud these days. The question vendors and customers are asking today is how do we move to the cloud. I’ve been asking this question in my post few month ago – PLM / PDM: Why the cloud? Wrong question… I discovered multiple options for customers to start their move to the cloud – mainstream cloud productivity tools to share data and collaborate, to migrate existing PLM platforms to cloud using IaaS strategies as well as to build new type of platforms and tools using new type of cloud platforms and infrastructure.

Today, I want to show the perspective on public cloud from both sides – large provider of public cloud infrastructure (Google) and large manufacturing company (GE) and to see what is the intersection between their strategies.

Google – example of public cloud platform

My attention caught Google presentation – The next generation of Cloud. Navigate your browser to the following link to watch it. Besides the fact it was inspiring by the exact same question – “How to you move to the cloud”, it provided a very interesting insight on the aspect of Google public cloud platform.

google-1

google-2

google-3

google-4

Hardware cost is declining and Google is adjusting public cloud to match economic realities. Together with economic of scale and utilization, I can see a trajectory towards decreased of public cloud cost even more in the future.

Large manufacturers move to the cloud

So, what customers are thinking about public cloud? Inforworld article just published an article presenting GE strategy to go all-in with public cloud. Presented as an interview with GE COO Chris Drumgoole, article outlines his aggressive plans to migrate to public cloud services — and how they support GE’s organizational goals. Read the article and draw your opinion. Here is my favorite passage:

Drumgoole won’t talk specific numbers, but he claims that “north of 90 percent” of the apps deployed by GE this year have been in a public cloud environment. We’re big fans of the idea that everything ends up in the public cloud utility model eventually. “Eventually” is the big caveat, because some people within GE would argue that should be tomorrow, while others would tell you it’s 15 years from now. It’s a subject of good debate. But either way, the regulatory environment we live in right now prohibits it. In a lot of spaces, when we say technically that we think something should be public, and we’re comfortable with it being public, the regulatory environment and the regulators aren’t quite there yet and we end up having to do some sort of private or hybrid cloud. That’s probably one of the biggest barriers to us moving more public.

Drumgoole speaks about connected devices, big data and analytics as a significant driver to move data to the cloud. I reminded me one of my previous posts – IoT data will blow up traditional PLM databases (http://beyondplm.com/2014/09/23/iot-data-will-blow-up-traditional-plm-databases/). The amount of data is huge and it will certainly require new approach in data management. Here is the example of how much data produced by jet engine these days:

Take one of the jet engines we make, and if it’s fully instrumented. On a typical flight, it’s going to generate about two terabytes of data. Not everybody fully instruments them, but if you instrument it the way people would like in order to get predictive data, you’re talking about 500GB per engine per flight. A flight with a GE engine takes off or lands every three seconds. All of a sudden, the data gets very, very large very, very fast.

PLM vendors and public cloud

As for today, I’m not aware about any PDM/PLM software using Google Cloud as a platform. The majority of cloud PLM software built on top of infrastructure provided by collocated hosting services and variety of Amazon cloud infrastructure. Dassault Systems and Siemens PLM made few public statements about support of diverse set of cloud options and IaaS infrastructure. It would be interesting to see future evolution of PLM cloud platforms.

What is my conclusion? The technology and economic of cloud is changing these days. My hunch, it will pull more vendors and companies to use public cloud in the next few years. Software companies will try to balance between leveraging technological platforms and cost. At the same time, customers will try to balance between regulatory requirements and opportunities to make data accessible and scale without limits. Interesting time and significant opportunity. Just my thoughts..

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 252 other followers