Part Numbers are hard. How to think about data first?

July 28, 2014

part-numbers-madness

One of the topics that usually raises a lot of debates is Part Numbers. One of my first takes on the complexity of Part Numbers was here – PDM, Part Numbers and the Future of Identification. Ed Lopategui reminded me about that topic in his GrabCAD post – Intelligent Numbering: What’s the Great Part Number Debate? few days ago. He speaks about four aspects related to handling of Part Numbers – creation, readability, uniqueness and interpretation. The conclusion is complex as well as the topic itself. Here is the passage, which outlines the conclusion Ed made.

Balancing all these diverse factors is difficult, because no solution is optimal for every company. Here are some final tips to help you make prudent decisions: 1/ Understand your PDM/PLM system part number generation capabilities; 2/ Understand the limitations of any other systems that interact with your parts; 3/ Go through every activity that requires interpreting part numbers and understand what system access is available, and how the interfaces work. This will provide a good basis for your interpretation cost; 4/ Understand how easy/difficult it is for a new employee to interpret a part number.

These tips made me think again about Part Numbering, data and different data and process management tools involved into the process of Part Numbers handling. Most of approaches are focusing on systems and functionality to handle part identification and classification. What we do is trying to align our need to identify and classify parts with what multiple systems can do. The hardest part is to find Part Numbers that will make all systems involved into the process (CAD, PDM, PLM, ERP, SCM, etc.) to work smooth. Honestly it is too complex and too costly.

So, how to manage that complexity? Is there a reasonable way to resolve the complexity of Part Numbering and made everybody happy? Thinking about that I came to conclusion that companies should start thinking about data first. From the longevity standpoint, data must have much higher priority compared to any data management system. In some industries companies are obliged to keep data for decades. Thinking about that, I want to outline some principles that will help you to do so and will allow to create some standardization around parts and data identification.

1- Disconnect Part Numbers and classification from specific applications. PN should not be dependent on requirements and capabilities of data and process management systems. Data has much longer lifespan compared to applications and systems. By defining PN independently you will keep data and processes in your company clean and well organized.

2- Generate PN based on classification, business needs and processes. Develop independent service to make it happen. This service is most probably should be independent from existing data management systems and converted in some sort of URI based notation.

3- Use independent service to convert independent PN into system specific identification. You can convert for any system you have at your disposal – PDM, PLM, ERP, SCM… What is important is to be able to control the process of conversion and adapt it each time data and/or process management system changes.

What is my conclusion? Product data is one of the most expensive assets in manufacturing companies. It represents your company IP and it is a real foundation of every manufacturing business. Think about data first. It will help you to develop strategy that organize data for longer lifecycle and minimize the cost of bringing new systems and manage changes in existing systems. I think, some services should be developed to make the process of part numbering easier for manufacturing companies. As manufacturing is getting global to maintain part numbering systems becomes a huge problem. Just my thoughts…

Best, Oleg


PLM vertical PaaS strategies

July 25, 2014

PaaS-plm-large-manufacturing

SaaS, PaaS, IaaS, DMaaS, … I’m sure marketing folks are having lots fun of new xaaS acronyms. The amount of publication about various strategies of services is skyrocketing. EDACafe article – The Platform-as-a-Service Provides European Aerospace & Defence OEMs and Partners with Greater Collaboration Capabilities brings a story of “AirDesign” – the European aerospace and defense industry’s collaboration platform by BoostAeroSpace and Dassault System. The article taste a bit marketing. Nevertheless, I found the following passage capture the rationale behind what AirDesign supposed to provide:

AirDesign drastically reduces operational costs for all partners through a single infrastructure, common exchange methods, open standards and easy access, all without adversely impacting existing information systems. All the primary European OEMs jointly requested and defined this platform in order to facilitate exchanges, support their suppliers’ ecosystems and generate new opportunities with services.

MMT article provides more analyzes on what Dassault and BoostAerospace are doing:

To manage the exchange of PLM data between all partners whatever their systems may be, AirDesign delivers three key capabilities through its digital exchange hub: 1/ The first allows an OEM to organize exchanges in the context of programs and projects, including the delegation of administrative roles; 2/The second is an integrated, secured and automatic technical data package exchange. The traceability needed for complex programs is a native capability, ensuring proper management of large technical files between OEMs and suppliers; 3/The third capability consists of access to a wide variety of services, including converters based on standards or approval services that a partner can use during an exchange.

The set of functionality above aren’t new. Large PLM implementation did it in the past. The problem of standardization and platform cost in supply chain eco-systems is critical. So, it is not unusual for partners to share single enterprise software infrastructure for design supply chain and more.

However, the notion of PaaS (Platform as a Service) is interesting. I’ve been blogging about that earlier – Will Cloud PLM develop PaaS option? and Cloud PLM and PaaS dilemma. It looks like PLM vendors is moving towards more vertical platform architecture. Especially for large companies and business eco-systems, PaaS can provide an interesting solution – standardization and cost benefits. The information about private cloud deployment by AirDesign confirms earlier news about Dassault developing all cloud PLM options.

What is my conclusion? I think we are going to see lots of PLM PaaS variations in a near term future. Large manufacturing companies are looking how to optimize cost and standartize infrastructure. This is an opportunity for PLM vendors to re-establish and sometimes re-develop their legacy systems in a new way. I’m sure lots of ENOVIA applications will be used in announced AirDesign PaaS. Overall looks like PaaS is another way to sell PLM cloud system to large manufacturing eco-system. Just my thoughts…

Best, Oleg

picture credit to MMT article

 

 


Who will make PLM sexier?

July 24, 2014

sexier-plm

Cool factor is trending in software these days. The time when software was ugly is probably in the past. Everyone wants to have a "cool app" – on the picture above you can clearly see the trend. Does it apply to enterprise software and PLM? It is a good question. Back in 2012, I asked it in my post – PLM: Ugly vs. Cool. While nobody specifically focused on how to develop cool PLM software, I can see an increased interest for improved user experience from PLM vendors.

cool-sexy-app-trend

UX magazine article Is there Room for Sexy in Enterprise Design? caught my attention few days ago. I found the discussion about emotional factor interesting and important. I especially liked the following passage:

The question enterprise technology companies need to ask themselves is “what does sexy mean to your enterprise customer?” Put another way, how do your customers want to feel when using your products?Every product, whether we realize it or not, produces an emotional reaction. As Donald Norman articulated in his seminal book Emotional Design, customers find aesthetically pleasing products more effective. Customers even “love” these products. Norman identified the commercial value in evoking some passion towards products, such as Gucci bags and Rolex watches. MailChimp’s Director of User Experince, Aarron Walter, took this one step further with his book, Designing for Emotion. He posits that the goal of emotional design is to connect with users and evoke positive emotions, which will make your users want to continue interacting with your product.

Article speaks about EchoUser research of emotions with enterprise customers. The following emotions are make sense to enterprise crowd – powerful, trust, flexible, calm, pride, accomplished. Cool and sexy are not in the list. So, is there a place for "cool and sexy" in PLM? For long time PLM was associated with "complex" and "expensive". At the same time, most of PLM commercial videos are cool and sexy. Sport cars, luxury airplanes, fashion shows, mobile devices. You rarely can see PLM video without such type of product examples.

I think, many PLM professionals these days are still trying to keep the association of PLM with complexity. My hunch, they are trying to justify expenses. Customers might think complex solution requires more budget, longer consultancy and service project. However, the other side of complexity is to feel absence of reliability and trust. This is not a simple decision for PLM consultants and software vendors.

What is my conclusion? People don’t like cumbersome software these days. There is no place for complex user experience even in enterprise software. What emotions should drive CAD and PLM software? How engineers should feel about software? I’d like to connect the results of engineering and manufacturing process with PLM tools. You cannot make good products with wrong tools. So, something should happen with PLM software. Complex PLM software is a wrong tool to build future cool products. Just my thoughts…

Best, Oleg

photo credit MidoriShoes


How to re-think activity streams for enterprise?

July 23, 2014

controlled-collaboration

These days manufacturing businesses are more connected than ever before. Every manufacturing company (even smallest startup) has a tremendous need for collaboration – help multiple engineers to get involved into the design process, communication with suppliers, plan manufacturing processes, etc. Social networks and open web inspired many companies to develop collaboration software that mimic consumer social software. One of the main attribute of every social software (Facebook, G+, twitter and others) is so called "activity stream" or "news feed". The trend was strong and produced lots of copycats. The successful and lucky ones got acquired. Many of less successful died.

The idea of activity stream is very powerful. It allows you easy share and consume information. However, here is a thing – it is not protected from "noise vs. signal" problem. The more people you follow – more information will flow into your activity stream(s). You end up with messy stream of information you cannot keep up with. It is probably okay for public news or even for executives in a company interested to keep up with what is going on. However, it is probably not a good experience for engineers that need to work together on the same design or discuss next engineering or manufacturing change request. Also, it is probably not a very useful as a tool to communicate between departments and suppliers. And… this is absolutely wrong model to use for process management.

All problems I mentioned above is actually making the adoption os social system for collaboration questionable. I can see many confirmations to that. CMSWire article The Problem With Yammer? People Don’t Use It speaks exactly about the problem. Here is key passage:

But what if the problem is not about difficulty or learning curves but about culture? What if the problem with Yammer has nothing to do with the product itself and nothing to with usability, but rather with the fact that enterprise workers are holding onto email for dear life and are not prepared to give it up? Microsoft itself appears to be aware of this. The addition of complimentary Yammer for the new Office 365 plans appears to speak to that. However, if Microsoft’s updated offerings are a step in the right direction, they won’t solve the problem of social and collaboration in the enterprise.

Another interesting example – Facebook. Clearly the king of social networks recently introduced simple and very effective feature to get out of noise of your information stream – Save. It can quickly remind you old and well-known list of favorites. Navigate to TNW article – Facebook introduces Save, a new bookmarking feature to help tame your News Feed. Sounds like a simple feature, but it allows you to keep specific post out of noisy channel and focus on them later in a more controlled way.

These and many other examples made me think about what is needed to provide a better way to collaborate. My hunch is that "controlled list of topics" can better serve the need of engineers and other people to work together. How to make it? This is probably more tricky question. I can see it as the next logical step from email that still one of the most favorited tools to communicate. It also reminded me my post Why PLM shouldn’t miss next email move earlier this week.

What is my conclusion? Activity stream is a good way to present flow of information. However, the type of experience it creates is way too open and subject to be affected by information noise. I believe engineering tools should provide more tight way to communicate, exchange information and share data for collaboration purposes. This is main reason people are holding onto email as a best tool. New ways to collaborate is not here… yet. Just my thoughts…

Best, Oleg


PLM implementations: nuts and bolts of data silos

July 22, 2014

data-silos-architecture

Data is an essential part of every PLM implementation. It all starts from data – design, engineering, manufacturing, supply chain, support, etc. Enterprise systems are fragmented and representing individual silos of enterprise organization. To manage product data located in multiple enterprise data silos is a challenge for every PLM implementation.

To "demolish enterprise data silos" is a popular topic in PLM strategies and deployments. The idea of having one single point of truth is always in mind of PLM developers. Some of my latest notes about that here – PLM One Big Silo.

MCADCafe article – Developing Better Products is a “Piece of Cake” by Scott Reedy also speaks about how PLM implementation can help to aggregate all product development information scattered in multiple places into single PLM system. The picture from the article presents the problem:

product-data-silos

The following passage is the most important, in my view:

Without a PLM system, companies often end up with disconnected silos of information. These silos inhibit the ability to control the entire product record and employees waste unnecessary time searching for the correct revision of the product design. As companies outsource design or manufacturing, it becomes even harder to ensure the right configuration of the product is leveraged by external partners.

Whether your company makes medical devices, industrial equipment, laptops, cell phones or other consumer products – PLM provides a secure, centralized database to manage the entire product record into a “Single Record of the Truth”… With a centralized product record, it is easy to propose and submit changes to the product design, track quality issues and collaborate with your internal teams and supply-chain partners.

The strategy of "single record of truth" is a centerpiece of each PLM implementation. However, here is the thing… if you look on the picture above you can certainly see some key enterprise systems – ERP, CRM, MES, Project and program management, etc. PLM system can contain scattered data about product design, CAD files, Part data, ECO records, Bill of Materials. However, some of the data will still remain in other systems. Some of the data gets duplicated. This is what happens in real world.

It made me think about 3 important data architecture aspects of every PLM implementation: data management, data reporting and data consistency.

Data management layer is focusing on what system is controlling data and providing master source of information. Data cannot be mastered in multiple places. Implementation needs to organize logical split of information as well as ability to control "data truth". This is the most fundamental part of data architecture.

Data reporting is focusing how PLM can get data extracted from multiple sources and presented in seamless way to end user. Imagine, you need to provide an "open ECO" report. The information can reside in PLM, ERP and maybe some other sources. To get right data in a right moment of time, can be another problem to resolve.

Last, but not least - data consistency. When data located in multiple places system will rely on so-called "eventual consistency" of information. The system of events and related transactions is keeping data in sync. This is not a trivial process, but many systems are operating in such way. What is important is to have a coordinated data flow between systems supporting eventual consistency and data management and reporting tools.

What is my conclusion? To demolish silos and manage single point of truth is a very good and important strategic message. However, when it comes to nuts and bolts of implementation, an appropriate data architecture must be in place to insure you will have right data at right time. Many PLM implementations are underestimating the complexity of data architecture. It leaves them with marketing slogans, burned budgets and wrong data. Just my thoughts…

Best, Oleg

picture credit MCADCafe article.


Why PLM shouldn’t miss next email move?

July 18, 2014

plm-email

Email is a king of communication in every company. Many companies are literally run by email. People are using it for different purposes -notification, collaboration and very often even record management. You can hear many discussions about how companies can replace or integrate email with enterprise and social collaboration tools. I captured some of them in my previous blogging: How engineers find path from emails and messages to collaboration?; PLM Workflows and Google Actionable Emails; DIY PLM and Zero Email Policy; PLM Messaging and WhatsApp Moment.

You may think email doesn’t change. I wanted to share with you two interesting examples related to changes and innovation in email that caught my attention for the last few weeks. The Verge article speaks about Gmail API announcement.

Google announced that any app could now talk to Gmail using today’s faster, more modern languages — languages that every web developer speaks. The Gmail API lets you ask Google for threads, messages, drafts, and labels three to ten times faster than with IMAP. What it can do is provide an interface for any app to interact on a small scale with your Gmail account without having to create an entire mail client. When that happens, Google won’t have replaced email — it will have actually extended it. Instead of killing email as some hoped it would, the Gmail API gives email new life.

The following video present some additional details about Gmail API usage. Take 5 minutes to watch it, especially places where video speaks about integration between Gmail and enterprise systems.

Another example comes from TNW article – Inbox launches as an open-source email platform to replace legacy protocols.

A new startup, Inbox, is launching its “next-generation email platform” as an alternative to aging protocols like IMAP and SMTP. The core of Inbox’s efforts is an Inbox Sync Engine for developers that adds a modern API on top of mail providers, including Gmail, Yahoo and Outlook.com.

As stated in the article, Inbox is a platform play. The intent of founders is to create new generation of messaging platform. And it is an open source play. The first step for Inbox is to create Sync engine that can expose existing email providers:

The core of Inbox is an open source sync engine that integrates with existing email services like Gmail, and exposes a beautiful, modern REST API. We’re pleased to announce that beginning today, you can download the Inbox engine, sync an account, and begin building on top of Inbox in your local development environment.

These articles made me think about a potential play PLM and engineering application can make by building their collaboration application tightly integrated with email services. It will allow better communication for people and ease of data integration between PLM solutions and communication platforms such as emails. You may see it as a pure technical play. Who cares how to integrate email and data? However, in my view, this is a place where differentiation in user experience and seamless data integration can become a critical to drive user adoption.

What is my conclusion? It is very hard to change people’s habits. Email is part of our every day routine. Existing systems are integrated with email, but the way it done as well as the level of data integration is very sporadic. Lots of unstructured data about customers, engineering decisions, requirements and many others stuck in the email and lost there forever. New email approach may help to have transparent and seamless integration between business applications and email. It can make a difference for users. Just my thoughts…

Best, Oleg


Will IBM and Apple open doors for mobile PLM future?

July 17, 2014

ibm-plm-apple-mobile

Enterprise software and Apple wasn’t much a success story until now. Don’t take me wrong – you can enterprise execs and even IT folks are using iPhones and other Apple devices. In my view, they do it mostly for mobile email and other cools apps. However, until now, the traction of iOS in enterprise was limited. I’ve been speculating about future of iPad for enterprise PLM in my previous writing PLM Downstream – Sent from my iPad?; iPad and Enterprise PLM; 3D/PLM and iPad: Future or Baloney? At the same time, I haven’t see many Apple devices in manufacturing companies and especially shop floor, maintenance and service departments. In many situations, IT remained a strong gatekeeper.

Some good news for iOS mobile PLM developers just came yesterday. Apple and IBM announced global partnership to transform enterprise mobility. Navigate here to read IBM press news . The amount of articles and reviews is skyrocketing. I picked few of them. PC World article – Why the Apple-IBM deal matters. My favorite passage speaks about "uniqu cloud services" for iOS.

Apple and IBM announced an “exclusive” deal on Tuesday in which IBM will build a new line of enterprise-specific apps from the ground up for Apple’s iOS, aimed at companies in retail, health care, transportation and other industries. IBM will create “unique cloud services” for iOS, including tools for security, analytics and device management. It will also resell iPhones and iPads to its corporate customers, and Apple will roll out new support services for businesses. In other words, Apple and IBM are putting a full-court press on the mobile business market. And they’re doing so in a tightly wedded fashion: The companies used the word “exclusive” four times in a statement announcing the deal.

Another article from Forbes Apple – IBM Partnership: Enough To Solve Enterprise iOS Fears? caught my attention speaking about Apple relying on enterprise partners to do heavy lifting needed to sell mobile solution to enterprise.

As enterprises increasingly look to make more use of business applications on mobile devices – for a competitive advantage in flexibility and productivity – manufacturers such as Apple will rely on enterprise partners, he notes, “to do the heavy lifting that will increasingly be required in areas such as mobile application development, lifecycle management and systems integration”. Apple is likely to seek other partners, similar to IBM, that can also provide enterprise capabilities and support.

Let’s go back to PLM vendors and mobile development. Until now, I had a mixed feeling about PLM mobile story. All PLM vendors did something for mobile and iOS. But, in my view, it was some sort of checkmark – "yes we have it". In my view, one of the mission points was absence of specific apps to solve productivity problems. Most of mobile PLM apps did the same job as non-mobile software did, but on iPad. In addition to that, 3D viewer app was very popular. Most of these application came as an overlap to existing software. At the same time, key advantage of mobile app is to provide productivity apps for situation when users are off desks on the road, workshops, manufacturing and service facilities. Some of my thoughts about that are here – Mobile PLM gold rush: did vendors miss the point?

What is my conclusion? Apple and IBM agreement could be a big deal. IBM have a very good past record in enterprise PLM deployments. Even manufacturing industry was not specifically mentioned in the press release, I’m sure it will influence decisions of many IT managers. So, sounds like an opportunity. iOS developers can start looking for jobs in PLM companies. It is also a good opportunity for startups. Just my thoughts…

Best, Oleg


Why Siemens PLM can develop PaaS option

July 16, 2014

cloud-paas

PaaS is a category of cloud computing service providing platform and solution stack. This service model is including not only computing infrastructure (IaaS), but also application design, development, testing, team collaboration, integration features, database integration, scalability, security and others. In addition to that, it might provide service management capabilities such as monitoring, workflow management, etc.

As cloud market becomes mature, IT, customers and application development are looking into complete solutions. All PLM vendors are in full swing into IaaS cloud PLM option. Cloud PLM experience brings the need to dig more into nuts and bolts of cloud business. It includes understanding of cost, scale, service maintenance, application development, partnership. So, fundamental question many vendors and customers have is what is the best way to make cloud PLM efficient? Understanding of PaaS option is an important step. I shared some of my thoughts about PaaS and PLM before- Cloud PLM and PaaS dilemma, Will cloud PLM develop PaaS options?

My blogging buddy and well known PLM analyst Chad Jackson tweetstormed what he learned at Siemens PLM about furure TeamCenter platform development. While I’m still waiting for full blog post on Chad’s Lifecycle Insight, the following filtered tweetstorm can give some idea about Siemens PLM platform strategy:

chad-jackson-siemens-plm-tweetstorm

It resonated with my previous thoughts about cloud PLM and PaaS and made me think why Siemens PLM as well as any other PLM vendor can consider PaaS as a right option for their cloud PLM strategy. Here are my 3 reasons to develop PLM PaaS:

1- Agile development.

Vendors should be able to go fast in the development of applications, ability to customize existing features and supporting new opportunities. Businesses are much more dynamic these days. Everyone wants to be agile. PLM vendors too. So, to get up to speed with business, PLM vendors need to have a stable platform to build on. PLM PaaS can be one.

2- Better upgrade strategies

Let’s take marketing gloves off. Regardless on deployment options (on premise; private cloud; public cloud), you need to deal with upgrades. Databases, services, data model changes- this is only a very short list. PaaS can hide upgrades from customer and application developers by providing a stable platform layer. This layer requires less frequent upgrades.

3- Scaling factor and cost.

Cost is important. Cloud is not cheap as many of us thoughts from the beginning. Customers are demanding new business models and optimized cost. The development and customization cost is another problem. Scaling and keeping cost low is also huge challenge. Utilization of enterprise servers is still relatively low. PaaS can answer on the question how to share resources and scale with low cost.

What is my conclusion? Most of PLM vendors took IaaS option as a starting point to develop cloud business. It is okay and will provide important experience from different perspective – technologies, business, user interface. However, IaaS won’t remove fundamental enterprise PLM issues – implementation complexity, upgrade challenges, high diversity of requirements and business changes. PaaS option can become the next logical step to optimize platform and application for agile delivery. It looks like Siemens is making steps towards this direction. Just my thoughts…

Best, Oleg


PLM and Manufacturing Startups: Potential Mismatch?

July 14, 2014

hardware-mfg-startup

Selling PLM for SME was always a very controversial topic among PLM vendors. No consensus here. I wrote about it few months ago in my Why PLM stuck to provide solution for SME post and got interesting follow up conversations with few industry pundits.

Every PLM vendor has some special product offering ready for SME market segment. But did it work well to anybody? My hunch, most of "successful PLM SME" implementations are focusing on basic CAD/PDM features. Very few SME organizations successfully implemented a complete PLM system including BOM, change management, configurations, manufacturing integration, requirement management and more. If you got a chance to see one, it is typically result of huge effort of people in the organization itself committed to make it work.

One of the most typical reasons for PLM vendors to sale to SME was high cost of implementation and sales multiplied by absence of IT people ready to handle PLM implementation. In my view, PLM vendors have a great hope to make it easier with modern cloud based PLM offering, but jury is still out to watch results.

Meantime, manufacturing landscape is getting even more interesting. Hardware is the new software. Nest, GoPro, Beats, Jawbone, Oculus… You’re welcome in the world of manufacturing startups. I touched it in my earlier post – Why Kickstarter projects need PLM? Yesterday, my attention was caught by TechCrunch article – Hardware Case Study: Why Lockitron Has Taken So Long To Ship. Read the article – I found it very interesting. The following passage explains basically that from "limited assembly", manufacturing startups are moving towards full manufacturing cycle:

In our initial RFQs (“request for quote”) we leaned heavily towards manufacturing entirely in the United States. Our impetus for this was largely around logistics; if we could make everything domestically, we wouldn’t have to travel far and wide to ensure the quality we expected. It quickly became apparent that manufacturing domestically would cost far beyond what we had budgeted for. Given the number of parts, required touch time (the amount of time it takes someone to assemble a product), various materials and processes used, building entirely in the U.S. wasn’t viable. Potential domestic suppliers still looked East for most of the components we needed, albeit with longer lead times.

However, even more interesting quote is the following one explaining the level of challenges during the development processes.

We spent the next few months redesigning our gearbox to reduce noise while increasing power to deal with sticky or hard-to-close locks. While the choice was the right one to make, it cost us valuable time; a few parts had to be retooled and there were cascading effects on our electronics and supplier choices. We selected an ultra-efficient, powerful motor to place at the lock’s heart, but this also impacted our timeline. Most challenging, however, was the meshing of electronic and mechanical worlds. An initial circuit board design proved overly complex and underpowered.

As you noted the complexity of product including mechanical and electronic parts is very high. In addition to that, even it wasn’t stated explicitly by the article, I can see a growing complexity of integration between electromechanical and software components.

What is my conclusion? The complexity of manufacturing startups is growing. To scale product development and manufacturing is a very challenging job. And all must be done in a craziest timeline – the reality of every startup. Manufacturing startups is an interesting niche that clearly different from typical SME organizations we’ve seen before. The challenge of PLM with a typical manufacturing SME is to compete with a status quo of existing processes and tools. Manufacturing startups are different – absence of processes, startup culture and an absolutely need to get job done in a very short timeframe. It would be interesting to see a growing demand for PLM tools as well as growing complexity of product development and supply chain in these organizations. What PLM tools will provide an answer? Good question for PLM strategists these days. Just my thoughts…

Best, Oleg


Dassault is going to support all PLM cloud options by 2015+

July 10, 2014

cloud-options

For the last few years, I’m following cloud strategies of main PLM vendors – Aras, Arena, Autodesk, Dassault, Siemens PLM, PTC and few others. You can find some of my early notes here – PLM vendors, IT and cloud strategies. The variety of cloud options made statement "Cloud PLM" practically useless. I had a chance to cover all four cloud PLM options here during Siemens PLM analyst event. One of the most challenging decisions for PLM vendors remains the "duality" of PLM cloud options – public vs. private. After few years of slow ramp up, all PLM vendors today are placing "cloud options" on their roadmap. So, the question "How to implement PLM cloud?" is the the one that you need to focus on when thinking about what is right PLM option for you.

I’ve been following Dassault #3DXforum for the last few days via twitter. The following slide caught my attention, since it presents clearly the spectrum of PLM deployment options Dassault is going to support – public cloud, private cloud, on premise cloud and on premise. It also gives you some idea about timeline. On premise, public and private cloud by 2014 and on premise cloud for 2015+.

ds-dfl-all-clouds

It looks like Dassault doesn’t want to miss the cloud movement and makes public and private cloud a priority. It would be interesting to see more about architecture specific, data centers, supported IaaS and PaaS options. The only information I can get from the Develop3D tweet is 6 global locations. Which sounds like a very impressive achievement. It is not clear what is behind on premise cloud option. I can guess about some combination of data storage location or mix of application deployed from multiple clouds. This is just a guess -not much you can see online.

What is my conclusion? Cloud is here to stay. However, cloud architecture and deployment options will evolve and morph actively for the next few years. It is hard to run on all options. Therefore, to focus on right match between customer interests and operation maturity looks like to direction PLM companies are trying to follow.Just my thoughts…

Best, Oleg

Disclaimer: Dassault didn’t sponsor and didn’t not influence the content of this post.


Follow

Get every new post delivered to your Inbox.

Join 244 other followers