PLM Sales Cheat Sheet

January 20, 2014

plm-sales-cheat-sheet

I have to admit – I don’t have formal sales education. My childhood was mostly influenced by math and tech. Technology has a smell of precision and knowledge. At the same time, sales appears to be manipulative. I can try to blame Soviet Union regime, but it doesn’t matter now. I was wrong.

The understanding of how I was wrong came later in my life. I learned to sell my software projects, application and services. I still cannot say I’m good in sales. I must thank few people in my carrier that helped me a lot to understand the nature of sales. I’m still learning.

None of recommendations I put below came from formal books. Actually, I never had a chance to read formal sales books. Probably I should. List below comes from my experience and involvement in PLM projects in variety of roles – technical, implementation, advisory, strategy, competitive analysis and more. Following these rules helped me to achieve goals.

1- Make "enterprise executives" friends and friends across other divisions in a company you are selling to. Don’t be engineering/tech buddy only. The championship in engineering system department is important, but you need to get the whole picture of product problems and profit in a company.

2- Learn to say "No" to engineers and R&D managers. Very often, PLM has engineering roots in the organization. You can easy get spoiled by engineering ideas. With all respect to engineers, these ideas are not always on the top priority list for CIO and can be far from business goals.

3- Prepare to come at least 7 times to meet your prospect customer. Sometimes you will feel repeating yourself, but in PLM implementation, it is often part of learning and convincing themselves about their decision. Also, be ready to answer on the question "what is PLM and why is it needed if company already spent million dollars to implement ERP". You need to have a good answer…

4- You need to become a source of knowledge about other PLM implementations, best practices, failures and successes etc. Make yourself a bit "techie" – it will help you to build your credit. Very often your prospect customer doesn’t know what to do and would love to learn from what you did for other companies.

5- Learn how to shutdown implementations if things are going wrong. This one is tough. Sh’t happens. You need to learn how to fire customer, even if you made a sale. Not everything depends on you. PLM touches different departments and often requires a company to change the way they do business. People are trying to manipulate and it resulted in politics influence, conflicts between management groups, implementation strategies, competitors, etc. If you see that you cannot make a visible success in 3-4 months, push the stop button and ask to "rethink what needs to be done".

6- Learn few key typical PLM implementations failures – team disagreement about product development process and data ownership, CAD integration failure, PLM/ERP integration failure. In my view, these three are responsible for more than 50% of PLM implementation failures. You need to learn how to smell it and go to option #5 with request to rebuild the process.

7- Deliver one feature from engineers dream list. Take one that will help engineers to be proud of what PLM does. Something that not on the top priority list. You will become "engineering hero".

8- You biggest win will come 3 years later from a customer you failed to sale to. Enterprise sales is a lengthy process. Companies need time to understand how they do work and how they need to manage a change. Management changes. Corporate conflicts get resolved. Remember #4 and be consistent in PLM vision you sale. Prospects will come back.

What is my conclusion? Enterprise sales is a special discipline. These days is probably one of the most interesting things in the overall process of technological disruption. Major disruption happened because of internet, cloud, mobile and other technologies. The wave seems to be going to enterprise continent. However, enterprise is first about people and second about technologies. Technologies make sense only after people can understand and use them. Just my thoughts and good luck!

Best, Oleg


Why PLM selection is about data access problem first?

October 29, 2013

plm-selection-process

How to select PLM? Manufacturing companies, industry pundits and vendors are trying to simplify this process. Nevertheless, after almost three decades, the problem is still on the table. PLM sales is value based and unfortunately requires to juggle too many people and events together. I see this process as a combination of technological choices, company practices and vendor selection.

I recently came to few notable PLM blog articles focusing on different aspects of PLM selection. My long time industry colleague and blog buddy Jos Voskuil put an article PLM Selection – Additional thoughts. Take a time and read Jos’ article – it contains many good recommendations and options to check when you trying to select PLM system such as – organizational requirements, implementation specifics, cost and even vendors FUD. The last one was interesting and I specially liked this passage:

My recommendation to a company that gets involved in FUD during a PLM selection process, they should be worried about the company spreading the FUD. Apparently they have no stronger arguments to explain to you why they are the perfect solution; instead they tell you indirectly we are the less worst.

Another article I came across is a publication in CL2M portal. A short writeup by Scott Cleveland is interesting – Why PLM? Scott is also mentioning multiple reasons to get involved into PLM. One of them is about "looking for information" effort was caught my special attention. It comes as a first problem one of his client was trying to solve by implementing of PLM. Here is how Scott explains that:

Looking for Information: He told me about the time his engineers spend looking for ‘stuff’ [like drawings and files]. I said this is a problem everywhere. I told him there have been many studies performed analyzing the time it takes engineers to find ‘stuff’, and all of them say that, without document management software, engineers can spend as much as 25% of their time looking for ‘stuff’.

He said he couldn’t put a figure on it, but believes that could be true at his company. He also mentioned that at some point the engineer will stop looking and just recreate the missing information. He said this is a killer. First it wastes project time and second, it leads to duplicate part numbers and possible errors caused by the duplicate drawings.

All together, it made me think about how to get information access into the central place of PLM selection process. Getting access to the information from multiple devices, organization locations and at any time becomes an absolutely must requirement any vendor should answer before getting into future PLM evaluation. Otherwise, you will be sinking into people inefficiency every day.

What is my conclusion? We live in the era when access to information becomes mission critical. Your design can lead to much more expensive options. You can potentially select wrong supplier. You can miss delivery dates. However, the most important is to note time aspect. Engineers are spending lots of time looking for "stuff". This is the problem nobody can tolerate any more. Just my thoughts…

Best, Oleg


Is SAP HANA the future of PLM databases

February 7, 2013

I’m on the road in Europe this week. Europe met me with snow in Zurich and not very reliable internet connection later in Germany. On the plane, I was reading about future investments of SAP in HANA (new in-memory database) that suppose to revolutionize enterprise software industry. Navigate to the following link and have a read – SAP’s HANA Deployment Leapfrogs Oracle, IBM And Microsoft. I found the following passage very interesting.

What SAP has done is to provide one database that can perform both business analysis and transactions, something its rivals are able to provide only by using two databases, according to Gartner analyst Donald Feinberg. “It’s the only in-memory DBMS (database management system) that can do data warehousing and transactions in the same database. That’s where it’s unique.”

Databases is a fascinating topic. At the end of the day, the enterprise software industry (and not only) is solely relies on database for most of the applications. The days PDM apps were running on proprietary databases and filesystems gone completely. The last one I knew was PDM workgroup. In my view, SolidWorks is still running bunch of customers using this solution, but nobody is taking database-less solution seriously these days. Most of PDM and PLM applications are running on MS SQL and Oracle databases. Despite PLM power of IBM, I haven’t seen any significant usage if DB2 for PDM/PLM. Another interesting quote, I found about HANA is related to competition. According to author it will take few more years until Microsoft and Oracle will be able to catchup.

SAP has taken a big step ahead of rivals IBM, Microsoft and Oracle with the announcement on Thursday that its in-memory database called HANA is now ready to power the German software maker’s business applications. The pronouncement is sure to darken the mood of competitors, who one analyst says will need several years to match what SAP has accomplished.

I’ve been writing about HANA and applications before on my blog. Take a look here. Also, you can find lots of interesting resources online here. Applications of HANA database are interesting and when it comes to analyzes of massive amount of data makes a lot of sense in context of product development and manufacturing.

For SAP customers, HANA-powered applications can speed up the sales process dramatically. For example, today when salespeople for a large manufacturer takes a large order from a customer, they may not be to say on the spot exactly when the order will be fulfilled. That information often comes hours later after the numbers are run separately through forecasting applications.

What is my conclusion? Customers are interested in real solutions that can save money to them. Technology is less relevant in that case. Ability to answer practical questions is more important. SAP has money and customers. Many years, SAP is using database solution from main competitors – Oracle and Microsoft. Will SAP be able to pull new technology to revolutionize this market? Will Microsoft, Oracle and open source databases will be able to catch up this game? An interesting question to ask these days… Just my thoughts.

Best Oleg


Why Cloud PLM will dominate in superior UI experience?

July 20, 2012

Let’s talk about User Interface today. Something that we call UX (User Experience) nowadays. The importance of user experience can be hardly undervalued. The days when a user interface wasn’t important gone, and customers are demanding from software vendors to provide a different level of product usability.

What is the path to a good UX?

Well. We agreed – user experience is a very important thing.Take a look on the picture below. I’m sure, many of you, reading now this blog post and watching this picture can go in your product development, PDM, PLM system and find some similarity with this UI.

However, how practically we can achieve these results? One of my favorite online publications, UX Magazine, published an interesting article – Overhauling a UI Without Upsetting Current Users. I recommend you to have a read. I found to the concept of “redesign by evidence” interesting and valuable. However, my favorite passage is related to “usability testing”:

Usability testing is very different from beta testing. In beta testing, users will typically only report usability problems that make it very difficult for them to accomplish a task—in other words, things that are very clearly bugs. They typically won’t report that they found something challenging or unintuitive. People don’t always like to admit that they failed at something. Also, beta testers (or at least the ones who take time to report issues) are often fans of the product, and are therefore also power users. They may have already learned to work around or ignore usability issues.

This is a key, and many PLM systems failed to accomplish that. To test systems with “mainstream users” and not only “product fans” is very important.

Cloud PLM and Usability Testing

In my view, cloud product architecture and development introduces new opportunity to develop a better user experience. Cloud is eliminating long development cycles and making software to be available on demand. It creates a possibility to experiment with individuals as well as user groups in a very granular form. It allows to achieve a next level of usability testing almost identical to the level consumer web brands have these days.

What is my conclusion? Cloud product development opens new horizons for PLM. Cloud product development and testing provide new platforms and capabilities for usability testing and variety of user experiments. As a result, PLM companies will have an opportunity to achieve an improved level of UI quality compatible with the quality of consumer and web products. Just my thoughts…

Best, Oleg


PLM Cloud and Software Licensing Transformation

July 18, 2012

The topic of software licensing is one of the most debated in the context of industry transition to the cloud. PLM is not immunized to this discussion, and I can see it happen in many forms in blogs, twitter and other social media forms. I definitely cannot bring all quotes to this post. One discussion that caught my attention recently in Ralph Grabowski’s WorldCAD Access blog. Ralph is quoting Griffin Securities analyst Jay Vleeschhouwer report from the Autodesk annual analyst meeting in New York:

For Autodesk, switching to the cloud is bigger than the ’90s switch to Windows.

Navigate to this link and have read. Don’t miss comments – they are probably as important as the article itself. In addition to traditional discussion about “security” and “cloud” danger, I found few interesting notes on the topic related to the potential danger of usage metering and other “new forms” of licensing. Here is my favorite passage:

Autodesk showed a slide indicating a progression from today’s mix of perpetual and maintenance revenues (for blended desktop and cloud workflows) to a future of per-user subscriptions and usage metering.

Cloud and Software Licensing Shakeout

This statement above made me think about the real transformation which will happen with licensing model in the cloud era. Traditional software licensing and (especially) enterprise licensing models are not suitable into cloud environment. Until now, the vast majority of enterprise software vendors use to license Servers, CPUs and Databases. With the introduction of cloud environment, most of the metrics became obsolete. How you can license server when you, actually, not really interested to know how many physical servers and/or virtual machines are running to support your environment.

Will “usage” become an ultimate licensing model?

Pay as you go. This is one of the famous and well-known slogans of SaaS (Software as a Service world). The wide adoption of this model in consumer web, raised the question if such a model will work well for the enterprise. Here are few pros and cons I can see.

Pros:

- simple model focused on “resource consumption”

- create a feeling or “fear pricing”

Cons:

- in some situations can be unpredictable

- can create a feeling of “license hostage” for customers.

What is a potential alternative to “usage” in cloud licensing? In my view, I can see 3 additional alternatives: capacity (size of the data storage), timeshare and end user (named users) licenses. All these models are implying “usage” in different forms. To pick up the right one (or combination) will be an important step for vendors.

What is my conclusion? One of the most important questions every cloud vendor needs to ask these days is how to create “a predictable licensing model”. Customers are afraid about a potential license hostage. To prevent it, vendors need to focus on transparency of cost calculations and potential alternatives. Just my thoughts…

Best, Oleg


The anatomy of MultiCAD-PDM Integrations

February 19, 2012

After posting my last blog multi CAD and PDM- dead lock?, I’ve got quite many emails and calls. It again proved to me that the topic is painful and require clarification. I’ll be setting up few follow up conversations following weeks. Today, I want to provide some background and clarify few basic things related to Multi-CAD and PDM.

The scope of Multi-CAD PDM

One size doesn’t fit all. Companies are using multiple CAD systems. Functional specialization, acquisitions, mergers and existing skill set. All these factors lead to the questions of how CAD systems exchange the information. It is about formats and interoperability. PDM multi-CAD is not about it. Since PDM is largely about the control and data share, multi-CAD PDM is about how to organize an environment in which engineers (and other people in your organization) will be able to use PDM will all CAD systems and files.

CAD-PDM: Immersive integration

Immersive integration is the trend in CAD/PDM which was formed for the last decade. It allows CAD users to interact with PDM within CAD environment. It simplified a lot design interaction and helps PDM to have a better control over the design process.

CAD-PDM plug-ins

Plug-in is a piece of software normally written using CAD API that allows PDM functionality inside of the CAD system. It includes functionality related to PDM control – open, check-in, check-out, release, etc. In addition to that it allows to support multiple CAD functional areas (I’m trying to avoid word ‘feature’ in order not to interrelate it with CAD parametric design features).

CAD complexity and PDM plug-ins

The growing complexity of CAD systems made PDM development quite complicated for the last 5-7 years. CAD systems made lots of functional enhancement, which made development of CAD-PDM in a nightmare. Add to that CAD releases, and you understand why CAD vendors got an ultimate advantage to provide a better CAD-PDM integration for their own CAD and PDM systems. It happens because of availability of internal knowledge, specific APIs and release process.

Customer needs and MultiCAD – PDM

MultiCAD environment is a reality of many companies. In such situation, customers are following two possible strategies 1- to follow dominant CAD + PDM strategy; 2- to form multi CAD/ PDM environment. The availability of a specific CAD-PDM integration is an ultimate deal/no-deal requirements in many situations.

Openness and API

In many situations, basic PDM and CAD integration can be achieved by using open API and system customization capabilities. These are important characteristics both PDM and CAD side. Availability of such instruments can decrease the pain around MultiCAD -PDM plug development.

CAD and PDM software releases and upgrades

Last, but now least- this is another pain point of Multi-CAD ODM plug-ins. Most of CAD and ODM releases are following one-year cycle. At the same time, customers are not always following all upgrades. To support multiple CAD releases in a single PDM is another place where precise synchronization between CAD and PDM development process is required.

What is my conclusion? The topic of MultiCAD PDM isn’t new. Let’s face the reality, because of the absolute importance, companies easy can get provoked to use it as a competitive advantage. In addition to "openness" in general, the ultimate way to solve this problem is Open API. I believe Open APIs is the most important strategic factor for companies to be competitive in the future. Just my thoughts… I’m looking forward to your comments and future discussions. Stay tuned for more posts about that.

Best, Oleg


Live from SolidWorks 2012 Media Event

August 31, 2011

I’ve been attending SolidWorks Media day in Dassault SolidWorks campus in Concord, MA. Actually, I noted- this is probably the last time event happens in the current SolidWorks building. SolidWorks is moving to the new Dassault campus in Waltham, MA. While the information about SolidWorks 2012 is under embargo, you can have a feeling of the event and some interesting information about SolidWorks install base, revenue growth, eco-system. I twitted most of these photos during the event. Nevertheless, I thought most of you can benefit from them combined in a single blog. Not everybody came to come to Boston because of Hurricane Irene a day before. However, you can see an impressive crowd in the meeting room.

Bertrand Sicot kicking off the event. The important message: SolidWorks is about evolution and not revolution.

SolidWorks is continued to develop the 3D professional market.

Nothing specially new, but this is how SolidWorks presented as part of all Dassault Systems’ brands

Key facts, revenues and install base.

Revenues

Install base

It was interesting to see the level of non-CAD product growth in SolidWorks portfolio. Today the numbers here mostly represents Simulation, Data management and documentation products.

Fielder Hiss, VP Product Management presents historical retrospective of SolidWorks releases for the last 20 years.

SolidWorks community numbers are impressive – 436 VARs and 750 partners.

The agenda afternoon included some very entertaining engineering experiments such as magnets, motor assembly and the hit of the day – vibration driven mouse robot.

Furthermore, afternoon agenda included SolidWorks 2012 Beta customers panel and a deep dive into SolidWorks 2012 features. However, this is where embargo starts.

In addition to that, SolidWorks was innovating in organizing new activities with SolidWorks execs (speed dating). 7 minutes round-table talk with exec. You can ask questions. After 7 minutes – rotation, execs are moving to the next table.

What is my take? It is hard to make a real comment and not to talk about the product. However, I think SolidWorks can play a significant role in the future transformation of Dassault Systems. How it will happen? Time will show.

Best, Oleg


PLM and New Openness

July 21, 2011

The topic of openness in PLM software isn’t new. In the past decade, I’ve been hearing lots of good and bad things about PLM and openness. Last year, I shared my thoughts with regards to PLM and openness in my post – Closed Thoughts About PLM Openness. Few days ago, I had a chance to read Ralf Grabowski was interviewing Fabien Fedida of Dassault Systems. Among all topics they discussed, one was about "new openness". Navigate your browser to the following link to read the interview. I put below some of my thoughts and references on PLM openness for the last year.

New Openness in Dassault V6 R2012

According to Mr. Fedida, Dassault is thinking how to improve the openness of their PLM software. Few examples: releasing of new APIs, adding V6 to external PDM integrations using Web Services and XML schemas. Here is my favorite passage:

Dassault is aggressively releasing API [application programming interface] calls "to the entire eco system." For example, ENOVIA V6 is now up to 3,000 API calls, 3DVIA Composer has 500, and CATIA has new ones in the area of composite manufacturing. There will be more APIs to come…

I can see the release of new APIs as something very positive. However, I’m afraid a bit about measurement of openness in the number of API calls, which reminded me how software was measured by LOC (lines of code) twenty years ago.

Aras, PLM Data "Obfuscation" and Other PLM vendors

A different perspective on openness is coming from Aras blog, which talked to us about PLM vendor’s practice of Lock-in, interoperability and need to make database and data models transparent and interoperable. Navigate your browser to read What is PLMData "Obfuscatopn" and Why Should I care?!? The following quote explains the "data obfuscation problem" and core idea behind Aras’ openness in database and data modeling technologies:

This is the way the other major PLM / PDM systems were / are designed. The database table for the Part Master is notcalled / labelled “Part” – it’s labelled “0034543908543TG324” or something else confusing like that… the data aresometime split into different tables so that access is non-intuitive. This is “obfuscation” and it’s done by design. PLMsystems have traditionally (and still are) very hard to get at the data and figure out, sometimes impossible…. Aras is an open and transparent data model that is designed in a very simple and straightforward manner. Parts are in a table calledPart”, Suppliers in a table called “Supplier”, etc. and you have complete access along with a published data dictionary. That’s very different from the other major PLM providers and one more way that we are helping companies take control of their own destiny. “

Talking about other PLM, I’d be mentioning TeamCenter PLM XML schema, which is a good example of openness, in my view. You can get more details here. It is interesting to see that the same Aras was delivered Aras’ connector to TeamCenter using TeamCenter PLM XML openness. Navigate to the following press release made by Aras back in 2009 – Aras announced connector technology for Siemens PLM Software TeamCenter.

What is my conclusion? Openness is a tricky thing, in my view. It depends on environment and in many times it is a part of the company fundamental strategies. I have to say that DS API numbers and connectors to everything are really reminded me the shelf in Apple store with connectors with all possible sources of data. At the same time, I cannot completely agree that usage of right naming in a database can solve the issue of data transparency and interoperability. The fact vendors are thinking about openness is a good sign. We will watch results… What is your take on PLM openness?

Best, Oleg


How To Stop Searching for PLM Killer App?

March 12, 2011

Are you familiar with the "Killer App" syndrome? In my view, conversations about a "Killer App" are very popular when some technological device or broad technological innovation needs to be proven. Killer App becomes so popular that return on the technology becomes obvious. I can bring some examples of Killer Apps in the past:VisiCalc on Apple II or Lotus 1-2-3 for IBM PC. However, in my view, talks about "killer app" are also a good indication about problems with a product or technology.

The following article caught my attention yesterday: What is the Killer Application for a Modern Engineer? I missed it when it was originally published in January. Chad Jackson, my colleague in the PLM Blogosphere, is talking about CAD, Collaboration and Mashups as examples of killer applications for Engineers. Where I disagree about the "notion" killer application in the context of engineers, I found analyzes Chad made in his post interesting.

Examples of Killer Apps?

CAD App
Personally, I think CAD is a mainstream technology. It was proven by many years. I don’t think, somebody today is designing any product without CAD system. History of CAD passed many waves of technological innovation that moved CAD between 2D, 3D and different computers platforms. I found surprising the fact SolidWorks wasn’t mentioned in the list of CAD products, but the choice of CAD was always somewhat "religious" and Chad’s selection didn’t surprise me.

Collaboration App
The history of various "collaborative applications" in the engineering space, in my view, started by introducing of data management to a wider company audience and following trial to expansion into PDM and PLM. The discussion about what is the killer app for collaboration is on going even today. My favorite collaboration tool for many years is email. Since I moved to Google App, I found it as a good addition to my email experience. PDM and PLM applications are constantly trying to replace email without visible success, in my view.

Mashups
The story of mashup is funny in my view. The word itself came to us from the Internet and Web space where applications (mostly running in the browser) "mashed up" the web content and making it more valuable for end users. The most successful mashup application, in my eyes is Google Map. I wrote about mashup on my blog before (Will Mashup Grow Up in PLM?) In my eyes mashups are interesting, but too vague and unclear from the standpoing of end-user who trying to get a job done.

PLM as a Killer App

In the beginning of 2000s PLM was introduced as a next big thing for engineers and manufacturing. After almost a decade of debates and different technological and product development attempts, I can see Product Lifecycle Management more as a "business and technological strategy" rather than "application".

Product Development: One Size Doesn’t Fit All?

Now think about design, engineering and manufacturing. It is all so different from various perspectives. Industry specific needs, departments and roles are different. Finally, every manufacturing shop is developing their own strategy for how to compete in the modern world and what can make it unique. If you ask me what application can fit everything, my ultimate answer is simple – Excel. Yes, Excel rocks when it comes to the flexibility and user adoption. The cost of customizing Excel to fit your needs is huge and the cost to support it even bigger (remember my Do you need chief Excel officer to manage BOM?)

What is my conclusion? PLM software vendors and analysts need to stop searching for a next "Killer Application". Flexibility and granularity are two important directions software vendors need to follow to gain next level of PLM software adoption. Just my opinion, of course. YMMV.

Best, Oleg


PLM Platform Wars: Who is Right or Who is Left?

February 25, 2011

CAD and PLM world is extremely competitive. The decision process in the manufacturing industry is not fast. Companies are spending significant budgets evaluating tools, benchmarking, comparing and, in the end of this process, supposed to make a right decision about what tools to use. During the last couple of months, I observed a growing amount of announcements made by PLM companies informing about "yet another major company" making a right choice by selecting a CAD or PLM system from a specific vendor.

Just few examples from Joe Brakai’s article: Industry on the Move — The Quest for Effective Global Product Lifecycle Management. PTC announcement about Hyundai Motor Company and Kia Motors Corporation (HKMC) about selection of PTC’sWindchill as its enterprise PLM software. Siemens PLM Software announced that Aston Martin is selecting the Siemens’NX CAD software for design and engineering, and Teamcenter PLM software to manage product and process information. Two months before, Daimler AG, selected NX as a corporate CAD Software standard. Earlier, Chryslerselected Teamcenter as its corporate-wide PLM software. Two months before, Volvo Group has adopted PTC’s Windchill. Here is my favorite passage from Joe’s article:

OEMs are realizing that the traditional heterogeneous and fragmented product lifecycle management environment, even when compromised of excellent tools, is unable to provide the level of visibility, manageability, and fidelity of decision-making required, and are taking steps to migrate to a design and manufacturing environment capable of supporting a global platform strategy. This environment must be standard-based and open in order to facilitate a single source of all data for design, manufacturing and supply chain processes across vehicle design and manufacturing programs.

Another interesting publication is Automotive Sector Ground Zero for PLM Battles by Beth Stackpole of Design News. Beth is discussing the same announcements made by PLM and major automotive vendors. However, I found the following passage interesting.

The automotive sector, which has a deep roots in evolving PLM software and practices, is shaping up as a fresh battleground for the major vendors in this category with all touting recent customer wins that play up their strengths and cement their positions as core development platforms for next-generation vehicles.

My best read about PLM movement in the automotive world is Al Dean’s article in Develop 3D – All Change in the Automotive World. I recommend you to have a read. Some of final Al’s thoughts struck me to think about V6 innovation.

There have been three pretty big moves away from Dassault or a decision to not take on Enovia during benchmarks. With a two year benchmark cycle being common, one has to wonder if there’s a link back to the launch of V6? A curious thing indeed.

In my view, V6 created disruption, innovation and challenge at the same time. The bundling of CATIA into Enovia V6 is creating a lot of possibilities that never been available before from the standpoint of collaborative design and data management. Initial introduction of the systems raised a lot of IT questions that need to be resolved.

What is my conclusion? I wanted to remind wise and relevant words of Bertrand Russell: “War does not determine who is right – only who is left”. The war between PLM platforms can become a disaster for customers. Customers are spending million of dollars investing money in "unbreakable closed platforms". Each of these systems contains lots of data, which has much bigger value compared to the software that eventually will be re-written every 5-10 years. I think, openness wins for a long run. In my view, PLM companies are only playing with openness. Who will take it seriously first? This might be a company that is left after the battle? Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 217 other followers