PLM, Cupcakes and Blind Spotting

June 19, 2012

Let me ask you this… Is there a connection between PLM and Cupcakes? I hope I’ve got your attention for a minute :). One of the questions PLM industry is struggling for a long period of time is awareness. PLM long time industry opponents ERPs are well positioned to get attention of CIOs and other executives in the companies. At the same time, speak to any PLM-related person about PLM awareness, and you will be immediately attacked by a long list of facts that supposed to convince you PLM is the first thing you need to implement in order to improve your business. Getting back to the original question about PLM and Cupcakes, one of my best blogging and twitting buddies, Jim McKenney posted in his PLM on my brainfew months ago an interesting blog post called – Who needs Product Lifecycle Management? The following passage is my favorite:

People seemingly cannot agree on who really needs Product Lifecycle Management, or PLM.  What is my response? EVERY business needs PLM! Why, you ask, well, let me tell you. Every business has a large pile of information that supports their products. It may be in the form of paper, but thankfully today, most businesses have a large amount of digital information that supports their business. Managing that virtual information is what allows the company to continue delivering products and support to their customers. PLM is all about managing virtual information to support physical products.

Later in his post, Jim is coming to the “cupcakes story” explaining why every company needs PLM. You can replace “cupcakes” with anything else – cars, airplanes, computers, food, etc. and it won’t change a story. The fact how people convinced themselves about the need of PLM is interesting to observe. As a result of this, PLM solutions’ appetites are growing, and many additional solutions become part of PLM portfolios.

Another thing that very often came in conversations about PLM is “change”. The company must change their processes, behaviors, organization, systems, etc. to successfully implement PLM systems. Another PLM blogger I admire – Jos Voskuil often writes about people’s aspect of PLM. In one of the last posts in his virtualdutchman blog – The State of PLM after 4 years blogging Jos is saying: I believe PLM requires a change in an organization not only from the IT perspective but more important from the way people will work in an organization and the new processes they require.

The aspect of change and the question about PLM awareness made me think about “blind spotting”. Everybody knows what is blind spot when you drive your car. However, this is not exactly what I want to talk about. I’ve got my exposure to the topic of blind spotting two years ago during COFES 2010 when I attended a lecture of Peter Marks about the same topic. I put a video record of this session. It is longer than a usual video I put in my posts. Keep it for coming 4th of July week and watch it. I’m sure you will learn a lot as I did.

Peter Marks lecture about Blind Spotting at COFES 2010

Just to capture an idea of what Peter Marks is talking about I want to refer you to the article where Peter is answering 3 questions about blind spotting. Navigate here and have a read. Here is the question that caught my special attention. Peter is talking about what blind spot will make a biggest difference to us:

It’s probably the extension of our innate territoriality to territories of belief. This often leads to irrational escalation of conflict. As with many other animals, we’re wired to defend our territories.  Home territories are where we find sustenance and protect our kin.   Over the millennia, we’ve evolved many biases to give us a “home field advantage.” Today, the notion of defending a physical territory has extended to “territories” of belief and culture.  The functional silos in most medium size and larger organizations are a mild form of this territoriality.

I found it resonating to the topic of PLM territory :). I hope you’ve got my point. It is about how to organize the territory of automotive manufacturing company, high-tech company and… finally, cupcake company with the way PLM is pretending to make their business. This is where a second blind spot mentioned by Peter caught my special attention -

Q: What is the biggest blind spot you overcome yourself? One thing I’ve become more aware of is how the “confirmation bias”  affects me.  Most of us, myself included, are confident in our own beliefs.   When challenged, we start looking (only) for evidence that supports our opinion.  Early in school and in my career, my knee-jerk reaction was to bury contrary opinions in an avalanche of facts.

What is my conclusion? I found “blind spotting” as an interesting association to look on what is happening around PLM these days. We’ve been very long-time believers of “know how” to make companies to use PLM software. I have to say, we’ve got certain achievements in how we did it. However, PLM software didn’t make it to the mainstream adoption similar to accounting, CRM and some of ERP functions. The market situation these days is very disruptive- cloud, social, different so-called “2.0 trends”. It is important to overcome traditional PLM blind spot in order to see what the shift PLM industry needs to take to go beyond its current potential. Just my thoughts…

Best, Oleg

The Future of TLA in Engineering Software?

September 22, 2010

Yesterday, I attended COFES Russia / isicad 2010 forum in Moscow. My presentation on the forum was about my view on the future of TLA (Three Letter Acronyms) in Engineering Software. You can see slides of my presentation below.

Later I run the round table about the future PLM technologies. Here are my key takeaways. Last decade was a decade of consolidation in Enterprise Software and PLM. Not much was done beyond that. Consumer software, the Internet and specially Web 2.0 applications will provide a significant impact on the future of technologies and products in Engineering and Manufacturing Software. Just my thoughts…

Best, Oleg

Autodesk, Data Management and “Why PLM?” Question

August 26, 2010

I read one of the latest VEKTORRUM re-posts about Autodesk and PLM. Navigate your browser to the following link and read the original article from 2007. According to the article – ” There are “more pragmatic, more digestible approaches” to solving engineering data management issues than PLM, he [Carl Bass] said”. It made me think more about Autodesk, data management and PLM strategies.


Let’s start from the history. Autodesk has a long history of data management solutions. It contains multiple products. Some of them were developed by Autodesk and for some of them Autodesk partnering with other companies. The most notable Document and Workflow Management system in early 1990s was Autodesk Workcenter (Google is tracking the following link on Autodesk Workcenter). I had a chance to work on few Autodesk Workcenter implementations, so I had my own Workcenter implementation memories going back in 1994-1995. The next big Autodesk data management project was Motiva PDM. Autodesk made a significant investment into Motiva project in the end of 1990s. You can track the following KMWord article – Autodesk and Motiva to Collaborate for PDM. Both, Workcenter and Motiva development were discontinued.

In the beginning of 2000s Autodesk acquired company truEInnovation. The original product truEVault was a foundation of existing Autodesk Vault. This is the Wikipedia quote:

Autodesk Vault was initially known as truEVault; part of an acquisition from a company called truEInnovations, Inc. based in Eagan, Minnesota. truEInnovations was started by two entrepreneurs, Brian Roepke and Dean Brisson in 1999. The company was founded on the basis of bringing a more affordable tool for managing engineering data to the market.

After the asset acquisition of truEInnovations by Autodesk in 2003, Autodesk began to further the integration of the product into the manufacturing product line, starting with Autodesk Inventor.

Autodesk’s Data Management Foundation

For the moment, Autodesk Vault is the foundation of all Autodesk Data Management products. After latest re-branding, Autodesk Vault is a family of PDM products providing a wide range of capabilities started from files vaulting and expanded into areas of Bill of Material Management and Engineering Change Management.

Autodesk is intensively working to provide additional data management features and functions. You can see a shortvideo of Brian Roepke about Autodesk Vault 2011:.

In the following video you can see a new Autodesk Vault 2011 integration with Inventor.

In my view, some of them are very similar to features presented by DS 3DLive and Siemens 3DHD products. See my post – 3DLive, 3DHD, 3D and UI efficiency.

Autodesk and PLM

Steve Wolf of Cyon Research recently published an article on COFES Blog – Who Needs PLM? (). In this article, Steve discussing the latest Autodesk financial results and

The following quote represents Steve’s comparison between Autodesk and other PLM-associated companies.

What’s interesting about Autodesk’s success is that the company’s products consist almost entirely of single-user desktop tools that engineers use to do their jobs faster. Relatively little of Autodesk’s income comes from what its rivals call “product lifecycle management” (PLM) software that combines engineering applications with fiendishly complex enterprise-level software for managing engineering data.

A different opinion presented by CIMData in their latest research paper focusing on how Autodesk will evolve into full-scope PLM provider. I had a chance to discuss this CIMData research before on my blog. This is the PLM think tank link. Take a look on the interesting quote from CIMData website:

... perspective on the transition that Autodesk is executing to transform itself from a supplier of individual PLM-focused point solutions to a supplier of industry-focused solutions that can be the fundamental platform for a company’s overall PLM strategy.

What is my conclusion? I think, Autodesk is going on a very narrow bridge and trying to connect customer’s demands to have a rich scope of data management functions and integration with design tools like Autodesk Inventor. At the same time, Autodesk is trying to avoid getting into positioning data management as a “PLM strategy”. The ugly truth, in my view, is that users are less interested in the TLAs these days and more thinking about products, functions and usability. Just my thoughts…

Best, Oleg

PLM and Bottom Up Option

August 20, 2010

The following publication in VEKTORRUM got me to review again a book “3D Manufacturing Innovation” by Dr. Hiroshi Toriya. I had a chance to read this book last year and it contains some very impressive examples every PLM software company need to learn. The book is pricey. As alternative, navigate your browser on Google Book link and you will have about 30% of this book for free. This book and Randal’s post – “3D Manufacturing Innovation- Explains the Japanese Quest for an Alternative to PLM” made me think about PLM strategies that were developed over the last decade and their potential improvements.

3D Master Top-Down
This is a dominant concept used today for Product Lifecycle Management. It supported by mindshare PLM leaders (Dassault, PTC, and Siemens PLM). Their strategies as well as portfolios are aligned straight from the CAD / 3D products, and they built infrastructure to manage and proliferate in 3D information downstream. Dassault is the dominant in their vision of 3D for all, PTC and Siemens PLM, in my view, provide more balanced between CAD/3D and Process orientation. As a consequence of competition with major ERP vendors, PLM mindshare companies are shifting towards better modeling of downstream data, engineering and manufacturing options as well as industry businesses.

Process is King
This model supported by PLM companies that have roots in ERP domain. 3D is definitely not the strongest side of their portfolio. So, building their product with “a process in mind” makes a lot of sense to them. These companies can be leverage a very strong enterprise architecture and infrastructure. By doing so, they can provide a support for development and manufacturing process. The advantage and disadvantage of this approach is a very weak connection to design and 3D data. Therefore, we had a chance to see investments of ERP companies in acquisition of 3D viewing technologies.

Bottom Up Approach
Both approaches – “3D Master” and “Process is King” are very focused on top down methodologies. This is, in my view, a significant weak point. What if we need to move from top down approach to bottom up? Dr. Hiroshi Toria mentioned in his something that can be considered as an alternative bottom up when 3D data will be accumulated by company in a central database and access by everybody. Here is a quote from his book:

What is my conclusion? PLM is one of the strategies that introduced by software companies to improve design and manufacturers. It was adopted by manufacturing companies. However, it faces a significant criticism. There are two main criticisms – (1) need to make a reform in a company in one day and (2) complex implementation following this decision. I can see multiple alternatives on the table. One of them is a top down approach and decline in a massive 3D CAD top down dominance in PLM implementation. Just my thoughts…

Best, Oleg

PLM, Enterprise Social Software and Excel Litmus Test?

August 9, 2010

I want to take another round of thinking about Enterprise and Social Software. My last post related to that was followingEnterprise 2.0 conference in Boston in June – PLM and Enterprise 2.0: No Fight… Yet.

Few days ago, I posted about PLM, BOM, Excel – How to Make it right? Chris Williams at Vuuch made an interestingcomment that made me think again about existing PLM problems and potentials of social software for enterprise organizations.

Excel Litmus Test
I’m coming to the conclusion that enterprise software vendors can use MS Excel as a Litmus test for potential problems. Depend on the amount of MS Excels, you can make a conclusion about the quality of solutions they provide. Users are voting Excel each time enterprise software doesn’t work or too complex to be used.

Enterprise Social Software
The term ESS (Enterprise Social Software) first appearance happened during Enterprise 2.0 conference couple of years ago. Here is Wikipedia definition:

Enterprise social software (also known as or regarded as a major component of Enterprise 2.0), comprises social software as used in “enterprise” (business/commercial) contexts. It includes social and networked modifications to corporate intranets and other classic software platforms used by large companies to organize their communication. In contrast to traditional enterprise software, which imposes structure prior to use, enterprise social software tends to encourage use prior to providing structure.[citation needed] Carl Frappaolo and Dan Keldsen defined Enterprise 2.0 in a report written for Association for Information and Image Management (AIIM)as “a system of web-basedtechnologies that provide rapid and agile collaboration, information sharing, emergence and integration capabilities in the extended enterprise”.[1]

Back in 2006, Social Software was defined as one component of Enterprise 2.0.  As of 2006, “Enterprise 2.0″ had become a catchier term, sometimes used to describe social and networked changes to enterprises, which often includes social software (but may transcend social software, social collaboration and software).

Another marketing buzz was Enterprise Web 2.0. This term related to the software making intensive use of Web 2.0 technologies for creating Enterprise applications.

Enterprise Portal Renaissance?
Let me move you back ten years. Do you remember Enterprise Portals?

An enterprise portal, also known as an enterprise information portal (EIP) or corporate portal, is a framework for integrating information, people and processes across organizational boundaries. It provides a secure unified access point,[1] often in the form of a web-based user interface, and is designed to aggregate and personalize information through application-specific portlets. One hallmark of enterprise portals is the de-centralized content contribution and content management, which keeps the information always updated.

I found this definition very interesting. If you replace “organization boundaries” with “product lifecycle” you are getting almost the definition of PLM. I found an old report provided by Delphi group in Boston in 1999 about Enterprise Portals. Download it navigating to the following link. You can see below a diagram I took from this report.

I can see lots of correlations between functional categories of Enterprise Portal model 1999 and Enterprise Social Software model 2010. It looks like we may have a second wave of Enterprise Portals coming with a new name – Enterprise Social Software on top of Web 2.0 technology matured during the last ten years of Web 2.0 deployment.

What is my conclusion? Complexity is hard. In my view, Excel Litmus Test can help you to identify it easily. PLM is in the deep complexity recession. Enterprise 2.0 and Social Software can provide some fresh air. However, as it usually happens during the hype period, many companies will try to sell you old stuff with a new name. Then we will be in danger of double-dip recession. Ask about functionality and technology? Try these things out and see if you remember the same stuff with old names. The good thing about Enterprise Social Software is try to bring modern Web technologies to enterprise. The last make a lot of sense to me.

Just my thoughts…
Best, Oleg

PLM Downstream – Sent from my iPad?

August 6, 2010
I had a chance to read “Sent from my iPad”  on VEKTORRM last week. Dave Angelotti discussed an option to use iPad as a field device. It seems interesting. It made me think about lots of un-realized options for PLM innovation in downstream. Last year, I wrote – PLM content downstream usage, Googlenomic and Futuristic Search. Looking now on this post it seems to me much less futuristic. iPad experience helps… I decided to put few “downstream scenarios” and hope to get more ideas from you.

PLM Downstream Scenarios

(1) Sales. I can see many scenarios where sales-people will be able to have an access to configured catalog of products to facilitate sales. This is a dumb simple scenario. However, the complexity of the scenario is to have it well integrated with other systems.

(2) Manufacturing Shopfloor. In my view, there is an opportunity to use it to replace printed documents on the shopfloor. Do you think it may work? I talked to few people about this option last week. They told that iPad screen size is probably too small. Or maybe we need to wait for iBoard (you can take a look on the following joke about iPhone-iPad-iBoard-iMat)?

(3) Maintenance Operation. This is another similar to manufacturing shopfloor situation. The accessibility of the device may play a key role in getting an access to the right information during maintenance procedures.

The Missing Part of Puzzle?

Do you think device is important? My bet is yes. Sometime, innovative ideas have a very long path to people. To make them possible the unique combination of events needs to happen. It might be cost of components, environment or just device or available technology. The ultimate goal is to get rid of paper from downstream. This is a real innovative goal, in my view. It makes a lot of sense from various standpoints – information access, ecology, etc. Don’t you think iPad is the missing part of Puzzle to make it happen?

What is my conclusion? From my experience, the following three iPad characteristics ultimately help to iPad proliferation in downstream – (1) Lightweight; (2) Connectivity; (3) Power consumption. I think iPad creates a significant option for PLM (and BIM) innovation in the downstream applications- sales, manufacturing facilities, field operations, etc. What I like in iPad is an ability to create a platform for lots of powerful and focused applications. Not a big PLM show, focus matters!

Best, Oleg

Vertical PLM Medley

July 16, 2010

My new website and blog is BeyondPLM. The original post is here.

I read an article “PLM’s Vertical Challenge” in High Tech Views. The High Tech article is presenting advantages of industry-oriented PLM implementations. Read it and make your impression. I specially liked the following part:

“We needed something that had an apparel maker’s focus in mind. Otherwise we’d be spending too much time trying to tailor something else,” says Rich Zielinski, Under Armour’s vice president of technical services. “The more tailoring you do, the more a system doesn’t work.”  Companies in many industries are learning that lesson. Having lived through lengthy, expensive, and often fruitless enterprise software rollouts, especially in the areas of ERP and CRM, manufacturers are now reluctant to bankroll the deployment of generic PLM systems in need of major modifications for handling industry-specific requirements and company-specific business processes. They understand that, as the PLM market matures, much of the heavy lifting associated with mapping out critical business processes and industry requirements — for example, in the areas of regulatory compliance or quality management — has already been done by the vendors and early adopters in sectors such as automotive and aerospace and defense.

The problem is very interesting, in my view. Companies are suffering from long time implementation cycles. The alternatives need to found. In the early beginning of PLM-era, CAD/PLM companies were selling toolboxes with a significant portion of services and implementation consultancy. However, 1-2 years implementation cycles, significant budgets and unsatisfied customers made vendors to think about possible alternatives.

PLM Industry Solutions
The idea of industry solutions (or how companies are calling it – industry verticals) isn’t new. If you will take a look on the PLM portfolios all major PLM (and not only) vendors, you can see a standard portion of industry doggy food. The industry solutions reminded me storytelling. In the beginning, industry sales or industry professionals are proving that they can speak to a customer in the “industry language”. Then it turns to the software? The very typical option is to have a pre-configured software package including full pack of documents and presentation explaining how to use it inside of the company. You can see below a top-view on such industry-oriented packages.

Vertical Solution Problems

What is the biggest advantage of Industry Solutions? In my view, pre-packaged functionality. What is the biggest disadvantage? In my view, the same. Engineers have a tendency to work slightly different. Organizations may have their own way to run a development that doesn’t fit 100% to pre-packaged functionality provided by vendors. So, the biggest danger is that customer will need to customize the industry solution to fit organizational needs.

Granularity and Risk Management

How to manage risks in implementing vertical PLM solutions? One of the possible ways is to increase the granularity of the solution you are going to buy and implement. To have small pieces that can stitch together can prevent you from grandiose plans to implement all in a single shot.

What is my conclusion? The best association that comes to my mind when I think about vertical PLM solutions is CD-medley. Do you remember that? You can buy Bach-medley, Mozart-medley, Rock-n-Roll-medley… You can buy a double-CD-medley for a bargain price. It was 15-20 years ago… What can be better? Now we know – iTunes. More granular approach can win over the time. Just my thoughts…

Best, Oleg

Open PLM – A Climb For Losers?

July 8, 2010

Almost two months ago, I had a chance to read a blog article by Mobile Beat with some intriguing name “Open is for losers” by Dave McClure. Take a look and form your opinion. Understanding the overall intention to provoke a good discussion around this topic, I decided to follow this blog post in some time to read discussion and see people’s opinion on this topic. It happened that independently some of PLM Think Tank posts generated some interesting discussion related to the topic of Openness.

Closed Thoughts About PLM Openness

Open vs. Closed PLM Debates

Today, I want to get back and discuss how I see the future of openness in PLM and Engineering Software.

Standards, Formats, Data Models

The discussion about standards and formats is life long. Do you think CAD and PLM industry will be able to generate a reasonable quality standard answering to the needs of industry? Discussions in this field are varying from request to develop independent formats and ends up by requests to disclosure existing data models, formats, etc. I don’t think these discussions may produce results in near future. The development of common formats and models is too expensive and, in the end, will require to have a set of tools to work with. The interesting potential will be to borrow some of the emerging web technologies.

APIs, Toolkits and Platforms

API is the most referenced way to make software open. Combined with a set of patterns, technologies and buzzwords, this is probably the way to do it in real life. The biggest remaining problem in this space is reliability and compliance of these APIs within the time. In addition, APIs and other internal mechanisms are under heavy licensing by vendors. I don’t see any silver bullet these days that can make any major improvements in this space.

Applications and Solutions

The reality of a current situation is that customers are working with multiple applications, products and solutions. In many situations, I can see no interest for a particular vendor to enable easy data migration from one tool to another. Therefore, we can see multiple software vendors or service providers that can help customers to solve this problem.

What is my conclusion today? How To Climb To Open PLM? This is the most important question that needs to be asked collectively in the industry. I think, the industry movement into PLM openness is the indicator of industry maturity. I can see ups and downs on this road. A critical mass of openness needs to be delivered by vendors to turn on the switch of benefits from open PLM. Multiple established vendors and newcomers will need to invest enough resources to make it happen. The remaining players will be able to deliver better solutions. Some of them can probably die on the road. Just my thoughts…

Best, Oleg


How Big Is Product Lifecycle Data?

July 6, 2010

Product-related data is one of the most important aspects of any PLM implementation. When you talk about PLM implementation, the topic of product-related data (or IP) is very often becomes a center of the conversation.  There are multiple sources of this type of data in the organization. In my view, one of the PLM goals is to have a control of this data and provide tools to manage the overall lifecycle. One of the PLM implementation challenges is to provide wide support for product-related data. The topic I want to discuss is related the ability of PLM product to handle full scope of this product lifecycle data.

I read the article Oracle, SAP working on Exadata support. The core of this conversation is about how to scale up and provide extensive support for big data handling in the organization. Have a read of this article and make you opinion. Mine is simple – both Oracle and SAP understood the size of the potential problem (data size). They are working in multiple directions to find a solution for data sizing in transactional enterprise application. Should PLM care? This is a very good question in my view…

PLM and Product Lifecycle Data Problem
One of the challenges PLM is having for many years is getting control of product-related data. My observation shows that product-related data is not completely controlled by PLM systems in the majority of PLM implementations. Even with a very successful PLM implementation, data is scattered between multiple data sources and PLM is only one of them. In addition to that, product-related data can be located in the diverse set of applications used for product development.

Product Data, Size and PLM value
The full value of Product Lifecycle Management is directly dependent on how what scope of product-related data is covered by PLM. The wider scope can maximize PLM value for organizations. With all current developments, PLM is looking on starting from design to manufacturing strategies and development of social-oriented application, sizing can easily become one of the potential bottlenecks related to the ability to support large scope of data.

What is my conclusion? I think, to understand sizing of product lifecycle data is important in order to build right operational and strategic plans related to data management. Data is growing fast. Future PLM implementation can suffer from problems related to data sizing. How to scale up PLM implementation in terms of size can be one of the most important questions in the future. Just my thought…

Best, Oleg


PLM and Open Source Big Games

July 1, 2010

I have been paying more attention to open source last time. What I wanted to analyze is how Open Source will influence future enterprise software landscape and what does it means for Product Lifecycle Management software. Open Source never been on the simple track in the enterprise organization. So, I’d like to put some of my analyzes towards trying to understand if future of PLM can gain some competitive analyzes from Open Source projects.

Big Player and Changes

The original open source invention contradicted lots of established rules in the enterprise software, mixed usage of FOSS software and components faced significant legal challenges. In addition, the valuation and other business characteristics of open source companies challenged lots of analysts and raised many discussions. However, I can see some significant changes for the last couple of years. Growth as community based development, FOOS started to leverage alternative sources of revenues. You can take a look on the following article by CNET from the last year, Open-source M&A: The scorecard to date” mentioned some interesting numbers related to OSS dividends.

Another article: How many billions is open-source software worth? by Computerworld also last year  estimations and some estimation numbers about OSS industry worth. What is about $387B? Sounds big? This is the number BlackDuck Software, Boston based firm came up. The following part is especially interesting:

That’s the number that Black Duck Software came up with. Black Duck isn’t an open-source ISV (independent software vendor). The Boston area company started as an IP (intellectual property) risk management and mitigation company, but has since grown into an open-source legal management firm. Since Black Duck was founded in 2002, the company has been tracking all known open source on the Internet According to their research, there are over 200,000 open-source projects representing over 4.9 billion lines of code. To create that code from scratch, Black Duck estimates that “reproducing this OSS would cost $387 billion and would take 2.1 million people-years of development.”

Open Monetizing Source Trajectories

Another interesting article with a catch’y name: “Has Oracle Been a Disaster for Sun’s Open Source?” Worth read. I recommend you to have a look. Oracle becomes a provide of Open Source Software. However, the Open Source trajectories inside of Oracle are not very happy. It is still not clear how OSS software will be integrated, developed and promoted by such a business community as Oracle.

The problem is that Oracle is naturally trying to optimize its acquisition of Sun for its own shareholders, but seems to have forgotten that there are other stakeholders too: the larger open source communities that have formed around the code. That may make sense in the short term, but is undoubtedly fatal in the long term: free software cannot continue to grow and thrive without an engaged community.

In addition, I can see additional trials to monetize existing open source projects. I got an invitation to the conference “Lucene Revolution” by company name Lucid Imagination that will happen in Boston in October this year. Lucene is a well known Open Source search platform. Lucid is trying to play a “Red Hat” search game, which can be an interesting monetizing strategy for established OSS brands.

PLM and Open Source

If we’ll take a look on PLM Open Source land, we can see Aras is pushing OSS trends forward. I think, it is getting more traction and interests. However, I see Aras’ activity as somewhat not balanced in the following two areas: 1/ development of the communities; 2/ leveraging existing OSS software stack. Probably, Aras’ roots and relationships with Microsoft put some restriction on their OSS-related positioning. Despite these two concerns, I can see a significant interest from customer side to what Aras is doing in Open Source PLM.

What is my conclusion? Customers in engineering and manufacturing organizations are looking for changes in enterprise software. FOSS is one of the potential directions to make it happen. Usage of OSS can save cost and therefore, provide benefits to end users. With lower cost base, enterprise vendors can be innovative in the development of new business models which, in the end, also benefit end users. As I had chance to mention before, community development, cross-usage of existing OSS platforms and tools can be very interesting.

Best, Oleg


Get every new post delivered to your Inbox.

Join 244 other followers