PLM and Unknown Unknowns Use Cases

April 30, 2013

Recent tragic event in Boston, raised again the question about critical role of real time information integration. You may think, it is not something that related to engineering and manufacturing software. Until recent time, I’ve seen it exactly in the same way. However, with the latest trends in the developing of data and information systems, I can see how big data, data analytic and analyzes can be used by business enterprise software too. Getting back to events of 9/11, Donald Rumsfeld, US Secretary of State for Defence, stated at a briefing: ‘There are known knowns. There are things we know that we know. There are known unknowns. That is to say, there are things that we now know we don’t know. But there are also unknown unknowns. There are things we do not know we don’t know.’ Originally “unknown unknowns” statement was considered as a nonsense. However, if we think twice, the concept of unknown unknowns might be relevant to many companies in manufacturing.

One of the key roles of PLM these days is to help companies to innovate. There are some many definitions of “innovation”. You can think about innovating organization, innovative processes. Here is the thing. Most of companies these days are afraid about how not to get “surprised” by innovation coming from unknown innovators, competitors and other factors – new economic condition, financial impacts, new product segments, cross domain innovation, etc.

In my view, the key element of preventing “unknown unknowns” impact is to get better analyzes of the data in your company and outside. Companies owns a lot of data business data, stored in databases and mainframes behind the firewall. This is “known knowns” because in this area business decisions are generally made based on historical data. This is where PLM/PDM operates today. There are lots of data that mostly unstructured and resides in emails, blogs, internet, websites, etc. This is a place of “known unknowns”. Companies dealing with big data and some others are trying to solve today. The biggest danger is coming from unknown unknowns. We need a solution to fix it.

What is my conclusion? There are many things that can influence manufacturing organizations. We live in a very dynamic world. Market conditions are changing, new competitors are entering market in a very disruptive way, financial market influence, employees turnover. These are “unknown unknowns” of PLM and future innovative solutions software vendors can come with to market. Just my thoughts…

Best, Oleg

PLM systems and web scale API trajectories

April 28, 2013

APIs. Just to eliminate any possible confusion, API stands for Application Programming Interface. Wikipedia provides very straightforward definition of API here. For many years, API was an important element of PDM, PLM and other enterprise software business. APIs enabled the possibility of any system to be expanded, customized and tailored to any customer needs. For many years, the availability of API was considered as ‘must have’ feature during any PDM/PLM system evaluation. Thinking about API usage in PDM/PLM system, I can segment it into two different aspects 1- ability to customize a specific system behavior such as data model, user interface appearance; 2- ability to integrate system with surrounding eco-system (eg. desktop applications, other enterprise application, etc.)

The second aspect of usage API to integrate with surrounding eco-system is getting more and more interesting. Almost a year ago, I posted on my blog – Why PLM need to learn Web APIs? If you haven’t had a chance to read it, catch up now, please. For me, growing web ecosystem provides an interesting prediction about what is going to happen next in enterprise software world. In parallel, I can remind you another blog – Mobile PLM and BaaS. Web provides a good oriented of how systems can be expanded. Web and mobile Apps expansion. This a way systems will expand, grow, gets customized and tailored in the future.

The following article was published on TechCrunch just few hours ago – Facebook And The Sudden Wake Up About The API Economy. Take a read – I found this article very solid. It made me think that importance of Web API infrastructure and openness among PLM system will be growing faster than I thought last year. Few acquisitions and partnership in the world of API makers – Intel / Mashery, Mulesoft / Programmable Web and very recently Facebook / Parse demonstrated, in my view, that the interest to back end application data combined with mobile app development is skyrocketing. API is a key how to make this data available and turn into profitable mobile (and other Apps). Here is an interesting passage from TechCrunch article.

With that scaling in number of APIs comes a virtuous circle for the developers that build compelling apps and APIs. The APIs extend the apps reach as they become part of distributed data network. As more people use the APIs so the app developer generates more data. As the data increases in scope, often the service will become an API. Facebook needs new streams of data to keep rolling out new digital products. Back end as a service providers like Parse provide SDKs and APIs that give developers access to infrastructure for storing basic data types, locations and photos. How Facebook uses this data is a question mark. But regardless, Pare serves as a constant replenishing source, nourished by the apps on the Parse platform that use APIs. Facebook now will decide how to package and segment that data to push more relevant advertising to its 1 billion users.

What is my conclusion? I would like to make a parallel between web platforms scaling to billion users and PLM platforms searching for a runway to grow up in every manufacturing company and to link manufacturers with customers, suppliers and business eco-system. Distributed data network is a key thing. Current PLM platforms were designed for “inside-firewall” reality have hard time to scale to web reality. To future PLM customers and evaluators – check your PLM platform of choice for the ability to grow beyond servers located in your backyard. PLM vendors must take a note about their future scale and ability to expand the ecosystem. Just my thoughts…

Best, Oleg

PLM, Bill of Materials and Silo Syndrome

April 26, 2013


Are you familiar with term “silo”. When it comes to enterprises and large organizations, we often can hear about different silos of information. Here is the definition of information silo as it appears in Wikipedia.

An information silo is a management system incapable of reciprocal operation with other, related information systems. For example, a bank’s management system is considered a silo if it cannot exchange information with other related systems within its own organization, or with the management systems of its customers, vendors, or business partners. “Information silo” is a pejorativeexpression that is useful for describing the absence of operational reciprocity. In Information Technology, the absence of operational reciprocity is between disparate systems also commonly referred to as disparate data systems. Derived variants are “silo thinking”, “silo vision”, and “silo mentality”.

Very often, you can hear about “information silos” in a very negative context. Here are typical reasons why silos are bad – productivity killer, bad information transparency, etc. Recently published by PR Newswire article defines a new term called silo syndrome. Navigate to the following link to read Thousands of Companies Diagnosed with Dreaded ‘Silo Syndrome’. Article defines list of so called “silo syndrome symptoms”:

- An inability to immediately access business information
– Searching for answers but never really finding them
– Problems processing terms like “unstructured content”
– A penchant to unnecessarily flatten relational data
– Inability to join concepts together in real-time
– Needlessly accessing multiple systems for “what” and “why” answers

PLM propaganda very often use the value of PLM in overcoming the problem of organizational silos. Here is one of many marketing examples of PLM value connected to org silos coming from Oracle Agile PLM article on IT Toolbox article.

PLM by definition is concerned with tracking and controlling product-related business processes that span multiple departments across an extended period of time. Each of these departments may utilize differing systems. Tracking a products lifecycle will often present the need to gather and share information with ERP, CRM, inventory, manufacturing, supply chain, logistics and other systems. While some off the shelf integration may be available, current PLM users often find themselves faced with a frustrating level of manual re-entry or poor visibility of information and processes trapped in so-called silos. Overcoming these integration challenges can mean that an organization is liberated to find the true value of PLM: more innovative, market-responsive products, faster-time-to-market, faster time-to-volume, more efficient change management, better customer care, and superior obsolescence strategies. These benefits can be achieved by both process and discrete manufacturers.

The reality of PLM and silos are difficult. The main place where PLM is facing organizational silos is Bill of Materials (BOM) management. For manufacturing organizations, to create and manintain multiple Bill of Materials is a straightforward way to split responsibilities and control. Requirements, Engineering, Manufacturing, Sales, Support, Supply… you name it. Every department and organization is requesting to have “my BOM”, which will allow them to control and manage the information in the way they want. The real challenge come after when people demand PLM system to take care of multiple BOMs and information transformation between these BOM-silos.

What is my conclusion? Today, PLM has a limited success in eliminating organizational silos by introducing support for multiple Bill of Materials. In many situations PLM is not eliminating the needs to re-creating information. The demand of customers is to have sophisticated BOM management tools that allows to maintain multiple BOM silos in organization. In practice, manufacturing organizations are not interested to eliminate BOM silos. People want to keep information silos, but have PLM system that can help them to manage silos. Result is skyrockeing complexity of the PLM systems and implementation. So, do we need to preserve silos? It is a good question you can ask before approaching you next PLM BOM implementation. What is your take? Speak up.

Best, Oleg

Product Lifecycle Exhaust

April 25, 2013

Do you think Big Data and noSQL are the last and coolest trend in data world? No way. Software architects and geeks are sleepless to find new and unknown trends and opportunities. Last week I attended COFES 2013 in sunny Arizona. The following buzzword caught my attention during one of the presentations. Here is a new buzzword – Data Exchaust.

I tried to find a better definition of what this term means. There is no consolidate view about that. The I found the best explanation about what is Data Exhaust on IT Law Wiki. Navigate your browser here. It provides four different definitions. The following one resonated the most with my way to think about data exhaust:

The "aggregation of [consumer] data through the digitization of processes and activities" in the commercial sector which generates metadata supporting corporate profit generation.

Here is a picture I captured during COFES 2013 presentation. It shows the idea of data exposed out as a result of mobile device usage.

Data exhaust is tightly connected to some notions of big data. Another interesting article I captured was a publication from O’Reily Strata website. Navigate to the following link to read the article – Tertiary data: Big data’s hidden layer. The article is worth reading. We are producing lots of data these days and this data can be very valuable. Unfortunately, we are far behind in our ability to capture the data we are producing and getting a value of this. Here is an interesting passage:

Back in the days of floppy disks, the lines of ownership were pretty clear. If you had the disk, the data was yours. If someone else had it, it was theirs. Things these days are much blurrier. That tertiary data — data that’s generated about us but not by us — doesn’t just build up on your mobile devices of course. Other people are building datasets about our patterns of movement, buying decisions, credit worthiness and other things. The ability to compile these sorts of datasets left the realm of major governments with the invention of the computer. We’re all aware of this, and there’s even a provocative buzzword to describe it: data exhaust. It’s the data we leave behind us, rather than carry with us.

I captured the following picture from the same article. It shows a visualization of iPhone location tracker.

Data exhaust conversation made me think about Product Lifecycle Exhaust. In everything PLM does today, we are very focused on how we create data during the engineering and manufacturing stage. PLM products provide little to none attention to the information products produce during their lifecycle. The situation is better for long lifecycle articles like airplane and nuclear submarines. But this is where PLM attention to lifecycle information ends.

What is my conclusion? Cost and quality are two top priorities of every manufacturers. In my view, data exhaust can be an interesting source of information about how to improve quality and reduce cost. We can learn about usage experience of our products, we can discover what features are not used by customers and we can learn how to optimize products in order to serve our customers in a better way. Just my thoughts. Do you see it the same way? Speak up. I want to know your opinion. If it resonates, come with examples, please.

Best, Oleg

PDM: Rightsize, Wrongsize, Overkill?

April 23, 2013

I want to talk about PDM today. Product Data Management is not a new topic. Companies are using PDM many years. However, here is a deal – after many years of PDM deployment, customers are still trying to avoid to implement PDM. You may think it is mostly small and medium companies, but it is not true. I’ve seen many large companies that are using files and folders structure to manage their design files and revisions.

You may think PDM overkill? The issue of "sizing" of PDM and PLM isn’t new as well. Navigate to one of my previous posts about that – PLM: Rightsizing vs. Wrongsizing Debates. My conclusion there – size doesn’t matter. Two major aspects of PDM/PLM deployment – user experience and cost of ownership. PLM needs to focus on these sooner than later.

I wasn’t alone in the discussion about PDM for SMB and rightsizing. My industry friends Chad Jackson and Jim Brown captured this topic in their regular talk show – Tech4PD. Navigate to this link to read more – Is Product Data Management Overkill for Small Design Teams? If you have few minutes watch the video.

The discussion made me think about two significant issues where PDM brings overkill to engineers. First and most important – nobody likes data management. If you are in a big company and your boss will tell you to use PDM, you do. However, if you have even a chance not to use PDM, you won’t miss that opportunity. Design is cool, but data management is boring stuff. So, if it is an absolute need, then you agree to use it. However, here is where the second question is coming – cost! The issue of cost is coming faster than you expected. It is not a cheap job to deploy PDM even for small company with 5-10 users.

What is my conclusion? Last 15 years of PDM didn’t solve the problem of user experience and cost for PDM. We still see high cost of PDM systems and user experience coming from last century. Customers are demanding new type of tools even going beyond what mainstream PDMs (eg. SolidWorks Enterprise PDM and Autodesk Vault) are capable to provide. The time for innovation is coming. Just my thoughts…

Best, Oleg

How PLM can adopt Sales 2.0?

April 22, 2013

The last decade was all about 2.0. To me, 2.0 trend was about how to re-think existing norms and behaviors, re-invent something well-known and to challenge existing axioms. Internet changed a lot in our life for the last decade. One of the places that remains very conservative is enterprise sales. If you are social web enthusiast, tech geek or iOS developer, try to speak to people selling to big enterprises. You can find yourself in the wrong territory after first 5 minutes of talk. However, I can see how changes are coming to this place too. In the beginning of this year, I tried to challenge my friend from enterprise sales department. If you missed my previous blog, navigate here – PLM, Viral Sales and Enterprise Old Schoolers. One of my conclusions after this post was that even enterprise sales has strong roots, the time is coming to challenge the current status.

I’ve been catching up on emails and social media during my long journey from Boston to Singapore this weekend. The following Gigaom article caught my attention – Enterprise 2.0: The science of inside sales. Take five minutes of your time and read the article. Freemium was a king of the road in consumer market for the last 3-5 years. It seems to me everybody read the “Free” bible by Chris Anderson. I remember my note back in 2009 – Is Free the future of PLM? What I found very interesting in Gigaom article is the idea of merging of two parallel models – freemium and direct sales.

At the end of the day, it is all about setting cost and price. If your cost of the sale is high, you have no chance to scale up and sale to mainstream market. I found the following passage important:

In contrast to traditional outside sales, which is done in-person and tends to involve extensive travel and time expenditures, inside sales is professional B2B sales done remotely via phone, email and chat. It is strategic selling that requires managing a deal through a multi-stage process, multiple touch points with the customer, establishing value and an ROI for the product and supporting complex purchasing methods, like procurement departments, but importantly without visiting the customer.

Another interesting snippet brings you cost vs. price model that can take you beyond the threshold of free online business and allows you to have sales people.

So how do you know if you’re ready to build an inside sales team? Truthfully, if the product is shipping it’s never too soon. A key test is the price at which you are converting free users to paid. There are a lot of apps that only charge 99 cents or $4.99 a month for the premium version. That won’t cut it – your margins won’t support a sales force. You’ll need a price point of at least $25 to $50 per user per month to validate the value of your product and make enterprise sales work. At that price or above, a workgroup of 10 to 20 users can be sold within a customer account for $5,000 to $10,000 per year. Over time, you’ll be able to increase the deal sizes through premium features like administrative functionality.

And finally, you can see an enterprise example, which probably can make sense for PLM sales too.

The typical inside rep will make $40,000 to $60,000 per year in base salary. Including bonus, their on-target earnings (OTE) will be between $100,000 and $120,000. Most Enterprise 2.0 startups are subscription businesses, so quotas should be tied to Annual Revenue Requirement (ARR) or Monthly Recurring Revenue (MRR) with accelerators for contract lengths greater than one year. A typical quota for your first rep is $500,000 of ARR. Over time, enterprise sales reps often settle around a $1 million quota.

The conversation about inside sales made me think about what PLM vendors potentially can do in order to step into the future of Sales 2.0. Here are 3 fundamental steps:

1. Delivery model. Your should be able to deliver your software without CD/DVD and people that need to come and install it. Call it cloud, online, distributed software – it doesn’t matter. You need to exclude a traditional delivery mechanism driven by traditional software development methods and long awaited releases.

2. Online configuration. After you learn to deliver software online, you need to switch an army of consultants, implementers and service providers to work online. Stop pay to airline tickets. All software configuration and tailoring must be done online.

3. Application and granularity. The nature of application is going to change. We should stop a monolithic nature of enterprise software. In the past, it was important to sell “all-in-one-box”. In order to support “inside sales” model, business software needs to have an ability to be deployed in a granular way. Some portions of applications can be provided for free, then configuration should allow to turn on licensing feature and Voilà – you converted your free customer into paying one.

What is my conclusion? The new technology is ready for enterprise. It proven by multiple startup companies and giants of consumer software business. However, enterprise companies are tricky and enterprise sales are even more tricky. Both sides – sellers and buyers are keeping an existing enterprise sales model. This is their life jacket to survive and keep an existing enterprise sales model afloat. The time is coming to disrupt it. Just my thoughts…

Best, Oleg

The future of PLM Glassware?

April 19, 2013

Technological predictions are tough and nobody wants to make them. Back in 2010, I came with the following post – Who Can Generate 3D/PLM Content For Apple iPad? Back that time, the value of iPad was questioned by many people. Speaking about manufacturing companies, people were very skeptical by the ability of iPad to bring meaningful functionality.

Fast forward in 2013. I’m sure you’ve heard that Google Glasses are coming. If you don’t know what is that about, navigate your browser here. The initial leaks of Google Glass experience are starting to leak the blogosphere. Navigate your browser to the following article – Google Glass is finally here: Tech specs released, first units shipped. The article put some snippets of Glass specification such as display resolution (similar to 25″ monitor), 16 GB flash storage, 5M camera with option to create 720p videos and sync to 12GB of Google cloud storage.

Google is paying a lot of attention to developers community. Navigate to the following article – Google publishes Glass’ Mirror API preview for developers. It contains a link to Google Glass Mirror API you might find useful.

Here is another article that caught my attention – 10 things about Google Glass: Could this be Google’s iPad? Many of Glasses usages are clearly individually oriented. At the same time, author is raising some initial questions and thoughts about business usages of Glasses and how Glasses can be connected to corporate accounts. Here is an interesting passage:

Consumer Google accounts can be connected to Google Glass. No corporate connections yet. The real interesting connection for enterprises would be service-oriented businesses and Google Glass. For now, Google Glass is all about individual accounts. Google Apps access will certainly follow at some point. The business implications for Google Glass will appear later. Google Glass could become a productivity tool. Presentations, location data, sales information and real-time information on the go could be handy. You could also picture a person on an oil rig giving a real-time, real-world view of a product to a manager in Dubai.

Few years ago, Microsoft and BMW released a video – Manufacturing Future Vision. Watch it below. You will find funny, but many of concepts related to tablet computing world are actually reality now.

However, I want you to pay attention to few examples below very similar to what we can see in a futuristic videos of Google Glass interface.

What is my conclusion? The analogy of Glasses with iPad is very strong. Only few years ago, iPad was introduced a complete new experience. Now, we can see tablet computing experience in our everyday life everywhere. Business usage of tablet computer is skyrocketing. I can see Glass experience can change some of businesses as well. We are going to see many Glasware use-cases that will change company processes. Just my thoughts…

Best, Oleg

How Amazon helps cloud PLM to connect to enterprise data?

April 16, 2013

Face it, even cloud is trending and growing fast, on enterprise premise systems are representing a major part of engineering and manufacturing systems in organizations. It includes ERP, CRM, PDM, PLM systems as well as zillions of Excels and CAD files. I’ve been thinking how to optimize cloud/on-premise data co-existance. My attention was caught by the news about Amazon Storage Gateway. Amazon, in its push to draw more enterprise customers, had to make sure the Amazon Storage Gateway will run in Microsoft Hyper-v virtualized shops. Which expands the ability of Amazon to synchronize data between cloud and on premise environment.

For those of you not familiar with ASG (Amazon Storage Gateway), navigate to the following link to learn more. The AWS Storage Gateway supports two configurations:

1/ Gateway-Cached Volumes: You can store your primary data in Amazon S3, and retain your frequently accessed data locally. Gateway-Cached volumes provide substantial cost savings on primary storage, minimize the need to scale your storage on-premises, and retain low-latency access to your frequently accessed data.

2/ Gateway-Stored Volumes: In the event you need low-latency access to your entire data set, you can configure your on-premises gateway to store your primary data locally, and asynchronously back up point-in-time snapshots of this data to Amazon S3. Gateway-Stored volumes provide durable and inexpensive off-site backups that you can recover locally or from Amazon EC2 if, for example, you need replacement capacity for disaster recovery.

The two options are representing an interesting option on how enterprise data can co-exist between cloud and on-premise environments. I can see mid-size companies are doing it to optimize their file storages. Larger companies can use it for extended value chain communication.

What is my conclusion? As cloud systems will expand in organizations, the demand for hybrid environment will grow as well. Companies won’t be able to migrate enterprise data assets outside of organizations fast, therefore cloud PLM solutions that will be able to communicate and co-exist in hybrid deployments will grow. The ability to connect existing enterprise data assets and cloud apps is a key to make future cloud expansion. Just my thoughts…

Best, Oleg

COFES 2013 – Notable discussions and moon photoshoots

April 15, 2013

I’ve been attended COFES 2013 earlier last week. For those of you not familiar with the even, navigate your browser to the following link. COFES stands for The Congress on the Future of Engineering Software.  Annual COFES is taking place in the same location every year in April. You can see a complete agenda of COFES 2013 here. The key two elements of COFES are people and discussions. You want to navigate to this link to see who attended COFES this year.

There are few sessions and discussion topics that I specially liked. One of them was Round-robin on computing future. It was fascinating to see a unique blend of presenters clearly representing three different generation of developers: Mike Riddle (the orignal author of AutoCAD), Jon Hirschtick (co-founder of SolidWorks) and Kai Backman (co-founder of TinkerCAD and Airstone Labs).

I had a chance to speak to Simon Floyd of Microsoft about Windows 8, multiple device strategies and PLM applications. Microsoft was one of the dominant at COFES 2013 with the technology suites presenting Windows 8 and CAD/PLM apps running at least on 10-12 different devices. In the world of multiple devices, multiple applications to see what Microsoft is trying to accomplish with their Windows 8 strategy was very interesting.

Another interesting discussion lead by Siemens PLM was about development of eCars. No, it wasn’t about how Siemens is going to compete with Mercedes, BMW and Audi. Siemens is introducing the approach of using plug-in electrical cars energy to optimize electrical power network in a country like Germany. One of most shocking examples was a plan to update car software when car is moving from one European country to another to follow specific standards and environment.

Esther Dyson keynote and Q&A was impressive. The surprising topic was to learn about the fact Esther speaks Russian and investing in Russian companies  like Yandex and some others. My favorite topic of simplification and cost came in the discussion when talking about earlier FedEx innovation – transport was cheaper than complexity.

Finally, the discussion with intriguing name “EoL 4 Email” actually morphed into the conversation about the future of email. The conclusion – email is not going to die, but to transform into online messaging system with rich content delivery and contextual actions. I probably will come with a separate post about this.

There is no way you can participate in all conversations and discussions at COFES. Discussions are everywhere – in hallways and walkways; around the tables and in the evening under the stars. Below few photos I’ve made during COFES Evening Under the Stars.

The following photo was done by my new EOS-M and telescope provided by COFES.

Best, Oleg

Will New Jazz Product Development Model work for PLM?

April 13, 2013

The world around us is changing much faster these days. It happens in many places. Business environment are much more dynamic. New technologies are disrupting existing industries and eco-systems. PLM systems were developed to help companies to manage and follow product development processes in the companies and extended eco-system. As businesses are going through the changes, we can see a need to change the way product development processes are organized. The question I want to ask is how it will impact and change PDM/PLM development.

Originally, PDM/PLM systems were build to bring an order in a chaotic world of technical and engineering data. You probably remember the early acronyms used for these purposes – EDM/TDM. For many years, the fundamental proposition of PDM/PLM systems was to organize a centralized data storage and to define rules to store and manage data. It certainly helps in many sense to companies to move from chaos to order. However, with all modern changes in business ecosystem, it gets harder to get to the final state of the PDM/PLM implementations. These implementations are unsustainable in front of frequency of changes.

I’ve been listening to Jon Hirschtick earlier this week at COFES 2013. Jon was talking about changes in the culture of programming tools. Programming and product development methods and technologies are going through lots of changes these days.

One of the analogies Jon made resonated. Jazz product development model. I found it funny and true at the same time. To go from from closed world of controlled development systems to open source and with community of developers replacing manuals. This is a very interesting change – community and development eco system can provide much high level of agility.

It made me think about different approaches in development of PDM/PLM systems. What is the potential to use agile product development methods, open source eco-system and the power of communities.

What is my conclusion? The changes are coming. Internet and open source helped to develop a different eco system. This system is completely different from what we know many years in PDM and PLM. I’m curious if and how the new model will influence the development of enterprise software. I think vendors need to take a note. Openness, flexibility and agile methods are trending. These are important and irreversible trends, in my view. Some companies will lose the connection with the reality and disappear. However, the smart ones will evolve. These are just my thoughts. What is your take? Speak up…

Best, Oleg


Get every new post delivered to your Inbox.

Join 245 other followers