3 reasons why size won’t matter in PLM future?

August 21, 2014

plm-small-big-future-1

The debates about small vs. large PLM implementations are probably as old as PLM software. Joe Barkai recently came with several very controversial blog series – Is PLM Software Only for Big Guys? One of these posts – Do PLM Vendors Think SMBs are Just Like Large Enterprises, Only Smaller? Note the following passage:

In my market research in PLM, PDM and related fields, and consulting work with engineering organizations, I often find that SMBs don’t think of themselves as being just like the “big guys”, only smaller. They believe they possess different culture, work habits and operational models, and perceive PLM as a tool ideally suited for large organizations with sizable engineering teams designing complex highly engineered products.

Another Joe’s post is questioning – Can PLM software benefit small company?

Looking at the profile and size of engineering companies using PDM software, especially those showcased by mainstream PDM and PLM vendors, one might easily reach the conclusion that these systems are, indeed, designed with the “big guys” in mind. This perception may be reinforced by PLM and ERP vendors that have announced products designed for the SMB market and abandoned them a few years later, when rosy revenue expectations weren’t achieved. Remember, for example, PTC’s ProductPoint and SAP’s Business By Design? Small engineering teams have come to think of PLM software as unnecessarily complex and limiting operational flexibility, not to mention the high cost of the software, IT overhead, and the pain of keeping the software up to date.

It is true, that historically of CAD and PDM systems came from large defense and aerospace industry. Since then, lots of innovation in PDM and later in PLM domains was about how to simplify complex and expensive solutions and make it simple, more usable and affordable. 80% of functionality for 20% of price… It worked for some CAD guys in the past. Is it possible in PLM? PLM system fallen into the trap of the simplification many times. As soon as new affordable solution came out for SME companies, it was demanded by large enterprises as well. You can hear an opinion that price was a key factor and PLM vendors didn’t find a way how to sell both enterprise and SME solution with right packaging and price differentiation. Not sure it is true, but to shutdown SME focused PLM solution is not very uncommon in PLM industry.

I shared some of my thoughts about why PLM vendors failed to provide solutions for SME. One of my conclusions was that cost and efficiency are key elements that can help PLM vendors to develop a solution for this challenging market segment.

However, Joe’s posts made me think one more time about “small vs. large” PLM challenge. I want to share with you my 3 hypothesis why size won’t matter for the future PLM solutions.

1. Horizontal integration

Large monolithic businesses with strong vertical integration are displaced by granular and sometimes independent business units with diverse sets of horizontal relationships. Businesses are looking how to optimize cost in everything – development, supply chain, manufacturing, operation. I imagine these businesses will demand a new type of PLM solution that can be used by network of suppliers and business partners rather than by single vertically integrated organization.

2. Technological transformation

In the past, many PDM and PLM vendors assumed SME solution as something that shouldn’t scale much, can run on a cheaper hardware and low cost technology and IT infrastructure. Cloud, web and open source technological trends changed the landscape completely. While most of existing PLM solutions are still running on the infrastructure developed 10-15 years ago, I can see them looking for new architectures and technologies that with no question can scale to cover a diverse set of customers – small and large.

3. Business dynamics

Business environment is changing. Businesses are more dynamic. New requirements are coming often and the demand to deliver a new solution or changes went down from years to months. In such environment, I can hardly imagine monolithic PLM solution deployment that can sustain for a decade as it was before. I would expect PLM vendors to think about new type of platforms and set of agile applications serving variety of business needs.

What is my conclusion? Business, technological and organization changes will affect future landscape of PLM platforms and applications. Small is new big. New technological platforms will be able to scale to support a diverse set of customers. Vendors will be moving from shipping CDs to provide services out of public and private clouds. As a result of that, the difference between PLM for SME and Enterprise PLM will disappear. Future PLM solutions will come as platforms with diverse set of agile applications. Just my thoughts…

Best, Oleg


PLM upgrades, release cycles and legacy software

August 20, 2014

legacy-software

Do you know what is legacy software? Earlier today, Marc Lind of Aras Corp. challenged me by his twitter status about companies complaining about legacy PLM systems and upgrading. Here is the original passage from twitter here and here.

"a lot of people complains about legacy PLM and a lot of companies that have legacy PLM are throwing in the towel and switching these days".

marc-lind-legacy-plm-tweet

The part of statement about "legacy software" is really interesting. Last week, I wasn’t able to update a game on my son’s iPad. After few minutes, I discovered that Apple is not supporting the original iPad hardware manufactured 4 years ago. Does it mean iOS software run on that iPad is a legacy? Good question. At the same time, what about properly functioning ERP software that company runs already for the last 10 years without any plans to upgrade? Is that a legacy software?

Wikipedia gives me the following definition of legacy system:

In computing a legacy system is an old method, technology, computer system, or application program,"of, relating to, or being a previous or outdated computer system."[1] A more recent definition says that "a legacy system is any corporate computer system that isn’t Internet-dependent."[2]… The first use of the term legacy to describe computer systems probably occurred in the 1970s. By the 1980s it was commonly used to refer to existing computer systems to distinguish them from the design and implementation of new systems. Legacy was often heard during a conversion process, for example, when moving data from the legacy system to a new database.

Software upgrades is an important topic in engineering and manufacturing. Very often, systems can be in use very long time because of product lifecycle and the need to maintain existing data. It happens a lot in defense, aero and some other "regulated" industries. Also, because of significant investment, the ROI from upgrade can be questionable, which leads companies to keep existing outdated systems in operation. I’ve been posted about problems of PLM customization and upgrades before – How to eliminate PLM customization problems and Cloud PLM and future of upgrades.

PLM vendors are aware about the issue of upgrades and difficulties of software migrations . For long time, industry recognized it as something unavoidable. However, in today’s dynamic business environment, the issue of software upgrades cannot be ignored. Customers demanding flexible and agile software that can be deployed and updated fast. At the same time, changes of business models towards services and subscriptions pushed the problem of upgrades back to vendors.

Earlier this year, my attention was caught by CIMdata publication – Aras Innovator: Redefining Customization & Upgrades. Aras enterprise open source model is predominantly subscription oriented. Which provides lots of incentives for Aras engineers to solve the issue of upgrades and new versions deployment. Here is the passage from the article confirming that:

For several years, the Aras Corporation (Aras) has included no-cost version-to-version upgrades in their enterprise subscriptions, independent of how the solution has been customized and implemented. This is a rather bold guarantee given the historic challenges the industry has experienced with upgrading highly customized PLM deployments. With more than 300 upgrades behind it, CIMdata felt it appropriate to find out how Aras’ guarantee was playing out, and discovered that there was much more to the story than just a contractual guarantee. Fundamentally, Aras Innovator is engineered to be highly configurable—even customizable—without resulting in expensive and complex version-to-version upgrades and re-implementations.

One of PLM software leaders, Siemens PLM is also thinking about What is the best release cycle. The article speaks about SolidEdge release cycle.

A few years ago we moved from an irregular release cycle for Solid Edge, maybe 9 months in one cycle to 15 months in the next, to a regular cycle of annual releases (of course there are also maintenance packs delivered in the interim). I believe our customers much prefer this, they can plan ahead knowing that there will be a significant Solid Edge release available to them in August each year.

At the same time, the article confirms that CAD/PLM vendors are looking how to solve the problem of upgrades. As I mentioned earlier, cloud software model is one of the most promising technical ways to solve the issue of upgrades. It is true, but can be tricky in case both desktop and cloud software are involved. Here is the passage from the same Siemens PLM blog:

Working in the PLM area we try really hard to provide our customers with a good upgrade experience. PLM software is itself dependent on both the operating system and database software, and it has to work with specific releases of CAD software (sometimes with more than one CAD solution for our multi-CAD customers) and with office software as well! Moving PLM software to the cloud could potentially take some of the upgrade issues away from the end user, but PLM software does not work in isolation from your data files, or your other software and systems so I believe there is much work still to be done before the cloud really impacts the upgrade situation for real-world customers.

What is my conclusion? From customer perspective, the best option is to make release cycle completely transparent. In my view, this is really high bar for PLM vendors. Customer data migration, customization and sometimes absence of backward compatibility make release transparency questionable. However, since industry moves towards cloud software and service business model the demand for agile release management and absence of upgrades will be growing. So, my hunch, in the future we will not see "legacy software" anymore. New type of enterprise software will manage upgrades and migrations without customers paying attention. Sound like a dream? I don’t think so. For most of web and consumer software it is a reality already today. Just my thoughts…

Best, Oleg


Apple iPhone 6 and and cross product families BOM planning

August 18, 2014

ipad-bom-assy

To manage Parts and Bill of Materials is not a simple tasks. I shared some of aspects related to the complexity of Part Numbering last week in my post – Existing data prevents companies to improve Part Numbers. The discussion in comments took me towards the complexity of Part Numbers in supply chain. Here is the passage (comments) made by Joe Barkai

…multiple BOMs with inconsistent numbering schema often hide a bigger problem: inconsistent attributes and metadata. I [Joe Barkai] worked with a global automotive OEM on issues surrounding architectural complexity reduction and global quality management. I discovered that each product line was using different part numbers. This was obviously difficult to manage from a supply chain perspective. But, not less importantly, other metadata and data attributes such as failure modes, labor operation codes and other important information were codified differently, rendering cross product line reporting and analysis difficult and potentially lacking, if not erroneous

Product lines and multiple configurations is a reality of modern manufacturing. The customization level is growing. On the other side to manage parts and BOM globally becomes one of the most important and challenging tasks. I found another example of that in today’s news . This is an example of a potential impact on Apple from management of bill of material across multiple product lines and supply chain. Navigate to Seeking Alpha post – Apple iPhone 6 Will Pick Up iPad Sales Slack. Here is the passage I captured:

Apple still generates the majority of profits in mobile, despite the slight declines in market share. Last November, research firm IHS estimated $274 in bill of materials and manufacturing costs for the 16GB iPad Air with Wi-Fi connectivity that retails for $499. Going forward, Tim Cook, operations man, will likely leverage Apple’s immense buying power to further drive down costs for component parts shared between the iPhone 6 and eventual iPad upgrade.

I have no information about PLM system used by Apple to manage bill of materials across product lines. However, I guess, re-use of components among different product lines is a very typical approach used by many manufacturing companies.

What is my conclusion? The complexity of bill of materials management across product lines and supply chain are skyrocketing these days. To manage part numbers, bill of materials, cost and multiple product lines can become a critical part of PLM solution to support manufacturing profitability. Just my thoughts…

Best, Oleg

[tags PLM, Apple, BOM, Supply Chain, iPhone


How long will take GrabCAD to develop full-blown PLM solution?

August 18, 2014

grabcad-plm-beyondplm

Time is running fast. It has been two years since I posted GrabCAD: from Facebook for engineers to PLM. If you are in the engineering community, the chances you will come to PLM are very high. Like in the past all roads lead to Rome, I guess all future development roads for PDM solution lead to PLM. Even if you don’t like to brand your solution as PLM… Nevertheless, if it looks like a duck, quacks like a duck and walks like a duck, it’s a duck.

Just few months ago, GrabCAD moved into "PDM" segment by introducing GrabCAD Workbench. Earlier today, GrabCAD made another "quack" towards PLM by adding BOM support. Navigate your browser to the following link to read – BOMs Away! Workbench Adds BOM Management. The following passage outlines what GrabCAD BOM can do today:

We’ve added an easy-to-use BOM export capability to Workbench, enabling any Workbench user to generate a Bill of Materials with just a few clicks. This means that your engineering team and manufacturing team will always be on the same page. Now your purchasing manager or supplier liaison doesn’t need to bother a CAD engineer to generate a BOM, and doesn’t need to enter items individually into Excel each time you change a revision. It’s as simple as two clicks to get the list of components into Excel!

grabCAD-bom-plm

Introducing BOM functionality is a very logical step many PDM systems did. However, it doesn’t come easy. The complexity of system is growing. From what I can see now GrabCAD is an early beginning and just touching BOM functionality to balance customers demand and complexity of full-blown BOM management solution.

What is my conclusion? To extract BOM from CAD drawing is a very neat functionality. My hunch it was requested by many GrabCAD users. However, BOM functionality introduction was a challenge for many PDM systems in the past. The complexity of Part management, multiple BOMs – this is only two visible parts of the iceberg. GrabCAD strategy reminded me what SolidWorks did in the past with PDM – added functionality when it was absolutely requested by majority of customers. I think it worked for SolidWorks… It would be interesting to see how it will work for GrabCAD. Just my thoughts…

Best, Oleg


Existing data prevents companies to improve Part Numbers?

August 15, 2014

historical-part-numbers

Part Numbers is a fascinating topic. I’m coming back to blog about what is the best approach to manage Part Numbers. My last post about it was – Part Numbers are hard. How to think about data first? was just few weeks ago. In that article, I outlined few principles how to keep PN separate from surrounding data focusing on different aspects of parts – description, classification, configurations, suppliers, etc.

Yesterday, my attention was caught by ThomasNet article – Are Part Numbers Too Smart for Their Own Good? The article nailed down a key issue why companies are still having difficulties with management of Part Numbers. Nothing works from scratch in engineering companies. Complexity of characteristics and history of existing Part Numbers and products are making real difficulties to adopt new PN management concepts. The following passage explains the problem:

Another problem with descriptive numbering is that the description can become out of date and irrelevant over time. Individual parts can have their own life cycles; if a part has been identified according to the product, what happens if that product is discontinued but the part continues to be used in a newer product? Or what if a manufacturer changes vendors and the part number contains the name of the vendor that originally provided the piece?

Gilhooley admits that some Ultra Consultants clients have decided that switching from descriptive to auto-generated numbering would require too much organizational change. Some companies stick with old systems, and some opt for hybrid systems that perhaps retain descriptive numbers for existing parts but use auto-generated numbers for new parts.

It looks like there is no single solution or best practice to solve the problem. The "traditional" engineering approach to keep options to manage a diverse set company configuration looks like the only possible way to solve this problem in existing PLM/ERP systems.

What is my conclusion? History keeps customers from moving forward. There are two aspects of complexity in Part Numbers: 1/ complexity of definition and data classification; 2/ historical records of PN in every company including catalogs and existing products. Together, they create a block to make any changes in existing PN schema and prevent companies from migration towards new approaches. New data modeling technologies must be invented to handle existing data as well as supporting customers to migrate into modern PLM and ERP solutions. Just my thoughts…

Best, Oleg


Why now is the right time to reinvent PDM?

August 15, 2014

re0invent-pdm-now

Product Data Management (PDM) isn’t a new domain. The first PDM systems were invented 20-30 years ago with a simple objective – to manage product data. The scope of PDM was heavily debated and included design, engineering BOMs, ECO and even supply chain. However, the most widely accepted role of PDM is to manage CAD files and their revisions.

For long time, PDM was recognized as somewhat you only need to consider if a size of your engineering department is large enough. Even starting price to implement PDM solution went down significantly for the last 20 years, my hunch average PDM solution starting cost for engineering organization with 10-15 people will be about $30-50K. Cost and implementation complexity made PDM business limited to larger companies and was mostly handled by resellers with special skills and knowledge. Most of them associated with a specific CAD vendor channel.

CAD vendors recognized the need and complexity of PDM. For most of vendors the answer on PDM demand was to develop (or acquire) a dedicated PDM system bundled with their CAD software. As a result of that, most of PDM players were acquired. Most of existing (remaining) PDM vendors are either focusing on a specific geographical niche or developed additional solutions usually branded with "PLM" buzzword and strategy.

My hunch is that until last year, PDM market was somewhat stalled and focusing on replacing of outdated versions of PDM software as well as support of new CAD software releases. Then something happens… For the last months, I can see an increased interested in PDM software. I noticed few focused researches and articles in the field of PDM – Expert Guide to the Next Generation of PDM; TechClarity Expert Guide for Basic CAD management and few others.

Also I want to mention few activities by vendors focusing on basic PDM functionality. It started from more traditional OOTB approach made by PTC Windchill PDM Essentials, SolidEdge SP focusing on SharePoint platform leverage and GrabCAD Workbench using "cloud platform" as a differentiation strategy.

Consilia Vector published CAMScore report for GrabCAD Workbench where CAMS stands for Cloud, Analytics, Mobile, Social. In my view, these major trends are making a renaissance in the space of PDM.

As I mentioned before, because of cost and complexity, PDM software was out of reach for many smaller companies and engineering departments. DIY (Do it yourself) PDM approach combining network file share, Excel files and FTP is a solution for probably 60-70% of market. For many years, to share files using network and USB drives was "good enough solution". But the era of file sharing changed forever with coming trend of social networks, mobile and cloud. So called YAPSA (Yet Another Photo Sharing Apps) became widely available in our everyday life. The question why PDM is so complex and why we cannot manage and access CAD data similar to what we do with photos and videos brings PDM solution back to the innovation room.

What is my conclusion? Cloud, web and social technologies in consumer space reached the level of maturity. It comes to the point where new tech and awareness of cloud and social approach are going to challenge a traditional PDM space. In addition to that, looks like an existing approach to use network drives and file sharing to manage CAD files is coming to logical end. People will be looking how to copy YAPSA approach into PDM space. So, it is time for PDM to change. Just my thoughts…

Best, Oleg


How to visualize future PLM data?

August 12, 2014

collective experience of empathetic data systems

I have a special passion for data and data visualization. We do it every day in our life. Simple data, complex data, fast data, contextual data… These days, we are surrounded by data as never before. Think about typical engineer 50-60 years ago. Blueprints, some physical models… Not much information. Nowadays the situation is completely different. Multiple design and engineering data, historical data about product use, history of design revisions, social information, data about how product is performing coming in real time from sensors, etc. Our ability to discover and use data becomes very important.

The ways we present data for decision making can influence a lot and change our ability to design in context of right data. To present data for engineers and designers these days can become as important as presenting right information to airplane pilots before. Five years ago, I posted about Visual Search Engines on 3D perspective blog. I found the article is still alive. Navigate your browser here to have a read. What I liked in the idea of visual search is to present information in the way people can easy understand.

Few days ago, my attention was caught by TechCrunch article about Collective Experience of Empathetic Data Systems (CEEDS) project developed in Europe.

[The project ]… involves a consortium of 16 different research partners across nine European countries: Finland, France, Germany, Greece, Hungary, Italy, Spain, the Netherlands and the UK. The “immersive multi-modal environment” where the data sets are displayed, as pictured above — called an eXperience Induction Machine (XIM) — is located at Pompeu Fabra University, Barcelona.

Read the article, watch video and draw your conclusion. It made me think about the potential of data visualization for design. Here is my favorite passage from the article explaining the approach:

“We are integrating virtual reality and mixed reality platforms to allow us to screen information in an immersive way. We also have systems to help us extract information from these platforms. We use tracking systems to understand how a person moves within a given space. We also have various physiological sensors (heart rate, breathing etc.) that capture signals produced by the user – both conscious and subconscious. Our main challenge is how to integrate all this information coherently.”

Here is the thing. The challenge is how to integrated all the information coherently. Different data can be presented differently – 3D geometry, 2D schema, 2D drawings, graphics, tables, graphs, lists. In many situations we can get this information presented separately using different design and visualization tools. However, the efficiency is questionable. Many data can be lost during visualization. However, what I learned from CEEDS project materials, data can be also lost during the process of understanding. Blindspotting. Our brain will miss the data even we (think) that we present it in a best way.

What is my conclusion? Visualization of data for better understanding will play an increased role in the future. We just in the beginning of the process of data collection. We understand the power of data and therefore collect an increased amount of data every day. However, to process of data and visualizing for better design can be an interesting topic to work for coming years. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 248 other followers