How much cost to build PLM software?

August 21, 2014

plm-startup-cost

The new normal – we need less money to build software these days. My attention caught Andreessen Horowitz article The Happy Demise of the 10X Engineer. In a nutshell, we live in the era when infrastructure cost is going down and the cost of software engineers is going up. The following passage is important:

As the leverage of the individual software engineer increases, the barriers to becoming a code creator are falling fast. The same software foundation (open source software, development tools like Github, infrastructure as a service provided by the likes of Digital Ocean, and more) that allowed Whatsapp and Imgur to scale, means that experience and skill writing software become less important. An individual can now scale a web app to millions of users with Digital Ocean, Heroku and AWS (perhaps coordinated by Mesosphere). It no longer requires a sophisticated understanding of MySQL parameters to scale a database on Google App Engine, just as it no longer requires a knowledge of the CPU chip it’s all chugging away on.

Nowadays, the open source software foundation, Amazon (AWS) and web distribution allows you to build software and ship it initially without significant upfront expense. Another article by ReadWriteWeb – You Don’t Need To Be An Engineering Genius To Start A Billion-Dollar Company compares the cost of hardware and storage with the cost of engineers between 1998 and 2013.

infra-vs-eng-cost-plm-software

In 1985, storage was a key expense, running $100,000 per gigabyte, while a developer could expect to get paid $28,000 per year. By 2013, things had changed considerably. Now storage is cheap, costing $0.05 per GB. Developers, on the other hand, are expensive $90,000 per year.

Both articles made me think about what is the cost of building PDM and PLM software today. Does new normal rule of building web and mobile startups apply to the world of engineering and manufacturing software? The world of enterprise software is probably different from web and mobile. At the same time, changes I mentioned above in development eco-system and infrastructure cost apply to PLM world as well. So, the answer is far from yes or no. Here is more structured answer related to building of PDM/PLM software.

1. Foundation and Development Infrastructure

Web and open source eco-system created a huge software foundation stack. As a new company, you have a huge opportunity for re-use. This stack wasn’t available 10 years ago. In the past, enterprise companies didn’t tolerate open source software. The situation is completely different today. From that standpoint you can build new software with near to zero development infrastructure cost.

2. Private vs. public cloud

Public cloud is the best world for web startups. Most of them can run in production on AWS or similar public cloud hosting services and scale as usage will increasing. However, many manufacturing companies are still sitting on the fence of private vs. public cloud decision. So, you need to choose. You can either cut your potential customer audience or will be required to incur an additional cost of private cloud configurations, data centers and infrastructure.

3. Domain expertise

You need to get your hands dirty into engineering and manufacturing business. It is different from web photo sharing, messaging and mobile games. There are less people available in this field, which will obviously bump your cost up compared to some other industries.

4. Distribution and sales

To go viral is one of the most desired way to distribute web and mobile software. You go viral or die. The applicability of "viral model" for PLM is questionable. Speak to enterprise sales people and they will explain you the difference between software that needs to be sold vs. software that can be bought. Sales and marketing expenses in enterprise space can be huge.

What is my conclusion? It is easy to build technology and product. However, it is very hard to build business. The technology is getting cheaper. The best part of this trend – it allows you to experiment without significant investment to find product-market fit. PLM industry has its own domain ecosystem and specific rules. Engineers need to be familiar with use cases, existing software, tools and environment to succeed. The last one can push engineering cost of building PLM software even higher than average. The last and the most critical part is distribution and sales. Be prepared to pay huge cost for that. The good news – you don’t need to do it upfront. Enterprise software space is changing dramatically these days. So, I’d agree with Excite founder Joe Kraus and his 2005 article"There’s never been a better time to be an entrepreneur because it’s never been cheaper to be one". Just my thoughts…

Best, Oleg


3 reasons why size won’t matter in PLM future?

August 21, 2014

plm-small-big-future-1

The debates about small vs. large PLM implementations are probably as old as PLM software. Joe Barkai recently came with several very controversial blog series – Is PLM Software Only for Big Guys? One of these posts – Do PLM Vendors Think SMBs are Just Like Large Enterprises, Only Smaller? Note the following passage:

In my market research in PLM, PDM and related fields, and consulting work with engineering organizations, I often find that SMBs don’t think of themselves as being just like the “big guys”, only smaller. They believe they possess different culture, work habits and operational models, and perceive PLM as a tool ideally suited for large organizations with sizable engineering teams designing complex highly engineered products.

Another Joe’s post is questioning – Can PLM software benefit small company?

Looking at the profile and size of engineering companies using PDM software, especially those showcased by mainstream PDM and PLM vendors, one might easily reach the conclusion that these systems are, indeed, designed with the “big guys” in mind. This perception may be reinforced by PLM and ERP vendors that have announced products designed for the SMB market and abandoned them a few years later, when rosy revenue expectations weren’t achieved. Remember, for example, PTC’s ProductPoint and SAP’s Business By Design? Small engineering teams have come to think of PLM software as unnecessarily complex and limiting operational flexibility, not to mention the high cost of the software, IT overhead, and the pain of keeping the software up to date.

It is true, that historically of CAD and PDM systems came from large defense and aerospace industry. Since then, lots of innovation in PDM and later in PLM domains was about how to simplify complex and expensive solutions and make it simple, more usable and affordable. 80% of functionality for 20% of price… It worked for some CAD guys in the past. Is it possible in PLM? PLM system fallen into the trap of the simplification many times. As soon as new affordable solution came out for SME companies, it was demanded by large enterprises as well. You can hear an opinion that price was a key factor and PLM vendors didn’t find a way how to sell both enterprise and SME solution with right packaging and price differentiation. Not sure it is true, but to shutdown SME focused PLM solution is not very uncommon in PLM industry.

I shared some of my thoughts about why PLM vendors failed to provide solutions for SME. One of my conclusions was that cost and efficiency are key elements that can help PLM vendors to develop a solution for this challenging market segment.

However, Joe’s posts made me think one more time about “small vs. large” PLM challenge. I want to share with you my 3 hypothesis why size won’t matter for the future PLM solutions.

1. Horizontal integration

Large monolithic businesses with strong vertical integration are displaced by granular and sometimes independent business units with diverse sets of horizontal relationships. Businesses are looking how to optimize cost in everything – development, supply chain, manufacturing, operation. I imagine these businesses will demand a new type of PLM solution that can be used by network of suppliers and business partners rather than by single vertically integrated organization.

2. Technological transformation

In the past, many PDM and PLM vendors assumed SME solution as something that shouldn’t scale much, can run on a cheaper hardware and low cost technology and IT infrastructure. Cloud, web and open source technological trends changed the landscape completely. While most of existing PLM solutions are still running on the infrastructure developed 10-15 years ago, I can see them looking for new architectures and technologies that with no question can scale to cover a diverse set of customers – small and large.

3. Business dynamics

Business environment is changing. Businesses are more dynamic. New requirements are coming often and the demand to deliver a new solution or changes went down from years to months. In such environment, I can hardly imagine monolithic PLM solution deployment that can sustain for a decade as it was before. I would expect PLM vendors to think about new type of platforms and set of agile applications serving variety of business needs.

What is my conclusion? Business, technological and organization changes will affect future landscape of PLM platforms and applications. Small is new big. New technological platforms will be able to scale to support a diverse set of customers. Vendors will be moving from shipping CDs to provide services out of public and private clouds. As a result of that, the difference between PLM for SME and Enterprise PLM will disappear. Future PLM solutions will come as platforms with diverse set of agile applications. Just my thoughts…

Best, Oleg


PLM upgrades, release cycles and legacy software

August 20, 2014

legacy-software

Do you know what is legacy software? Earlier today, Marc Lind of Aras Corp. challenged me by his twitter status about companies complaining about legacy PLM systems and upgrading. Here is the original passage from twitter here and here.

"a lot of people complains about legacy PLM and a lot of companies that have legacy PLM are throwing in the towel and switching these days".

marc-lind-legacy-plm-tweet

The part of statement about "legacy software" is really interesting. Last week, I wasn’t able to update a game on my son’s iPad. After few minutes, I discovered that Apple is not supporting the original iPad hardware manufactured 4 years ago. Does it mean iOS software run on that iPad is a legacy? Good question. At the same time, what about properly functioning ERP software that company runs already for the last 10 years without any plans to upgrade? Is that a legacy software?

Wikipedia gives me the following definition of legacy system:

In computing a legacy system is an old method, technology, computer system, or application program,"of, relating to, or being a previous or outdated computer system."[1] A more recent definition says that "a legacy system is any corporate computer system that isn’t Internet-dependent."[2]… The first use of the term legacy to describe computer systems probably occurred in the 1970s. By the 1980s it was commonly used to refer to existing computer systems to distinguish them from the design and implementation of new systems. Legacy was often heard during a conversion process, for example, when moving data from the legacy system to a new database.

Software upgrades is an important topic in engineering and manufacturing. Very often, systems can be in use very long time because of product lifecycle and the need to maintain existing data. It happens a lot in defense, aero and some other "regulated" industries. Also, because of significant investment, the ROI from upgrade can be questionable, which leads companies to keep existing outdated systems in operation. I’ve been posted about problems of PLM customization and upgrades before – How to eliminate PLM customization problems and Cloud PLM and future of upgrades.

PLM vendors are aware about the issue of upgrades and difficulties of software migrations . For long time, industry recognized it as something unavoidable. However, in today’s dynamic business environment, the issue of software upgrades cannot be ignored. Customers demanding flexible and agile software that can be deployed and updated fast. At the same time, changes of business models towards services and subscriptions pushed the problem of upgrades back to vendors.

Earlier this year, my attention was caught by CIMdata publication – Aras Innovator: Redefining Customization & Upgrades. Aras enterprise open source model is predominantly subscription oriented. Which provides lots of incentives for Aras engineers to solve the issue of upgrades and new versions deployment. Here is the passage from the article confirming that:

For several years, the Aras Corporation (Aras) has included no-cost version-to-version upgrades in their enterprise subscriptions, independent of how the solution has been customized and implemented. This is a rather bold guarantee given the historic challenges the industry has experienced with upgrading highly customized PLM deployments. With more than 300 upgrades behind it, CIMdata felt it appropriate to find out how Aras’ guarantee was playing out, and discovered that there was much more to the story than just a contractual guarantee. Fundamentally, Aras Innovator is engineered to be highly configurable—even customizable—without resulting in expensive and complex version-to-version upgrades and re-implementations.

One of PLM software leaders, Siemens PLM is also thinking about What is the best release cycle. The article speaks about SolidEdge release cycle.

A few years ago we moved from an irregular release cycle for Solid Edge, maybe 9 months in one cycle to 15 months in the next, to a regular cycle of annual releases (of course there are also maintenance packs delivered in the interim). I believe our customers much prefer this, they can plan ahead knowing that there will be a significant Solid Edge release available to them in August each year.

At the same time, the article confirms that CAD/PLM vendors are looking how to solve the problem of upgrades. As I mentioned earlier, cloud software model is one of the most promising technical ways to solve the issue of upgrades. It is true, but can be tricky in case both desktop and cloud software are involved. Here is the passage from the same Siemens PLM blog:

Working in the PLM area we try really hard to provide our customers with a good upgrade experience. PLM software is itself dependent on both the operating system and database software, and it has to work with specific releases of CAD software (sometimes with more than one CAD solution for our multi-CAD customers) and with office software as well! Moving PLM software to the cloud could potentially take some of the upgrade issues away from the end user, but PLM software does not work in isolation from your data files, or your other software and systems so I believe there is much work still to be done before the cloud really impacts the upgrade situation for real-world customers.

What is my conclusion? From customer perspective, the best option is to make release cycle completely transparent. In my view, this is really high bar for PLM vendors. Customer data migration, customization and sometimes absence of backward compatibility make release transparency questionable. However, since industry moves towards cloud software and service business model the demand for agile release management and absence of upgrades will be growing. So, my hunch, in the future we will not see "legacy software" anymore. New type of enterprise software will manage upgrades and migrations without customers paying attention. Sound like a dream? I don’t think so. For most of web and consumer software it is a reality already today. Just my thoughts…

Best, Oleg


Apple iPhone 6 and and cross product families BOM planning

August 18, 2014

ipad-bom-assy

To manage Parts and Bill of Materials is not a simple tasks. I shared some of aspects related to the complexity of Part Numbering last week in my post – Existing data prevents companies to improve Part Numbers. The discussion in comments took me towards the complexity of Part Numbers in supply chain. Here is the passage (comments) made by Joe Barkai

…multiple BOMs with inconsistent numbering schema often hide a bigger problem: inconsistent attributes and metadata. I [Joe Barkai] worked with a global automotive OEM on issues surrounding architectural complexity reduction and global quality management. I discovered that each product line was using different part numbers. This was obviously difficult to manage from a supply chain perspective. But, not less importantly, other metadata and data attributes such as failure modes, labor operation codes and other important information were codified differently, rendering cross product line reporting and analysis difficult and potentially lacking, if not erroneous

Product lines and multiple configurations is a reality of modern manufacturing. The customization level is growing. On the other side to manage parts and BOM globally becomes one of the most important and challenging tasks. I found another example of that in today’s news . This is an example of a potential impact on Apple from management of bill of material across multiple product lines and supply chain. Navigate to Seeking Alpha post – Apple iPhone 6 Will Pick Up iPad Sales Slack. Here is the passage I captured:

Apple still generates the majority of profits in mobile, despite the slight declines in market share. Last November, research firm IHS estimated $274 in bill of materials and manufacturing costs for the 16GB iPad Air with Wi-Fi connectivity that retails for $499. Going forward, Tim Cook, operations man, will likely leverage Apple’s immense buying power to further drive down costs for component parts shared between the iPhone 6 and eventual iPad upgrade.

I have no information about PLM system used by Apple to manage bill of materials across product lines. However, I guess, re-use of components among different product lines is a very typical approach used by many manufacturing companies.

What is my conclusion? The complexity of bill of materials management across product lines and supply chain are skyrocketing these days. To manage part numbers, bill of materials, cost and multiple product lines can become a critical part of PLM solution to support manufacturing profitability. Just my thoughts…

Best, Oleg

[tags PLM, Apple, BOM, Supply Chain, iPhone


How long will take GrabCAD to develop full-blown PLM solution?

August 18, 2014

grabcad-plm-beyondplm

Time is running fast. It has been two years since I posted GrabCAD: from Facebook for engineers to PLM. If you are in the engineering community, the chances you will come to PLM are very high. Like in the past all roads lead to Rome, I guess all future development roads for PDM solution lead to PLM. Even if you don’t like to brand your solution as PLM… Nevertheless, if it looks like a duck, quacks like a duck and walks like a duck, it’s a duck.

Just few months ago, GrabCAD moved into "PDM" segment by introducing GrabCAD Workbench. Earlier today, GrabCAD made another "quack" towards PLM by adding BOM support. Navigate your browser to the following link to read – BOMs Away! Workbench Adds BOM Management. The following passage outlines what GrabCAD BOM can do today:

We’ve added an easy-to-use BOM export capability to Workbench, enabling any Workbench user to generate a Bill of Materials with just a few clicks. This means that your engineering team and manufacturing team will always be on the same page. Now your purchasing manager or supplier liaison doesn’t need to bother a CAD engineer to generate a BOM, and doesn’t need to enter items individually into Excel each time you change a revision. It’s as simple as two clicks to get the list of components into Excel!

grabCAD-bom-plm

Introducing BOM functionality is a very logical step many PDM systems did. However, it doesn’t come easy. The complexity of system is growing. From what I can see now GrabCAD is an early beginning and just touching BOM functionality to balance customers demand and complexity of full-blown BOM management solution.

What is my conclusion? To extract BOM from CAD drawing is a very neat functionality. My hunch it was requested by many GrabCAD users. However, BOM functionality introduction was a challenge for many PDM systems in the past. The complexity of Part management, multiple BOMs – this is only two visible parts of the iceberg. GrabCAD strategy reminded me what SolidWorks did in the past with PDM – added functionality when it was absolutely requested by majority of customers. I think it worked for SolidWorks… It would be interesting to see how it will work for GrabCAD. Just my thoughts…

Best, Oleg


Existing data prevents companies to improve Part Numbers?

August 15, 2014

historical-part-numbers

Part Numbers is a fascinating topic. I’m coming back to blog about what is the best approach to manage Part Numbers. My last post about it was – Part Numbers are hard. How to think about data first? was just few weeks ago. In that article, I outlined few principles how to keep PN separate from surrounding data focusing on different aspects of parts – description, classification, configurations, suppliers, etc.

Yesterday, my attention was caught by ThomasNet article – Are Part Numbers Too Smart for Their Own Good? The article nailed down a key issue why companies are still having difficulties with management of Part Numbers. Nothing works from scratch in engineering companies. Complexity of characteristics and history of existing Part Numbers and products are making real difficulties to adopt new PN management concepts. The following passage explains the problem:

Another problem with descriptive numbering is that the description can become out of date and irrelevant over time. Individual parts can have their own life cycles; if a part has been identified according to the product, what happens if that product is discontinued but the part continues to be used in a newer product? Or what if a manufacturer changes vendors and the part number contains the name of the vendor that originally provided the piece?

Gilhooley admits that some Ultra Consultants clients have decided that switching from descriptive to auto-generated numbering would require too much organizational change. Some companies stick with old systems, and some opt for hybrid systems that perhaps retain descriptive numbers for existing parts but use auto-generated numbers for new parts.

It looks like there is no single solution or best practice to solve the problem. The "traditional" engineering approach to keep options to manage a diverse set company configuration looks like the only possible way to solve this problem in existing PLM/ERP systems.

What is my conclusion? History keeps customers from moving forward. There are two aspects of complexity in Part Numbers: 1/ complexity of definition and data classification; 2/ historical records of PN in every company including catalogs and existing products. Together, they create a block to make any changes in existing PN schema and prevent companies from migration towards new approaches. New data modeling technologies must be invented to handle existing data as well as supporting customers to migrate into modern PLM and ERP solutions. Just my thoughts…

Best, Oleg


Why now is the right time to reinvent PDM?

August 15, 2014

re0invent-pdm-now

Product Data Management (PDM) isn’t a new domain. The first PDM systems were invented 20-30 years ago with a simple objective – to manage product data. The scope of PDM was heavily debated and included design, engineering BOMs, ECO and even supply chain. However, the most widely accepted role of PDM is to manage CAD files and their revisions.

For long time, PDM was recognized as somewhat you only need to consider if a size of your engineering department is large enough. Even starting price to implement PDM solution went down significantly for the last 20 years, my hunch average PDM solution starting cost for engineering organization with 10-15 people will be about $30-50K. Cost and implementation complexity made PDM business limited to larger companies and was mostly handled by resellers with special skills and knowledge. Most of them associated with a specific CAD vendor channel.

CAD vendors recognized the need and complexity of PDM. For most of vendors the answer on PDM demand was to develop (or acquire) a dedicated PDM system bundled with their CAD software. As a result of that, most of PDM players were acquired. Most of existing (remaining) PDM vendors are either focusing on a specific geographical niche or developed additional solutions usually branded with "PLM" buzzword and strategy.

My hunch is that until last year, PDM market was somewhat stalled and focusing on replacing of outdated versions of PDM software as well as support of new CAD software releases. Then something happens… For the last months, I can see an increased interested in PDM software. I noticed few focused researches and articles in the field of PDM – Expert Guide to the Next Generation of PDM; TechClarity Expert Guide for Basic CAD management and few others.

Also I want to mention few activities by vendors focusing on basic PDM functionality. It started from more traditional OOTB approach made by PTC Windchill PDM Essentials, SolidEdge SP focusing on SharePoint platform leverage and GrabCAD Workbench using "cloud platform" as a differentiation strategy.

Consilia Vector published CAMScore report for GrabCAD Workbench where CAMS stands for Cloud, Analytics, Mobile, Social. In my view, these major trends are making a renaissance in the space of PDM.

As I mentioned before, because of cost and complexity, PDM software was out of reach for many smaller companies and engineering departments. DIY (Do it yourself) PDM approach combining network file share, Excel files and FTP is a solution for probably 60-70% of market. For many years, to share files using network and USB drives was "good enough solution". But the era of file sharing changed forever with coming trend of social networks, mobile and cloud. So called YAPSA (Yet Another Photo Sharing Apps) became widely available in our everyday life. The question why PDM is so complex and why we cannot manage and access CAD data similar to what we do with photos and videos brings PDM solution back to the innovation room.

What is my conclusion? Cloud, web and social technologies in consumer space reached the level of maturity. It comes to the point where new tech and awareness of cloud and social approach are going to challenge a traditional PDM space. In addition to that, looks like an existing approach to use network drives and file sharing to manage CAD files is coming to logical end. People will be looking how to copy YAPSA approach into PDM space. So, it is time for PDM to change. Just my thoughts…

Best, Oleg


How to visualize future PLM data?

August 12, 2014

collective experience of empathetic data systems

I have a special passion for data and data visualization. We do it every day in our life. Simple data, complex data, fast data, contextual data… These days, we are surrounded by data as never before. Think about typical engineer 50-60 years ago. Blueprints, some physical models… Not much information. Nowadays the situation is completely different. Multiple design and engineering data, historical data about product use, history of design revisions, social information, data about how product is performing coming in real time from sensors, etc. Our ability to discover and use data becomes very important.

The ways we present data for decision making can influence a lot and change our ability to design in context of right data. To present data for engineers and designers these days can become as important as presenting right information to airplane pilots before. Five years ago, I posted about Visual Search Engines on 3D perspective blog. I found the article is still alive. Navigate your browser here to have a read. What I liked in the idea of visual search is to present information in the way people can easy understand.

Few days ago, my attention was caught by TechCrunch article about Collective Experience of Empathetic Data Systems (CEEDS) project developed in Europe.

[The project ]… involves a consortium of 16 different research partners across nine European countries: Finland, France, Germany, Greece, Hungary, Italy, Spain, the Netherlands and the UK. The “immersive multi-modal environment” where the data sets are displayed, as pictured above — called an eXperience Induction Machine (XIM) — is located at Pompeu Fabra University, Barcelona.

Read the article, watch video and draw your conclusion. It made me think about the potential of data visualization for design. Here is my favorite passage from the article explaining the approach:

“We are integrating virtual reality and mixed reality platforms to allow us to screen information in an immersive way. We also have systems to help us extract information from these platforms. We use tracking systems to understand how a person moves within a given space. We also have various physiological sensors (heart rate, breathing etc.) that capture signals produced by the user – both conscious and subconscious. Our main challenge is how to integrate all this information coherently.”

Here is the thing. The challenge is how to integrated all the information coherently. Different data can be presented differently – 3D geometry, 2D schema, 2D drawings, graphics, tables, graphs, lists. In many situations we can get this information presented separately using different design and visualization tools. However, the efficiency is questionable. Many data can be lost during visualization. However, what I learned from CEEDS project materials, data can be also lost during the process of understanding. Blindspotting. Our brain will miss the data even we (think) that we present it in a best way.

What is my conclusion? Visualization of data for better understanding will play an increased role in the future. We just in the beginning of the process of data collection. We understand the power of data and therefore collect an increased amount of data every day. However, to process of data and visualizing for better design can be an interesting topic to work for coming years. Just my thoughts…

Best, Oleg


PLM: Tools, Bundles and Platforms

August 11, 2014

plm-tools-bundles-platforms

I like online debates. The opportunity to have good online debates is rare in our space. Therefore, I want to thank Chad Jackson for his openness to have one. I don’t think Chad Jackson needs any introduction – I’m sure you had a chance to watch one of his Tech4PD video debates with Jim Brown of TechClarity.

Here is my post that ignite the debates – CAD: Engineering bundles vs. Granular Applications. In a nutshell, I caught Chad by saying that his idea of bundling of MCAD and ECAD in a single application might go against another idea of granular integrated application he articulated before.

chad-jackson-beyondplm-blog-fight

Here it starts! Chad twitted it a blog fight… whatever. I saw it as a good opportunity to debates what is the future engineering landscape might be. In a world where large CAD and PLM players are aggressively acquiring companies, products and technologies, the idea to combine MCAD and ECAD application can be quite disruptive.

However, my intention is not to discuss who is buying whom in CAD/PLM world. There is relatively limited number of MCAD and ECAD vendors. You can see them by navigating to the following links – 3D CAD, ECAD.

Chad’s main point – Granularity and Integration are not diametrically opposite. I agree with the statement. I also find examples of 3DEXPERIENCE, PTC and Transmagic very relevant. I found very important to clarify the differences between so called "granular apps" and "data integration". Here is my favorite passage from Chad’s article:

Granular Apps offer a limited set of capabilities that are focused on a specific job. These apps are more accessible to different roles in the company because their limited set of functionality requires less training and retention in terms of how they work. They are valuable in the network of roles that participate because they are so accessible. Data Integration means that multiple software applications work against a single set of data in a coordinated fashion. There can be value in this in propagating change and enabling collaboration across the network of roles that participate in overall product development.

The way article presents the combination of integration and granularity made me think about some interesting trajectories in future development of engineering software. I’d like to classify things into 3 distinct categories – Tools, Bundles and Platforms.

1- Tools.

The history of engineering applications goes back into development of tools that helped engineering to be more productive – drafting tools and calculation tools. You can find many of these tools in the past – 2D CAD, 3D CAD, Simulation and analysis tools. If you look on current software landscape, you can find most of the tools are still here.

2- Bundles and/or Suites

One of the biggest challenge with tools is related to the fact how customers can use them together. The topics of data integration and interoperability are very often discussed in the context of ability to use multiple tools, especially when these tools are developed by different vendors. The problem of interoperability is well recognized by vendors. One of the answers is to provide so called "suites" or application bundles with special focus on how tools are integrated together.

3- Platforms.

Platform is a lovely word in a lexicon of software developers. For most of them, this is an end game in the maturity of software tools. How to become a platform that can be used by other developers? There are so many advantages you can unlock as a provider of a platform. Easy to say, but very hard to do. The critical characteristics of platforms are hard to achieve – openness, data integration, maturity of data standards, tools and APIs and many others.

What is my conclusion? My guess, Chad is speaking about the opportunity to provide a unified product development platform that combines MCAD and ECAD tools. His statement about data integration indicates that tools can be still granular but become part of an integrated platform. I don’t think everybody will see it in the same way. I want to see mechanical engineer is using ECAD type environment for his work. I hardly can imagine some ECAD related work done in 3D environment. 3D view can be potentially cumbersome and confusing for most of electronic design. I believe IT and PLM architects might appreciate platform thing, but engineers can disagree. Where is the middle ground? It made me think more about what future engineering and manufacturing platforms will look like. I guess Chad Jackson might have some ideas about that and he would like to share them. I will work on my list to compare notes too. Just my thoughts…

Best, Oleg


PLM workflow dream

August 8, 2014

plm-workflow-dream

Process management is a very important part of any PLM software. You can find one in every PLM system. There are so many ways to define and manage process. Few years ago I captured some of them here – PLM Processes: Flowchart vs. Rule-based? While, I believe, we can agree about importance of processes management, I found hard to find simple and powerful implementation of PLM workflow. I believe this statement holds for every enterprise system. Time ago I had a dream that PLM vendors will adopt best in class BPM (Business Process Management) tools and infrastructure. My dream didn’t come true. Instead of that, the reality is that every PLM system has some (not the best) workflow implementation.

As part of my thinking about un-bundling in PLM, I decided to come with a description of what I call PLM workflow dream – list of features for an ideal PLM workflow system.

1- Visual designer. Majority of people think visually when it comes to workflow. So, visual designer should be a tool to draw a workflow in an easiest way, put boxes with activities and connect them together. It would be very interesting to have it done in a collaborative manner – typically, you need more than 1 person to define a good workflow.

2 – Drag-n-drop activity planning. There should be a very clear way to define activities. In most of PLM systems, activities should be connected to something that happens in your system (eg. Part status change; Document release, etc.) To connect them together with flow activities is a key.

3- Visualize and test. Designer should provide a way to "lineup" worklow into simple set of events (boxes) without cumbersome lines/nodes intersection mess. No cyclic visualization, unclear sub graph connections etc. System also should provide way to test the workflow with dummy or real data.

4- Program activities easily. Each activity node should support a notion of process such as failure, alert, delegation and user action (if needed). It would be really nice to have some predefined "processing rules" such as how to react on people absence and mistakes. The interface to set these values and action should be user friendly without additional complexity.

5- Failure programming. I need to be able to program what happens in case of general workflow failure in terms of who to call and what to do.

6- Programming scripts. The ability to attach programmable scripts to every activity/note. These days, Java script is probably a standard to should be just adopted. Don’t invent yet another programming language. Testing ability should support debugging and data dump for analysis.

What is my conclusion? Well, this is my dream list. If I missed something, please don’t hesitate to add it to the list. I believe, there is a possibility to build easy to use workflow system that can be easy plugged into any PLM system. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 246 other followers