What’s wrong with “analog PLM”?

July 6, 2015

analog-computing-devices-plm-old

In engineering world, digital vs. analog differentiation can be explained in a very precise manner – analog is using electrical signals. Opposite to that digital is using binary format. Things are much more complicated when “digital” term comes into marketing realm.

Accenture article Faster, fitter, better: Why product innovation is going digital is taking a marketing spin and coming with the term of “digital PLM” as a system or approach that can help companies to improve their product development processes and improve innovation. I like the following passage explaining the difference between digital and non-digital PLM worlds:

Such models, where the multiple processes and systems live in silos, inhibit the flow of relevant information needed to optimize product development. Engineers, for example, are often disconnected from the new-product introduction process. As a consequence, new-product launch teams don’t always hear about critical last-minute design changes. And because vital insights are not shared, the solutions that eventually emerge from this fragmented system just aren’t meeting customer expectations for innovation and relevance. Moreover, because of this linear approach, product launch is often delayed.

It made me think more about “digital vs. analog”. An obvious thing to think about companies are using “paper” trays to share information and manage processes. I can see it as an option. But the chances are companies are using at least emails to share information. Still can be very complicated way to collaborate. I guess most of companies are trying to step up from an analog way of sharing information and manage changes using emails into the realm of PLM systems. But it doesn’t go very well in many situations or it gets very expensive. So, what is the problem with inefficiency with “analog PLM” implementations?

One of the core elements of every PLM system is its “data model”. Jos Voskuil provided a good explanation about what is “PLM data model” in his last article – Importance of PLM data models. According to Jos, the success of PLM implementation depends on an efficient organization of PLM data models. Here is the passage that explains that:

My conclusion for both situations is that it all leads to a correct (PLM) data model, allowing companies to store their data in an object-oriented manner. In this way reflecting the behavior the information objects have and the way they mature through their information lifecycle. If you making compromises here, it has an effect on your implementation, the way processes are supported out-of-the-box by a PLM system or how information can be shared with other enterprise systems, in particular, ERP. PLM is written between parenthesis as I believe in the future we do not talk PLM or ERP separate anymore – we will talk business.

Jos is absolutely right in his assumption. The traditional world of PLM implementations is requiring to translate everything into PLM data models. Take every PLM system and you will see this step as an essential element of every implementation. If I refer to the engineering definition of “analog” system, it means to translate the real world – organization, data, processes into “analog world” of data models. With a full respect to flexible PLM data models and architectures, this translation is creating distortion and the process itself is very complicated.

Is there a better way? I think digital PLM (or whatever other name our marketing genius bring) will have to abandon old “analog PLM” practices of data modeling as a complex and outdated approach. Digital PLM will be able to use native representation of product information and product development processes across silos without translation into variety of PLM data models abstractions. It doesn’t mean data models will disappear tomorrow. Software will have to use data models anyway. But customers and implementers will be excluded from the loop.

What is my conclusion? Think about existing PLM data modeling approach as a way to translate native sound or video signals into electric signal to transmit it into organization. Think about it as an “analog approach” similar to what we had in the past in audio and video recording. In PLM it created a whole level of implementation complexity – a need to define data models and and map organizational reality into these models. Future “digital PLM” will have to bring a better way and exclude people from a formal data modeling loop. It will make PLM systems simpler to understand and easier to implement. Just my thoughts…

Best, Oleg

picture credit Wikipedia article presenting an ancient analog computer

 



What Aras “great fanfare” means for PLM industry

July 3, 2015

disruption-plm-aras

Aras is probably doing something right. At least this is an immediate thought that comes to my mind when I see a line of press releases with announcements about significant deals or agreements. The last one about Aras and Airbus partnership just came few days ago – Airbus and Aras sign Strategic Partner Agreement to use Aras Innovator for enterprise-wide Engineering Business Processes beyond 30,000 Users. The announcement outlines list 5 topics that potentially differentiated Aras from other PLM platforms. Here is the passage from press release:

Several PLM Platforms were evaluated by Airbus for the ability to enable simple, agile solution delivery, and Aras was retained as the preferred platform based on: 1/ Significant coverage of expected scope; 2/ Open architecture with high-end data modeling “on the fly”, no development involved; 3/ Upgrade services for customizations included as part of subscription; 4/ Easy integration & handling; 5/ Long term viability and total cost of ownership

Four years ago, I attended Aras event – ACE 2011 in Detroit. You might remember my article – Aras PLM lines up against Windchill, Enovia and Teamcenter. I can see a connection between topics from Aras-Airbus announcement and my 2011 blog post – platform flexibility, customization, integration and low TCO.

Aras announcement made me think about what happens with Aras for the last 4 years. Here is the list of my thoughts.

1- Aras is probably coming to the right combination of functional maturity and cost that allows to be disruptive for larger deals against top 3 leading PLM vendors (Dassault Systems, Siemens PLM and PTC).

2- Aras architecture can scale to the level demanded by enterprise IT of large organizations thinking about 10’000s users. Aras sent a strong message last year (backed up by Microsoft) about 1 million PLM user scalability test on Microsoft SQL server, which potentially created a confidence for IT organizations.

3- Aras removed one of the most complicated questions related to upgrades and continues support of new revisions. Aras is committed to support “free upgrade services” for subscription customers. It allows to IT managers not to be very concerned by a potential “version lock” when organization is forced to pay a significant amount of money for services to migrate to the latest PLM software version.

So, how does it impact PLM industry? For the last 10-15 years, nobody created a significant competition to the top 3 largest PLM software outfits – Enovia, Teamcenter, Windhcill. But, here is the thing – traditional PLM software reached their limits and many organizations need to consider what to do next. The ROI is slow and upgrade to next versions of PLM platforms is questionable.

What is my conclusion? It looks likes Aras has a potential to change a status quo among top tier of PLM providers. You can think about Aras as a “pain relief” for companies thinking how to grow their PLM development and concerned about a ROI speed. Here is my formula of what happened: ARAS SUCCESS = MATURE PLM FUNCTIONS + STABLE ARCHITECTURE – HIGH LICENSE COST + ALL INCLUSIVE UPGRADE SERVICES. So, what about risks of switching to Aras? The potential risks for Aras can come from growing development and support effort that will force Aras to raise subscription cost. I see it as a challenge for Aras team. But, by that time, Aras can potentially disrupt large enough number of manufacturing OEMs to become 4th major PLM provider. It looks like next few years we will see a growing competition between existing PLM vendor and Aras, which is on mission to disrupt PLM industry status quo. Just my thoughts.

Best, Oleg


Who’s ready to manage complexity of BOM for connected cars?

July 2, 2015

bom-plm-connected-cars

The complexity of modern manufacturing products is skyrocketing. It is hard to imagine a product without electronic and software components. But if you think, industry reached the top of imaginable complexity, think again. It gets even more complicated.

The future of “connected cars” is coming fast. Modern automobile is already very complex device with lot of electronic and software. But, it is getting even more complex. Forbes article U.S. And European Automakers Will Need To Be More Aware Of The Chips They Put In Their Cars speaks about complexity of car electronics and its connection to security related issues. I found the following passage interesting:

With the modernization and electrification of vehicles, electronics as a percentage of the BOM (bill of materials) of the car has skyrocketed, and we haven’t seen anything yet. This will only become a higher percentage as piloted and self-driving vehicles start to become more commonplace. Up until this point, silicon brand and security hasn’t really mattered all that much as long as the functionality was there, and as a result, vendors simply implemented whatever met the utility, was more cost effective and passed regulatory rules.

As the percentage of the BOM that is electronic components increases and features are added that could increase potential security risk, I believe that this will change, and branding and security will become more important.

The complexity of BOM management is well know thing in PLM industry. My earlier blog post – Multiple dimensions of BOM complexity. The need to trace manufacturers of electronic components in a car bill of material will only increase the complexity of data. Most of PLM products today are managing multiple views of engineering and as-built BOM. The requirement for additional traceability and regulation in this space can potentially break the level of complexity PLM products are capable to handle.

In addition to branding, security or at least perceived security, will become an even more important factor in automobiles. Previously, people simply worried about people breaking into their cars with crowbars or wires, but now high-tech carjackers are breaking into cars remotely. Just think of all of the safety and security concerns with a vehicle that is fully in control of the driving experience at 65 Mph or more. Few really thinking that one through, yet.

A potential security concern and government regulation will create a demand to expose more information about vehicle electronics. To make some of the information available will be another challenge for PLM systems in automotive domain. Bill of materials data is siloed between multiple systems often not available from a single place.

What is my conclusion? The complexity of automobiles and specifically car electronics will increase the demand for sophisticated data solutions to manage bill of materials (BOM) and related product data. Some of existing PLM vendors might be unprepared for such change and for some systems it can be beyond what can be managed. This is an alarm call for PLM architects and technologists. Just my thoughts…

Best, Oleg

Images courtesy of Mashable article, Doug Chezem, Flickr, colinbrown

 

 


Growth hacking PLM sales

July 1, 2015

flying-pigs-PLM-marketing

Enterprise sales is one of the most conservative things in sales eco-system. Despite many changes that happened in our life for the last 10-15 years, this particular experience doesn’t change much. You probably heard best recommendation about how to stop “PLM sales” calling you – buy something from these guys. To sell multimillion dollar PLM deal to large manufacturing OEM is an art performance by a group of people mostly combined from a diversely skilled sales people with heavy support of management, development and… marketing. Let speak about last one – marketing. Do we really need one

Engineering.com article – The role of marketing in complex solution sales brings a perspective on how modern digital marketing can help to sell complex PLM solutions. In a nutshell, I can summarize it as a creation of a credible story that can help sales people to make a sale. Few passages below can give you a feeling about what is that.

Some prospects that the sales team has not reached may identify themselves by reading thought leadership stories and realizing a PLM system may be what they needMarketing creates awareness among the decision makers who may not have heard of your solution. Marketing creates the content that helps prospects understand the value of a new solution. Thought leadership is a big part of the marketing mix for many engineering solution vendors. They routinely send speakers to conferences, for example, to demonstrate their command of technical challenges. These presentations translate very well to digital marketing, either as sponsored posts in trusted publications or as webinar presentations.

Nothing bad with creating of credible story. For a long time marketing was about how to amplify messages from vendor to customer. So, you may think about new digital technologies as a set of new tools that came to help traditional marketing to amplify their voices.

Here is thing – this wrong and old approach. To use modern content marketing with a traditional sales approach is like to put a lipstick on a pig. Guess what… it is still a pig. Few years ago, Andrew Chen wrote in his blog back 2012 – Growth Hacker is a new VP Marketing. If you never heard about growth hacking, navigate here to read more. The following passage can give you some perspective:

This isn’t just a single role – the entire marketing team is being disrupted. Rather than a VP of Marketing with a bunch of non-technical marketers reporting to them, instead growth hackers are engineers leading teams of engineers. The process of integrating and optimizing your product to a big platform requires a blurring of lines between marketing, product, and engineering, so that they work together to make the product market itself. Projects like email deliverability, page-load times, and Facebook sign-in are no longer technical or design decisions – instead they are offensive weapons to win in the market.

It made me think that new marketing approach can disrupt existing PLM paradigm of selling and implementing PLM products. Most of PLM products today are first sold and then implemented by customers. This process requires a lot of effort from customers to grasp around the PLM idea and thinking how to apply it in an organization. Growth hacking can change it. Few years ago, I posted – How to ditch old PLM marketing and friend engineers. It could be part of growth hacking for PLM sales.

What is my conclusion? Growth hacking can be an important moment for PLM software. By disrupting a traditional marketing and sales roles, growth hacking can change the core paradigm of PLM products – to change the way companies are doing business. Instead of that, the culture of growth hacking will introduce a practice of learning from customers and discovering opportunities to sell products solving customer problems. Just my thoughts…

Best, Oleg

Image courtesy of sattva at FreeDigitalPhotos.net

 



How to escape “listing” paradigm and reinvent ECO

June 30, 2015

Bound_computer_printout.agr

Do you remember what is "hard copy listing"? If you had a chance to write software programs in 1980s, you might remember "listings" – a printed list of computer code or digital data in a human reading format. Navigate to the following Wikipedia article to refresh your memory. We don’t do listings anymore. It is gone. Wikipedia explains that in a dead simple language:

Today, hard copy listings are seldom used because display screens can present more lines than formerly, programs tend to be modular, storage in soft copy is considered preferable to hard copy, and digital material is easily transmitted via networks, or on disks or tapes. Furthermore, data sets tend to be too large to be conveniently put on paper, and they are more easily searched in soft-copy form.

My long time blogging buddy Ed Lopategui is discussing common practices and challenges manufacturing companies are experiencing with ECO (Engineering Change Management). I recommend you to take a look on the following two articles on GrabCAD blog: ECOs Aren’t Dead, But They Are Slow and Stupid and ECOs are stupid II: The price of unincorporated change. ECO best practices are heavily influenced by traditional approach of paper-oriented world following old-school configuration management standards. The following passage explains it very well:

Most engineering change processes are rooted in very formal and traditional frameworks. ECOs can be traced back to Configuration Management (CM) practices that literally come from a time well before CAD (much less PDM/PLM) where manual drawings ruled the earth. Engineering data was neither readily portable nor widely accessible. These effective but complex practices were established in the larger, older manufacturing companies that became the first natural customers to afford PDM/PLM.. As a result, these processes live on and are perceived as absolutes. They remain relatively intact, buoyed by large company process culture despite the opportunity to evolve.

Unincorporated change is another archaic practice, which is a reflection of old practice of making changes in drawings. The complexity of changing documentation created the practice of change process itself. So called "was-now" practice was related to the fact comparison between two states of the design was very complicated:

The concept of an unincorporated change was a necessary compromise in the past because of limitations in changing design data, especially in the era of manual drawings and early CAD, when drawing views didn’t just update with a click or two. Understanding the difference from one version of a design to another chiefly involved intense staring for prolonged periods of time until the change was understood and/or blindness occurred. To minimize eye strain, ECOs often highlighted the changes specifically, sometimes with designation of WAS/NOW views side by side on the form.

Now, let’s move into modern world of software. Imagine software engineer is writing a paper document with explanation about code changes he is going to implement tomorrow. He prints "listing" and mark by yellow color "was/now" differences in the code. Then apply it for approval and documentation. After it made, actual change is going to be performed. It is probably sounds strange.

The traditional PLM paradigms are very much like old-school software listings. It is slow and complicated. I’m consistently hearing engineers are not making formal changes and not abandoning lifecycle process because to perform lifecycle is slow and complicated. Nobody wants to deal with complicated forms and processes. In many situations the process is just too complex for teams and people to deal with. I pointed some of these problems in my earlier article – why PLM should revise NPI products.

Agile software development became very popular for the last decade. It introduces many concepts that can be adopted by manufacturing companies. One of the most interesting opportunity I can see is around how to make change management fast and easy. When it comes to change, the speed is a very important thing.

What is my conclusion? Old habits die hard. ECO is one of them and it goes back 20-30 years to best practices that were developed before CAD systems were capable to compare two versions of the model and visualize differences. Check my earlier post about how to compare versions and changes. New technologies and new practices should come and displace old "ECO listings" in an agile, paperless and easy way. Just my thoughts…

Best, Oleg

Picture is credit Wikipedia article – Computer Listing.


PLM Workflow “California Roll” Style

June 29, 2015

plm-workflow-california-roll

Product Lifecycle Management software is obsessed with the idea of workflow. You can see it offered by all PLM vendors. To have a taste of traditional PLM workflows, navigate to my five years old blog post – PLM processes: flowcharts vs rule-based – it gives some ideas how PLM products are defining processes. PLM workflow business was also targeted by newer product in PLM domain. Autodesk PLM360 put a significant focus on process management. You can read more in Mike Watkins blog here. Kenesto – a software outfit founded by Mike Payne was also relaunched last year with a specific focus on workflow process management – Kenesto PLM platform relaunched.

The idea of workflows didn’t die. It is actually a good idea. But, here is the thing – PLM workflows is very hard to implement. Therefore for many manufacturing companies, PLM workflow is still a dream that you can experience in PowerPoint presentations only. An the same time, the idea of workflow brings a way to think about processes in a holistic way. At least if you think about all elements of PLM workflow (or process) are important – activities, events, approval, actions, statuses, etc. However, it doesn’t work or doesn’t work in an easy way.

My attention was caught by a blog post by Nir Eyal, which has no obvious connection to PLM. It speaks about intersection of psychology, technology and business – People Don’t Want Something Truly New, They Want the Familiar Done Differently. I like California Roll example. Here is my favorite passage from the blog:

Then came the California Roll. While the origin of the famous maki is still contested, its impact is undeniable. The California Roll was made in the USA by combining familiar ingredients in a new way. Rice, avocado, cucumber, sesame seeds, and crab meat — the only ingredient unfamiliar to the average American palate was the barely visible sliver of nori seaweed holding it all together.

The California Roll provided a gateway to discover Japanese cuisine and demand exploded. Over the next few decades sushi restaurants, which were once confined to large coastal cities and almost exclusively served Japanese clientele, suddenly went mainstream. Today, sushi is served in small rural towns, airports, strip malls, and stocked in the deli section of local supermarkets. Americans now consume $2.25 billion of sushi annually.

It made me think about “PLM workflows” done differently. So, what if can take all right elements of PLM workflow and combined them in the way it won’t hurt companies to think about it. SolidSmack article earlier today speaks about IFTTT – a platform to automate web application tasks. Navigate to the following link – IFTTT introduce the maker to maker channel for makers and hardware developers. I liked the following passage.

Of course, while the ability to automatically automate web application tasks is certainly a very powerful thought, one can only imagine what this might mean as we enter an age of more connected hardware devices in addition to our existing phones, tablets and laptops. Within the last year, the platform has started to integrate its service into a collection of connected home products including Wink devices, Nest Thermostat, SmartThings, WeMo switches and other off-the-shelf products.

The following video can give you an idea how IFTTT works.

IFTTT is not only the way to automate you web tasks. Navigate here to see alternatives to IFTTT – Zapier, itDuzzit, Workauto, Jitterbit and others.

What is my conclusion? It is time to re-think the way Workflow is done for PLM. A more traditional PLM architecture is overemphasizing the planning step – organizing data and control of information. Think about somethign different – a set of re-usable web services orchestrated by tools that every person can use. So, all components of a typical PLM workflow is there, but it is user friendly and not requires 3-6 months of planning activities. Is it a dream? Maybe… But to me it sounds like something PLM architects and technologists might think about. Just my thoughts…

Best, Oleg


Brutal reality of process management for hardware startup

June 26, 2015

plm-hardware-startup-respect-process

Startup company and process management. These are probably two most conflicting definitions you might think about. Everyone is familiar with famous Zuckerber’s statement – move fast and break things. How process management can survive in such environment? Although Facebook is more careful these days about "breaking things", startups are still operating on the edge to move as fast as possible. The outcome is questionable for many hardware startups. A large majority of young companies are not delivering on time. The rough statistics said 75-80% of hardware projects on Kickstarter are not shipping products on time. That was a piece of information that made me wrote the following post last year – Why Kickstarter projects need PLM?

The blog post Speed Can Kill: the importance of process for hardware written by Ben Einstein of Bolt put a great perspective on what means process for manufacturing startups companies dealing with atoms (not bits like software companies). Take a look on the article – I found it very interesting. So called "long shadow effect" is brutal if you think about potential impact of decisions made during the lifecycle of product development. The following passage can give you an explanation how it can happen:

Early mistakes often don’t have a measurable impact until first shots are coming off tooling and the manufacturing process grinds to a halt. My partner Scott calls this the “long shadow effect.” An early decision about which microcontroller to use or the shape of a housing can appear correct until months or even years later during the first production run. Sometimes parts can have exceptionally long lead times, require odd financing terms, demand manual rework, or be entirely un-moldable. None of these problems can be uncovered by moving quickly to get to production.

The article speaks about four "trunks" as a the way to organize processes. I found it interesting comparing to more traditional department organization you can in PLM implementations done in larger companies. It reminded me my earlier blog about why PLM companies should revise NPI processes. The waterfall process is complicated and can introduce many artificial breakpoints to prevent company from moving fast. At the same time, running out of process organization will fail you on late stages of product development and manufacturing. My favorite passage from Bolt’s blog is the following conclusion:

This is not to say that hardware startups can’t move quickly; in fact they can move faster than ever before. But the ability to go fast and build good products on time and under budget comes from process, not pure iteration. The fastest companies tend to be exceptionally organized about their product development and manufacturing process. Many people have asked us to cover this in much more detail and we’re working on a 4 part series exploring each of these trunks.

What is my conclusion? I’d change the original "breaking things" statement in order to fit manufacturing companies reality as following – "move fast and respect process". It is easy to say and hard to implement. The complexity of product and processes is high. Products are combined of mechanical parts, electronics and software. Work and teams are distributed. All things are interconnected and can create "long shadow effects". Hardware startup companies are struggling to set basic elements of information and process organization. My hunch that those companies that are able to organize four trunks right will survive. This is a note for PLM architects and other people thinking how to apply PLM technologies for agile and dynamic processes. Just my thoughts…

Best, Oleg

Image courtesy of hywards at FreeDigitalPhotos.net


Follow

Get every new post delivered to your Inbox.

Join 281 other followers