What Aras “great fanfare” means for PLM industry

July 3, 2015

disruption-plm-aras

Aras is probably doing something right. At least this is an immediate thought that comes to my mind when I see a line of press releases with announcements about significant deals or agreements. The last one about Aras and Airbus partnership just came few days ago – Airbus and Aras sign Strategic Partner Agreement to use Aras Innovator for enterprise-wide Engineering Business Processes beyond 30,000 Users. The announcement outlines list 5 topics that potentially differentiated Aras from other PLM platforms. Here is the passage from press release:

Several PLM Platforms were evaluated by Airbus for the ability to enable simple, agile solution delivery, and Aras was retained as the preferred platform based on: 1/ Significant coverage of expected scope; 2/ Open architecture with high-end data modeling “on the fly”, no development involved; 3/ Upgrade services for customizations included as part of subscription; 4/ Easy integration & handling; 5/ Long term viability and total cost of ownership

Four years ago, I attended Aras event – ACE 2011 in Detroit. You might remember my article – Aras PLM lines up against Windchill, Enovia and Teamcenter. I can see a connection between topics from Aras-Airbus announcement and my 2011 blog post – platform flexibility, customization, integration and low TCO.

Aras announcement made me think about what happens with Aras for the last 4 years. Here is the list of my thoughts.

1- Aras is probably coming to the right combination of functional maturity and cost that allows to be disruptive for larger deals against top 3 leading PLM vendors (Dassault Systems, Siemens PLM and PTC).

2- Aras architecture can scale to the level demanded by enterprise IT of large organizations thinking about 10’000s users. Aras sent a strong message last year (backed up by Microsoft) about 1 million PLM user scalability test on Microsoft SQL server, which potentially created a confidence for IT organizations.

3- Aras removed one of the most complicated questions related to upgrades and continues support of new revisions. Aras is committed to support “free upgrade services” for subscription customers. It allows to IT managers not to be very concerned by a potential “version lock” when organization is forced to pay a significant amount of money for services to migrate to the latest PLM software version.

So, how does it impact PLM industry? For the last 10-15 years, nobody created a significant competition to the top 3 largest PLM software outfits – Enovia, Teamcenter, Windhcill. But, here is the thing – traditional PLM software reached their limits and many organizations need to consider what to do next. The ROI is slow and upgrade to next versions of PLM platforms is questionable.

What is my conclusion? It looks likes Aras has a potential to change a status quo among top tier of PLM providers. You can think about Aras as a “pain relief” for companies thinking how to grow their PLM development and concerned about a ROI speed. Here is my formula of what happened: ARAS SUCCESS = MATURE PLM FUNCTIONS + STABLE ARCHITECTURE – HIGH LICENSE COST + ALL INCLUSIVE UPGRADE SERVICES. So, what about risks of switching to Aras? The potential risks for Aras can come from growing development and support effort that will force Aras to raise subscription cost. I see it as a challenge for Aras team. But, by that time, Aras can potentially disrupt large enough number of manufacturing OEMs to become 4th major PLM provider. It looks like next few years we will see a growing competition between existing PLM vendor and Aras, which is on mission to disrupt PLM industry status quo. Just my thoughts.

Best, Oleg


Who’s ready to manage complexity of BOM for connected cars?

July 2, 2015

bom-plm-connected-cars

The complexity of modern manufacturing products is skyrocketing. It is hard to imagine a product without electronic and software components. But if you think, industry reached the top of imaginable complexity, think again. It gets even more complicated.

The future of “connected cars” is coming fast. Modern automobile is already very complex device with lot of electronic and software. But, it is getting even more complex. Forbes article U.S. And European Automakers Will Need To Be More Aware Of The Chips They Put In Their Cars speaks about complexity of car electronics and its connection to security related issues. I found the following passage interesting:

With the modernization and electrification of vehicles, electronics as a percentage of the BOM (bill of materials) of the car has skyrocketed, and we haven’t seen anything yet. This will only become a higher percentage as piloted and self-driving vehicles start to become more commonplace. Up until this point, silicon brand and security hasn’t really mattered all that much as long as the functionality was there, and as a result, vendors simply implemented whatever met the utility, was more cost effective and passed regulatory rules.

As the percentage of the BOM that is electronic components increases and features are added that could increase potential security risk, I believe that this will change, and branding and security will become more important.

The complexity of BOM management is well know thing in PLM industry. My earlier blog post – Multiple dimensions of BOM complexity. The need to trace manufacturers of electronic components in a car bill of material will only increase the complexity of data. Most of PLM products today are managing multiple views of engineering and as-built BOM. The requirement for additional traceability and regulation in this space can potentially break the level of complexity PLM products are capable to handle.

In addition to branding, security or at least perceived security, will become an even more important factor in automobiles. Previously, people simply worried about people breaking into their cars with crowbars or wires, but now high-tech carjackers are breaking into cars remotely. Just think of all of the safety and security concerns with a vehicle that is fully in control of the driving experience at 65 Mph or more. Few really thinking that one through, yet.

A potential security concern and government regulation will create a demand to expose more information about vehicle electronics. To make some of the information available will be another challenge for PLM systems in automotive domain. Bill of materials data is siloed between multiple systems often not available from a single place.

What is my conclusion? The complexity of automobiles and specifically car electronics will increase the demand for sophisticated data solutions to manage bill of materials (BOM) and related product data. Some of existing PLM vendors might be unprepared for such change and for some systems it can be beyond what can be managed. This is an alarm call for PLM architects and technologists. Just my thoughts…

Best, Oleg

Images courtesy of Mashable article, Doug Chezem, Flickr, colinbrown

 

 


Growth hacking PLM sales

July 1, 2015

flying-pigs-PLM-marketing

Enterprise sales is one of the most conservative things in sales eco-system. Despite many changes that happened in our life for the last 10-15 years, this particular experience doesn’t change much. You probably heard best recommendation about how to stop “PLM sales” calling you – buy something from these guys. To sell multimillion dollar PLM deal to large manufacturing OEM is an art performance by a group of people mostly combined from a diversely skilled sales people with heavy support of management, development and… marketing. Let speak about last one – marketing. Do we really need one

Engineering.com article – The role of marketing in complex solution sales brings a perspective on how modern digital marketing can help to sell complex PLM solutions. In a nutshell, I can summarize it as a creation of a credible story that can help sales people to make a sale. Few passages below can give you a feeling about what is that.

Some prospects that the sales team has not reached may identify themselves by reading thought leadership stories and realizing a PLM system may be what they needMarketing creates awareness among the decision makers who may not have heard of your solution. Marketing creates the content that helps prospects understand the value of a new solution. Thought leadership is a big part of the marketing mix for many engineering solution vendors. They routinely send speakers to conferences, for example, to demonstrate their command of technical challenges. These presentations translate very well to digital marketing, either as sponsored posts in trusted publications or as webinar presentations.

Nothing bad with creating of credible story. For a long time marketing was about how to amplify messages from vendor to customer. So, you may think about new digital technologies as a set of new tools that came to help traditional marketing to amplify their voices.

Here is thing – this wrong and old approach. To use modern content marketing with a traditional sales approach is like to put a lipstick on a pig. Guess what… it is still a pig. Few years ago, Andrew Chen wrote in his blog back 2012 – Growth Hacker is a new VP Marketing. If you never heard about growth hacking, navigate here to read more. The following passage can give you some perspective:

This isn’t just a single role – the entire marketing team is being disrupted. Rather than a VP of Marketing with a bunch of non-technical marketers reporting to them, instead growth hackers are engineers leading teams of engineers. The process of integrating and optimizing your product to a big platform requires a blurring of lines between marketing, product, and engineering, so that they work together to make the product market itself. Projects like email deliverability, page-load times, and Facebook sign-in are no longer technical or design decisions – instead they are offensive weapons to win in the market.

It made me think that new marketing approach can disrupt existing PLM paradigm of selling and implementing PLM products. Most of PLM products today are first sold and then implemented by customers. This process requires a lot of effort from customers to grasp around the PLM idea and thinking how to apply it in an organization. Growth hacking can change it. Few years ago, I posted – How to ditch old PLM marketing and friend engineers. It could be part of growth hacking for PLM sales.

What is my conclusion? Growth hacking can be an important moment for PLM software. By disrupting a traditional marketing and sales roles, growth hacking can change the core paradigm of PLM products – to change the way companies are doing business. Instead of that, the culture of growth hacking will introduce a practice of learning from customers and discovering opportunities to sell products solving customer problems. Just my thoughts…

Best, Oleg

Image courtesy of sattva at FreeDigitalPhotos.net

 



PLM Workflow “California Roll” Style

June 29, 2015

plm-workflow-california-roll

Product Lifecycle Management software is obsessed with the idea of workflow. You can see it offered by all PLM vendors. To have a taste of traditional PLM workflows, navigate to my five years old blog post – PLM processes: flowcharts vs rule-based – it gives some ideas how PLM products are defining processes. PLM workflow business was also targeted by newer product in PLM domain. Autodesk PLM360 put a significant focus on process management. You can read more in Mike Watkins blog here. Kenesto – a software outfit founded by Mike Payne was also relaunched last year with a specific focus on workflow process management – Kenesto PLM platform relaunched.

The idea of workflows didn’t die. It is actually a good idea. But, here is the thing – PLM workflows is very hard to implement. Therefore for many manufacturing companies, PLM workflow is still a dream that you can experience in PowerPoint presentations only. An the same time, the idea of workflow brings a way to think about processes in a holistic way. At least if you think about all elements of PLM workflow (or process) are important – activities, events, approval, actions, statuses, etc. However, it doesn’t work or doesn’t work in an easy way.

My attention was caught by a blog post by Nir Eyal, which has no obvious connection to PLM. It speaks about intersection of psychology, technology and business – People Don’t Want Something Truly New, They Want the Familiar Done Differently. I like California Roll example. Here is my favorite passage from the blog:

Then came the California Roll. While the origin of the famous maki is still contested, its impact is undeniable. The California Roll was made in the USA by combining familiar ingredients in a new way. Rice, avocado, cucumber, sesame seeds, and crab meat — the only ingredient unfamiliar to the average American palate was the barely visible sliver of nori seaweed holding it all together.

The California Roll provided a gateway to discover Japanese cuisine and demand exploded. Over the next few decades sushi restaurants, which were once confined to large coastal cities and almost exclusively served Japanese clientele, suddenly went mainstream. Today, sushi is served in small rural towns, airports, strip malls, and stocked in the deli section of local supermarkets. Americans now consume $2.25 billion of sushi annually.

It made me think about “PLM workflows” done differently. So, what if can take all right elements of PLM workflow and combined them in the way it won’t hurt companies to think about it. SolidSmack article earlier today speaks about IFTTT – a platform to automate web application tasks. Navigate to the following link – IFTTT introduce the maker to maker channel for makers and hardware developers. I liked the following passage.

Of course, while the ability to automatically automate web application tasks is certainly a very powerful thought, one can only imagine what this might mean as we enter an age of more connected hardware devices in addition to our existing phones, tablets and laptops. Within the last year, the platform has started to integrate its service into a collection of connected home products including Wink devices, Nest Thermostat, SmartThings, WeMo switches and other off-the-shelf products.

The following video can give you an idea how IFTTT works.

IFTTT is not only the way to automate you web tasks. Navigate here to see alternatives to IFTTT – Zapier, itDuzzit, Workauto, Jitterbit and others.

What is my conclusion? It is time to re-think the way Workflow is done for PLM. A more traditional PLM architecture is overemphasizing the planning step – organizing data and control of information. Think about somethign different – a set of re-usable web services orchestrated by tools that every person can use. So, all components of a typical PLM workflow is there, but it is user friendly and not requires 3-6 months of planning activities. Is it a dream? Maybe… But to me it sounds like something PLM architects and technologists might think about. Just my thoughts…

Best, Oleg


“True SaaS” PLM – the devil is in details

June 23, 2015

cloud-plm-the-devil-in-details

My earlier attempt to compare PLM vendors and cloud services raised many comments online and offline. I want to thank everybody who commented and shared your insight – it helps me to build a next version of comparison table. Also, I can see an importance of open discussion for the future of cloud PLM industry.

One of the most debate topic is the definition of SaaS. The questions and opinions are keep coming. Can we qualify a particular solution as SaaS? What are characteristic of SaaS product from technological and business perspective? And finally… can we define what is “true SaaS”? I want to step back and talk about SaaS first. Below is the definition by Wikipedia. You can learn more here – Software as a Service.

Software as a service (SaaS; pronounced /sæs/ or /sɑːs/) is a software licensing and delivery model in which software is licensed on a subscription basis and is centrally hosted. It is sometimes referred to as “on-demand software”. SaaS is typically accessed by users using a thin client via a web browser.

This definition leaves a lot of flexibility and, of course, doesn’t reflect multiple aspects of product and technology – the core source of disagreement about what is “true SaaS”. I want to focus on some of them – product portfolio, subscription business model, IaaS and hosting, product versions and releases, upgrades and thin/think client access.

1- Product portfolio. This is a question about cloud and on-prem portfolios. Do you believe company can be providing both traditional PLM software and cloud (SaaS) software. For large companies it is mostly part of their strategy. It is a tricky balancing act of selling existing products and moving into the future. For smaller companies, it is a question about their focus.

2- Subscription business model. Most of subscription-based products are tagged with a price per user / per month. Is it a model you want to follow? Do you expect paying monthly? Is it just a way to advertise the price? What is the additional cost associated with product deployment, operation, support and services.

3- IaaS and Hosting. There are multiple sources of infrastructure for cloud software these days. You can run it using services like AWS and Microsoft Azure. Alternatively, you can host it using variety of hosting providers. If your business is large enough, the question about company datacenter can come .

4- Product versions and release. An important question about availability of multiple versions and configuration of your products. The option to keep a single version of truth for your cloud product has lot of advantages. But at the same time, it can raise a concern from IT folks thinking about how to make cloud product compatible with other software running by a company.

5- Upgrades. The topic of software upgrades is painful. Who is responsible to upgrade your environment when product is moving to the next release? Cloud software vendors are traditionally responsible for infrastructure and upgrades. But some specific customizations and configurations can complicate things.

6- Thin vs. Thick clients. Do you think “cloud” is equal “browser”? For some people, the answer is clear yes. Do you think browser access is an ultimate characteristic of “true SaaS” software? You can decide what is important for you, but consider the implication of managing software installed on mobile devices, laptops and desktop computers.

What is my conclusion? The devil is in details and SaaS definition brings many questions. I’m working on a next version of PLM cloud services comparison between vendors. It is a competitive space and vendors will have to work to explain their products and technology. To say “cloud” is not enough. SaaS has no simple definition. To understand multiple characteristics of SaaS is important to take a right decision about what is a right solution for you. Just my thoughts…

Best, Oleg

PS. If you have specific questions, please reach me out via email – oleg [@] beyondplm [.] com

Image courtesy of Suat Eman at FreeDigitalPhotos.net


PLM vendors are in risk to follow ERP dinosaurs

June 22, 2015

old-flying-cars1

When it comes to PLM and ERP, you may always get some feeling of rivalry at the air. PLM and ERP domains are not competing. However, ERP and PLM vendors are clearly competing for customers strategic mindset. After all, it always comes to the competition for strategic budget allocation.

For many years ERP vendors came first to the corner office of CIO. These days some ERP dinosaurs are fighting for surviving in a new era of cloud computing and SaaS software. Toolbox.com article – 6 Reasons Why The Dinosaurs of ERP Are About To Go Extinct provides an interesting perspective on what happens in ERP software domain and industries today. 6 reasons in the article is the answer why traditional on-prem software solutions provided by companies like PeopleSoft are eaten by rivalry of cloud newcomers.

The article made me think about trajectory of some implementations and vendors in PLM domain. I can clearly see some similarities. Do you think some 20-30 years old PLM vendors will follow the path of the dinosaurs of ERP? Here are some of my thoughts about PLM domain and on-prem / cloud trends.

1- Old platforms need to retire

For the last 15-20 years, manufacturing companies adopted 2-3 generations of PLM software. For some of them (especially very large companies), the process of adoption was long and expensive. It took time to align processes and systems together. At the same time, technology is moving forward. To move forward, many customers need to drop old products and move to support new operational systems, hardware, devices, internet browsers, etc. It is obviously raises a question of how to retire old platforms. But this is a very painful question for many companies.

2- IT managers and upgrades

Upgrades are painful and IT is the department that traditionally spending a lot of resources and cost to upgrade all systems for a company. Cloud systems are going to change it. Since data centers and cloud infrastructure are owned by software vendors, they are also taking responsibilities for the upgrade. Some innovative PLM vendors such as Aras is including upgrades into their subscription also on on-prem installations.

3- Mobile and site independence

Our working environment is changing. 10-15 years ago, out work was mostly on site. These days the workforce is distributed. People don’t need to be at their desk to do a job. Multiple locations is a reality even for small companies. Mobile devices are in everyone pocket. To have a system that capable to work in such environment is an imperative for every company.

4- How to get small customers on board

PLM vendors made multiple attempt to provide a solution for smaller companies. It never worked. I can list number of products that were announced, retired and discontinued. However, the importance of smaller companies will only increase. New technologies and online market communities are making smaller manufacturing more competitive. It will bring an additional need for cloud PLM systems.

What is my conclusion? Manufacturing companies are slow in their adoption of new technologies. PLM never been a first place to innovate for cloud companies. But the reality of the outside world and online business are bringing manufacturing companies to the point that they will need to have a competitive software for product development and manufacturing. Old systems won’t survive and will have to retire. It is a time for PLM vendors to think about innovation and new platforms. Otherwise, it might to be too late to build and too expensive to buy. Just my thoughts…

Best, Oleg


PLM and a future of deep linking

June 19, 2015

deep-linking-plm

I like links. The things about links are fascinating. It is about connection between products, people, companies and things. The nature of our life today is to stay connected. Therefore links are important. It is natural to see links appearing everywhere in engineering and manufacturing too. Think about a product and its dependencies. The information is naturally linked between assemblies, parts, documents, bill of materials, materials, suppliers, manufacturing, shop-floor, orders and many other things.

The nature of things in PLM to be connected. At the same time, the reality of engineering software is in fact highly fragmented. The time when vendors and customers believed in a single system (or database) that can contain and manage all information is over. Customers are using multiple applications and it is unusual to see two or more PLM systems in the same company. When it comes to supply chain, the situation is even more complex.

Application integrations remains one of the most painful aspect of enterprise software and PLM can clearly lead the wave of complexity involved into implementations. My article yesterday –How PLM can avoid cloud integration spaghetti was a warning to all folks that imagine that cloud will be a silver bullet to kill application pain. It doesn’t. The cloud integration can be sometimes even more complex compared to traditional integration hacks using SQL and ETL tools.

I want to continue discussing the topic of cloud integration. The topic I’m bringing today is related to so called "deep linking". If you’re not familiar with the topic, navigate to the following Wikipedia article – Deep Linking – to get some background information.

In the context of the World Wide Web, deep linking consists of using a hyperlink that links to a specific, generally searchable or indexed, piece of web content on a website (e.g.http://example.com/path/page), rather than the home page (e.g. http://example.com/). The technology behind the World Wide Web, the Hypertext Transfer Protocol (HTTP), does not actually make any distinction between "deep" links and any other links—all links are functionally equal. This is intentional; one of the design purposes of the Web is to allow authors to link to any published document on another site. The possibility of so-called "deep" linking is therefore built into the Web technology of HTTP and URLs by default—while a site can attempt to restrict deep links, to do so requires extra effort. According to the World Wide Web Consortium Technical Architecture Group, "any attempt to forbid the practice of deep linking is based on a misunderstanding of the technology, and threatens to undermine the functioning of the Web as a whole".[1]

TechCrunch article – A brief history of deep linking brings an interesting perspective of trajectory of deep linking development on the web and in app word. Below is my favorite passage. It is important since it gives a notion of how to threat standards in internet and application world.

In order for me to write this article, and for you to be able to read it, we have to share a common language: modern English. The same holds true for directing users through deep links — in order to construct a deep link that an application will understand, we need to have some shared way of expressing information or addresses. In software engineering, a well-defined shared vernacular is defined by a “standard.”

The problem with standards, though, is that many of them do not actually become standard practice, and introduce as much fragmentation as they resolve. I could define the word “basilafacitarian” as “a person who likes basil a lot,” but unless it enters the common vernacular, it’s useless as a means of communication for me to tell you that I like basil.

The same is true for an app speaking to another app; unless the URL “myapp://show-item/id123” is mutually agreed upon, there’s no guarantee what the receiving app will do with it.

Coming back to PLM world, we can see customers and vendors are struggling with data and application fragmentation. It is not getting any better in cloud PLM world – we just move to another set of APIs and technologies.

What is my conclusion? The idea of setting standards for deep linking is interesting. It can provide some level of solution to stop proprietary silos, data islands and data pumping challenges to send data between applications back and forth. It is not simple and requires some level of synchronization between vendors and customers. We have already enough history on the web and in app world to learn and correct the course to control data and make it transparent at the same time. Just my thoughts…

Best, Oleg

Image courtesy of Stuart Miles at FreeDigitalPhotos.net


Follow

Get every new post delivered to your Inbox.

Join 280 other followers