Why Excel and Multi-BOM are killing collaboration?

April 22, 2014

excel-multiple-bom

Excel and Bill of Materials. What can be better to start a discussion? One of my favorites blogging buddies and author of eng-eng blog Ed Lopategui hit the button of BOM & Excel discussion in his GrabCAD blog – It’s time to drop Excel BOM. I liked the following passage. It speaks about huge cost involved in management of changes using Excel:

There’s one fundamental constant in all of engineering: change. Aligning with the capability to change quickly and often is crucial in fighting back ever-increasing schedule pressures. Excel BOMs provide no help here. A separate Excel BOM has to be manually synchronized with each design change. It’s usually in this confusion where some of the bigger and most expensive errors tend to happen. Conflicts are common and notoriously difficult to set straight. Recognize that the information in a BOM is every bit as vital as your CAD design, and should be managed accordingly. For the very same reasons you benefit from managing CAD, so should you do the same with a BOM.

Ed’s post took me back five years to my Why do I like my PLM Excel spreadsheets? Excel is super flexible and powerful. However, it comes with cost. I summarized them here – PLM Excel spreadsheets: from odes to woes. Very recently I put a possible recipe how PLM can compete and take over Excel spreadsheets. These are important 3 ingredients – (1) flexible data model, (2) easy customization and (3) flawless user experience.

One of the topics in Ed’s blog, took me far beyond just usage of Excel to edit BOM. It was about how to manage bill of materials between engineering and manufacturing space. Here is the passage:

So far we’ve been talking about BOMs strictly from a design perspective. But the expectation that there can be only one BOM to rule them all is unrealistic. There are different ways to slice BOMs, different disciplines may have a need for their own specific view or information. How manufacturing looks at a BOM in ERP will be fundamentally quite different from how engineering looks at a BOM.

The topic of multiple BOM management isn’t so new. The truth is every enterprise system wants to manage their portion of BOM. In PLM space BOM management is often comes with the strategy of multiple BOMs or BOM views. Most of PLM systems can support multiple BOMs. The idea of separating BOMs into different slices or views is current answer to how to let every department in the organization to own their portion of BOM. Most of organizations are doing that because they didn’t find an alternative way to agree how to manage BOM. So, data is split between CAD, PDM, PLM, ERP, MRP, CRM and other domains. Read more about it in my article Why companies are not ready for single BOM? One of the biggest problems in using multiple bill of materials is related to collaboration between people in organization. Multi-BOM leads to huge data synchronization problem. The question “where is my BOM?” is usually the one that kills collaboration.

What is my conclusion? To manage BOM in Excel is a nightmare. So, to bring BOM management tools to replace Excel is a very good idea. However, most of companies are having though time to decide how to manage bill of materials among different systems and environments. In a real world companies are relying on combination of Excel, PDM/PLM and ERP to manage multiple BOMs. Unfortunately, it kills collaboration, productivity and innovation. Just my thoughts…

Best, Oleg


Why PLM stuck to provide solutions for SME?

April 21, 2014

PLM-for-SME

PLM is in the focus on many companies these days. Questions how to improve processes, optimize cost and improve quality are important and PLM vendors are laser focus on that. But… with one small clarification . It works for large manufacturing companies. To transform business processes is the way PLM succeeded to deliver ROI and demonstrate clear value. It is hard to find large manufacturing company these days that is not implementing kind of PLM. You can see multiple options – complete home made PLM system developed by IT department (usually based on some of available PLM toolkits), combination of older PDM/PLM system with some additional development and complete solutions from leading PLM companies.

However, when it comes to small manufacturing companies, the situation is very different. It is not rare to face the question "what is PLM and why do we need it?" as well as to see customers confused about the difference between PDM and PLM. The last one is a big misleading factor in PLM marketing. Few weeks ago I posted Why PLM stuck in PDM? The article raised lots of comments and opinions. The question I want to ask today is about why PLM software and strategies failed to deliver value to small manufacturing companies or so called SME (small and medium enterprises).

Speak to software vendors about PLM and SME and you will learn about top three PLM inhibitors – (1) limited financial resources, (2) lack of IT support and (3) diverse set of requirements. While PLM competition for large OEMs is getting stronger, SME becomes a very attractive opportunity for PLM to growth. It is an attractive and turbulent market with lots of innovative companies. Together with growing number of smaller suppliers. To win this market is a very interesting opportunity with significant growth potential.

SME remains a very challenging place for PLM vendors. The question about how to serve SME with PLM solutions is open for a long time. Large PLM vendors tried to serve these customers by scaling down their large PLM product suites and developing special packaged solutions. Newcomers tried to provide special applications for SME. Open source, SaaS, Out-of-the-box (OOTB) applications… After all, SME PLM marketshare remains very fragmented with lot of opportunities and no mainstream solution.

It made me think about some problems in existing PLM strategies for SME. I can see some similarity with mass customization trend in manufacturing. The time when car supposed to be "Ford" and "black" is over. Automotive and other manufacturers explored new opportunities to customize their solution to satisfy turbulent market with diverse set of requirements. So, focus on the niche markets and individual customer is important. In the past, it was a strategy Japanese firms captured marketshare in U.S. PLM vendors are trying to win PLM SME market by focusing on flexibility of their solution and OOTB applications. The problematic part of this strategy is cost. This is where flexible PLM failed. The cost of PLM implementation is still very high. Marketing, sales, business development and implementation services are not allowing to PLM vendors to scale their PLM operations for SME.

What is my conclusion? Low cost and efficiency. When it comes to customization and fulfillment of diverse customer requirements, low cost and efficiency are "must have" components of your strategy. Flexible platforms and OOTB Apps are not enough. Cloud solved some problems related to cost and IT support but left implementation services cost open. PLM vendors need to think how to deliver PLM services at low cost or think about alternative strategies. So far, PLM vendors failed to deliver to SME. Cost of the delivery is too high. After more than a decade of "focus on flexibility", I think it is a time for PLM vendors to find an alternative. Just my thoughts…

Best, Oleg


Why do I want vacation resort to adopt PLM and other tech?

April 20, 2014

bahamas-plm-vacation

I took few days of rest with the family on Paradise Island. Huge hotel and attractions facilities of Atlantis are impressing by size, location and weather (compared to snowing Boston). So, if you are all for pools, dolphins, coral reefs, multi-color fishes, this is a place to be. However, when it comes to customer facing technologies, you can quickly find yourself back in 1999. The only one thing that will remind you 2010s will be the allowance to use 4 devices for one internet connection purchase to connect you to the outside world. Inside of the hotel, the only thing you have is room phone line and guest services button. After few days of such experience, it made me think about some PLM and other technologies hotel can adopt sooner than later.

3D Mockup and Virtual Navigation

Hotel guide provided few pages paper map with low resolution pictures and index of POIs. You have low chance to understand the location of the attraction. Don’t even try to figure out the walking distance and time. To use this map for navigation and to plan your activities upfront is impossible. Maybe you don’t need to do it. But when you want to take kids from point A to point B, it can be useful. While most of hotel visitors keep mobile devices in their hands, to provide some digital tours (before the visits) as well as 3D walking navigation application on site can make your life much easier.

Product Sales Configuration

Well, as you can imagine, hotel is not just about rooms. It is also large amount of additional facilities, attractions and other points of entertainment you want to use. You need to spend a significant amount of time trying to understand what is available, when and decide if you want to use them. Very limited information is available upfront before your visit. It sounds like an opportunity for customizable product configuration. This tool can be used during the booking process and will allow you to choose all entertainment and other options. Some of them can be purchased and some of them just added to the portfolio for later purchase decision. Everything I said is not a rocket science for PLM and other tech today.

Process and activity planner

When you are in the hotel, you want to plan a specific activity. Sometimes your plans and interests are changing. How you can manage all these things in an easy and painless way. The only tool at your disposal today is phone with a button called "guest service" or front desk service with waiting queue. Too bad. It sounds like a good application for one of the available project and process planning tools with calendar and scheduling options.

What is my conclusion? People are interested in services and not in technologies. The easiest way to sell technologies is to show services people can use and buy. Nice weather, beach, pools and exotic animals are nice attractions. However, services are getting more and more important these days. The available technological services will play more important role in our decision making about future vacation options. PLM and other tech business dev people looking for the opportunities beyond aerospace and defense should take a note. Also, red alert to tech managers of hotels and travel industry companies. Just my thoughts…

Best, Oleg


How CAD collaboration can avoid competition with OneDrive?

April 15, 2014

web-services-recombination-cad-plm

Collaboration is the name of the game today for many vendors. CAD and PLM vendors are included. Cloud technology is opening many new capabilities to collaborate and it is captured by existing players and newcomers. Sometimes, it happens so fast that it even create an internal competition. Image and Data Manager article Is OneDrive for Business the SharePoint alternative? speaks about interesting transformation that happens these days around file collaboration using Microsoft family of product. We knew about SharePoint capabilities to collaborate and share content (files). However, the new born child – OneDrive is growing fast and potentially can capture some spaces occupied by SharePoint today. I liked the following passage explaining how OneDrive takes on SharePoint:

OneDrive has a very simple interface (one that has been simplified further with recent updates). So it’s easy to upload your files and share them. You can also sync to all your devices, desktop, tablet, smartphone, giving you direct access to your content when you are online or offline. You even have mobile apps for iOS, Android, Windows 8 and Windows RT.

OneDrive even has this cool feature that allows you to grab a file from your PC even if you haven’t uploaded it to OneDrive. You have to turn that feature on, but it’s pretty nice to have.

SharePoint’s interface is OK, but it’s the subject of much debate. It’s not very intuitive to use and requires a fair amount of planning and organizing to get it set up in a way that’s easy for people to understand. Getting access to SharePoint on mobile devices has been spotty at best. Access via mobile (tablet or smartphone) has improved a lot with SharePoint 2013, but for those on SharePoint 2010, the story is not so good.

What I learned from this article is that file sharing, collaboration space is getting busy and competitive. Which brings me back to the discussion about specialized CAD collaboration tools. It made me think about some strategies CAD collaboration tools can use in order to avoid frontal competition with OneDrive, Dropbox and other file sharing and sync tools.

The name for this game is "layers". Creating of layered architecture will allow to CAD collaboration tools to store data using OneDrive (or other storage and share service) and, at the same time, enhance it with the data layer providing rich access to CAD specific content, viewer and other CAD data relationships. Think about it in a similar way how Google organized information from web for you. You are not necessarily store data on websites and other locations. Nevertheless Google gives you easy access to this information via different services. The basic service is search. Enhanced services can provide a specific vertical slices of information (think about Google Flight as an example).

What is my conclusion? To separate vertical application and horizontal services is getting more and more important. It was true in the past to build right enterprise architecture, but it is getting even more important in the era of cloud services. To be successful, cloud vendors will have to learn how to recombine and reuse technologies provided by different players. File Share and Synchronization is a very good examples to start with. For CAD vendors it means to learn how to share data on OneDrive or Dropbox, but at the same time to provide vertical experience specific for CAD content. Just my thoughts…

Best, Oleg


PLM Technology vs Vertical Industries: Wrong balance?

April 14, 2014

plm-industries

Let’s talk about PLM technologies. Err.. PLM is not a technology. Even more, PLM is even not a product. So, what is that? Business strategy? Product development politics? For the sake of this conversation let’s leave these debates out. I want to speak about PLM technologies that allow you to manage product data, CAD files, bill of materials, rich set of related information as well as processes around it. This technology came to us about 20-25 years ago first as a very hard-coded set of tools. You had to build it literally different for every customer. So, it supported only large customers that were able to pay for software, infrastructure and implementation. Later on, PDM/PLM turned into software toolkit. The next step in PDM/PLM technology evolution was called flexible data modeling. The first flexible (dynamic) PLM data modeling tools were released back in 1995-2000 and… not much changed since then.

So, what happened since that time? PLM vendors went to develop out-of-the-box and vertical industry solutions in a massive way. David Linthicum’s article Saleforce.com officially is out of ideas reminded me about the joke comparing technology vs. industry play. Here is the passage:

When you run out of new ways to provide innovative technology, you go vertical. That was the running joke among CTOs back in the day. It usually meant the market had reached the saturation point and you could not find new growth

I found this message very compelling to what happens in PLM industry. PLM vendors are trying to compete by providing more comprehensive set of data models, best practices, process templates. By doing so, vendors want to reduce TCO of PLM implementations. It is actually brings success and many customers are using these solutions as a starting point for their PLM implementation.

So, where is the problem? For most of the situations, PLM is still costly and expensive implementation. Services may take up to 50% of the cost. Here is the issue – core PLM data and process modeling technology didn’t change a lot for the last 10-15 years. Data models, CAD file management, product structure, process orchestration. All these things are evolving, but very little. The fundamental capabilities are the same. And it is very expensive to develop solutions using these technologies.

You may ask me about cloud technologies. Cloud is the answer. But only partially. It solves problems related to infrastructure, deployments and updates. Cloud provides clear benefits here. However, from the implementation technology standpoint, it is very similar to what non-cloud solutions can offer. Another interesting passage from Infoworld cloud computing article explains what is the problem new SaaS/cloud products can experience when trying to displace existing vendors:

So many companies have tried this approach — many times — but most found limited success. I can’t help but think the same will occur here. Salesforce will soon discover that when you get into vertical industries, the existing foundation of industry-specific applications is difficult to displace. Although Salesforce can always play the SaaS card, most of those industry-specific providers have already moved to SaaS or are in the midst of such a move. That means SaaS won’t be the key differentiator it was when Salesforce first provided its powerful sales automation service more than a decade ago.

What is my conclusion? Efficiency and cost. These are two most important things to make PLM implementation successful. So, the technology must be improved. Data and model capturing tools, flexibility and ease of use – everything must be more efficient to support future of manufacturing processes. How to do so? This is a good topic to discuss with technology leaders and strategiest. I’m going to attend COFES 2014 in 10 days. I hope to find some answers there and share it with you.

Best, Oleg


How to forget ODBC and Rethink PLM Data Openness?

April 12, 2014

plm-link-data

One of the most popular topics in engineering (but not only) software ecosystem. Open vs. Close. I’ve been discussing it many times – Open vs. Closed PLM Debates, PLM and New Openness, Closed Thoughts About PLM openness and few more. There is clear trend towards openness these days and, in my view, it is hard to find a PDM/PLM company that will defend closed approach and not openness.

However the definition of openness can be quite different. What else, the implementation of openness can be different too. Speaking from the engineering standpoint, devil is in details. So, I wanted to speak about some aspects of "openness" and how it might be implemented in PDM / PLM world. For very long period of time, data in PDM/PLM world was completely dependent on Relational Database Management Systems (RDBMS). The time of proprietary databases and data files is finally over. So, you can think data is peacefully located in RDBMS where it can be easy accessed and exchanged. Not so fast… There are two main constraints preventing data openness in RDBMS: data access technology and data schema. You need to support both in order to have access to the data. An alternative would be to use published APIs, which will provide you an access layer. In most cases, APIs will eliminate the need to know data model, but in a nutshell will not be very different from data access technology.

For many years ODBC remains one of the most widely adopted database access technology. I’m using name ODBC, but it can also refer variety of similar data access technologies – JDBC, OLE DB, JDBC, ADO.NET, JDO, etc. This is where things went wrong with data access and openness. The power and success of ODBC came from the use of DSN (Data Source Names) as a identification of data access. All ODBC-compliant applications leveraged the fact other developers have implemented RDBMS specific libraries – ODBC drivers. So, used don’t need to think about Oracle, SQL server, MySQL, etc. User just need to connect to DSN.

The distinct development and end-user models of ODBC ensured a massive ecosystem of ODBC-compliant applications and database connectivity drivers. Unfortunately, RDBMS vendors — the same ones that collectively created the SQL CLI and inspired its evolution into ODBC — also sought to undermine its inherent RDBMS agnosticism. The problem it created lies in the producing of huge amount of data driven applications relying on ODBC data access and claiming data openness as the ability to access, retrieve and (sometimes) update data in the RDBMS. Hidden behind DNS, databases converted into data silos. Data extracted from a specific database was dead and lost without context of the database. So called "openness" became simple "data sync pipe". What else, each DNS remains separate. So, if you have few databases you are out of luck to access data in logical way. Applications are pumping data from one database to another mostly trying to synchronize data between different databases. The amount of duplicated and triplicated data is skyrocketing.

So, what is the alternative? We need to stop "syncing data" and instead of we need to start "linking data". Think about simple web analogy. If you want to reference my blog article, you don’t need to copy it to your blog. For most of the cases you can create a link to my blog and URL address. Now, let’s bring some more specific technologies into this powerful analogy. Maybe you are familiar with semantic web and linked data. If not, this is the time! Start here and here.

There is a fundamental differences between old ODBC world and new way of linking data. You can get some fundamentals here and by exploring W3C data activity. I can summaries three main principles of linking data – 1/ use of hyperlinks to the source of data; 2/ separation of data abstraction data access APIs; 3/ conceptual data modeling instead of application level data modeling. So, instead of implementing ODBC drivers and APIs to access data, each data provider (think about PLM system, for the moment) will implement an linked data web abstraction layer. This abstraction layer will allow to other applications to discover data and run queries to get results or interlink data with data located in other systems. LinkedData is fast developed ecosystem. You can lear more here.

What is my conclusion? We are coming to the point where we need to re-think the way we are accessing data in business systems and start building a better abstraction level that will allow to stitch data together via linkage opposite to synchronization. The wold wide web and the internet are ultimately success stories for open standard adoption and implementation techniques. Applying that will simplify access to data and build a value of data connection to the enterprise. Just my thoughts…

Best, Oleg


Will PLM Vendors Jump into Microsoft Cloud Window in Europe?

April 11, 2014

european-plm-cloud

Cloud is raising lots of controversy in Europe. While manufacturing companies in U.S. are generally more open towards new tech, European rivals are much more conservative. Many of my industry colleagues in Germany, France, Switzerland and other EU countries probably can confirm that. Europe is coming to cloud systems, but much slower. I’ve been posting about cloud implications and constraints in Europe. Catch up on my thoughts here – Will Europe adopt cloud PLM? and here PLM cloud and European data protection reforms. These are main cloud concerns raised by European customers – data, privacy and specific country regulation. With companies located in different places in EU, it can be a challenge.

Earlier today, I’ve heard some good news about cloud proliferation in Europe coming from Microsoft. TechCrunch article – Microsoft’s Enterprise Cloud Services Get A Privacy Thumbs Up From Europe’s Data Protection Authorities speaks about the fact Microsoft enterprise cloud service meets the standards of data privacy in several European countries. Here is a passage that can put some lights on details and what does it mean:

But today comes a piece of good news for Redmond: the data protection authorities (DPAs) of all 28 European member states have decided that Microsoft’s enterprise cloud services meet its standards for privacy. This makes Microsoft Azure, Office 365, Microsoft Dynamics CRM and Windows Intune the first services to get such approval. The privacy decision was made by the “Article 29 Data Protection Working Party,” which notes that this will mean that Microsoft will not have to seek approval of individual DPAs on enterprise cloud contracts. In its letter to Microsoft (embedded below), chair Isabelle Falque-Pierrotin writes, “The MS Agreement, as it will be modified by Microsoft, will be in line with Standard Contractual Clause 2010/87/EU… In practice, this will reduce the number of national authorizations required to allow the international transfer of data (depending on the national legislation).”

Majority of PDM / PLM providers are friendly with Microsoft tech stack. Some of them are completely relies on MS SQL server and other Microsoft technologies. Most of them are supporting SharePoint. Now, these PLM vendors have an additional incentive to stay with Microsoft technologies for the cloud. It can be also a good news for manufacturing companies already deployed PDM/PLM solutions on top of Microsoft technologies and developed custom solutions.

What is my conclusion? The technological landscape these days is very dynamic. The time, one platform worked for everybody is over. In light of technological disruption and future challenges tech giants will be using different strategies in order to stay relevant for customers. Will European cloud regulation keep PDM/PLM players with MS Azure and other Microsoft technologies compared to alternative cloud technological stacks? How fast will take to other players to reach the same level of compliance? These are good questions to ask vendors and service providers. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 218 other followers