PLM, Excel Spreadsheets, Pain Killers and Vitamins

July 29, 2014

bom-plm-excel-painkiller-1

We like to compare stuff. Gadgets, cars, hotels, software. We can compare iPhone to Samsung, Canon to Nikon, Honda to Toyota. Software is a special category. When it comes to enterprise software it gets even more complicated. However, marketing comparison is a fascinating type of writing. Arena PLM blog posted a marketing writing – Using Excel for Bill of Materials (BOM) Management. The article compares BOM management using Excel spreadsheets and BOM management PLM tools (Arena tools implied, which is okay). Read the article and draw your own conclusion.

I have special passion for spreadsheets. In my view, (and I know many of PLM analysts and bloggers will agree here) Excel is stands out as one of the most popular PLM software tool in the industry. I have my reasons to like PLM spreadsheets as well as list of my "hate statements" about Excel.

Arena’s article reminded me famous marketing stories about vitamins and pain killers. The first is "nice to have" and the second is "must buy now". I think the value of PLM tools is obvious. But… here is my little "but". If I compare lists of values, cost and features in that article, I can not come to an absolute conclusion about advantages of PLM tools. It creates some mixed feeling. First, there is no line that says "no" to any of features you can do with Excel. So, basically, I can do everything with Excel, but not in an optimal way (means I won’t die :) tomorrow by keep using Excel). Second, cost is emotionally on the side of Excel. It is very hard to compete with "free" that everybody can use. And, to switch to PLM tools, you need to change the way you work. Even this is not in the list, it implied when you compared "time to implement" between "immediate" and "days-weeks". So, when you have organization using Excel and manages BOM, PLM is not in competition with Excel. This is another type of competition, which sales people often calls "competing with status quo".

What is my conclusion? Few weeks ago, I shared my recipe how PLM can take over Excel spreadsheets. Here is the list of three recommendations – flexible data models, easy customization and excellent user experience. I’d like to add pain killers to the list. This is something that PLM is still missing in competition with Excel. The comparison should have "no/yes" notation. Today’s "poor/excellent" is still has a flavor of vitamins. PLM implementations are still hurting people and lose in the comparison to initially glamorous Excel spreadsheets. Engineers are spending too much time managing Excels, but the cost is hidden and not obvious to managers to step into longer implementations, higher cost and slow learning curve. Just my thoughts…

Best, Oleg


Part Numbers are hard. How to think about data first?

July 28, 2014

part-numbers-madness

One of the topics that usually raises a lot of debates is Part Numbers. One of my first takes on the complexity of Part Numbers was here – PDM, Part Numbers and the Future of Identification. Ed Lopategui reminded me about that topic in his GrabCAD post – Intelligent Numbering: What’s the Great Part Number Debate? few days ago. He speaks about four aspects related to handling of Part Numbers – creation, readability, uniqueness and interpretation. The conclusion is complex as well as the topic itself. Here is the passage, which outlines the conclusion Ed made.

Balancing all these diverse factors is difficult, because no solution is optimal for every company. Here are some final tips to help you make prudent decisions: 1/ Understand your PDM/PLM system part number generation capabilities; 2/ Understand the limitations of any other systems that interact with your parts; 3/ Go through every activity that requires interpreting part numbers and understand what system access is available, and how the interfaces work. This will provide a good basis for your interpretation cost; 4/ Understand how easy/difficult it is for a new employee to interpret a part number.

These tips made me think again about Part Numbering, data and different data and process management tools involved into the process of Part Numbers handling. Most of approaches are focusing on systems and functionality to handle part identification and classification. What we do is trying to align our need to identify and classify parts with what multiple systems can do. The hardest part is to find Part Numbers that will make all systems involved into the process (CAD, PDM, PLM, ERP, SCM, etc.) to work smooth. Honestly it is too complex and too costly.

So, how to manage that complexity? Is there a reasonable way to resolve the complexity of Part Numbering and made everybody happy? Thinking about that I came to conclusion that companies should start thinking about data first. From the longevity standpoint, data must have much higher priority compared to any data management system. In some industries companies are obliged to keep data for decades. Thinking about that, I want to outline some principles that will help you to do so and will allow to create some standardization around parts and data identification.

1- Disconnect Part Numbers and classification from specific applications. PN should not be dependent on requirements and capabilities of data and process management systems. Data has much longer lifespan compared to applications and systems. By defining PN independently you will keep data and processes in your company clean and well organized.

2- Generate PN based on classification, business needs and processes. Develop independent service to make it happen. This service is most probably should be independent from existing data management systems and converted in some sort of URI based notation.

3- Use independent service to convert independent PN into system specific identification. You can convert for any system you have at your disposal – PDM, PLM, ERP, SCM… What is important is to be able to control the process of conversion and adapt it each time data and/or process management system changes.

What is my conclusion? Product data is one of the most expensive assets in manufacturing companies. It represents your company IP and it is a real foundation of every manufacturing business. Think about data first. It will help you to develop strategy that organize data for longer lifecycle and minimize the cost of bringing new systems and manage changes in existing systems. I think, some services should be developed to make the process of part numbering easier for manufacturing companies. As manufacturing is getting global to maintain part numbering systems becomes a huge problem. Just my thoughts…

Best, Oleg


PLM vertical PaaS strategies

July 25, 2014

PaaS-plm-large-manufacturing

SaaS, PaaS, IaaS, DMaaS, … I’m sure marketing folks are having lots fun of new xaaS acronyms. The amount of publication about various strategies of services is skyrocketing. EDACafe article – The Platform-as-a-Service Provides European Aerospace & Defence OEMs and Partners with Greater Collaboration Capabilities brings a story of “AirDesign” – the European aerospace and defense industry’s collaboration platform by BoostAeroSpace and Dassault System. The article taste a bit marketing. Nevertheless, I found the following passage capture the rationale behind what AirDesign supposed to provide:

AirDesign drastically reduces operational costs for all partners through a single infrastructure, common exchange methods, open standards and easy access, all without adversely impacting existing information systems. All the primary European OEMs jointly requested and defined this platform in order to facilitate exchanges, support their suppliers’ ecosystems and generate new opportunities with services.

MMT article provides more analyzes on what Dassault and BoostAerospace are doing:

To manage the exchange of PLM data between all partners whatever their systems may be, AirDesign delivers three key capabilities through its digital exchange hub: 1/ The first allows an OEM to organize exchanges in the context of programs and projects, including the delegation of administrative roles; 2/The second is an integrated, secured and automatic technical data package exchange. The traceability needed for complex programs is a native capability, ensuring proper management of large technical files between OEMs and suppliers; 3/The third capability consists of access to a wide variety of services, including converters based on standards or approval services that a partner can use during an exchange.

The set of functionality above aren’t new. Large PLM implementation did it in the past. The problem of standardization and platform cost in supply chain eco-systems is critical. So, it is not unusual for partners to share single enterprise software infrastructure for design supply chain and more.

However, the notion of PaaS (Platform as a Service) is interesting. I’ve been blogging about that earlier – Will Cloud PLM develop PaaS option? and Cloud PLM and PaaS dilemma. It looks like PLM vendors is moving towards more vertical platform architecture. Especially for large companies and business eco-systems, PaaS can provide an interesting solution – standardization and cost benefits. The information about private cloud deployment by AirDesign confirms earlier news about Dassault developing all cloud PLM options.

What is my conclusion? I think we are going to see lots of PLM PaaS variations in a near term future. Large manufacturing companies are looking how to optimize cost and standartize infrastructure. This is an opportunity for PLM vendors to re-establish and sometimes re-develop their legacy systems in a new way. I’m sure lots of ENOVIA applications will be used in announced AirDesign PaaS. Overall looks like PaaS is another way to sell PLM cloud system to large manufacturing eco-system. Just my thoughts…

Best, Oleg

picture credit to MMT article

 

 


Who will make PLM sexier?

July 24, 2014

sexier-plm

Cool factor is trending in software these days. The time when software was ugly is probably in the past. Everyone wants to have a "cool app" – on the picture above you can clearly see the trend. Does it apply to enterprise software and PLM? It is a good question. Back in 2012, I asked it in my post – PLM: Ugly vs. Cool. While nobody specifically focused on how to develop cool PLM software, I can see an increased interest for improved user experience from PLM vendors.

cool-sexy-app-trend

UX magazine article Is there Room for Sexy in Enterprise Design? caught my attention few days ago. I found the discussion about emotional factor interesting and important. I especially liked the following passage:

The question enterprise technology companies need to ask themselves is “what does sexy mean to your enterprise customer?” Put another way, how do your customers want to feel when using your products?Every product, whether we realize it or not, produces an emotional reaction. As Donald Norman articulated in his seminal book Emotional Design, customers find aesthetically pleasing products more effective. Customers even “love” these products. Norman identified the commercial value in evoking some passion towards products, such as Gucci bags and Rolex watches. MailChimp’s Director of User Experince, Aarron Walter, took this one step further with his book, Designing for Emotion. He posits that the goal of emotional design is to connect with users and evoke positive emotions, which will make your users want to continue interacting with your product.

Article speaks about EchoUser research of emotions with enterprise customers. The following emotions are make sense to enterprise crowd – powerful, trust, flexible, calm, pride, accomplished. Cool and sexy are not in the list. So, is there a place for "cool and sexy" in PLM? For long time PLM was associated with "complex" and "expensive". At the same time, most of PLM commercial videos are cool and sexy. Sport cars, luxury airplanes, fashion shows, mobile devices. You rarely can see PLM video without such type of product examples.

I think, many PLM professionals these days are still trying to keep the association of PLM with complexity. My hunch, they are trying to justify expenses. Customers might think complex solution requires more budget, longer consultancy and service project. However, the other side of complexity is to feel absence of reliability and trust. This is not a simple decision for PLM consultants and software vendors.

What is my conclusion? People don’t like cumbersome software these days. There is no place for complex user experience even in enterprise software. What emotions should drive CAD and PLM software? How engineers should feel about software? I’d like to connect the results of engineering and manufacturing process with PLM tools. You cannot make good products with wrong tools. So, something should happen with PLM software. Complex PLM software is a wrong tool to build future cool products. Just my thoughts…

Best, Oleg

photo credit MidoriShoes


How to re-think activity streams for enterprise?

July 23, 2014

controlled-collaboration

These days manufacturing businesses are more connected than ever before. Every manufacturing company (even smallest startup) has a tremendous need for collaboration – help multiple engineers to get involved into the design process, communication with suppliers, plan manufacturing processes, etc. Social networks and open web inspired many companies to develop collaboration software that mimic consumer social software. One of the main attribute of every social software (Facebook, G+, twitter and others) is so called "activity stream" or "news feed". The trend was strong and produced lots of copycats. The successful and lucky ones got acquired. Many of less successful died.

The idea of activity stream is very powerful. It allows you easy share and consume information. However, here is a thing – it is not protected from "noise vs. signal" problem. The more people you follow – more information will flow into your activity stream(s). You end up with messy stream of information you cannot keep up with. It is probably okay for public news or even for executives in a company interested to keep up with what is going on. However, it is probably not a good experience for engineers that need to work together on the same design or discuss next engineering or manufacturing change request. Also, it is probably not a very useful as a tool to communicate between departments and suppliers. And… this is absolutely wrong model to use for process management.

All problems I mentioned above is actually making the adoption os social system for collaboration questionable. I can see many confirmations to that. CMSWire article The Problem With Yammer? People Don’t Use It speaks exactly about the problem. Here is key passage:

But what if the problem is not about difficulty or learning curves but about culture? What if the problem with Yammer has nothing to do with the product itself and nothing to with usability, but rather with the fact that enterprise workers are holding onto email for dear life and are not prepared to give it up? Microsoft itself appears to be aware of this. The addition of complimentary Yammer for the new Office 365 plans appears to speak to that. However, if Microsoft’s updated offerings are a step in the right direction, they won’t solve the problem of social and collaboration in the enterprise.

Another interesting example – Facebook. Clearly the king of social networks recently introduced simple and very effective feature to get out of noise of your information stream – Save. It can quickly remind you old and well-known list of favorites. Navigate to TNW article – Facebook introduces Save, a new bookmarking feature to help tame your News Feed. Sounds like a simple feature, but it allows you to keep specific post out of noisy channel and focus on them later in a more controlled way.

These and many other examples made me think about what is needed to provide a better way to collaborate. My hunch is that "controlled list of topics" can better serve the need of engineers and other people to work together. How to make it? This is probably more tricky question. I can see it as the next logical step from email that still one of the most favorited tools to communicate. It also reminded me my post Why PLM shouldn’t miss next email move earlier this week.

What is my conclusion? Activity stream is a good way to present flow of information. However, the type of experience it creates is way too open and subject to be affected by information noise. I believe engineering tools should provide more tight way to communicate, exchange information and share data for collaboration purposes. This is main reason people are holding onto email as a best tool. New ways to collaborate is not here… yet. Just my thoughts…

Best, Oleg


PLM implementations: nuts and bolts of data silos

July 22, 2014

data-silos-architecture

Data is an essential part of every PLM implementation. It all starts from data – design, engineering, manufacturing, supply chain, support, etc. Enterprise systems are fragmented and representing individual silos of enterprise organization. To manage product data located in multiple enterprise data silos is a challenge for every PLM implementation.

To "demolish enterprise data silos" is a popular topic in PLM strategies and deployments. The idea of having one single point of truth is always in mind of PLM developers. Some of my latest notes about that here – PLM One Big Silo.

MCADCafe article – Developing Better Products is a “Piece of Cake” by Scott Reedy also speaks about how PLM implementation can help to aggregate all product development information scattered in multiple places into single PLM system. The picture from the article presents the problem:

product-data-silos

The following passage is the most important, in my view:

Without a PLM system, companies often end up with disconnected silos of information. These silos inhibit the ability to control the entire product record and employees waste unnecessary time searching for the correct revision of the product design. As companies outsource design or manufacturing, it becomes even harder to ensure the right configuration of the product is leveraged by external partners.

Whether your company makes medical devices, industrial equipment, laptops, cell phones or other consumer products – PLM provides a secure, centralized database to manage the entire product record into a “Single Record of the Truth”… With a centralized product record, it is easy to propose and submit changes to the product design, track quality issues and collaborate with your internal teams and supply-chain partners.

The strategy of "single record of truth" is a centerpiece of each PLM implementation. However, here is the thing… if you look on the picture above you can certainly see some key enterprise systems – ERP, CRM, MES, Project and program management, etc. PLM system can contain scattered data about product design, CAD files, Part data, ECO records, Bill of Materials. However, some of the data will still remain in other systems. Some of the data gets duplicated. This is what happens in real world.

It made me think about 3 important data architecture aspects of every PLM implementation: data management, data reporting and data consistency.

Data management layer is focusing on what system is controlling data and providing master source of information. Data cannot be mastered in multiple places. Implementation needs to organize logical split of information as well as ability to control "data truth". This is the most fundamental part of data architecture.

Data reporting is focusing how PLM can get data extracted from multiple sources and presented in seamless way to end user. Imagine, you need to provide an "open ECO" report. The information can reside in PLM, ERP and maybe some other sources. To get right data in a right moment of time, can be another problem to resolve.

Last, but not least - data consistency. When data located in multiple places system will rely on so-called "eventual consistency" of information. The system of events and related transactions is keeping data in sync. This is not a trivial process, but many systems are operating in such way. What is important is to have a coordinated data flow between systems supporting eventual consistency and data management and reporting tools.

What is my conclusion? To demolish silos and manage single point of truth is a very good and important strategic message. However, when it comes to nuts and bolts of implementation, an appropriate data architecture must be in place to insure you will have right data at right time. Many PLM implementations are underestimating the complexity of data architecture. It leaves them with marketing slogans, burned budgets and wrong data. Just my thoughts…

Best, Oleg

picture credit MCADCafe article.


Cloud PDM can make file check-in and check-out obsolete

July 21, 2014

cloud-pdm-checkin-out-need-1

Management of CAD files (PDM) is heavily associated with desktop workflows. Lots of CAD files live on engineering desktops and shared company network drives. Originally, one of the main PDM functionality was to vault CAD data and manage CAD files revisions. One of the most widely used scenario to support this functionality is so-called Check-in / Check-out process. CAD files are checked-in from working folders (working space) into secured File vaults located on PDM servers. In case engineers want to make a change, you need to check-out file. The same mechanism can insure released CAD files won’t be changed without approval and prior check-out. The implementation of PDM check-in/check-out process is not simple because of CAD data complexity. File relationships and dependencies need to be taken into account if you want to make an update CAD 3D design and drawings.

Cloud is changing existing working habits. For long time, engineers were tightly connected to their desks. CAD, engineering analysis, Excel spreadsheets… this is only a short list of tools that live on engineering desks. Not anymore. These days our workflows are heavily impacted by cloud software. Web email, cloud file sharing, cloud and mobile applications. We don’t need to be at our desk to do a job in many situations. Cloud is providing new complementary workflows. However, in some cases, we can see a total replacement of existing workflows.

I’ve been discussing how cloud technologies are changing CAD file sharing, CAD data management and PDM. Navigate to my previous post – What makes cloud a good alternative for PDM system?. One of the most widely debated questions is related to the ability of cloud system to handle large size of CAD files. The capacity of public cloud systems to handle large data scale is well known. Cloud storage cost is getting down. The speed of changes is significant and the numbers from my 2 years old post – Cloud PDM and 10GB emails can make me smile today.

At the same time, a very important and critical aspect of cloud technologies is synchronization of data between cloud and desktop / local networks. Web giants like Google, Amazon, Microsoft and others are working to improve sync technologies. In few of my posts, I covered some specific examples about how companies like Box, Dropbox are providing specific techniques to improve data and file sync. But CAD data is different. Not like photos, office files and even videos. To solve the same problem for highly dependent and intertwined CAD data can be a big deal. When it done, it can be a significant leapfrog for any company in the market of cloud PDM solution.

Future CAD file management trajectories can take us from the original idea to check-in/check-out files between secured PDM vault and local working folders towards different workflows. Cloud file systems can support a new way to manage CAD files and provide access to them for design tools and other services. Long term goal can be a future without CAD files. The potential file storage transformation can raise lots of question about how CAD systems will operate without local storage? All these questions are relevant for both private and public cloud solutions.

What is my conclusion? Cloud will change PDM. I can see a potential transformation in fundamental CAD/PDM scenarios – check-in/check-out. Modern cloud PDM can take an approach of seamless and transparent data synchronization and simplify PDM. New workflows can potentially exclude engineers from time consuming and complicated file retrieval between desktops and servers. New way of work will be more simple and focus on design release and approval only. I can see this approach well aligned with future cloud design systems eliminating local file storage completely. So, future cloud PDM without check-in/check-out? What do you think? These are just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 244 other followers