PLM security: data and classification complexity

July 30, 2014


Security. It is hard to underestimate the importance of the topic. Information is one of the biggest assets companies have. Data and information is a lifeblood of every engineering and manufacturing organization. This is a key element of company IP. Combined of 3D models, Bill of Materials, manufacturing instructions, suppliers quotes, regulatory data and zillions of other pieces of information.

My attention caught Forrester TechRadar™: Data Security, Q2 2014 publication. Navigate to the following link to download the publication. The number of data security points is huge and overwhelming. There are different aspects of security. One of the interesting facts I learned about security from the report is growing focus on data security. Data security budgets are 17% as for 2013 and Forester predicts the increase of 5% in 2014.


The reports made me think about some specific characteristics of PLM solutions – data and information classification. The specific characteristic of every PLM system is high level of data complexity, data richness and dependencies. The information about product, materials, BOMs, suppliers, etc. is significantly intertwined. We can speak a lot of about PLM system security and data access layers. Simple put, it takes a lot of specifics of product, company, business process and vendor relationships. As company business is getting global, security mode and data access is getting very complicated. Here is an interesting passage from report related to data classification:

Data classification tools parse structured and unstructured data, looking for sensitive data that matches predened patterns or custom policies established by customers. Classiers generally look for data that can be matched deterministically, such as credit card numbers or social security numbers. Some data classiers also use fuzzy logic, syntactic analysis, and other techniques to classify less-structured information. Many data classification tools also support user-driven classification that users can add, change, or conrm classification based on their knowledge and the context of a given activity. Automated classication works well when you’re trying to classify specic content such as credit card numbers but becomes more challenging for other types of content.

In my view, PLM content is one of the best examples of data that can be hardly classified and secured. It takes long time to specify what pieces of information should be protected and how. Complex role-based security model, sensitive IP, regulation, business relations and many other factors are coming into play to provide classification model to secure PLM data.

What is my conclusion? I can see a growing concern to secure data access in complex IT solutions. PLM is one of them. To protect complex content is not simple – in many situations out of the box solutions won’t work. PLM architects and developers should consider how to provide easier ways to classify and secure product information and at the same time be compliant with multiple business and technical requirements. Important topic for coming years. Just my thoughts…

Best, Oleg

PLM, Excel Spreadsheets, Pain Killers and Vitamins

July 29, 2014


We like to compare stuff. Gadgets, cars, hotels, software. We can compare iPhone to Samsung, Canon to Nikon, Honda to Toyota. Software is a special category. When it comes to enterprise software it gets even more complicated. However, marketing comparison is a fascinating type of writing. Arena PLM blog posted a marketing writing – Using Excel for Bill of Materials (BOM) Management. The article compares BOM management using Excel spreadsheets and BOM management PLM tools (Arena tools implied, which is okay). Read the article and draw your own conclusion.

I have special passion for spreadsheets. In my view, (and I know many of PLM analysts and bloggers will agree here) Excel is stands out as one of the most popular PLM software tool in the industry. I have my reasons to like PLM spreadsheets as well as list of my "hate statements" about Excel.

Arena’s article reminded me famous marketing stories about vitamins and pain killers. The first is "nice to have" and the second is "must buy now". I think the value of PLM tools is obvious. But… here is my little "but". If I compare lists of values, cost and features in that article, I can not come to an absolute conclusion about advantages of PLM tools. It creates some mixed feeling. First, there is no line that says "no" to any of features you can do with Excel. So, basically, I can do everything with Excel, but not in an optimal way (means I won’t die :) tomorrow by keep using Excel). Second, cost is emotionally on the side of Excel. It is very hard to compete with "free" that everybody can use. And, to switch to PLM tools, you need to change the way you work. Even this is not in the list, it implied when you compared "time to implement" between "immediate" and "days-weeks". So, when you have organization using Excel and manages BOM, PLM is not in competition with Excel. This is another type of competition, which sales people often calls "competing with status quo".

What is my conclusion? Few weeks ago, I shared my recipe how PLM can take over Excel spreadsheets. Here is the list of three recommendations – flexible data models, easy customization and excellent user experience. I’d like to add pain killers to the list. This is something that PLM is still missing in competition with Excel. The comparison should have "no/yes" notation. Today’s "poor/excellent" is still has a flavor of vitamins. PLM implementations are still hurting people and lose in the comparison to initially glamorous Excel spreadsheets. Engineers are spending too much time managing Excels, but the cost is hidden and not obvious to managers to step into longer implementations, higher cost and slow learning curve. Just my thoughts…

Best, Oleg

Part Numbers are hard. How to think about data first?

July 28, 2014


One of the topics that usually raises a lot of debates is Part Numbers. One of my first takes on the complexity of Part Numbers was here – PDM, Part Numbers and the Future of Identification. Ed Lopategui reminded me about that topic in his GrabCAD post – Intelligent Numbering: What’s the Great Part Number Debate? few days ago. He speaks about four aspects related to handling of Part Numbers – creation, readability, uniqueness and interpretation. The conclusion is complex as well as the topic itself. Here is the passage, which outlines the conclusion Ed made.

Balancing all these diverse factors is difficult, because no solution is optimal for every company. Here are some final tips to help you make prudent decisions: 1/ Understand your PDM/PLM system part number generation capabilities; 2/ Understand the limitations of any other systems that interact with your parts; 3/ Go through every activity that requires interpreting part numbers and understand what system access is available, and how the interfaces work. This will provide a good basis for your interpretation cost; 4/ Understand how easy/difficult it is for a new employee to interpret a part number.

These tips made me think again about Part Numbering, data and different data and process management tools involved into the process of Part Numbers handling. Most of approaches are focusing on systems and functionality to handle part identification and classification. What we do is trying to align our need to identify and classify parts with what multiple systems can do. The hardest part is to find Part Numbers that will make all systems involved into the process (CAD, PDM, PLM, ERP, SCM, etc.) to work smooth. Honestly it is too complex and too costly.

So, how to manage that complexity? Is there a reasonable way to resolve the complexity of Part Numbering and made everybody happy? Thinking about that I came to conclusion that companies should start thinking about data first. From the longevity standpoint, data must have much higher priority compared to any data management system. In some industries companies are obliged to keep data for decades. Thinking about that, I want to outline some principles that will help you to do so and will allow to create some standardization around parts and data identification.

1- Disconnect Part Numbers and classification from specific applications. PN should not be dependent on requirements and capabilities of data and process management systems. Data has much longer lifespan compared to applications and systems. By defining PN independently you will keep data and processes in your company clean and well organized.

2- Generate PN based on classification, business needs and processes. Develop independent service to make it happen. This service is most probably should be independent from existing data management systems and converted in some sort of URI based notation.

3- Use independent service to convert independent PN into system specific identification. You can convert for any system you have at your disposal – PDM, PLM, ERP, SCM… What is important is to be able to control the process of conversion and adapt it each time data and/or process management system changes.

What is my conclusion? Product data is one of the most expensive assets in manufacturing companies. It represents your company IP and it is a real foundation of every manufacturing business. Think about data first. It will help you to develop strategy that organize data for longer lifecycle and minimize the cost of bringing new systems and manage changes in existing systems. I think, some services should be developed to make the process of part numbering easier for manufacturing companies. As manufacturing is getting global to maintain part numbering systems becomes a huge problem. Just my thoughts…

Best, Oleg

PLM vertical PaaS strategies

July 25, 2014


SaaS, PaaS, IaaS, DMaaS, … I’m sure marketing folks are having lots fun of new xaaS acronyms. The amount of publication about various strategies of services is skyrocketing. EDACafe article – The Platform-as-a-Service Provides European Aerospace & Defence OEMs and Partners with Greater Collaboration Capabilities brings a story of “AirDesign” – the European aerospace and defense industry’s collaboration platform by BoostAeroSpace and Dassault System. The article taste a bit marketing. Nevertheless, I found the following passage capture the rationale behind what AirDesign supposed to provide:

AirDesign drastically reduces operational costs for all partners through a single infrastructure, common exchange methods, open standards and easy access, all without adversely impacting existing information systems. All the primary European OEMs jointly requested and defined this platform in order to facilitate exchanges, support their suppliers’ ecosystems and generate new opportunities with services.

MMT article provides more analyzes on what Dassault and BoostAerospace are doing:

To manage the exchange of PLM data between all partners whatever their systems may be, AirDesign delivers three key capabilities through its digital exchange hub: 1/ The first allows an OEM to organize exchanges in the context of programs and projects, including the delegation of administrative roles; 2/The second is an integrated, secured and automatic technical data package exchange. The traceability needed for complex programs is a native capability, ensuring proper management of large technical files between OEMs and suppliers; 3/The third capability consists of access to a wide variety of services, including converters based on standards or approval services that a partner can use during an exchange.

The set of functionality above aren’t new. Large PLM implementation did it in the past. The problem of standardization and platform cost in supply chain eco-systems is critical. So, it is not unusual for partners to share single enterprise software infrastructure for design supply chain and more.

However, the notion of PaaS (Platform as a Service) is interesting. I’ve been blogging about that earlier – Will Cloud PLM develop PaaS option? and Cloud PLM and PaaS dilemma. It looks like PLM vendors is moving towards more vertical platform architecture. Especially for large companies and business eco-systems, PaaS can provide an interesting solution – standardization and cost benefits. The information about private cloud deployment by AirDesign confirms earlier news about Dassault developing all cloud PLM options.

What is my conclusion? I think we are going to see lots of PLM PaaS variations in a near term future. Large manufacturing companies are looking how to optimize cost and standartize infrastructure. This is an opportunity for PLM vendors to re-establish and sometimes re-develop their legacy systems in a new way. I’m sure lots of ENOVIA applications will be used in announced AirDesign PaaS. Overall looks like PaaS is another way to sell PLM cloud system to large manufacturing eco-system. Just my thoughts…

Best, Oleg

picture credit to MMT article



Who will make PLM sexier?

July 24, 2014


Cool factor is trending in software these days. The time when software was ugly is probably in the past. Everyone wants to have a "cool app" – on the picture above you can clearly see the trend. Does it apply to enterprise software and PLM? It is a good question. Back in 2012, I asked it in my post – PLM: Ugly vs. Cool. While nobody specifically focused on how to develop cool PLM software, I can see an increased interest for improved user experience from PLM vendors.


UX magazine article Is there Room for Sexy in Enterprise Design? caught my attention few days ago. I found the discussion about emotional factor interesting and important. I especially liked the following passage:

The question enterprise technology companies need to ask themselves is “what does sexy mean to your enterprise customer?” Put another way, how do your customers want to feel when using your products?Every product, whether we realize it or not, produces an emotional reaction. As Donald Norman articulated in his seminal book Emotional Design, customers find aesthetically pleasing products more effective. Customers even “love” these products. Norman identified the commercial value in evoking some passion towards products, such as Gucci bags and Rolex watches. MailChimp’s Director of User Experince, Aarron Walter, took this one step further with his book, Designing for Emotion. He posits that the goal of emotional design is to connect with users and evoke positive emotions, which will make your users want to continue interacting with your product.

Article speaks about EchoUser research of emotions with enterprise customers. The following emotions are make sense to enterprise crowd – powerful, trust, flexible, calm, pride, accomplished. Cool and sexy are not in the list. So, is there a place for "cool and sexy" in PLM? For long time PLM was associated with "complex" and "expensive". At the same time, most of PLM commercial videos are cool and sexy. Sport cars, luxury airplanes, fashion shows, mobile devices. You rarely can see PLM video without such type of product examples.

I think, many PLM professionals these days are still trying to keep the association of PLM with complexity. My hunch, they are trying to justify expenses. Customers might think complex solution requires more budget, longer consultancy and service project. However, the other side of complexity is to feel absence of reliability and trust. This is not a simple decision for PLM consultants and software vendors.

What is my conclusion? People don’t like cumbersome software these days. There is no place for complex user experience even in enterprise software. What emotions should drive CAD and PLM software? How engineers should feel about software? I’d like to connect the results of engineering and manufacturing process with PLM tools. You cannot make good products with wrong tools. So, something should happen with PLM software. Complex PLM software is a wrong tool to build future cool products. Just my thoughts…

Best, Oleg

photo credit MidoriShoes

How to re-think activity streams for enterprise?

July 23, 2014


These days manufacturing businesses are more connected than ever before. Every manufacturing company (even smallest startup) has a tremendous need for collaboration – help multiple engineers to get involved into the design process, communication with suppliers, plan manufacturing processes, etc. Social networks and open web inspired many companies to develop collaboration software that mimic consumer social software. One of the main attribute of every social software (Facebook, G+, twitter and others) is so called "activity stream" or "news feed". The trend was strong and produced lots of copycats. The successful and lucky ones got acquired. Many of less successful died.

The idea of activity stream is very powerful. It allows you easy share and consume information. However, here is a thing – it is not protected from "noise vs. signal" problem. The more people you follow – more information will flow into your activity stream(s). You end up with messy stream of information you cannot keep up with. It is probably okay for public news or even for executives in a company interested to keep up with what is going on. However, it is probably not a good experience for engineers that need to work together on the same design or discuss next engineering or manufacturing change request. Also, it is probably not a very useful as a tool to communicate between departments and suppliers. And… this is absolutely wrong model to use for process management.

All problems I mentioned above is actually making the adoption os social system for collaboration questionable. I can see many confirmations to that. CMSWire article The Problem With Yammer? People Don’t Use It speaks exactly about the problem. Here is key passage:

But what if the problem is not about difficulty or learning curves but about culture? What if the problem with Yammer has nothing to do with the product itself and nothing to with usability, but rather with the fact that enterprise workers are holding onto email for dear life and are not prepared to give it up? Microsoft itself appears to be aware of this. The addition of complimentary Yammer for the new Office 365 plans appears to speak to that. However, if Microsoft’s updated offerings are a step in the right direction, they won’t solve the problem of social and collaboration in the enterprise.

Another interesting example – Facebook. Clearly the king of social networks recently introduced simple and very effective feature to get out of noise of your information stream – Save. It can quickly remind you old and well-known list of favorites. Navigate to TNW article – Facebook introduces Save, a new bookmarking feature to help tame your News Feed. Sounds like a simple feature, but it allows you to keep specific post out of noisy channel and focus on them later in a more controlled way.

These and many other examples made me think about what is needed to provide a better way to collaborate. My hunch is that "controlled list of topics" can better serve the need of engineers and other people to work together. How to make it? This is probably more tricky question. I can see it as the next logical step from email that still one of the most favorited tools to communicate. It also reminded me my post Why PLM shouldn’t miss next email move earlier this week.

What is my conclusion? Activity stream is a good way to present flow of information. However, the type of experience it creates is way too open and subject to be affected by information noise. I believe engineering tools should provide more tight way to communicate, exchange information and share data for collaboration purposes. This is main reason people are holding onto email as a best tool. New ways to collaborate is not here… yet. Just my thoughts…

Best, Oleg

PLM implementations: nuts and bolts of data silos

July 22, 2014


Data is an essential part of every PLM implementation. It all starts from data – design, engineering, manufacturing, supply chain, support, etc. Enterprise systems are fragmented and representing individual silos of enterprise organization. To manage product data located in multiple enterprise data silos is a challenge for every PLM implementation.

To "demolish enterprise data silos" is a popular topic in PLM strategies and deployments. The idea of having one single point of truth is always in mind of PLM developers. Some of my latest notes about that here – PLM One Big Silo.

MCADCafe article – Developing Better Products is a “Piece of Cake” by Scott Reedy also speaks about how PLM implementation can help to aggregate all product development information scattered in multiple places into single PLM system. The picture from the article presents the problem:


The following passage is the most important, in my view:

Without a PLM system, companies often end up with disconnected silos of information. These silos inhibit the ability to control the entire product record and employees waste unnecessary time searching for the correct revision of the product design. As companies outsource design or manufacturing, it becomes even harder to ensure the right configuration of the product is leveraged by external partners.

Whether your company makes medical devices, industrial equipment, laptops, cell phones or other consumer products – PLM provides a secure, centralized database to manage the entire product record into a “Single Record of the Truth”… With a centralized product record, it is easy to propose and submit changes to the product design, track quality issues and collaborate with your internal teams and supply-chain partners.

The strategy of "single record of truth" is a centerpiece of each PLM implementation. However, here is the thing… if you look on the picture above you can certainly see some key enterprise systems – ERP, CRM, MES, Project and program management, etc. PLM system can contain scattered data about product design, CAD files, Part data, ECO records, Bill of Materials. However, some of the data will still remain in other systems. Some of the data gets duplicated. This is what happens in real world.

It made me think about 3 important data architecture aspects of every PLM implementation: data management, data reporting and data consistency.

Data management layer is focusing on what system is controlling data and providing master source of information. Data cannot be mastered in multiple places. Implementation needs to organize logical split of information as well as ability to control "data truth". This is the most fundamental part of data architecture.

Data reporting is focusing how PLM can get data extracted from multiple sources and presented in seamless way to end user. Imagine, you need to provide an "open ECO" report. The information can reside in PLM, ERP and maybe some other sources. To get right data in a right moment of time, can be another problem to resolve.

Last, but not least - data consistency. When data located in multiple places system will rely on so-called "eventual consistency" of information. The system of events and related transactions is keeping data in sync. This is not a trivial process, but many systems are operating in such way. What is important is to have a coordinated data flow between systems supporting eventual consistency and data management and reporting tools.

What is my conclusion? To demolish silos and manage single point of truth is a very good and important strategic message. However, when it comes to nuts and bolts of implementation, an appropriate data architecture must be in place to insure you will have right data at right time. Many PLM implementations are underestimating the complexity of data architecture. It leaves them with marketing slogans, burned budgets and wrong data. Just my thoughts…

Best, Oleg

picture credit MCADCafe article.

Cloud PDM can make file check-in and check-out obsolete

July 21, 2014


Management of CAD files (PDM) is heavily associated with desktop workflows. Lots of CAD files live on engineering desktops and shared company network drives. Originally, one of the main PDM functionality was to vault CAD data and manage CAD files revisions. One of the most widely used scenario to support this functionality is so-called Check-in / Check-out process. CAD files are checked-in from working folders (working space) into secured File vaults located on PDM servers. In case engineers want to make a change, you need to check-out file. The same mechanism can insure released CAD files won’t be changed without approval and prior check-out. The implementation of PDM check-in/check-out process is not simple because of CAD data complexity. File relationships and dependencies need to be taken into account if you want to make an update CAD 3D design and drawings.

Cloud is changing existing working habits. For long time, engineers were tightly connected to their desks. CAD, engineering analysis, Excel spreadsheets… this is only a short list of tools that live on engineering desks. Not anymore. These days our workflows are heavily impacted by cloud software. Web email, cloud file sharing, cloud and mobile applications. We don’t need to be at our desk to do a job in many situations. Cloud is providing new complementary workflows. However, in some cases, we can see a total replacement of existing workflows.

I’ve been discussing how cloud technologies are changing CAD file sharing, CAD data management and PDM. Navigate to my previous post – What makes cloud a good alternative for PDM system?. One of the most widely debated questions is related to the ability of cloud system to handle large size of CAD files. The capacity of public cloud systems to handle large data scale is well known. Cloud storage cost is getting down. The speed of changes is significant and the numbers from my 2 years old post – Cloud PDM and 10GB emails can make me smile today.

At the same time, a very important and critical aspect of cloud technologies is synchronization of data between cloud and desktop / local networks. Web giants like Google, Amazon, Microsoft and others are working to improve sync technologies. In few of my posts, I covered some specific examples about how companies like Box, Dropbox are providing specific techniques to improve data and file sync. But CAD data is different. Not like photos, office files and even videos. To solve the same problem for highly dependent and intertwined CAD data can be a big deal. When it done, it can be a significant leapfrog for any company in the market of cloud PDM solution.

Future CAD file management trajectories can take us from the original idea to check-in/check-out files between secured PDM vault and local working folders towards different workflows. Cloud file systems can support a new way to manage CAD files and provide access to them for design tools and other services. Long term goal can be a future without CAD files. The potential file storage transformation can raise lots of question about how CAD systems will operate without local storage? All these questions are relevant for both private and public cloud solutions.

What is my conclusion? Cloud will change PDM. I can see a potential transformation in fundamental CAD/PDM scenarios – check-in/check-out. Modern cloud PDM can take an approach of seamless and transparent data synchronization and simplify PDM. New workflows can potentially exclude engineers from time consuming and complicated file retrieval between desktops and servers. New way of work will be more simple and focus on design release and approval only. I can see this approach well aligned with future cloud design systems eliminating local file storage completely. So, future cloud PDM without check-in/check-out? What do you think? These are just my thoughts…

Best, Oleg

Why PLM shouldn’t miss next email move?

July 18, 2014


Email is a king of communication in every company. Many companies are literally run by email. People are using it for different purposes -notification, collaboration and very often even record management. You can hear many discussions about how companies can replace or integrate email with enterprise and social collaboration tools. I captured some of them in my previous blogging: How engineers find path from emails and messages to collaboration?; PLM Workflows and Google Actionable Emails; DIY PLM and Zero Email Policy; PLM Messaging and WhatsApp Moment.

You may think email doesn’t change. I wanted to share with you two interesting examples related to changes and innovation in email that caught my attention for the last few weeks. The Verge article speaks about Gmail API announcement.

Google announced that any app could now talk to Gmail using today’s faster, more modern languages — languages that every web developer speaks. The Gmail API lets you ask Google for threads, messages, drafts, and labels three to ten times faster than with IMAP. What it can do is provide an interface for any app to interact on a small scale with your Gmail account without having to create an entire mail client. When that happens, Google won’t have replaced email — it will have actually extended it. Instead of killing email as some hoped it would, the Gmail API gives email new life.

The following video present some additional details about Gmail API usage. Take 5 minutes to watch it, especially places where video speaks about integration between Gmail and enterprise systems.

Another example comes from TNW article – Inbox launches as an open-source email platform to replace legacy protocols.

A new startup, Inbox, is launching its “next-generation email platform” as an alternative to aging protocols like IMAP and SMTP. The core of Inbox’s efforts is an Inbox Sync Engine for developers that adds a modern API on top of mail providers, including Gmail, Yahoo and

As stated in the article, Inbox is a platform play. The intent of founders is to create new generation of messaging platform. And it is an open source play. The first step for Inbox is to create Sync engine that can expose existing email providers:

The core of Inbox is an open source sync engine that integrates with existing email services like Gmail, and exposes a beautiful, modern REST API. We’re pleased to announce that beginning today, you can download the Inbox engine, sync an account, and begin building on top of Inbox in your local development environment.

These articles made me think about a potential play PLM and engineering application can make by building their collaboration application tightly integrated with email services. It will allow better communication for people and ease of data integration between PLM solutions and communication platforms such as emails. You may see it as a pure technical play. Who cares how to integrate email and data? However, in my view, this is a place where differentiation in user experience and seamless data integration can become a critical to drive user adoption.

What is my conclusion? It is very hard to change people’s habits. Email is part of our every day routine. Existing systems are integrated with email, but the way it done as well as the level of data integration is very sporadic. Lots of unstructured data about customers, engineering decisions, requirements and many others stuck in the email and lost there forever. New email approach may help to have transparent and seamless integration between business applications and email. It can make a difference for users. Just my thoughts…

Best, Oleg

Will IBM and Apple open doors for mobile PLM future?

July 17, 2014


Enterprise software and Apple wasn’t much a success story until now. Don’t take me wrong – you can enterprise execs and even IT folks are using iPhones and other Apple devices. In my view, they do it mostly for mobile email and other cools apps. However, until now, the traction of iOS in enterprise was limited. I’ve been speculating about future of iPad for enterprise PLM in my previous writing PLM Downstream – Sent from my iPad?; iPad and Enterprise PLM; 3D/PLM and iPad: Future or Baloney? At the same time, I haven’t see many Apple devices in manufacturing companies and especially shop floor, maintenance and service departments. In many situations, IT remained a strong gatekeeper.

Some good news for iOS mobile PLM developers just came yesterday. Apple and IBM announced global partnership to transform enterprise mobility. Navigate here to read IBM press news . The amount of articles and reviews is skyrocketing. I picked few of them. PC World article – Why the Apple-IBM deal matters. My favorite passage speaks about "uniqu cloud services" for iOS.

Apple and IBM announced an “exclusive” deal on Tuesday in which IBM will build a new line of enterprise-specific apps from the ground up for Apple’s iOS, aimed at companies in retail, health care, transportation and other industries. IBM will create “unique cloud services” for iOS, including tools for security, analytics and device management. It will also resell iPhones and iPads to its corporate customers, and Apple will roll out new support services for businesses. In other words, Apple and IBM are putting a full-court press on the mobile business market. And they’re doing so in a tightly wedded fashion: The companies used the word “exclusive” four times in a statement announcing the deal.

Another article from Forbes Apple – IBM Partnership: Enough To Solve Enterprise iOS Fears? caught my attention speaking about Apple relying on enterprise partners to do heavy lifting needed to sell mobile solution to enterprise.

As enterprises increasingly look to make more use of business applications on mobile devices – for a competitive advantage in flexibility and productivity – manufacturers such as Apple will rely on enterprise partners, he notes, “to do the heavy lifting that will increasingly be required in areas such as mobile application development, lifecycle management and systems integration”. Apple is likely to seek other partners, similar to IBM, that can also provide enterprise capabilities and support.

Let’s go back to PLM vendors and mobile development. Until now, I had a mixed feeling about PLM mobile story. All PLM vendors did something for mobile and iOS. But, in my view, it was some sort of checkmark – "yes we have it". In my view, one of the mission points was absence of specific apps to solve productivity problems. Most of mobile PLM apps did the same job as non-mobile software did, but on iPad. In addition to that, 3D viewer app was very popular. Most of these application came as an overlap to existing software. At the same time, key advantage of mobile app is to provide productivity apps for situation when users are off desks on the road, workshops, manufacturing and service facilities. Some of my thoughts about that are here – Mobile PLM gold rush: did vendors miss the point?

What is my conclusion? Apple and IBM agreement could be a big deal. IBM have a very good past record in enterprise PLM deployments. Even manufacturing industry was not specifically mentioned in the press release, I’m sure it will influence decisions of many IT managers. So, sounds like an opportunity. iOS developers can start looking for jobs in PLM companies. It is also a good opportunity for startups. Just my thoughts…

Best, Oleg


Get every new post delivered to your Inbox.

Join 244 other followers