Why PLM stuck in PDM?

April 5, 2014

plm-stuck-pdm-round-square

I’ve been following CIMdata PLM market industry forum earlier this week on twitter. If you’re are on twitter, navigate here or search for #PLM4UM hash tag on twitter. The agenda of PLM forum is here. The following session discussed one of my favorite topics- PDM v PLM. PLM: Well Beyond Just PDM by Peter Bilello. This passage is explaining what the session is about

CIMdata’s research reveals that leading industrial companies are looking to expand beyond PDM functionality to truly enable a more complete PLM strategy. This becomes even more important in a circular economy. In this presentation, CIMdata will discuss which areas are most important, and what opportunities they create for PLM solution and service providers.

My attention was caught by the following tweets coming from this session:

According to CIMdata, leading Mfrs are now looking to move beyond PDM. #PLM4um
— ScottClemmons (@ScottClemmons) link to tweet.

Peter B / CIMdata explains that it’s hard to find a ‘real’ end-to-end #PLM implementation hat works #plm4um
— Marc Lind (@MarcL_) link to tweet.

It made me think why after so many years of PLM implementations, most of vendors are still solving mostly PDM problems for customers and it is hard to move on into broad downstream and upstream adoption of PLM beyond CAD data management functions. Here are my four points explaining in a nutshell why I think "PLM stuck in PDM".

1- Focus on design and CAD.

Most of PLM vendors historically came from CAD-related domain. Therefore, PLM business for them was the expansion of CAD, design and engineering business. As a result of that, use cases, business needs and customer focus were heavy influenced by design domain. The result – PDM focus was clear priority.

2- PLM is a glorified data management toolkit

The initial focus of many PLM systems was to provide a flexible data management system with advanced set of integration and workflow capabilities. There are many reasons for that – functionality, competition, enterprise organization politics. Flexibility was considered as one of the competitive advantages PLM can provide to satisfy the diversity of customer requirements. It resulted in complicated deployments, expensive services and high rate of implementation failures.

3- Poor integration with ERP and other enterprise systems

PLM is sitting on the bridge between engineering and manufacturing. Therefore, in order to be successful, integration with ERP systems is mandatory. However, PLM-ERP integration is never easy (even these days), which put a barrier to deploy PLM system beyond engineering department.

4- CAD oriented business model

Because of CAD and design roots, PLM sales always were heavily influenced by CAD sales. Most of PLM systems initially came to market as a extensions of CAD/PDM packages. With unclear business model, complicated VARs and service companies support, mainstream PLM deployment always focused on how not to slow CAD sales.

What is my conclusion? Heavy CAD roots and traditional orientation on engineering requirements hold existing PLM systems from expanding beyond PDM for midsize manufacturing companies. The success rate of large enterprise PLM is higher. But, it comes at high price including heavy customization and service offerings. Just my thoughts…

Best, Oleg


PLM Is Challenged With Compliance Data

October 24, 2013

open-data-types

Manufacturing is going global. This is not about the future. This is a reality of all manufacturing companies today. So, with such a reality, supply chain solutions is getting more and more traction and interest from IT managers and other people in companies. If I will use the language of one of my sales exec friends, supply chain solution turns from a vitamin to pain killer. Which means sales opportunity.

At the same time, supply chain is not a new topic for manufacturing companies. Very often PLM companies are focusing on supply chain with “CAD in mind”, which makes them looking mostly on design supply solutions. Many supply chain management opportunities are turning directly to ERP vendors and other vendors specialized on specific vertical supply chain management solutions. In my view, data becomes one of the most critical topic in every supply chain solution. In most of the situation, data needed for supply chain solutions “stick in the middle” between ERP, PLM and SCM. It combined from design data, manufacturing data, suppliers data. It is not easy. It requires openness and transparency between multiple systems. Let’s say the truth – it doesn’t work well in existing enterprise systems. These systems never been designed for openness and transparency.

The topic of supply chain compliance was discussed during recent PI Congress in Chicago two weeks ago. I found a good summary of PI Congress provided by CIMdata here. Here is an interesting passage from this write up related to supply chain compliance topic:

The first [panel] focused on supply chain transparency and traceability. This issue occurs at the intersection of PLM and ERP, and is critically important to firms that must increasingly compete on a global basis. The panel agreed there was a need for a common dataset on compliance issues, a common problem when selling in many countries. They recognized that PLM solution providers are challenged to provide this information in a timely fashion, and challenged the audience to help find or create new alternatives.

Common dataset is an interesting topic. It made me to think about the trend towards open data. I can see as part of broader discussion about BigData opportunity these days. In that context, I want to mention Open Data initiative led by some organizations – Open Data Institute (ODI) and Open Knowledge Foundation (OKF). The first one – ODI founded by Tim Berners Lee. The topic of open data is complex and I hope to speak about it in my future blog posts. You can find some information about open data on ODI website here. Another thing derived from open data is Open Definition initiatives. These are all new initiatives that mostly unknown in the enterprise software domain.

What is my conclusion? I think we are in the beginning of data revolution. To provide a better solutions these days, we need to have data available. It includes openness and data access. It also related to company data stored on company servers and, even more important, open data outside of organizations that must be available to everyone. In my view, common compliance data set is a perfect example of open data that must be available to enable future development of PLM supply chain solutions. Just my thoughts…

Best, Oleg

image courtesy of Open Knowledge Foundation.


PLM, BigData and Importance of Information Lifecycle

September 25, 2013

BigData is trending these days. It goes everywhere. Marketing people are in love with this name. It brings such a good taste of “big something”. It might be $$ or amount of problems it supposed to solve. It can be potentially related to big value proposition. Net-net, the amount of people and articles around you referring to the opportunity related to big data is probably skyrocketing. If you want to read more about big data, navigate to the following Wikipedia article – it is a good starting point.

CIMdata, well-known PLM advisory outfit, recently published an interesting paper about PLM and BigData. Navigate to this link, download research paper (it requires registration) and have a read. I’d say, this is the best reference about intersection of PLM and Big Data worlds. Here is what is the document about:

This paper focuses on the intersection of PLM and what has come to be known as “Big Data.” The increasing volume and growth rate of data applicable to PLM is requiring companies to seek new methods to turn that data into actionable intelligence that can enhance business performance. The paper describes methods, including search-based techniques, that show promise to help address this problem.

Search and analytic is one of the ways to dig into big data problem. Last year, I wrote about why PLM vendors need to dig into Big Data. Here is the link to my post – Will PLM vendors dig into Big Data?. I believe, BigData can provide a huge value to organization. To unlock this value is extremely important. However, looking on BigData hype these days, I got a feeling of wrong priorities and some gaps between vision of BigData and reality of PLM implementations.

I’ve been reading an ITBusinessEdge article – Three Reasons Why Life Cycle Management Matters More with Big Data. The main thing I learned in this article – even big data is going to change a lot, it won’t change some fundamental data management laws. Data lifecycle is one of them. Here is my favorite passage:

With Big Data, which can be unpredictable and come in many different sizes and formats, the process isn’t so easy,” writes Mary Shacklett, president of technology research and market development firm Transworld Data. “Yet if we don’t start thinking about how we are going to manage this incoming mass of unstructured and semi-structured data in our data centers.

It means a lot in the context of PLM systems. This is where I can see the biggest gap between BigData and PLM. It is easy to collect data from multiple sources. That’s what everybody speaks about. However, big data needs to be managed as well together with other information managed by PLM. Big Data is coming through the lifecycle of processing, classification, indexing and annotation. To connect pieces and to related big data pieces of information to PLM system – significant problem to think about. Engineers and other people in the company probably won’t be interested to access data itself, but analytics, insight and recommendation.

What is my conclusion? The value behind big data is huge. It can improve decision making, quality of service, suppliers bids and lot of other things. However, it creates a huge pressure on IT and organization in terms of resources, data organization and data infrastructure. PLM systems won’t be able to start with big data overnight. Whoever, will tell you “now we support big data” is probably too marketing oriented. PLM will have to focus on data lifecycle to bring a realistic big data implementation plans to organization. Just my thoughts…

Best, Oleg


PLM and The Art Of Simplicity

April 12, 2013

For many years, enterprise software was known as a place where development of new features was one of the main priorities. To have comprehensive list of features was considered as absolute necessity. The army of sales people and advisers spent enormous amount of time in validating and comparing features of enterprise systems. From the early beginning, PLM was considered as an extremely complex discipline. Product development methodology and engineering culture made PLM in the way we have it now – overloaded with features, struggling with user experience and the ability to have fast and broad PLM adoption in any company.

It looks like we are going to spot some changes in this place. It is not about small manufacturing shops anymore. Companies like Boeing are complaining about PLM software usability and looking for solutions to solve that problem.

To confirm what I said above take a look on the slide presented yesterday at CIMdata PLM Forum in Ann Arbor. CIMdata defines the simplification of PLM as one of the key challenges PLM is facing with customers.

However, the act of bringing a simplicity is not a simple thing to do. CIMdata confirms that to make complex simple is a significant development undertaking. CIMdata defines it as "art and science" at the same time.

What is my conclusion? We are going to see a big change in PLM development. PLM developers, just a minute ago focused on how to add one more feature to the product, will take a step back and think about user experience and simplicity. This is a result of many years of customer disappointment in the way PLM systems are implemented. To hide complexity of data models and processes make a total sense. People like "everything simple" these days. Vendors must take a notes. Just my thoughts…

Best, Oleg


CIMdata PLM Forum: PLM Never Ends

April 10, 2013

I’ve been attending North American PLM Market & Industry Forum organized by CIMdata earlier today. CIMdata is running these forums across the different geographies. Navigate to the following link to learn more about future locations and forums. Here you can see the agenda. I’ve made some calculation. The pure presentation time was about 6 hours. CIMdata planned to present total about 369 slides. It means attendees supposed to digest slides with the average speed of 1.02 slides/min. The top slide/min speed was captured by me during Big Data and PLM presentation. Ken Amman performance was 1.86 slides/min.

The amount of information shared by CIMdata was huge. There is no sense to copy/paste all graphs and charts. I will take time to digest it and probably will come later with some thoughts and ideas. Nevertheless, there are three topics that stand out in the overall stream of information presented by CIMdata earlier today. I wanted to share some of my thoughts about them. These topics are software and service revenues; PLM evolution chart and Collaboration.

PLM Revenues: Software vs. Services

An interesting piece of information was presented by Peter Bilello during his State of PLM presentation. The following slide shows the overall state of 2012 PLM Market. The data point that caught my attention was about software vs. services revenue growth.

According to CIMdata, in cPDM/PLM segment of market services grew slower than software. Traditionally, service component of PLM implementation was significant. It is not unusual to see 50/50 split of software and services revenues. What means this data point from 2012? Is it a local 2012 anomaly or, maybe it represents a trend towards different ways to implement PDM/PLM solutions? Interesting question to ask. I hope CIMdata will follow up this topic with additional research.

PLM technologies

It is always interesting to see how analysts are presenting the history of PLM. I found the following slide showing the evolution of PLM market quite interesting. Here is a main reason why. I’m sure you are familiar with the theory presenting evolution using spiral patterns.

Interesting enough, PLM evolution slide doesn’t address the spiral of evolution. According to that slide, the evolution of PLM went from data and technology to processes and “bottom line” of business solutions. However, we  need to remember a massive disruptive technological innovation that happens around us now with web, mobile, big data, open source, etc. Many legacy PDM/PLM solutions were built back using the technologies of 1980s. Do you think, the technology of 1980s and 1990s is keeping up to speed with the bottom line of processes and business solutions? I don’t think so, therefore, I’m looking to see next spiral of PLM technologies. New technology will drive the change across the whole solution chain.

Re-think Collaboration?

Last, but not least – collaboration. PDM/PLM industry spent significant amount of time working on solutions for collaboration for the last 2 decades. Nevertheless, the following research presented by John MacKrell demonstrate that collaboration is actually weak link in the landscape of PLM system engineering solutions.

 

People are expecting vendors to make improvements in all aspects of collaboration – people, data and processes. The are two main reasons for that, in my view. Traditionally, vendors have a difficulties with the openness and data access. It leads customers to disappointment and anger about unwillingness of vendors to make a change in their strategy. On the other side, modern web and social networking tools are providing a good examples of collaboration – Skype, Facebook, Twitter, Google+, LinkedIn. This is only short list of available products and technologies. In my view, it is a time to re-think collaboration by reusing social web paradigm and modern web technologies.

What is my conclusion? I captured the title of this blog post from one of the final tweets from CIMdata PLM forum. PLM never ends. PLM has deep connections to product development processes and innovation. You cannot stop the innovation process. If you don’t do it, innovation will happen anyway, but in another place. I think, PLM vendors need to remember that. The technology is democratizing these days. The question how to democratize technology becomes more and more relevant. Just my thoughts…

Best, Oleg


PLM: Business Transformation vs. Business Pain Solving?

March 8, 2013

cloud-plm-tco.pngI’d like to provoke the discussion about PLM implementations today. I assume most of your had a chance to hear the term "business transformation". That was the way majority of PLM vendors, consulting and service companies approached PLM implementation for the last decade. In a nutshell, it means to transform business processes in a company to follow PLM strategy. Implementation of PLM products and infrastructure was part of "business transformation" process. I found a good description of what means PLM transformation in the following paper by Kalypso. Kalypso is well know service and consulting outfit – sort of "department store" in the business of PLM implementation. Their viewpoint is unique and interesting since they are partnering with the majority of PLM companies in the world (according to the following list). I found the following quote interesting:

In the past several years we have watched companies try to implement PLM solutions on a tactical basis; focusing on a single department or system function, defining the business problem narrowly, or taking a technology-replacement approach to their programs. These projects inevitably fail to deliver any significant business impact, but they are safe, small, and can “fly below the radar” of the executives. Nobody gets promoted, but nobody loses their job… Companies that adopt a strategic, “vision-driven” approach to their PLM programs are significantly outperforming those that view PLM more tactically. Specifically,there is a strong correlation between best-in-class program performance and the following actions taken by leading companies: 1/ Developing a firm vision and strategy for PLM that identifies a future state to achieve from PLM, and tie that vision back to the overall business strategy; 2/ Adopting a PLM program approach to implementing PLM, addressing the implementation of PLM as a series of related projects. 3/ Approaching the PLM implementation as a business transformation as opposed to a technology installation, recognizing the need to change behavior and business processes in addition to providing new software.

It is very solid statement. There is nothing wrong in the business of developing long term strategic programs and investing in product development innovation. There is only one problem here – the dynamic of business is different these days compared to what we had 5-10 years ago. The cost demand is different too. These days businesses are running much faster and requires speedy and flexible reaction of IT and all business systems (PLM included).

Speaking about flexibility in PLM implementation, I was reading commentary published by CIMdata yesterday – Using PLM In the Cloud to Improve Business Flexibility. You can navigate to the publication via this link. CIMdata speaks about PLM delivery via cloud actually speaking about TeamCetner and virtual cloud model. Siemens PLM recently announced the availability of TeamCenter via IaaS cloud infrastructure. I found the following passage from CIMdata commentary interesting:

Today companies are not looking to buy “PLM.” They want solutions that solve specific business “pains” for their specific industry focus. Businesses must be able to more quickly acquire and deploy PLM functionality and solutions that give them operational flexibility and improve the efficiency and the pace of product development, production and service. They need to be able to take advantage of new capabilities without having to go through lengthy installation and tailoring processes – and they need to deploy and operate these new capabilities in a cost efficient manner. Reducing the time to deploy new PLM functionality with less (or no) IT support and infrastructure costs can significantly improve operational flexibility

What is my conclusion? Flexibility and agility of PLM implementation is getting more attentions. Cloud is a perfect way to increase the flexibility of PLM deployment by providing infrastructure and tools to deliver PLM systems. The fact TeamCenter is moving towards cloud deployment just another confirmation of the fact customers are looking for alternative to existing PLM deployment models. Cost is another aspect. I don’t know if TeamCenter in the cloud is cheaper than traditionally licensed option. However, I can imagine how cloud PLM can provide cost advantage compared to existing PLM implementations. Together with flexibility it can become a deal-breaker for many manufacturing companies. Just my thoughts…

Best, Oleg


Will IBM return to PLM software business?

March 4, 2013

It is almost 2 years since Dassault Systems completed the transaction to acquire and integrate IBM PLM operations into DS. The historical IBM/DS press release is here. At the same time, IBM is continue to focus on product innovation. Navigate here and may learn about IBM product development innovation, system engineering and lifecycle support. In parallel, the adoption of PLM software is growing. More companies in the world are implementing PLM and requires more products, systems, solutions and services in PLM. For the last 3-5 years we can see how IBM is pumping infrastructure software companies… Does it mean we will see again IBM services and services solving manufacturing and product development problems? I’ve been reading Seeking Alpha’s article “So What Does IBM Mean When It Says It’s In The Solutions Business?” explains what type of solutions IBM will be providing in the future:

“It is not individual packaged products per se, but groups of related software products, services, and systems. And we know at very high level where IBM is going to focus its solutions efforts. IBM has always been about software, services, and systems – although in recent years the first two have taken front stage. The flip side is that some of these solutions areas are overly broad. Smarter Analytics is a catch-all covering the familiar areas of business intelligence and performance management, predictive analytics and analytical decision management, and analytic applications.”

I found the following video about IBM usage of system engineering to streamline smarter product development quite interesting.

During the last PI Congress in Berlin 3 weeks ago, CIMdata was talking about the need of integration between configuration management, PLM and system engineering. Peter Bilello of CIMdata mentioned such integration as an absolutely needed element of future of product innovation.

cimdata-plm-se-pi-congress.jpg

What is my conclusion? Business intelligence, decision support, system engineering and integration. These functions are quite desired by manufacturing companies to solve prod development problems. Large companies these days are looking how to streamline product development processes. Enterprise PLM business seems to be impossible without system services and integration support. IBM is collecting a significant software stacks that can be used for this purposes. Maybe we see IBM renaissance in PLM soon? Just a thought…

Best, Oleg


Thoughts about PLM Conferences

November 1, 2012

Last week I attended PLM Innovation Americas 2012 conference in Atlanta. I already published few posts inspired by the conference – PLM Innovation and 5 PLM Trends and PLM Innovation: Who will provide PLM to Boeing in 2015? Few facts about the conference itself: about 250 attendees, reasonable sized for large presentation and small roundtables. Exhibition floor was presented by all PLM vendors. This is one of very few vendor-independent PLM conferences. Actually, I know only one more – PLM Roadmap. I’ve been reading blogs and twitter stream from the last PLM Innovation. The following press release caught my attention – PLM Road Map To Be Presented with PLM Innovation Americas 2013.

CIMdata, Inc., the leading global PLM strategic management consulting and research firm, announces that it will be co-locating its PLM Road Map conference in conjunction with MarketKey’s PLM Innovation conference in 2013, the date to be announced. The name of the combined event will be Product Innovation Featuring the PLM Road Map. Both organizations will participate in planning and program development for these events. CIMdata brings its extensive PLM knowledge and 20 years of PLM Road Map experience to bear. MarketKey provides its marketing and event organization skills to the combined effort.

It made me think why PLM events became so rare and what can make non-vendor event successful and popular. In the world of the web, blogs, youtube and social media, you need to provide something very special to drive people to get on board of airplanes and travel across the country to attend the conference for few days. I found only one reason to come – to listen to customer stories and speak to customers live.This unique opportunity is priceless and can justify the time and money you need to spend.

Actually, I found a confirmation of my idea reading Michael Fauscette blog post Recap of Oracle Open World 2012. Michael is comparing Oracle Open World and Salesforce.com user conference Dreamforce by analyzing how customers were presenting during the conference. Here is an interesting passage:

Having just attended Salesforce.com’s Dreamforce conference two weeks before OpenWorld it’s hard not to compare the two mega-conferences. Both vendors put on a great show, but there were some differences. For me the thing that Salesforce did right, and I think is clearly a best practice in vendor conferences, was weaving powerful customer stories, told by executive from those customers, all through every keynote and discussion. In other words Salesforce lets its customers tell much of its story. This is simply not true of Oracle. Don’t get me wrong, there are plenty of Oracle customer stories involved, many in the form of videos, but the approach is just different. Oracle prefers to tightly control the message and have its executives present the information, backed up mostly by customer videos. It’s just not as compelling to me, and its a shame because there are some great customer stories to tell.

What is my conclusion? In our online world, there is a single reason to come and attend the conference – to connect and speak to customers. In this context, vendor-independent events are more appealing. During these events, customer can speak about real customer experience without obligation to promote a specific vendor. These are speeches that help you to learn a lot about the product, implementations and industry practices. Unfortunately, there are not so many PLM vendor-independent events. I’m looking forward to seeing more PLM events in 2013. Vendors should take a note to promote customer presentations during the events. Just my thoughts…

Best, Oleg

Image courtesy of [fotographic1980] / FreeDigitalPhotos.net


PLM, Blurred IP and Practical Data

October 24, 2012

IP (Intellectual Property) is the term used by PLM very often. You probably had a chance to hear about IP management, IP lifecycle, IP protection, etc. I don’t know about you, but to me, it usually provided a feeling of dealing with something important. While I agree that IP is important, it often comes to people in a very blurred way. You can also see how people are switching from speaking about IP to other terms (Bill of Materials, Parts, Drawings) when it proliferates down in the organization from top management level to engineering and manufacturing organization.

Few days ago, my attention caught the article written by Peter Bilello of CIMdata – PLM View: Management Intellectual Property. CIMdata is a well known analytical outfit specialized in research and consulting in the field of engineering and manufacturing software. PLM is one of their key specialties. The article was available via ConnectPress community website. Navigate to the following link to access (It requires registration on ConnectPress available for free). Have a read and make your opinion. Here is a definition of the IP provided by Peter.

What is IP? The common-sense answer is information that defines the product and how it is to be manufactured, delivered, supported and recycled, and that may be required to support patent applications and to defend patents if challenged or infringed. These definitions also include new-product engineering data: requirements, conceptual and detailed designs, analyses and trade-off studies, simulations of production systems, and even ergonomic analyses.

To me it sounds like all information regarding the product in the company actually represents product IP. So, you probably can ask what information is NOT belonging to IP? Here is the answer you can find in the same article:

And What is Not IP. Information that is probably not IP includes transactional data that doesn’t provide a company with any particular competitive advantage. Of course, the distinctions remain fuzzy. In its PLM consulting work, CIMdata encourages the use of two litmus tests. Does the information in question relate to basic enterprise or product capabilities that could become competitive issues, or legal issues, or touch on regulatory compliance? If yes, the information is IP. Secondly, is retention of the information mandatory? If yes, the information is IP.

This definition made me think about the variety of information sources nowadays. We are living in the world where information is aggressively collected by any company and devices. I’m sure you are familiar with multiple incidents and information leaked and collected by Google services, Apple iPhone tracking information, Facebook activities and many other sources. E-commerce websites are collecting a lot of information about people purchasing different products and services. Thinking about organization is very hard to predict what information is actually related to legal, regulatory, retention.

Speaking about clarification of what IP is and how PLM can help us to deal with IP, I found the following passage very important:

Ultimately, IP governance is about extracting real value from misunderstood assets. Amid the 21st century’s data tsunami and its constant disruptions to accustomed ways of thinking and working, the value of these assets keeps going up. Part of the new awareness of IP is the tremendous value as source of insights for solving problems and making decisions. Applying PLM strategies to IP helps ensure that decision-makers can get whatever data they need in a timely manner. To state this in another way, resources dedicated to reusing IP data are true investments and not just money spent digging up information.

What is my conclusion? I think companies need to move from mystical blurred strategies to simple terms and definitions. PLM IP is one of them. Companies need to collect and retain data that important for their business and lifecycle. As manufacturing company, I want to collect information about my customers, product usage, suppliers, etc. As engineering organization, I want to collect the information about how to develop and manufacture the product. There are many other fields that become important, and we need to discover them. To get whatever data people need in a timely manner is the best IP management strategy I can think about. Just my thoughts…

Best, Oleg


New Collaboration and Data Hostage Game

August 17, 2012

Think about the most overused term in PDM/PLM software for the last decade (or even more). Collaboration. It was developed and sold in different flavors and packages. Remember CPDM – Collaborative PDM? Later it became Collaborative PLM. Moreover, don’t forget e-Collaboration and many others. If you want to refresh your memory, navigate to the following link with CIMdata article – Definition of cPDM.

Time is moving fast. Last decade of internet, consumer devices, mobile and web 2.0 changed the face of how we share information and collaborate online. At the end of the day, I need to collaborate with my family, kids, friends and I do it on-line in a very efficient way. So efficient, that the question "how I can do the same in my company?" becomes almost obvious.

Earlier today, the following paper commentary from CIMdata came to me via twitter (thanks Chad Jackson for his tweet). The article The Changing Face of Collaboration (Commentary) is speaking about how Collaboration is changing as a result of influence made by the technology, mobile and consumer based software. Here is the first important passage I captured

In many ways we are witnessing the convergence of a number of technology-driven themes that have the potential of significantly changing collaborative work processes within and outside of a company’s four walls. The first technology-driven theme can be categorized as the consumerization of information technology (IT). The second is the explosion in the availability, capability, and usability of mobile information delivery devices. And the third is the entrance of social media-savvy individuals, who’ve grown up using Facebook, Twitter, and the Internet, into the corporate workforce. This convergence is well underway and today’s companies need to prepare and implement the appropriate processes and technologies that support the new way of collaborating.

Later, author is making the conclusion about the absolute need to develop new collaborative processes, otherwise we will become dinosaurs of the previous PLM solutions. Here is another passage:

The need to define and enable new collaborative processes and enabling technologies are not optional, they are mandatory–not only for Generation Y but also for the rest of us who need to compete in this highly collaborative and connected world. Without providing the correct level of support, today’s PLM solutions will be tomorrow’s legacy systems.

Well, we have a bunch of new technologies, new Gen-Y workforce. What next? What needs to be done in order to deliver a new kind of collaborative processes? It made me think about openness again. Let think about the web and social networking. Availability of the information on the web was one of the most important prerequisites allowed companies to develop websites and apps that deliver value (starting from Google search and ending with last social nets like Pinterest).

There is a problem that does exist in all PDM / PLM systems. These systems are taking data hostages. Let me explain what I mean. Whatever they manage – files, processes, communication stays in the system. In general, almost all of them claim openness, but in practice it doesn’t mean much. You can make a test by trying to share data out of these systems using some generic infrastructure without exporting the date (for example, in Excel file). How I can share Bill of material from my PDM system in SharePoint without exporting it? How I can share preview of my CAD model on the supplier website of my company without "dance with a tambourine" and additional coding?

What is my conclusion? In order to facilitate collaboration, PDM/PLM software products need to stop taking data hostages. It means sharing of information out of these systems needs to become a first priority for product data management software. The open infrastructure of data sharing will create a new eco-system that will help people to collaborate. After this stage, we can expect many other companies and products to come with applications helping people to collaborate using openly available information. Just my thoughts…

Best, Oleg

image credit sheelamohan / FreeDigitalPhotos.net


Follow

Get every new post delivered to your Inbox.

Join 217 other followers