Full Product Lifecycle, Cloud and Granular Security

November 27, 2013

granular-security-cloud

Security is one of the most fundamental things in enterprise systems. If you think about every enterprise system as a big data store with applied business rules, security will be the one you will need to evaluate first in order to map it to your organization. It comes in multiple dimensions about who is getting access to information and in which way. Additional business complexity can increase the complexity of security model even more.

Cloud systems development is trending these days. We can see additional cloud applications and systems almost every day. It comes to the situation when collaborative usage of multiple cloud systems and applications becomes real. I’ve been reading Engineering.com article Can CAD-in-the-Cloud Handle the Entire Product Lifecycle? Read the article and draw your opinion. The important intent I captured is related to combining different disciplines and cloud tools under a single hood of entire product lifecycle development and business practice.

The decomposition of the process into stages and using of different cloud application can bring an interesting perspective on required security model. Actually, I can see security becomes a hot issue in the cloud development. The time when security was only about "file-share" options is gone. Today, cloud systems and users are demanding much more granular data organization and security management.

My attention was caught by an interesting acquisition made by Box. Navigate to the following article to read – Box Acquires dLoop To Enhance Security With Fine-Grained Data Analytics Technology. Box is well known outfit producing variety of tools and specifically focusing on enterprise customers. My hunch, the deal is the answer to an increased demand for data security in enterprise and it comes as part of growing competition with tools such as Dropbox and other consumer file sharing tools. Despite huge success in consumer space, security and enterprise deployment is probably still a challenging point for most of them. Here is an interesting passage from TechCrunch article:

We’ve been spending a lot of time improving the end user experience on Box, but we’re equally committed to creating better management tools for enterprise IT. This means unlocking greater visibility into the activities happening around information, and providing more granular controls where necessary. dLoop’s machine learning capabilities will ultimately allow Box to help enterprise customers identify and surface relevant content by tracking activity patterns… In larger enterprises, data classification is becoming a must-have in order to control what people can do with files. Companies want policy-based file sharing. Box has made considerable effort to enhance its security.

Policy based security is a key thing here. Thinking about full product lifecycle scenario involving multiple tools and people, policy based security can be the only way to support a granular security model. The needs for security is only one aspect. Ultimately, the goal of enterprise systems today is to improve user experience. In my opinion, without an appropriate data granularity, this is mission impossible for most of systems today. Cloud is adding an additional dimension of complexity. It comes as part of scenarios related to multiple cloud tools and shared content.

What is my conclusion? New environments, old problems and even bigger challenges. This is how I can see a combination of enterprise reality, new cloud systems and demand for security. The granularity is a key, in my view. Without policy based granular access, cloud product development tools such as CAD, CAE, CAM, PLM and others will remain childish and fail meet real enterprise customers scenarios. Just my thoughts…

Best, Oleg


Cloud PLM Scaling Factors

November 26, 2013

plm-cloud-scale

Scale is one of the most fancy word which is coming in place when different technologies are discussed or debated. So, speaking about cloud CAD, PDM and PLM, the discussion must go towards "scaling factor" too. Since PLM Cloud Switch finally happened, it became clear, vendors and customers will have to go down from buzzwordy statements about "cloud" into more specific discussions about particular cloud technologies they are using. Very often, cloud deployment is related to so called IaaS (infrastructure as a service) used by vendors to deploy solutions. PLM vendors are using IaaS as well. I’ve been talking about it a bit in my post – Cloud PLM and IaaS options. In my view, Siemens PLM provided the most bold statements about following IaaS strategy in delivery of cloud PLM solutions. At the same time, I believe all other vendors without special exclusion are using variety of IaaS options available on the market from Amazon, Microsoft and IBM.

An interesting article caught my attention earlier today – Google nixes DNS load balancing to get its numbers up. Article speaks about Google demoing cloud platform scaling capabilities. Google blog articles provides a lot of details about specific setup used for tests and measurement:

This setup demonstrated a couple of features, including scaling of the Compute Engine Load Balancing, use of different machine types and rapid provisioning. For generating the load we used 64 n1-standard-4’s running curl_loader with 16 threads and 1000 connections. Each curl_loader ran the same config to generate roughly the same number of requests to the LB. The load was directed at a single IP address, which then fanned out to the web servers.

It is not surprising that Google put some competitive statements trying to differentiate itself from their major competitor – Amazon. Here is an interesting passage from Gigaom writeup:

“Within 5 seconds after the setup and without any pre-warming, our load balancer was able to serve 1 million requests per second and sustain that level.”… this as a challenge to Amazon Web Service’s Elastic Load Balancing. ”ELBs must be pre-warmed or linearly scaled to that level while GCE’s ELBs come out of the box to handle it, supposedly,” he said via email. Given that Google wants to position GCE as a competitor to AWS for business workloads, I’d say that’s a pretty good summation.

The discussion about cloud platforms and scalability made me think about specific requirements of cloud PLM to scale and how it can be related to platform capabilities. Unfortunately, you cannot find much information about that provided by PLM vendors. Most of them are limiting information by simple statements related to compatibility with a specific platform(s). However, the discussion about scaling can be interesting and important. Thinking about that, I came to the 3 main group of scaling scenarios in the context of PLM: 1/ computational scale (e.g. when PLM system supposed to find design alternatives or resolve product configuration); 2/ business processing scale (e.g. to support a specific process management scale in transactions or data integration scenarios); 3/ data processing scale (e.g. required to process a significant data imports or analyzes). Analysis of these scenarios can be an interesting work, which of course will go beyond short blog article.

What is my conclusion? Coming years will bring an increased amount of platform-related questions and differentiation factors in PLM space and enterprise in general. It will come as a result of solution maturity, use cases and delivery scenarios. Cost of the platforms will matter too. Both customers and vendors will be learning about delivery priorities and how future technological deployment will match business terms and expectations from both sides. Just my thoughts…

Best, Oleg


What “social features” PLM can steal from SharePoint?

November 25, 2013

tag

Collaboration is still one of the hot topics in PLM space. I was watching PLM TV report about Dassault’s 3DEXPERIENCE Platform from Engineering.com. A bit long (my taste) video provides a comprehensive overview of what Dassault’s key execs are saying about 3DEXPERIENCE platform on the last Dassault conference. Lots of them are related to complexity of data, connecting information sources, provide better user experience especially for top tier mangers. However, "social" remained the key word in many of conversations connected to the value of 3DExperience as a "new way to collaborate". Social was one of the key points of new DS platform innovation. This is how engineering.com captured that:

Verdi asks about the platform and architecture and also how the changes will impact product development teams. He delves into how Dassault envisions that teams will use the social and collaborative tools as part of their engineering processes.

However, very often, it is not simple to see what specific "social" features can enable a new way to collaborate. As I mentioned in my previous blog – Why Social PLM 1.0 failed, a single social utility is what most of vendors missed in their plans to deliver a new way to collaborate socially. An interesting article caught my attention few days ago – 3 Cool Things about SharePoint in Office 365 Enterprise. One of features was specifically "social oriented". It was about simple and well-known tagging – hashtags (#) and at tags (@). I captured the following passage:

Bring social inside the walls of your organization with SharePoint’s social features. With SharePoint 2010, you could follow sites and tag colleagues. In the 2013 flavor, you can have a newsfeed where you can use social features like hashtags (#) and at tags (@) to track ideas and topics and mention people in your posts. In a news feed for a particular team, you might put hashtags on customer names, industry publication names, or create a tag for a particular issue. Then someone can just click the active tag to see all posts relative to that topic.

SharePoint cool features made me think again about "single utility" that can provide some practical sense to a new way to collaborate presented by PLM companies. Tagging feature can be quite powerful. If I combine it with the ides of data complexity, multiple data sources and user experience, we can get a complete new way to collaborate.

What is my conclusion? Social is a very powerful topic. However, most of enterprise companies lost the focus to provide a specific meaning to social features by applying the same principles companies like Facebook and Twitter empowered their own solutions. In the context of engineering, manufacturing and, probably, any enterprise organization, Facebook-like functionality is clearly not enough to change some fundamental collaboration principles. I think, SharePoint folks understood it quite well after all ups and downs SharePoint had in the past. Therefore, I can see the value of some specific features in the overall social collaboration domain. Just my thoughts…

Best, Oleg


How PLM can transmit data to the cloud?

November 24, 2013

plm-cloud-data-transmiting

Transmitting of data is complicated and painful. Remember few years ago, when industry just started to switch to the cloud, the discussion around how to transfer large CAD parts and assembly was very hot. Actually not much changed since that time. A huge amount of data lives on the desktops and produced outside of cloud environment. Think about variety of sensors that started to appear almost everywhere. Also, think about potential for huge amount of simulations – still many of these are run on desktop computers.

My attention was caught by Amazon newly announced Kinesis service. The description provided by Amazon is quite impressive.

Amazon Kinesis provides an easy way to process fast moving data in real-time. You simply provision the level of input and output required for your data stream, in blocks of 1 megabyte per second (MB/sec), using the AWS web management console, API, or SDKs. The size of your stream can be adjusted up or down at any time without restarting the stream and without any impact on the data sources pushing data to Amazon Kinesis. Once your stream has been created, you can immediately start loading your data, with simple HTTP PUTs. Amazon Kinesis automatically manages the infrastructure, storage, networking, and configuration to collect and process your data at the level of throughput your application needs. Within seconds, data PUT into an Amazon Kinesis stream is available for analysis. Stream data is stored across multiple availability zones in a region for 24 hours. During that window, data is available to be read, re-read, backfilled, and analyzed, or moved to long-term storage (e.g. Amazon S3 or Amazon Redshift).

If you don’t have time to read product description, watch the following 2 minutes video. The idea in a nutshell is quite simple – Amazon Kinesis will pump your data to the cloud. After that you can decide what to do and how to proceed.

I can see few potential applications of Kinesis service within cloud infrastructure today. One of them is massive data processing from desktops and corporate network locations. It can enable analytic applications, search and many other services that cannot live without data. Another application area is so called "IoT" (Internet of things) I covered few days ago here. By processing data from various sensors and mobile devices, engineering applications can provide new data input that can improve design process.

What is my conclusion? Kinesis service shows a potential of data processing from variety of locations to the cloud. Bringing data together can enable a number of a completely new ways to design, making analysis and optimization. Capturing a huge amount of real time data, can help improve quality and provide insight on how customers are using products. Just my thoughts…

Best, Oleg


CAD, PLM and Fast Data Processing

November 24, 2013

network-computing

Our demand for the software to be fast. We don’t like to wait until computer will perform all actions and produce results. Depends on the situation and use case, speed can be something very disturbing or even more – change the overall working process. Time ago, simulation and analysis software embedded into CAD systems changed the way engineers designed mechanical parts. Another, more PLMish example might be related to visualization of complete product configuration and performing clash analysis.

Nowadays, the computing mindset is taking us towards new approaches in data processing using new processor technologies as well as network computing opportunities. MIT Tech review article Graphics Chips Help Process Big Data Sets in Milliseconds is talking about how to visualize big data on cheap computers in milliseconds. The writeup is mentioning massively parallel database that allows to achieve big speed gain by storing data on the onboard memory of GPUs, which allows 70x performance gain. You can use the following link to use public interface to product tweetmap. Here is an interesting passage I captured:

The can be used to visualize 50 million geocoded tweets posted between September 28 and October 6. The tool allows users to explore different search terms, examine broad geographical trends, and zoom in on each tweet. For each of the 30 frames per second it generates when animating Twitter, Map-D scans all the tweets that have been loaded on the GPUs, constructing visualizations such as maps of how word usage—which could include mentions of a product name or news item—is propagating across a region or around the world in real time.

Watch the following video, which is quite self explaining.

What is my conclusion? Fast data processing can be a key technology to realize the opportunity to change design and engineering process. To visualize design options, analyzes variants and make a final assessment to validate potential mistakes – this is only a very short list of options. Moving forward you can think about analyzing of huge databases for potential failures and customer issues, potential supplier components. Theses days, technologies developed for web, social networks and mobile domain can be leveraged tomorrow to change future landscape of engineering software. Just my thoughts…

Best, Oleg


PDMish, PLMish and other CADPDLM bundles…

November 20, 2013

cad-model

Engineering and manufacturing software is full of interesting TLAs that companies continuing to invent even these days. Only yesterday GrabCAD introduced Workbench enhancement under somewhat that can be considered as a very traditional name – Collaborative Product Development (CPD) Platform.

Chad Jackson, my long time blogging buddy published an article with fascinating title – ENOVIA Data Management: Less PDM-ish, More PLM-ish. Chad made a review of Enovia data management capabilities from Dassault System briefing. Read the article with multiple embedded tweet-quotes. Besides playing PDM vs. PLM naming game, article brings an important perspective of where Enovia data management architecture is going. Here is an interesting passage I captured:

CAD models have never been simple files. Lots of stuff have been, and continue to be, jammed into them. They contain far more than just geometry. They contain part numbers, so drawing BOMs populate automatically. They contain material properties, such that mass properties can be calculated. Over time, as more and more enterprise considerations needed to be taken into account, more non-geometry stuff has been jammed into CAD files. The problem with all this stuff in CAD files was that, unlike many other types of files, operating systems couldn’t understand the structure of information in CAD files.

So, what Enovia is doing differently? According to Chad’s comments -

Dassault Systèmes are trying to liberate all that stuff jammed into the CAD file. Here are some notes from my briefing with them. In short, they are taking non-geometric items in CAD files an turning them into meta-data that lives in ENOVIA. They are being turned into individual pieces of information that can be modified separately and independently from every other piece of information and the geometry. Of course, this meta-data is related and will live as a database item right alongside the file that contains the geometry.

cad-pdm-bundle

It made me think about future trajectories of CAD/PDM/PLM integrations. I’ve been describing possible options in one of the posts last year – Multi-CAD PDM integrations: yesterday, today and tomorrow. The idea of CAD/PDM bundle I expressed there is in my view a reflection of what CATIA/Enovia is doing in data management. As I mentioned there, it solves the problem of version compatibility as well as provide a significant advantages in terms of functional richness. This is exactly what Chad demonstrated with CATIA/Enovia examples.

What is my conclusion? Integration remains a place where lots of innovation is happening. CAD, PDM and PLM integrations is a very challenging space. Customers have a huge demand to have vertically integrated product providing sufficient amount of features and what is mostly important a completely new level of user experience. It sounds like we can see more investment in this space coming from traditional vendors. CAD/PLM companies will try to integrate existing products into vertical suites connecting data and providing support for integrated product scenarios. Just my thoughts…

Best, Oleg


GrabCAD CPD wants to disrupt CAD file management

November 19, 2013

grabcad-workbench

Three years ago I sat together with Hardi Meybaum of GrabCAD is one of Starbucks coffee shops near Brookline, Mass. We’ve been talking about GrabCAD social networks, communities, openness, engineers and CAD libraries. You can find some of my thoughts after this conversation here – Marketplaces and Engineering Software. Since that time, I’ve been following GrabCAD trajectories closely and had a chance to meet with Hardi many times. You can probably remember some of my previous posts – Manufacturing Crowdsourcing and Cloud PLM opportunity, GrabCAD, Collaboration and Dropbox, GrabCAD and Open Engineering Source: Dream or Reality?

However, GrabCAD trajectory recently started to show some changes. From open CAD library and “Facebook for engineers”, GrabCAD is moving towards maybe more traditional space – design and collaboration. GrabCAD introduced Workbench – the product that supposed to help engineers collaborate during the design phase. You see my first GrabCAD Workbench experiments here. All around GrabCAD Workbench started to look and smell document and product data management – PDM: reinvent the wheel or innovate?

Lately, Hardi and I had a very active discussion about different aspects of CAD file sharing, collaboration and integration – CAD File Sharing and Integration Challenges, Top 3 pros and cons to have a special CAD file sharing tools.

And here is finally come. Early today, GrabCAD published a press release announcing future GrabCAD PDM solution – GrabCAD Workbench brings CAD file management into 21st century. Navigate here () to read press release. So, the name of the game is CPD (Collaborative Product Development) and here is how it explained in GrabCAD press release:

Today GrabCAD announced that it has expanded the capabilities of the cloud­ based Workbench solution to a complete platform for Collaborative Product Development (CPD), enabling users to manage, share and view CAD models with zero IT investment. More than ten thousand users have already signed up to use Workbench to share CAD models with suppliers, customers and partners. With the expansion of file management capabilities, these users will be able to automatically sync their desktop files to cloud projects, track file dependencies, visualize version differences and resolve conflicts. With these additions, Workbench is now the ideal tool to help small to mid-­size companies manage CAD files.

It all comes to the advantages of the cloud – to get service without installation, configuration, maintenance and other related hassle. Here is another passage I captured.

As a cloud­ based service, Workbench requires no dedicated server, no configuration, no maintenance, and no IT hardware or support. While legacy PDM/PLM solutions take weeks or months to install and configure, Workbench users are up and running in minutes.

Few screenshots below can give some additional impression about what GrabCAD Workbench CPD is supposed to provide. According to GrabCAD, service will become available first as a beta for everybody and later will become available for flat fee of 25$ or 45$ per month depends on the plan.

grabcad-pdm-1

On of the very interesting features of GrabCAD CPD and Workbench is Viewer Diff functionality that is able to visualize and show difference between CAD file versions.

grabcad-pdm-2

grabcad-pdm-3

What is my conclusion? Technology and people. These are two components of future changes in almost every place. I’m always saying – technology is much easier than people. In engineering and manufacturing business it is probably specially true. Engineers are main consumers of CAD and PDM products and it is very hard to change their behaviors. PDM has probably one of the worst records in enterprise software, especially among small companies and individuals. Engineers recognized PDM as something that hurts them and doesn’t help. It would be interesting to see if new cloud services such as GrabCAD CPD will change the trend. The name (CPD) isn’t new, so hope to product itself will deliver the difference. I’m looking forward to join the service as a beta and test it. So, stay tuned for future posts.

Best, Oleg

Disclaimer: GrabCAD didn’t pay me to write this post and in no way influence my opinion about what GrabCAD Workbench CPD can do.


How PLM can discover “data opportunity”?

November 19, 2013

open-data-3danimatix-visualization

The amount of data around us is growing enormously. For the last 10 years, internet as well as other data source such as mobile devices created a huge stream information. It comes in every place – our daily activity such as photos, videos, online GPS as well as in a business world. Companies are discovering new ways to communicate and perform digital activities that nobody seen before. Think about manufacturing company 10-15 years ago. Most of digital activity was about design and engineering. After throwing BOM to the manufacturing side and procurement planning, rest was in metal – not digital. This is not true anymore. The amount of online information about product performance, regulation, tracking information about customers and other activities is skyrocketing.

Internet is one of the biggest driver behind changes in digital space made a revolution in the way people communicate in 21st century. However, even internet is going to change these days. Invented first as a “web of shared linked documents” aka as Web 1.0 later developed in so called Web 2.0. Web 2.0 was born to provide an answer on an increased demand to collaborate. It came together with social networks and increased dominance of mobile devices.

However, these days many people speak about future changes and development of web. It comes in two major aspects – IoT and Web of Data. I’ve been writing about PLM and IoT common trajectories recently on my blog here. Web of data (or how some of people called Web 3.0) is not less interesting. Sir Tim Berners-Lee, original inventor of internet at CERN Lab in Switzerland is one of the most prominent believers in a new web. Few days ago, I was reading Telegraph article – Sir Tim Berners-Lee: data and the new web. Take few minutes of time and read it. Sir Tim is speaking about opportunity of sharing data. I found the following passage very important:

His [Sir Tim] idea sounds simple: he wants companies, governments, organizations and even, to some extent, individuals to share their data. It’s not new, per se: bus companies have been publishing timetables since they began, while anyone selling anything must explain to prospective purchasers what’s on offer. But in the age of the web, there’s a lot more data about and the chance to connect it in novel, creative and enlightening ways. So Sir Tim wants ever more organizations – and especially countries and businesses – to share what they have and see what happens when it gets put together.

However, the problem with sharing data is well known for a long time. PLM, manufacturing and other enterprise space is familiar with this problem very much. It is about business models and open vs. close business world. Here is how Sir Tim explains this:

“It’s a constant battle of mindsets – once people have got the open data bug they realise the benefit. They realise they’re performing a service to the country. With the original web, people could see the benefits. But with this you don’t immediately know the benefits or who is using it. It could be another company or a kid doing homework or somebody in the World Bank – nobody’s really able to be able to work out the investment.”

Open Data Institute (ODI) – is new UK based organization founded by Sir Tim Berners-Lee is focusing on data opportunities. The Open Data Institute is catalysing the evolution of open data culture to create economic, environmental, and social value. It helps unlock supply, generates demand, creates and disseminates knowledge to address local and global issues. I found the following related work by UK government interesting – UK data capability strategy: seizing the data opportunity. Navigate to the following link and take a look on a document – Seizing the data opportunity. Here is how data opportunity defined there.

Data has been likened to the “new oil” of the 21st century, but unlike oil, we are not going to run out of it. On the contrary, we will continue to amass more and more data. As set out in the Information Economy Strategy, business sectors across the economy are being transformed by data, analytics, and modelling. New and emerging technologies will fuel the growth of data: as access to computing and the internet becomes ever more mobile, data will be transmitted and analysed continuously; and the development of the Internet of Things could mean that by 2020 sensor data will be created by as many as 50 billion connected devices across the globe. From increased transparency and accountability through open data, to new scientific discoveries, and market-changing products and services which can be developed using modeling and simulation, big data analytics and data-driven science, the opportunities – and challenges – are significant.

What is my conclusion? Engineering and Manufacturing firms can gain a significant business advantages from smart usage of data. Thinking about full product lifecycle including design, engineering, manufacturing and maintenance, PLM vendors can discover an interesting opportunities in variety of data usages by specific companies and industry verticals. This is a new opportunity. Will be it be discovered by PLM vendors or new startups? This is a good question to ask. Just my thoughts…

Best, Oleg

Picture courtesy of 3danimatix blog http://3danimatix.blogspot.co


PDM: Bring Your Own Cloud Or Die?

November 16, 2013

my-pdm-cloud

Cloud is not a "new black" in PLM industry these days. It is hard to find vendor that not associating itself with "sort of cloud solution". However, here are new a trend on horizon – bring your own cloud. Dropbox, a very successful in all means these days is trying hard to get in the enterprise business. Which made it very interesting from the standpoint of engineering and manufacturing business. Especially for companies focusing on design disciplines.

I was reading Dropbox is getting down in the business on ReadWrite.com. In a nutshell, the problem Dropbox is trying to solve is related to accounts and identity management. Navigate here and you will see how Dropbox is preparing to solve the problem of multiple accounts and security. The message is clear – for whatever it worth, you will be able to get a "piece of your cloud" secured for you and for your company if you need it. Dropbox claims full redesign of security system. Here is the passage:

“We didn’t just re-do Dropbox for Business,” Houston said. “We re-did the [whole] foundation of Dropbox.” The company redesigned the service across the desktop, mobile and the Web. The changes also include advanced security and access controls, which should help mollify company IT managers. Businesses can, among other things, manage (or block) sharing to outside users, prevent sensitive docs from going into personal accounts, monitor all activity around work files, and even remotely wipe files from the devices of former employees.

activity-log-sharing

Another piece of news also came few days ago from Amazon. Navigate to the following TechCrunch article and read Amazon Launches WorkSpaces, A Virtual Desktop Service On AWS. Amazon workspaces will allow you to have virtual working space from the cloud for less cost. Here is the quote from the article:

The news plays into the company’s effort to take more business from enterprise providers by providing customer-centric services with security that is sufficient for companies with significant operations at a lesser cost.

amazon-workspaces

You can ask me how these two news related to PDM. Here is the deal. In my view, CAD will be the last engineering system that will migrate to the cloud. Hardi Meybaum and I had an interesting discussion about it in one of GrabCAD blog posts here and my post here. Solutions provided by Amazon and Dropbox made me think about about a possibility to make current desktop CAD system "cloud-enabled" without even changing their desktop nature or significant re-write. Of course, all these solutions won’t stop (and don’t need to stop) future cloud design product development. However, the combination of secured cloud solution together with some virtualization technologies can make cloud based collaboration possible today (not tomorrow).

What is my conclusion? Nowadays, we can see concurrent development of different technologies related to cloud – virtualization, file sharing and others. All together, combined with existing engineering and design software they can be used to build future cloud collaborative platforms. Product Data Management (PDM) can leverage these platforms in order to build secured, cost-effective and simple solution to manage files and have collaborative access to them. Just my thoughts…

Best, Oleg


Social PLM and Structured Enterprise Communication

November 15, 2013

plm-social-structured-discussion

Social is one of the topic that keeps trending these days among enterprise companies. The last few years demonstrated lots of advantages of social technologies and their applications in different fields. Enterprise is obviously looking on social as one of the hot topics following consumerization of IT and mobile. While software business models are moving from ownerships to usage, the adoption of tools and ability to get people to collaborate becomes one of the most important imperative.

However, we need to face the truth – social PLM failed the first attempt. I’ve been blogging about it in my previous articles – Why Social PLM 1.0 failed? Many people in manufacturing organization and specifically engineers didn’t get a point of ‘social’. The utilization of social applications when down- engineers didn’t see social as a utility they can rely on similar to email and other communication tools. Navigate to one of my writeups about this – PLM and common social platform behaviors. In my view, the important takeaway from that discussion is related to how social tool can provide a single utility for engineers and other people in manufacturing and engineering companies to deal with data. This function is practically doesn’t exist in most of current versions of social tools. Think about sharing of CAD models or other engineering and/or manufacturing data and you will understand the distance between Facebook photo sharing. Rich enterprise content is different. Sharing CAD data and manufacturing information is very different.

However, there is another aspect. Social collaboration and communication are main factors that drove adoption of systems like Facebook, Twitter and Google+. However, thinking about these systems, the main people behavior was about following friends and their posting on social site. To share photo or link is probably one big mainstream function today. But, it is different in the enterprise and business. Noise factor is very high and it drives efficiency and usage down. In order to solve “information noise” problem, social systems proposed grouping and activity streams. The idea is nice and I found some interesting examples to confirm the benefits of the idea. I called it “structured social conversation”. Earlier this week, my attention was caught by company called Hexigo that can be a good example of what I mean by “structured social conversation”. In a nutshell, Hexigo provides they way to follow up social activities to insure nothing get lost and by doing that, helping people to following decision process in a social way. Nice idea, in my view – you can see it in the following video:

What is my conclusion? We have to re-think how social collaboration is coming to enterprise organization. The unstructured social discussion similar to what people having on social networks is good, but probably too noisy to busy enterprise people and engineers. In my view, there are two big things every social collaboration system should adopt in order to be successful in manufacturing, engineering and probably other enterprises as well – (1) rich data and (2) structured discussion. It is not clear how much of the original twitter and facebook ideas will remain afterwards. However, without that the adoption of “new ways to collaborate” will be pretty low among busy engineers and other decision makers in enterprise. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 244 other followers