Full Product Lifecycle, Cloud and Granular Security

November 27, 2013


Security is one of the most fundamental things in enterprise systems. If you think about every enterprise system as a big data store with applied business rules, security will be the one you will need to evaluate first in order to map it to your organization. It comes in multiple dimensions about who is getting access to information and in which way. Additional business complexity can increase the complexity of security model even more.

Cloud systems development is trending these days. We can see additional cloud applications and systems almost every day. It comes to the situation when collaborative usage of multiple cloud systems and applications becomes real. I’ve been reading Engineering.com article Can CAD-in-the-Cloud Handle the Entire Product Lifecycle? Read the article and draw your opinion. The important intent I captured is related to combining different disciplines and cloud tools under a single hood of entire product lifecycle development and business practice.

The decomposition of the process into stages and using of different cloud application can bring an interesting perspective on required security model. Actually, I can see security becomes a hot issue in the cloud development. The time when security was only about "file-share" options is gone. Today, cloud systems and users are demanding much more granular data organization and security management.

My attention was caught by an interesting acquisition made by Box. Navigate to the following article to read – Box Acquires dLoop To Enhance Security With Fine-Grained Data Analytics Technology. Box is well known outfit producing variety of tools and specifically focusing on enterprise customers. My hunch, the deal is the answer to an increased demand for data security in enterprise and it comes as part of growing competition with tools such as Dropbox and other consumer file sharing tools. Despite huge success in consumer space, security and enterprise deployment is probably still a challenging point for most of them. Here is an interesting passage from TechCrunch article:

We’ve been spending a lot of time improving the end user experience on Box, but we’re equally committed to creating better management tools for enterprise IT. This means unlocking greater visibility into the activities happening around information, and providing more granular controls where necessary. dLoop’s machine learning capabilities will ultimately allow Box to help enterprise customers identify and surface relevant content by tracking activity patterns… In larger enterprises, data classification is becoming a must-have in order to control what people can do with files. Companies want policy-based file sharing. Box has made considerable effort to enhance its security.

Policy based security is a key thing here. Thinking about full product lifecycle scenario involving multiple tools and people, policy based security can be the only way to support a granular security model. The needs for security is only one aspect. Ultimately, the goal of enterprise systems today is to improve user experience. In my opinion, without an appropriate data granularity, this is mission impossible for most of systems today. Cloud is adding an additional dimension of complexity. It comes as part of scenarios related to multiple cloud tools and shared content.

What is my conclusion? New environments, old problems and even bigger challenges. This is how I can see a combination of enterprise reality, new cloud systems and demand for security. The granularity is a key, in my view. Without policy based granular access, cloud product development tools such as CAD, CAE, CAM, PLM and others will remain childish and fail meet real enterprise customers scenarios. Just my thoughts…

Best, Oleg

Cloud PLM Scaling Factors

November 26, 2013


Scale is one of the most fancy word which is coming in place when different technologies are discussed or debated. So, speaking about cloud CAD, PDM and PLM, the discussion must go towards "scaling factor" too. Since PLM Cloud Switch finally happened, it became clear, vendors and customers will have to go down from buzzwordy statements about "cloud" into more specific discussions about particular cloud technologies they are using. Very often, cloud deployment is related to so called IaaS (infrastructure as a service) used by vendors to deploy solutions. PLM vendors are using IaaS as well. I’ve been talking about it a bit in my post – Cloud PLM and IaaS options. In my view, Siemens PLM provided the most bold statements about following IaaS strategy in delivery of cloud PLM solutions. At the same time, I believe all other vendors without special exclusion are using variety of IaaS options available on the market from Amazon, Microsoft and IBM.

An interesting article caught my attention earlier today – Google nixes DNS load balancing to get its numbers up. Article speaks about Google demoing cloud platform scaling capabilities. Google blog articles provides a lot of details about specific setup used for tests and measurement:

This setup demonstrated a couple of features, including scaling of the Compute Engine Load Balancing, use of different machine types and rapid provisioning. For generating the load we used 64 n1-standard-4’s running curl_loader with 16 threads and 1000 connections. Each curl_loader ran the same config to generate roughly the same number of requests to the LB. The load was directed at a single IP address, which then fanned out to the web servers.

It is not surprising that Google put some competitive statements trying to differentiate itself from their major competitor – Amazon. Here is an interesting passage from Gigaom writeup:

“Within 5 seconds after the setup and without any pre-warming, our load balancer was able to serve 1 million requests per second and sustain that level.”… this as a challenge to Amazon Web Service’s Elastic Load Balancing. ”ELBs must be pre-warmed or linearly scaled to that level while GCE’s ELBs come out of the box to handle it, supposedly,” he said via email. Given that Google wants to position GCE as a competitor to AWS for business workloads, I’d say that’s a pretty good summation.

The discussion about cloud platforms and scalability made me think about specific requirements of cloud PLM to scale and how it can be related to platform capabilities. Unfortunately, you cannot find much information about that provided by PLM vendors. Most of them are limiting information by simple statements related to compatibility with a specific platform(s). However, the discussion about scaling can be interesting and important. Thinking about that, I came to the 3 main group of scaling scenarios in the context of PLM: 1/ computational scale (e.g. when PLM system supposed to find design alternatives or resolve product configuration); 2/ business processing scale (e.g. to support a specific process management scale in transactions or data integration scenarios); 3/ data processing scale (e.g. required to process a significant data imports or analyzes). Analysis of these scenarios can be an interesting work, which of course will go beyond short blog article.

What is my conclusion? Coming years will bring an increased amount of platform-related questions and differentiation factors in PLM space and enterprise in general. It will come as a result of solution maturity, use cases and delivery scenarios. Cost of the platforms will matter too. Both customers and vendors will be learning about delivery priorities and how future technological deployment will match business terms and expectations from both sides. Just my thoughts…

Best, Oleg

What “social features” PLM can steal from SharePoint?

November 25, 2013


Collaboration is still one of the hot topics in PLM space. I was watching PLM TV report about Dassault’s 3DEXPERIENCE Platform from Engineering.com. A bit long (my taste) video provides a comprehensive overview of what Dassault’s key execs are saying about 3DEXPERIENCE platform on the last Dassault conference. Lots of them are related to complexity of data, connecting information sources, provide better user experience especially for top tier mangers. However, "social" remained the key word in many of conversations connected to the value of 3DExperience as a "new way to collaborate". Social was one of the key points of new DS platform innovation. This is how engineering.com captured that:

Verdi asks about the platform and architecture and also how the changes will impact product development teams. He delves into how Dassault envisions that teams will use the social and collaborative tools as part of their engineering processes.

However, very often, it is not simple to see what specific "social" features can enable a new way to collaborate. As I mentioned in my previous blog – Why Social PLM 1.0 failed, a single social utility is what most of vendors missed in their plans to deliver a new way to collaborate socially. An interesting article caught my attention few days ago – 3 Cool Things about SharePoint in Office 365 Enterprise. One of features was specifically "social oriented". It was about simple and well-known tagging – hashtags (#) and at tags (@). I captured the following passage:

Bring social inside the walls of your organization with SharePoint’s social features. With SharePoint 2010, you could follow sites and tag colleagues. In the 2013 flavor, you can have a newsfeed where you can use social features like hashtags (#) and at tags (@) to track ideas and topics and mention people in your posts. In a news feed for a particular team, you might put hashtags on customer names, industry publication names, or create a tag for a particular issue. Then someone can just click the active tag to see all posts relative to that topic.

SharePoint cool features made me think again about "single utility" that can provide some practical sense to a new way to collaborate presented by PLM companies. Tagging feature can be quite powerful. If I combine it with the ides of data complexity, multiple data sources and user experience, we can get a complete new way to collaborate.

What is my conclusion? Social is a very powerful topic. However, most of enterprise companies lost the focus to provide a specific meaning to social features by applying the same principles companies like Facebook and Twitter empowered their own solutions. In the context of engineering, manufacturing and, probably, any enterprise organization, Facebook-like functionality is clearly not enough to change some fundamental collaboration principles. I think, SharePoint folks understood it quite well after all ups and downs SharePoint had in the past. Therefore, I can see the value of some specific features in the overall social collaboration domain. Just my thoughts…

Best, Oleg

How PLM can transmit data to the cloud?

November 24, 2013


Transmitting of data is complicated and painful. Remember few years ago, when industry just started to switch to the cloud, the discussion around how to transfer large CAD parts and assembly was very hot. Actually not much changed since that time. A huge amount of data lives on the desktops and produced outside of cloud environment. Think about variety of sensors that started to appear almost everywhere. Also, think about potential for huge amount of simulations – still many of these are run on desktop computers.

My attention was caught by Amazon newly announced Kinesis service. The description provided by Amazon is quite impressive.

Amazon Kinesis provides an easy way to process fast moving data in real-time. You simply provision the level of input and output required for your data stream, in blocks of 1 megabyte per second (MB/sec), using the AWS web management console, API, or SDKs. The size of your stream can be adjusted up or down at any time without restarting the stream and without any impact on the data sources pushing data to Amazon Kinesis. Once your stream has been created, you can immediately start loading your data, with simple HTTP PUTs. Amazon Kinesis automatically manages the infrastructure, storage, networking, and configuration to collect and process your data at the level of throughput your application needs. Within seconds, data PUT into an Amazon Kinesis stream is available for analysis. Stream data is stored across multiple availability zones in a region for 24 hours. During that window, data is available to be read, re-read, backfilled, and analyzed, or moved to long-term storage (e.g. Amazon S3 or Amazon Redshift).

If you don’t have time to read product description, watch the following 2 minutes video. The idea in a nutshell is quite simple – Amazon Kinesis will pump your data to the cloud. After that you can decide what to do and how to proceed.

I can see few potential applications of Kinesis service within cloud infrastructure today. One of them is massive data processing from desktops and corporate network locations. It can enable analytic applications, search and many other services that cannot live without data. Another application area is so called "IoT" (Internet of things) I covered few days ago here. By processing data from various sensors and mobile devices, engineering applications can provide new data input that can improve design process.

What is my conclusion? Kinesis service shows a potential of data processing from variety of locations to the cloud. Bringing data together can enable a number of a completely new ways to design, making analysis and optimization. Capturing a huge amount of real time data, can help improve quality and provide insight on how customers are using products. Just my thoughts…

Best, Oleg

CAD, PLM and Fast Data Processing

November 24, 2013


Our demand for the software to be fast. We don’t like to wait until computer will perform all actions and produce results. Depends on the situation and use case, speed can be something very disturbing or even more – change the overall working process. Time ago, simulation and analysis software embedded into CAD systems changed the way engineers designed mechanical parts. Another, more PLMish example might be related to visualization of complete product configuration and performing clash analysis.

Nowadays, the computing mindset is taking us towards new approaches in data processing using new processor technologies as well as network computing opportunities. MIT Tech review article Graphics Chips Help Process Big Data Sets in Milliseconds is talking about how to visualize big data on cheap computers in milliseconds. The writeup is mentioning massively parallel database that allows to achieve big speed gain by storing data on the onboard memory of GPUs, which allows 70x performance gain. You can use the following link to use public interface to product tweetmap. Here is an interesting passage I captured:

The can be used to visualize 50 million geocoded tweets posted between September 28 and October 6. The tool allows users to explore different search terms, examine broad geographical trends, and zoom in on each tweet. For each of the 30 frames per second it generates when animating Twitter, Map-D scans all the tweets that have been loaded on the GPUs, constructing visualizations such as maps of how word usage—which could include mentions of a product name or news item—is propagating across a region or around the world in real time.

Watch the following video, which is quite self explaining.

What is my conclusion? Fast data processing can be a key technology to realize the opportunity to change design and engineering process. To visualize design options, analyzes variants and make a final assessment to validate potential mistakes – this is only a very short list of options. Moving forward you can think about analyzing of huge databases for potential failures and customer issues, potential supplier components. Theses days, technologies developed for web, social networks and mobile domain can be leveraged tomorrow to change future landscape of engineering software. Just my thoughts…

Best, Oleg

PDMish, PLMish and other CADPDLM bundles…

November 20, 2013


Engineering and manufacturing software is full of interesting TLAs that companies continuing to invent even these days. Only yesterday GrabCAD introduced Workbench enhancement under somewhat that can be considered as a very traditional name – Collaborative Product Development (CPD) Platform.

Chad Jackson, my long time blogging buddy published an article with fascinating title – ENOVIA Data Management: Less PDM-ish, More PLM-ish. Chad made a review of Enovia data management capabilities from Dassault System briefing. Read the article with multiple embedded tweet-quotes. Besides playing PDM vs. PLM naming game, article brings an important perspective of where Enovia data management architecture is going. Here is an interesting passage I captured:

CAD models have never been simple files. Lots of stuff have been, and continue to be, jammed into them. They contain far more than just geometry. They contain part numbers, so drawing BOMs populate automatically. They contain material properties, such that mass properties can be calculated. Over time, as more and more enterprise considerations needed to be taken into account, more non-geometry stuff has been jammed into CAD files. The problem with all this stuff in CAD files was that, unlike many other types of files, operating systems couldn’t understand the structure of information in CAD files.

So, what Enovia is doing differently? According to Chad’s comments –

Dassault Systèmes are trying to liberate all that stuff jammed into the CAD file. Here are some notes from my briefing with them. In short, they are taking non-geometric items in CAD files an turning them into meta-data that lives in ENOVIA. They are being turned into individual pieces of information that can be modified separately and independently from every other piece of information and the geometry. Of course, this meta-data is related and will live as a database item right alongside the file that contains the geometry.


It made me think about future trajectories of CAD/PDM/PLM integrations. I’ve been describing possible options in one of the posts last year – Multi-CAD PDM integrations: yesterday, today and tomorrow. The idea of CAD/PDM bundle I expressed there is in my view a reflection of what CATIA/Enovia is doing in data management. As I mentioned there, it solves the problem of version compatibility as well as provide a significant advantages in terms of functional richness. This is exactly what Chad demonstrated with CATIA/Enovia examples.

What is my conclusion? Integration remains a place where lots of innovation is happening. CAD, PDM and PLM integrations is a very challenging space. Customers have a huge demand to have vertically integrated product providing sufficient amount of features and what is mostly important a completely new level of user experience. It sounds like we can see more investment in this space coming from traditional vendors. CAD/PLM companies will try to integrate existing products into vertical suites connecting data and providing support for integrated product scenarios. Just my thoughts…

Best, Oleg

GrabCAD CPD wants to disrupt CAD file management

November 19, 2013


Three years ago I sat together with Hardi Meybaum of GrabCAD is one of Starbucks coffee shops near Brookline, Mass. We’ve been talking about GrabCAD social networks, communities, openness, engineers and CAD libraries. You can find some of my thoughts after this conversation here – Marketplaces and Engineering Software. Since that time, I’ve been following GrabCAD trajectories closely and had a chance to meet with Hardi many times. You can probably remember some of my previous posts – Manufacturing Crowdsourcing and Cloud PLM opportunity, GrabCAD, Collaboration and Dropbox, GrabCAD and Open Engineering Source: Dream or Reality?

However, GrabCAD trajectory recently started to show some changes. From open CAD library and “Facebook for engineers”, GrabCAD is moving towards maybe more traditional space – design and collaboration. GrabCAD introduced Workbench – the product that supposed to help engineers collaborate during the design phase. You see my first GrabCAD Workbench experiments here. All around GrabCAD Workbench started to look and smell document and product data management – PDM: reinvent the wheel or innovate?

Lately, Hardi and I had a very active discussion about different aspects of CAD file sharing, collaboration and integration – CAD File Sharing and Integration Challenges, Top 3 pros and cons to have a special CAD file sharing tools.

And here is finally come. Early today, GrabCAD published a press release announcing future GrabCAD PDM solution – GrabCAD Workbench brings CAD file management into 21st century. Navigate here () to read press release. So, the name of the game is CPD (Collaborative Product Development) and here is how it explained in GrabCAD press release:

Today GrabCAD announced that it has expanded the capabilities of the cloud­ based Workbench solution to a complete platform for Collaborative Product Development (CPD), enabling users to manage, share and view CAD models with zero IT investment. More than ten thousand users have already signed up to use Workbench to share CAD models with suppliers, customers and partners. With the expansion of file management capabilities, these users will be able to automatically sync their desktop files to cloud projects, track file dependencies, visualize version differences and resolve conflicts. With these additions, Workbench is now the ideal tool to help small to mid-­size companies manage CAD files.

It all comes to the advantages of the cloud – to get service without installation, configuration, maintenance and other related hassle. Here is another passage I captured.

As a cloud­ based service, Workbench requires no dedicated server, no configuration, no maintenance, and no IT hardware or support. While legacy PDM/PLM solutions take weeks or months to install and configure, Workbench users are up and running in minutes.

Few screenshots below can give some additional impression about what GrabCAD Workbench CPD is supposed to provide. According to GrabCAD, service will become available first as a beta for everybody and later will become available for flat fee of 25$ or 45$ per month depends on the plan.


On of the very interesting features of GrabCAD CPD and Workbench is Viewer Diff functionality that is able to visualize and show difference between CAD file versions.



What is my conclusion? Technology and people. These are two components of future changes in almost every place. I’m always saying – technology is much easier than people. In engineering and manufacturing business it is probably specially true. Engineers are main consumers of CAD and PDM products and it is very hard to change their behaviors. PDM has probably one of the worst records in enterprise software, especially among small companies and individuals. Engineers recognized PDM as something that hurts them and doesn’t help. It would be interesting to see if new cloud services such as GrabCAD CPD will change the trend. The name (CPD) isn’t new, so hope to product itself will deliver the difference. I’m looking forward to join the service as a beta and test it. So, stay tuned for future posts.

Best, Oleg

Disclaimer: GrabCAD didn’t pay me to write this post and in no way influence my opinion about what GrabCAD Workbench CPD can do.


Get every new post delivered to your Inbox.

Join 290 other followers