Dropbox can quietly become ready to manage CAD files

January 29, 2015

dropbox-pixelaps-cad

Cloud file sharing and collaboration tools are booming. To store and share files on one of the cloud file sharing tools like Dropbox, Google Drive, Apple iCloud is part of everyday workflows. The situation is different for engineers working with CAD tools. You can store CAD files in your Dropbox. However, because of cross references between CAD files, you should do a lot of hand work identifying dependencies and bringing files back to your desktop. I discussed CAD file sharing and integration challenges in my blog earlier.

The problem of supporting specific file formats as well as providing ready to use revision management and collaboration workflows is recognized cloud file sharing vendors. My attention was caught by the following Dropbox acquisition – Pixelapse. Venturebeat article Dropbox acquires Pixelapse a version control and collaboration tool for designers brings some more details. Here is a very interesting passage:

While GitHub has built an entire business from enabling version control and collaboration for developers — it even takes its name from a popular version control language — designers have not been as fortunate with their tools. CAD and design software have been slow to catch up, usually forcing designers to save multiple versions of what they’re working in order to have access to slightly different versions. Adobe, Layervault, and even GitHub have also tried to alleviate this problem for designers, although no one, Pixelapse included, seems to have become an industry standard yet.

Some of Pixelapse features are interesting if you think about supporting work on engineering teams – real time notification, collaboration in context, granular permissions.

What is my conclusion? Mainstream cloud file sharing and collaboration tools are getting better. Of course, Dropbox is not targeting engineers working with CAD tools. However, by bringing new features and capabilities, Dropbox and other cloud file sharing tools can close the gap and push CAD file sharing and collaboration tool providers to search for more differentiations to justify premium price for cloud PDM tools. Just my thoughts…

Best, Oleg


How PTC is delivering PLM in the cloud?

January 28, 2015

ptc-plm-cloud-1

Cloud is trending and it is hard to find a company who is not thinking how to leverage new cloud technologies and business models. However, just to say “cloud” these days means probably nothing. The right question is how to implement cloud. I guess many companies these days are coming to that question. It goes in parallel with the discussion about what is “cloud” and what is “not cloud”, which has some technical and some marketing aspects.

Long time ago, PTC introduced PLM “On Demand“. You should remember this marketing name that later was replaced by SaaS and cloud. According to the PTC website, the solution is available and hosted by IBM. I noticed some indication of PTC move to the cloud back in 2013 after acquisition of NetIDEAS. My writeup about that is here. According to PTC press release NetIDEAS allowed to PTC to develop a better foundation to offer multiple deployment options.

Earlier today, my attention was caught by PTC announcement – PTC Introduces PTC PLM Cloud New PTC Windchill SaaS offerings for small and midsized companies. The following passage is explaining the reason why PTC is coming with cloud product offering

Recognizing that many SMB organizations may lack a dedicated IT staff but still want to adopt a proven PLM environment, PTC designed PTC PLM Cloud specifically to enable team collaboration and data management in the cloud. This flexible offering eliminates the typical, but risky, SMB practice of shared folders and file naming conventions which hamper product development. With more effective and reliable data sharing in the cloud, customers are able to improve product development across teams in different locations, teams working with varying CAD applications, and with external teams such as partners and suppliers who are a growing part of the product development process.

I tried to dig inside of materials available online to see how PTC will provide cloud PLM and what options are available. Navigate here to learn more. It is available with 3 options – standard, premium and enterprise. While names mean nothing, the following definition caught my attention – “instant access” for standard vs. “dedicated database” for others. In addition to that, the differences between options down to “workflow customization” in premium “new business objects and UI customization” for enterprise. It looks like PTC recognized the importance of MCAD data management – all versions are coming with integrated viewing solution and support for Creo, AutoCAD, Inventor and SolidWorks.

ptc-cloud-28-jan-2015

The questions that remaining open me at this moment are price and cloud (hosting) architecture. It is essentially important for customers today as I mentioned earlier in my post – Why you should ask your cloud PLM vendor about Devops and Kubernetes.

What is my conclusion? Manufacturing companies are not implementing PLM because of high cost and availability of IT resources. To many customers and vendors today, cloud seems like a right path to remove IT cost and make implementations less painful. From that standpoint, PTC is taking right trajectory by delivering Windchill based PLM solution using cloud. However the devil is in details. I’m looking forward to learn more about “how” PTC on the cloud will be delivered and how it will be different from other PLM clouds from Autodesk, Aras, Dassault Systemes and Siemens PLM. Just my thoughts…

Best, Oleg


The importance of software BOM for hardware security

January 27, 2015

smart-products-bom

We live in the era of smart products. Modern smartphones is a good confirmation to that. The average person today keeps in his pocket a computer with computational capability equal or even more than computer that aerospace and defense industry used for navigation. In addition to that, you smartphone has communication capability (Wi-Fi and Bluetooth) which makes it even more powerful. If you think about cost and availability of boards like raspberry pi and Arduino, you can understand why and how it revolutionize many products these days. Although, wide spread of these devices has drawbacks.

Smart products are bringing a new level of complexity everywhere. It starts from engineering and manufacturing where you need to deal with complex multidisciplinary issues related to combination of mechanical, electronic and software pieces. The last one is a critical addition to product information. Bill of materials has to cover not only mechanical and electronic parts, but also software elements.

Another aspect is related to operation of all smart products. Because of connectivity aspects of products, the operation is required to deal with software, data and other elements that can easy turn your manufacturing company into web operational facility with servers, databases, etc.

As soon as devices are exposed to software, the problem of software component traceability is getting critical. Configuration management and updates is a starting point. But, it quickly coming down to security, which is very critical today.

GCN article – How secure are your open-source based systems? speaks about problem of security in open source software. Here is my favorite passage:

According to Gartner, 95 percent of all mainstream IT organizations will leverage some element of open source software – directly or indirectly – within their mission-critical IT systems in 2015. And in an analysis of more than 5,300 enterprise applications uploaded to its platform in the fall of 2014, Veracode, a security firm that runs a cloud-based vulnerability scanning service, found that third-party components introduce an average of 24 known vulnerabilities into each web application.

To address this escalating risk in the software supply chain, industry groups such as The Open Web Application Security Project, PCI Security Standards Council and Financial Services Information Sharing and Analysis Center now require explicit policies and controls to govern the use of components.

Smart products are also leveraging open source software. The security of connected devices and smart product is a serious problem to handle. Which brings me to think about how hardware manufacturing companies can trace software elements and protect their products from a potential vulnerability.

What is my conclusion? To cover all aspects of product information including software becomes absolutely important. For many manufacturing companies the information about mechanical, electronic and software components is siloed in different data management systems. In my 2015 PLM trends article, I mentioned the importance of new tools capable to manage multidisciplinary product information. Software BOM security is just one example of the trend. The demand to provide systems able to handle all aspect of product BOM is increasing. Just my thoughts…

Best, Oleg

photo credit: JulianBleecker via photopin cc


The anatomy of PLM upgrades

January 26, 2015

plm-migration-upgrades

Software upgrades is a fascinating topic. It has been with us from a very early beginning of software. Seriously, we hate upgrades. On the other side, very often, this is the only way to make a progress. The main problem of upgrades is related to existing dependencies – migration of data, file formats and data incompatibilities, hardware incompatibilities, etc.

As software is getting more complex, the complexity of upgrades are increasing. Enterprise software is a very good example. Talk to people about ERP, PLM and other enterprise software upgrades and you can learn a lot about effort and cost of upgrades for an organization.

For a long time, enterprise software upgrades were considered as something inevitable. Which led to many problems for customers. One of the extreme situation is when a specific configuration of a system becomes non-upgradable. It is known as "version lock-in". Most typical reasons – features and customization incompatibility between new software version and the one customer is still running. As much as customers are discovering the complexity of upgrades, we can see software vendors are trying to leverage it to demostrate their differentiation.

For last few years, I can see an increased focus of PLM vendors around "upgrade and migration". My hunch, too many customers stuck in previous versions of PLM software or outdated PLM systems. Random PLM (future) thoughts article by Jos Voskuil speaks about PLM systems upgrades complexity. Read the following passage:

Not every upgrade is the same! Where consumer software will be used by millions and tested through long Alfa and beta cycles, PLM software often comes to the market in what you could consider a beta stage with limited testing. Most PLM vendors invest a lot of their revenue in providing new functionality and technology based on their high-end customer demands. They do not have the time and budget to invest in the details of the solution; for this reason PLM solutions will remain a kind of framework. In addition, when a solution is not 100 % complete there will be an adaptation from the customer, making upgrades later, not 100 percent guaranteed or compatible. More details on PLM Upgrades after the conference, let’s look into the near future.

I think, the overall trend in quality of enterprise software is positive. Consumer software mentioned by Jos is only one factor why enterprise software vendors are investing more in quality. Jos’ article made me think more about how customers should approach the topic of PLM migrations and upgrades. In general, I think, it can be applicable not only to PLM systems. PLM vendors are trying to make migrations easy from both economical and technological standpoints. Here are some of my thoughts about anatomy of PLM software migration.

Migration technologies

While some technologies can give you an advantage during migration and upgrades, from a technical standpoint you cannot avoid upgrades. Very simple – from time to time you need to restructure database to bring new features or optimize for performance. Since PLM is relying on OS and database technologies, you need to get upgrades to bring PLM system into compatible state with new OS/RDBMS. If you PDM/PLM system is integrated with other CAD systems, this is another aspect of migrations.

From technological perspective, migration is always sort of extract, transfer, load type of things. It can be minor or major. It can happen in a single database or may require a separate set of application or database servers. PLM system architecture designed with "upgrade in mind" can make it easier, but won’t eliminate it completely.

PLM vendors and economic of migration

PLM vendors are starting to pay attention to migration and upgrades. While the status of PLM systems is far from an ideal when it comes to migration, some vendors are proposing to cover upgrades and migrations as part of PLM service and licensing offerings.

SaaS (cloud) is providing another way to hide migration and upgrades. Since customer is not buying software to install it in their data centers, the problem of migrations and upgrades eventually is part of PLM vendor responsibility.

Technical elements of migration

There are 3 main elements that can increase PLM system vulnerability to upgrades and migrations – 1/ custom data model; 2/ code customization and scripting; 3/ integration with other system. The amount of specialization in each of them, can increase a potential cost and complexity of migration.

What is my conclusion? You cannot avoid migrations and upgrades. So, to plan ahead is a good idea. You should evaluate vendor and product for "updatability". It is not simple, especially when it comes to on-premise software. Product architecture evaluation should be an important element of your system selection process. If you think about SaaS /cloud as a universal solution for upgrades and migration, I recommend you to take it carefully as well. It certainly removes a pain from a customer. However, take into account it won’t eliminate upgrades from technological standpoint. Upgrades are essential part of SaaS product development. Depends on SaaS architecture and development methodology, the system can be in an upgrade mode all the time. Which is a good thing because it will be become part of product delivery. Just my thoughts…

Best, Oleg


How to transform old CAD-PDM integration paradigms

January 23, 2015

cloud-pdm-paradigm

Integration of CAD and PDM is a field with long history of battles, innovation and failures for the last 15-20 years. You can hardly undervalue the importance of integration between CAD and data management tools. For some time in the past CAD and PDM were separate systems. Engineers had to switch from CAD to PDM to perform specific tasks related to data management functions. To integrate PDM tools inside CAD was one of the greatest ideas of 1990s, which improved significantly what we call today “user experience”.

However, the complexity of data management interfaces was always something that made engineers uncomfortable. Another innovative approach that was introduced in the field of integration between CAD and PDM was to embed PDM tools into File Explorer user interface. One of the companies that did it back in 2000s was Conisio (later SolidWorks Enterprise PDM). The idea got lot of traction and allowed to engineers to work with a familiar file based interface while in fact using PDM tools.

People are hard. Especially, when it comes to adopting of new paradigms. Dassault System blog SOLIDWORKS and PLM: No Fear Required brings an interesting perspective on integration between ENOVIA PLM and SolidWorks.

3DEXPERIENCE platform offers a fresh approach to this problem. Recognizing that our existing Enterprise PDM solution has been greatly accepted by the design community, the same R&D group has designed a new product that offers the usability of EPDM but actually stores the data in a broader and more capable PLM solution. The result is the SOLIDWORKS Collaborative Innovation Connector, a product that works and acts much like a workgroup solution would but gives the designer just enough access to the PLM functionality to innovate their processes beyond what they can do today in a PDM environment.

The following video is one of the confirmation for that. You can see how ENOVIA PLM traditional web interface is morphing to provide File-Explorer user experience for SolidWorks users. What I found specifically interesting is that you can hardly distinguish between ENOVIA PLM and SolidWorks EPDM, which has very similar user experience for both file explorer and SolidWorks UI.

The video about ENOVIA SolidWorks integration made me think about what can be a new PDM paradigm as we move forward into cloud future. I’d like to bring few references to new products and companies in that space – GrabCAD, Autodesk Fusion360 and Onshape.

Fusion360

At recent Autodesk University in Las Vegas, Autodesk CEO Carl Bass presented the evolution of Fusion360 and its connection with cloud services such as Autodesk A360. According to Carl Bass, you can think about Fusion is a GitHub for engineers. Combined with A360, Fusion is a full digital re-imagination of how designers and engineers will collaborate – online and social. What is important to understand is that A360 provides data and collaboration backbone for Fusion360, so engineers are not facing file-based operations like in traditional desktop CAD tools.

carl-bass-fusion-360-au2014-2

Onshape

Onshape is a new company re-imagining CAD for Google era. Large group of Onshape founding team is coming from SolidWorks. Last week, Onshape started to blog. One of the things I captured from Onshape blog is their claim to rethink PDM role and appearance for cloud CAD. You can read some of my thoughts here – Future CAD won’t require PDM. Here is quote from Onshape blog:

on-shape-world-changed

We tried with traditional PDM, but fundamentally the architecture of copying files around, to and from servers and desktops, is just not a good basis for solving version control and collaboration problems. We think we have a better way to solve the problems, and no PDM system is needed.” Mac, Windows, phone or tablet. No PDM system needed. The files stay in one place. Different UI look. Now those sound like interesting and wonderful things. We’ll continue to anxiously anticipate what they have planned and what you have to say about it.

GrabCAD

GrabCAD workbench is another system that introducing a different experience by merging cloud and file-based data management operations. GrabCAD didn’t develop CAD system as it was predicted by some CAD industry insiders. However, GrabCAD Workbench is a PDM system on the cloud that can remind you some elements of Dropbox combined with CAD viewer and ability to control file revisions.

grabcad-workbench

What is my conclusion? Existing paradigms are hard to change. In my view, engineers are one of the most innovative groups of people. However, when it comes to their own tools, engineers are very conservative. You can easy expect the following vision for data management from an engineer – “I want to work with my designs (files), please leave me alone and stop selling me PDM tools”. However, here is the thing – collaboration can make a difference. The integration of data management and collaboration can provide a significant advantage to engineers in a modern mobile and distributed environment. This is a key thing, in my view. Cloud and mobile collaboration will change CAD /PDM integration paradigm in the future. Just my thoughts…

Best, Oleg


Why manual PLM data modeling should be a thing in the past?

January 22, 2015

plm-data-modeling

One of the most complicated parts of any PLM implementation is data modeling. Depends on PLM vendor, product and technology, the process of data modeling can be called differently. But fundamentally, you can see it in any PLM implementation. This is a process, which creates an information model of product and processes in a specific company. To get it done is not simple and it requires lot of preparation work, which is usually part of implementation services. Even more, once created data model needs to be extended with new data elements and features.

Is there a better way? How other industries and products are solving similar problems of data modeling and data curating. It made me think about web and internet as a huge social and information system. How data models are managed on the web? How large web companies are solving these problems?

One of the examples of creating a model for data on the web was Freebase. Google acquired Freebase and used as one of the data sources for Google Knowledge Graph. You can catch up on my post why PLM vendors should learn about Google Knowledge Graph. Another attempt to create a model for web data was Schema.org, which is very promising in my view. Here is my earlier post about Schema.org – The future of Part Numbers and Unique Identification. Both are examples of curating data models for web data. The interesting part of schema.org is that several web search vendors are agreed on some elements of data model as well as how to curate and manage schema.org definitions.

However, it looks like manual curating of Google Knowledge Graph and Schema.org is not the approach that makes web companies to feel happy about and leapfrog in the future. Manual work is expensive and time consuming. At least some people are thinking about that. Dataversity article “Opinion: Nova Spivack on a New Era in Semantic Web History” speaks about some interesting opportunities that can open a new page in the way data is captured and modeled. He speaks about possible future trajectories of deep learning, data models and relationships detecting. It can extend Schema.org, especially in the part that related to automatically generated data models and classifications. Here is my favorite passage:

At some point in the future, when Deep Learning not only matures but the cost of computing is far cheaper than it is today, it might make sense to apply Deep Learning to build classifiers that recognize all of the core concepts that make up human consensus reality. But discovering and classifying how these concepts relate will still be difficult, unless systems that can learn about relationships with the subtly of humans become possible.

Is it possible to apply Deep Learning to relationship detection and classification? Probably yes, but this will likely be a second phase after Deep Learning is first broadly applied to entity classification. But ultimately I don’t see any technical reason why a combination of the Knowledge Graph, Knowledge Vault, and new Deep Learning capabilities, couldn’t be applied to automatically generating and curating the world’s knowledge graph to a level of richness that will resemble the original vision of the Semantic Web. But this will probably take two or three decades.

This article made me think about the fact manual data curating for Freebase and Schema.org is a very similar process to what many PLM implementers are doing when applying specific data and process models using PLM tools. Yes, PLM data modeling happens usually for a specific manufacturing companies. At the same time, PLM service providers are re-using elements of these models. Also companies are interconnected and working together. The problem of communication between companies is painful and still requires some level of agreement between manufacturing companies and suppliers.

What is my conclusion? Data modeling is an interesting problem. For years PLM vendors put a significant focus how to make flexible tools that can help implementers to create data and process models. Flexibility and dynamic data models are highly demanded by all customers and this is one of the most important technological element of every PLM platform today. New forms of computing and technologies can come and automate this process. It can help to generate data models automatically via capturing data about what company does and processes in a company. Sounds like a dream? Maybe… But manual curating is not an efficient data modeling. The last 30 years of PDM/PLM experience is a good confirmation to that. To find a better way to apply automatic data capturing and configuration for PLM can be interesting opportunity. Just my thoughts…

Best, Oleg

photo credit: tec_estromberg via photopin cc


Cloud PDM: stop controlling data and check shadow IT practices

January 21, 2015

cloudpdm-shadow

An interest of customers in cloud PDM solution is growing. I guess there are multiple factors here – awareness about cloud efficiency and transparency, less concern about cloud security and improved speed and stability of internet connections. If you are not following my blog, you can catch up on my older blog articles about cloud PDM – Cloud PDM ban lifted. What next?; Cloud PDM hack with Google Drive and other tools; Cloud can make file check-in and check-out obsolete. The confluence of new technologies around cloud, web, mobile and global manufacturing is creating a demand for cloud (or web based) solution helping distributed design teams.

So, where is a challenge for cloud PDM? My hunch, the biggest one is how to sell cloud PDM to manufacturing companies. I can divide all customers into two groups – larger manufacturing companies that already implemented PDM solutions and smaller manufacturing firms that are still managing CAD design with folders, FTP and Dropbox accounts.

Analysts, researchers and PDM marketing pundits are trying to convince companies that cloud PDM can become a great enabler for collaboration and leaving CAD data “not managed” can bring even greater risk to organization. There is nothing wrong with that… PDM was build around the idea of how to take a control over data. However, the idea of “control” is not something engineers like. Ed Lopategui is speaking about engineers and control in his last blog – The day the strength of PDM failed. Here is a passage I liked:

The second reason, which is not so legitimate, is a loss of control. The reason so many engineers pine about the days of paper-based PDM in document control departments (or instead nothing at all) is that world could be circumvented in a pinch. It was flawed because it was run by humans, and consequently also replete with errors. Replaced with immutable and uncaring software, engineers working in groups nonetheless become irritated because they can’t just do whatever they want. You see this very conflict happening with regard to source control in software development circles. The order needed to manage a complex product necessarily makes manipulating pieces of that engineering more cumbersome. It’s one thing to be creating some widget in a freelance environment, it’s another matter entirely when that end product needs traceable configuration for a serialized certification basis. And that will happen regardless of how the software operates.

Here is the thing… Maybe cloud PDM should stop worry about controlling data and think more about how to bring a comfort to engineers and stop irritating users with complex lifecycle scenarios? It made me think about practice that known as “shadow IT”. For the last few years, shadow IT and cloud services have lot of things in common. Don’t think about shadow IT as a bad thing. Think about innovation shadow IT can bring to organizations.

Forbes article “Is shadow IT a runaway train or an innovation engine?” speaks about how shadow IT can inject some innovative thinking into organization. This is my favorite passage:

As we reported last month, one corporate employee survey found that 24% admit they have purchased and/or deployed a cloud application — such as Salesforce.com, Concur, Workday, DropBox, or DocuSign. One in five even use these services without the knowledge of their IT departments.

The rise of shadow IT may actually inject a healthy dose of innovative thinking into organizations, at a time they need it most. The ability to test new approaches to business problems, and to run with new ideas, is vital to employees at all levels. If they are encumbered by the need for permissions, or for budget approvals to get to the technology they need, things will get mired down. Plus, shadow IT applications are often far cheaper than attempting to build or purchase similar capabilities through IT.

What is my conclusion? Stop controlling data and bring a freedom of design work back to engineers. I understand, it is easy to say, but very hard to implement. To control data is a very fundamental PDM behavior. To re-imagining it require some innovative thinking. It is also related to the fact how to stop asking engineers to check-in, check-out and copy files between different locations. Maybe, this is an innovation folks at Onshape are coming with? I don’t know. In my view, cloud PDM tools have the opportunity to change the way engineers are working with CAD data. Many new services became successful by providing cloud applications and making existing working practices much easier than before. Just my thoughts…

Best, Oleg
photo credit: Dean Hochman via photopin cc


Follow

Get every new post delivered to your Inbox.

Join 269 other followers