How PTC is delivering PLM in the cloud?

January 28, 2015

ptc-plm-cloud-1

Cloud is trending and it is hard to find a company who is not thinking how to leverage new cloud technologies and business models. However, just to say “cloud” these days means probably nothing. The right question is how to implement cloud. I guess many companies these days are coming to that question. It goes in parallel with the discussion about what is “cloud” and what is “not cloud”, which has some technical and some marketing aspects.

Long time ago, PTC introduced PLM “On Demand“. You should remember this marketing name that later was replaced by SaaS and cloud. According to the PTC website, the solution is available and hosted by IBM. I noticed some indication of PTC move to the cloud back in 2013 after acquisition of NetIDEAS. My writeup about that is here. According to PTC press release NetIDEAS allowed to PTC to develop a better foundation to offer multiple deployment options.

Earlier today, my attention was caught by PTC announcement – PTC Introduces PTC PLM Cloud New PTC Windchill SaaS offerings for small and midsized companies. The following passage is explaining the reason why PTC is coming with cloud product offering

Recognizing that many SMB organizations may lack a dedicated IT staff but still want to adopt a proven PLM environment, PTC designed PTC PLM Cloud specifically to enable team collaboration and data management in the cloud. This flexible offering eliminates the typical, but risky, SMB practice of shared folders and file naming conventions which hamper product development. With more effective and reliable data sharing in the cloud, customers are able to improve product development across teams in different locations, teams working with varying CAD applications, and with external teams such as partners and suppliers who are a growing part of the product development process.

I tried to dig inside of materials available online to see how PTC will provide cloud PLM and what options are available. Navigate here to learn more. It is available with 3 options – standard, premium and enterprise. While names mean nothing, the following definition caught my attention – “instant access” for standard vs. “dedicated database” for others. In addition to that, the differences between options down to “workflow customization” in premium “new business objects and UI customization” for enterprise. It looks like PTC recognized the importance of MCAD data management – all versions are coming with integrated viewing solution and support for Creo, AutoCAD, Inventor and SolidWorks.

ptc-cloud-28-jan-2015

The questions that remaining open me at this moment are price and cloud (hosting) architecture. It is essentially important for customers today as I mentioned earlier in my post – Why you should ask your cloud PLM vendor about Devops and Kubernetes.

What is my conclusion? Manufacturing companies are not implementing PLM because of high cost and availability of IT resources. To many customers and vendors today, cloud seems like a right path to remove IT cost and make implementations less painful. From that standpoint, PTC is taking right trajectory by delivering Windchill based PLM solution using cloud. However the devil is in details. I’m looking forward to learn more about “how” PTC on the cloud will be delivered and how it will be different from other PLM clouds from Autodesk, Aras, Dassault Systemes and Siemens PLM. Just my thoughts…

Best, Oleg


The importance of software BOM for hardware security

January 27, 2015

smart-products-bom

We live in the era of smart products. Modern smartphones is a good confirmation to that. The average person today keeps in his pocket a computer with computational capability equal or even more than computer that aerospace and defense industry used for navigation. In addition to that, you smartphone has communication capability (Wi-Fi and Bluetooth) which makes it even more powerful. If you think about cost and availability of boards like raspberry pi and Arduino, you can understand why and how it revolutionize many products these days. Although, wide spread of these devices has drawbacks.

Smart products are bringing a new level of complexity everywhere. It starts from engineering and manufacturing where you need to deal with complex multidisciplinary issues related to combination of mechanical, electronic and software pieces. The last one is a critical addition to product information. Bill of materials has to cover not only mechanical and electronic parts, but also software elements.

Another aspect is related to operation of all smart products. Because of connectivity aspects of products, the operation is required to deal with software, data and other elements that can easy turn your manufacturing company into web operational facility with servers, databases, etc.

As soon as devices are exposed to software, the problem of software component traceability is getting critical. Configuration management and updates is a starting point. But, it quickly coming down to security, which is very critical today.

GCN article – How secure are your open-source based systems? speaks about problem of security in open source software. Here is my favorite passage:

According to Gartner, 95 percent of all mainstream IT organizations will leverage some element of open source software – directly or indirectly – within their mission-critical IT systems in 2015. And in an analysis of more than 5,300 enterprise applications uploaded to its platform in the fall of 2014, Veracode, a security firm that runs a cloud-based vulnerability scanning service, found that third-party components introduce an average of 24 known vulnerabilities into each web application.

To address this escalating risk in the software supply chain, industry groups such as The Open Web Application Security Project, PCI Security Standards Council and Financial Services Information Sharing and Analysis Center now require explicit policies and controls to govern the use of components.

Smart products are also leveraging open source software. The security of connected devices and smart product is a serious problem to handle. Which brings me to think about how hardware manufacturing companies can trace software elements and protect their products from a potential vulnerability.

What is my conclusion? To cover all aspects of product information including software becomes absolutely important. For many manufacturing companies the information about mechanical, electronic and software components is siloed in different data management systems. In my 2015 PLM trends article, I mentioned the importance of new tools capable to manage multidisciplinary product information. Software BOM security is just one example of the trend. The demand to provide systems able to handle all aspect of product BOM is increasing. Just my thoughts…

Best, Oleg

photo credit: JulianBleecker via photopin cc


The anatomy of PLM upgrades

January 26, 2015

plm-migration-upgrades

Software upgrades is a fascinating topic. It has been with us from a very early beginning of software. Seriously, we hate upgrades. On the other side, very often, this is the only way to make a progress. The main problem of upgrades is related to existing dependencies – migration of data, file formats and data incompatibilities, hardware incompatibilities, etc.

As software is getting more complex, the complexity of upgrades are increasing. Enterprise software is a very good example. Talk to people about ERP, PLM and other enterprise software upgrades and you can learn a lot about effort and cost of upgrades for an organization.

For a long time, enterprise software upgrades were considered as something inevitable. Which led to many problems for customers. One of the extreme situation is when a specific configuration of a system becomes non-upgradable. It is known as "version lock-in". Most typical reasons – features and customization incompatibility between new software version and the one customer is still running. As much as customers are discovering the complexity of upgrades, we can see software vendors are trying to leverage it to demostrate their differentiation.

For last few years, I can see an increased focus of PLM vendors around "upgrade and migration". My hunch, too many customers stuck in previous versions of PLM software or outdated PLM systems. Random PLM (future) thoughts article by Jos Voskuil speaks about PLM systems upgrades complexity. Read the following passage:

Not every upgrade is the same! Where consumer software will be used by millions and tested through long Alfa and beta cycles, PLM software often comes to the market in what you could consider a beta stage with limited testing. Most PLM vendors invest a lot of their revenue in providing new functionality and technology based on their high-end customer demands. They do not have the time and budget to invest in the details of the solution; for this reason PLM solutions will remain a kind of framework. In addition, when a solution is not 100 % complete there will be an adaptation from the customer, making upgrades later, not 100 percent guaranteed or compatible. More details on PLM Upgrades after the conference, let’s look into the near future.

I think, the overall trend in quality of enterprise software is positive. Consumer software mentioned by Jos is only one factor why enterprise software vendors are investing more in quality. Jos’ article made me think more about how customers should approach the topic of PLM migrations and upgrades. In general, I think, it can be applicable not only to PLM systems. PLM vendors are trying to make migrations easy from both economical and technological standpoints. Here are some of my thoughts about anatomy of PLM software migration.

Migration technologies

While some technologies can give you an advantage during migration and upgrades, from a technical standpoint you cannot avoid upgrades. Very simple – from time to time you need to restructure database to bring new features or optimize for performance. Since PLM is relying on OS and database technologies, you need to get upgrades to bring PLM system into compatible state with new OS/RDBMS. If you PDM/PLM system is integrated with other CAD systems, this is another aspect of migrations.

From technological perspective, migration is always sort of extract, transfer, load type of things. It can be minor or major. It can happen in a single database or may require a separate set of application or database servers. PLM system architecture designed with "upgrade in mind" can make it easier, but won’t eliminate it completely.

PLM vendors and economic of migration

PLM vendors are starting to pay attention to migration and upgrades. While the status of PLM systems is far from an ideal when it comes to migration, some vendors are proposing to cover upgrades and migrations as part of PLM service and licensing offerings.

SaaS (cloud) is providing another way to hide migration and upgrades. Since customer is not buying software to install it in their data centers, the problem of migrations and upgrades eventually is part of PLM vendor responsibility.

Technical elements of migration

There are 3 main elements that can increase PLM system vulnerability to upgrades and migrations – 1/ custom data model; 2/ code customization and scripting; 3/ integration with other system. The amount of specialization in each of them, can increase a potential cost and complexity of migration.

What is my conclusion? You cannot avoid migrations and upgrades. So, to plan ahead is a good idea. You should evaluate vendor and product for "updatability". It is not simple, especially when it comes to on-premise software. Product architecture evaluation should be an important element of your system selection process. If you think about SaaS /cloud as a universal solution for upgrades and migration, I recommend you to take it carefully as well. It certainly removes a pain from a customer. However, take into account it won’t eliminate upgrades from technological standpoint. Upgrades are essential part of SaaS product development. Depends on SaaS architecture and development methodology, the system can be in an upgrade mode all the time. Which is a good thing because it will be become part of product delivery. Just my thoughts…

Best, Oleg


Why manual PLM data modeling should be a thing in the past?

January 22, 2015

plm-data-modeling

One of the most complicated parts of any PLM implementation is data modeling. Depends on PLM vendor, product and technology, the process of data modeling can be called differently. But fundamentally, you can see it in any PLM implementation. This is a process, which creates an information model of product and processes in a specific company. To get it done is not simple and it requires lot of preparation work, which is usually part of implementation services. Even more, once created data model needs to be extended with new data elements and features.

Is there a better way? How other industries and products are solving similar problems of data modeling and data curating. It made me think about web and internet as a huge social and information system. How data models are managed on the web? How large web companies are solving these problems?

One of the examples of creating a model for data on the web was Freebase. Google acquired Freebase and used as one of the data sources for Google Knowledge Graph. You can catch up on my post why PLM vendors should learn about Google Knowledge Graph. Another attempt to create a model for web data was Schema.org, which is very promising in my view. Here is my earlier post about Schema.org – The future of Part Numbers and Unique Identification. Both are examples of curating data models for web data. The interesting part of schema.org is that several web search vendors are agreed on some elements of data model as well as how to curate and manage schema.org definitions.

However, it looks like manual curating of Google Knowledge Graph and Schema.org is not the approach that makes web companies to feel happy about and leapfrog in the future. Manual work is expensive and time consuming. At least some people are thinking about that. Dataversity article “Opinion: Nova Spivack on a New Era in Semantic Web History” speaks about some interesting opportunities that can open a new page in the way data is captured and modeled. He speaks about possible future trajectories of deep learning, data models and relationships detecting. It can extend Schema.org, especially in the part that related to automatically generated data models and classifications. Here is my favorite passage:

At some point in the future, when Deep Learning not only matures but the cost of computing is far cheaper than it is today, it might make sense to apply Deep Learning to build classifiers that recognize all of the core concepts that make up human consensus reality. But discovering and classifying how these concepts relate will still be difficult, unless systems that can learn about relationships with the subtly of humans become possible.

Is it possible to apply Deep Learning to relationship detection and classification? Probably yes, but this will likely be a second phase after Deep Learning is first broadly applied to entity classification. But ultimately I don’t see any technical reason why a combination of the Knowledge Graph, Knowledge Vault, and new Deep Learning capabilities, couldn’t be applied to automatically generating and curating the world’s knowledge graph to a level of richness that will resemble the original vision of the Semantic Web. But this will probably take two or three decades.

This article made me think about the fact manual data curating for Freebase and Schema.org is a very similar process to what many PLM implementers are doing when applying specific data and process models using PLM tools. Yes, PLM data modeling happens usually for a specific manufacturing companies. At the same time, PLM service providers are re-using elements of these models. Also companies are interconnected and working together. The problem of communication between companies is painful and still requires some level of agreement between manufacturing companies and suppliers.

What is my conclusion? Data modeling is an interesting problem. For years PLM vendors put a significant focus how to make flexible tools that can help implementers to create data and process models. Flexibility and dynamic data models are highly demanded by all customers and this is one of the most important technological element of every PLM platform today. New forms of computing and technologies can come and automate this process. It can help to generate data models automatically via capturing data about what company does and processes in a company. Sounds like a dream? Maybe… But manual curating is not an efficient data modeling. The last 30 years of PDM/PLM experience is a good confirmation to that. To find a better way to apply automatic data capturing and configuration for PLM can be interesting opportunity. Just my thoughts…

Best, Oleg

photo credit: tec_estromberg via photopin cc


Can BOX become a platform for PLM?

January 20, 2015

box-collaboration

Platform is a topic, which comes quite often in the discussion about future of PLM. CIMdata recently came with a topic of “platformization” in PLM. You can catch up on the discussion – A CIMdata dossier: PLM platformization. I can probably divide all existing PLM platforms into two groups – 2D/3D design platform and Object Database Modeling platform. Last year, I charted some of possible options for a foundation of future PLM platform – System Engineering, 2D/3D services, product development standards, New database technologies. From another standpoint, the debates about future PLM platforms are often raising a question of single vs. federated platform for PLM.

New technological trends and demands of customers can bring new platforms into PLM world. One of them is cloud storage. I touched cloud storage topic in my article – CAD companies and cloud storage strategies. One of the points was related to longevity of “cloud storage” business. Cloud companies want to store your data. It gives them an opportunity to understand your business better. However, the prediction is that cloud storage cost is eventually coming to zero. Which leaves cloud companies with the need to develop solutions to elevate productivity and improve collaboration and document creation. This is where it comes to PLM as a future platform for product innovation.

BOX is a company which is located at the intersection of cloud storage and enterprise business. My attention was caught by BI article – In One Slide, Box Explains What Everybody’s Getting Wrong About The Company. Here is the slide:

box-enterprise-platform

Here is an interesting passage and conclusion from the article:

In an interview with Business Insider, Box CEO Aaron Levie said he knew storage business was going to turn into a commodity business back when he first started the company. Instead, he said he’s creating a platform business, where more value is added on top of things like storage, computing, and security. “It’s all about going into the top 8 to 10 industries and finding where are companies reimagining their business, where are they going digital, where are they transforming their business model, and how does Box act as a platform that could accelerate that push into the future,” he said. If the critics are right, Box is doomed. If Box is right, it has a chance at being a valuable enterprise company along the lines of Salesforce.

Looking on customers, partners and, especially BOX enterprise content collaboration platform, made me think about an interesting intersection between product lifecycle and BOX business. Of course BOX is not in the business of design and engineering software. However, enterprise collaboration has a significant overlap with what most of PLM platforms are providing – metadata, security, workflow, collaboration, content search. These are topics that always presented in PLM. It seems to me current focus of BOX is outside of manufacturing companies. However, maybe future BOX growth will take it towards manufacturing enterprises.

What is my conclusion? I don’t think BOX is focusing today on manufacturing companies. However, elements of BOX platform have a perfect sense when you think about product lifecycle collaboration. What is especially interesting is content collaboration on an enterprise scale. This is a topic, which most of PLM companies are struggling with. Existing PLM platforms have good representation in engineering domain, but lack of broad enterprise adoption. This is a place where future competition between PLM vendors and BOX (or similar companies) can occur. On the other side, BOX can become a platform to take PLM collaboration forward in enterprise companies. Just my thoughts…

Best, Oleg

Picture credits box.com


Will search replace engineer’s brain in the future?

January 17, 2015

engineers-plm-brain

Computers are changing the way we work. It is probably too broad statement. But if I think about the fact today is Friday afternoon, it should be fine :). I want to take a bit futuristic perspective today. Google, internet and computing are good reason why our everyday habits today are different from what we had 10 years ago. Back in the beginning of 2000s we’ve been buying paper maps before going on vacation and kept paper books with phone numbers of people we need. Look how is it different now. Maybe we still need to make a hotel reservation before the trip, but most of the thing we do can be achievable online via internet and mobile devices.

A month ago, I posted about connecting digital and physical entities. I was inspired by Jeff Kowalski presentation at AU 2014. You can get a transcript and video by navigating to the following link. The idea of machine learning and "training" computer brain to find an optimal design is inspiring. The following passage from Kowalski’s presentation is a key in my view:

…we’re working on ways to better understand and navigate existing solutions that might be relevant to your next design project. Using machine learning algorithms, we can now discover patterns inherent in huge collections of millions of 3D models. In short, we can now discover and expose the content and context of all the current designs, for all the next designs. Taxonomies are based on organizing things with shared characteristics. But they don’t really concern themselves with the relationships those things have with other types of things — something we could call context. Adding context reveals not only what things are, but also expresses what they’re for, what they do, and how they work.

Nature explores all of the solutions that optimize performance for a given environment — what we call evolution. We need to do the same thing with our designs. But first we have to stop "telling the computer what to do," and instead, start "telling the computer what we want to achieve." With Generative Design, by giving the computer a set of parameters that express your overall goals, the system will use algorithms to explore all of the best possible permutations of a solution through successive generations, until the best one is found.

Another time, I’ve was recently thinking about artificial intelligence, machine learning and self-organized systems was my article – How PLM can build itself using AI technologies. The idea of The Grid that allows to self organize website based on a set of input parameters and content learning is interesting. It made me think about future PLM system that self-define system behaviors based on the capturing of information and processes from a manufacturing company.

The article Google search will be your brain put another interesting perspective on the evolution of computer and information system. Take some time over the weekend and read the article. The story of neural nets is fascinating and if you think about a potential to train the net with the knowledge of design, it can help to capture requirements and design commands in the future. Here is an interesting passage explaining how neural nets are working from the article:

Neural nets are modeled on the way biological brains learn. When you attempt a new task, a certain set of neurons will fire. You observe the results, and in subsequent trials your brain uses feedback to adjust which neurons get activated. Over time, the connections between some pairs of neurons grow stronger and other links weaken, laying the foundation of a memory.

A neural net essentially replicates this process in code. But instead of duplicating the dazzlingly complex tangle of neurons in a human brain, a neural net, which is much smaller, has its neurons organized neatly into layers. In the first layer (or first few layers) are feature detectors, a computational version of the human senses. When a computer feeds input into a neural net—say, a database of images, sounds or text files—the system learns what those files are by detecting the presence or absence of what it determines as key features in them.

So, who knows… maybe in a not very far future CAD and PLM systems will be providing a specific search based experience helping engineers to design and manufacturing in a completely different way.

What is my conclusion? While it still sounds like a dream, I can see some potential in making design work looks similar to search for an optimal solution with specific constraints and parameters. A well trained algorithm can do the work in the future. Just thinking about that can fire so many questions – how long will take to train the net, what will be a role of engineers in the future design and many others. But these are just my thoughts… Maybe it will inspire you too. Have a great weekend!

Best, Oleg


Top 5 PLM trends to watch in 2015

January 15, 2015

plm-trends-2015

Holidays are over and it was a good time to think about what you can expect in engineering and manufacturing software related to PLM in coming year. You probably had a chance to listen to my 2015 PLM predictions podcast few months ago. If you missed that, here is the link. Today I want to give a bit more expanded list of trends in product lifecycle management to observe in 2015.

1- Greater complexity of cloud PLM implementations

Cloud adoption is growing in enterprise for the last few years and it is getting more mature. PLM vendors are making steps in the cloud direction too. Companies are moving from marketing and research to “nuts and bolts” of implementations. Switch to the cloud is not as simple as some marketing pundits predicted. It is more than just moving servers from your data center to somebody else place. The complexity of implementation, maintenance and operation will emerge and will drive future difference between “born in the cloud” solutions and existing PLM platforms migrating to the cloud.

2- The demand to manage complex product information will be growing

Products are getting more complex. You can see it around you. A simple IoT gadget such as door lock can combine mechanical, electrical, electronic and software parts. It introduces a new level of complexity for manufacturing and PLM vendors – how to manage all this information in a consistent way? To bring together design and bill of materials for every discipline becomes a critical factor in manufacturing company of every size.

3- New type of manufacturing companies will be attracting focus of PLM vendors

Manufacturing landscape is changing. Internet and globalizaiton enabling to create a new type of manufacturing companies – smaller, distributed, agile, crowdfunded. It requires new type of thinking about collaboration, distribute working, digital manufacturing and more. These companies are representing new opportunity and will drive more attention from PLM vendors.

4- Growing interest in mobile enterprise PLM solutions

Mobile went mainstream in many domains. Until now, engineers in manufacturing companies mostly used mobile for email. In 2015 I can see a potential to have a greater interest in mobile solution from manufacturing companies. Distributed work and need for collaboration will drive the demand to make existing enterprise systems more mobile.

5- The demand for big data and analytics in product lifecycle.

Data is driving greater attention these days. I even heard data “data as a new oil”. Manufacturing companies will start to recognize the opportunity and think how to use piles of data from their enterprise engineering and manufacturing system to drive some analysis and use it for decision making.

What is my conclusion? I think 2015 will be a very interesting year in PLM. Broader adoption of cloud, mobile and big data analytics will drive future transformation in engineering and manufacturing software. The disconnect between old fashion enterprise software and new tech vendors will increase. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 269 other followers