How PLM can “build itself” using artificial intelligence technologies

January 7, 2015

building-the-grid-of-data

I had a chance to visit The Art of Brick exhibition in Boston’s Faneuil Hall Museum few days ago. If you following me on social media websites, there is a chance you noticed few pictures. Afterwards, I read more about LEGO artist Nathan Sawaya. What impressed me is a power of “simple LEGO brick”. A simple plastic brick and huge amount of imagination is allowing to create such an incredible models.

lego-bricks-sawaya

You can ask me – how is that connected to engineering, manufacturing and product lifecycle management? Here is the thing… It made me think about ways PLM systems are implemented these days. I’m sure you are familiar with the “best practices” approach. The topic isn’t new. I found my old post - PLM best practices torpedo. After five years, I still like my conclusion – PLM best practices are good to show what PLM technology and software are capable to do. However, for real implementation, it is not very useful. You have to come back to a “simple bricks” of PLM technology – data models, documents, lifecycle statuses, bill of materials, processes.

I captured a bit different perspective about PLM best practices. Navigate to PLM cultural change blog – PLM design patterns. It took me back into thinking about best practices. How to define implementation patterns and make a real PLM implementation much easier? The article speaks about general way of PLM implementation can be done, organizational transformation and change. Read the article. I found the following few passages interesting:

In general you can setup all required supporting procedures in using the PLM design patterns. Even for specific supporting procedures of a business process pattern like Engineer to Order (ETO) you can derive patterns, which consists of a framework of general PLM design patterns and are adapted to the specific business needs. There is enough freedom to derive based on these patterns supporting procedures to fulfill specific business needs.

If some organizations would have implemented supporting procedures based on patterns already, then consultants in introducing PLM to an organization could refer to “state of the art” implementation examples of these organizations. The target is to convince an organization, that the decision for a new practice requesting organizational change is required and works. Only then the organization can enable the full potential of the PLM methodology without remaining stuck in the current practice.

Instead of inventing a Ping-Pong table “from scratch” with a cabinetmaker we can make a clear decision based on all the options available, fulfilling and probably exceeding our originally perceived needs (with a safe and easy-to-use folding mechanism). And we can afford it, because a stock table is cheaper than a custom built one.

The time saved in avoiding the endless discussions and continual redesign of processes because of paradigm paralysis, based on current methods, could be better used in a well-planned, strategic deployment of the new processes leading to an improved business solution.

plm-design-patterns

The idea and vision of configurable patterns and best practice is interesting. In my view, it was invented earlier as PLM toolkits, flexible data models and process templates. The key problem here is not related to technology- software does what it does. The problem is related to people and organization. Remember, technology is simple, but people are really hard. What called “to convince people” is actually a process that takes organization and people to understand their business and product development patterns. Without that understanding the chances of successful PLM implementation are very low and probability of PLM project failure is high.

So, what could be 21st century solution for that problem?

My attention today caught by a new startup – The Grid. The tagline states – AI websites that design themselves. The vision of The Grid is to change the paradigm of website building. The idea of self-building websites driven by artificial intelligence and data analysis is something worth to think about. Watch the video.

Now let me back to manufacturing companies and PLM implementations. All manufacturing organizations are different. The approach most of PLM vendors are taking these days is to classify companies by size (small, medium, large), industry (aero, auto, industrial equipment, etc), manufacturing model (mass production, configured to order, engineering to order, etc.) and many others such as locations, supply chain, existing enterprise systems (ERP, SCM, etc.). The decision matrix is huge. To make analysis of existing manufacturing company, processes, existing systems, requirements – this is what takes time and money during PLM implementation.

What is my conclusion? The opportunity we have today is coming from new way to process data. Call it cloud computing, big data, whatever. Facebook is reporting about a capability to index trillion posts. Would it be possible to capture data from an existing manufacturing company and ask PLM system to build itself? Is it a dream or a future of PLM? Just my thoughts…

Best, Oleg

pictures credit to The Grid website and PLM cultural change blog


PLM and Entire System Common Data Model

January 5, 2015

complex-system-bom

Products are getting more complex these days. There are multiple reasons for that. Adding of connectivity and software is one of the most visible reasons why it happens. Think about any simple gadget these days, which cost less than 99$ in US. It includes mechanical components, electrical parts and software. In additional to that, products are often functioning together with cloud-based or bluetoooth enabled software. The situation is even more complex when it comes to industrial equipment, transportation and other products.

Last year, I was blogging about the need to combine engineering and software BOM. My hunch, the problem is not solved yet. My attention caught by TEC Technology Evaluation writeup – 9 Innovation and Product Development Software Market Trends and Predictions made by Predrag Jakovljevic. One part of it speaks about the need to develop a holistic system common data model to manage an entire system. Here is passage, which speaks about that:

There is a need for a common data model for managing an entire system, i.e., both hardware and software data from ideation to end of life (EOL). Common repositories and software architectures must enable the reuse of design and components (and intellectual property, if you will). Thus the "innovation platform" has become a big concept for PLM-CAD-MOM players—Dassault Systèmes has the 3DEXPERIENCE Platform, Siemens PLM Software has its Smart Innovation Platform, and Autodesk’s A360, PLM 360, and Fusion 360 products all run on the same data model. Other vendors such as SAP and Oracle are not far behind. I question how many companies will be able to support an all-inclusive PLM to CAD to MES solution. My feeling is that “cloud rings or layers” with improved interoperability will emerge around PLM to reduce upgrade and total cost of ownership (TCO) costs.

The fact vendors are working on new platform can hint future modeling capabilities will help to manage more complex data structures. To create a "composite BOM model" with elements of mechanical, electrical and software parts can be an interesting PLM challenge. PLM vendors have been working on variety of BOM solutions for the last two decades. Product structure is a complex piece of data, which is representing a critical element of PLM data management foundation. Earlier days PLM complexity was related to product configurations and options. Today electronic and software is a new challenge for PLM data management.

What is conclusion? What was a challenge for aerospace and automotive industry 20 years, now comes in electronic gadget and IoT connected devices. The ability to manage software, electronic and mechanical parts becomes a pre-requisite for any PLM system deployment. Just my thoughts…

Best, Oleg

photo credit upverter


Oracle Cloud PaaS will provide a magic button for PLM

September 29, 2014

oracle-hq

Cloud PLM architecture and implementations is one of the topics I’m following for the last few years. It is interesting to watch dynamics of this space from initial ignorance to careful recognition and marketing buzz. I can see differences in how PLM vendors are approaching cloud. In my view, nobody is asking a question “why cloud?” these days. At the same time, we can see large variety of strategies in cloud PLM implementations and strategies. I guess PLM vendors want to answer on the question – How to implement cloud?

The element of infrastructure is important. The strategy of Siemens PLM – one of the leaders of PLM market is heavily relying on IaaS option. I covered in my post here. At the same time, Dassault is promising to support all PLM cloud options by 2015+.

I’m following Oracle Open World these days online. Gigaom article. Earlier today, the following article caught my attention – Oracle launches upgraded cloud platform with its database and Java available as a service. One of the key elements in Oracle cloud strategy is reliance of Oracle database.

oracle-db-paas

This is my favorite passage from the article:

Oracle detailed on Sunday evening its upgraded cloud suite that includes the ability for customers to use its flagship database in the cloud as well as on-premise. Executive chairman and CTO Larry Ellison talked about the new platform, now available, during his keynote session at Oracle’s annual OpenWorld conference. Ellison (pictured above) attempted to persuade the audience that Oracle’s rejiggered cloud platform can be the all-in-one shop for users to run Oracle applications, house their data and even build out their own applications while choosing whether or not they want any or all of those items to run on the cloud. “This new Oracle in the cloud allows you to move any database from your datacenter to the cloud like pushing a button,” said Ellison. Oracle’s cloud platform consists of a software-as-a-service (SaaS), a platform-as-a-service (PaaS) and an infrastructure-as-a-service (IaaS) in which all three are needed by Oracle to better serve its customers who have been clamoring for the company to provide cloud services, explained Ellison.

oracle-cloud

 

multitenant-oracle-db

The point of moving any database from your data center to the cloud is fascinating. It made me think about future path to the cloud for many PLM vendors. Most of them are using Oracle database for core database functions. The specific architecture of each PLM product can be different, but to have Oracle responsible for running database in cloud environment can be an interesting opportunity to simplify cloud architecture. Instead of hosting databases using IaaS platforms, PLM products can use multi-tenant Oracle PaaS.

What is my conclusion? Major PLM vendors are looking how to “cloud-enable” their existing product and software architectures. The promise to move database from data center to cloud like pushing a button might be a bit on a marketing side. This is an alert for PLM software architects. IT managers responsible for PLM implementation can take a note to ask about how to move Enovia or TeamCenter into Oracle PaaS. To have Oracle multi-tenant database running by Oracle PaaS is an interesting option, for sure. Just my thoughts…

Best, Oleg


Security and permissions are showstoppers to adopt search

June 25, 2014

search-top-secret

Search and information discovery is a big deal these days. Inspired by Google and other web search giants, we want information at our fingertips at the right time. I’ve been following topic of search long time. You can jump on few of my previous articles about search – Oslo & Grap – new trajectories in discovery and search; Why engineers need exploratory search? and Pintrest will teach CAD companies to search.

You may think cost and complexity are top problems of search technologies. Crunching lots of data and connecting relevant information requires application of right resources and skills. You will be surprised, but there is one more element that drives low adoption of search in manufacturing companies – security.

Information age articles Enterprise search adoption remains low – survey speaks about survey done among 300 Enterprise IT professionals conducted by Varonis Systems. According to this survey – enterprises are afraid good search solution will allow to people o find information with no permission. Here is the passage which explains that:

The respondents were surveyed at two major security-themed industry events, the RSA Conference in February and Infosecurity Europe in April. When asked to choose the biggest obstacle to enterprise search adoption, 68% cited the risk of employees locating and accessing files they should not have permission to view. Further, even if an enterprise search solution perfectly filters out results based on established permissions, the majority of respondents indicated they are not confident that their organisation’s existing permissions are accurate. Additional obstacles to enterprise search adoption most commonly cited were accuracy of the results (36%), end user adoption (29%) and the ability of solutions to scale enough to index all the data (24%).

It made me think about complexity of manufacturing companies and enterprise organization in general. Established permissions are part of the story. The search results permissions are as good as data that enterprise systems are supplying to search software. GIGO (Grabage in, Garbage out). For many IT organization, management of security and permissions is a big deal. Think about typical manufacturing company. Tomorrow, search system can find all CAD files that were occasionally copy/pasted in different locations and shared between organizations outside of existing PDM/PLM tools. What else, multiple "publishing solutions" created variety of published copies in different formats. Add SharePoint and similar technologies sometimes adopted by divisions against approvals of central IT. Good search solution can be a litmus test to many IT organizations.

What is my conclusion? Manufacturing enterprises are complex. As I described, it driven by strategic, political and cultural lines. Search is disruptive technology that has a possibility to cross these lines and expose many elements of corporate IT problems. So, once more, we learn that only mix of technological and people skills can solve the problem. Strategists and technologist of search vendors should take a note. Just my thoughts…

Best, Oleg

[categories Daily PLM Think Tank]


Dropbox Webhooks and Cloud PDM Pivoting

May 27, 2014

cloud-pdm-pivoting

Regardless on what CAD and PDM vendors want, engineers are going to share files on Dropbox and similar file sharing services (Google Drive, One Drive, etc.) Do you remember my PLM cloud concerns and Dropbox reality for engineers post two years ago? 34% of people in engineering departments are using Dropbox to share data. I don’t know what is the number now, but my hunch – it is not going down.

I’ve been reading about interesting functionality added to Dropbox- Webhooks. Navigate to the following Computerworld article to read more – Dropbox plays more nicely with Web apps. The following passage explains what service does:

Web developers can now configure apps to be notified immediately of changes that users make to their Dropbox files, taking some strain off Web servers and potentially giving end users a better experience. The functionality comes via a new "webhooks" API (application programming interface) for Dropbox, which lets developers set up real-time notifications for their Web apps whenever users modify a Dropbox file.

More explanations can be found in Dropbox blog – Announcing Dropbox webhooks:

In general, a webhook is a way for an app developer to specify a URI to receive notifications based on some trigger. In the case of Dropbox, notifications get sent to your webhook URI every time a user of your app makes a file change. The payload of the webhook is a list of user IDs who have changes. Your app can then use the standard delta method to see what changed and respond accordingly.

You can ask me how is that related to cloud PDM? Good question. In my view, this particular piece of Dropbox technology can simplify development of any cloud PDM system. Dropbox developed reference application in the tutorial. Another product referenced in the announcement is Picturelife. Simplify – doesn’t mean cloud PDM will ultimately relies on Dropbox. Many companies don’t want to put their data on Dropbox. Maybe your remember my blog post – How to evaluate PDM before it will ruin your personal productivity? Here is the thing – for most of cloud PDM developers, user experience is the biggest issue. Dropbox is an ideal environment to kick off your cloud PDM development experiments. The majority of companies that not using PDM these days are using Dropbox. For these users cloud PDM on top of Dropbox can be "no brainier". Later on, additional infrastructure can be build and used.

What is my conclusion? To get user traction is priceless. It requires lots of UX pivoting. To find right experience is one of the most critical first steps. Future technologies can be improved and fixed. There are many open web infrastructure these days that can be used to build enterprise products (including PDM). Startup companies can pivot and experiment with user experience with Dropbox based cloud PDM… Actually, established vendors can do the same. Just my thoughts…

Best, Oleg

picture credit http://www.freedigitalphotos.net/


Social PLM and Mobile Dribbling

May 6, 2014

Once “social” was a hot topic for PLM developer and analysts. In my view, the hype went down. Ask PLM people about social applications and be prepared for very neutral response. I asked myself – why so? Social applications can improve the way people communicate and should bring value. Nevertheless, social revolution in PLM is kinda “postponed”. You might be interested – will social apps back, when and how?

My attention was caught by WSJ.D article – Data Point: Social Networking Is Moving on From the Desktop. These numbers made me think about potential benefits between “social” and “mobile” in PLM.

social-network-mobile-social

Similar to social apps, the popularity of mobile PLM application is not skyrocketing too. Once excited about the ability to run “everything from iPad”, users got back to their desktops, CAD workstations, BOM Excels and browser applications. Did PLM vendors miss the point of mobile? I asked about that two years ago here. The confusion between “mobile web” and “native app” is probably only part of the problem. When world is going to be even more distributed than today, the efficiency of mobile PLM applications and intuitiveness of how mobile app can present the data becomes absolutely critical. However, mobile app will be used only if it is easier and brings additional value. The best example is taking picture during the presentation and sharing it via Twitters and/or Facebook.

Now I want to get back to social PLM option. I just read about new feature – you can tweet to Amazon to put a specific article or item in your shopping cart. Navigate here to read more. There is nothing very special here. It is all about efficiency. Imagine you found something you want to buy at the time you browse your twitter stream in the morning. To stay in the same environment and put an article to the Amazon shopping cart is all about efficiency. So, here is my guess. Social PLM can reinvent itself via mobile option.

What is my conclusion? The efficient interaction is very important when out of the office and not connected to your well-organized desktop. So, specially designed “social PLM” function can be very demanded on mobile devices. However, the fine tuning of functionality and mobile experience is a key. Efficient user interaction combined together with valuable scenario. This is a key for mobile PLM and social PLM to be successful. Without these two elements, customers will keep walking from social and mobile links to desktop. Just my thoughts…

Best, Oleg

photo source – WSJ-D article


How cloud PLM can reuse on-premise enterprise data?

April 7, 2014

plm-on-premise-data-sync

Cloud becomes more and more an obsolete additional word to call every technology we develop I hardly can image anything these days that we develop without "cloud in mind". This is absolutely true about PLM. Nowadays, it is all about how to make cloud technologies to work for you and not against you.

For cloud PLM, the question of secure data usage is one of the most critical topics. Especially, if you think about your existing large enterprise customers. These large companies started PLM adoption many years ago and developed large data assets and custom applications. For them, data is one of the most important elements that can enable use of cloud PLMs.

Networkworld article How Boeing is using the cloud caught my attention this morning. The writeup quotes Boeing chief cloud strategies David Nelson and speaks about very interesting approach Boeing is using to deploy and use on-premise data on public cloud. Here is the passage that outline the approach:

Nelson first described an application the company has developed that tracks all of the flight paths that planes take around the world. Boeing’s sales staff uses it to help sell aircraft showing how a newer, faster one could improve operations. The app incorporates both historical and real-time data, which means there are some heavy workloads. “There’s lots of detail and analysis,” he says. It takes a “boatload” of processing power to collect the data, analyze it, render it and put it into a presentable fashion.

The application started years ago by running on five laptop computers that were synced together. They got so hot running the application that measures needed to be taken to keep them cool, Nelson said. Then Nelson helped migrate the application to the cloud, but doing so took approval from internal security, legal and technology teams.

In order to protect proprietary Boeing data the company uses a process called “shred and scatter.” Using software supported by a New Zealand firm, GreenButton, Boeing takes the data it plans to put in the cloud and breaks it up into the equivalent of what Nelson called puzzle pieces. Those pieces are then encrypted and sent to Microsoft Azure’s cloud. There it is stored and processed in the cloud, but for anything actionable to be gleaned from the data, it has to be reassembled behind Boeing’s firewall.

It made me think about one of the most critical things that will define future development and success of cloud PLM technologies and products – data connectivity and on-premise/cloud data sync. Here is my take on this challenge. It is easy to deploy and start using cloud PLM these days. However, PLM system without customer data is not very helpful. Yes, you can manage processes and new projects. However, let’s state the truth – you need to get access to legacy data to fully operate your PLM software on enterprise level. Manufacturing companies are very sensitive about their data assets. So to develop kind of "shred and scatter" data sync approaches can be an interesting path to unlock cloud PLM for large enterprise customers.

What is my conclusion? I can see cloud data sync as one of the most important cloud PLM challenges these days. To retrieve data from on-premise location in a meaningful way and bring it to the cloud in a secure manner is a show stopper to start broad large enterprise adoption. By solving this problem, cloud PLM vendors will open the gate for large enterprises to leverage public cloud. It is a challenge for top enterprise PLM vendors today and clearly entrance barrier for startup companies and newcomers in PLM world. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 268 other followers