Will PLM Vendors Jump into Microsoft Cloud Window in Europe?

April 11, 2014

european-plm-cloud

Cloud is raising lots of controversy in Europe. While manufacturing companies in U.S. are generally more open towards new tech, European rivals are much more conservative. Many of my industry colleagues in Germany, France, Switzerland and other EU countries probably can confirm that. Europe is coming to cloud systems, but much slower. I’ve been posting about cloud implications and constraints in Europe. Catch up on my thoughts here – Will Europe adopt cloud PLM? and here PLM cloud and European data protection reforms. These are main cloud concerns raised by European customers – data, privacy and specific country regulation. With companies located in different places in EU, it can be a challenge.

Earlier today, I’ve heard some good news about cloud proliferation in Europe coming from Microsoft. TechCrunch article – Microsoft’s Enterprise Cloud Services Get A Privacy Thumbs Up From Europe’s Data Protection Authorities speaks about the fact Microsoft enterprise cloud service meets the standards of data privacy in several European countries. Here is a passage that can put some lights on details and what does it mean:

But today comes a piece of good news for Redmond: the data protection authorities (DPAs) of all 28 European member states have decided that Microsoft’s enterprise cloud services meet its standards for privacy. This makes Microsoft Azure, Office 365, Microsoft Dynamics CRM and Windows Intune the first services to get such approval. The privacy decision was made by the “Article 29 Data Protection Working Party,” which notes that this will mean that Microsoft will not have to seek approval of individual DPAs on enterprise cloud contracts. In its letter to Microsoft (embedded below), chair Isabelle Falque-Pierrotin writes, “The MS Agreement, as it will be modified by Microsoft, will be in line with Standard Contractual Clause 2010/87/EU… In practice, this will reduce the number of national authorizations required to allow the international transfer of data (depending on the national legislation).”

Majority of PDM / PLM providers are friendly with Microsoft tech stack. Some of them are completely relies on MS SQL server and other Microsoft technologies. Most of them are supporting SharePoint. Now, these PLM vendors have an additional incentive to stay with Microsoft technologies for the cloud. It can be also a good news for manufacturing companies already deployed PDM/PLM solutions on top of Microsoft technologies and developed custom solutions.

What is my conclusion? The technological landscape these days is very dynamic. The time, one platform worked for everybody is over. In light of technological disruption and future challenges tech giants will be using different strategies in order to stay relevant for customers. Will European cloud regulation keep PDM/PLM players with MS Azure and other Microsoft technologies compared to alternative cloud technological stacks? How fast will take to other players to reach the same level of compliance? These are good questions to ask vendors and service providers. Just my thoughts…

Best, Oleg


How cloud PLM can reuse on-premise enterprise data?

April 7, 2014

plm-on-premise-data-sync

Cloud becomes more and more an obsolete additional word to call every technology we develop I hardly can image anything these days that we develop without "cloud in mind". This is absolutely true about PLM. Nowadays, it is all about how to make cloud technologies to work for you and not against you.

For cloud PLM, the question of secure data usage is one of the most critical topics. Especially, if you think about your existing large enterprise customers. These large companies started PLM adoption many years ago and developed large data assets and custom applications. For them, data is one of the most important elements that can enable use of cloud PLMs.

Networkworld article How Boeing is using the cloud caught my attention this morning. The writeup quotes Boeing chief cloud strategies David Nelson and speaks about very interesting approach Boeing is using to deploy and use on-premise data on public cloud. Here is the passage that outline the approach:

Nelson first described an application the company has developed that tracks all of the flight paths that planes take around the world. Boeing’s sales staff uses it to help sell aircraft showing how a newer, faster one could improve operations. The app incorporates both historical and real-time data, which means there are some heavy workloads. “There’s lots of detail and analysis,” he says. It takes a “boatload” of processing power to collect the data, analyze it, render it and put it into a presentable fashion.

The application started years ago by running on five laptop computers that were synced together. They got so hot running the application that measures needed to be taken to keep them cool, Nelson said. Then Nelson helped migrate the application to the cloud, but doing so took approval from internal security, legal and technology teams.

In order to protect proprietary Boeing data the company uses a process called “shred and scatter.” Using software supported by a New Zealand firm, GreenButton, Boeing takes the data it plans to put in the cloud and breaks it up into the equivalent of what Nelson called puzzle pieces. Those pieces are then encrypted and sent to Microsoft Azure’s cloud. There it is stored and processed in the cloud, but for anything actionable to be gleaned from the data, it has to be reassembled behind Boeing’s firewall.

It made me think about one of the most critical things that will define future development and success of cloud PLM technologies and products – data connectivity and on-premise/cloud data sync. Here is my take on this challenge. It is easy to deploy and start using cloud PLM these days. However, PLM system without customer data is not very helpful. Yes, you can manage processes and new projects. However, let’s state the truth – you need to get access to legacy data to fully operate your PLM software on enterprise level. Manufacturing companies are very sensitive about their data assets. So to develop kind of "shred and scatter" data sync approaches can be an interesting path to unlock cloud PLM for large enterprise customers.

What is my conclusion? I can see cloud data sync as one of the most important cloud PLM challenges these days. To retrieve data from on-premise location in a meaningful way and bring it to the cloud in a secure manner is a show stopper to start broad large enterprise adoption. By solving this problem, cloud PLM vendors will open the gate for large enterprises to leverage public cloud. It is a challenge for top enterprise PLM vendors today and clearly entrance barrier for startup companies and newcomers in PLM world. Just my thoughts…

Best, Oleg


What metadata means in modern PDM/PLM systems

February 26, 2014

meta-data

Metadata is "data about data". If you stay long with PDM industry, you probably remember how earlier EDM/PDM software defined their role by managing of "data about CAD files" (metadata). However, it was long time ago. Wikipedia article defines two types of metadata – structural and descriptive. Here is a quote from the article:

The term is ambiguous, as it is used for two fundamentally different concepts (types). Structural metadata is about the design and specification of data structures and is more properly called "data about the containers of data"; descriptive metadata, on the other hand, is about individual instances of application data, the data content.

In my view, CAD/PDM/PLM is using both types. Since design is very structured and contains lots of rich semantic relations, metadata about CAD files stored in PDM system is structured. At the same time, descriptive metadata such as file attributes, information about people, project, organization can be applied to individual instance of CAD data (files) as well.

Since early EDM/PDM days, lots of changes happened in the definition and usage of a word metadata. Some of them are very confusing. The traditional use and definition of files (for example, in context of CAD files) is changing. Sometimes, we want to to keep "file" as a well-known abstraction, but underlining meaning is completely different and point more on "data" or "design" rather than actual files. Also, introduction of web based systems are changing physical use of files. The usage of file accessed via mobile application located in Dropbox is completely different. In many scenarios you will never get access to these physical files.

DBMS2 article Confusion about Metadata speaks about some additional aspects of metadata management that getting more relevant these days. It includes data about mobile devices usage (telephone metadata) and document data. Document data is getting more structured these days and often cannot be distinguished from structured RDBMS data. Here is interesting passage that describes the transformation of database and document based data.

[data about data structure] has a counter-intuitive consequence — all common terminology notwithstanding, relational data is less structured than document data. Reasons include: Relational databases usually just hold strings — or maybe numbers — with structural information being held elsewhere. Some document databases store structural metadata right with the document data itself. Some document databases store data in the form of (name, value) pairs. In some cases additional structure is imposed by naming conventions. Actual text documents carry the structure imposed by grammar and syntax.

Modern data management systems and especially noSQL data bases such as document and key-value databases can introduce new types of metadata or data. IoT (Internet of things) brings another level of complexity to data management. I can see many others to come.

What is my conclusion? I think, the term meta-data is getting outdated at least in the scope of design and engineering. Originally used a lot in EDM/PDM systems managing metadata about CAD files is not relevant anymore. Design data originally stored in CAD files becomes more transparent and connected to external to design world. The whole data paradigm is changing. Just my thoughts…

Best, Oleg


Why PLM vendors need to hire data scientists?

December 4, 2013

plm-data-knowledge

The importance of data is growing tremendously. Web, social networks and mobile started this trend just few years ago. However, these days companies are starting to see that without deep understanding of data about their activities, the future of company business is uncertain. For manufacturing companies, it speaks a lot of about fundamental business processes and decisions related to product portfolios, manufacturing and supply chain.

It sounds like PLM vendors have a potential best fit to fulfill this job. PLM portfolios are getting broader and covers lots of applications, modules and experience related to optimization of business activities. In one of my earlier blogs this month, I was talking about new role of Chief Data Officer in companies. Navigate here to read and draw your opinion. However, to make this job successful is mission impossible without deep understanding of company data by both sides – company and vendors / implementers.

Few days ago, I was reading InformationWeek article – Data Scientist: The Sexiest Job No One Has. The idea of data scientist job is very interesting if you apply it beyond just storing data on file servers. Think about advanced data analyst job focusing on how company can leverage their data assets in a best way. The amount of data companies are generating is doubling every few months. To apply right technology and combine it with human skills can be an interesting opportunity. Pay attention on the following passage:

The role of data scientist has changed dramatically. Data used to reside on the fringes of the operation. It was usually important but seldom vital — a dreary task reserved for the geekiest of the geeks. It supported every function but never seemed to lead them. Even the executives who respected it never quite absorbed it.

But not anymore. Today the role set aside for collating, sorting, and annotating data — a secondary responsibility in most environments — has moved to the forefront. In industries ranging from marketing to financial services to telecommunications, the data scientists of today don’t just crunch numbers. They view the universe as one large data set, and they decipher relationships in that mass of information. The analytics they develop are then used to guide decisions, predict outcomes, and develop a quantitative ROI.

So, who can become data scientist in a manufacturing companies? Actually, this major is still not defined in American colleges. Anybody with good skillset of math, computer science and manufacturing domain knowledge can think about this work. So, I can clearly can see it as an opportunity for retired CAD and PLM IT managers spending their life on installation of on premise PLM software as soon as the software will be moving to the cloud environment.

What is my conclusion? In the past, installation and configuration skill set was one of the most important in PDM/PLM business. The time vendors spent on system implementation was very significant. PLM cloud switch is going to create a new trend – understanding of company data and business processes will be come and #1 skill So, PLM vendors better start thinking about new job description – people capable to understand how to crunch manufacturing data to create a value for customers. Just my thoughts…

Best, Oleg


How CAD/PLM can capture design and engineering intent

November 8, 2013

design-eng-intent

It was a big Twitter day. Twitter IPO generated an overflow of news, articles, memorable stories. For me, twitter become a part of my working eco-system, the place I use to capture news, exchange information and communicate with people. If you are on twitter, try Vizify to visualize you twitter account. I did it here. The most insightful information for me was the fact I tweet 24 hours a day… (well, I don’t know how Vizify deal with my time zone changes). It made me think about what impact Twitter-like ecosystem can provide on engineers and designers. It came to me as a continues thoughts about failure of Social PLM – Why Social PLM 1.0 failed and What PLM can learn from public social data?

I’ve been reading an article and interview with Biz Stone, Twitter co-founder and entrepreneur – Be emotionally invested. It is a good story. Read it, especially if you are involved in startup activity. One of interesting pieces that caught my attention was a story about Google working environment. Here is the passage:

“I used to just walk around. I don’t know if I was supposed to, but I’d just open doors and see what people were doing.” One led to a guy surrounded by DVRs. Stone asked what he was doing. “I’m recording everything being transmitted on TV all over the world.” Another led to “a sea of people operating illuminated foot-pedal scanning devices. “We’re scanning every book ever published.”

Another interesting article that caught my attention was about an interesting behavior – deleted tweets. Navigate to read – Why do people delete their tweets? University of Edinburgh researchers have been looking into the motives behind deleted Twitter missives. You can read more about this study here. The funny part of this mechanism is that it implements the old idiom – Word spoken is past recalling. Here is passage explained the research and how it works.

Right now there’s no way to tell whether you’ll be proud of your rousing 140 character defense of James Franco in a few years, or deeply, deeply ashamed. But hiding the evidence isn’t hard. Deleting a tweet is not a complicated process. If you don’t like what you wrote, you can trash it in a few clicks. And there are services like Tweet Delete that help you mass-delete older tweets.

These two examples – capturing of information streams from global and personal perspective made me think about how potentially we can capture engineering activities and discover design intent of decision making factors similar to techniques used to identify deleted tweets and other related twitter user behaviors. The challenge of CAD/PLM environment compared to Twitter is obviously security and open APIs. It is hard to capture information from design and engineering systems. In most of the cases, the information is secured and access is restricted.

What is my conclusion? There is a huge potential in analyzing of design and engineering activity from capturing information about people behavior. My hunch, it can become one of the places CAD/PLM companies and startups might crack to discover a future potential of design optimization and decision making. Just my thoughts…

Best, Oleg


What is the future of PLM data analytics?

October 31, 2013

analytics

The amount of data we produce is skyrocketing these days. Social web, mobile devices, internet of things – these are only few examples of data sources that massively changing our life. The situation in the business space is not much different. Companies are more and more involved into connected business. Design, supply chain, manufacturing – all activities are producing a significant amount of digital information every minute. In design, the complexity of products is growing significantly. Personalization, customer configurations and many other factors are creating significant data flow. Simulation is another space that potentially can bring a significant amount of data. I was watching presentation "Workspace 2020" made by Dion Hinchcliffe last week at the forum "Transform the way work gets done". Navigate to the slide share slides and take a look. One of the numbers stroke me – speed of data growth. Now (in 2013) we double data in 2 years. By 2020, we are going to double the data every 3 months.

era-of-change

The massive amount of data brings the question how engineering, design and manufacturing systems can handle this data and produce a meaningful insight and decision support for engineers and other people involved into development process. The question of data analytics came to my mind. In the past data analytics was usually associated with long, complicated and expensive process involving IT, resources, time and programming. Cloud and new computing ecosystem are changing this space. I was reading the announcement made by Tableau (outfit focusing on providing analytic dashboards and tools) – Tableau Software partners with Google to Visualize Big Data at Gartner IT Symposium.

The partnership mixes Tableau’s analytics with the Google Cloud Platform. The technology was presented at Gartner conference in Orlando recently. Here is an interesting passage explaining what Tableu did with Google:

“Tableau and Google created a series of dashboards to visualize enormous volumes of real-time sensory data gathered at Google I/O 2013, Google’s developers’ conference. Data measuring multiple environmental variables, such as room temperature and volume, was analyzed in Tableau and presented to attendees at the Gartner event. With Tableau’s visual analytics, Gartner attendees could see that from the data created, I/O conference managers could adjust the experience and gain insights in real time, like re-routing air-conditioning to optimize power and cooling when rooms got too warm.”

I found the following video interesting explaining how easy you can build some analytics with Tableu and Google Big Query.

It made me think about future potential of analytics we can bring into design and engineering process by analyzing huge amount of data – simulation, customer, operational. Just think about combining data collection from products in the field, mixed with some simulation analyzes that can be used to improve design decisions. Does it sounds crazy and futuristic? Yes, I think so. However, there are many other places that we considered crazy just few years ago and it became the reality now.

What is my conclusion? Data analytics is one of the fields that can provide a potential to improve design and engineering process by analyzing significant amount of data. Think about leveraging cloud infrastructure, network and graph databases combined with visual and intuitive analytic tools. It is not where we are today. This is how future will look like. Just my thoughts…

Best, Oleg


What PLM can learn from public social conversation?

October 16, 2013

social-data-plm-20

Social PLM is not cool any more. The article Why is Social PLM DOA? from PLMJim popped up in my twitter stream earlier today and drove my attention to social PLM topic again. In my view, it gets nasty this time. We loved the idea of social communication and how to these tools can improve collaboration. However, it all gets wrong. Here is the passage Jim is using to explain that:

There have been many articles on the value that social tools can bring to your business. However, the uptake of social tools within Engineering organizations in the guise of social PLM has been very low; possibly non-existent.Why is this the case? Is there no value in Engineering for social tools, or is it just hard to exploit these tools in the product development environment? There is clearly a need for more social collaboration during product design, so it would stand to reason that these social tools would have some value. As I have introduced many engineers to Social PLM in my PLM Certificate Education classes, I have often wondered about the lack of enthusiasm for these kinds of tools.

So, after all hype, the solution is not there and demand is near to zero. I’ve been thinking about social topic and PLM quite some time and I agree with Jim’s point. Maybe not in a such disruptive form. I called it - Why social PLM 1.0 failed?  in the beginning of the year.

However, here is the deal. The more I think about social, the more I’m convinced PLM vendors and startups in “social PLM” domain took a wrong approach by trying to convince that “social collaboration” will provide a silver bullet to improve communication between people. In my view, it is totally wrong. People are locked in silos and not interested to get out of their silos. This is how organizations work and the best communication tools cannot change this trend for the moment.

So, what is the potential future leverage of social tools in engineering and manufacturing? In my view, social data is a potential Klondike for manufacturing companies. It is a place where manufacturing companies can get ideas about customer demands, future product improvements and existing product failures.

I’ve been reading makeuseof article Facebook usage is changing, so which online activities are growing. The article brings an interesting perspective on what happens with social tools. It speaks about variety of tools people use anonymously to publish social information. Tools like tumblr, instagram, whatsapp, twitter. Some of them are well known and some of them are not very popular in professional social space.

Manufacturing companies are sensitive to social activities these days. Look on how CEO of Tesla, Elon Musk was handling Tesla Model S fire very recently. It will give you a sense of potential value and danger manufacturing companies can experience by missing this type of sensitive social information.

tesla-social-model-s-fire

What is my conclusion? I think Social PLM 1.0 was a nice try and… failure. It was good experience to learn how mimicking something buzzy and hyping is dangerous without focusing on values for individuals and companies. I’m expecting to see new Social PLM 2.0 to come soon with new agenda, ideas and lesson learned. Social data has a huge potential. Not to leverage this potential will be a huge mistake made by PLM vendors. Just my thoughts…

Best, Oleg


Will PLM Data Size Reach Yottabytes?

October 14, 2013

bigdatasizing

Everybody speaks today about big data. It is probably one of the most overhyped and confused terms. It goes everywhere and means different things depends who you are talking to. It can be data gathered from mobile devices, traffic data, social media and social networking activity data. The expectations are the size of big data will be going through the roof. Read Forbes article Extreme Big Data: Beyond Zettabytes And Yottabytes. The main point of the article – we produce data faster than we can invent a name how to call it. Here is a scale we are more/less familiar with – TB terabyte, PB petabyte, EB exabyte, ZB zettabyte, YB yottabyte…

However, article also brings an interesting lingo of data sizes. Here are some examples: Hellabytes (a hell of a lot of bytes), Ninabytes, Tenabytes, etc. Wikipedia provides a different option to extend prefix system – zetta, yotta, xona, weka, vunda, uda, treda, sorta, rinta, quexa, pepta, ocha, nena, minga, luma, … Another interesting comparison came from itknowledgeexchange article. Navigate here to read more. Here is my favorite passage. The last comparison to Facbook is the most impressive.

Beyond what-do-we-call-it, we also have the obligatory how-to-put-it-in-terms-we-puny-humans-can-understand discussion, aka the Flurry of Analogies that came up when IBM announced a 120-petabyte hard drive a year ago. Depending on where you read about it, that drive was: 2.4 million Blu-ray disks; 24 million HD movies; 24 billion MP3s; 6,000 Libraries of Congress (a standard unit of data measure); Almost as much data as Google processes every week; Or, four Facebooks.

Forbes article made me think about sizes of PLM data, engineering data, design data. It is not unusual to speak about CAD data and/or design data as something very big. Talk to every engineering IT manager and he will speak to you about oversizing of CAD files in libraries. Large enterprise companies (especially in regulated industries) are concerned about how to store data for 40-50 years, what format to use, how much space it can keep and how it can be accessible. At the same time, I’ve seen a complete libraries of CAD components together with all design data coming from a mid size companies backed up with simple 1TB USB drive. I believe software like simulation can produce lots of data, but this data today is not controlled and just lost on desktops. One of the most popular requirements from engineers about PDM was the ability to delete old revisions. The sizes of PLM repositories for Items and Bill of Materials can reach certain size, but still I can hardly see how it compete to Google and Facebook media libraries. At the same time, engineering is just before to explore the richness of online data and internet of things. So, the size of engineering repositories will only grow up.

What is my conclusion? If you compare to Google, Twitter and Facebook scale, the majority of engineering repositories today are modest sized. After all, even very large CAD files can hardly compete with the amount of photo and video streams uploaded by billion people on social networks. Also, tracking data captured from mobile devices oversize every possible Engineering Change Order (ECO) records. However, engineering data has a potential to become big. An increased interest to simulation, analyzes as well as design options can bump sizes of engineering data significantly. Another potential source of information is related to an increased ability to capture customer interests and requirements as well as product behavior online. Just my thoughts. So, how fast PLM will grow to Yottabytes? What is your take?

Best, Oleg


PLM, Google Knowledge Graph and Future Decision Support

September 27, 2013

GoogleKnowledgeGraph

Do you remember live without Google? It was only 15 years ago. You should agree, information was less available back that days. Fast forward – good news! Google made a change to its Web search system. ReadWrite Web article says – Now You Can Ask Google Search To Compare, Filter And Play. Another article from TheNextWeb – Google unveils search updates for mobile, new Page Rank algorithm, and Knowledge Graph comparisons.

You probably recall my post from last year – Why PLM need to learn about Google Knowledge Graph? That was the first publication about GKG and it wasn’t clear what use Google will give to Google Knowledge Graph. Now you can see some examples – type in your browser "wine vs beer" or "eiffel tower vs empire state building" and you can expect some meaningful results about objects.

I specially liked Eiffel vs. Empire state building example. It can give you a perspective of where data exploration and search can go. People are not interested in search results.

wine-vs-beer

People are looking for meaningful information. In that context, GKG is the right way to go – it accumulates knowledge and can provide it in a consumable way.

empire-vs-eiffel

So, how it related to PLM data you may ask? Here my hunch. People are not interested how to find assemblies and parts any more. The decision driven by data is getting more and more focus. The complexity of design decision is skyrocketing. Engineers are facing it every day. Engineers will need to get decision support. Try to type Vishay – DF10M-E3/45 vs. Vishay MB6S-E3/80. Don’t expect Google to help you. These are special types of rectifiers. Google doesn’t know what is Miniature Glass Passivated Single-Phase Bridge Rectifier. Maybe in few years? There are other systems that can produce better results like this (below). However, we still have long way to go.

Vishay MB6S-E3

What is my conclusion? I find it more difficult to locate correct product information in the environment with growing complexity. Engineers are looking how to make right decisions. In most of the cases, they are making these decisions based on their past experience rather than using right information. This is a place where we should expect some changes in the future. We just in the beginning. Product data should be collected, analyzed and presented in the way that helps engineers to make a right decision. Just my thoughts…

Best, Oleg


PLM, Data and Automotive Manufacturing

June 7, 2013

ford-data-plm.jpgCAD and PLM vendors have a long history of development product for automotive industry. Major OEMs and their suppliers were one of the first customers long time ago in the history of the computer systems for design and manufacturing. The days when car development was mostly about mechanical design with a small amount of electric wiring are gone completely. Today car manufacturing is a complex process and every car contains over 100M lines of code. The question about how to make improvements in car design and manufacturing process is the one engineering and manufacturing software is supposed to solve these days and in the future.

In the past, main focus of CAD and PLM systems was about how to automate work of engineers, place manufacturing orders and sometimes organize supply chain. It is much more complicated these days. Customers are getting involved into design process process. Customized design and an increased amount of configurations is not an unusual thing these days. Customer feedback is getting more important. What your customers are saying about car experience, what happens with the car after it leaves manufacturing facilities. Dealership, maintenance and many other things. Supply chain is getting more complex and requires more sophisticated optimizations.

One of many lessons companies are learning these days – data matters. Having data about what you do or how to do your business better becomes a key and in some situations can be a game changer for a business. So, where it can come to automotive companies. I’ve been reading an interesting article by GigaOM – How data is changing the car game for Ford. It covers few very interesting areas of how data can provide a competitive advantage for car manufacturer. The following pasage was my favorite:

Mashing up data sources such as social and sales in order to find insights is a pretty easy sell, Cavaretta explained, but getting people to put sensors in everything and collect data every second or with every transaction can still be a bit challenging. In part, this is just a lingering effect of the constraints that legacy technologies imposed on the company. It wasn’t possible to store all this data, so people just got accustomed to the status quo of summarizing data hourly, for example.

Now, however, he’s pushing them to “dial it down” and collect data at the lowest level possible and as often as possible. In manufacturing alone, he explained, there are between 20,000 and 25,000 parts in any given vehicle, and there’s a supply chain that spans from parts suppliers all the way up to dealerships. Getting a complete view of this process could help drive serious efficiencies and, Cavaretta said, “We don’t see anything but big data technologies that can get us there.”

The question you obviously can ask – what does it mean for CAD/PLM companies and how enterprise software business can leverage that. Manufacturing companies are under significant pressure to improve processes and cut cost. This is a new nightmare for modern manufacturing. It is not enough to apply "next big single thing" like agile, kanban or something else. Customers are expecting software companies to provide a connected experience that helps manufacturing companies to focus on goals and not on design, engineering of manufacturing. To provide design tools is not enough these days.

What is my conclusion? Decision matters. Companies are looking for software that helps to make a decision. What is the right part to use? What is the right supplier to work with? What is the right price to buy? How to improve existing processes based on actual data can be an interesting opportunity. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 217 other followers