Will public clouds help enterprises to crunch engineering data?

August 6, 2014

google-data-center-crunches-engineering-data

The scale and complexity of the data is growing tremendously these days. If you go back 20 years, the challenge for PDM / PLM companies was how to manage revisions CAD files. Now we have much more data coming into engineering department. Data about simulations and analysis, information about supply chain, online catalog parts and lot of other things. Product requirements are transformed from simple word file into complex data with information about customers and their needs. Companies are starting to capture information about how customers are using products. Sensors and other monitoring systems are everywhere. The ability to monitor products in real life creates additional opportunities – how to fix problems and optimize design and manufacturing.

Here is the problem… Despite strong trend towards cheaper computing resources, when it comes to the need to apply brute computing force, it still doesn’t come for free. Services like Amazon S3 are relatively cheap. However, if we you want to crunch and make analysis and/or processing of large sets of data, you will need to pay. Another aspect is related to performance. People are expecting software to work at a speed of user thinking process. Imagine, you want to produce design alternatives for your future product. In many situations, to wait few hours won’t be acceptable. It will be destructing users and they won’t use such system after all.

Manufacturing leadership article Google’s Big Data IoT Play For Manufacturing speaks exactly about that. What if the power of web giants like Google can be used to process engineering and manufacturing data. I found explanation provided by Tom Howe, Google’s senior enterprise consultant for manufacturing quite interesting. Here is the passage explaining Google’s approach.

Google’s approach, said Howe, is to focus on three key enabling platforms for the future: 1/ Cloud networks that are global, scalable and pervasive; 2/ Analytics and collection tools that allow companies to get answers to big data questions in 10 minutes, not 10 days; 3/ And a team of experts that understands what questions to ask and how to extract meaningful results from a deluge of data. At Google, he explained, there are analytics teams assigned to every functional area of the company. “There’s no such thing as a gut decision at Google,” said Howe.

It sounds to me like viable approach. However, it made me think about what will make Google and similar computing power holders to sell it to enterprise companies. Google ‘s biggest value is not to selling computing resources. Google business is selling ads… based on data. My hunch there are two potential reasons for Google to support manufacturing data inititatives – potential to develop Google platform for manufacturing apps and value of data. The first one is straightforward – Google wants more companies in their eco-system. I found the second one more interesting. What if manufacturing companies and Google will find a way to get an insight from engineering data useful for their business? Or even more – improving their core business.

What is my conclusion? I’m sure in the future data will become the next oil. The value of getting access to the data can be huge. The challenge to get that access is significant. Companies won’t allow Google as well as PLM companies simply use the data. Companies are very concerned about IP protection and security. To balance between accessing data, providing value proposition and gleaning insight and additional information from data can be an interesting play. For all parties involved… Just my thoughts..

Best, Oleg

Photo courtesy of Google Inc.


Why PLM shouldn’t miss next email move?

July 18, 2014

plm-email

Email is a king of communication in every company. Many companies are literally run by email. People are using it for different purposes -notification, collaboration and very often even record management. You can hear many discussions about how companies can replace or integrate email with enterprise and social collaboration tools. I captured some of them in my previous blogging: How engineers find path from emails and messages to collaboration?; PLM Workflows and Google Actionable Emails; DIY PLM and Zero Email Policy; PLM Messaging and WhatsApp Moment.

You may think email doesn’t change. I wanted to share with you two interesting examples related to changes and innovation in email that caught my attention for the last few weeks. The Verge article speaks about Gmail API announcement.

Google announced that any app could now talk to Gmail using today’s faster, more modern languages — languages that every web developer speaks. The Gmail API lets you ask Google for threads, messages, drafts, and labels three to ten times faster than with IMAP. What it can do is provide an interface for any app to interact on a small scale with your Gmail account without having to create an entire mail client. When that happens, Google won’t have replaced email — it will have actually extended it. Instead of killing email as some hoped it would, the Gmail API gives email new life.

The following video present some additional details about Gmail API usage. Take 5 minutes to watch it, especially places where video speaks about integration between Gmail and enterprise systems.

Another example comes from TNW article – Inbox launches as an open-source email platform to replace legacy protocols.

A new startup, Inbox, is launching its “next-generation email platform” as an alternative to aging protocols like IMAP and SMTP. The core of Inbox’s efforts is an Inbox Sync Engine for developers that adds a modern API on top of mail providers, including Gmail, Yahoo and Outlook.com.

As stated in the article, Inbox is a platform play. The intent of founders is to create new generation of messaging platform. And it is an open source play. The first step for Inbox is to create Sync engine that can expose existing email providers:

The core of Inbox is an open source sync engine that integrates with existing email services like Gmail, and exposes a beautiful, modern REST API. We’re pleased to announce that beginning today, you can download the Inbox engine, sync an account, and begin building on top of Inbox in your local development environment.

These articles made me think about a potential play PLM and engineering application can make by building their collaboration application tightly integrated with email services. It will allow better communication for people and ease of data integration between PLM solutions and communication platforms such as emails. You may see it as a pure technical play. Who cares how to integrate email and data? However, in my view, this is a place where differentiation in user experience and seamless data integration can become a critical to drive user adoption.

What is my conclusion? It is very hard to change people’s habits. Email is part of our every day routine. Existing systems are integrated with email, but the way it done as well as the level of data integration is very sporadic. Lots of unstructured data about customers, engineering decisions, requirements and many others stuck in the email and lost there forever. New email approach may help to have transparent and seamless integration between business applications and email. It can make a difference for users. Just my thoughts…

Best, Oleg


CAD companies and cloud storage strategy

July 7, 2014

cad-cloud-storage-strategy

Cloud storage is changing fast these days. From relatively small portion of our life limited mostly by online email, cloud storage is growing into space where majority of our activities are happening these days. Email, photo storage, online documents, calendars, shopping – this is only a short list. Changes are coming to corporate world as well. New York Times article Google, Microsoft and Others Delve Deeper Into Cloud Storage for Businesses speaks about trajectories of cloud storage and business. Here is an interesting passage:

Soon, keeping your digital goods will also be the means for tech companies to understand who does what inside a business, just the way they understand consumers by watching what they do on the web. “Storage is where the stickiness is,” said Jeffrey Mann, vice president for research at Gartner. “It’s how they hold a customer. If they store your stuff, they get to know you better.”

So, you may think the strategy is to hold data and keep customers as hostages for storage. It might sounds like a strategy for short term. However, web giants don’t see storage as something that will hold companies strategically. The following passage can give you a feeling of direction:

“Cloud storage is a temporary market,” said Scott Johnston, director for product management for Drive, Google’s online storage, document creation and collaboration business. “In the future it will be about elevating productivity: How do we look for patterns? What does it mean if a document is read by 10 percent of the company? What does it mean if you haven’t read it yet?" It’s a strategy that Microsoft is also pursuing with its OneDrive product. Dropbox, a storage site popular with consumers, and Box, a storage and collaboration site specifically for business, are both also working on ways to turn data storage into something that provides greater insight into how people are working. Dropbox started a business offering last year.

This point of view made me think about what can be a potential strategy of CAD companies related to cloud storage and operations of CAD systems. The majority of CAD business today is not in the cloud. CAD files and related information is stored on desktop computers and local area networks. How big this data and how easy and transparently companies can move this data to the cloud (private and public) and make it available for collaboration? The demand for better collaboration is huge. CAD vendors are working on cloud CAD systems. But this work is just in the beginning. Cloud storage of CAD files and seamless access by existing desktop CAD systems can be a short term CAD file management strategy. The most interesting part is coming next. If I will follow Google’s logic, companies can make analysis of massive amount of CAD data and use it for future product design improvement and better work organization.

What is my conclusion? Data is a fuel for future growth. Whoever will capture CAD data these days will have an ability to run analytic work and make that data part of future design strategy. In most of cases today, companies have very limited capability to re-use design, make analysis and predict future improvements. Cloud storage can be a first step towards future data-driven design. Just my thoughts…

Best, Oleg


Drawing Callouts and Future Google PLM Design

June 27, 2014

google-material-design

For me, Google is one of the symbols of simple software design. Therefore, Google developers event is always a good place to get inspired with ideas and new technologies. Google I/O was this week in San Francisco. I had now chance to attend the event, but was able to watch video streaming of keynote and few other sessions. One of the things Google introduced this year was Material Design - a new approach to rethink user experience to make it more natural, consistent and clean. Watch the following video to learn more.

Material design presentation made me think again about how to develop simple PLM experience. The topic isn’t new. I’ve been talking about it already for few years. Remember my presentation almost 3 years ago at AU 2011?

future-plm-ux

The time of complex and cumbersome enterprise user experience is finally over. Simplicity is an obsessive motto of every enterprise software company these days. However, to develop good UX is a big deal. It requires time and effort. “Don’t make me think” is my favorite quote by Steve Krug about how to develop good UI. How to do so? This is a tricky question. In my view, one of the key elements of this process is to capture elements of well known customer behaviors. You need to learn how people work today. It is extremely hard to change existing user behaviors.

I’ve been reading GrabCAD blog post -BOM Find Numbers: Don’t Get Too Attached. Read the article – it is short and sweet. I’m sure you are familiar with “drawing callout” and find numbers. It was absolutely must feature on paper drawing. Should we keep them in the future? This is a question Ed Lopategui is asking in his post? His conclusion – not really. Here is a passage explaining that:

Find numbers will make little sense in the future, so it’s probably best for everyone if you don’t get too attached to them. There will come a day where the find number is finally retired, and we can move on to the next chapter of BOM management. How can you get to that future faster, you might ask? That’s the easy part: move away from all those outdated BOM authoring tools (like Excel for one), and adopt a modern, integrated BOM editing capability.

bom-find-numbers

I liked BOM find numbers example. New technologies can clearly help us to interlink between BOM and 3D models. It is good to shift away from managing Bill of Materials in Excel spreadsheets. At the same time, maybe we need to think twice and not to kill familiar user behavior and experience? Maybe we can re-use it for the future clean user interface design. Users are familiar with existing experience and it can help them to understand how to use the system.

What is my conclusion? Existing engineering behaviors have long history going back to drawing boards, paper drawing and engineers collaborating live in one room. Digital technologies are ripping off these behaviors. We create digital models and collaborate using internet and computer screens. New ideas and technologies are good. At the same time, it is might be a good idea to learn from existing behaviors and preserve some of them in the way that will simplify digital collaboration and design. It will definition good for user adoption. What to keep and how to combine existing behaviors and technologies? This is a very good question. I’m sure it will inspire PLM innovators for coming years. Just my thoughts…

Best, Oleg


Can PLM turn notifications into a process and vice versa?

June 10, 2014

ios8-notification-center

Notifications are fascinating. We are all love to get notified. Alarms, emails, meetings… Later came social notifications such as likes, discussion comments and others. Enterprise systems are sending notifications about process states and many others.

Recent Apple WWDC presentation provided a snapshot about next evolution point of OS/X notifications. When iOS 8 hits, the notification center can easy become a center of universe for your iPhone. The same happens on Android device- notification screen is pretty much in focus of everything.

Yosemite-notification-center

Wired article Why Notifications Are About to Rule the Smartphone Interface? provides an interesting insight on how notification functionality will developed and what is means for end users. Here is my favorite passage:

Interactive notifications will spur all sorts of new behaviors. (And yes, Android already has interactive notifications, but the ones in iOS 8 look to go beyond what KitKat can do.) Some of these will be simple, like the ability to reply to an email or text message. But they’re powerful in that you can do this without quitting whatever you’re already doing. And this interactivity is not just limited to system apps. Third-party developers can take advantage of this new capability as well, so you could comment on something on Facebook, respond to a tweet, or even check in on Foursquare. But others are going to be radical, stuff we haven’t imagined yet. Once developers begin to really harness what interactive notifications can do in iOS 8—and they will—it’s going to cause one of the most radical changes since third-party apps. With the advent of iOS 8, notifications are the new interface frontier.

It made me think, there is an opportunity for process management tools to leverage notification center ideas as well as use this spot for better experience PLM systems can provide. In most of the implementation, I’ve been involved, PLM systems are using email to get people involved into the communication about changes and process notification. Recently, instant messengers (IM) and live chats came to that place as well. New type of notification system can come to combine all aspects of communication into single interconnected experience. There is one more aspect. The ability of process management tools to capture existing notifications. It can be an interesting opportunity to simplify process planning for organizations.

What is my conclusion? Communication brings a lot of noise and inefficiency. This is a problem we have everywhere. On the other side, process is all about how to get people to perform in the most possible efficient way. It seems to me the new notion of notification can provide some alternative to old email exchange and buzzzzzes of alerts. UX architects and PLM technologiests should take a note. Just my thoughts…

Best, Oleg


How PLM IoT will support RTF?

May 29, 2014

plm-rtf-implementation

IoT (Internet of things) is trending these days. Connected houses, cars, sensors, thermostats, LED lamps, etc. On the recent COFES 2014 in Arizona, IoT was one of the most most discussed topics. Clearly, there are some very interesting things PLM can do with IoT even tomorrow. However, the question of privacy and information transparency is one that can hit negatively PLM IoT solutions.

I was reading Google Co-Founder Sergey Brin: I Wish I Could Forget The “Right To Be Forgotten” article in Search Engine Lands earlier this morning. If you are not familiar with new acronym – RTF (Right to be forgotten), you better do it now. The Wikipedia article is here. Current legal framework is mostly focusing on private personal data and potential exposure of this data on the internet.

The right to be forgotten ‘reflects the claim of an individual to have certain data deleted so that third persons can no longer trace them.’[8] It has been defined as ‘the right to silence on past events in life that are no longer occurring.’[9] The right to be forgotten manifests itself in allowing individuals to delete information, videos or photographs about themselves from internet records, and thus prevent them from showing up on search engines.[10] Currently there are little protections currently against the harm that incidents such as revenge porn sharing, or pictures uploaded that are born of equally poor choices, can do.[11]

The implementation of RTF is still far from mature. Here is the passage from the article, which explains how it can be done.

Practical implementation of the RTF, which allows individuals to seek the de-indexing of personal information that has become “outdated” or “irrelevant” (even if legal), will create procedural challenges across the numerous jurisdictions throughout Europe. Each jurisdiction is free, in theory, to enact a different process. According to Bloomberg, Germany is first out of the gate with a potential RTF “takedown” procedure. The German government is considering setting up arbitration courts to weigh in on what information people can force Google Inc. and other search-engine providers to remove from results… the Interior Ministry in Berlin would seek to establish “dispute-settlement mechanisms” for consumers who file so-called take-down requests…The ministry suggested that the removal of information shouldn’t be left to company algorithms… The German ministry doesn’t currently plan to create a single mediating authority or to put mediators under state supervision, it said. Talks with Google and other providers will begin once the government has finalized its position.

RTF implementation made me think about potential connection between RTF and future ability of manufacturing companies to collect data from different devices. Think about car, thermostat, phone ability to collect product usage information. Maybe not very realistic today, but tomorrow the data collected from devices will be personalized. There is high probability that PLM products focusing on services / maintenance, performance and requirement management will collect personalized information about customers (and people). Which will immediately trigger the question of “de-personalization” of product information to comply RTF rules.

What is my conclusion? We are collecting more data every day about ourselves and products we are using. It is good and bad at the same time. More data can help us to optimize products. At the same time, it raises a question about potential exposure of private data. Similar to regulatory support, PLM companies will face the need to implement and support RTF rules and clean data about individuals and companies collected from product and devices. Just my thoughts…

Best, Oleg


PLM thoughts about information access after reading Mr. Page’s letter

May 15, 2014

plm-info-context

I had a chance to read “Google ‘A Million Miles Away from Creating the Search Engine of my Dreams’, Says Larry Page.” this morning. The article points to Mr. Page annual “Founder’s Letter”, published ahead of the company’s annual shareholder meeting.

One of the most interesting part to me was how Mr. Page sees Google focus on answering a user’s question. Here is my favorite passage:

“Information is Google’s core,” Page said, noting that over 100 billion Google searches are conducted each month — 15 percent of which are never-before-asked new queries. The search engine is working on being able to provide direct answers to questions rather than just a list of results said Page, adding that Voice Search now works in 38 languages.

It made me think about engineers, product development, search and information retrieval. Tine glitch I learned 5 years ago when I started to work with engineers on how search can solve their information problems. Usually, engineers don’t know what question to ask. What was the most important to engineers is the ability to provide an additional information about the answer,which helps to get the right result. Project, supplier, date/time, customer name… this is a short list of examples.

This Later in the latter Mr. Page outlined the importance of “context”. Here is another passage:

Page explained: “Improved context will also help make search more natural, and not a series of keywords you artificially type into a computer. We’re getting closer: ask how tall the Eiffel Tower is, and then when ‘it’ was built. By understanding what ‘it’ means in different contexts, we can make search conversational.”

In many cases “context” is an important lifecycle piece of data need to retrieve the right information. Revision, Configuration, Manufacturer, Serial Number, Project Name – these are all examples of contextual information needed to make data retrieval easy and painlessly. Search for part description can bring you thousands of results. You won’t be able to filter the right one. Context might have a larger implication in the future world. You may find more examples about “contextual world” in the book – The Age of Context by Robert Scoble and Shel Israel.

What is my conclusion? Keyword search as a foundation of information retrieval mechanism is dying. Information overload. Laundry list of results. Not good. We are moving to the age of contextual information retrieval. To get right context from the data and user is the key element of successful information retrieval. Product development and manufacturing is a very complicated environment. Data is intertwined and disconnected. It is essential to build right contexts to get right answer. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 248 other followers