Security and permissions are showstoppers to adopt search

June 25, 2014

search-top-secret

Search and information discovery is a big deal these days. Inspired by Google and other web search giants, we want information at our fingertips at the right time. I’ve been following topic of search long time. You can jump on few of my previous articles about search – Oslo & Grap – new trajectories in discovery and search; Why engineers need exploratory search? and Pintrest will teach CAD companies to search.

You may think cost and complexity are top problems of search technologies. Crunching lots of data and connecting relevant information requires application of right resources and skills. You will be surprised, but there is one more element that drives low adoption of search in manufacturing companies – security.

Information age articles Enterprise search adoption remains low – survey speaks about survey done among 300 Enterprise IT professionals conducted by Varonis Systems. According to this survey – enterprises are afraid good search solution will allow to people o find information with no permission. Here is the passage which explains that:

The respondents were surveyed at two major security-themed industry events, the RSA Conference in February and Infosecurity Europe in April. When asked to choose the biggest obstacle to enterprise search adoption, 68% cited the risk of employees locating and accessing files they should not have permission to view. Further, even if an enterprise search solution perfectly filters out results based on established permissions, the majority of respondents indicated they are not confident that their organisation’s existing permissions are accurate. Additional obstacles to enterprise search adoption most commonly cited were accuracy of the results (36%), end user adoption (29%) and the ability of solutions to scale enough to index all the data (24%).

It made me think about complexity of manufacturing companies and enterprise organization in general. Established permissions are part of the story. The search results permissions are as good as data that enterprise systems are supplying to search software. GIGO (Grabage in, Garbage out). For many IT organization, management of security and permissions is a big deal. Think about typical manufacturing company. Tomorrow, search system can find all CAD files that were occasionally copy/pasted in different locations and shared between organizations outside of existing PDM/PLM tools. What else, multiple "publishing solutions" created variety of published copies in different formats. Add SharePoint and similar technologies sometimes adopted by divisions against approvals of central IT. Good search solution can be a litmus test to many IT organizations.

What is my conclusion? Manufacturing enterprises are complex. As I described, it driven by strategic, political and cultural lines. Search is disruptive technology that has a possibility to cross these lines and expose many elements of corporate IT problems. So, once more, we learn that only mix of technological and people skills can solve the problem. Strategists and technologist of search vendors should take a note. Just my thoughts…

Best, Oleg

[categories Daily PLM Think Tank]


PLM real implementations: too long to be on-time?

July 7, 2013

One of the looongest US weekends I remember is going to end. This is a time to get back from relaxing holiday atmosphere to business reality. I’ve been skimming social channels and stumbled on PLM statistics posts published by my good friend and PLM blogging buddy Jos Voskuil. Navigate to the following link – PLM statistics–the result. Read and make your opinion. Jos’ main conclusion – PLM is more vision than tech. PLM implementation is a journey that takes time, effort and resources. Some interesting and funny things came out of comparison of experience and PLM implementation time. Here is the passage I liked:

Here, it was interesting to see that more than 60 % of the respondents have over 8 years of experience. As mentioned related to the previous questions it is necessary to have a long term experience. Sometimes I meet a “Senior” PLM Consultant (business card) with two or three years of experience. I believe we should reserve the word “senior” for PLM with a minimum amount of 5 years experience. And it is also depending on the amount of projects you were involved in. Interesting thought came into my mind. Some vendors claim the provide extreme rapid implementations for PLM ( 2 weeks / 30 days / 3 months). If this is real PLM you could do 25, 12 or 4 PLM projects per year full time.

It made me think about PLM implementations in the way they exist today – journey type of specialized implementation requiring time and effort. I certainly agree with Jos- to change the way companies work requires vision, time and effort. In some situations, PLM implementations are coming to change product development processes established during decades.

However, here is a different angle to look on PLM problem. Business is very dynamic these days. Business environment, ecosystem, technology, human resources, business landscape. What if the current lifecycle of many PLM implementations is not very inline with business needs? It reminds me one of the old PTC slides from COFES Israel – people just want drink beer!

What is my conclusion? New enterprise landscape and business reality will be require a different approach in everything – IT, computing models, enterprise software and implementations. We’ve seen lots of changes in consumer technology space, open source and other places happened over the past 10 years. People are looking how to build new products faster and provide a quick respond on customers demands. So, my hunch some of PLM journeys will be late to deliver results. Just my thoughts…

Best, Oleg


Why PLM needs to shift focus from buyers to users?

June 3, 2013

Enterprise systems for long time are well-known as a place where IT plays the role of the king on the road. ERP, CRM and many other enterprise systems and implementations proved that. If you want to sell to enterprise organization, you need to focus on key IT people, preferable on CIO, Chief of Engineering, Manufacturing etc. Earlier this year, I had a healthy debate about this topic following my blog post – PLM, Viral Sales and Enterprise Old Schoolers.

The disruption in Enterprise IT is one that I see as one of the most interesting trends these days in enterprise space. The following presentation caught my attention yesterday – The challenges and opportunities of business in the disruptive tech era. I recommend you to take a look on this even the presentation is 56 slides. However, the following slide stands out and resonate with the point I wanted to make about IT.

Let’s get back to PLM domain. In the existing ecosystems, there are two major ways to sell and implement PDM/PLM projects. One can be made indirectly mostly via CAD vendors channels. The complexity of these implementations is limited and these implementations (with some small number of exclusions) are limited to catch the level of enterprise IT. Another one is a direct channel developed by PLM and ERP vendors selling PLM implementations to top level management in IT organizations. The higher level of IT people is better.

I can see multiple reasons why existing IT is not getting excited about technological disruption in PLM and other enterprise organization. The disruption means changes and changes are usually come with the lost of control and existing status. For example, cloud means no servers need to installed, implementations can be done remotely and product development has a better chances to focus on user experience and business needs rather than on how to implement and run enterprise deployments.

What is my conclusion? The future of PLM implementation will shift focus from PLM buyers to PLM users. At the end of the days, people need to get job done. PLM needs to focus on user needs, user experience and the ability of systems to help people in everyday business life. Just my thoughts…

Best, Oleg


PLM Cloud Concerns and Dropbox Reality for Engineers

December 4, 2012

Last week at AU, I attended Innovation Forum – The Reality of the cloud. The presentation made by Theresa Payton of Fortalice LLC caught my special attention. It was about security. Check later here. Security is loaded and complicated topic. Physical security is one of the top 5 concerns of customers related to the decision of using cloud services. Even if consumption of online services is growing crazy, companies are very careful in placing their missing critical data assets to the cloud. Especially when it comes to IP (intellectual property). Navigate here to read what SearchCIO blog is saying about that. You need to register to read full article. The following passage is interesting:

To be sure, some cloud services are pretty lightweight, such as filling out a form to schedule an online meeting. But for mission-critical applications or storing data in the cloud, you need to ask tough questions: "What does their data center look like? Are they willing to show you a diagram? Backup plans? Security documents?" asked Jessica Carroll, managing director of IT for the United States Golf Association, which uses the cloud for business continuity, as well as for collaboration with 1,500 golfing associations nationwide.

Contact any CIO in the industry and his team will drain you down with the endless list of questions about security. However, here is a news for you, Mr. CIO. I don’t know if you are aware, but 34% of your engineering staff is placing data on the cloud in their Dropbox accounts. What is more surprising – half of them are aware they are doing it against the company rules. Navigate to the following link to read more and see some diagrams – Guess what Mr. CIO? One in five of your employees uses Dropbox at work.

One out of five of 1,300 business users surveyed said they use the consumer file-sync-and-share system with work documents, according to new research by Nasuni, an enterprise storage management company. And, half of those Dropbox users do this even though they know it’s against the rules.

However, the fact employees are putting files in the Dropbox is just half of the problem. Since they are using private accounts, the information remains there even after an employee is leaving the company.

“The sensitive data stored in Dropbox is not secure and just as importantly, not controlled by IT. This means that if an employee leaves the company, the information that [a] user has stored goes with them, creating a significant risk of data loss or exposure. Furthermore, as the amount of sensitive corporate data stored in Dropbox increases, the online file-sharing service will become a more attractive target for hackers and other malicious groups.

What is my conclusion? Think about PLM and Excel. Who won the game? I think the answer is clear – Excel. Each time, PDM/PLM software was incompetent to provide a reliable solution, Microsoft Excel won PLM competition. Now, guess what? If company and corporate IT continue to abuse users’ demand to have flexible and easy access to information, the information flow will go from proprietary data and file servers directly to Dropbox and similar "easy to use" cloud services. Companies need to pay attention. Just my thoughts…

Best, Oleg


PLM Implementations and PLM Egoism

November 13, 2012

PLM implementation requires the change. I’m sure you had a chance to hear about it more than one times. The idea behind that is somewhat simple – PLM implementation eventually going to change the way you are doing business, your product development processes, intercommunication between people, systems and, of course, the way you are making decisions. The theme of "change" during PLM implementation is reflected online quite well in PLM blogosphere. Jos Voksuil, my long time blogging buddy, is probably one of the most prominent supporters of "change" during PLM implementation. Navigate to the following blog – The state of PLM – after 4 years of blogging to read more about how Jos sees PLM technologies and implementations these days. Here is my favorite passage about PLM implementation and change.

I believe PLM requires a change in an organization not only from the IT perspective but more important from the way people will work in an organization and the new processes they require. The change is in sharing information, making it visible and useful for others in order to be more efficient and better informed to make the right decisions much faster.

During my long flight from Boston to Europe yesterday, I read “Ending the Cults of Personality in Free Software.” This write up resonated well with my thoughts about PLM implementation and change, because personality reflected significantly in everything related in design, engineering and product development. If you are long enough in CAD business, you probably remember that very often the decision about what CAD system to use was almost religious among some engineers and designers. You can see lots of similarity these days related to the decision about Integrated vs. best of breed PLMs. Another place where discussion is heating up is related to the conversation about open vs. close PLM platforms. It takes literally years for some large organizations to decide about what PLM platform to use. One of the best way to observe it is to attend customer presentations during PLM vendor forums. You can learn many stories about organization, history of product development decisions and endless PLM roadmaps.

What is my conclusion? I found PLM implementation discussion very similar to some technological disputes. The potential danger is the ego factor. When it comes, ego factor is going one way – up! Sometimes ego may lead to a something very positive and sometimes ego can be a significant destructive factor. Time is a good validation for many egocentric decisions. This is why ERP and PLM implementations are often cyclic with 5-7 years of people’s lifecycle in an organization. When PLM implementation fails, ego might provide a bad guidance. My recommendation to PLM people is to develop "ego-detectors" :). Another piece of technology to decide about on a long transatlantic flight. Just my thoughts…

Best, Oleg


Cloud PLM debates about multitenant models

November 8, 2012

The discussion about cloud PLM is growing these days. Big players are entering the game. Latest announcement made by Siemens PLM about TeamCenter on the cloud just emphasized that PLM cannot avoid the "cloud" game. The list of cloud options for PLM today including a long list of companies – Autodesk PLM 360, Aras, Arena Solutions, Dassault Enovia V6, Dexma PLM from Ascon, PTC Windchil / IBM, TeamCenter and more. One of the questions that always raised by customers and analysts in this space is so-called "multi-tenat model". Usually confusing and raising many debates about what is "true cloud" solution, this topic is indeed very important and provide significant differentiations from both technological and business standpoint.

I’ve made few write ups earlier this year about cloud and multi-tenancy. One of them – Cloud PLM: what do you need to know about multitenancy provides a deep analyzes of all options multi-tenant implementation options. Two additional posts – What Oracle multi-tenancy means for PLM providers and Cloud PLM and IaaS options are discussing various aspects of multi-tenant implementations and cloud infrastructure usage.

Cloud PLM arguments

I can identify two major groups of people arguing about what means "cloud" for PLM and enterprise, in general. One group is saying that cloud PLM is just the ability to put PLM server located in "another place". I agree – this is one of the options. Second group of people is defining cloud PLM as "a service" available from some business applications located "somewhere" outside of customer infrastructure. ASP is not a new option and use d by many vendors in the cloud domain. SaaS option assumes you provide services (only) and make infrastructure (eg. IaaS) transparent.

I suggest to take a deeper look. The following article came to my attention couple of weeks ago – Ask the Experts: What’s the Difference between ASP and SaaS? In my view, it provides good arguments to both of these models: ASP and SaaS. The following two pictures show the diagram of the options. I liked the following passage:

The difference between ASP and SaaS providers lies mainly in the way they manage their respective computing resources… Most ASPs use a single environment for each customer, which means that they provide a specific application that is set up for the individual customer. Each customer uses the business software as a single tenant, and does not share it with anyone else. All application setup configuration, and sometimes even server and operation configuration, is unique for each client. On the other hand, with a SaaS provider, all customers share the same computing resources: servers, application, and database in a so-called multitenant model. So, while an ASP hosts the application environment in its own “building,” a SaaS provider uses the same application environment for all its customers, and they all share the same “building” (see figures 1 and 2)

Cloud ASP Model

Cloud SaaS Model

What is my conclusion? Most of the manufacturing companies are still not well educated about infrastructure and different "cloud options". To understand them is important for IT decision processes. I think to claim buzzwords like "true cloud" or "false cloud" is a fundamentally wrong approach. Both ASP and SaaS models are optimizing resources and cost of the infrastructure . ASP model is more "resources neutral" – you just pull resources by "moving servers outside of your company". At the same time, SaaS (or service model) hides servers resources from your company and provide "the resource consumption view". Such view is generally more focused on business functions and less on IT compared to ASP model. To understand the implication of each of these models on your company operation can help you to decide. Important. Just my thoughts.

Best, Oleg

Image courtesy of [nattavut] / FreeDigitalPhotos.net


PLM priorities and Gartner IT’s Top 10 Tech Trends for 2013

November 2, 2012

As we move towards the end of the year, we will see more posts with trends’ reviews for 2013. While it is really hard to predict “next big things”, these posts usually provide a good perspective on what is going on. Earlier this week, my attention caught by Gartner’s Top 10 strategic tech for 2013 publication. Navigate to the following link in PC Magazine Forward Thinking publication to review details. You can see the picture with top 10 techs below.

It made me think about some interesting intersections of these trends with the priorities of engineering applications and specifically PLM technologies and systems.

Mobile Technologies and HTML5

The question about what will become a preferable technology for mobile and web browsers will continue to be dominant in PLM development eco-system. The complexity of the systems is high. Therefore, vendors will be slow to adopt every change. I believe vendors will try first to concentrate on the supported versions of browsers. Old browsers are easier to reject. Native mobile apps is something more complicated. Most of consumers prefer native mobile apps and not browsers. Therefore, the improvement of HTML5 technologies can be an advantage for PLM vendors considering mobile options.

Enterprise App Stores

Introducing of cloud and mobile apps is raising a question about the future of application distribution. IT will need a tool(s) to distribute new type of application in organizations. App store is a fascinating idea and the majority of people these days actually “got this”. In my view, the challenge will be to balance between chaotic nature of consumer app stores (Apple, Google) and more structured way enterprise IT wants to distribute application to customers. PLM vendors traditionally survive from a problem how to expand usage of PLM tools in organizations. So, App store can be a good idea to fix it.

Internet of Things

This is very fascinating topic. You might be interested to read one of my earlier blog posts. We have more products (things) that connected to the network and starting to communicate online. How it can be connected to PLM? In my opinion, the connection is in a ‘lifecycle’. Today PLM is heavily involved into design and engineering. Less in manufacturing. However, what about the lifecycle of the products after they’re built in a factory and sold, delivered to actual customers. This is a space we are going to discover in a near future. iPhone controlled bulbs is just a beginning.

Hybrid IT and Cloud computing

Traditionally, companies are working with IT departments to get things done. PLM is not different in that sense. Servers need to be configured, routers connected, software installed and updated. This is how a traditional eco-system looks like. At the same time, cloud is coming, which means no servers, no software, no updates. Even if cloud software will become widely adopted, companies will be running a lot of applications and software on premise for a very long period of time. So, how IT will be re-organized around this environment? How we will consume our “granular PLM apps”?

Strategic Big Data

There are lot of confusion around Big Data topic. Here is the big data definition from Gartner’s report – Cearley continued to define big data as dealing not just with volume, but also with variety, velocity, and the complexity of data an organization is dealing with. He talked about managing both internal and external data, and talked about how technologies such as Hadoop may be a big part, but only a part, of it. Big Data certainly open horizons to get more information about product and improve quality of the products from the standpoint that cannot be achieved now.

What is my conclusion? In my view, enterprise software technologies are going through a significant change now. Many software systems in engineering and manufacturing sector are approaching end of their lifecycle. It is a time for vendors to provide new cost-effective solutions targeting new user experience and different IT eco-system. Cost and user experience is one segment where lots of improvements can be made. Another one is data – customers will be actively looking for additional information about products and customers. Just my thoughts…

Best, Oleg


SharePoint got infected with PLM disease

September 19, 2012

I was long time I didn’t write anything about SharePoint. I’ve been tracking SharePoint for the last 5-7 years very closely. These days I can hear lots of talks about coming SharePoint 2013. Many of the customers I know are using SharePoint. Back in 2006-2007, the success of SharePoint comes from the ability to provide an easy starting solution to collaborate on files in folders. The technology was easy, came together with Windows server and was free as soon as you have paid Windows server license. It was easy to start and put you hands-on something that gives you value immediately.

Fast-forward to 2012. The situation is different in my view. SharePoint has an established ecosystem of developers, system integrators and support. At the same time, I’m learning that SharePoint became much more complicated. It is not easy to start using SharePoint and it certainly requires time and effort to install and configure SharePoint-based solution.

The following article came to my attention few days ago – Why SharePoint 2013 Isn’t for You. Have a read and make an opinion. I found the following passage important:

SharePoint is not an app that gets upgraded every month as part of an update cycle. It is a development platform for providing business solutions. Large clients who rolled out SharePoint 2010 in the past two years are going to find it hard to justify moving on to 2013 in the near future, unless they can find a business justification for spending the time and money it will take to make the transition.

With this state of mind, SharePoint finally can be qualified with PLM disease you’re familiar. It is hard to install, it requires business strategy and money to configure and support. Bottom line – it is a perfect vehicle to get service organization to earn money. However, the question is how many users will continue to use it and what will happen with SharePoint ecosystem as we move forward.

What is my conclusion? The demand of customers today is how to simplify things. It is certainly true for consumer-oriented software, it comes fast to enterprise as well. I can see many examples here of companies in ERP, CRM and other fields. So, I can see how enterprise software companies are moving towards making things easy. I’d be concerned if the software I use gets complicated and requires more effort to install and configure. PLM was and still there. Most of traditional PLM products are struggling of PLM disease of complicated installation, long implementation cycle and need for support and maintenance on site. Just my thoughts.. Are you using SharePoint and PLM today? What is your take?

Best, Oleg

Image: FreeDigitalPhotos.net


How cloud can increase PLM flexibility

August 21, 2012

Despite hot weather and summer vacation time, I can see quite many cloud discussions trending around. I was following OSCON conference few weeks ago. The blog that caught my attention in the context of the conference, and cloud was CloudAve. If you have an interest in cloud software related business, you can have a read of the following article -Hardware, cloud and cloud washing. I noticed an interesting passage:

Having said that I am against bundling hardware with software layer and call it a cloud for the same reason I articulated in the previous paragraph. If cloud is about abstracting away all the hardware complexities underneath, I shouldn’t be forced to buy a specific hardware to run the cloud software. Well, you can always make a point that these hardware+software solution will help organizations build clouds. Yes, they do but their offerings by itself is not cloud. Period.

The question about hiding hardware complexity made me think about the flexibility of cloud software. Can bundle of software and hardware increase flexibility and why it can be helpful? I found another article Cloud – it’s about flexibility – is trying to answer precisely to this question.

cloud computing is all about taking technology solutions and delivering them in a way that allows organizations to consume them “as a service.” I’m happy with the concept of private cloud, with cloud in a box or any other permutations of “cloudiness” so long as it means that organizations can give their users the ability to enjoy the sort of flexibility that we, as consumers, enjoy with services like Gmail.

Now let’s turn to PLM software. I’d like to disconnect the idea of PLM from the historical basement of PDM and keeping the control of information. I can see many situations in the company when a specific business problem solved without going to the total product development processes coverage – design supplier exchange, engineering services, change processes.

Any of these (and many others) examples can raise a question in front of engineering IT manager – how long it will take to establish a system to manage that. The answer 6 months is the wrong one these days. The right answer is 6 weeks. This is a place where cloud PLM software can be very helpful. As an engineering IT person, I want to spin my process implementation fast without any hardware and software installation. 6 weeks needs to be spent exactly on how to leverage cloud flexibility to get things up and running.

What is my conclusion? Think about traditional IT and Amazon Elastic Cloud. The last one can give you a server up and running in minutes. The best IT still needs days to make it happen. Nowadays, we want things to go fast with the focus on business and not on how to install servers. In many situations, the only tool, you can start using in minutes is Microsoft Excel. This is why Excel is still the most popular PLM software in the world. But cloud can change it. Just my thoughts…

Best, Oleg


PLM and Multi-Tier Strategies

June 9, 2012

PLM and Single Point of Truth. You probably heard about that before. I tried to address this topic in the past. Navigate to few of my old posts about that – PLM and Single Point of Truth and PLM and Single Point of Disagreement. Earlier this year, I came back to this topic in my write-up PLM and "The Whole Truth" Problem, which raised up few comments about cost of integration and "single PLM option" as a cost-effective alternative. Interesting enough, the topic of multiple systems usage isn’t unique for PLM space only. I’ve been readingCloudAve blog post The Rise of Two Tier ERP and Larry Ellison’s NetSuite Intentions. Here is an interesting passage:

Essentially it’s a further nod by NetSuite to the notion of two-tier ERP, the idea that organizations can continue to use their existing ERP systems at a corporate level, but enable individual business units to innovate with secondary solutions. It’s a smart idea and one which is a natural fit for NetSuite that had traditionally had a hard time selling into the largest corporates who were generally seen as invested in one or other of the large ERP vendors. At the event NetSuite was keen to tell attendees about the case studies of large corporates who have moved to NetSuite for individual business units, all tied to traditional ERP solutions at the corporate level.

Even if the ERP topic and local accounting is a bit different topic, I can see a clear trend to "optimize" IT environment opposite to unification of everything under a single umbrella.

PDM/PLM multi-tier optimization

The idea of multi-tier strategy for PLM implementation sounds as something we might see more in the near future. The main reason is the same – optimization. With a large amount of IT systems already implemented, companies can think about the combination of systems to re-use existing assets and implementation and optimize cost. Cloud can play an additional role in this future optimization by providing companies an easy path to the missing functionality or coverage of remote and geographically separated divisions.

What is my conclusion? How to optimize IT assets? I believe we will be hearing about it more and more these days. When cost of global deployment is skyrocketing, companies will be looking how to leverage multi-tier strategies to optimize the future of PLM deployment and implementation. Large PLM vendors be aware, since it provides an opportunity to niche players and startups. Just my thoughts…

Best, Oleg

image courtesy of designbuzz post


Follow

Get every new post delivered to your Inbox.

Join 241 other followers