PLM and “The Whole Truth” Problem

March 31, 2012

It is hard to find somebody in PDM/PLM business that is not familiar with the idea of a “single point of truth (SPOT)”. The idea is not new. In my view, it was one of the most powerful model that convinced people to implement PLM during the last decade. Similar to the idea of technological singularity, the idea of single point of truth assumes the ability of PLM as a business system to absorb all aspects of a product development lifecycle from early idea generation and requirements building through manufacturing and disposal. Architects of PLM systems took ERP resource and accounting as a model to build a complete model of product development processes in an organization. In the past, I addressed this topic in my posts. Navigate to Back topan> basics: PLM and Single Point of Truth or PLM and a single point of disagreement posts to read more.

Practical application of “single point of truth” model was far from ideal. High level of diversity in engineering and manufacturing processes combined with a significant cost needed to implement a singular model, ended up with very limited implementations. Lots of data elements and processes weren’t covered by PLM implementations. The discussions around “holistic” or “total” PLM implementations is not an unusual topic discussed during every PLM implementation. Earlier today I was listening to the tweet stream coming from CIMdata PLM forum. I learned a new term coined by Stan Przybylinski of CIMdata – “The Whole Truth model”. You can see below tweets about that. What I learned is that “whole truth model” supposed to expand PLM to domains not well covered by PLM today – electronic, software, process (and not only limited by traditional “mechanical orientation” of PLM).

This conversation made me think about possible trajectories of PLM model development into so-called “The Whole Truth”. I decided to make a picture to illustrate that. On the right side of the picture, you can see a PLM model growing and absorbing various domains of product development. Extra data elements represent supplemental aspects of product development PLM model needs to cover. It includes additional data elements as you can see on the picture.

At the time, I can see clear advantages and logic behind this “the whole truth” model, I have concerns about overall scalability and feasibility of such data model organization. The complexity and cost of this model will create difficulties in real-life implementations. Change management will be costly and complicated. Therefore, I decided to predict an alternative model. I called it “web truth” model. The idea behind isn’t new and present an organization of scalable network to represent multiple aspects of the product development. This model can have all advantages of a “single model”, but at the same side assumes some level of Independence in data organization, which will make overall data architecture mode lean and agile.

What is my conclusion? PLM vendors need to learn more about last decade of web development and organization of large scalable web systems. In my view, an attempt to build a “singular” system won’t be successful and create a complex system that hard to maintain, change and scale. The future belongs to data networks and more flexible data organization. Just my thoughts…

Best, Oleg

Picture Victor Habbick / FreeDigitalPhotos.net


PLM, Innovation and Getting Things Done

March 30, 2012

This week was extremely busy. I was traveling to attend Autodesk Media Summit in San-Francisco. You can read about it in my previous post. However, I wanted to talk about Innovation before this weekend. Have you heard about this buzz? It became so popular to talk about how software helps you to innovate these days. PLM vendors are joining this buzz as well. Well, everybody wants to be innovative. Easy answer. Obviously, I want to be innovative. Because it is so easy applies, PLM marketers took into the portfolio of PLM sales machine. The connection between PLM and Innovation became stronger for the last couple of years. You can see it on many websites of PLM-related companies and marketing materials. Almost two years ago, I posted – What are the metrics for PLM innovation? The idea was to discuss how you can measure innovation in the organization in order to prevent dilution of the term. Nevertheless, I believe the problem is still here.

I was reading Virtualdutchman blog of Jos Voskuil earlier this week about PLM and Innovation. Jos is taking the topic of Innovation in a very systematic way. Jos drives his conclusion about what are key company processes PLM can help you to innovate in. Read Jos’ conclusion below:

PLM does not kill innovation and although the PLM Vendor marketing is not very explicit, there are three areas where PLM supports Innovation. In a (subjective) order of priority I would say: 1/ New Product Introduction – bringing the highest revenue advantages for a selected invention; 2/ Invention discovery – by providing R&D a 360 view of their customers and market landscape enable inventions to happen in your company; 3/ Portfolio Management – to assist in selecting the right opportunities to focus.

I’m actually in agreement with Jos. PLM can help people to innovate. However, the story of innovation reminded me the story sales people were telling about the differences between vitamins and pain-killers. Vitamins are making your life better. Pain-killers are helping to remove your pain. It is very important to be able to innovate, take a right decision in NPD (new-product development) or discover for better ideas. However, in my view, it is all vitamins. Opposite to that, to have an ability to manage change processes, get access to the correct version of BOM or calculate product cost to be manufactured according to the specific order is something organization need to do every day.

It made me think about what is in people mind on a daily basis, so if PLM does it – people will use it immediately. Not innovation. I want to get a job done. Every day. People are absolutely dependent on tools that help them to manage everyday life activities – release drawings, run BOM changes, ordering components and product shipments. The following funny video of Kanban2go is not related to PLM, but it is so true.

What is my conclusion? People want to get a job done and go fo skydiving or skiing. I like the idea. However, the manufacturing planning process is frustrating. I cannot find my last drawings, or I need to re-build a product prototype from a baseline nobody can memorize or retrieve. There are many other frustrating tasks. In my view, to help people to get a job done will be the next big thing in PLM. Not to plan about 10-year future innovation (although it is important), but get rid of tasks standing between people and the weekend. I think PLM vendors need to take a note. Just my thoughts…

Best, Oleg


Autodesk, PLM 360, Marketing and Zero Click Solutions

March 28, 2012

I’m attending Autodesk press and media summit yesterday and today in Autodesk San Francisco office. Autodesk invited about 100 journalists and bloggers to share information about the new Autodesk product release. This is also an opportunity to speak to Autodesk executives – Carl Bass, Buzz Kross and others. Autodesk puts a lot of energy and focus behind their newborn PLM baby and to me, it was a chance to have a talk to people responsible for PLM 360 product – Brian Roepke and others.

Autodesk and Cloud

Autodesk is deeply committed to the cloud. It is going much beyond PLM, but reflected in everything Autodesk is doing these days. It starts from giving 3GB-free cloud space for AutoCAD users and continue to all other products, including mobile applications and more. Carl Bass has a huge believe in the cloud. As you can see below, he is emotional and he is on twitter…

PLM 360 – Marketing View

Buzz Kross presented PLM 360 during the main briefing session. After early preview during AU 360, to me, it was a first opportunity to see PLM 360 marketing machine in action.

Below you can slides presented by Buzz Kross. Autodesk defines PLM 360 in three steps – 1/ bring ALL aspects of lifecycle together, 2/ support business models and processes, 3/ ease of adoption, implementation and support. As marketing messages they are not new in PLM world. The difference Autodesk brings to the table is how PLM 360 will be delivered via the cloud and this is what makes PLM 360 different.

You can see how Autodesk presents PLM 360 differentiations. The main characteristics – instant on, access, subscription based are clearly belonged to the cloud. Some of them like "insanely configurable" sounds as too much marketing to me and hardly can be quantified. I believe "Highly Current User Experience" is sort of typo. I have no idea about what is that.

Integration story. The ability to bring right data to PLM 360 is important. It means PLM 360 needs to find the way to access data coming from existing CAD files, PDM systems and other company data source. Buzz presented the following slide in that context. In my view, Autodesk understands the importance of data integration with PLM 360. So, I hope Autodesk will bring more clarity to this space soon.

Prices and cost of ownership. Buzz Kross presented the following cost comparison based on calculation of a PLM deployment of 200 seats and five basic modules. It looks cheaper upfront. However, as I learned during my PLM innovation panel discussion last months, when it comes to pricing, devil is in details. Customers will drill down to functionality to understand if a comparison of PLM 360 to other PLM systems is valid and compares right features, functional and system capabilities.

Customer Workshop and Q&A

During afternoon workshops I have an opportunity to attend demonstrations made by PLM 360 "early adopters". It was a place where marketing messages down to the reality. Three Autodesk customers presented what they did with PLM 360 so far. Few observations I’ve made – 1/ PLM 360 as a tool is simple to use; 2/ cloud helps to customers to start fast and removes all IT complexity; 3/ the discussion customers are having around PLM 360 is no different from the discussion you may have around Enovia, TeamCenter or Windchill. Below few pictures, so you can get a feeling of the conversation.

What is my conclusion? I learned new marketing buzz from Autodesk yesterday – ‘zero click solution’. It is a funny word. I’m not sure I want an application to make clicks and do something without my involvement. However, jokes aside, user experience is extremely important. From what I’ve seen in PLM 360 users – you can start easy and fast. The tool is user friendly. So, my conclusion -PLM 360 has a potential. There are many other issues related to PLM 360 that needs to be solved and require improvements. Here is my list of visible PLM 360 gaps today: 1/ integration with data in the company (files, PDM and other systems), 2/ dynamic workflows, 3/ usability of administration tools like workflow designer and data modeling. The next step will be to analyze gaps identified by customers during first experiments and deployment. I believe Autodesk has enough money and resource to solve these problems. Time will show. Just my thoughts…

Best, Oleg


Product Lifecycle Management and Obsessive Taxonomies

March 26, 2012

I’ve been reading tweeter stream during my short weekend at home. One of the tweets from Randal Newton caught my special attention. This is the message:

This message made me think about PLM systems, taxonomies and folksonomies. If you’re new to this term, a short intro.Taxonomies is what you most probably know as data classification. From Wikipedia article:

Taxonomy (from Greek: τάξις taxis “arrangement” and Greek: νομία nomia “method”[1]) is the science of identifying and naming species, and arranging them into aclassification.[2][3] The field of taxonomy, sometimes referred to as “biological taxonomy”, revolves around the description and use of taxonomic units, known as taxa (singulartaxon). A resulting taxonomy is a particular classification (“the taxonomy of …”), arranged in a hierarchical structure or classification scheme.

Taxonomies are created by a single individual or a team, and it is clearly represented as hierarchical structure. Opposite to taxonomies, folksonomy presents a different way of data organization.

A folksonomy is a system of classification derived from the practice and method of collaboratively creating and managing tags to annotate and categorize content;[1][2] this practice is also known as collaborative tagging,[3] social classification, social indexing, and social tagging. Folksonomy, a term coined by Thomas Vander Wal, is aportmanteau of folk and taxonomy. Folksonomies became popular on the Web around 2004[4] as part of social software applications such as social bookmarking and photograph annotation. Tagging, which is one of the defining characteristics of Web 2.0 services, allows users to collectively classify and find information. Some websites include tag clouds as a way to visualize tags in a folksonomy.[5] A good example of a social website that utilizes folksonomy is 43 Things.

Take a look on an interesting picture presenting opposite worlds of taxonomies and folksonomies. It is about top-down and bottom-up:

I’ve been thinking about taxonomies and folksonomies in a sense of system rigidity. Most of PLM systems built with a predefined set of rules and models. It creates a certain level of resistance when it comes to the usage of the systems. Customization of systems is complicated, sometimes is cumbersome. Opposite to that, folksonomies is a model that can be “collaboratively created“. This element of collaborative creation is something that can be very much appealing to most of the engineering that like to think more flexible.

Social is another aspect. Social is trending and some companies are trying to bring it as a differentiation in PLM game these days. It would be interesting to see if social PLM and other systems pretending to be “social” are using folksonomical approach to help people to organize data within lifecycle.

What is my conclusion? PLM needs to learn new words and methods of work that prove themselves in the last 10 years of Web. Folksonomies is one of them. The rigidity of existing systems (obsessive taxonomies) need to be transformed into a more flexible and granular approach. Just my thoughts…

Best, Oleg


My first take on Autodesk PLM 360 system and technology

March 24, 2012

Autodesk PLM 360 is widely announced and promoted “new cloud alternative” from Autodesk to disrupt PLM market. After initial announcement, back few weeks ago, press, media and bloggers gave significant focus to PLM 360 products. If you want to catch up a bit on articles, I’d recommend you few stories – Sharing our PLM 360 experiencesby Jim Brown of Tech-Clarity, Autodesk announced pricing for PLM 360 offerings by Develop 3D, Live from AU. Autodesk and PLM. Strap your boots, it is coming by Al Dean of Develop 3D, The devil must be cold: Autodesk launches PLM product Nexus by Chad Jackson of Lifecycle Insightand Autodesk PLM 360: Insanely Configurable? by Monica Schnitger. I also had a chance to write few articles after PLM 360 release. Navigate to the following link to read my article - Autodesk, Cloud and PLM for $19.95. That was my conclusion few weeks ago just after the announcement:

Autodesk made a significant turnaround from rejecting PLM to claiming Autodesk PLM revolution to come to every manufacturing company. If I think Darwinian, it can be a confirmation of the Autodesk ability to adopt to the reality of today’s world. One of the conclusions I’ve made last week during PLM Innovation conference in Munich – PLM is strategic now. Autodesk is claiming PLM revolution and emphasizing “technology” as one of the enabling factors. It means technologies behind Autodesk PLM 360 is what made Autodesk PLM possible. I’m looking forward to seeing technological whitepaper about Autodesk PLM 360 with some details going beyond marketing buzzwords. Time will show what Autodesk is serving us in PLM cloud box.

Autodesk PLM 360 Hands On

Autodesk provided me an access to Autodesk PLM 360. My PLM 360 tenant runs on Autodesk QA servers. I discover that later. It was good to know and helpful to justify my impression about performance and availability. After doing some work during the past 3 weeks, this article is an attempt to summarize my initial take on what I think about PLM 360, how it is different from other PLM systems on the market and talk as much as I can about PLM 360 architecture and technology.

General Concept and Strategy

Let me talk how I understood Autodesk concepts and strategy with regards to PLM 360. Autodesk step in PLM game after long time ignorance of PLM. The key point made by Carl Bass during AU was – now we have right technology to solve problems of manufacturers. The bold hint was that technologies and PLM products available from competitors cannot do so. Autodesk conceptual differentiation is the cloud. At the same time, Autodesk has their own PDM product – Autodesk Vault PDM. So, Autodesk is creating a strategy of how to use PDM on premises and PLM on the cloud. In my view, this strategy is interesting and can hold real implementations beyond presentation slides. However, the weak point of this strategy will be the ability of Autodesk PLM to provide an effective integration solution and technology. Autodesk will have to balance between existing PDM product and development of PLM on the cloud which will obviously bring overlaps and lots of questions. One of the examples is about Bill of Materials. My initial take on this problem is here – Cloud PLM and Bill of Material Question. Here is my short graphical explanation of what is the concept of Autodesk PLM:

Data Architecture

Data model is one of the key elements of any PLM system. With the absence of technical information from Autodesk, here is my take on PLM 360 data architecture. PLM 360 is not much different from all other PLM systems. It most probably using SQL-compliant database as a foundation of data architecture. On top of that, there is an object abstraction layer you can use. If you’re familiar with few other PLM/PDM systems, it won’t take you much time to get to PLM 360 model. The core data modeling abstraction concept is “workspace”.

Workspace is defined as a set of attributes with related information and specific behavior. In that sense, it is not different from Object / Class / Business object abstractions used by other PLM developed back in 1990-2000s. I discovered rigidity in some definitions like not ability to extend the amount of tabs in the workspace and some others issues. This leads me to some assumptions related to RDBMS modeling behind, but I don’t see it as something critical.

User Experience

The overal user experience of PLM 360 is nice. The web application has nicely designed elements of UI. It looks a little outdated if you think about modern UI experience circal 2010. You can take a look on the following two screenshots that basically show you the majority of UI experiences.

In addition to that, I found few elements of user interface with original and interesting decisions. One of them, I want to mention is the interface defining attributes of the workspace.

Administration

Overal PLM 360 can be configured and administered via web user interface. No local utilities needed. Please see screenshots above, I used to explain the idea of workspace. In addition to that, you have quite extensive set of administration for scripting and additional configurations.

The only exception from this rule is a workflow designer – a rich application and requires Java to run. I’d say, this kind of reduces the flexibility of process management that can be done by PLM 360.

Architecture and Technology

Autodesk is quite secretive with everything that related to sharing of technological and architecture information about how the system is built. They are taking full responsibility on hosted cloud servers. I don’t have a confirmed information what company is hosting PLM 360 for Autodesk. According to Graphic Speak publication, PLM 360 is hosted on dedicated servers. You can try to make some conclusion about multitenancy. On the picture below, I created PLM 360 pseudo architectural diagram. The picture I draw based on my “educational guess” and “detective actions” :). PLM 360 has near to traditional enterprise architecture contains SQL-compliant database, server code and web frontend.

What is my conclusion? I have a positive impression about PLM 360. It is stable, and I could perform my research experiments as well as some customization and development work. The overall maturity of the system even higher than I would expect from the system developed from scratch (as it was mentioned by Randal Newton in his article) – Autodesk PLM 360 is the first PLM product written from scratch for contemporary cloud technology. Autodesk is betting it will be a hit with companies of all sizes. At the same time, I didn’t find special novelty in the data-management paradigm. Also, I didn’t find any confirmation about flexibility and scalability of the system going beyond traditional PLM solutions (in the case systems like Enovia, Aras, etc. will be hosted on the cloud). The concepts of integration of PLM 360 are not clear and provide a concern with regards how PLM 360 can be embedded into overall company IT strategy. With all that, I found my user experience quite enjoyable, and I liked how PLM 360 performed tasks. These are just my thoughts… I’ll continue my experience with PLM 360 and hope to come with more articles.

Best, Oleg


How to re-invent “PLM collaboration” world?

March 24, 2012

What do you think about “PLM Collaboration”?… Yes, I can hear you – boring. However, what if I tell that collaboration can be cool again? Over the past year, I was tracking few vendors investing and playing with a collaboration topic. Today I decided to give you my perspective on what I believe will re-invent “collaboration” to the degree that will allow to regular people work and love these new collaborative tools.

Easy To Get

This one is simple. Nobody wants to deal with installations these days. If you want to provide a collaborative system, please make it available online or in the way that simplifies a path to start using a tool. This is absolutely critical.

Access and Mobile

Access is one of the very important things, collaboration needs to adopt. Collaborative tools need to be available everywhere and not restricted to a particular environment, infrastructure or device. This is in the past. Nowadays, mobile and BYOD rocks the field. All together it will create a significant challenge for IT, but it is just must go this way.

Collaboration and Cool

Cool is important! Don’t make it boring. You may say cool is not important when it comes to the business. I think opposite. Even if enterprise companies are still driven by old generation people, “new influencers” become more and more visible. People want to keep the same cool devices and use the same cool systems as they use at home.

New Gen of Collaboration Tools

I’d like to share few videos related to new collaboration tools I’m following. The first one is Vuuch. The company was established back in 2009. The idea of Vuuch is to improve collaboration by social interaction. Last version of Vuuch – V5 introduced additional features that make a deeper connection to working processes. You can see a video here. My guess usability is one of the biggest topics.

SolidWorks n!Fuze is a system to help engineers to collaborate mostly with SolidWorks files. It adopts cloud and improves connection between people and files outside of organization. It was released last year (2011). It was a flagship product for SolidWorks and use Enovia v6 cloud platform. At the same time, SolidWorks admitted many mistakes that were made.

Autodesk Cloud. This product is kind of n!Fuze remake together with the attempt to bring easy collaboration with files. It looks a bit Googlish, but fundamentally lack of many functions. The user experience of Autodesk cloud shows that customers are more in focus.

TeamPlatform. This is a newest addition to my “collaboration collection”. I found the application behavior slick and nice. This system clearly taking the cool and best experience of consumer web. You can navigate here to learn more. It supports few scenarios around file collaboration, managing projects and other files. It is too early to say what TeamPlatform can do, but it looks interesting.

What is my conclusion? I think more companies will try to challenge large vendors with new type of applications leveraging cloud, mobile and other best practices. It is interesting how they will be able to find a precise scenario to interest engineers and other people in organizations to start using that. We will need to get back to this topic in few months to see trajectories of collaborations.

Best, Oleg

picture credit David Castillo Dominici / FreeDigitalPhotos.net


PLM Think Tank March Top 5

March 23, 2012

I wanted to start by quote one of my blogs earlier this month – PLM is a fun place to be again. After almost a decade of slow motions and acquisitions, we are coming again to the state when new companies are created, and you can see some interesting alternatives appear in the market. There are few reasons to that. Among all disruption factors I’d like to mention three – cloud, consumerization and cost. They are connected, but ultimately driving most of changes these days. Autodesk is playing an interesting role in this disruption. By stepping into the game and committed to "cloud PLM", Autodesk clearly wants to become an innovator in this space. Later this year, we will have an indication who successful Autodesk PLM 360 will play at the first round. Now, I’d like to move to my traditional Top 5 reviews.

PLM Perfect Storm 2012

It is an interesting time to be in the PLM market these days. As I wrote in my recent blog -SolidWorks community and opportunity for PLM, there is a significant opportunity to deliver PLM solution to the white space market these days. Gartner’s PLM market dynamics slide is highlighting the same opportunity. It is clearly a perfect storm. Large PLM companies have a lot of money to play the future PLM game. They have a lot to win as well as to lose, in case something will go wrong. Who will take the best “stormy seat” in this game? An interesting question to ask.

Autodesk, Cloud and PLM for $19.95…

Autodesk made a significant turnaround from rejecting PLM to claiming Autodesk PLM revolution to come to every manufacturing company. If I think Darwinian, it can be a confirmation of the Autodesk ability to adopt to the reality of today’s world. One of the conclusions I’ve made last week during PLM Innovation conference in Munich – PLM is strategic now. Autodesk is claiming PLM revolution and emphasizing “technology” as one of the enabling factors. It means technologies behind Autodesk PLM 360 is what made Autodesk PLM possible. I’m looking forward to seeing technological whitepaper about Autodesk PLM 360 with some details going beyond marketing buzzwords. Time will show what Autodesk is serving us in PLM cloud box.

PLM Innovation 2012: PLM is strategic, but what’s next?

PLM is definitely getting more grounds. It is about product development processes and business goals. Five years ago, the most typical question about PLM was “why?” These days, people are asking question – “how”? Unfortunately, many questions are not answered yet.

Dassault V6, 3D Experience and "After PLM" Party

The story of 3D Experience is inspiring. I think, Bernard got inspired by Apple story. Experience is what getting more importance these days. I’m sure you know Apple example. But anyway… MP3 players were around quite few years, but only iPod/iTunes experience turned it into what we see now. It was about “music”, and Steve Jobs created closed eco-system of “music experience”. So, Bernard Charles is probably thinking about “future design experience eco-system”. It is cool! However, here is the problem I can see. The consumer audience is different from business one. When it comes to business of the companies, IP assets, IT and many other things, companies are getting concerned about “closed eco-systems”. The last passage about V6 and data got me concerned about future Dassault openness strategy too. In the past, V6 was criticized because it locked CATIA and Enovia behaviors. Dassault “after PLM” party seems to me as a story how to create an ecosystem supporting full cycle of innovation. Very visionary…

Cloud PLM and Bill of Material Question

I think, time comes to start asking simple questions about “where is my stuff”? Where is my CAD drawing? Where is my BOM? Where is my ECO? How all these elements can play together, since I need’em to feed my manufacturing/ ERP system and go to production. I believe we need to get better understanding about how Autodesk Vault interplay with PLM 360. I hope Autodesk will demo it soon. The same question goes to all cloud PLM providers. How to integrate data between existing and news systems will become a key question to make cloud PLM successful.

Best, Oleg


PLM and Google Enterprise

March 22, 2012

Let’s talk about Google today. I’ve been writing about Google technologies and Google enterprise efforts quite frequently. One of the questions, readers were asking me – when Google develops PLM? The interest was obvious. Many people believe Google technologies can be quite powerful to work for enterprise.

Read the following article – Google and the Enterprise: The Point? Money. Steve Arnold is analyzing Google trajectories in enterprise software. The main conclusion – working with enterprise organization is significantly different from working within consumer internet. Here is my favorite passage:

In one memorable, yet still confidential interaction, Google allegedly informed a procurement manager that Google disagreed with a requirement. Now, if that were true, that is something one hears about a kindergarten teacher scolding a recalcitrant five year old. Well, that may have been a fantasy, but there were enough rumblings about a lack of customer support, a “fluid” approach to partners, and a belief that whatever Google professionals did was the “one true path.” I never confused Google and Buddha, but for some pundits, Google was going to revolutionize the enterprise. Search was just the pointy end of the spear. The problem, of course, is that organizations are not Googley. In fact, Googley-type actions make some top dogs uncomfortable.

Another interesting article related to Google Enterprise on GigaOM. Notice an excellent explanation about what influence Google made on Microsoft. In particular, we can think how Google forced Microsoft to come with MS Office 365in order to compete with a growing competition with Google Apps.

For all Google’s effort, the incumbent powers Microsoft Office and Exchange Server still lead the corporate applications and email market. Last fall, market researcher Gartner estimated that Google Apps for Businessrepresented less than 1 percent of Google’s overall revenue and there is some doubt as to whether the enterprise apps business remains a priority for the company… Still, market share and revenue may never have been Google’s goal. By offering a lower-cost option to the Office/Exchange tandem, Google forced the market leader to respond, and that may have been the point all along.

I share Steve’s assumption about how much focus Google can put behind the enterprise deals. Absence of key execs reporting to Larry Page confirms Google is not planning to make Enterprise PLM Googley in a near future.

What is my conclusion? Good news for PLM companies. If somebody thoughts Google come tomorrow with PLM product, no worry. This is probably won’t happen. No Google PLM 2012. When I think, Google technologies are fascinating, the absence of focus and experience with enterprise companies, makes Google teeth-less in front of large enterprise software dogs. Just my thoughts…

Best, Oleg


The Future of PLM consultancy

March 20, 2012

The internet changed our life for the last decade. I don’t think somebody will argue about that. Cloud is trending. Thinking about different aspects of how cloud is going to influence our future business, the topic of consultancy and service providing shouldn’t be missed. Some time ago, I posted – PLM, Cloud and Service Channel. Back three months ago my conclusion was as following

I have no doubt, introduction of new cloud solutions won’t reduce the amount of services and implementations. So, first, it is a good news for VARs and service providers. At the same time, cloud solutions will set a different “price” demand in front of vendors and partners. To re-organize to a new pricing structure will be another challenge.

The following announcement few days ago resonated to my thoughts about PLM cloud and service partners – General Atlantic And Sequoia Put 60M In Enterprise Cloud Consultancy Appirio. Here is the news, in a nutshell:

Cloud solution provider and consultancy Appirio is announcing $60 million in new funding led by private equity firm General Atlantic, with participation from existing investors Sequoia Capital and GGV Capital. This brings Appirio’s total investment to over $70 million. Previous investors include Salesforce.

Appirio shows some impressive growth.

In 2011, Appirio grew revenue by over 80 percent, expanded into Europe with the acquisition of Saaspoint and developed the Cloud Enablement Suite, an integrated set of applications that supports enterprise cloud development. Additionally, the company introduced CloudSpokes, a crowdsourcing community for cloud development.

So, what is my conclusion? All PLM service providers that afraid about how cloud will destroy their business need to calm down. There are plenty of work to deploy cloud solutions and money is here. You will probably need to justify you skills and think about how to adopt your practices and experience to the speed of the cloud. Just my thoughts…

Best, Oleg


Cloud PLM: what do you need to know about multitenancy

March 18, 2012

Multitenancy. You may ask me why I want to spend this Sunday morning talking about multitenancy. Well… two words – important and confusing. Now, with the announcement of Autodesk PLM 360, I expect conversations about multitenancy to happen more often and even create some turbulence during pre-sales cycles. So, let me step back and try Wikipedia link. Here is the definition:

Multitenancy refers to a principle in software architecture where a single instance of the software runs on a server, serving multiple client organizations (tenants). Multitenancy is contrasted with a multi-instance architecture where separate software instances (or hardware systems) are set up for different client organizations. With a multitenant architecture, a software application is designed to virtually partition its data and configuration, and each client organization works with a customized virtual application instance. Multitenancy is also regarded as one of the essential attributes of Cloud Computing.[1]

History

The history of multitenancy goes to few computing paradigms we had in the past – time sharing and ASP (application service provider). Time sharing was very popular on mainframe (this is why many people consider mainframes as one of the technological roots of cloud computing). ASP concept when another company “hosted” product for customers is another early example service that influenced the current state of multitenancy.

Economic and Multitenancy

I’d like to start from the economic of multitenancy. Thinking about cloud software, multitenancy creates fundamentals for resource sharing. As a result, you can make the operational cost lower. It increases the ability to compete and provide more attractive price point for services. Hosting without multitenancy won’t provide such an advantage, since you will have to host server per customer. It obviously can bring an advantage of scale economy, but even so won’t be on the same level as multitenancy.

Technology, Complexity and Examples

The majority of enterprise applications developed during the last two decades were a single tenant. The target was “client-server” environment and data center. Even applications developed with “web architecture in mind” assumed database and application server dedicated to a customer. Multitenancy requires significant changes in architecture. Some of the enterprise software providers started to move their platform towards multitenancy. The majority of knowledge about development of multitenancy came from public web sites and SaaS application providers. The most famous example is salesforce.com. The architecture of salesforce.com assumes full multitenancy also on application and data level. If you want to deep your knowledge about how Salesforce platform designed, navigate to few available youtuble videos. Few slides below can give you high level view of Salesforce vision of multitenancy.

Salesforce view on waste of multi-tenancy

Salesforce.com -multitenancy advantages

Salesforce high-level multitenant architecture

I can recommend you another interesting article about multitenancy from Microsoft. Navigate to the following link -Multi-Tenant Data Architecture. The document is relatively old 2006), but provides some interesting illustration about how multitenancy and data architecture can be designed. Take a look on the following picture illustrates Three Approaches to Managing Multi-Tenant Data:

Multitenancy and data architecture (source: MSDN)

There are obvious pros and cons in different solutions. Multitenancy has obvious advantages mentioned above. However, complexity and cost of development of a multitenant solution are higher than a single tenant alternative.

What is my conclusion? In my view, we are going to hear more about multitenancy. Cloud is disruptive. At the time, vendors are going to own servers and provide services, multitenancy will become one of the factors to improve profitability and decrease operational cost. At the same time, marketing will continue to use “buzzwords” to win a social marketing and pre-sale game. If you are a customer shopping for your first cloud solution, you better get yourself a bit educated about the topic of multitenancy. Just my thoughts…

Best, Oleg


Follow

Get every new post delivered to your Inbox.

Join 218 other followers