Dogfooding and PLM APIs random thoughts

November 14, 2012

If you are long time enough in software business you should be familiar with the term "dogfooding" (or eat your own dog food). This term used to explain the situation or scenario in which company is using their own products to demonstrate their capabilities and quality. If you are not familiar with this process, navigate to the following Wikipedia article to read more. I liked some examples there, specifically, Apple one, which I wasn’t aware about -

Apple Computer president Michael Scott in 1980 wrote a memo announcing that "EFFECTIVE IMMEDIATELY!! NO MORE TYPEWRITERS ARE TO BE PURCHASED, LEASED, etc., etc." by the computer company, with a goal to eliminate typewriters by 1 January 1981.[9]

The following passage brings few more examples:

One perceived advantage beyond marketing is that dogfooding allows employees to test their company’s products in real-life scenarios,[3][5] and gives management a sense of how the product will be used, all before launch to consumers.[5] In software development, the practice of dogfooding with build branches, private (or buddy) builds, and private testing can allow several validation passes before the code is integrated with the normal daily builds. The practice leads to more stable builds[citation needed], and proactive resolution of potential inconsistency and dependency issues, especially when several developers or teams work on the same product. For example, Microsoft and Google emphasize the internal use of their own software products[citation needed]. For Microsoft, especially during the development stage, all employees across the corporation have access to daily Software builds of most products in development, including the Windows operating system.[citation needed]

Today, I want to speak about specific "dogfooding", which is related to PDM/PLM APIs or (Application Programming Interfaces). In the world of PLM implementations, the role of Open API becomes very important. Usually, when I’m working with customer requirements, I can see the following notes – external programming or customization as a way to resolve features or function absence available in the product. Yesterday, I had a chance to read the following TechCrunch article – 5 Rules for API Management. Even if you are not programmer or software engineer, have a read and make your opinion.

The article made me think of the complexity of API delivery in PDM/PLM as well as about "lifecycle". The latest is important – PDM/PLM products live very long period of time, and the development of stable APIs is a separate and almost "must have" a prerequisite. The 5 rules – design, documentation, analytics, universal access and uptime made a perfect sense to me. I found interesting note about the relationships between IT and business group (which is also very typical for many PDM/PLM implementations):

Enterprise API Management must include the entire Enterprise, not just the techies in IT. The SOA solution, and the other gateways as well, is focused on the IT person and not the business owner of the API program. This is reflected in the UI that they present in their free version as well as their language that includes things like “policies”; too much of the business rules are codified in complex policies that require a technical expert to really use.

However, I found the notion of analytics, mostly interesting, since it can address the idea and requirements of API management through the lifecycle of the product. Here is the passage to think about:

[how to] think about the collection and processing of all the statistics associated with the use of the API, with an eye toward supporting and encouraging effective usage and discouraging/limiting usage that is counter to your business or technology goals.

What is my conclusion? The days of single PLM platforms are almost gone. The future belongs to Networks. Data networks, product and cloud services networks. The ability to adapt a product to customer needs, to continue product development in a fast-changing customer environment and strategic goal for cloud, deployment set new goals in front of PDM / PLM developers. The importance of having agile and flexible API that can sustain many product releases and development cycles was never as important as of today. Just my thoughts…

Best, Oleg

Image is courtesy of TechCrunch article (Feature image courtesy of XPlane – under Creative Commons.)


PLM Implementations and Open APIs

May 8, 2012

Let’s talk nuts and bolts today. APIs.. If you think about any PDM / PLM implementation, the question about API is one of the most important. Why so? Because you know – it is near to impossible to get all done out of the box and via configuration. Even if marketing advertised and sales promised, you will have to have something to be done behind the scene using this magic word API.

PLM Openness

The topic of openness comes very often these days. I’ve been posting about openness about a year ago -PLM and New Openness. Notable news around PLM Openness is coming these days around so-called “Codex of PLM Openness” introduced by ProSTEP iViP. Navigate to the following link and you discover that majority of PLM vendors, including big-three-PLM (Dassault, PTC andSiemens PLM) are committed. Yesterday, during the opening session of annual Siemens PLM user conference – PLM World 2012 in Las Vegas, the topic of PLM Openness came into many conversations and even was captured by Siemens PLM blog.

Enterprise Systems and APIs

Enterprise systems have long history of API development. If you spent enough time in your life with databases and enterprise business you probably remember horrible stories of proprietary databases, move to SQL, hope of XML, believe in SOA / Web Services latest dreams about REST APIs. Last week, I came to a very interesting blog trilogy from CloudAve blog about enterprise architecture, APIs and more called – Simple Service Enterprise part 1, part 2, part 3. It is a bit long, but I recommend you to have a read. The following picture was resonating to my thoughts related to PLM implementations and APIs:

Here is my favorite passage that I’d apply to product lifecycle management and many other enterprise implementations:

…the fundamentals of information interchange: exposing business functionality, currently encapsulated in the back-end, to the outside world via services. These services are a one-to-one translation to back-end functions, which are one-to-one translations to business process steps themselves: the smallest level of business transaction.

Implementations, API and Open Data

Here is the idea how I see the future of open APIs. PLM system(s) is holding hostage of data and responsible for a set of process and transactions. Since PLM system cannot live in a vacuum, the interaction of PLM system with other systems in the enterprise (including various B2C and B2C services) is driven by processes. In order to have a productive API, you need to expose these processes using an appropriate level of granularity, including semantics of data (in this context, thinking about resources seems to me as an appropriate way). Having such a level semantically-resource-oriented-APIs can provide an easy and open way to interact with PLM system to build the most effective services.

What is my conclusion? To build a good API is a very complicated task. To make Open API is even harder. I can see a potential in exposing both semantics of data and related system functions in a way allowing me to use it and accomplish processes automatically. I think, web and REST give us a bit promise. The responsibility of vendors is to develop an appropriate level of granularity to make it usable. Just my thoughts…

Best, Oleg

Picture is courtesy of CloudAve.


Top 5 API Mistakes in PLM Software

October 13, 2010

In any enterprise software, there is a point of time we are asking the following question – "what APIs are supported?" We anticipate the need of API in order be configure, customize, tailor, change, adapt, etc. Use any marketing wants, in the end you’ll need an API to be able to make the software finally works in the way you want. APIs provided by software define a level of flexibility, and this is one of the important characteristics you need to take into account.

The "10 common mistakes by API providers" article on ReadWriteCloud made me think about mistakes vendors are making when designing and providing APIs. I decided to put my top 5 lists of typical problems I’ve seen in APIs for CAD, PDM, PLM software. I believe, this list is not unique in CAD/PLM and may represent some general trends as well.

1. Complexity of data and API structures
The simplicity of API is one of the most important characteristics. When it comes to complex products like CAD or PLM, a very typical problem is a high level of complications in API functions, dependencies and data structures. The level of knowledge required to work with API is growing enormously and the potential for bugs and mistakes is growing too.

2. Late availability compared to software release
This one is representing a painful situation when APIs becomes available only in the "next version" of product. Normally, it represents a lack of maturity in product functions. It often happens in a situation when a vendor is releasing a new version and anticipate function changes in the next version.

3. Inconsistency with core product functions
Another very painful situation when a product from a user standpoint behaves differently compared to API. It normally happens when development organization is inconsistent with regards to planned functions and API. The involvement of different teams of developers , separation of user-functions and API-development may cause these situations.

4. Not Stable APIs and Lack of Backward compatibility
When you customize software for customer or develop your own application you want to minimize potential problems when new version of software is coming to a market. An ability of software to support a stable set of APIs between releases is a very important characteristic. Not stable API can jeopardize a new version deployment or a new release of your software. This is a very complicated issue causes a lot of problems.

5. Lack of tests
There are software vendors consider developers a cheap workforce to debug a new software. I can see a higer probability to see an API problem, compared to user function in the same product. It can be prevented, in my view, but increased usage of own software API for internal development. In addition, it often happens when API considered as an "addition to software" and developed separately from core functions.

There is one more thing. Availability of API. I didn’t put it in the list for several reasons. Sometimes, CAD/PLM vendors are making decision to protect some aspects of their software by restricting APIs. This practice more related to business strategy and cannot be considered as "API mistake". This is another problem that often happens and needs to be taken into the account.

What is my conclusion? The amount of revenues in the market of engineering software related to services, and customization is very high. For different segments, it approaches 50% and even more. APIs is a core fuel behind these revenues. Consistent, stable and simple API is an important characteristic of every successful product in this market. Just my opinion.

Best, Oleg


PLM Content and Facebook Open Graph

April 28, 2010

Facebook F8 conference this week was a place for some very interesting announcements that, in my view, may have an impact on PLM too. I can recommend you to read a good summary of news introduced by Facebook here. In my view, it presents a very interesting dimension in the future development of Facebook. The question I’m asking – should PLM care? Is there something new presented by Facebook, that can catch a focus of current and future PLM development?

Web Content Creation

Facebook is making next steps in the development of content on the web. FB introducing a new way to build social sites where a site itself converted into meaningful content item able to accumulate links to other elements of web content. It presents another dimension in the way web content can be structured.

Social Graph

The notion of connection between social sites is presented in the Open Graph Protocol. Facebook OGP API will provide a way to browse through dependencies and connection between content elements based on social dimension. This is a very interesting approach in the development of rich metadata about web content.

PLM Content

I see a comparison between the way Facebook is building social web content and PLM content. For the last few years PLM presented a very hardwired way to create structured content. It creates a lot of complexity in user interaction and tools integration. The fundamentals of this content are in the ability to structure product information in various dimensions – design, bill of material, projects, etc. The similarity between Facebook social content and PLM structured content is obvious to me. However, Facebook presented an interesting approach to build it. There are several developments on PLM horizon that can fit such concepts – DS/BlueKiwi, PTC Windchill/SharePoint, Vuuch. It will be interesting to see the future development of these products.

What is my conclusion? Facebook is going to change a way to create the web content. Current PDM/PLM software seems very cumbersome from the standpoint on how to interplay with user on content creations. PLM can learn few lessons about how to create a content in the organization and across the value chain. Some concepts and ideas can be replicated in my view. What is your opinion?

Best, Oleg

Share


Next Level of PLM Social Tools Development

January 8, 2010

I want to discuss what can be the next step in the development of social aspects of PLM tools as well as about what can become a major driver for future social PLM. Last year, we had chance to see multiple examples of how mature PLM vendors and small companies moved towards establishing products in the social domain. Social Innovation, Social Product Development, Social Design… All these buzzwords were used, but I want to dig inside and discuss how I can, practically, these tools can get some level of social acceptance in the enterprise.

Social vs. Siloed?

This is one of the questions that come to my mind when I’m thinking about multiple vendor race toward social tools. The major barrier is user adoption. How many social networks you can be a friend of? How you can track your participation in multiple forums, social groups, forums, etc. If tomorrow’s product will come as  social software from multiple vendors? SharePoint communities vs. Salesforce Chatter? How many other social networks and communities can practically exist in the organization? My conclusion is that social experience cannot be siloed  – the certain mechanism needs to be to allow people to communicate across business application boundaries.

Social API

How to organize cross application social experience. This problem is not new these days and exist in multiple social networks we have today – facebook, linkedin, myspace, ning etc. The option to integrate communication across these networks can be development of some social API that allows to the communicate in a singular way. An example of such an API can be OpenSocial:

OpenSocial helps these sites share their social data with the web. Applications that use the OpenSocial APIs can be embedded within a social network itself, or access a site’s social data from anywhere on the web.

I’d recommend you to take a look on Open Social for Enterprise white paper. Some interesting concepts are defined there about how API can be used to allow cross application social tool to co-exist and not to be siloed into specific application niches.

OpenSocial Architectural Concepts

Broadly speaking, OpenSocial defines two concepts. The first is gadgets, a component model based upon HTML, Cascading Style Sheets (CSS), and JavaScriptTM, that describes how content gets managed and rendered inside a Web browser. If you use sites like iGoogle, then you are already familiar with gadgets. The second is a set of APIs for accessing and working with social data. These APIs define how you access information about a person, their relationships, and their activities. These APIs are made available for use in gadgets, via a set of JavaScript APIs, as well as programmatically via REST. OpenSocial applications can take the form of gadgets that can be embedded into any container that supports the OpenSocial specification or traditional SOA services for integration. Taken together, the gadget component model, social APIs, and REST interfaces, provide a programming model that enables the creation of standards-based social applications.

You can see adoption of Social APIs and OpenSocial specifically for existing social networks from MySpace, Ning, LinkedIn, Google and some others.. .

What is my conclusion today? PLM is interesting to jump into social bandwagon. However, PLM will be able to do so only by adoption of some “open behaviors” that are considered as must-attributes of the social world. This is will be even more important in enterprise rather than in the consumer world.

Just my thoughts.
Best, Oleg

Share This Post


Follow

Get every new post delivered to your Inbox.

Join 262 other followers