PLM 2013: What is your 7-years plan?

December 31, 2012

I’ve been out of active blogging for the last week because of Autodesk week of rest. It is a perfect time to disconnect from day-to-day activities, stop and think about what happens in PLM from different perspective – customers, vendors, technology. Approaching the end of the year, we can see huge amount of blog posts with titles like "the most important things you need to pay attention in 2013". I’m a bit tired of these "predictions". I decided to jump over and think about longer perspective of PLM beyond 2013. One of the companies that always impressed me by their long term thinking strategy is Amazon. You can read a bit about Amazon long term strategy from the last year NYT article here. My favorite passage is this one -

“If everything you do needs to work on a three-year time horizon, then you’re competing against a lot of people,” Mr. Bezos told reporter Steve Levy last month in an interview in Wired. “But if you’re willing to invest on a seven-year time horizon, you’re now competing against a fraction of those people, because very few companies are willing to do that. Just by lengthening the time horizon, you can engage in endeavors that you could never otherwise pursue. At Amazon we like things to work in five to seven years. We’re willing to plant seeds, let them grow—and we’re very stubborn.”

It made me think about what I can see on a horizon pf PLM product and technologies for the next seven years. To make long term prediction is risky thing. Nevertheless, I will take a chance to put some of my questions, ideas, thoughts and the conclusion below.

What we do after "cloud" goes mainstream?

We are in the middle of the biggest technological transformation since PC-era. Think about consumer technologies and products. Cloud technologies went mainstream in many areas – email, photo sharing, social networks, video streaming. PLM cloud revolution is moving forward these days as well. Vendors are applying different strategies. Two main disputed topics – security and availability, will be resolved sooner or later. Today, you still have time to get prepared to a new level of access, openness, mobility and ease of PLM cloud adoption.

PLM business models

Customers’ demand is to have predictable business models that will allow to PLM to grow in organizations. Today, PLM business models cannot scale to the level of enterprise and beyond. You can hear lots of discussion about that. To provide simple and affordable pricing models is a next challenge in front of CAD/PLM vendors. It won’t happen fast. To change revenue model is one of the most challenging part for every organization. At the same time, I don’t see any other ways to make PLM for everyone.

The value of small communities

The idea of communities is going mainstream in social networking and media. PLM companies and few startup companies tried to apply it to engineers, manufacturing companies and product development. It had their successes and failures. There are few Facebook-copycats developed by startups and established PLM/ERP vendors. I believe, it was a right step for 2012. However, thinking about longer terms, the value of these communities is limited. PLM companies will have to explore the value of small communities focused on specific vertical, industry, segment, etc. These small communities will have a lot of impact and potential in the future.

How to build PLM open data?

The era of data exchange is about to end. No industry can survive under such a huge amount of data translators and data modeling best practices. PLM vendors need to discover a growing world of shared vocabularies. Dublin core, Freebase, Good Relations, RDFS, SKOS, SIOC and others. To leverage open vocabularies will be extremely important in order to build connected PLM services.

What is my conclusion? The world of manufacturing, engineering and product development is changing. It is not as fast as consumer world and web. In my view, many manufacturers are holding back now their plans about the future development of PLM initiatives. My favorite quote of Nathan Myhrvold (ex CTO of Microsoft) - If you want to have a great future you have to think about it in the present, because when the future’s here you won’t have the time. Manufacturing and product development is in the early beginning of adopting curve of technologies and methods developed by web during last two decades. It will take time to transform and adapt it to fit enterprise companies need, but once it came, "new PLM" will stay with us long time.

Best, Oleg

Image courtesy of [jscreationsz] / FreeDigitalPhotos.net



PLM Scale and Some Internet Factoids

December 22, 2012

The scalability of enterprise systems is an interesting topic. Enterprise IT usually keeps the story about scalability of systems close to their chest. It involves data centers, databases, channels, networks, latency, and many other aspects that allows you to tune your enterprise PLM. And I know, it was absolutely true for existing enterprise PDM and PLM.

The situation is different nowadays. Last 10 years of web development and internet established a new level of scale. The amount of data and user activities web and social networks can handle is going much beyond typical enterprise deployments. The following AronoldIT factoid article captured my attention earlier this week. I don’t know if these numbers are accurate. But knowing that Gangnam style fist video just hit 1B Youtube views, I can easy believe that.

Every minute more than 1,649,305 tweets get shared.
Every minute more than 3,472,225 photos get added to Facebook.
Every minute more than 2,060 brand new blogs are created.
Every minute more than 52,488 minutes of video are added to YouTube.
Every minute more than 31,510 new articles are created by an online newspaper.
Every minute more than 3,645,833,340 new spam emails are delivered online.

What is my conclusion? The consumer web and social media introduced a completely different perspective of scale, capacity and system performance. Enterprise PLM vendors and IT service companies need to start paying attention. The technological gap consumer systems are developing these days can easy outperform existing enterprise PDM and PLM deployments. Important. Just my thoughts…

Best, Oleg

Image courtesy of [ddpavumba] / FreeDigitalPhotos.net


Who will clone existing PLMs to the cloud?

December 20, 2012

The discussion about PLM and cloud is moving to the levels where "details become important". In my view, many customers today are moving from "why cloud?" to "how we can leverage cloud?" type of questions. Cloud has many faces. In my previous blog post – PLM and the diversity of cloud options, I discussed how multiple cloud deployment options can be used for PLM – IaaS, PaaS, SaaS. Different PLM vendors are choosing different strategies. Here are some examples – TeamCenter is choosing IaaS, Aras decided for PaaS and Autodesk PLM360 is SaaS.

PLM industry is coming to the cloud with the a heavy baggage of technologies and products developed for the last 15-20 years. The existing PLM products and amount of customer investments into PLM program can raise valid questions about how to leverage these assets in the cloud. I’ve been reading TechCrunch article about CloudVelocity startup – CloudVelocity Launches With $5M From Mayfield To Bring The Hybrid Cloud To The Enterprise. Here is the interesting passage:

Users can discover, blueprint, clone, and migrate applications between data centers and public clouds. Currently, CloudVelocity supports full server, networking, security and storage integration with AWS but plans to integrate other public clouds, such as RackSpace in 2013. The beta trial of the Developer Edition cloud cloning software allows users to clone multi- tier app clusters and services, without modification into the Amazon Web Services (AWS) EC2 cloud. The Enterprise Edition enables users to clone, migrate and failover multi-tier apps and services into the AWS EC2 cloud.

The article made me think more about hybrid cloud and opportunity to expand existing PLM implementations to the cloud. Imagine, you can clone your existing PLM implementation and move it to AWS or RackSpace cloud? It allows you to build a secured environment to expand your PLM deployment into additional services. Here is a possible example. Many companies have BOM management implementation done as part of basic PDM/PLM programs. Future expansion of these services to NPI or Service Management requires additional resources and global availability. By "cloning" existing BOM management implementation to the cloud you can future expand to additional services.

However, I can see potential problems too. Many PDM/PLM environments have tight connections with desktop CAD applications. How to clone these environments to the cloud? This is a good question to ask.

What is my conclusion? The interest of customers to leverage cloud is growing. Still, many customers see cloud is a potential to implement something that they cannot do today with traditional PLM programs. Sometimes it is infrastructure limitations such as global deployment and sometimes it is related to cost implied into growing PLM deployment. I can see a growing opportunity to provide a technology enabling "to clone" existing PLM program to the cloud with future growth. It can be interesting option. So, dear PLM developers, somebody will clone you tomorrow in the cloud. What do you think about that? Just my thoughts…

Best, Oleg

Image courtesy of [Victor Habbick] / FreeDigitalPhotos.net


Why Companies are Not Ready for Single BOM?

December 19, 2012

Bill of Material is one of the fundamental things in engineering, manufacturing and product development. Whatever topic you start discussing, you end up with the discussion about BOM. Wikipedia, actually, provide a decent definition of Bill of Material. Here is the link and quote:

A BOM can define products as they are designed (engineering bill of materials), as they are ordered (sales bill of materials), as they are built (manufacturing bill of materials), or as they are maintained (service bill of materials). The different types of BOMs depend on the business need and use for which they are intended. In process industries, the BOM is also known as the formula, recipe, or ingredients list. In electronics, the BOM represents the list of components used on the printed wiring board or printed circuit board. Once the design of the circuit is completed, the BOM list is passed on to the PCB layout engineer as well as component engineer who will procure the components required for the design.

It sounds so simple and straightforward. If you just finished your “BOM 101″, you can think the topic is really easy to get. The complexity of Bill of Material management comes as a result of the process, which is happening around BOM during design, engineering, manufacturing and support. What defined as “different types” of BOM, in reality, representing people, teams, department and sometimes different tools and enterprise solutions.

Time ago, I posted about how companies can have a single BOM – Seven rules towards single bill of materials, which raised many questions and comments in the past. One of the ideas of having single Bill of Materials is to streamline processes across disparate teams and departments. Few weeks ago, I came across a white paper published by Arena – Beyond BOM 101: Next Generation Bill of Materials Management. Navigate to the following link to read the document. This white paper provides a very interesting picture, which demonstrates the reality of BOM management in any manufacturing company.

multiply-teams-and-BOM.png

This whitepaper is highlighting a very important fact – during the design, engineering and manufacturing process, engineers need to update BOM in many systems. Here is my favorite passage explaining the complexity of BOM management.

A modern BOM often includes a complex set of hundreds to thousands of structured items… Even after the first product is built, the BOM will continue to evolve—whether due to potential bug fixes, design improvements, part substitutes, or supplier switches—until the product reaches its end of life. The time spent to manually make changes and fix mistakes throughout the lifecycle of a product may amount to a substantial delay in its shipment. With multiple teams inputting frequent changes, manual revision control processes can easily become overwhelming and chaotic. It is difficult to track which changes have been made to which revisions. There is a lack of “a single version of truth” —the latest product information including BOM—that all project teams can consistently and confidently rely on throughout the lifecycle of a product.

The main challenge in this process is to maintain multiple BOMs in different systems. So, the idea of single Bill of Materials can be easy materialized to solve the complexity of synchronizations. So, why companies often fail to establish this single BOM? I can identify 3 main reasons why it happens:

1- Companies are using variety of tools to design, build and support products. Single platform PLM is probably a dream that not going to be materialized. In most of the companies, multiple design tools (including CAD), product data management and ERP systems are creating a complicated eco-systems with many rules and dependencies.

2- Because of specialization, people are not interested to switch from specialized and tailored tools to somewhat less functioning but common. The change is complex, can lead to potential delays and involvement of IT in system deployment and data integration. People prefer to bump BOM between systems rather than use a single tool.

3- It is hard to agree on how to share a single structured set of information (single BOM) among multiple teams, department and organizations. To develop export/import functionality as well as multiple synchronization services is, unfortunately, the mainstream decision taken by many companies.

What is my conclusion? I think, companies need to have a single, sharable, structured BOM representation reflecting all aspects of product development. PLM vendors applause to the idea of single database, but most of the integration and data synchronization tools and techniques are still very premature. In addition to that, PLM vendors are usually trying to lock customer to a single platform solution preventing independent and open bill of material storage. So, all together it blocks customers from migrating their infrastructure and system towards “single BOM” implementation. Just my thoughts…

Best, Oleg


3 things PLM can learn from UCO (Used Car Ontology)

December 18, 2012

Our digital and real lives are getting more and more connected. Think about our web experience these days. Mobile devices, map service, location based query, social networks. The list of examples can continue. One of the biggest challenges we have, as a result of web exposure in our real life, is the need to integrate and interconnect many sources of information coming from different places. Think about intertwining of your location information, photo posts and reviews. You have Facebook Nearby service (the enhancement of this service was just announced yesterday), You can get some interesting perspective on the service here.

Moving to the main topic I wanted to discuss today, navigate your browser to the following SemanticWeb article - Introducing the used car ontology (UCO). The article speaks about publishing of ontology (knowledge of concepts) that supports a precise description of used cars. The article reference MakoLab, the company that made this work as well as provides a link to the ontology itself. I found the following passage interesting.

“Publishing information about used cars… containing a description which refers to the Used Cars Ontology will allow for easier searching of cars for purchase, along with a more in-depth description of their state of exploitation. The UCO supplements the more general ontologies GoodRelations and Vehicle Sales Ontology created by Professor Martin Hepp. The GoodRelations ontology is now integrated with the famous dictionary schema.org created by Google, Yahoo and Bing, for the purpose of improvement of information searching on the Internet and better positioning of websites.”

Dig a bit into UCO ontology document and you find some examples of queries and operations build with the use of UCO ontology. Spent some time with the document and you will learn how to get report of used cars, getting car description, updating information about the car and more. The information about used cars can be located on multiple websites. It made me think about possibility to improve interaction with multiple island of information. Here are top 3 things I came after the analyzes of UCO doc:

1. Ontology can be used to produce a meaningful queries. The web technologies are providing a reliable instrument to work with data located in multiple websites. Semantic web provides set of technologies helping you to describe, query and manipulate the data.

2. Nobody is interested how data is stored. It is almost meaningless how data is stored. The website with the information about used cars can use any technology (from text files to excels and databases) to store data. This information is not needed to process data on a web scale.

3. Publishing semantic information can improve cross system data access. When your website and/or service is publishing information in semantically accessible way the information can be intertwined and used by other services for different purposes.

What is my conclusion? Web is a good example of the system that grew up beyond the level of single database. Web data processing mechanisms are interesting from the standpoint of sustainability and data scalability. Used car ontology provide a good example of organizing interoperability beyond the level of a single website. My hunch, we are going to see some of these technologies can change the way PLM systems operate today. Just my thoughts… What is your take?

Best, Oleg


How to prevent PLM cloud from buying experience complexity?

December 17, 2012

Experience is one of the most popular words in tech these days. CAD and PLM vendors are part of this journey as well. I would like to talk about "buying experience". For many customers, enterprise software experience is actually starts from the moment company needs to make a decision what product (software) configuration to buy. PLM and other enterprise software price lists are well known by the complexity. Many products, options, configurations. In many cases, the final decision about the desired configuration of software package can be taken after the evaluation and pilot implementation. Very often, consulting and/or service company needs to be hired to help in software configuration choice.

Opposite to enterprise on premise software, cloud or SaaS software is coming with the model that supposed to simplify the decision process. Especially, when it comes to the buying decision. For many SaaS software companies, the simplification of buying process is one of the most critical elements to success. The configuration of SaaS product is combined from few options like the following one for Salesforce.com. I’m sure you can come with more examples like that.

Over the weekend, my attention was caught by the following article in ReadWriteWeb Enterprise – Cloud Complexity Clouts Enterprise Customers. Author speaks about complication of Amazon Web Service cloud – experience gathered from recent re:Invent conference. One of the sessions of the conference was completely dedicated to… billing. I found the following passage quite interesting:

The billing session came as a bit of a shock. One Apache OpenCloud committer who did not wish to be identified summed it up best: "If you have to have a session on billing, you’re doing it wrong."… It’s a valid argument, because while one should rightfully expect all levels of interest to be addressed at a trade show’s first run, it seems that something like billing for cloud services should be pretty simple and not in need of tutoring. This kind of complexity raises serious questions about the real value of using a public vs. private cloud in the enterprise.

I see the potential of PLM cloud to get infected with the "complexity disease". It can be easy inherited from on-premise PLM portfolios. It can also come from some examples of cloud complexity like the one above. This is a very dangerous trajectory. In my view, cloud software is all about easy experience – buying, starting to use, expanding. Vendors need to be focus how to get customers involved and make a decision fast. Renewal of the service is another key element of success. Complex buying experience and complex paying experience (including billing) won’t help cloud solutions to expand their positions in organizations.

What is my conclusion? The simplification is one of the most strongest trends today. It goes everywhere. Buying experience is one of the most critical elements, since it is one of the first interaction with vendor and software customer may have when he starts adopting a service. It is also critical from the standpoint of understanding software ROI. If you cannot understand your software bill, you most probably will have a problem to calculate ROI too. Just my thoughts…

Best, Oleg

Image courtesy of [pakorn] / FreeDigitalPhotos.net


How BoxCryptor can solve CAD cloud concerns?

December 16, 2012

The simplicity of DropBox and similar cloud based file storage makes it very attractive to many people. Engineers are not an exclusion from the list. The discussion about "Dropbox and PLM" is on going already long time. I posted about some interesting dropbox usage patterns few weeks ago here. As you can read from the article, engineers are placing files to the Dropbox. The problem of security is clearly identified these days. So, to see a startup trying to solve this problem should not surprised you much. That’s what I felt when my attention was caught by BoxCryptor for Dropbox. Still in alpha version, but promising … Here is how BoxCryptor CEO Andrea Wittek explains what they do:

According to BoxCryptor CEO Andrea Wittek, the benefit of that would become apparent if you happen to want to download and decrypt something using someone else’s machine. I can also see the feature coming in handy for Chrome OS business users, down the line at least. “We call it an alpha version,” Wittek told me. “We’ve been testing it for a while. We definitely recommend people try it, though we wouldn’t recommend it for very sensitive files. It can crash – the worst thing that can happen is you think it’s encrypted a file and it hasn’t.”

Watch the following video demo how BoxCryptor works.

http://www.youtube.com/watch?&v=iC4_zTjwYPk

What is my conclusion? Security is number #1 concern to adopt many cloud-based solutions. At the same time, you cannot stop new technologies. Access is one of the big advantages of the cloud. I can expect more companies will try to crack cloud-security problem and find innovative ways to share files in secured manner. The prize is huge – seamless access of files plus simplicity of Dropbox. For engineers, BoxCryptor can be a very attractive solution to share files with suppliers and remotely located design partners. This is only partial list. I’m sure you will come with more use cases. I’d be glad to read more about your experience with Dropbox file sharing.

Best, Oleg

Image courtesy of [Renjith Krishnan] / FreeDigitalPhotos.net


[tag CAD, Cloud, Files, Dropbox, Security]


What PLM vendors need to know about noSQL databases?

December 14, 2012

Relational databases is a very mature set of technologies. We use RDBM (Relational databases) practically everywhere these days. It is hard to imagine enterprise software and PDM/PLM systems these days without relational databases. At the same time, the new class of database management solution is coming. It called NoSQL (Not Only SQL). I posted about noSQL few times. You can refresh your memory by navigating to the following link. First time this term came in use back in 1998 as "noREL" databases. Later in 2009, the term noSQL was proposed for "to label the emergence of a growing number of non-relational, distributed data stores that often did not attempt to provide atomicity, consistency, isolation and durability guarantees that are key attributes of classic relational database systems". NoSQL database solutions are widely used today in web and mobile applications. I can see a growing number of noSQL database usage in business intelligence and master data management applications.

NoSQL is not a single database. This is a name for a broad set of data management or database technologies focusing outside of RDBMS world. The technologies and terminologies behind this term is new. PDM/PLM vendors ignored noSQL database management solutions until very recently. It made me think to provide a quick summary of what stands behind this broad term and what PDM/PLM uses cases it can support.

Key-value (KV) databases

KV stores is a simplest database model in noSQL world. It stores "keys" and associated "value". Basically your database is a storage of pairs of key-value. Some databases support more complex structure behind values such as complex values (list, hash), but it is not required. One of interesting PDM/PLM use cases is to store list of files as a key-value database. In such a case, file name is a key (including full path) and value is actually the content of the file. Examples of KV stores are Riak and Redis.

Colum-oriented databases

This type of database is very close to RDBMS. The main difference is that columnar data model designed to keep data from every column in the table together. It is an opposite solution to RDBMS, which keeps the data for a specific row together. It allows to add a column to a table in a very "inexpensive" way. Each row may have a different set of columns. This type of databases are good for reporting and business intelligence solutions. Columnar data model impacted few PDM/PLM core modeler development available today at the market, by providing a higher level of flexibility in data modeling. Example of column-oriented databases is HBase.

Document-oriented database

Document databases are managing data in a form of documents. Documents can be different and have different structure. The last thing makes document oriented databases very flexible. Some implementations of document oriented databases such as MongoDB provides you an ability to run query against the document structures as well as do mapreduce computations as well. Depends on the need you can consider different DO-databases. Examples of these databases are – MongoDB and CouchDB. You can consider document database in PDM/PLM in two cases – the need for high-performance scalable document store and free form data modeling.

Graph-databases and triple stores

Graph data model is dealing with highly interconnected data. It contains nodes and relationships between nodes. Both nodes and relationships can have properties (key-value pairs). This data model becomes really important when you are traversing through the nodes with a specific relationships. There are many situations in PDM/PLM applications when we need to traverse data efficiently. Graph database (and predecessors – object databases) has a great potential to bring a value here. The example of graph databases is Neo4j. Also, a specific case of graph databases is so-called triplestores managing information using triples (subject-predicate-object). Examples of triple stores are OWLIM and AllegroGraph. Also triple stores are supported by Oracle and IBM DB2

CAP Theorem and why PLM systems need to use more than one database?

In computer science CAP theorem states that it is impossible for a distributed computer system to simultaneously provide all there guarantee Consistency (all nodes see the same data at the same time), Availability (a guarantee that every request receives a response about whether it was successful or failed) and Partition tolerance (the system continues to operate despite arbitrary message loss or failure of part of the system). Navigate here to read more. It is a question of priorities and a tradeoff between what requirements you need to satisfy in your system. PLM systems are facing significant challenges in a variety of data types, retrieve patterns and data scaling. Usage of different strategies in database management can improve existing solutions.

What is my conclusion? PLM is a multidisciplinary approach. It handles variety of data and connected to many places in the organization. Design, engineering, manufacturing, supply chain, support, services. The specialty of PLM environment is to get connected to all data suppliers and interplay with different sources of data. From that standpoint, data behaves like oil – located in multiple places, but needs to be extracted. You need to use different tools to get it out. Think about different database as a tool-set to process and get access to data in a most efficient way. Just my thoughts…

Best, Oleg


PLM and Engineering Task (Process) Management

December 13, 2012

PLM is all about process management. This statement comes to the play when people explain the value of PLM in the organization. Usually, when you think about process management, your mind is switching to some kind of "workflow thinking" mode, which assumes you need to follow the process from state to state by accomplishing tasks and activities. In every PLM implementation, this is a moment of time, people ask – how do we manage engineering processes? What toolset we need to have to make it happen?

I can see, engineering people, are bad organized. In many situations to run processes among engineers is similar to herding cats. To manage process in an engineering organization is a challenge. This is a place where PLM vendors usually fails to provide a reliable and simple solution. Engineers are asking for additional flexibility and vendors have a tendencies to provide a complicated solutions. Many PLM tools are providing sort of Workflow designer to create a process model. Later on, you can discover that engineers tend to abandon these processes. Main reason – these processes are not reflecting the reality. I wanted to come with some ideas how to fix that. I came up with the three definitions – tasks, engagement and information context. Take a look on the picture below.

The overall engineering process is described as list of tasks (above). This is the simplest way to present what needs to be done. It easy to digest and follow up. At the same time, the activity around this task list is not linear. In order to accomplish the task, an engineer needs to engage with additional people. This a typical situation when a person who leads the process needs to communicate with other people and comes with the result. Often, it is ad-hoc communication that cannot be formalized resides in people’s mind. Another situation happens when an engineer needs to bring an additional set of information to accomplish the task or make a decision. To combine these activities together is not a simple thing. Workflow is a wrong tool to solve this problem. To support a simplified task management tools with the ability to manage external engagement and connect to information context can be a potential solution to the problem.

What is my conclusion? The simplification is a key word to summarize my thoughts. In many situations, engineers will prefer a simple task list to get things done. However, tools need to provide a collaborative capabilities to connect the engineer’s activity to other people and additional sources of information. Just my thoughts. I’m interesting to learn how you manage engineering tasks in your organizations.

Best, Oleg

[categories Daily PLM Think Tank]


Will PLM Make a Redshift?

December 11, 2012

Scaling globally is one of the biggest challenges for manufacturing organizations. For many years, enterprise software used as a basic backbone to solve problems of accessing information globally. Global IT, data centers, database clusters, applications integrations. To provide a single point of information access is a challenge for many of them. ERP and, later, PLM vendors made a multiple tries to implement global data warehousing and global data access.

Cloud is disrupting multiple areas these days. Some interesting announcements caught my attention earlier this week – Amazon Redshift disrupts data warehousing economics. Here is what Amazon Redshift is about:

Amazon Redshift is a fast and powerful, fully managed, petabyte-scale data warehouse service in the cloud. Redshift offers you fast query performance when analyzing virtually any size data set using the same SQL-based tools and business intelligence applications you use today. With a few clicks in the AWS Management Console, you can launch a Redshift cluster, starting with a few hundred gigabytes of data and scaling to a petabyte or more, for under $1,000 per terabyte per year.

Navigate here to learn more details. I captured the following passage with some useful technical information:

Redshift is made available via MPP nodes of 2TB (XL) or 16TB (8XL), running Paraccel’s high-performance columnar, compressed DBMS, scaling to 100 8XL nodes, or 1.6PB of compressed data. XL nodes have 2 virtual cores, with 15GB of memory, while 8XL nodes have 16 virtual cores and 120 GB of memory and operate on 10Gigabit ethernet.

Here is the place where this technology can be useful for manufacturing companies and manufacturing software. Today, even small manufacturing companies are global. It is not uncommon to see a 200 people manufacturing companies distributed over 5 different locations and working with suppliers all over the world. Current data warehouse technologies are out of reach for them as well as complex and expensive on premise PLM systems. Here is another quote from Information Week article that can give you a hint of what cloud can do for them.

Announcing RedShift at last week’s AWS re: Invent conference, Amazon senior VP Andy Jassy described how "expensive and complicated" data warehousing is for large companies and how "out of reach" it is for smaller firms. Delivered as a service, Redshift will cost as little as $1,000 per terabyte, per year. That will be "game changing" versus the estimated $19,000 to $25,000 per terabyte, per year that companies are used to shelling out for on-premises deployments, Jassy said.

What is my conclusion? 1TB of data for $1K. I think this scale can easy overlap a typical engineering archive in most of small manufacturing firm. Combined with global access and familiar SQL-based data access, it sounds like a perfect technological option. And this is a time for PLM vendors to think about the shift. Just my thoughts…

Best, Oleg

Image courtesy of [David Castilio Doninici] / FreeDigitalPhotos.net


Follow

Get every new post delivered to your Inbox.

Join 217 other followers