Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.


Open Compute faces validation requests

  • Print
  • Share
  • Comment
  • Save

You might expect testing to be non-controversial. but you’d be wrong. It’s been a hurdle in the path of many a technology movement, and now it’s the Open Compute Project’s turn to face up to it. 

Since it began five years ago, the Open Compute Project has been working to create and share designs for lower-cost hardware, without the gratuitous and pricey differentiation introduced by vendors.

It’s an open source hardware movement, based on shared spcifications, and the idea is to produce lower cost hardware. But how reliable will that equipment be? How sure can we be that it meets the specifications? Will it work as promised with other OCP equipment? And how much does any of this matter to the actual customer?

open compute project logo

Source: OCP

Testing times?

Testing and certification is not as simple as it sounds. If you don’t believe me, this month has also seen the Uptime Institute acting on the “deceptive” use of its Tier certificates for the reliability of facilities. I’ll examine that story another time.

OCP’s testing is supposed to check that products meet the specifications emerging from the Open Compute community. But this week an anonymous test engineer sounded off to a news site, saying the OCP’s test regime is s ‘joke’. Testing laboratories set up by OCP are defunct, said the source. And the organization has only one priority: cheap hardware and a low cost of ownership over the lifetime of a data center.

Former OCP director Cole Crawford responded, explaining that the OCP’s compliance and interoperability (C&I) project has matured. It’s now based around self-certification and acceptance testing.

White label vendors and enterprise vendors will make and sell “OCP-compliant” products. There will be an element of trust in just how compliant (and reliable) they are. 

The issue behind this is that testing and certification by third party laboratories is expensive - but some people need it. Telcos, for example, demand that the equipment they buy is certified to meet the rigorous NEBS standards which include stress testing the equipment that Internet hubs and telephone exchanges will rely on.

Reliable kit versus commodity price?

The core customers for OCP equipment are large cloud providers, who need a very reliable service. OCP is their way to get a reliable service using hardware that isn’t anything like NEBS-compliant.

High availability is abstracted away from the hardware, so Facebook, Rackspace or Google can provide a reliable service on cheap commodity hardware. If anything fails they can just replace it. Buying equipment in the volume they do, self-certification and acceptance tests are all they need.

But not everyone works that way. Enterprise data centers in particular aren’t built like that. They don’t have huge racks of servers carrying out a single identical job, and they can’t swap out hardware the way the cloud providers can.

Other OCP people have clarified things. Whatever the original engineer who kicked this off may have meant, it seems there are some people in OCP who want validated equipment, and Barbara Aichinger of FutrePlus Systems, responded to the original story, talking about a “battle” to bring this about.

Here’s my suggestion. Testing and certification is downstream of the definition of the OCP equipment, and this is an open source process. So why can’t the people who want validated OCP kit go ahead and get it validated under the Open Compute Foundation umbrella?

The big players don’t need validated equipment for web scale applications, and don’t see the need to fund it. Their muscle is enough to ensure vendors do what they want. 

For most other people the way forward is to buy OCP kit from suppliers like HP who do their own testing. That might be a bit of a step back from the all-white-label vision of the hardcore OCP users, but it could answer the need for trust while still getting the improvements that OCP offers.

Fully validated OCP kit would be an even bigger change from the original ideas of the project, and an expense which is at the very least a contrast to the low-cost mantra which kicked OCP off. But just suppose  there are enough people who like OCP kit, and want fully validated versions of it. Why not let them see if they can arrange effective testing, and then deliver validated kit. If they can do this without breaking the bank, then it will fly. 

There may be something in the Open Compute Foundation’s constitution which prevents this, why not set up a sub-group for validated OCP kit? if there really is a demand, and enough people support the call, then it could bring the benefits of OCP to a new market.

A version of this article appeared on Green Data Center News.

Related images

  • open compute project logo

Have your say

Please view our terms and conditions before submitting your comment.

  • Print
  • Share
  • Comment
  • Save


More link