We’ve become familiar with the dictum by original Netscape co-founder Marc Andreessen that “software is eating the world”. It is true, but the reality is more nuanced than that.

We currently create, process and consume amounts of data that were unimaginable a generation ago, and we must be able to produce hardware of quality and in quantity to keep pace. Software is indeed chewing up the ability of hardware to deal with it – the IT systems of the world currently process several zettabytes (ie, several million petabytes) of data annually, and projected growth rates are 26 to 28 percent per year (according to research by Cisco), thereby doubling the amount of data every two or three years.

open source tin can right
– Chris Perrins / DCD

Invite the world

No single company can keep up with that and, because every part of the system is interconnected with the others, the only way to meet the demands of the cloud is to distribute the workload, invite the world to help, and keep the doors open. In other words, open source has become, quite simply, the only viable approach to webscale and cloud technology.

In the 20 years since it was created, the Linux OS has become the most widely used software in the world, with customized distributions cropping up in embedded processors for everything, including networked devices.

Amazon doesn’t share much about its technology, but it is well known that Amazon Web Services (AWS) – the biggest public cloud service in the world – is running on millions of servers, each running Amazon’s own version of Linux. How else could it scale as it does for the prices it charges?

Outside of Amazon, OpenStack has become the leading public cloud platform, and that is an open-source project. Big Data is revolutionizing the handling of information, allowing insights to be extracted from giant data sets – and the leading offerings are based on the Hadoop open-source platform, which emerged from efforts by Google and Yahoo to handle floods of information on commodity hardware. Because Hadoop is open source, managed by the Apache Foundation, that power is now available to anyone.

Open source reached the hardware level in a big way in 2011, when Facebook shared its server designs and launched the Open Compute Project, which is now destroying vendor lock-in and sucking waste out of hardware, including racks and network switches.

Open technology is vital “above the rack and below the stack”. So it’s not software that’s eating the world: it’s open source.

Open source reached the hardware level in a big way in 2011, when Facebook launched the Open Compute Project,

Open-source technology is licensed in such a way that others can get involved, and the players each adopt the best open-source license for their purposes: the Gnu General Public License GPL) version 2 is a favorite with “purists” because it requires developers to share their modifications with the community.

By contrast, the Apache license allows vendors the tempting opportunity to extend the software with code they can keep to themselves.

open source tin can left
– Chris Perrins / DCD

Code is currency

“In open source, code is currency, and the medium in which you earn respect,” says Cole Crawford, CEO of Vapor IO. He should know: he was there at the formation of OpenStack and led the Open Compute Project before setting up to create an open-source means to manage data center hardware at Vapor IO.

That company is just one example of a valuable open-source effort. All the associations and organizations mentioned here are doing the grinding, daily work that keeps the data flowing, and the biggest companies are now betting on open technology.

Whether you’re modeling a complex scientific problem, collecting and analyzing data from a new industrial manufacturing plant, or just watching some stuff on YouTube, you’ll be glad that these organizations and companies exist, and are focused on nothing but open.

This article appeared in the April issue of DatacenterDynamics magazine