Archived Content

The following content is from an older version of this website, and may not display correctly.

Data centers are increasilngly moving to green energy as a way to reduce their costs and environmental impact. In Singapore, the regional data center hub has a long-term plan to become a dominant player in green IT, with S$100 million (US$80M) set aside to fund various R&D initiatives including research into green data center operations.

It was with this in mind that the Singapore government commissionedi3 Solutions Group to research ways to reduce energy consumption in data centers. Among others, the recently concluded 16-month study found Singapore’s dearth of renewable energy meant that it would need to focus on energy efficiency measures, referred to as Green IT, in order to maintain its lead, i3 Solutions Group chairman, Ed Ansett told us.

Software matters

The report outlined a number of suggestions to improve energy efficiency, with a key recommendation highlighting the need to optimize software – traditionally a black box in data centers – to play a bigger role in using less electricity. But how does one even start on the task of tweaking software so that it translates to greater energy efficiency within the data center?

Many would argue that the first big step towards greater energy efficiency started when virtualization was adopted en masse by enterprise businesses, allowing multiple “virtual machines” to be consolidated within a single x86 physical server.

Similarly, cloud computing in the form of IaaS (Infrastructure-as-a-Service) allows consolidation, with the implicit understanding that cloud equates greater efficiency.

However, it turned out that in the data center “virtual machine sprawl” - where poorly managed virtual machines are duplicated and wasted - can result in inefficient use of virtualization. And efficiency in the cloud isn’t necessarily something that can be taken for granted either.

With a large amount of virtual servers to support, Amazon Web Services (AWS) had been optimizing the software stack on this front for some time now, showing how to address the challenge. The company announced two new services at its recent AWS re:Invent 2014 conference that promises to promote even greater energy efficiency in the cloud.

Driving energy efficiency in the cloud

AWS now offers a Trusted Advisor service to help discover the best practices pertaining to cloud infrastructure, inspecting a company’s AWS environment to help customers “save money, improve performance and reliability”. Specifically, AWS says that it has sent out some 2.6 million notifications pertaining to inefficient usage of customers’ AWS deployments since 2012, saving them over US$350 million dollars.

The recently announced Amazon Lambda service appears to be another way to do more with less. In a nutshell, Lambda brings event-based programming to the cloud, allowing developers to create new back-end services to initiate compute resources based on a range of predefined requests.

This is interesting not just for the whole range of development possibilities, but how it saves organizations from having to run wasteful compute instances that are just waiting for things to happen. Events such as image upload, in-app activities, website click, or external input – among others, can be configured ahead of time, and AWS will ensure that their corresponding functions will trigger “within milliseconds” when they do occur.

Finally, AWS also announced EC2 Container Service with Docker support. Beyond its ability to manage containers at any scale, the service can also schedule and manage Docker containers even as it offers integration with various AWS services such as Virtual Private Cloud (VPC).

Granted, Docker isn’t necessarily suitable for every kind of software, and there have been accusations that Docker’s new direction will lead to bloat and broken security. Yet there is no doubt that container technology as a whole is one step ahead of traditional virtualization in terms of efficiency. After all, only one instance of the operating system needs to be loaded into memory, versus having to run multiple operating systems in traditional virtualization.

Bringing it home

If anything, the various innovations at AWS show that there is ample headroom for improvements to be made within the software stack. As Singapore and other countries in the region look to green IT, it is a reminder that hardware efficiency ratings and PUE numbers must be evaluated in tandem with software within the data center.

As it is, container technologies such as Docker can be utilized by anyone, while Open Stack offers a great starting place to build a private cloud with only the parts needed by your organization. (You may want to read: How OpenStack may offer Asia an edge in the cloud)

Of course, AWS is doing this to persuade you to adopt it as your go-to cloud provider. “Running IT infrastructure on the AWS Cloud is inherently more energy efficient than traditional computing that depends on small, inefficient, and over-provisioned datacenters,” said an AWS spokesperson to Datacenter Dynamics.

Pointing to the extensive experience AWS has in running energy efficient and highly utilized data centers, the spokesperson noted that: “With AWS, customers can be confident that they are reducing their overall consumption of environmental resources while also improving utilization.”