Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections

Microsoft donates server designs to Facebook's Open Compute Project

  • Print
  • Share
  • Comment
  • Save

Microsoft has donated a cloud server design specification to the Open Compute Project, the Facebook-led open source community for data center and hardware design.

 

Microsoft is the second web-scale data center operator to contribute its own hardware designs to OCP. Facebook became the first in 2011 when it formally announced the non-profit initiative.

 

The designs Microsoft has contributed is powering the core of its business today, Bill Laing, the company's corporate VP of cloud and enterprise, wrote in a blog post announcing the news. They are “the designs for the most advanced server hardware in Microsoft data centers delivering global cloud services like Windows Azure, Office 365, Bing and others,” he wrote.

 

In addition to the design specs, Microsoft is open sourcing the software code its engineers have written for hardware operations management. These are tasks such as server diagnostics, power supply and fan control.

 

The servers are optimized for Windows Server and can be up to 40% cheaper and 15% more power efficient than comparable traditional enterprise servers. They can also be deployed and serviced in half the time it takes to deploy and service industry-standard machines.

 

There is a corporate sustainability angle for Microsoft as well. “We also expect this server design to contribute to our environmental sustainability efforts by reducing network cabling by 1,100 miles and metal by 10,000 tons across our base of 1 million servers,” Laing wrote.

 

The company started managing its own data centers in 1989, Laing wrote. As of today, Microsoft has invested more than US$15bn in the infrastructure that supports its services.

 

While best known as a software company, Microsoft is one of the world's leaders in data center and hardware design. Like other companies of its scale that do business primarily through online services, it designs its facilities infrastructure and IT to optimize for the applications.

 

One of the latest approaches to data center capacity expansion Microsoft has come up with is the ITPAC approach. The company has container-like modules pre-manufactured and filled with servers before shipping them for quick installation at the data center site.

 

Laing will be speaking at the Open Compute Summit in Santa Clara, California, tomorrow. FOCUS will be covering the event, so come back to this site for more Open Compute news.

Related images

  • Microsoft Research campus in Mountain View, California

Have your say

Please view our terms and conditions before submitting your comment.

required
required
required
required
required
  • Print
  • Share
  • Comment
  • Save

Webinars

  • Do Industry Standards Hold Back Data Centre Innovation?

    Thu, 11 Jun 2015 14:00:00

    Upgrading legacy data centres to handle ever-increasing social media, mobile, big data and Cloud workloads requires significant investment. Yet over 70% of managers are being asked to deliver future-ready infrastructure with reduced budgets. But what if you could square the circle: optimise your centre’s design beyond industry standards by incorporating the latest innovations, while achieving a significant increase in efficiency and still maintaining the required availability?

  • The CFD Myth – Why There Are No Real-Time Computational Fluid Dynamics?

    Wed, 20 May 2015 14:00:00

    The rise of processing power and steady development of supercomputers have allowed Computational Fluid Dynamics (CFD) to grow out of all recognition. But how has this affected the Data Center market – particularly in respect to cooling systems? The ideal DCIM system offers CFD capability as part of its core solution (rather than as an external application), fed by real-time monitoring information to allow for continuous improvements and validation of your cooling strategy and air handling choices. Join DCIM expert Philippe Heim and leading heat transfer authority Remi Duquette for this free webinar, as they discuss: •Benefits of a single data model for asset management •Challenges of real-time monitoring •Some of the issues in CFD simulation, and possible solutions •How CFD can have a direct, positive impact on your bottom line Note: All attendees will have access to a free copy of the latest Siemens White Paper: "Using CFD for Optimal Thermal Management and Cooling Design in Data Centers".

  • Prioritising public sector data centre energy efficiency: approach and impacts

    Wed, 20 May 2015 11:30:00

    The University of St Andrews was founded in 1413 and is in the top 100 Universities in the world and is one of the leading research universities in the UK.

  • A pPUE approaching 1- Fact or Fiction?

    Tue, 5 May 2015 14:00:00

    Rittal’s presentation focuses on the biggest challenge facing data centre infrastructures: efficient cooling. The presentation outlines the latest technology for rack, row, and room cooling. The focus is on room cooling with rear door heat exchangers (RHx)

  • APAC - “I Heard It Through the Grapevine” – Managing Data Center Risk

    Wed, 29 Apr 2015 05:00:00

    Join this webinar to understand how to minimize the risk to your organization and learn more about Anixter’s unique approach.

More link