The famous online-auction company eBay has unveiled project Topaz, its new flagship data center near Salt Lake City, Utah, that will house the company's core business: the eBay.com Web site, serving more than 90 million users in 32 countries and PayPal.com, the online payment system with 81 million accounts.
The facility is part of eBay's ongoing effort to consolidate its infrastructure housed in leased data centers in three US states. The company expects to lower cost and increase efficiency as a result of moving each of its existing servers into the new data center.
"As you can see, we live and die by the performance of our datacenters," eBay's Senior Director of Global Data Center Strategy Dean Nelson wrote in a blog post, announcing the new facility. "Our buyers and sellers depend on its reliability. Project Topaz is a critical part of the eBay engine. It is the foundation for our business and must be solid, stable, and secure. In a nutshell, it needs to be bulletproof."
According to Nelson, the $287m data center is the largest infrastructure project eBay has ever undertaken. Construction of the facility took more than 1.2 million man-hours within 14 months. The team completed the project on May 4. Skanska headed construction of the project, designed by RTKL.
Its infrastructure is concurrently maintainable and fault-tolerant. Nelson said its infrastructure redundancy level was Tier IV, while being 30 percent "more efficient than the most efficient data center in our portfolio." It has a designed PUE of 1.4.
First of the data center's four phases consists of three 20,000 sq ft computer rooms, housed in a 240,000 sq ft two-story building. Total power for all three rooms is 7.2MW, although the on-site substation's capacity can reach up to 30MW. The building sits on a 60 acre property. Room one houses equipment for eBay Marketplace, room two is for PayPal.com and room three is for further consolidation.
With all electrical equipment running at 400V, the design enabled omission of a whole level of transformers, delivering 230V to the servers. The design uses a modular Starline busway system.
The facility's primary cooling source will be a 400,000 gallon cistern that will collect rain water. The data center has a waterside economizer, which the company expects to be able to use for more than six months per year. The computer rooms feature hot-aisle containment systems. There is also in-row cooling for additional capacity when necessary.
The cooling system can handle rack density from less than 1kW to more than 30kW.
eBay users sold more than $60bn worth of products in 2009.
The facility is part of eBay's ongoing effort to consolidate its infrastructure housed in leased data centers in three US states. The company expects to lower cost and increase efficiency as a result of moving each of its existing servers into the new data center.
"As you can see, we live and die by the performance of our datacenters," eBay's Senior Director of Global Data Center Strategy Dean Nelson wrote in a blog post, announcing the new facility. "Our buyers and sellers depend on its reliability. Project Topaz is a critical part of the eBay engine. It is the foundation for our business and must be solid, stable, and secure. In a nutshell, it needs to be bulletproof."
According to Nelson, the $287m data center is the largest infrastructure project eBay has ever undertaken. Construction of the facility took more than 1.2 million man-hours within 14 months. The team completed the project on May 4. Skanska headed construction of the project, designed by RTKL.
Its infrastructure is concurrently maintainable and fault-tolerant. Nelson said its infrastructure redundancy level was Tier IV, while being 30 percent "more efficient than the most efficient data center in our portfolio." It has a designed PUE of 1.4.
First of the data center's four phases consists of three 20,000 sq ft computer rooms, housed in a 240,000 sq ft two-story building. Total power for all three rooms is 7.2MW, although the on-site substation's capacity can reach up to 30MW. The building sits on a 60 acre property. Room one houses equipment for eBay Marketplace, room two is for PayPal.com and room three is for further consolidation.
With all electrical equipment running at 400V, the design enabled omission of a whole level of transformers, delivering 230V to the servers. The design uses a modular Starline busway system.
The facility's primary cooling source will be a 400,000 gallon cistern that will collect rain water. The data center has a waterside economizer, which the company expects to be able to use for more than six months per year. The computer rooms feature hot-aisle containment systems. There is also in-row cooling for additional capacity when necessary.
The cooling system can handle rack density from less than 1kW to more than 30kW.
eBay users sold more than $60bn worth of products in 2009.