Chinese search giant Baidu will deploy Xilinx field programmable gate arrays (FPGAs) in its data centers for machine learning applications such as image and speech recognition.

The two companies will work together to expand volume deployment of the Baidu-optimized FPGA platforms.

FPGAs everywhere

“Acceleration is essential to keep up with the rapidly increasing data center workloads that support our growth,” Yang Liu, Executive Director at Baidu, said.

Junwei Bao, Director in Baidu’s Autonomous Driving Unit, added: “Xilinx FPGAs are assisting greatly with this critical task and can provide significant value in the design of autonomous vehicles.”

Victor Peng, EVP and GM of the Programmable Products Group at Xilinx, said: “The momentum for FPGA-based acceleration continues as shown by this significant implementation with Baidu.

“We celebrate Baidu’s innovation, expertise and creativity in bringing advanced applications to market.”

FPGAs have long been used in data centers, but some companies are beginning to take more advantage of them.

Today, Microsoft released an academic paper detailing how it uses FPGAs in its facilities to accelerate processing and networking speeds.

The paper said: “Hyperscale data center providers have struggled to balance the growing need for specialized hardware (efficiency) with the economic benefits of homogeneity (manageability). In this paper we propose a new cloud architecture that uses reconfigurable logic to accelerate both network plane functions and applications.

“This Configurable Cloud architecture places a layer of reconfigurable logic (FPGAs) between the network switches and the servers, enabling network flows to be programmably transformed at line rate, enabling acceleration of local applications running on the server, and enabling the FPGAs to communicate directly, at data center scale, to harvest remote FPGAs unused by their local servers.”

The company claims that the results of its ’Project Catapult’ initiative that uses Intel-owned Altera FPGAs has created an ‘AI supercomputer in the cloud.’

Microsoft ‘Storyteller’ Allison Linn said in a blog post: “To make data flow faster, they’ve inserted an FPGA between the network and the servers. That can be used to manage traffic going back and forth between the network and server, to communicate directly to other FPGAs or servers or to speed up computation on the local server.”