Microsoft Puts FPGAs to Work for Azure and Bing
Both Microsoft's cloud and its search engine are using field-programmable gate array technology to speed up workloads.Project Catapult, a field-programmable gate array (FPGA) project established in 2011 by Microsoft, is starting to pay off for users of the company's services, according the Redmond, Wash., software giant. Microsoft today shared a behind-the-scenes look at how FPGA technology is transforming its Bing search engine and helping to accelerate Azure workloads. FPGAs, as the term suggests, are chips that can be reconfigured, or "reprogrammed," through software after manufacture, enabling customers to accelerate select workloads without incurring the processing overhead of additional software. Cloud providers, particularly those on the hyperscale end of the spectrum, have a keen interest in FPGAs. Beset by unpredictable and rapidly changing workloads, cloud companies can employ FPGAs to help them adapt faster and more readily to the demands of the market. In Microsoft's case, the company is using FPGAs in an effort to build an artificial intelligence (AI) supercomputer in the cloud. Users will soon be able to judge those efforts for themselves by seeking answers on Bing, Microsoft's search engine. Currently, Bing is using Catapult technology to generate faster, more accurate results. "By the end of 2016, an artificial intelligence technique called deep neural networks will be deployed on Catapult to help Bing improve its search results," blogged Microsoft Research writer Allison Linn on Oct. 17. "This AI supercomputer in the cloud will increase the speed and efficiency of Microsoft's data centers—and anyone who uses Bing should notice the difference, too."