Microsoft Uses Intel FPGAs for Smarter Bing Searches

Microsoft is employing field programmable gate arrays from Intel to power Bing's latest intelligent search capabilities.

Machine Learning Boosts Bing

Microsoft credits its use of field programmable gate arrays (FPGAs) from chipmaker Intel, at least in part for the performance of Bing's intelligent search capabilities.

The company's search engine is now capable of gathering information from multiple sources and present its findings as a fact-filled summary. This and other time-saving methods of fetching information found on the web are part of a collection of intelligent search features this is based on Project Brainwave, a deep learning acceleration platform.

Project Brainwave describes a system of deep neural networks running on Intel Arria and Stratix 10 FPGAs, enabling AI workloads that deliver results in milliseconds, claims Microsoft. Arria FPGAs were produced by programmable logic device maker Altera, which Intel acquired in 2015 for an estimated $16.7 billion. Stratix 10 FPGAs combine a 14-nanometer manufacturing process and Intel's HyperFlex fabric architecture to accelerate large-scale workloads.

Nestled within the data centers that process Bing searches, the FPGAs are helping users get quick answers to their pressing queries.

"Intel's FPGA chips allows Bing to quickly read and analyze billions of documents across the entire web and provide the best answer to your question in less than a fraction of a second," wrote Microsoft representatives in a blog post.

"Intel's FPGA devices not only provide Bing the real-time performance needed to keep our search fast for our users, but also the agility to continuously and quickly innovate using more and more advanced technology to bring you additional intelligent answers and better search results."

Microsoft estimates that the FPGAs have delivered a 10-fold reduction in the latency produced by Bing's intelligent search models while enabling a 10-fold increase in the size of those models.

Although they may boost performance for today's toughest IT workloads, FPGAs may soon get competition from adaptive compute acceleration platform (ACAP) chips from Xilinx. The San Jose, Calif. FPGA company announced on March 19 a proprietary chip that can deliver a 20 times improvement on deep neural network performance compared to the company's own Virtex VU9P FPGA.

Bing Decodes Jargon, Helps with How-To Questions

Among Bing's new intelligent search capabilities is a lookup function for jargon, technical terms or other uncommon words. Bing now automatically recognizes those types of words and highlights them, displaying a definition when the user hovers a cursor over a term.

For the do-it-yourself set, Microsoft is working on a new feature that generates multiple answers for how-to questions. This approach comes in handy when queries may not be specific enough to generate a proper answer or users struggle to ask the right type of questions, claims the company. The feature will be enabled in the coming weeks.

Finally, for the fashion-conscious, Bing's intelligent image search capabilities are getting a big upgrade.

The search engine's object detection feature, which allows users to zero in on items that appear in an image, focused primarily on shirts, handbags and a few other items. Now, it is capable of spotting a wider variety of apparel and accessories.

Pedro Hernandez

Pedro Hernandez

Pedro Hernandez is a contributor to eWEEK and the IT Business Edge Network, the network for technology professionals. Previously, he served as a managing editor for the network of...