Intel Unveils Upcoming Xeon Phi Chip Aimed at AI Workloads
Intel's Xeon Phi chips also are primary processors, which means they can run by themselves rather than serve only as co-processors to other CPUs. By contrast, Nvidia's GPU accelerators need to work with CPUs. Machine learning is made up of training (where neural networks are taught such things as object identification) and inference (where they use this training to recognize and process unknown inputs). Neural networks used for training are large, and a lot of that training is done on Nvidia GPUs, which offer more processing cores than CPUs. The inference networks are smaller, and most of that work is done on CPUs from Intel. Bryant and Chappell argued that CPUs have advantages over GPUs in processing machine learning tasks, particularly as environments scale out. The stance was backed up by Slater Victoroff, CEO of text and image analytics startup Indico, who is opting for CPUs over GPUs for that reason. In a post on Nvidia's blog this week, company officials have disputed recent benchmark numbers Intel has released to back up its arguments for CPUs, saying that the chip maker has used outdated and faulty data to come to its conclusions.In a research note, Charles King, principal analyst with Pund-IT, wrote that Intel's efforts in the nascent AI market—including buying Nervana—shows the company's commitment to the space, which "represents a huge opportunity." "But Intel is anything but alone in pursuing it," King wrote. "Other notable companies are in the AI hunt, including enterprise vendors like IBM, cloud players including Google and Amazon, and other silicon vendors, such as NVIDIA. To stay ahead of the curve, Intel is committing sizable financial and human capital to its commercial AI efforts." However, he warned that "competitors may believe they can overtake and overcome the company. But time and again, Intel has demonstrated that it has what it takes to go for and win the gold."
"It's great that Intel is now working on deep learning," they wrote in the blog post. "This is the most important computing revolution with the era of AI upon us and deep learning is too big to ignore. But they should get their facts straight."