Intel continues to be the dominant server chip vendor in the industry, but a rapidly changing data center environment fueled by the rise of such emerging technologies as cloud computing, data analytics, artificial intelligence and machine learning is opening the door to new and re-energized rivals that are looking to grab some of the 95 percent market share that Intel currently owns.
Longtime competitor Advanced Micro Devices, which like Intel sells x86-based CPUs, several weeks ago came out with its high-performance and power-efficient Epyc server chips based on its Zen architecture, a move that gave AMD its best chance in more than a decade to chip away at Intel’s lead.
At the same time, the need for new compute capabilities to handle the demands brought on by artificial intelligence (AI), big data and the cloud is giving GPU maker Nvidia a greater foothold in the data center, IBM and its OpenPower Foundation consortium renewed life, and ARM and chip-making partners like Qualcomm and Cavium confidence to push ahead with their server strategies. At the same time, some hyperscale cloud players like Google are working on their own processors.
But Intel is not sitting idly by while its challengers make their advances. At an event in a newly opened lab in New York City July 11, company officials unveiled their latest server chips, the Xeon Scalable processors—based on the Skylake-SP core—which they said represent the most significant improvement in a data center platform in a decade and come with the scalability, security and optimization capabilities needed to address a broad range of workloads, from traditional tasks to AI, data analytics and other emerging applications.
The chips also will help service providers prepare for the upcoming migration from 4G to 5G networks, which will rapidly increase the number and types of devices that will connect to broadband networks.
“5G is about more than faster networking,” Navin Shenoy, executive vice president and general manager of Intel’s Data Center Group, said during the event. “5G is about connecting things that previously hadn’t been connected.”
The Xeon Scalable platform has been in the works for five years and includes a completely remade microprocessor as well as new and enhanced features that improve performance, power efficiency, scalability and security, said Shenoy, a longtime Intel executive who has been in his current position for fewer than two months.
An array of server vendors, including Dell EMC, Hewlett Packard Enterprise (HPE), Lenovo, Cray and Cisco Systems are rolling out the next generations of their respective system lineups that take advantage of Intel’s latest Xeons while also adding their own technologies to address the evolving data center environment. Like Intel, many of these established OEMs are adapting to more cloud- and software-based data centers and the new workloads that are running in them.
For example, Dell EMC gave more details about its 14th generation PowerEdge systems that officials said are optimized for software-based data centers and offer improved security features. Such data centers demand greater flexibility and the ability to adapt to different workloads, Ravi Pendekanti, the company’s senior vice president of server solutions product management and marketing, told eWEEK.
New capabilities in the systems deliver faster database and storage performance and faster live virtual machine migration, easier life cycle management with Integrated Dell Remote Access Controller (iDRAC 9) and improved security, including cryptographically trusted booting, System Lockdown to protect systems from any malicious or unwanted changes, and System Erase, which enables quick wiping of data from storage drives and other embedded non-volatile memory.
Lisa Spelman, vice president and general manager of Intel’s Xeon products and data center marketing, said during the event that the new Xeon will form “the foundational layer and building blocks” for the new workloads. Officials boasted some significant gains, saying the new processors deliver 1.65 times the performance of their predecessors and 2.2 times the deep learning training and inference performance.