Artificial intelligence and chipmaker NVIDIA held its fall GPU Technology Conference (GTC) this week in an all-digital format. Like the live version, NVIDIA uses the event as a launch mechanism to articulate its vision and launch new products. The October 2020 GTC event was packed with announcements, partnerships, educational presentations and use cases.
The digital event let NVIDIA do things it could not do before. It’s the first GTC that ran across the world’s time zones, with sessions in Chinese, Korean, Japanese and Hebrew, all in local times. There were 1000 sessions this year, 400 more than last year.
The artificial intelligence era is here
Without a doubt, the AI era has arrived. AI is being built into almost everything today, from corporate computing systems to home assistants to cars and anything else one can consider. I believe AI will be the most powerful technology force ever and will literally change the way we work, live and learn. For all the AI we have, I do think we are at the precipice of an AI explosion, because AI automates things to improve our lives. In farming, AI can automate food production, with consumers it lowers the cost of electricity by managing our power grids better and, at large scale, it leads to breakthroughs to do new things.
During his opening keynote, NVIDIA CEO Jensen Huang, made an interesting statement when he said: “We are limited by our ability to write software, but now software can write software. AI is the automation of automation.”
He also described the combination of GPUs and the abundance of data as the “big bang” of modern AI. It’s arrived. Are you ready?
The introduction of the data processing unit (DPU)
The terms CPUs and GPUs are widely understood in the computing industry but now there is a new term–the DPU. A data processing unit is a network interface card (NIC) loaded up with extra processing to offload much of the processor intensive tasks a modern server needs to do. This includes network, security, analytics and other tasks.
By offloading these, the server can handle an increase in the number of virtual machines and containers and let the server do what it was designed to do. At the event, NVIDIA launched two of its BlueField DPUs, one without a GPU and one with. The main difference is the GPU enabled one is designed for servers that perform a lot of AI intensive tasks. I expect the uptake of DPUs to be fairly quick, because a number of vendors have already announced support for them, including VMware, Red Hat, Canonical and Check Point as well as many of the major server manufacturers.
Maxine brings video AI to the masses
Video in the workplace is here to stay. Even when people go back to the office, it’s expected that 95%+ meetings will include video participants. However, there are many challenges with video as the experience is often subpar because of call quality, noise, lighting or other issues. At GTC, NVIDIA announced Maxine, a cloud native set of AI services to improve video meetings. Meeting vendors can use Maxine services, such as facial alignment, where the AI algorithms can make it appear the person is looking at the camera, even though they aren’t or noise removal that mutes background noise to make meetings better. Other features include bandwidth reduction, virtual avatars, virtual backgrounds and more. At the event, Avaya announced it will be using Maxine to bring more AI features to its Spaces collaboration tool.
NVIDIA gets edgy with EGX
Edge computing is the “next big thing” in computing and is an excellent complement to cloud. Not all workloads are best done in the cloud, because it's often more effective to process the data locally. NVIDIA announced that its edge EGX platform has been expanded to combine the NVIDIA Ampere GPU and BlueField-2 DPU on a single PCIe card, providing a common platform to build, secure and accelerate edge initiatives.
NVIDIA also provided an update on the uptake of its EGX, because hundreds of companies are using it for things like AI, 5G, security and networking. Server manufacturers include Dell, Inspur, Lenovo and Supermicro. There are also a number of software companies that have adopted EGX such as Canonical, Red Hat, Suse and VMware. One of the more interesting use cases was with software provider Cloudera that is using EGX and NVIDIA software, RAPIDS and AI, to “turbo charge” the Cloudera Data Platform. Cloudera provides a single view of all of an organization’s data across clouds, and now edges and the use of AI can significantly speed up the analytics of the information to find key business insights faster.
Omniverse is now in beta
As of Oct. 5, Day 1 of GTC, the NVIDIA Omniverse platform entered open beta with availability for downloads this fall. Omniverse brings together NVIDIA RTX-based 3D graphical simulation with photorealistic detail. The platform enables developers to simulate different scenarios and collaborate.
For example, a car manufacturer could run different simulations to see how different shades of paint look in the shadows or in bright sunlight. Architects can use Omniverse to create realistic 3D visualizations of buildings and different types of lighting. To date, getting graphic artists and designers to work together has been difficult, if not impossible, given the size of data sets and the length of time real time ray tracing takes. NVIDIA came to life as a graphics company, and it is using its experience in this area and accelerated computing to do things never possible before.
I certainly miss the live GTC event, but the digital event certainly didn’t disappoint. Expect to see GTC continue to grow and be the major show for AI developers.
Zeus Kerravala is an eWEEK regular contributor and the founder and principal analyst with ZK Research. He spent 10 years at Yankee Group and prior to that held a number of corporate IT positions.