Nvidia Breaks Gaming Tradition With RTX, Turning GPUs Into AI Powerhouses
Nvidia's RTX GPUs were largely known for gaming and graphics, but are being configured and repackaged for enthusiasts interested in trying out AI on desktops. The new GPUs are a part of Nvidia’s approach to make GPUs available wherever and whenever customers need them.
The company announced RTX GPUs, which can be used for AI inferencing and training. The GPUs are based on the Ada Lovelace architecture, which is different from the Hopper architecture used in the red-hot H100 GPUs that are in short supply.
Enthusiasts are already using GPUs on gaming laptops to run AI-powered applications, such as text-to-text or text-to-image models. At this week’s SIGGRAPH conference, Nvidia announced new desktop and workstation designs with RTX GPUs.
Computer makers including Dell, Lenovo and Boxx will announce workstations that can pack up to four RTX 6000 data generation in a chassis. Nvidia said the suggested retail price for the GPU was $6,000, though vendors such as Dell are selling it in excess of $9,000, including tax.
Each of the RTX 6000 GPUs, which are based on the Ada Lovelace design, has 48GB of GDDR6 memory and a 200Gbps network-interface card. The GPU draws 300 watts of power and is based on the older PCIe 4.0 interconnect standard.
Nvidia also announced the L40S Ada GPU, which is more like a poor man’s version of the H100, as it is faster than previous-generation A100 GPUs in AI training and inference. The new product is a variant of the L40 server GPU announced a year ago.
The L40S also has 48GB GDDR6 memory and will be in systems based on the OVX reference server design for metaverse applications.
The L40S is up to four times faster for AI and graphics workloads over the previous generation A40 GPU, which is based on the previous generation Ampere architecture. The AI training is 1.7 times faster than the A100 GPU, and inference is 1.5 times faster. The L40S has faster clock speeds and more tensor and graphics rendering performance.
Nvidia’s RTX systems for enterprises are built for the metaverse and AI markets, and the new hardware will include licenses for the Omniverse and AI Enterprise software. The company also announced AI Enterprise 4.0, which will include the Nemo large-language model.
There should be no struggles acquiring the L40S GPU supplies, which will ship later this year.
"These will not be as constrained as we've been in some of our highest-end GPUs," said Bob Pette, vice president for Pro Visualization at Nvidia, during a press briefing.
Nvidia’s low-end RTX 4000 GPU will become available in September for $1,250. The RTX 4500 will be available for $2,250 starting in October.
AI is as important as gaming to Nvidia. The company wants to make GPUs a commodity on which enthusiasts can create their own programs, and then run where the closest GPU is available. Nvidia’s H100 GPUs are hard to find and have become an asset for companies. A startup called CoreWeave has put its Nvidia GPUs on collateral to fund its growth. Cryptocurrency miners are also repurposing their GPUs in data centers to run AI.