NVIDIA's GPUs powered the AI revolution. Its new Blackwell chips are up to 30 times faster<p>In less than two years, NVIDIA’s H100 chips, which are used by nearly every AI company in the world to train large language models that power services like ChatGPT, made it one of the world’s <a data-i13n="elm:context_link;elmt:doNotAffiliate;cpos:1;pos:1" class="no-affiliate-link" href="
https://www.engadget.com/nvidia-becomes-the-third-most-valuable-us-company-at-alphabets-expense-123503967.html" data-original-link="
https://www.engadget.com/nvidia-becomes-the-third-most-valuable-us-company-at-alphabets-expense-123503967.html">most valuable companies[/url]. On Monday, NVIDIA announced a next-generation platform called Blackwell, whose chips are between seven and 30 times faster than the H100 and use 25 times less power.
“Blackwell GPUs are the engine to power this new Industrial Revolution,” said NVIDIA CEO Jensen Huang at the company’s annual GTC event in San Jose attended by thousands of developers, and which <a data-i13n="elm:context_link;elmt:doNotAffiliate;cpos:2;pos:1" class="no-affiliate-link" href="
https://x.com/liveplexio/status/1769843150598930587?s=20" data-original-link="
https://x.com/liveplexio/status/1769843150598930587?s=20">some compared[/url] to a Taylor Swift concert. “Generative AI is the defining technology of our time. Working with the most dynamic companies in the world, we will realize the promise of AI for every industry,” Huang added in a <a data-i13n="elm:context_link;elmt:doNotAffiliate;cpos:3;pos:1" class="no-affiliate-link" href="
https://nvidianews.nvidia.com/news/nvidia-blackwell-platform-arrives-to-power-a-new-era-of-computing" data-original-link="
https://nvidianews.nvidia.com/news/nvidia-blackwell-platform-arrives-to-power-a-new-era-of-computing">press release[/url].
NVIDIA’s Blackwell chips are named in honor of David Harold Blackwell, a mathematician who specialized in game theory and statistics. NVIDIA claims that Blackwell is the world’s most powerful chip. It offers a significant performance upgrade to AI companies with speeds of 20 petaflops compared to just 4 petaflops that the H100 provided. Much of this speed is made possible thanks the 208 billion transistors in Blackwell chips compared to 80 billion in the H100. To achieve this, NVIDIA connected two large chip dies that can talk to each other at speeds up to 10 terabytes per second.
In a sign of just how dependent our modern AI revolution is on NVIDIA’s chips, the company’s press release <a data-i13n="elm:context_link;elmt:doNotAffiliate;cpos:4;pos:1" class="no-affiliate-link" href="
https://nvidianews.nvidia.com/news/nvidia-blackwell-platform-arrives-to-power-a-new-era-of-computing" data-original-link="
https://nvidianews.nvidia.com/news/nvidia-blackwell-platform-arrives-to-power-a-new-era-of-computing">includes[/url] testimonials from eight CEOs who collectively lead companies worth trillions of dollars. They include OpenAI CEO Sam Altman, Microsoft CEO Satya Nadella, Alphabet CEO Sundar Pichai, Meta CEO Mark Zuckerberg, Google DeepMind CEO Demis Hassabis, Oracle chairman Larry Ellison, Dell CEO Michael Dell, Amazon CEO Andy Jassy and Tesla CEO Elon Musk.
“There is currently nothing better than NVIDIA hardware for AI,” Musk says in the statement. "Blackwell offers massive performance leaps, and will accelerate our ability to deliver leading-edge models. We’re excited to continue working with NVIDIA to enhance AI compute,” Altman says.
NVIDIA did not disclose how much Blackwell chips would cost. Its H100 chips currently run between $25,000 and $40,000 per chip, <a data-i13n="elm:context_link;elmt:doNotAffiliate;cpos:5;pos:1" class="no-affiliate-link" href="
https://www.cnbc.com/2024/03/18/nvidia-announces-gb200-blackwell-ai-chip-launching-later-this-year.html" data-original-link="
https://www.cnbc.com/2024/03/18/nvidia-announces-gb200-blackwell-ai-chip-launching-later-this-year.html">according[/url] to
CNBC, and entire systems powered by these chips can cost as much as $200,000.
Despite their costs, NVIDIA’s chips are in high demand. Last year, delivery wait times were <a data-i13n="elm:context_link;elmt:doNotAffiliate;cpos:6;pos:1" class="no-affiliate-link" href="
https://www.tomshardware.com/tech-industry/artificial-intelligence/wait-times-for-nvidias-ai-gpus-eases-to-three-to-four-months-suggesting-peak-in-near-term-growth-the-wait-list-for-an-h100-was-previously-eleven-months-ubs" data-original-link="
https://www.tomshardware.com/tech-industry/artificial-intelligence/wait-times-for-nvidias-ai-gpus-eases-to-three-to-four-months-suggesting-peak-in-near-term-growth-the-wait-list-for-an-h100-was-previously-eleven-months-ubs">as high as[/url] 11 months. And having access to NVIDIA’s AI chips is increasingly seen as a status symbol for tech companies looking to attract AI talent. Earlier this year, Zuckerberg <a data-i13n="elm:context_link;elmt:doNotAffiliate;cpos:7;pos:1" class="no-affiliate-link" href="
https://www.engadget.com/mark-zuckerberg-is-the-latest-billionaire-who-wants-to-create-artificial-general-intelligence-210820789.html" data-original-link="
https://www.engadget.com/mark-zuckerberg-is-the-latest-billionaire-who-wants-to-create-artificial-general-intelligence-210820789.html">touted[/url] the company’s efforts to build “a massive amount of infrastructure” to power Meta’s AI efforts. “At the end of this year,” Zuckerberg wrote, “we will have ~350k Nvidia H100s — and overall ~600k H100s H100 equivalents of compute if you include other GPUs.”</p>This article originally appeared on Engadget at
https://www.engadget.com/nvidias-gpus-powered-the-ai-revolution-its-new-blackwell-chips-are-up-to-30-times-faster-001059577.html?src=rssSource:
NVIDIA's GPUs powered the AI revolution. Its new Blackwell chips are up to 30 times faster