Nvidia, the current market leader for semiconductor chips, revealed its latest chip designed to run large-scale AI systems.
Nvidia, one of the world’s leading developers of semiconductor chips, revealed its latest chip on Aug. 7, designed to power high-level artificial intelligence (AI) systems.
The company said its next-generation GH200 Grace Hopper Superchip is one of the first to be equipped with an HBM3e processor, and is designed to process “ the world’s most complex generative AI workloads, spanning large language models, recommender systems and vector databases.”
Jensen Huang, the CEO of Nvidia, commented in a keynote that it is giving the processor a “boost” and that:
“This processor is designed for the scale-out of the world’s data centers.”
While the GH200 has the same general processing unit as the H100 — the company’s most high-end chip and one of the top in the world — it comes with 141 gigabytes of advanced memory and a 72-core ARM central processor, which is at least three times more powerful than the previous chip.
The latest chip from Nvidia is designed for inference, one of the two primary components of working with AI models after training them. Inference is when the model is used to generate content, make predictions and is constantly running.
Huang said “pretty much any” large language model (LLM) can be run through this chip, and it will “inference like crazy.”
“The inference cost of large language models will drop significantly.”
The GH200 becomes available in the second quarter of 2024, according to Huang, and by the end of the year should be available for sampling.
Related: OpenAI CEO highlights South Korean chips sector for AI growth, investment
This development comes as Nvidia’s market dominance is currently being challenged by the emergence of new semiconductor chips from rival companies racing to create the most powerful products.
At the moment it has over an 80% market share for AI chips and briefly tipped $1 trillion in market value.
On May 28, Nvidia introduced a new AI supercomputer for developers to be able to create successors in the style of ChatGPT, with BigTech companies like Microsoft, Meta and Google’s Alphabet expected to be among the first users.
However, on Jun 14, Advanced Micro Devices (AMD) released information on its forthcoming AI chip with the capabilities and capacity to challenge Nvidia’s dominance. The AMD chip is said to be available in the third quarter of 2023.
Most recently, on Aug. 3, the chip developer Tenstorrent received $100 million in a funding round led by Samsung and Hyundai in an effort to diversify the chip market.
Magazine: Experts want to give AI human ‘souls’ so they don’t kill us all