Stocks & ETFs

Insider Scoop: Everything You Need to Know About NVIDIA’s Blackwell GPU

Nvidia’s CEO, Jensen Huang, has announced new AI chips and software for running next gen AI models. The new AI graphics processors are named Blackwell and are expected to start shipping out later this year.

The Insider

By

bantinginc

Insider Scoop: Everything You Need to Know About NVIDIA’s Blackwell GPU

Nvidia’s CEO, Jensen Huang, has announced new AI chips and software for running next gen AI models. The  new AI graphics processors are named Blackwell and are expected to start shipping out later this year.

The announcement came at Nvidia’s developer conference in San Jose as the chipmaker seeks to solidify its market position as the leading hardware provider for AI applications. The first Blackwell chip has been named,  GB200, which will allow for much larger GPU’s when compared to Nvidia’s ‘Hopper’ H100 chips. The GB200 features two B200 graphics processors and one Arm-based central processor, this will allow for a massive performance upgrade for AI companies with 20 petaflops in AI performance versus 4 petaflops for the H100. This is definitely a huge step in the right direction for Nvidia continuing to earn its colossal valuation.

Available As An Entire Server

 
The Nvidia Blackwell GPU will also be available as a server called the GB200 NVLink 2. This particular configuration will combine 72 Blackwell GPUs and other Nvidia parts to train AI models. The added bonus here with this configuration is the Alphabet, Amazon, Microsoft, and Oracle will be selling access to the GB200 through cloud services. Furthermore, on the performance side, Nvidia said that the Blackwell system would be can deploy a 27-trillion parameter model which is much larger than the largest model available on the market  today, GPT-4 (1.7 trillion parameters).
 

Introducing NIM

 
Nvidia also announced ‘NIM’ at its developer conference which stands for Nvidia Inference Microservice. NIM makes it easier to use older Nvidia GPUs for running AI software, this will allow companies to use the millions of Nvidia GPUs they already own. NIM will basically allow companies to run their own AI models without necessarily purchasing new hardware, this will definitely help bring more AI applications to the table. This move definitely helps Nvidia’s customers stay on their platform which will enhance revenue retention.

Leave a Comment

Scroll to Top

Subscribe and get a discount code for 1 free month of premium articles, Market Data & Analysis, Trader Voice Notes, and Whispers on Wall Street.

Thanks for subscribing.