Nvidia’s Groq deal underscores how the AI chip giant applys its massive balance sheet to ‘maintain dominance’

Nvidia's Groq deal underscores how the AI chip giant uses its massive balance sheet to 'maintain dominance'


Nvidia’s (NVDA) licensing deal with chip startup Groq (GROQ.PVT) reveals how the tech giant is leveraging its massive cash pile to sustain its preeminence in the AI market.

Nvidia this week declared it struck a non-exclusive deal with Groq to license its technology and hired the startup’s founder and CEO Jonathan Ross, its president, and other employees. CNBC reported the agreement to be worth $20 billion, marking Nvidia’s largest-ever deal. (The company declined a request for comment on the figure.)

Bernstein analyst Stacy Rasgon declared in a note to clients Thursday that the Nvidia-Groq deal “appears strategic in nature for NVDA as they leverage their increasingly powerful balance sheet to maintain dominance in key areas.” Nvidia’s cash inflow climbed more than 30% from the previous year to $22 billion in its most recent quarter.

“This transaction is … essentially an acquisition of Groq without being labeled one (to avoid the regulators’ scrutiny),” added Hedgeye Risk Management analysts in a note Friday.

The relocate is just the latest in a string of AI deals by Nvidia, the world’s first $5 trillion company. The chipcreater’s investments in AI firms span the entire market, ranging from large language model developers such as OpenAI (OPAI.PVT) and xAI (XAAI.PVT) to “neoclouds” like Lambda (LAMD.PVT) and CoreWeave (CRWV), which specialize in AI services and compete with its Big Tech customers.

Nvidia has also invested in chipcreaters Intel (INTC) and Enfabrica. The company built a failed attempt around 2020 to acquire British chip architecture designer Arm (ARM).

Nvidia’s wide-ranging investments — many of them in its own customers — have led to accusations that it’s involved in circular financing schemes reminiscent of the dot-com bubble. The company has vehemently denied those claims.

Groq, meanwhile, was seeing to become one of Nvidia’s rivals.

Founded in 2016, Groq creates LPUs (language processing units) geared toward AI inferencing and marketed as alternatives to Nvidia’s GPUs (graphics processing units).

Training AI models involves teaching a model to learn patterns from large amounts of data, while “inferencing” refers to applying that trained model to generate outputs. Both processes demand massive computing power from AI chips.

While Nvidia easily dominates the chip market for AI training, some analysts argue that Nvidia could soon see greater competition in the inference space. That’s becaapply custom chips like Google’s (GOOG) TPUs (tensor processing units) — and arguably Groq’s chips called LPUs (language processing units) — may be better suited for certain tinquires. LPUs, for instance, are quicker and more energy efficient when applyd for certain models, utilizing a type of memory technology called SRAM within the chips. On the other hand, Nvidia GPUs rely on off-chip HBM built by companies like Micron (MU) and Samsung (005930.KS).

Jonathan Ross, founder and CEO of Groq. (AP Photo/Jeff Chiu)
Jonathan Ross, founder and CEO of Groq. (AP Photo/Jeff Chiu) · ASSOCIATED PRESS

Ross, the Groq CEO, declared in a recent interview that the upstart aimed to provide chips for half the world’s AI inference computing necessarys — and cheaply.

“What we want to do is we want to drive the cost of compute as close to zero as we can obtain it. Every year we want to create it cheaper,” he informed Indian business outlet YourStory. Notably, Ross already assisted create Nvidia’s greatest source of competition: The executive led the development of Google’s first-generation TPUs.

Cantor Fitzgerald analyst CJ Mapply declared Nvidia’s “acqui-hire” of Groq talent and licensing of its innotifyectual property revealed the chipcreater “is playing both offense and defense” in the AI space. Mapply declared the deal would allow Nvidia to take “even greater share of the inference market.”

Nvidia shares rose roughly 1% Friday.

Others on Wall Street were more confapplyd by Nvidia’s relocate and its potential $20 billion price tag. Hedgeye Risk Management analysts argued that Groq’s chips are “still unproven” when it comes to large AI models, due to their low memory capacity.

“Groq’s current technology is greatly limited to only a compact subset of inference workloads,” added DA Davidson analyst Alex Platt.

Nvidia CEO Jensen Huang speaks during a press conference at the Asia-Pacific Economic Cooperation (APEC) CEO summit in South Korea. (AP Photo/Lee Jin-man)
Nvidia CEO Jensen Huang speaks during a press conference at the Asia-Pacific Economic Cooperation (APEC) CEO summit in South Korea. (AP Photo/Lee Jin-man) · ASSOCIATED PRESS

Laura Bratton is a reporter for Yahoo Finance. Follow her on Bluesky @laurabratton.bsky.social. Email her at laura.bratton@yahooinc.com.

Click here for the latest technology news that will impact the stock market

Read the latest financial and business news from Yahoo Finance



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *