Nvidia to License AI Chip Challenger Groq’s Tech and Hire Its CEO

Nvidia partners with AI chip startup Groq, licensing its technology and hiring founder Jonathan Ross, in a deal aimed at boosting future AI speed and efficiency.

A major move is happening in the world of AI chips. Nvidia, the company known for powering most of today’s AI systems, has made a deal with a fast growing chip startup called Groq. This deal could shape how AI runs in the future, especially how fast and efficient it becomes.

Nvidia has agreed to license Groq’s chip technology and hire some of Groq’s top leaders, including its founder and CEO, Jonathan Ross. While many people first thought this was a full company takeover, Nvidia says it is not buying Groq as a company. Instead, it is licensing the technology and bringing key people into Nvidia.

This move shows how serious the race for AI computing power has become.

What Nvidia and Groq agreed to

The deal between Nvidia and Groq is a non exclusive licensing agreement. This means Nvidia can use Groq’s technology, but Groq is still allowed to work with others too.

As part of the deal, Jonathan Ross, who founded Groq, will join Nvidia. Groq’s president Sunny Madra and several engineers are also moving to Nvidia. Groq will continue to exist as a company, with a new CEO named Simon Edwards.

Some reports say Nvidia is paying about $20 billion for Groq’s assets and technology. If true, this would be Nvidia’s largest deal ever. Nvidia has not confirmed the amount, but it also has not denied the reports.

What Nvidia has made clear is this, Groq is not being fully acquired.

Why Groq matters in the AI chip world

Groq is not just another chip startup. It builds a special kind of chip called an LPU, which stands for Language Processing Unit. These chips are designed mainly for inference.

Inference is the part of AI where a trained model answers questions, writes text, or responds to users. This is different from training, which is where models learn from data.

Groq claims its LPUs can run large language models much faster than traditional GPUs. The company says its chips can be up to ten times faster and use much less power. This matters a lot as AI apps grow and need to serve millions of users at the same time.

Jonathan Ross is also well known in the industry. Before Groq, he worked at Google, where he helped create the TPU, one of the first major AI accelerator chips.

Nvidia’s position in AI chips

Nvidia already dominates the AI chip market. Its GPUs are the standard choice for training large AI models. Almost every major AI company relies on Nvidia hardware in some way.

However, Nvidia faces more competition in inference. Companies like AMD, Cerebras, and Groq have been pushing new designs that focus on speed and efficiency.

By licensing Groq’s inference technology, Nvidia strengthens its position in this area. It can now combine its own GPU strength with Groq’s low latency chip ideas.

This helps Nvidia stay ahead as AI shifts from training models to running them at scale.

Why Nvidia did not fully buy Groq

Many people wonder why Nvidia did not just buy Groq outright. The answer likely involves regulation.

Big tech companies are under heavy watch by regulators. Buying a fast growing AI chip company could raise antitrust concerns. By licensing the technology and hiring key talent instead, Nvidia gets most of the benefits without triggering a full acquisition review.

This type of deal is often called an acquihire. It allows companies to gain people and ideas while keeping the appearance of competition alive.

Other tech giants have done similar deals in recent years.

What happens to Groq now

Groq will continue to operate as an independent company. Its cloud service, called GroqCloud, will keep running. The company says it will still serve developers and businesses as before.

Before this deal, Groq raised $750 million and was valued at $6.9 billion. It said more than two million developers were using its technology, a big jump from the year before.

Groq also uses a different memory approach than Nvidia. It relies on on chip memory instead of high bandwidth external memory. This helps speed up responses but limits the size of models it can run.

Even with leadership changes, Groq says it will keep growing.

What this deal means for the AI industry

This agreement shows how intense the AI race has become. AI companies need faster, cheaper, and more efficient chips to stay competitive.

For Nvidia, this deal helps protect its lead as AI workloads shift toward inference. For Groq, the deal brings validation and access to Nvidia’s massive reach.

It also shows a new pattern in tech deals. Instead of full buyouts, large companies are licensing technology and hiring teams to move faster and avoid regulation.

For developers and users, this could mean better AI performance, faster responses, and lower costs over time.

The Bottom Line

AI is no longer just about smart software. It is about hardware, energy use, and speed. Chips decide how powerful and affordable AI can be.

By teaming up with Groq, Nvidia is signalling that it plans to lead not just today, but for many years to come. It is betting that combining talent, new chip ideas, and its own platform will keep it ahead of rivals.

This deal may not be the last of its kind. As AI grows, expect more partnerships, licensing deals, and talent moves across the chip industry.

For now, one thing is clear. Nvidia is not slowing down, and the battle for the future of AI computing just became even more intense.

Also Read:Nvidia bulks up open source offerings with an acquisition and new open AI models

 

Author

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top