News

DeepSeek has rattled large AI players — but smaller chip firms see it as a force multiplier

DeepSeek has rattled the U.S.-led AI ecosystem with its latest model, shaving hundreds of billions in chip leader Nvidia’s market cap. While the sector leaders grapple with the fallout, smaller AI companies see an opportunity to scale with the Chinese startup.

Several AI-related firms told CNBC that DeepSeek’s emergence is a “massive” opportunity for them, rather than a threat. 

“Developers are very keen to replace OpenAI’s expensive and closed models with open source models like DeepSeek R1…” said Andrew Feldman, CEO of artificial intelligence chip startup Cerebras Systems.

The company competes with Nvidia’s graphic processing units and offers cloud-based services through its own computing clusters. Feldman said the release of the R1 model generated one of Cerebras’ largest-ever spikes in demand for its services. 

“R1 shows that [AI market] growth will not be dominated by a single company — hardware and software moats do not exist for open-source models,” Feldman added. 

Open source refers to software in which the source code is made freely available on the web for possible modification and redistribution. DeepSeek’s models are open source, unlike those of competitors such as OpenAI.

DeepSeek also claims its R1 reasoning model rivals the best American tech, despite running at lower costs and being trained without cutting-edge graphic processing units, though industry watchers and competitors have questioned these assertions.

“Like in the PC and internet markets, falling prices help fuel global adoption. The AI market is on a similar secular growth path,” Feldman said. 

Inference chips 

A number of AI chip startups told CNBC that they were seeing more demand for inference chips and computing as clients adopt and build on DeepSeek’s open source model. 

“[DeepSeek] has demonstrated that smaller open models can be trained to be as capable or more capable than larger proprietary models and this can be done at a fraction of the cost,” said Sid Sheth, CEO of AI chip start-up d-Matrix. 

“With the broad availability of small capable models, they have catalyzed the age of inference,” he told CNBC, adding that the company has recently seen a surge in interest from global customers looking to speed up their inference plans. 

Robert Wachen, co-founder and COO of AI chipmaker Etched, said dozens of companies have reached out to the startup since DeepSeek released its reasoning models.

“Companies are [now] shifting their spend from training clusters to inference clusters,” he said. 

“DeepSeek-R1 proved that inference-time compute is now the [state-of-the-art] approach for every major model vendor and thinking isn’t cheap – we’ll only need more and more compute capacity to scale these models for millions of users.”

Jevon’s Paradox 

DeepSeek will spur new innovation in AI, says Groq COO

This pattern explains Jevon’s Paradox, a theory in which cost reductions in a new technology drive increased demand.

Financial services and investment firm Wedbush said in a research note last week that it continues to expect the use of AI across enterprise and retail consumers globally to drive demand.

Speaking to CNBC’s “Fast Money” last week, Sunny Madra, COO at Groq, which develops chips for AI inference, suggested that as the overall demand for AI grows, smaller players will have more room to grow.

“As the world is going to need more tokens [a unit of data that an AI model processes] Nvidia can’t supply enough chips to everyone, so it gives opportunities for us to sell into the market even more aggressively,” Madra said.

Checkout latest world news below links :
World News || Latest News || U.S. News

Source link

Back to top button