Unlock the Editor’s Digest for free
Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter.
A team of Silicon Valley veterans who previously worked at Apple and Google has raised $250mn to challenge Nvidia’s dominance of the software tools used to programme artificial intelligence chips.
Palo Alto-based Modular, which is building an alternative to Nvidia’s Cuda software that has been vital to keeping customers locked to the chipmaker’s AI products, announced it has received the new investment at a $1.6bn valuation on Wednesday.
The funding will boost its efforts to build “Android for generative AI hardware”, said the start-up’s co-founder and chief executive Chris Lattner, likening its toolkit to the Google mobile operating system that boosted competition to Apple’s iPhone.
The new investment nearly triples Modular’s valuation since its last round in 2023. The $250mn financing was led by US Innovative Technology, the fund headed by former Legendary Entertainment owner Thomas Tull, which has previously backed US defence tech group Anduril and AI cloud start-up Lambda.
While Nvidia’s Cuda software only works with the chipmaker’s own hardware, Modular’s system allows developers to create AI models and apps that can run on the latest graphics processing units from Nvidia, AMD and Apple’s Macs, allowing them to switch between chips more easily.
DFJ Growth, a major backer of SpaceX and xAI, is also joining the round alongside existing investors such as Alphabet’s venture unit GV and General Catalyst.
“Modular is addressing the most urgent challenge in AI: unifying the compute layer by enabling diversified processing hardware and software to operate cohesively,” said Sam Fort, partner at DFJ Growth.
But the start-up has an uphill struggle. Nvidia is also involved in a whirlwind of deals, such as its latest pledge to invest $100bn in OpenAI, which are seen as an attempt to ensure its chips and Cuda platform remain at the centre of the AI industry.
Nvidia’s GPUs today command an estimated 80 per cent of the booming market for AI data centre systems — a dominance that has propelled the chipmaker to become the world’s most valuable company.
“The industry is locked into this Cuda world,” said Lattner, who previously led Apple’s developer tools team for several years and launched its Swift programming language. “This is a big challenge for developers, because people want choice. People want to be able to run and build AI, and put it on the hardware where it makes the most sense.”
Lattner co-founded Modular in 2022 alongside Tim Davis, after they worked together on Google’s own custom AI chip, dubbed Tensor Processing Units, and other semiconductor projects at Google Brain, the Big Tech group’s AI lab.
Lattner argues that other industry attempts to build cross-compatible alternatives to Cuda — including open source projects backed by the likes of OpenAI, Meta, Google and Microsoft, as well as efforts from chipmakers Intel and AMD — have failed because each Big Tech company is trying to protect its own interests.
The business charges corporate customers based on usage when they run AI tasks on its software on cloud computing platforms such as Amazon Web Services and Oracle. Several new AI cloud computing start-ups are also using Modular to run their data centres using multiple GPU suppliers, Davis said.
Even though Modular presents a challenge to one of Nvidia’s key differentiators, the chipmaker is also working as a partner with the start-up.
“Nvidia has wonderful products, wonderful GPUs, but we want more competition to enable even better silicon,” said Davis. “You shouldn’t win because you have enormous market power, you should win because you have the best product.”
Source link
 
			 
												 
												 
												 
												 
				 
						 
						 
						