Skip to main content


applied intelligence Live

24 Apr 2023

The dynamics of VC for AI – get the scoop on our latest report

Alex Harrowell, Principal Analyst, Omdia
The dynamics of VC for AI – get the scoop on our latest report
To give us a bit of insight into the Omdia Market Radar: Top AI Hardware Startups, Alex Harrowell, Principal Analyst at Omdia provides insights into where the technology is headed, what is likely to happen with these companies in the next couple of years, and the dynamics of the venture capital funding ecosystem. He argues that there needs to be a shift from generic AI acceleration towards more scalable designs and to custom products. Interview with Chuck Martin, Editorial Director, AI Business.

Can you tell us about the report?

We’ve been exploring the top AI startups in a recent report, the Top AI Hardware Startups Market Radar. This report starts off with a data screen of the top 25 best-funded startups in the sector, then drills down to give insight into where the technology in this space is heading, what's likely to happen with these companies in the next couple of years, and the dynamics of the venture capital funding ecosystem.

Can you reveal any insights that the report has uncovered?

The enthusiasm for funding chip start-ups that gripped the venture capital ecosystem has cooled off somewhat. After the banner year of 2020-2021, where over $3 billion was invested, we haven’t had such a big round except for MooreThreads at the end of 2022. 2022 was a bit quieter. Logically, this implies there'll be a wave of exits in the near future. Another issue we've seen is a considerable degree of herding in funding, with funding racing not just into chips for AI but chips using the using the same architecture and pursuing the same applications and strategies. The great majority of this funding (58%, approximately $6 billion) has gone into devices developed for generic AI acceleration rather than for any particular use case in mind.

Just under half of the total incorporate devices that use the so-called CGRA (Coarse-Grained Reconfigurable Architecture). Essentially, a large flat matrix that consists of very simple multiply-accumulate cores and blocks of memory. These devices rely on a special compiler for optimization, which is crucial for mapping the neural network onto the device. Typically, these are dataflow chips, meaning they execute when they receive data, rather than constrained by a clock.

The problem with this approach, however, is scalability. Many of these hardware developers initially aimed to store the entire AI model’s neural network in SRAM on the chip. This is ideal from an input-output perspective, as there is no faster way to access data and weights than having them directly on the accelerator. However, models keep getting bigger, and this presents challenges in terms of both scaling up to accommodate larger models and scaling down for use at the edge and in data centers.

Does this mean that AI on the chip is going to go away as a concept?

No, I think it's going to evolve into multi-chip devices - systems designed around small units of scalability. Very large-scale systems may find a place, possibly more in the high-performance computing (HPC) sector. But there is an obvious issue if the best way you can scale up is by adding another device which is the size of the entire wafer.

So, are we going to see big developments in 2023, or is this going to be gradual?

I think we will see more substantial developments. In the big-name sector, we have seen major launches including Intel’s launch of the Sapphire Rapids CPU with its AMX accelerator in January, and AMD a bit earlier with the Ryzen 7. NVIDIA have also launched the L4, L40, and RTX4090 GPUs at GTC23. There are a couple of startups in stealth mode that I'm expecting to reveal something fairly soon as well.

One thing I do expect is that with tighter funding and some degree of greater scepticism towards the claims, we will see some exits. I suspect that the key route out won't be so much to the IPO, but rather a trade sale to one of the major silicon vendors. With the recent position of SVB, there are more likely to be exits due to financial tensions in the VC ecosystem. Not only did a lot of start-ups fund there, so did VC’s, and SVB lent a lot (venture debt, straight business lending, and also financing VC limited partners and founders for capital calls) in that space. There's around $10 billion on the balance sheets of the major silicon vendors, around $23 billion at Apple, and about $35 billion at Amazon. They're interested in the fundamental technology, and they tend to have substantial AI workloads of their own, as well as serving customer workloads. They also have substantial use cases for AI internally. I wouldn't be surprised if quite a few exits end up being through trade sales.

Read the report Omdia Market Radar: Top AI Hardware Startups. For more insights on the world of AI across sectors visit The London AI Summit, June 14-15 2023, where Omdia’s experts will be exploring everything from Bridging the Culture and Knowledge Gap Between Design Teams & Technical ML Skills to State of the Quantum Ecosystem.

Explore Our Full Blog Library