According to the market analysis firm, the revenues from the sale of chips for AI processing was US$1.3 billion in 2018 and will grow to $23 billion in 2023, according to market analysis firm ABI Research. This equates to a compound annual growth rate of 78 percent over the six years.
Breaking the sector down further ABI reckons the revenue from the sale of AI chips and chipsets for edge inferencing will grow with a CAGR of 65 percent over the same time period but the smaller class of more powerful and higher-priced chips for inference-training will grow with a CAGR of 137 percent.
However, these market increases do not necessarily favour the incumbent market leaders Intel and Nvidia, ABI said.
The market is essentially a totally new one and will see intense competition from both established chip companies and numerous startup companies, the market research firm said.
"Companies are looking to the edge because it allows them to perform AI inference without transferring their data. The act of transferring data is inherently costly and in business-critical use cases where latency and accuracy are key, and constant connectivity is lacking, applications can’t be fulfilled," said Jack Vernon, an analyst at ABI Research, in a statement. "Locating AI inference processing at the edge also means that companies don’t have to share private or sensitive data with cloud providers, something that is problematic in the healthcare and consumer sectors," he added.
AI at the edge will have a significant impact on the semiconductor industry and the biggest winners are likely to come from among vendors with intellectual property for AI-related ASICs.
Next: Tensor processing