Cloud Service Providers Investing in AI Equipment: TrendForce Survey

Cloud service providers (CSPs) have been investing in equipment that supports artificial intelligence (AI) technologies in response to the emergence of new applications like self-driving cars, AI of Things (AIoT), and edge computing since 2018. According to TrendForce’s latest survey of the server market, AI servers equipped with general-purpose GPUs (GPGPUs) accounted for almost 1% of global server shipments in 2022. AI server shipments are expected to increase by 8% YoY in 2023 due to the growing demand for ChatBot and similar applications across AI-related fields. Furthermore, shipments of AI servers are expected to increase at a CAGR of 10.8% from 2022 to 2026.

The four major North American CSPs, Google, AWS, Meta, and Microsoft, held the largest share of the annual total AI server demand in 2022, accounting for 66.2% of the global procurement quantity. In China, localization of manufacturing and self-sufficiency in critical technologies have accelerated the build-out of AI infrastructure. Among Chinese CSPs, ByteDance led the procurement of AI servers in 2022, accounting for 6.2% of the annual global procurement quantity. Tencent, Alibaba, and Baidu followed, comprising around 2.3%, 1.5%, and 1.5%, respectively.

The optimization of search engines with AI is driving demand for high-bandwidth memory (HBM). Microsoft has invested a significant amount in OpenAI and launched an improved version of Bing that incorporated a large-scale language model named Prometheus and the technology that underlies ChatGPT. Baidu also launched ERNIE Bot, which adopted NVIDIA’s A100 and the A800. However, ERNIE Bot has now switched to A800 due to the US Commerce Department’s export control restrictions. NVIDIA’s H100, A100, and A800, and AMD’s MI250 and MI250X series are the mainstream products for server GPUs used in AI-related computing. NVIDIA currently controls about 80% of the market share for server GPUs, while AMD controls about 20%.

HBM, which is involved in high-bandwidth computing and requires high-bandwidth memory, has attracted attention in the market. HBM currently represents about 1.5% of the entire DRAM market, and SK hynix is expected to become the dominant supplier for HBM3 solutions. The demand for HBM solutions rose significantly in 2020-2021, but it is expected to slow down in 2023 due to inventory corrections. The market for HBM solutions is expected to expand at a CAGR above 40% from 2022 to 2026.

In conclusion, CSPs are investing in equipment that supports AI technologies in response to the emergence of new applications like self-driving cars, AIoT, and edge computing. The optimization of search engines with AI is driving demand for HBM. The market for HBM solutions is expected to expand at a CAGR above 40% from 2022 to 2026.