Equities

Cloud Giants Challenge Nvidia in AI Chip Market Shift

Cloud Giants Develop Own AI Chips to Cut Costs, Challenging Nvidia's 90% Market Share Amid $178 Billion Capital Spending Surge

By Alex P. Chase

4/19, 02:33 EDT
Advanced Micro Devices, Inc.
Amazon.com, Inc.
Broadcom Inc.
Eaton Corporation, PLC
Alphabet Inc.
Intel Corporation
Meta Platforms, Inc.
Microsoft Corporation
NVIDIA Corporation
Taiwan Semiconductor Manufacturing Company Ltd.
Vertiv Holdings, LLC
article-main-img

Key Takeaway

  • Cloud giants Amazon, Meta, Microsoft, and Alphabet are developing their own AI chips to control costs and reduce reliance on Nvidia.
  • Semiconductor industry shows mixed performance; Nvidia remains strong while Intel and AMD decline, highlighting the importance of AI focus.
  • Investment strategies shift towards companies supporting or benefiting from AI growth, offering new opportunities beyond direct chip manufacturers.

Cloud Giants Challenge Nvidia's AI Dominance

The AI chip market, long dominated by Nvidia with over a 90% share, is witnessing a new challenge from some of its largest customers. Amazon, Meta, Microsoft, and Google-parent Alphabet, collectively known as the Cloud Four, are developing their own AI chips. This move is driven by the desire to control surging capital spending, which for fiscal 2024 is expected to reach $178 billion, marking a more than 26% increase from the previous year. By designing their own chips, these companies aim to manage costs more effectively, similar to Apple's integrated approach to hardware and software.

Amazon entered the chip market in 2015 with the acquisition of Annapurna Labs, leading to the development of three chips aimed at enhancing AI performance and cost-efficiency. Google has been working on its tensor processing units (TPUs) for nearly a decade, focusing on both performance and efficiency to support its services and large language models. Meta recently introduced its second-generation AI chip, the Meta Training and Inference Accelerator (MTIA), to power its social media and advertising algorithms. Microsoft, though later to the game, unveiled the Azure Maia 100 AI Accelerator, focusing initially on internal workloads with plans to expand access to customers.

Semiconductor Sector's Mixed Performance

The semiconductor industry is showing mixed results amid the AI chip demand surge. While Nvidia continues to benefit from its strong position in the AI chip market, companies like Intel and AMD have seen their shares decline due to their lesser focus on AI chips. Broadcom, on the other hand, has seen a 4% increase in its stock price, benefiting from the AI chip demand despite AI being only a fifth of its business. This divergence highlights the importance of business mix in the current market dynamics.

Taiwan Semiconductor Manufacturing Company (TSMC), another key player, reported first-quarter results that exceeded expectations, with sales guidance for the second quarter also beating forecasts. However, concerns over non-AI chip demand led to a 4% drop in its stock price in midday trading.

Investment Strategies Amid AI Boom

The growing demand for AI chips presents new investment opportunities and challenges. Analysts and investors are reassessing their strategies to capitalize on the AI boom. Ocean Park Investments, for example, has found success by investing in companies like Eaton, Vertiv, and Super Micro Computer, which support or benefit from the AI industry's growth without directly competing with chip giants. This approach highlights the potential for investors to find value in sectors adjacent to core AI technologies.

Management Quotes

  • Gadi Hutt, Director of Business Development for Annapurna (Amazon):

    "There was no Gen AI, and there were no LLMs [or large language models], but machine learning was growing as a use case at Amazon and with our customers. They all came to us with basically the same product statement, which was that we want to do more, but it’s too expensive." "Compared to other solutions that are available in the cloud, mainly Nvidia, you will save money and you’ll have high performance, in some cases even higher."

  • Mark Lohmeyer, Vice President of Compute and Machine Learning Infrastructure at Google Cloud:

    "The company started the project over a decade ago when a few engineers floated a thought experiment—how much compute it would take if Google users interacted a few minutes a day with search through voice prompts? He says the answer was that the amount of required general purpose computing it would have required was astounding—spurring the work on processors or AI and machine learning."

  • Rani Borkar, Corporate Vice President for Microsoft Azure Hardware Systems and Infrastructure:

    "We are optimizing and integrating every layer of the stack. It’s not just about silicon. We are reimagining every layer from silicon to servers to system to data center infrastructure, to maximize performance and efficiency." "It’s not this or that; it's this and that... we need diversity of suppliers."