ChatGPT, the chatbot that helped spark the market’s AI enthusiasm, is an important innovation because it can digest large amounts of text (hence the term large language model), communicate in natural (human) language, and generate sophisticated responses. Based on the Transformer architecture, a technique first introduced by Google in 2017, ChatGPT demonstrates the advances that have been made in AI that open the door to a wider set of business uses. But while natural language models recently produced epiphanies among lay chief executives and investors regarding AI, some tech companies were already investing in such capabilities and are being rewarded for that foresight. NVIDIA has been the biggest beneficiary this year in terms of its stock run and projected revenue gains; however, our other holdings, such as Adobe, Microsoft, Salesforce, ServiceNow, Synopsys, and TSMC, also appear among the possible beneficiaries. More companies—including, perhaps, some not yet in existence—will certainly join the ranks over time.
While it is still early, it’s evident that these companies see generative AI as transformative to their businesses and something upon which they can build new revenue models. Additionally, they are turning to AI to boost internal productivity, enhance existing customer offerings, and improve the quality and efficiency of customer interactions.
Most notably, Microsoft was able to gain an immediate leadership position in generative AI by making a US$10 billion investment in OpenAI, the company behind ChatGPT, earlier this year. Microsoft’s Bing search engine has since introduced ChatGPT into its web index data—a collection so large that it is rivaled by the dataset of only one other business in the world, Alphabet’s Google. Data are the feedstock of AI models, and an AI-enhanced search engine trained on so much data may attract more users to Bing, allowing Microsoft to sell more ads on the service. Microsoft is also adding generative AI to other products, including the Azure cloud service, enabling business customers who use Azure to easily integrate OpenAI models to glean more insights from their data and automate functions such as certain IT tasks. These added capabilities should motivate more businesses to migrate their data to the cloud and make Azure more competitive with Amazon.com’s AWS and Google Cloud.
The beneficiaries of demand for generative AI aren’t limited to traditional IT-sector companies. Data centers used to train AI models require up to ten times more power than typical data centers, thus requiring more-powerful equipment and backup power. Given the amount of heat they generate, new liquid-cooling solutions will be needed as well. This creates an opportunity for Schneider Electric, which has been developing innovative data-center equipment solutions for many years.
In the meantime, NVIDIA has emerged as the unrivaled global leader in providing the technologies at the center of the AI arms race. Due to an explosion of demand related to generative AI and LLMs from across its customer base, NVIDIA projects that data-center revenue for its fiscal second quarter ending in July will surge to US$11 billion. Not only is that more than double last quarter’s total, but the forecast also shattered the average analyst estimate that called for about US$7 billion.
Our investments in NVIDIA and Schneider are reflective of how we are thinking through the many unknowns and approaching portfolio structure in this environment. Through our fundamental framework, we can appreciate the broad excitement for AI, but we also remain conscious of valuations and thoughtful about diversification, recognizing that it’s unlikely anyone can predict today the biggest long-term winners.