Apple (AAPL) recently announced that it has used Alphabet's (GOOGL) tensor processing units (TPUs) to develop artificial intelligence (AI) models. To some commentators, this is a revelation that the industry is less reliant on Nvidia's (NVDA) graphics processing units (GPUs) than initially thought. However, since both units do not provide the same value, it may be better to view these TPUs as part of Google's broader AI package offering rather than as a revenue driver in isolation.
Although Google's TPUs can't compete with Nvidia's technological advantages, I am bullish on Alphabet (best known for its search engine, Google) due to its innovative AI products, relatively cheap valuation, and increasing autonomy. Wall Street shares this bullish view.
Does Apple's choice of TPU matter?
First, let’s understand why Apple’s adoption of Google’s TPU attracted so much attention and what difference its unique model makes compared to Nvidia’s.
In a recent technical paper, Apple engineers highlighted that the iPhone maker chose Google's TPUs, using 2,048 TPUv5p chips for its AI models on devices and 8,192 TPUv4 processors for its AI models on servers. Until now, companies working with AI and machine learning (ML) have typically used GPUs from Nvidia.
For some commentators, it was intriguing that there was no mention of Nvidia in the report, perhaps indicating that Google's TPUs could offer comparable performance and efficiency.
It's also interesting to note that Google's business model for TPUs is entirely different: Unlike Nvidia, which sells chipsets to hyperscalers and other markets, Google offers access to its TPUs through its cloud services, with customers essentially renting access to the tech giant's TPUs.
Google's TPUs are custom-designed application-specific integrated circuits (ASICs) optimized for deep learning tasks, providing high efficiency for large matrix operations. This approach requires customers to develop the necessary software within Google's ecosystem, highlighting a strategic difference between the two companies' products. This strategy makes Google's customers much more committed than Nvidia's, and drives more consistent revenue growth.
Can Google's TPUs compete with GPUs?
Currently, Google's TPUs can't match Nvidia's GPUs in terms of cost-effectiveness and popularity, but this is just the beginning for Google's new chipsets and the future looks bright. Let's look at the main differences.
While Nvidia's GPUs were developed for graphics rendering, their TPUs and Neural Processing Units (NPUs) are purpose-built for AI/ML workloads. GPUs have parallel processing capabilities that make them widely used for AI/ML workloads. However, both TPUs and GPUs have distinct advantages and limitations in AI and ML.
TPUs are highly efficient for training large AI models, excel at tasks requiring large matrix calculations, and can be connected into large clusters for scalability. TPUs offer a cost-effective solution through Google Cloud, with a significant performance-per-dollar advantage. However, TPUs do not have the versatility of GPUs, nor do they have the same mature ecosystem.
GPUs, on the other hand, are quite cost-effective, at least according to some metrics like FLOPS (floating-point operations per second). In fact, Jensen Huang said in March that Nvidia's AI chips are “so good that even if our competitors' chips were free, they wouldn't be cheap enough.”
So while TPUs are useful for AI and ML tasks, connecting them to large clusters to provide the computational power needed for advanced AI models, they don't seem to have the same broad appeal or cost-effectiveness as GPUs.
Google's investment in TPUs
So TPUs don't have the same technical value as Nvidia's GPUs, but as we said before, this is all part of Alphabet's broader plans for the new AI era.
Google Cloud revenue is expected to grow 28% year-over-year to $9.6 billion in the first quarter of 2024, driven in part by demand for AI infrastructure, including TPUs. Google doesn't disclose revenue from TPUs, but the underlying data and capacity growth in markets such as the North Atlantic suggests TPUs could be more widely adopted.
CEO Sundar Pichai suggested that Google's TPU-related revenue could grow significantly due to increased demand for AI services, noting that Google's AI hypercomputer, which combines TPUs with Nvidia's GPUs, has attracted a significant number of funded AI startups and unicorns as customers.
More than 60% of funded generative AI startups and nearly 90% of generative AI unicorns are Google Cloud customers. This trend positions Google well to capture a larger market share in the rapidly expanding AI field, poised for significant revenue growth in the coming years.
However, it's worth noting that Google Cloud revenue in the first quarter only accounted for about 12% of the company's revenue. While the company's AI cloud model may become more widespread, these TPUs are just one part of a broader range of services. Moreover, these TPUs will help Google reduce its computing costs and dependency on Nvidia.
GOOGL Stock: Cheapest of the Big Tech Companies
Another key factor to be bullish on GOOGL stock is its relatively cheap price. In fact, Google is the cheapest of the big tech stocks, trading at 21.8 times forward earnings and a price-to-earnings growth (PEG) ratio of 1.25. Its currently expected medium-term (next 3-5 years) annual earnings growth rate is 17.4%.
Is GOOGL stock a buy according to analysts?
TipRanks rates GOOGL a Strong Buy, with 30 Buy ratings, 7 Hold ratings, and 0 Sell ratings from analysts in the past three months. Alphabet’s average price target is $204.62, suggesting an upside potential of 29.3%.
See more GOOGL analyst ratings
Google's conclusion
Google's new TPUs are part of a broader strategy for the AI era. It's taking a different approach than Nvidia, and Google's cloud revenue shows growing profit margins. Another is the stock price: it's certainly not expensive for a big tech company, and it's enjoying growth tailwinds, with projected annual revenue growth of 17.4%. While the TPUs are not the only reason for this growth, they are part of Google's broader AI package, and they seem to be gaining momentum.
Disclosure