While the stock market has seen some volatility so far in August, Nvidia shares (NVDA) are up more than 150% so far this year.
What's driving the stock price is Nvidia's role as a supplier of chips crucial to artificial intelligence technology to tech giants including Apple (AAPL), Amazon (AMZN) and Microsoft (MSFT) as the generative AI boom spreads.
More than a third of the market capitalization growth in the S&P 500 (^GSPC) in the first half of 2024 will come from Nvidia. To some investors, the concentration of gains in a few stocks like Nvidia seems risky. This point was underscored earlier this month when a stock market crash briefly pushed the Volatility Index (^VIX) above 60 and Nvidia's shares fell by as much as 10%.
Though the stock eventually recovered, the period served as a reminder to investors that they could look for AI opportunities elsewhere.
Understanding AI's core technologies and terminology is essential for anyone looking to gain an advantage and diversify their technology assets.
Here are some terms you need to know to capitalize on the AI boom.
inference
Inference is AI's defining moment: when an AI model like ChatGPT generates an answer to a prompt based on its previous training and learning. The quality of an AI system's inference depends heavily on the underlying technology stack, including the powerful chips that power it.
NVIDIA President and CEO Jensen Huang speaks during the Computex 2024 trade show in Taipei, Taiwan, June 2, 2024. (AP Photo/Chiang Ying-ying) (Associated Press)
Calculation
Computing power is what drives the success of an AI system, just as horsepower propels a car: the more computing power there is, the more efficient and faster the inference process will be.
Processing power, memory, and storage all drive computing power, and chipmakers tend to focus on increasing computing power with each new chip release because increased computing power with each chip generation allows companies to charge higher prices, which bodes well for future profit margins.
GPU
Graphics Processing Units (GPUs) are advanced, expensive chips that power AI. Their quality helps determine how fast AI can compute. Nvidia, which started developing GPUs in the 90s, owns more than 80% of the GPU market and used the term 19 times in its first quarter earnings call. Nvidia's GPUs have improved AI inference performance by 1000x over the past decade.
Hyperscaler
Large tech companies such as Microsoft, Alphabet (GOOG, GOOGL), Meta (META), and Amazon are considered hyperscalers, or companies that can rapidly scale AI. Products and services such as Microsoft's Copilot, Alphabet's Gemini, and Meta's Llama have made these companies significant consumers of AI chips as well as competitors to chipmakers.
The story continues
For the latest technology news impacting the stock market, click here
Read the latest financial and business news from Yahoo Finance
StockStory aims to help retail investors beat the market.