4 AI terms Nvidia investors should know

In this article:

Despite some volatility in the stock market so far in August, Nvidia stock (NVDA) remains up over 150% this year.

Driving the stock is Nvidia's role in supplying chips to major tech giants like Apple (AAPL), Amazon (AMZN), and Microsoft (MSFT) that are crucial for artificial intelligence technology amid the broader generative AI boom.

Over a third of the S&P 500 (^GSPC) market cap gains in the first half of 2024 could be attributed to Nvidia. For some investors, the concentration of gains in a few stocks such as Nvidia seems risky. It was a point that was underlined earlier this month by a stock rout that briefly pushed the volatility index (^VIX) above 60 and shares of Nvidia down by as much as 10%.

Stocks ended up recovering, but the period offered a reminder to investors that they could seek AI opportunities elsewhere.

For those looking to gain an edge and diversify their tech holdings, understanding AI's core technology and terminology is essential.

Here's a breakdown of a few of the terms you need to know to invest in the AI boom:

Inference

Inference is AI’s moment of truth. It's when an AI model like ChatGPT generates an answer to a prompt based on its previous training and learning. The quality of an AI system's inference relies heavily on its underlying technology stack, including the powerful chips that drive it.

FILE - President and CEO of Nvidia Corporation Jensen Huang delivers a speech during the Computex 2024 exhibition in Taipei, Taiwan, June 2, 2024. A rebound for Nvidia on Tuesday, June 25, 2024, is helping keep U.S. indexes close to their records Tuesday. (AP Photo/Chiang Ying-ying)
President and CEO of Nvidia Corporation Jensen Huang delivers a speech during the Computex 2024 exhibition in Taipei, Taiwan, on June 2, 2024. (AP Photo/Chiang Ying-ying) (ASSOCIATED PRESS)

Compute

Compute power is the driving force behind an AI system's success, similar to how horsepower propels a car. The greater the compute power, the more efficient and faster the inference process becomes.

Processing power, memory, and storage all fuel compute power, and chipmakers tend to focus on increased compute power for new chip releases because improvements in compute power with each chip generation could allow companies to charge more, which typically bodes well for future profit margins.

GPUs

Graphics processing units, or GPUs, are advanced and expensive chips that power AI. Their quality can help determine the speed of AI computations. Nvidia, which began working on GPUs in the '90s, owns over 80% of the GPU market and mentioned the term 19 times in its first quarter earnings presentation. Nvidia’s GPUs have increased AI inference performance by a thousand times over the last decade.

Hyperscalers

Big Tech companies like Microsoft, Alphabet (GOOG, GOOGL), Meta (META), and Amazon are considered hyperscalers, or those capable of rapidly scaling AI. With products and services such as Microsoft's Copilot, Alphabet's Gemini, and Meta's Llama, these companies are significant both as consumers of AI chips and competitors to chipmakers.

Click here for the latest technology news that will impact the stock market

Read the latest financial and business news from Yahoo Finance

StockStory aims to help individual investors beat the market.
StockStory aims to help individual investors beat the market.
Advertisement