It’s no secret that Nvidia (NASDAQ: NVDA) dominates the artificial intelligence (AI) chip market, particularly in the large-scale data center sector. Think of the server farms built by companies like Amazon and Alphabet. Last quarter, the company brought in $26.5 billion in revenue from high-profit segments. That’s 10 times the amount managed by its biggest competitor, AMD, but not only is Nvidia selling more, it’s selling at a lower cost. The company’s net income for the same period was a whopping 25 times that of AMD.
Media attention is almost entirely focused on this segment. It makes perfect sense. That number is extraordinary. Without a doubt, data centers are the heart of Nvidia’s explosive growth, and that’s not likely to change. But exciting opportunities in AI exist outside of these giant server farms. Your personal computer may play an important role in the near future.
That’s why Nvidia is hosting RTX AI PC Day on October 19th and 20th this year. The event will explore how the company is building AI and machine learning capabilities into consumer GPUs and the future it envisions for AI-powered PCs. What does this mean for investors?
Save time and money by pushing AI to the edge
The technology industry has been defined by the cloud for the past two decades. Advances in Internet bandwidth have made it possible to move much of the actual work of computing and storage from local devices to vast server farms, mostly hidden in rural areas. Out of sight, out of mind, as they say. AI is generally similar. ChatGPT does not run on your computer. It’s probably running on Microsoft servers thousands of miles away. You are simply using the computer to communicate with the computer.
But that’s starting to change. In some cases, computing is moving back to what technologists call “the edge.” In other words, increasingly important parts of computing are once again being done locally on the device. Some think this could become a big trend in AI. But why? Great question.
Some AI applications are too demanding to run anywhere other than a data center with thousands of Nvidia’s H100s, for example. At $25,000 each, the cost is staggering. That’s why these data centers are being built by companies that can invest billions of dollars. However, not all AI applications require that much power. Some models, certain tools, or other limited AI applications can be run locally using a sufficiently powerful GPU. But just because you can, does that mean you should? What benefits does edge computing have here?
story continues
First, what edge computing lacks in power, it makes up for in speed. Cloud computing introduces latency into the equation that is not present when processing numbers locally. Those few milliseconds can make a big difference in some applications of the technology. Second, considering the cost of building and operating large data centers, their use is not cheap. Customers who choose the edge approach may pay more upfront, but over time it could become a cheaper option. Another big one? If your internet connection isn’t guaranteed or limited, there’s no need to rely on the cloud.
AI PCs will be used for everything from enhancing video game graphics and expanding content creation possibilities to bringing personal AI assistants to cars.
AI PCs could be a big market for Nvidia
This could be a huge advancement for the entire PC industry. Qualcomm’s CEO called it “as important as Windows 95.” No one wants to be left behind here. That means Nvidia is in fierce competition with a host of other chipmakers for a slice of a pie that research firm Canalys expects to grow at a compound annual growth rate of 44% over the next four years. The company believes more than 200 million AI PCs will be shipped by 2028.
Given the competition, it may be unlikely that Nvidia will capture as much of the data center market. However, given the scale, this is not necessary. A modest share of this market would still significantly boost earnings.
The Nvidia event could help raise the profile of AI PCs and impact the stock price, but I wouldn’t hold my breath if something significant happens. The event itself does not move much. What matters is what it means. This is another huge opportunity for companies to fire on all cylinders.
Should you invest $1,000 in Nvidia right now?
Before buying Nvidia stock, consider the following:
The Motley Fool Stock Advisor team of analysts identified the 10 best stocks for investors to buy right now…and Nvidia wasn’t one of them. These 10 stocks have the potential to generate impressive returns over the next few years.
Consider when Nvidia created this list on April 15, 2005… If you invested $1,000 at the time of recommendation, you would have earned $831,707. *
Stock Advisor provides investors with an easy-to-understand blueprint for success, including guidance on portfolio construction, regular updates from analysts, and two new stocks each month. Stock Advisor services have increased S&P 500 returns more than 4x since 2002*.
See 10 stocks »
*Stock Advisor will return as of October 14, 2024
John Mackey, former CEO of Amazon subsidiary Whole Foods Market, is a member of the Motley Fool’s board of directors. Suzanne Frey, an Alphabet executive, is a member of the Motley Fool’s board of directors. Johnny Rice has no position in any stocks mentioned. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet, Amazon, Microsoft, Nvidia, and Qualcomm. The Motley Fool recommends the following options: A long January 2026 $395 call on Microsoft and a short January 2026 $405 call on Microsoft. The Motley Fool has a disclosure policy.
October 19th is approaching. Could Nvidia’s AI PC Day be a game-changer for investors?Originally published by The Motley Fool