Quarterly Outlook
Q4 Outlook for Investors: Diversify like it’s 2025 – don’t fall for déjà vu
Jacob Falkencrone
Global Head of Investment Strategy
Investment Strategist
Nvidia’s next earnings report on 19 November has turned into a global stress test for the artificial intelligence story. The company is about 8% of the S&P 500 and recently became the world’s most valuable listed firm, so its numbers now move more than just chip stocks.
At the last close on 14 November 2025, Nvidia shares traded at 190.17 USD, up 1.8% on the day and roughly 42% year to date. Options markets are braced for about a 6% swing on results, which means the reaction could easily ripple across global indices.
Nvidia has beaten expectations in most recent quarters, yet its share price has often slipped once the headlines fade. Investors are no longer impressed by big beats alone. They want proof that the AI build-out is moving from “infinite budget” stories to durable profits.
This time the backdrop is more fragile. Tech just suffered one of its shakiest weeks in years, rate-cut hopes in the United States have cooled, and data releases are distorted by a recent government shutdown. Asian markets are also on edge, with Japan and Taiwan sensitive to any wobble in AI demand and geopolitics.
In that setting, one company’s earnings call doubles as a referendum on a multi-trillion dollar AI capex cycle.
The average analysts according to Bloomberg expect around 55.2 billion USD of revenue for the quarter, up almost 60% year on year. Adjusted earnings per share (EPS, which is profit divided by the number of shares) are seen near 1.26 USD, roughly 55% higher than a year ago.
Three line items sit at the top of most models: data centre revenue, margins and guidance. Data centres are the core of the story, fuelled by Nvidia’s Blackwell chips and early spending for the Rubin generation that should ship in 2026.
Investors will also look at how much is coming from the “hyperscalers” (the big cloud platforms such as Microsoft, Amazon, Alphabet and Meta) versus newer customers like specialist AI clouds and governments. A broader customer mix would make the revenue base feel less dependent on a handful of cheque-writers.
Guidance for the next quarter may be even more important than this print. Some previews point to the possibility of guidance heading towards the mid-60 billion USD range. Anything that looks soft against that sort of whisper could reignite AI bubble worries.
Nvidia now sits at the centre of a web of huge AI infrastructure promises. OpenAI has talked about 1.4 trillion USD of data centre commitments over the next eight years, spanning cloud, hardware and dedicated AI campuses.
Within that, Nvidia has agreed to invest up to 100 billion USD to help OpenAI build at least 10 gigawatts of AI data centres, which will in turn be filled with Nvidia systems. On top of that, OpenAI has struck a 38 billion USD cloud deal with Amazon Web Services that again leans heavily on Nvidia hardware.
From one angle, this is a powerful flywheel. Capital flows into OpenAI, which buys Nvidia chips, which supports Nvidia’s share price and makes more capital available for the next round of projects. From another angle, some of that spending starts to look circular, with investors asking how much is true end-user demand and how much is AI companies funding each other.
On the call, watch how Nvidia talks about who is ultimately paying for these racks and what kinds of workloads are running on them. Concrete examples of paying customers and real-world use cases matter more now than grand capex totals.
A fresh worry has arrived just in time for this earnings season: depreciation. Michael Burry, of “The Big Short” fame, argues that major AI and cloud companies are overstating profits by stretching the useful life of their Nvidia-powered hardware. He estimates that between 2026 and 2028, earnings could be inflated by around 176 billion USD across the sector.
The basic idea is simple. Capital expenditure (capex) on GPUs and networking is spread over several years through depreciation. If companies claim that servers last five to six years, but the economic life of high-end AI chips is more like two to three years, then near-term profits look better than they should.
Nvidia is not the one booking that depreciation, but it sits at the heart of this cycle. Faster product launches, from Blackwell to Rubin and later Feynman, shorten the time before older chips feel obsolete. That makes it harder for customers to justify long depreciation schedules.
Investors listening to Nvidia’s guidance will be asking a simple question: are customers spending at a pace that their own profit and loss statements can reasonably support, or are we front-loading an AI boom and the bill arrives later?
Nvidia’s supply chain is heavily concentrated in Asia. Foundry partners like TSMC and memory suppliers such as Samsung and SK hynix sit at the core of the product stack, and they themselves make up large chunks of the Taiwanese Taiex and Korean Kospi indices.
That means Nvidia’s guidance does not just move one ticker. It influences how investors view whole markets, from US tech to Asian hardware exporters. Recent global sell-offs have shown how quickly weakness in Nvidia can spill into indices from New York to Tokyo.
For investors, Nvidia has become a proxy for sentiment on AI as a whole. When it rallies, the market tends to reward “AI-linked” stories. When it stumbles, the same stories can deflate in a hurry.
Three risk signals are worth watching on results night and in the weeks after:
If any of these show up alongside cautious guidance, talk of an AI bubble would likely come back quickly.
This is an earnings preview, not a to-do list, but there are a few practical checkpoints:
Nvidia has become the scoreboard for the AI boom, from OpenAI’s trillion-dollar ambitions to the depreciation rows in Big Tech boardrooms. This week will not answer every question, but it will show how much of that future is already priced in, and how much room is left before gravity reasserts itself.