Outrageous Predictions
Révolution Verte en Suisse : un projet de CHF 30 milliards d’ici 2050
Katrin Wagner
Head of Investment Content Switzerland
Investment Strategist
Memory, not only graphics chips, sets artificial intelligence speed and cost, and tight supply can shift pricing power.
Micron’s outlook points to strong demand and heavy investment, but the memory cycle still matters.
Oracle’s data centre snag shows artificial intelligence growth needs financing and execution, not just excitement.
On 17 December 2025, markets give artificial intelligence (AI) a late-year reality check. The Standard and Poor’s 500 (S&P 500) closes at 6,721.43, down 1.2%, and the Nasdaq Composite ends at 22,693.32, down 1.8%. The move feels less like “AI is over” and more like “show me the bill”. After a year of big promises, investors lean into three practical questions: where the bottleneck sits, who has pricing power, and how quickly today’s spending turns into tomorrow’s cash.
A graphics processing unit (GPU) can only work as fast as it can fetch data. Modern AI models move huge amounts of information between the processor and memory. If that pipe is narrow, the GPU waits.
Two memory terms matter. Dynamic random-access memory (DRAM) is the server’s working memory. High-bandwidth memory (HBM) is a premium form of DRAM stacked and placed close to the GPU, so data moves faster and with less power.
When HBM is scarce, memory makers gain negotiating power. That is the quiet reason the “AI trade” is also a memory story.
Micron reported results on 17 December 2025 and leans into a simple message: memory is no longer the quiet sidekick in the AI story. The company’s quarter lands ahead of expectations, but the bigger surprise sits in the outlook. Management points to a step-up that comes in well above what Bloomberg analysts expect, driven by data centre demand and the scramble for high-bandwidth memory (HBM), the premium “fast lane” memory used next to AI accelerators.
Compared with this time last year, the change is stark: Micron’s fiscal first-quarter revenue rises to 13.64 billion USD from 8.71 billion USD a year earlier, and management talks about record performance and supply that still does not fully meet demand. In plain English, customers want more chips than the industry can comfortably ship, and that is when pricing tends to behave better. Management also points to a much higher 2026 capital expenditure plan (money spent on factories and equipment) and warns tight supply can persist beyond 2026. That combination supports pricing power, but it raises execution risk because the industry is building expensive capacity fast.
Nvidia designs the GPUs (graphics processing units) that do the heavy lifting in AI training and inference, but the system is only as strong as everything around the GPU. More GPUs pull the whole chain forward: more HBM, more networking, more electricity, and more data centres.
Oracle sits in that “buildings and power” layer as it expands cloud capacity and chases large AI workloads, including projects linked to OpenAI. On 17 December 2025, Oracle shares close down 5.4% at 178.46 USD after reports that a planned Michigan data centre project hits turbulence when Blue Owl Capital steps back from funding talks. The market takeaway is not that demand disappears. It is that AI now looks like infrastructure, and infrastructure lives and dies by terms, financing, and payback periods.
Memory remains cyclical. A pause in cloud capital expenditure can cool demand and pressure pricing quickly. Watch for softer spending plans and rising inventories.
HBM is also hard to scale. Yield issues, packaging bottlenecks, or faster progress by rivals can squeeze margins. Watch for customer qualification updates and any shift in multi-year supply agreements.
Finally, watch the profit gap. AI can grow revenue and costs at the same time. If services do not price well enough to cover the infrastructure bill, valuations can reset across the theme.
Track AI as a supply chain: GPUs, memory, networking, and data centres, not one hero stock.
Prioritise cash flow direction and capital discipline over headline growth rates.
Use diversification and position sizing to respect volatility in cyclical semiconductors.
Watch concrete signals: capital expenditure guidance, HBM allocation comments, and data centre utilisation.
The tidy ending to 2025 is that the AI story grows up. In January, it feels like software magic. By December, it looks like concrete, cables, and contracts. Micron’s quarter shows why: even the best GPU cannot run if memory bandwidth is scarce. Oracle’s funding snag shows the other half: the buildings and power systems need willing financiers.
For a long-term investor, the lesson is not to chase every headline. It is to follow where the economics settle, and to remember that boring components often set the speed. Price the plumbing first, then enjoy the magic. That mindset helps when the next wobble arrives, because it usually does.