Why the AI Boom May Not Be a Bubble: Wall Street Veteran Rick Sherlund Sees the Next Tech Giants Emerging
As billions of dollars flow into artificial intelligence, investors and executives are increasingly asking whether the market is replaying the dynamics of past tech bubbles.
Rick Sherlund, a longtime Wall Street technology analyst and senior adviser at Wedbush, argues the comparison misses what is actually happening.
In his view, the current AI surge looks less like a short-lived frenzy and more like a platform shift that will reshape how software is built and how companies operate. Sherlund says the real story is the slow, structural adoption of AI across business systems, not just consumer-facing chatbots.
A platform shift, not a fad
Sherlund points to earlier transitions he has watched in markets, from mainframes to client-server computing and later to cloud and mobile. Those waves also attracted skepticism early on, but ultimately created new long-term winners and rewired corporate IT spending.
He argues AI is still at an early stage of enterprise integration, where companies are moving from experimentation to embedding models into products, workflows and internal tools. That shift, he says, is what can sustain demand beyond market cycles and valuation swings.
AI looks like an oligopoly
Unlike some previous tech booms, Sherlund expects AI to remain heavily concentrated because the costs of building and training frontier-scale models are extremely high. The market already resembles an oligopoly, with a small set of firms controlling key layers of compute, models and platforms.
Nvidia remains central to the hardware stack, while major model and platform ecosystems are being built by companies such as OpenAI, Google and Microsoft, alongside rivals like Anthropic. At the same time, open-source models, including efforts coming out of China, could put pressure on pricing and margins over time.
Infrastructure is the real bottleneck
Sherlund emphasizes that the pace of AI expansion will depend on infrastructure as much as on algorithms. Data centers, energy supply, advanced memory and the capital needed to scale deployments are becoming the practical constraints that determine how fast AI can spread.
He argues demand for compute will increasingly come from inference, the day-to-day running of AI systems inside products and enterprise environments. That workload, he says, is only beginning to ramp and could become a durable driver even if parts of the stock market periodically overheat.
While Sherlund does not rule out pockets of overvaluation, particularly among heavily funded startups, he believes the broader trend is anchored in real adoption. In his assessment, the biggest AI winners may not be fully obvious yet, but the foundations for the next generation of tech giants are being laid now.
