DeepSeek Disrupts AI, Tanks Tech Stocks, and Creates Buy Opportunities In 3 Stocks
How DeepSeek Is Disrupting AI and Unlocking New Opportunities for Investors in Cloud Infrastructure Stocks.
The world of large language models (LLMs) is heating up, and DeepSeek’s recent R1 launch is throwing fuel on the fire. If you thought there were moats around generative AI, think again. With performance that rivals the best-in-class models but at more than 90% lower cost than OpenAI’s latest reasoning model, R1 is making waves—and investors and industry giants are starting to feel the ripple effects.
DeepSeek’s approach challenges the idea that generative AI is only for those with billions of dollars in capital expenditures. Instead, it signals the beginning of what we might call the “commodification of complements”—a fascinating shift where as the price of LLMs drops, the usage and value of complementary goods like cloud infrastructure increase exponentially.
Let’s dig deeper into what this means and why it could reshape the landscape of AI… and perhaps even your portfolio.
1. DeepSeek’s R1 Shows There Are No Moats in LLMs
Generative AI once seemed to be the exclusive domain of big-budget players like OpenAI and Google. These companies poured billions into developing cutting-edge models, creating a perception of insurmountable barriers to entry. But DeepSeek just flipped the script.
The R1 model boasts impressive reasoning capabilities while maintaining dramatically lower pricing. It proves that innovation in AI isn’t about spending the most—it’s about finding efficiencies that redefine the cost-performance equation. R1’s release begs the question: are massive expenditures on proprietary models even necessary anymore?
The democratization of generative AI is here. And it’s going to rattle the cages of the incumbents who thought they had cornered the market.
2. How Falling LLM Prices Will Drive Cloud Infrastructure Demand
Here’s where it gets interesting. While falling LLM prices might seem like bad news for cloud vendors, the opposite is true. As LLM costs plummet, demand for cloud infrastructure will likely soar.
Why? Because generative AI workloads don’t operate in isolation. Businesses adopting LLMs rely on scalable cloud platforms to run them efficiently. When the cost of deploying these models drops, more companies—from startups to enterprises—will jump on the AI bandwagon. This, in turn, increases the demand for the “primary good” of public cloud vendors: infrastructure.
For Amazon ($AMZN), Microsoft ($MSFT), and Google ($GOOG), this is an enormous opportunity. Their massive datacenters and cloud ecosystems stand ready to support the flood of businesses eager to leverage AI for everything from customer service to content generation.
3. A Capital Expenditure Conundrum
The R1 launch raises a provocative question for Big Tech: have they been overspending on generative AI?
Public cloud vendors have been laser-focused on AI-driven capital expenditures, plowing billions into model training and inference capabilities. But DeepSeek’s efficient model shows there might be alternative paths that don’t require such eye-watering costs. If companies like Microsoft and Google can adopt similar cost-reduction techniques, we could see a shift in how these giants allocate their resources.
In the short term, expect capital expenditures to remain high as vendors scale infrastructure to meet AI demand. However, in the medium to long term, the techniques pioneered by DeepSeek could significantly lower training and inference costs. The result? A win-win scenario for both the companies investing in AI and the businesses leveraging it.
4. R1: The “Commodification of Complements” in Action
DeepSeek’s R1 isn’t just an innovation—it’s a statement about the future of the industry. The dramatic drop in LLM pricing aligns with a broader economic principle: when the price of a complementary good falls, demand for the primary good increases.
Let’s bring this home. If LLMs are the complementary good, cloud infrastructure is the primary good. The more affordable LLMs become, the more companies will rely on cloud infrastructure to deploy and scale these models. In this scenario, public cloud vendors like Amazon, Microsoft, and Google emerge as the long-term winners.
5. Who Stands to Gain? Follow the Data Centers
If you’re wondering where to look for investment opportunities, consider this: public cloud vendors with sprawling datacenter networks are poised to reap the rewards of lower LLM prices. Here’s a quick breakdown:
Keep reading with a 7-day free trial
Subscribe to IYKYK Stocks to keep reading this post and get 7 days of free access to the full post archives.