Has Amazon revealed its true business?

Last week something happened that I had been waiting for a long time. Andy Jassy, Amazon’s CEO, for the first time in history released concrete figures about how much money the company makes from artificial intelligence. And that number honestly surprised me.

But first a bit of context, because that’s key to understanding why this is important news.

Why this is a historic moment

Amazon Web Services, Amazon’s cloud division, has existed since 2006. For two decades AWS has been something of the company’s “quiet engine,” less sexy than Prime Video or Alexa, but generating the vast majority of operating profit that funded everything else. Investors knew it, analysts knew it, but Amazon never broke down the numbers so precisely to make clear exactly where the future lies.

Now Jassy did it for the first time. And the message is clear: Amazon is not a retail company. Amazon is increasingly an infrastructure company for the digital era, and AI is the new chapter of that story.

15 billion from AI, 20 billion from chips

Let’s start with a factual overview of what Jassy actually disclosed.

AWS currently generates annual revenues exceeding $142 billion. Of that, roughly $15 billion comes directly from AI services — that is, from companies and developers using AWS to train models, run inference workloads, and build AI applications. That represents about 10% of the overall AWS “pie.” And this number is growing quickly. Jassy himself admitted that the cloud business would grow even faster if not for capacity constraints that trouble the entire tech industry today.

But even more interesting than AI revenues is the second number Jassy revealed.

Amazon’s own chip business doubled last quarter from $10 to more than $20 billion in annualized revenue. It is growing at a triple‑digit year‑over‑year rate. For comparison: Nvidia, whose GPUs are now the symbol of the whole AI revolution, reported chip revenues of roughly $115 billion per year. Amazon is therefore still a fraction of that, but the dynamics are dizzying.

Then came a sentence that immediately caught my attention.

A strategic move that changes the game

In his letter to shareholders Jassy hinted at something Amazon has never done before: the possibility of directly selling its own chips to external companies. “Demand for our chips is so high that it is very possible we will sell entire racks to third parties in the future,” said the Amazon chief.

To be clear what this means for investors: Amazon has so far used its chips exclusively internally, either in AWS for its own needs or as an option for AWS customers. Opening sales to external players would mean entering a market segment today dominated by Nvidia and partly by AMD.

The strategy outside AWS is also being tested by $GOOGL. Last October it struck a deal with Anthropic to supply millions of its own AI chips. For Amazon this would be an analogous step, only on an even larger scale.

Jassy illustrated it with a thought experiment: if the chip business were a standalone company and sold chips like other leading chipmakers, annual revenues would be roughly $50 billion. That’s a number that demands attention from anyone who holds or is considering Amazon in their portfolio.

Trainium vs. $NVDA: A price war that’s just heating up

To understand why Amazon’s chip division is strategically important, you need to grasp the core problem of the entire industry.

Nvidia’s GPUs are expensive. Training large language models costs hundreds of thousands to millions of dollars, with a large portion of those costs going to hardware. Anyone who can offer an alternative with a better price‑to‑performance ratio enters the game with enormous leverage.

Amazon claims its Trainium chips have a 30–40% better price‑to‑performance ratio compared to competitive hardware available on AWS. Trainium 2 costs approximately 40% less than comparable GPUs from Nvidia. These figures are not just a marketing slogan — they are confirmed by firms like Anthropic, Databricks, and Deutsche Telekom, which are among the chip testers.

Jassy himself stated in the shareholder letter that at full deployment Trainium could save Amazon tens of billions of dollars in capital expenditures annually and add several percentage points of operating margin compared to dependence on external chip suppliers. That is a concrete, measurable number.

$200 billion into capex: Fear, or opportunity?

When Amazon announced earlier this year that it plans to invest roughly $200 billion this year into capital expenditures focused primarily on AI infrastructure, the market reaction was mixed. Some investors were horrified — that’s a brutal number even by the standards of the world’s largest tech company.

Jassy counters this skepticism with concrete arguments. From the planned AWS expenditures meant to be “monetized” in 2027 and 2028, Amazon already holds customer commitments for a substantial portion. Among them is a deal with OpenAI exceeding $100 billion. In other words: Amazon is not building capacity based on forecasts, but on real contracts with clients.

The market logic is simple: demand for compute capacity for AI exceeds supply, and by a lot. Two large AWS customers reportedly asked to buy all available Graviton processor capacity for the entire year 2026. Amazon had to refuse because other customers would have been left without service.

That is a problem solved in one way: build more.

Vojta’s thoughts

Amazon is one of my largest positions, and this disclosure only confirmed why I added it to my portfolio.

There is no “if” here — the numbers are real, the customer commitments are real, and the dynamics of the chip business honestly surprised me more than the AI revenues themselves. Doubling from $10 to $20 billion in a single quarter and triple‑digit year‑over‑year growth — that doesn’t happen every day for a $20 billion business.

What interested me most, however, is the strategic move to sell chips to external companies. Amazon has so far presented itself as a cloud platform. If it truly opens chip sales outside AWS, it will enter direct competition with Nvidia and do so with a 40% price advantage. This is not a small step.

Of course there are risks. $200 billion in capex is a huge bet that demand for AI infrastructure will continue. If the AI bubble bursts or it turns out AI revenues are insufficient to cover the costs of building infrastructure, Amazon will feel it. But with the customer commitments Jassy describes, I believe the risk is well calibrated.

I am bullish on $AMZN, and these numbers strengthen that conviction.

Amazon is transforming from an e‑commerce and cloud company into an AI infrastructure powerhouse. The combination of proprietary chips, a gigantic cloud platform, and customer commitments worth hundreds of billions of dollars creates a moat that will be hard for competitors to overcome over the next several years.

Do you believe Amazon can realistically threaten Nvidia’s dominance in the AI chip market, or will it remain exclusively an internal AWS tool? Where do you see the biggest risk in this $200 billion bet?

Do you trust $AMZN and hold their shares in your portfolio?


Great article, thanks for it. I have a fairly large position in $AMZN in my portfolio and I have a lot of faith in the company’s future.

Menu StockBot
Tracker
Upgrade