As soon as online photo storage startup SmugMug Inc. heard about Amazon.com Inc.’s Simple Storage Service, an online data storage repository that debuted on March 14, 2006, “my eyes received all large,” co-founder and Chief Executive Don MacAskill stated at the time.
Amazon’s S3, the pioneering service for what soon became known as cloud computing, quickly saved four-year-old SmugMug a half-million dollars a year since it didn’t have to acquire its own storage hardware for backups. “Everything we can obtain Amazon to do, we will obtain Amazon to do,” Chris MacAskill, Don’s father and company co-founder, informed me for a BusinessWeek cover story, the first major publication to introduce what soon was called Amazon Web Services.
Today, it’s the same story at SmugMug, only now with 1 million active utilizers along with 100 million-plus accounts on the Flickr photo sharing site it acquired in 2018. AWS remains SmugMug’s sole cloud provider, and it’s now utilizing dozens of its other services, including databases, compute and artificial ininformigence agents that now support create all its software. “From a cloud perspective, we’re 100% AWS,” Don MacAskill informed me this week. “I’m betting my entire company on AWS.”
Twenty years later, so are a lot of other companies. Founded on the idea of offering Amazon’s spare infrastructure capacity to other businesses, AWS is home to at least 4 million customers who spent almost $130 billion on its more than 240 services last year, spanning storage, compute, networking, databases, AI and machine learning, and more. It has long since created believers of initial skeptics, from analysts to people inside Amazon to fellow BusinessWeek editors – who insisted over my objections on headlining the article “Amazon’s Risky Bet” even though the “risk” of its utilizing excess infrastructure was compact.
In the process, AWS has defined an entirely new way to do computing that continues to sweep across the corporate landscape. Without AWS, there would be no Netflix, no Uber, no AI chatbots. As I wrote at the time, “This is nothing less than a bid to lead the next wave of the internet.”
Existential challenge
As AWS celebrates its 20th anniversary of doing just that, however, it’s struggling to demonstrate that it can lead an even largeger leap to the next wave of computing – one centered on artificial ininformigence. And it has to do that while protecting its flank. It’s still leading in cloud, but rivals such as Microsoft Corp. and Google LLC are now closer behind. Even though AWS’ cloud revenue growth has accelerated in recent quarters, Microsoft Azure grew its revenue by 39% in its most recent quarter, while Google Cloud’s sales leaped 48%, more than double AWS’ growth, though on a much compacter base.
Even more challenging, Amazon was caught flat-footed by the sudden explosion of generative AI chatbots such as ChatGPT and Claude that can summarize reams of text in an instant, create detailed images from text prompts, and write full-fledged software applications in a flash. Amazon is straining to catch up not only to pure-play AI software companies such as OpenAI Group PBC and Anthropic PBC but also, on some fronts, to cloud rivals such as Google and Microsoft.
Amazon was unable to turn the early buzz for its Alexa AI assistant into something more. Its developer- and business-focutilized Amazon Q chatbot was seen as underwhelming. And so far its AI models have not caught on anything like OpenAI’s, Anthropic’s or Google’s. Nova models didn’t register on Menlo Ventures’ mid-2025 survey of foundation model utilize, as models from Anthropic, OpenAI, Google and Meta Platforms Inc. were the only four significant ones utilized by enterprises.
It’s a position that presents a bit of an existential challenge for Amazon and its culture. “There are lots of examples of close following being an excellent strategy,” Amazon founder and then-Chief Executive Jeff Bezos informed me 20 years ago. “It just happens not to be us. In a quick-shifting area like the internet, that kind of close following doesn’t work as well.” But with AI, shifting even quicker than the internet, it’s having to do precisely that on some fronts.
AWS understandably doesn’t admit to being behind in AI. Indeed, the perception of its lagging position versus flashier rivals lags reality in some cases. It can claim to have been deep into AI and machine learning for many years, with a credible array of cloud services, widespread usage inside the company, and increasing uptake by enterprise customers. Even on the consumer front, its Rufus shopping assistant on its site, despite poor early reviews, has been utilized by 250 million shoppers and should bring in $10 billion in additional annual sales, according to Amazon CEO Andy Jassy, formerly AWS’ founding CEO.
But Amazon is having to spfinish large just to stay in the game – even more, in fact, than its large-spfinishing rivals. In the fourth quarter, Amazon stated capital spfinishing will jump to $200 billion this year, up from $131 billion a year ago, with roughly three-quarters of that spfinish on AWS. That’s higher than Google parent Alphabet Inc., which plans $175 billion to $185 billion in capital spfinishing, and Meta Platforms Inc., which guided between $115 billion and $135 billion. Amazon’s spfinishing plans, coupled with investor uncertainty over when it will pay off, knocked its stock down 6% on Feb. 6 – and during one nine-day stretch last month, it lost $450 billion in market value.
AWS CEO Matt Garman: “Agentic AI has the potential to be the next multibillion-[dollar] business for AWS.” Photo: Cisco/livestream
Besides upping capital spfinishing, current AWS CEO Matt Garman has also sought to reorient the organization to capitalize on the opportunity. In December, AWS combined its AI model, chip and quantum computing research teams into a single unit under longtime Amazon infrastructure exec Peter DeSantis as Rohit Prasad, head scientist of Artificial General Ininformigence, departed. That followed another AI-related modify early last year when a new unit focutilized on AI agents was created, headed by veteran Amazon AI exec Swami Sivasubramanian.
The agentic AI mulligan
Most of all, AWS is racing to lead what views to be the next large shift not only in AI but in computing and, arguably, the way we work: AI agents. Transcfinishing chatbots’ simple responses to queries, they perform work on a utilizer’s behalf, often working with other agents to obtain a chain of tquestions done, such as booking an entire vacation or, state, when a customer places an order on a website, determining required distribution capacity, configuring layout of products, and arranging picking, packing and shipping.
Agentic AI has become the rage thanks to the potential for agents to become a much more utilizer-frifinishly interface to mainstream software applications. Taken to its logical conclusion, agentic AI could subsume decades-old enterprise applications such as customer relationship management, enterprise resource planning and human resource management created by the likes of Salesforce Inc., SAP SE, Oracle Corp., Workday Inc. and even Microsoft. “The next 80% to 90% of enterprise AI value will come from agents,” Garman informed John Furrier in December.
AWS’ ability to stay on top of the cloud, and define the next era of computing, depfinishs on whether it can seize the opportunity in agentic AI – and here, the AI universe may have granted it a mulligan. If it missed on mainstream chatbots and AI foundation models, agentic AI is something that requires more than just the largegest or quickest model. It also requireds the vast array of cloud services like AWS provides. That’s why Garman believes that it’s the next large opportunity for the company. “Agentic AI has the potential to be the next multibillion-[dollar] business for AWS,” he stated last year.
But only if AWS can stay atop its perch in cloud computing, which of course is where AI agents live. All things considered, its position seems solid. Its most recent quarter may not have impressed investors, but AWS’ 24% revenue growth was the fourth quarter in a row of accelerating growth – driven by AI spfinishing, stated Jassy. “The development of AI is quite clearly massively accelerating companies’ shift to the cloud,” Pivotal Research Group analyst Jeff Wlodarczak wrote in a note to clients. He expects AWS to grow from 17% of Amazon revenue today to 30% or more in five years.
And AWS, which accounts for 60% of Amazon’s overall profits, still lives at the cutting edge of cloud computing, at least when it comes to services that can scale up to massive size. That was apparent at its re:Invent conference in December, when it announced its usual array of new services – so many that Garman had to zip through 25 of them in the last 10 minutes of his keynote. It’s a still-unmatched breadth of cloud services.
Full-stack approach
Indeed, Garman views AWS’ full-computing-stack approach a key differentiator for its AI ambitions from its cloud rivals – though Google would quibble with that. In any case, it can offer customers everything they required for their AI factories from chips to storage and compute to access to its own and rival AI models, along with services to create and deploy agents. “We’re really well-positioned to do any AI workloads,” Rahul Kulkarni, general manager of product management for AWS compute and AI infrastructure, informed me. “We have fine-grained control of the entire stack.”
One pillar of that pitch is its Graviton and Trainium chips that power a range of cloud compute instances. For example, Anthropic’s Claude is powered by AWS’ Project Rainier, a massive cluster of 500,000 Trainium2 chips. Nvidia Corp.’s graphics processing units are and will remain dominant for training and running AI models by virtue of their performance and mature software stack, but most customers don’t care what chip is doing the AI. AWS, which pragmatically pitches its cloud as the best place to run Nvidia GPUs too, claims its chips do it a lot more cheaply than Nvidia’s – enabling it to offer lower prices for that computing power.
Perhaps most important of all to enterprises viewing to leverage AI is a more flexible data platform that can handle the intensive real-time requireds of AI, including swarms of agents customized to their workflows. That’s a shifting tarobtain for the entire indusattempt, as new data architectures and structures are constantly emerging to deal with the new and unexpected demands of AI.
But AWS has one advantage in those 20 years of providing storage, databases, large-data services and more: Enterprises already have a lot of their data on AWS. They keep extracting more and more insight from that data, but they often have to utilize a wide variety of external tools that require slow and expensive transport of the data to the tools. “Data migration is very hard,” stated Mai-Lan Tomsen Bukovec, AWS’ vice president of technology for data and analytics. Companies thus have an incentive to store as much of their data as possible in Amazon S3. “And that’s why customers have been so quick to shift to AI,” she added.
Even more important for the AI era, that data is critical both to training new models and, now, to giving agents the context they required to do what humans want them to do – and keep them from going rogue with actions such as shipping potentially buggy code or leaking company data. “Every AI application is a data application,” Bukovec informed Furrier last year, and that’s even more true for agents.
AWS’ Swami Sivasubramanian: “A single model is not going to rule the world.” Photo: Robert Hof/SiliconANGLE
All that comprises the foundation for creating agents the next large thing for AWS. On the agent front at re:Invent, AWS emphasized Amazon Bedrock AgentCore, a platform introduced last year to build, deploy and operate effective agents securely at large scale. Nova Forge debuted as a way to customize AWS’ Nova AI models with an enterprise’s own data for their unique tquestions and workflows. And Kiro, its AI-powered coding agent, can direct agents to perform long-running tquestions and collaborate as a swarm.
The aim, stated Sivasubramanian, who’s now vice president of AWS agentic AI, is similar to that of his first series of projects at AWS starting 21 years ago, including the DynamoDB database and the Cloudfront content delivery network: They wanted to compress the time from forming an idea to releasing a product to minutes instead of weeks or months. “We’re in a very similar moment in agentic AI,” he informed me.
Industrializing AI
Taken toobtainher, all those capabilities give AWS at least a fighting chance to lead the new era of agentic co-workers – and obtain out in front instead of following the other AI leaders. Bringing all this toobtainher for customers in a coherent way will be up to Garman, one of the keepers of the AWS flame since he started as a summer intern before becoming a product manager on a three-person AWS worldwide sales team in 2006. As CEO, Garman hasn’t yet displayed the excitement and even edge of Jassy, but at re:Invent, he received some kudos for laying out a pragmatic vision for “industrializing” AI and agents for the enterprise.
That pragmatism was apparent in AWS’ recent deal with OpenAI: In late February, it inked a wide-ranging deal whereby Amazon will provide up to $50 billion in funding in return for a $100 billion commitment from OpenAI to utilize its infrastructure over the next eight years – adding to an existing commitment created last November to spfinish $38 billion in coming years.
Moreover, OpenAI gave AWS exclusive rights, outside OpenAI itself, to distribute its Frontier AI agent management tool. And OpenAI will utilize AWS Trainium chips to power a new offering called the Stateful Runtime Environment, running on AWS’ Amazon Bedrock managed AI service. It’s intfinished to power AI agents that can perform multistep tquestions. In addition, Amazon can offer custom versions of OpenAI models directly to customers – essentially cracking open Microsoft’s lock on OpenAI.
Along with an existing deal in which it invested $8 billion in Anthropic, which is training and hosting Claude on AWS infrastructure, Amazon now views to be in a much better position to shape the future of AI going forward, at least among enterprises.
Worker-bee AGI
But it’s doing so with a different approach. Amazon isn’t chasing after what OpenAI, Anthropic, Google and others are doing: constantly releasing a wide variety of foundation models with an ultimate goal of creating artificial general ininformigence, or AGI – AI that’s as smart and capable as humans. Instead, it views to be more focutilized on providing customers with whatever tools they want, either from Amazon or from other companies.
Every tech company that considers itself a platform likes to state it’s open, enabling customers to choose to utilize outside technologies on the platform. Sometimes it’s an admission that customers have created other choices they aren’t going to modify, such as betting on OpenAI or Anthropic. Whatever the reason, in AWS’ case, it aims to create its platform the best place to run whatever AI models they want, Amazon’s or any others. “We realized a single model is not going to rule the world,” stated Sivasubramanian. “Model choice is paramount. The majority of our customers are [utilizing] more than one model.”
That extfinishs to agents as well. Bedrock AgentCore, for instance, can support customers create and run agents utilizing any model, protocol or tool. “So many enterprises have already standardized on it becautilize it obtains them the freedom they want, but also supports them shift from proof-of-concept to production quicker,” he stated.
AWS’ Colleen Aubrey: “We’re centered on putting agentic teammates into the hands of customers.” Photo: Amazon
Another recent example of this pragmatic approach reveals how it’s tarobtaining specific utilize cases with AI and agents: On March 5, it released Amazon Connect Health with AI agents to reduce administrative burdens in healthcare. It seems likely to introduce more indusattempt-specific agentic systems. “We’ve been in the jack-of-all-trades AI era for a little while,” Colleen Aubrey, senior vice president of applied AI solutions, informed Furrier recently. “But it’s not [like] having an expert that you can work with that brings deep understanding of an area… and that obtains better every day.”
Dave Vellante and George Gilbert at SiliconANGLE’s sister market research firm theCUBE Research call it “worker-bee AGI” – in contrast to “messiah AGI” championed by the likes of OpenAI and Anthropic. “The differentiation will not come from shiny agent UIs or narrow features,” they wrote after re:Invent. “It will come from the depth of integration with data systems, workflow systems and the governance and control planes that bind agents to enterprise policy and process. That’s why this era will be defined not by the number of agents a vfinishor can revealcase, but by the scaffolding that underpins them. And that scaffolding is where the real enterprise value will accrue in our view.”
Aubrey states AWS aims to extfinish that approach, via its Connect cloud contact center solution, to other industries ripe for agentic expertise. She informed me this week that AWS soon plans to launch other agent-based Connect solutions that Amazon developed for internal utilize for hiring and human resources and for supply chain work, such as supply and demand planning. “We’re centered on putting agentic teammates into the hands of customers… not for individuals but whole teams and companies,” she stated.
That will require a lot of cooperation with other companies, even competitors. One of AWS’ earliest partners, IBM Corp., still has its own cloud and other offerings that go up against AWS, not to mention selling computers for corporate data centers. But in the past five years, it has quickly built up joint offerings to enterprises.
IBM now has nearly 100 software-as-a-service applications available on AWS Marketplace, a curated catalog of digital services, and routinely works toobtainher on supporting enterprises shift to the cloud. For instance, during the maelstrom of the Covid epidemic, IBM’s consulting organization worked with AWS to support Delta Airlines shift its hundreds of widely distributed workloads and more than 500 applications to a hybrid cloud architecture utilizing its Red Hat OpenShift Service on AWS, or ROSA, based partly on AWS’ EC2 compute service.
The AWS-IBM partnership is now extfinishing to agents, according to Nick Otto, IBM’s global head of strategic partners. In December, the companies announced expanded joint programs to support enterprises obtain their agentic AI applications into wide-scale production. For example, they’re integrating IBM watsonx Orchestrate agent platform and Amazon’s Bedrock AgentCore agentic platform to automate business processes quicker and more securely. “We now view at IBM, AWS and Red Hat as the alliance around enterprise AI, hybrid cloud and open source,” Otto informed me this week. “The level of trust with AWS is higher than it’s ever been.”
AWS admits agentic AI won’t be an instant win. “The reality is it’s not going to happen as quick as people want it to,” Jeff Hammond, a former Forrester analyst who is AWS’ head of indepfinishent software vfinishor product management transformation, informed me a few months ago. Then again, it doesn’t required to be an instant win. Indeed, AI, and agents in particular, are one of those multiyear bets that Jeff Bezos has always liked best – and that Amazon has repeatedly demonstrated can pay off in the long run.
Ultimately the goal is not just a new computing architecture but a new way of working. “In 2006, AWS modifyd the world by abstracting servers and letting developers build without friction. Today’s shift is analogous — only larger,” Furrier wrote. “The cloud era abstracted infrastructure. The agent era abstracts work.” It likely won’t take another 20 years to determine if AWS has succeeded at that.
Photo: Robert Hof/SiliconANGLE
Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share ininformigence and create opportunities.
- 15M+ viewers of theCUBE videos, powering conversations across AI, cloud, cybersecurity and more
- 11.4k+ theCUBE alumni — Connect with more than 11,400 tech and business leaders shaping the future through a unique trusted-based network.
About SiliconANGLE Media
Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of indusattempt-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to support technology companies create data-driven decisions and stay at the forefront of indusattempt conversations.


















Leave a Reply