Amazon’s AWS shows signs of weakness as competitors charge ahead
Amazon’s AWS shows signs of weakness as competitors charge ahead
Homepage   /    technology   /    Amazon’s AWS shows signs of weakness as competitors charge ahead

Amazon’s AWS shows signs of weakness as competitors charge ahead

🕒︎ 2025-11-02

Copyright The Philadelphia Inquirer

Amazon’s AWS shows signs of weakness as competitors charge ahead

It’s no exaggeration to say Amazon.com Inc. invented the cloud business. Amazon Web Services took the corporate data center apart and split it into pieces, building pay-as-you go services delivered with remarkable speed and consistency. The effort brushed aside incumbents, transformed an internal start-up into Amazon’s profit engine, and gave executives in Seattle the power to dictate terms to much of the industry. Now, suddenly, AWS is struggling. On Oct. 20, Amazon’s cloud unit suffered one of the worst outages in its history, taking down its most important cluster of data centers and disrupting the operations of hundreds of companies and consumer apps. Trading platforms, digital curriculums for students, online utility payments for Seattle, Amazon’s hometown, were all disrupted. The event dragged on for 15 hours before AWS managed to get all of its services back online. Then on Oct. 23, confirming a Bloomberg report, Alphabet Inc.’s Google said it will supply up to 1 million of its specialized AI chips to Anthropic PBC. The deal deepens Google’s partnership with the fast-growing artificial intelligence start-up and represents a blow to Amazon, which has invested billions in Anthropic. Three years into an AI boom spawned by OpenAI’s revolutionary ChatGPT, AWS is widely perceived as trailing its tech peers in AI. While AWS remains the market leader, Microsoft Corp. is now growing its backlog of corporate sales faster than Amazon, a trend that recently favored AWS. Last year, according to a Gartner Inc. estimate, Amazon’s cloud division captured 38% of corporate spending on cloud infrastructure services. That sounds hefty until you consider that Amazon’s cloud division held almost half of that market as recently as 2018, according to the firm. To understand what ails AWS, Bloomberg interviewed cloud computing and financial analysts, businesses that use or resell Amazon’s cloud, and 23 current and former AWS employees spread across engineering, product management, marketing, sales, and support. They describe internal bureaucracy that has slowed AWS down at a time when it needs to be nimble, a lackluster start to the company’s AI efforts, and an operation that has become less attractive to start-ups. Despite all the challenges, these people said, AWS remains committed to its longtime playbook amid a rapidly changing marketplace. These people acknowledge that AWS retains considerable strengths and customer loyalty, but they worry it’s losing its cutting edge and chasing rivals it once blindsided. Financial results posted late Thursday showed higher fiscal third-quarter profit and sales compared with a year ago, on strong demand for its cloud services, easing the pressure from softer growth at its e-commerce business. Its cloud unit, Amazon Web Services, reported a 20% rise in revenue in the third quarter ending in September, compared with estimates of a 17.95% increase. It doesn’t help that the cloud market has become more fiercely contested in the past couple of years. Oracle Corp., once dismissed as a cloud industry also-ran, has been booking multi-billion-dollar deals to host cutting-edge AI development work. Google has become a much more serious threat. And a swarm of newcomers with little track record running massive data centers have ambitious plans to rent their infrastructure to AI companies. Dave McCarthy, who advises corporate technology buyers at IDC, said clients once mostly wanted to understand the difference between Amazon’s and Microsoft’s cloud offerings. These days, they’re just as likely to ask about Google, Oracle, or an upstart like CoreWeave Inc. “There’s more choice out there,” McCarthy said. “It doesn’t bode well for Amazon. It’s creating some new competitive pressure that they didn’t have before.” As it seeks to retake the initiative, AWS has reorganized engineering and sales teams, swapped CEOs and marketing chiefs, and thrown some of its own product development rules out the window in the name of speeding products to market. At the same time, employees said, the company has tried to streamline the bureaucracy that took hold after AWS went on a pandemic-era hiring spree. Earlier last month, AWS released an updated version of its main workplace AI tool, now called Quick Suite. In December, the company is expected to launch a flurry of new and restyled AI services. “AWS remains the leader in cloud by a wide margin and we’re excited by the tremendous customer response we’re seeing to our AI services like Amazon Bedrock, SageMaker and Kiro, as well as the unique price and performance benefits they’re enjoying from our Trainium2 chips,” Amazon spokesperson Selena Shen said in an emailed statement. In the past few months, she said, Amazon has signed significant deals with a wide array of customers, including Delta Air Lines, Volkswagen, the U.S. General Services Administration, and State Farm. “If you look at any list of the world’s most innovative or fastest growing start-ups, you’ll find the vast majority are running significant workloads on AWS,” Shen added. On Nov. 30, 2022 — the day ChatGPT debuted — AWS brass were in Las Vegas for their annual re:Invent conference. Adam Selipsky, then the cloud division’s CEO, gave a two-hour talk that amounted to an infomercial for AWS services and an exhortation to get with the times and start using the cloud. “With AWS, you don’t have to worry about having too much or too little capacity,” he said. AI barely merited a mention. That afternoon, OpenAI’s Sam Altman announced the release of ChatGPT — instantly turning the industry upside down. Amazon, which has used algorithms to automate much of its operations, was hardly unaware that a new AI revolution was brewing. A few years earlier, AWS engineers had strung together about 6,000 Nvidia Corp. graphics processing units, creating an AI supercomputer before anyone would have called it that. At first, it went mostly unused. Even corporate giants didn’t need that kind of horsepower, and executives began worrying that the initiative was a wasteful research experiment, according to a person familiar with the matter. Then a tiny start-up called Anthropic, founded by former OpenAI personnel, began using the cluster of chips. The start-up, already taking the first steps toward creating what would become known as generative artificial intelligence, chewed through whatever computing power AWS could throw at it. Even then, it was clear Anthropic was working on a potentially game-changing technology. The obvious move was to invest in the start-up, and Amazon considered doing just that, according to the person, who requested anonymity to discuss confidential information. Anthropic was spending most of its cash on Amazon servers. If the start-up ever took off, so would its spending on AWS. But Amazon executives weren’t convinced that Anthropic would find a way to monetize the emerging technology, the person said, and didn’t pursue the idea. The resistance was partly cultural: AWS has historically been reluctant to pay for access to technology that it believed could be readily developed in-house, two people familiar with the strategy said. Anthropic, in the search for ever more computing power, subsequently started using Google’s cloud, in addition to Amazon servers. When the start-up raised money from investors in early 2023, Google was among them. Amazon invested in Anthropic that September, making the first of two planned $4 billion infusions. The deals committed Anthropic to using AWS computing power and in-house chips and offering its Claude models to Amazon customers. The size of the check shocked Amazon veterans who knew the company loathed paying tech industry prices. To some, it looked like desperation. Amazon has long prided itself for operating like a start-up, structured as a loose collection of independent teams that compete in a kind of Darwinian bake-off. That worked well when the goal was creating streamlined databases or durable file storage systems. Not so much for deep research into models capable of displaying humanlike reasoning or generating video. When OpenAI debuted ChatGPT, science and engineering units nestled within AWS, Amazon’s retail organization, and the Alexa and devices group were all pursuing similar, sometimes duplicative work training their own AI models, current and former employees said. Executives eventually tried to impose order. A few months before the Anthropic investment, they centralized most cutting-edge model development work under Rohit Prasad, who led the engineering organization that built the brains behind Alexa. Swami Sivasubramanian, the cloud unit’s longtime AI chief, was ordered to focus on ways to help businesses actually use AI tools. The new sense of urgency was palpable at AWS’s re:Invent conference in November 2023. Suddenly artificial intelligence was very much on the agenda, with Selipsky and other executives mentioning AI almost 100 times in two hours. They also debuted software called Amazon Q that included a chatbot and a software coding assistant. The offering gave Amazon salespeople a counter to ChatGPT, but didn’t introduce anything revolutionary to a market already awash with chatbots, according to analysts. Seven months later, Selipsky was gone in what Amazon said was a planned transition. His replacement was Matt Garman, who had spent his career at AWS leading increasingly large engineering teams and, eventually, sales and marketing. Colleagues describe Garman as a whip smart, occasionally brusque sports fan with a dry sense of humor and Amazonian distrust of conventional wisdom. For some of the company’s engineers, Garman also brought a credibility that Selipsky never established, and struck outside analysts as a better fit for the wartime conditions AWS found itself in. On the surface, the AWS Garman inherited functions much as it did when he joined from business school in 2006. Product leaders still attend Wednesday meetings, where they’re expected to be brutally honest about missteps and lessons learned. Employees are required to present ideas in short, written pitches. Software developers work shifts handling support and incident calls for their own products in an effort to understand customer needs. But AWS has slowed down — in part because management layers proliferated after a pandemic-era hiring binge. One sales engineer recalls that when he joined AWS before the pandemic, he was six managers down from Jeff Bezos, then Amazon’s CEO. He was promoted. But earlier this year, he checked the org chart again to find he was 15 rungs from CEO Andy Jassy. (Amazon said that experience is an outlier. Jassy last year ordered a companywide push to reduce layers of management.) In an increasingly bureaucratic culture, decision-making isn’t happening as quickly as it once did. Three AWS employees plugging away on separate AI initiatives recall being asked to write and rewrite pitches for so long that the market moved on, rendering their ideas dated. When the AI revolution arrived, AWS was at war with bloat, cutting thousands of people in an effort to slash costs. Promotions and raises grew harder to secure, and some teams that previously had license to hire freely found themselves battling just to maintain their existing staffing, current and former employees said. Amid an industrywide talent war for AI expertise, several senior AWS people left, including executives leading components of the AI push, start-up sales, chip design, and data center infrastructure. Shen, the spokesperson, said the company promotes those who “demonstrate performance at and readiness for the next level,” and that AWS’s culture — including its document-writing process — remains strong. AWS has recruited numerous new leaders since 2024, she said. AWS employees also have raised the alarm that Amazon is losing its status as the default place for software start-ups to build their products, according to two people familiar with the company’s deliberations. AWS famously helped birth Netflix, one of most successful start-ups of the past two decades. But in recent years, Amazon has at times prioritized big-spending corporations and wrestled with how much energy to devote to small companies that might never become long-term paying customers, employees said. Google, according to analysts and start-up advisers, has parlayed its name recognition with a generation of engineers and its chops in cutting-edge AI tools into relationships with many leading AI start-ups. “If you are not winning the native AI start-up companies today that are going to be five, 10 times as large in next couple years, then that can be a real issue for the business,” said Josh Beck, an analyst with Raymond James. Shen said AWS was “the top choice for start-ups.” In another era, Pete Schwab, who spent a decade working for Amazon, probably would have chosen AWS to host his early-stage start-up, Stronghold Labs, which is building AI tools to analyze troves of video. He opted to go with Google for its focus on smaller developers and the quality of its in-house AI models. AWS “used to do a much better job of attracting folks like us,” Schwab said. “That’s obviously an important currency for us, how well they support the little guys. And Google’s just doing a better job of that.” Jentic Technology Ltd., a start-up working to help companies plug AI tools into their existing software, has a different read. “We had a lot of credits from Amazon and from Google,” said Sean Blanchfield, Jentic’s CEO, referring to the in-kind services both companies offer to new start-up customers. “We ended up spending the Amazon credits. Easier to get going and to scale there,” he said. Blanchfield is hoping Jentic, among a cohort of promising start-ups Amazon flew to Seattle, can use AWS’s credibility with big businesses to sell its software. Established companies that have invested heavily with a cloud provider like Amazon rarely uproot that infrastructure, partly because of the expense and time involved. But existing AWS customers are at least trying out the competition — especially in artificial intelligence. Grammarly Inc., the writing assistance outfit, has long run its business from AWS data centers, relying on Amazon to recommend revisions to documents and emails. For AI services though, Grammarly looked elsewhere. Bedrock, the AWS model marketplace, doesn’t meet the company’s price and other needs, Mark Schaaf, Grammarly’s chief technology officer, said in an interview. Instead, the company taps into OpenAI models, including through Microsoft’s Azure cloud service, and uses Meta’s popular open-source Llama model. “Competition makes for better products and helps on cost for customers,” Schaaf said. Druva, a data security and back-up outfit that built its business primarily selling to AWS customers, earlier this year announced a partnership with Microsoft’s cloud to do the same work there. “We did see people adopting Azure in a larger degree,” Stephen Manley, Druva’s chief technology officer, said in an interview, adding that the company was open to partnerships with other cloud providers. Employees describe a frenzied atmosphere as AWS gears up for its fall product launches. One worker plugging away on AI tools said their team had set aside some of Amazon’s typical product development guidelines, skimping on documentation and regular reviews in the name of getting products out the door before rivals. In a recent all-hands meeting, Garman urged staffers to focus on launching the products they’ve already promised. (His remarks were previously reported by Reuters.) Garman has also changed the approach to older products. Amazon’s cloud long operated with a determination to keep services running even if they weren’t hits, a practice that was part demonstration of loyalty to customers, part a vote of confidence that engineers could build services capable of running themselves with minimal oversight. The practice began to fray as AWS’s tally of products exceeded 200, and big corporate customers complained that they needed help using the array of existing tools more than they needed new ones. So in the last two years, AWS has ended or stopped developing some three dozen services or major features. That, in turn, freed up engineers to join teams building AI tools. AWS’s annual fusillade of new products began earlier last month with Quick Suite, a chatbot and set of AI agents designed to analyze data, produce reports, or summarize web content. It replaces Q Business, and renewed Amazon’s pitch to sell software for use by office workers. It’s an arena where the company hasn’t had much success. Executives privately recognize that while Google and Microsoft can show off their AI tools to a captive audience of billions of search customers or PC users, AWS’s natural reach is a much smaller community of developers. For that audience, AWS continues to expand the capabilities of Bedrock, which lets businesses to tap into AI models built by Amazon, Anthropic, and other companies. It also builds upon AWS’s strengths, offering streamlined access to AI models in the same way the company offered companies an alternative to running their data centers almost 20 years ago. Bedrock boasts tens of thousands of customers and is widely seen as the company’s most successful AI product. Sales teams have made adding to that tally a priority, customers and partners said. Amazon could do well in the AI age simply by serving as the infrastructure provider for other companies and running it cost effectively. Most corporate AI work is experimental, according to analysts. Should it gain wider traction, cost savings and durable infrastructure — two AWS selling points, the recent outage aside — will become more important, AWS executives have said. Like its peers, Amazon is spending tens of billions of dollars building server farms around the globe. The company hopes to fill the facilities with Nvidia chips and the homebuilt Trainium2 AI semiconductor. In his pitch to investors, Jassy has touted Trainium2’s cost effectiveness as a potential competitive edge. The chip handles inference and training, the two types of work that make AI models tick, some 30% to 40% more cheaply than rival hardware, the company has said. In Indiana, Amazon is racing to finish an $11 billion data center complex built primarily for Anthropic, anchoring a cluster of hundreds of thousands of Trainium chips that AWS is stitching together for the startup. If Anthropic can build cutting-edge models on the hardware, it would validate Amazon’s approach and could attract more customers looking to replicate its success. But with its recent deal, Anthropic now has greater access to Google’s chips, giving the start-up an alternative should AWS not meet its expectations and underscoring the reality that Amazon is now in a dogfight with rivals it once easily lapped.

Guess You Like

How Killer Baby Opening Scene Was Made
How Killer Baby Opening Scene Was Made
To learn more about how “Welco...
2025-10-27