Copyright Reuters

Nov 6 (Reuters) - This was originally published in the Artificial Intelligencer newsletter, which is issued every Wednesday. Sign up here to learn about the latest breakthroughs in AI and tech. Last week’s earnings bonanza and Tuesday’s stock sell-off show just how much tech giants’ enterprise AI businesses are holding the stock market together. But it’s their consumer products that are helping them fend off challengers. Sign up here. It’s the latest episode demonstrating the power large corporations hold over upstarts. Perplexity isn’t the only one realizing it must rely on the products and websites that people have loved long before the AI boom to make its new AI products useful. Whereas Perplexity has sought to make its AI agents useful across the web, OpenAI, which is preparing for a blockbuster IPO, has taken a different tack. The ChatGPT-maker has penned partnerships with the likes of Zillow, Spotify and Shopify so that its AI agents can navigate those sites. But Artificial Intelligencer subscribers will remember our chronicling of how it hit a wall trying to partner with Google. The search giant first turned down OpenAI’s request to integrate its search technology into ChatGPT, then changed its search scraping rules to the top 10 results. ChatGPT's answers took a hit. Whether an AI startup goes through the front door or the back door, all will have to contend with the power the legacy web still holds. Our latest reporting in tech & AI: How Google regained its AI edge A high-stakes bet on its once-overlooked cloud computing division is paying off for Alphabet, turning the former "also-ran" into a key growth engine that is boosting Wall Street's confidence in the search giant's future. “Cloud used to be the red-headed stepchild of Google,” one former employee told me. But that’s changed: “If Google hadn’t invested in the cloud years ago, it would just be this ads and search company under attack.” Indeed, that’s exactly the position Google was in three years ago this month, when OpenAI triggered an AI arms race with the launch of ChatGPT. Almost overnight, Google was seen as having fallen behind. It was particularly embarrassing for the self-proclaimed “AI-first” company as the kernel of ChatGPT was born in a Google lab. The question from longtime Googlers who had seen the days of Larry Page and Eric Schmidt’s stewardship of the company: was Alphabet chief Sundar Pichai, the man behind that proclamation, fit to be a wartime CEO? But Pichai has flipped the script. A bet he made when he became CEO in 2019 to make Google Cloud one of his two big bets (the other was YouTube) to move Alphabet beyond its core business is paying off. Cloud boss Thomas Kurian has become one of Pichai’s most influential lieutenants by transforming the division’s structure and culture. Kurian’s rise was reflected at Google’s agenda-setting weekly “leads” meetings, where he has played a more prominent role jostling for resources against other leaders. Winning took breaking with tradition. When Cloud started selling Google’s self-developed AI chips that were reserved for internal use to outside customers, it sparked a “healthy amount of tension” with other Google units, including DeepMind, a former Cloud executive told me. The move helped Google carve out a role in the highly competitive cloud business, where it has long been considered a distant third behind Amazon and Microsoft. Meanwhile, the Ads team used to laugh Cloud salespeople out of the room when the latter asked for help closing six-figure contracts, Josh Gwyther, a startup founder who worked at Google Cloud between 2016 and 2025, told me. Now, Google Cloud is penning deals worth nine or 10 figures with both large enterprises and AI labs hungry for computing power like its own rivals OpenAI and Anthropic. In some cases, Google’s ability to supply its AI chips, called TPUs, has won it deals. Internally, Google sees itself as a closer competitor to Nvidia than AMD or Intel, the former executive told me. While Google’s search and ads business remains its revenue driver, Wall Street confidence in the company, which enjoyed a 6% stock boost after reporting earnings last week, is being spurred by the cloud business. From backwater to growth engine, Google Cloud is now buying time for Google’s search business and DeepMind AI unit to fend off attacks from disruptors like Perplexity and OpenAI. “Google Cloud is one of the most important priorities for Alphabet as a whole and I expect it to play an even more central role as the company moves forward,” Pichai told me in October. When I asked him how he saw Google Cloud’s positioning in the event of a market correction, Pichai said he anticipated “a lot of resilience” to the business. He added: “From our standpoint, we've been doing the AI thing for a decade now, and we're going to be doing it a decade from now. We take these long-term bets and stay relentlessly focused on it, be it AI, be it self-driving, be it quantum computing. That's how I view our approach.” Chart of the week: By Krystal Hu, Technology Correspondent Here's a head-scratcher: investors are throwing an astounding amount of money at robotics startups—nearly $20 billion in 2024 and even higher in 2025—yet the actual payoffs are slow to show up. Here’s the catch: liquidity (the M&As and IPOs that actually return cash to investors) peaked back in 2019–2020 and has cratered since. We're talking near-zero so far in 2025. Bessemer thinks the near-term wins will come from the boring stuff: surgical robots, tightly geofenced self‑driving, and warehouse workflows. The firm believes those niches can clear today's bar on safety, ROI and data collection, while the big, open‑ended humanoid bets keep absorbing capital and patience. AI research to read The big knock with transformers, the basis of today’s large language models, is the massive computing power required, especially to analyze large amounts of text. Today’s LLMs work by looking at every word and figuring out how it is related to every word before it. Kimi-Linear combines one part traditional transformer with three parts of something called “linear attention” that tries to summarize the most useful information so that the AI model doesn’t have to look back at every word over and over. It's a small tweak in how AI works. But it could reshape what’s economically viable in AI. Reporting by Kenrick Cai; Additional reporting by Krystal Hu; Editing by Lisa Shumaker Our Standards: The Thomson Reuters Trust Principles., opens new tab Kenrick Cai is a correspondent for Reuters based in San Francisco. He covers Google, its parent company Alphabet and artificial intelligence. Cai joined Reuters in 2024. He previously worked at Forbes magazine, where he was a staff writer covering venture capital and startups. He received a Best in Business award from the Society for Advancing Business Editing and Writing in 2023. He is a graduate of Duke University.