Unlock the Editor’s Digest for free

The head of Amazon’s cloud division has used recent turmoil at OpenAI to launch a thinly veiled attack on Microsoft, the artificial intelligence company’s biggest investor.

Adam Selipsky’s swipe at his biggest rival in cloud computing came as Amazon Web Services announced new custom chips for servers and AI — two weeks after Microsoft debuted its own bespoke processors — as well as a new workplace AI assistant that competes with Microsoft’s OpenAI-powered Copilot.

Satya Nadella, chief executive of Microsoft, which has an exclusive partnership with OpenAI, rushed to support the AI company’s co-founder Sam Altman when he was abruptly ousted as chief executive earlier this month.

“Things are moving so fast [in AI] and in that type of environment the ability to adapt is the most valuable capability that you can have,” Selipsky, AWS chief executive, said at its annual developer conference in Las Vegas on Tuesday. “You don’t want a cloud provider that’s beholden primarily to one model provider, you need a real choice . . . The events of the past 10 days have made that very clear.”

The comments were a tacit but obvious reference to the boardroom drama at OpenAI, one of the world’s most valuable private companies, which gripped Silicon Valley as Altman was fired then reinstated within days.

To counter Microsoft’s all-in bet on OpenAI, which has fuelled growth in its Azure cloud business, Amazon has sought to offer corporate customers access to a range of alternative AI models. It has also struck an investment deal worth up to $4bn with OpenAI’s biggest rival, Anthropic, which has pledged to use AWS’s Trainium, a chip for building large artificial intelligence systems that Amazon designed in-house.

Selipsky said the fourth version of its Graviton processors, a general-purpose chip based on UK chip designer Arm’s architecture, was “the most powerful and energy-efficient chip we’ve ever built”. Amazon also previewed the second generation of Trainium, which Selipsky said would be “ideal for training foundation models with hundreds of billions or even trillions of parameters” when it is available next year.  

The launch comes two weeks after Microsoft unveiled the first server chips that it has designed in-house: an AI accelerator called Maia and Cobalt, a central processing unit based on Arm technology.

At the same time as touting a rival AI chip, Amazon also deepened its partnership with chipmaker Nvidia, whose chief executive Jensen Huang joined Selipsky on stage in Las Vegas — two weeks after appearing alongside Nadella at a Microsoft event. AWS will offer customers access to Nvidia’s latest AI processors and host its “AI training as a service” platform, DGX Cloud, they said.

The three big cloud computing companies — Amazon, Microsoft and Google — are upping their investments in semiconductors as they try to optimise their own infrastructure and reduce operating costs. They are also trying to create alternatives to Nvidia, whose processors dominate the booming market for training giant AI models such as OpenAI’s GPT, even as they maintain alliances with the chipmaker.

AWS had for years been considered the market leader in cloud computing, but Microsoft has this year closed some of the gap thanks to its alliance with OpenAI. Microsoft gives its Azure customers access to the large language model behind the popular chatbot ChatGPT

Tuesday also saw the introduction of Amazon Q, an AI assistant designed for businesses and their employees that in some areas competes with Microsoft’s Copilot, which can help write Word documents or computer code. Amazon Q can be tailored to a company’s own data and systems to generate content, answer questions and analyse data.

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here