On Monday, Apple CEO Tim Cook announced on stage a great deal with OpenAI to include its powerful artificial intelligence model as part of the voice assistant, Siri.
But in the fine print of a technical document Apple released after the event, the company makes it clear that Alphabet’s Google has emerged as another winner in the Cupertino, California-based company’s attempt to catch up with AI.
To build Apple’s core AI models, the company’s engineers used its own software framework with a range of hardware, specifically its own local graphics processing units (GPUs) and chips available only in Google’s cloud called tensor processing units (TPUs).
Google has been making TPUs for about 10 years and has publicly discussed two types of its fifth-generation chips that can be used to train artificial intelligence; the fifth-generation performance version offers performance rivaling Nvidia H100 AI chips, Google said.
Google announced at its annual developer conference that the sixth generation will be launched this year.
The processors are designed specifically to run AI applications and train models, and Google has built a cloud computing hardware and software platform around them.
Apple and Google did not immediately respond to requests for comment.
Apple did not discuss how much it relies on Google’s chips and software compared to hardware from Nvidia or other AI vendors.
But using Google’s chips typically requires a customer to buy access to them through its cloud division, the same way customers buy compute time from Amazon.com’s AWS or Microsoft’s Azure.
© Thomson Reuters 2024