Amazon AI chips may help train Apple Intelligence in the future


Apple Intelligence features may get trained with Amazon AI chips



Apple is using chips sourced from Amazon Web Services to handle searches, and it may also use them to pre-train Apple Intelligence AI models.

It is common knowledge that large companies like Apple rely on external service providers when it comes to offering certain services. However, while it is expected for Apple to do so for tasks that don’t have a level of privacy and security, it turns out that it is doing so for some of its machine learning features.

Revealed during the annual AWS Reinvent conference on Tuesday, Apple confirmed it is using Amazon’s custom artificial intelligence chips, reports CNBC.

Apple senior director of machine learning and AI Benoit Dupin appeared on stage to talk about Apple’s use of Amazon’s cloud services, including how it does so. Apple has used AWS for more than a decade, including for Siri, Apple Maps, and Apple Music.

Amazon’s Trainium and Graviton chips have also been used by Apple to handle consumer search queries, Dupin confirmed. So far, it has been beneficial for Apple, since the use of the chips made searches 40% more efficient.

Model benefits

While the use of AWS and Amazon’s chips for search is a boon for Amazon, Dupin indicated that there could be more benefits to Apple on the way.

Speaking on stage, Benoit said Apple was evaluating Amazon’s Trainium2 chip, for potential use to pretrain its models. This could include new models for Apple Intelligence features that add new elements or improve what’s already offered to consumers.

So far, it seems that Apple is approving of the chips. “In the early stages of evaluating Trainium2, we expect early numbers up to 50% improvement in efficiency with pretraining,” Dupin told the audience.

The sizable efficiency improvement offers a very real benefit to Apple using Amazon’s chip, which the retailer said is available to rent via AWS. Efficiency gains could lead to lower costs in adequately pretraining models, or to perform more training with the models for the same cost.

Not a risk to users

The use of Amazon’s chips may alarm some users who are familiar with Apple’s privacy-focused approach. Apple usually performs on-device processing using its own chips, which it also employs for cloud-based tasks via Private Cloud Compute.

However, the AWS announcement by Apple doesn’t actually affect Apple’s processing practices at all. It’s about training the models, not processing queries.

Before a model is deployed for use by customers, the model needs to be trained beforehand. This is a processor-intensive and resource-consuming task, which sets the model to be used in specific ways, and with specific intended results.

This training can be performed in many ways, such as by buying high-performance servers with multiple GPUs that are particularly adept at AI-based number crunching tasks. However, this can get expensive, and for a process that doesn’t touch user data at all, it doesn’t necessarily need to be performed in-house.

Where the announcement could matter is, unexpectedly, Google.

In July, an AI research paper confirmed that Apple had used Google-designed hardware to build the Apple Foundation Model. While it is unclear if Apple rented server time or bought hardware from Google that ran within Apple’s data centers, it did ultimately mean Apple’s model was trained with Google’s hardware.

Apple’s interest in Amazon’s chips could lead to a similar situation of either server rental or hardware purchases, all in a bid for efficiency.

It ultimately doesn’t matter to consumers what Apple uses to train the model itself. What does is that Apple’s hardware is still being used to answer queries and to perform the actual processing required for Apple Intelligence to exist.



Source link

Previous articleCan your PC handle Path of Exile 2 system requirements?
Next articleNYT Strands today — hints, answers and spangram for Wednesday, December 4 (game #276)