An Apple research paper reveals how older iPhones would get on-device AI
January 1, 2024

An Apple research paper reveals how older iPhones would get on-device AI

Expectations are high for Android flagship phones slated for 2024 when it comes to generative artificial intelligence (AI) on the device, but Apple has been pretty quiet on the matter so far. However, a new Apple research paper indicates that AI could be available natively even in older iPhones, regardless of their limited RAM.

The Financial Times noticed Apple research detailing a solution for running large language models (LLM) on devices with limited RAM. The Apple research paper reveals how the company can keep “model parameters” in the phone's storage and then only transfer parts of it to the device's RAM as needed, as opposed to implementing the entire AI model in RAM.

This method would allow running models with four to five and 20-25 times faster model inference speed compared to CPU and GPU loading approaches“, it is written in the paragraph of the work.

The device's generative artificial intelligence benefits from a large amount of RAM, providing faster read/write speeds than the storage space used in premium phones. High speeds are key to on-device AI, enabling much faster model inference time, as users don't necessarily have to wait tens of seconds (or more) to get an answer.

All of this means an on-device AI assistant that could potentially run at conversational speed, generate images/text faster, summarize articles faster, and more. So the Apple solution means you don't necessarily need a lot of RAM for AI on a device that performs well enough.

Apple's research work could thus enable as many new and older iPhones as possible to get AI capabilities on the device – given that they generally offer less RAM than many premium Android phones. For example, the iPhone 11 series only has 4 GB of RAM, while even the base, latest iPhone 15 model has only 6 GB of RAM.

If this solution were feasible, it would change the situation with the silence of the Cupertino company regarding artificial intelligence. But Apple isn't the only mobile player working to optimize LLMs on the device.

Qualcomm and MediaTek's recently released chipsets support the so-called INT 4 accuracy for optimizing artificial intelligence and large language models. Either way, the industry will definitely continue to find new ways to reduce the system requirements for on-device AI, potentially allowing even low-end phones to provide these capabilities.