Apple Just Open Sourced AI Models Designed To Run On iPhones

Apple has made new AI models available to the open source community that appear designed to run on devices like iPhones, rather than in the cloud.

Apple has long been expected to launch a new iOS 18 update at WWDC in June that will focus on new AI features, although details are currently hard to confirm.

A previous report had suggested that the AI features would run on-device rather than on servers in the cloud, and this latest news appears to back that up.

By running the AI tools on iPhones Apple will be able to make them faster. With no need to send and receive data, sometimes over slow cellular connections, the iPhone will be able to offer features that work more readily. The privacy implications are clear as well, with potentially private data never leaving the device.

In the case of this open source project, called OpenELM (Open-source Efficient Language Models), the LLMs are available via the Hugging Face Hub community.

OpenELM, a state-of-the-art open language model. OpenELM uses a layer-wise scaling strategy to efficiently allocate parameters within each layer of the transformer model, leading to enhanced accuracy. For example, with a parameter budget of approximately one billion parameters, OpenELM exhibits a 2.36% improvement in accuracy compared to OLMo while requiring 2x fewer pre-training tokens.

If Apple sticks to its usual pattern the company will announce iOS 18 at WWDC and then immediately make a developer beta available. From there, it’s likely to take months for the beta program to complete with the update likely to be released to the public in September.

You may also like to check out:

You can follow us on Twitter, or Instagram, and even like our Facebook page to keep yourself updated on all the latest from Microsoft, Google, Apple, and the Web.