Apple releases many new open source AI models

Apple's LLM, called OpenELM (Open-source Efficient Language Models), is designed to run on the device instead of on a cloud server.

Apple doesn't share much about its Generative AI plans. Now, with the release of a large line of open source language models, it seems the tech giant is looking to make AI run locally on Apple devices. Apple's LLM, called OpenELM (Open-source Efficient Language Models), is designed to run on the device instead of on a cloud server. These LLMs are available on Hugging Face Hub, a central platform for sharing AI code and data sets.

During testing, Apple found that OpenELM delivered similar performance to other open language models, but with less training data.

Picture 1 of Apple releases many new open source AI models

As shown, there are a total of 8 OpenELM models. 4 models were pre-trained using the CoreNet library, while the remaining 4 models were tuned according to the instructions. To improve overall accuracy and efficiency, Apple uses a layered scaling strategy in these open source LLMs.

'To achieve this goal, we release OpenELM, the most advanced open language model. OpenELM uses a layer-wise scaling strategy to efficiently distribute parameters within each layer of the transformer model, helping to improve accuracy. For example, with approximately one billion parameters, OpenELM demonstrates a 2.36% improvement in accuracy over OLMo while requiring 2x fewer pre-training tokens' - according to Apple.

Apple doesn't just provide the latest trained model. Rather, it also provides code, training logs, and multiple versions of the model. Project researchers are optimistic that it will accelerate advances and deliver 'reliable results' in the field of natural language AI.

'Different from previous approaches that only provide model weights and inference code as well as pre-training on private datasets, our release includes a complete framework for model training and evaluation. language profiles on publicly available datasets, including training logs, multiple checkpoints, and pre-trained profiles. We also released model conversion code to the MLX library for inference and tuning on Apple devices. This comprehensive release aims to empower and strengthen the open research community, paving the way for future research efforts' - according to Apple.

Apple added that the release of OpenELM models will 'empower and enrich the open research community' with modern language models. Open source models allow researchers to explore potential risks, data, and weaknesses inherent in the models. The models are ready for developers to use. They can use this open source LLM as is or make necessary modifications.

Back in February, Apple CEO Tim Cook revealed that Generative AI features would be coming to Apple devices later this year. Some time later, he reiterated that the company is working to deliver groundbreaking AI experiences.

Previously, Apple released several other AI models. It's a shame that the company has yet to bring AI capabilities to its devices. However, the upcoming iOS 18 is expected to include a new set of AI features, and the OpenELM release could be the latest behind-the-scenes preparations from Apple.

Recently, Mark Gurman also reported that iOS 18's AI features will be primarily powered by an on-device large language model to provide privacy and speed benefits. We'll know everything when Apple announces iOS 18 and other software upgrades at WWDC on June 10.

Update 26 April 2024
Category

System

Mac OS X

Hardware

Game

Tech info

Technology

Science

Life

Application

Electric

Program

Mobile