Qualcomm partners with Meta to bring Llama 2 to smartphones and PCs

After much speculation, Meta has finally officially announced Llama 2, the next generation version of the big language model for general AI applications and services.

After much speculation, Meta has finally officially announced Llama 2, the next generation version of the big language model for general AI applications and services. More notably, there is information about a new partnership signed between Meta and Qualcomm that will allow the use of Llama 2 on mobile devices running on Qualcomm Snapdragon processors, promising interesting innovations and additions related to the field of artificial intelligence on smartphones.

In a press release, Qualcomm also said that the goal of the cooperation agreement with Meta is to allow Snapdragon-powered devices to run Llama 2-based applications and services without connecting to a cloud service like most current AI products including ChatGPT and Bing Chat. The company stated:

'The ability to run generic AI models like Llama 2 on devices like smartphones, PCs, VR/AR headsets, etc., allows developers to save on cloud costs, while providing users with a better private, trusted and personalized experience'.

Qualcomm partners with Meta to bring Llama 2 to smartphones and PCs Picture 1Qualcomm partners with Meta to bring Llama 2 to smartphones and PCs Picture 1

 

Llama (Large Language Model Meta AI) is an open-source operating model that allows researchers and organizations of government, society, and academia to use it for free. Meta says large language models (LLMs) have been shown to be very effective in creating texts, conversing, summarizing documents and solving other problems related to math and science. They mine large amounts of text to summarize information and create content, for example being able to answer questions in writing as if it were written by a human. This is also the biggest advantage that companies and users aim for.

Qualcomm says the ability to run major language models like Llama 2 on mobile devices offers some clear advantages. Most notably, it can be more cost-effective than using a cloud-based LLM. It also provides better performance because you won't have to connect to an online service. At the same time, on-device LLM can provide more personalized AI services. It can also be more secure and private than connecting to a cloud server.

Currently, Qualcomm plans to start supporting Llama 2-based AI services on Snapdragon-powered devices around 2024. There's no word yet on whether Llama 2 will need the latest-generation Qualcomm chip to work, or can be compatible with current Snapdragon chips.

4 ★ | 1 Vote