Qualcomm partners with Meta to bring Llama 2 to smartphones and PCs
After much speculation, Meta has finally officially announced Llama 2, the next generation version of the big language model for general AI applications and services.
Your browser doesn’t support HTML5 audio
After much speculation, Meta has finally officially announced Llama 2, the next generation version of the big language model for general AI applications and services. More notably, there is information about a new partnership signed between Meta and Qualcomm that will allow the use of Llama 2 on mobile devices running on Qualcomm Snapdragon processors, promising interesting innovations and additions related to the field of artificial intelligence on smartphones.
In a press release, Qualcomm also said that the goal of the cooperation agreement with Meta is to allow Snapdragon-powered devices to run Llama 2-based applications and services without connecting to a cloud service like most current AI products including ChatGPT and Bing Chat. The company stated:
'The ability to run generic AI models like Llama 2 on devices like smartphones, PCs, VR/AR headsets, etc., allows developers to save on cloud costs, while providing users with a better private, trusted and personalized experience'.
Llama (Large Language Model Meta AI) is an open-source operating model that allows researchers and organizations of government, society, and academia to use it for free. Meta says large language models (LLMs) have been shown to be very effective in creating texts, conversing, summarizing documents and solving other problems related to math and science. They mine large amounts of text to summarize information and create content, for example being able to answer questions in writing as if it were written by a human. This is also the biggest advantage that companies and users aim for.
Qualcomm says the ability to run major language models like Llama 2 on mobile devices offers some clear advantages. Most notably, it can be more cost-effective than using a cloud-based LLM. It also provides better performance because you won't have to connect to an online service. At the same time, on-device LLM can provide more personalized AI services. It can also be more secure and private than connecting to a cloud server.
Currently, Qualcomm plans to start supporting Llama 2-based AI services on Snapdragon-powered devices around 2024. There's no word yet on whether Llama 2 will need the latest-generation Qualcomm chip to work, or can be compatible with current Snapdragon chips.
You should read it
- Meta starts releasing LLaMA 'super AI' language model to researchers
- What is Llama 2? How to use Llama 2?
- How to download and install Llama 2 locally
- How to build a chatbot using Streamlit and Llama 2
- Snapdragon 865 pitted A13 Bionic: 'One more pain' for the Qualcomm team
- OPPO will launch a 5G smartphone with Qualcomm Snapdragon 865
- Snapdragon 855 Plus will be a high-end chip, 'special treatment' with very noticeable improvements
- 5 things to know about Qualcomm Snapdragon 845 chip
- Qualcomm launches two new Snapdragon CPU models, promising to stir the market for low-cost ARM laptops
- Qualcomm announces new 4G chipset series: Snapdragon 720G, 662, and 460, what's noteworthy?
- Robot chat in strange language, threat to humans?
- Qualcomm shows off the 'superstar' lineup will use the Snapdragon 865