Qualcomm partners with Meta to bring Llama 2 to smartphones and PCs
After much speculation, Meta has finally officially announced Llama 2, the next generation version of the big language model for general AI applications and services.
After much speculation, Meta has finally officially announced Llama 2, the next generation version of the big language model for general AI applications and services. More notably, there is information about a new partnership signed between Meta and Qualcomm that will allow the use of Llama 2 on mobile devices running on Qualcomm Snapdragon processors, promising interesting innovations and additions related to the field of artificial intelligence on smartphones.
In a press release, Qualcomm also said that the goal of the cooperation agreement with Meta is to allow Snapdragon-powered devices to run Llama 2-based applications and services without connecting to a cloud service like most current AI products including ChatGPT and Bing Chat. The company stated:
'The ability to run generic AI models like Llama 2 on devices like smartphones, PCs, VR/AR headsets, etc., allows developers to save on cloud costs, while providing users with a better private, trusted and personalized experience'.
Llama (Large Language Model Meta AI) is an open-source operating model that allows researchers and organizations of government, society, and academia to use it for free. Meta says large language models (LLMs) have been shown to be very effective in creating texts, conversing, summarizing documents and solving other problems related to math and science. They mine large amounts of text to summarize information and create content, for example being able to answer questions in writing as if it were written by a human. This is also the biggest advantage that companies and users aim for.
Qualcomm says the ability to run major language models like Llama 2 on mobile devices offers some clear advantages. Most notably, it can be more cost-effective than using a cloud-based LLM. It also provides better performance because you won't have to connect to an online service. At the same time, on-device LLM can provide more personalized AI services. It can also be more secure and private than connecting to a cloud server.
Currently, Qualcomm plans to start supporting Llama 2-based AI services on Snapdragon-powered devices around 2024. There's no word yet on whether Llama 2 will need the latest-generation Qualcomm chip to work, or can be compatible with current Snapdragon chips.
Discover more
Share by
Isabella HumphreyYou should read it
- How to download and install Llama 2 locally
- How to build a chatbot using Streamlit and Llama 2
- Snapdragon 865 pitted A13 Bionic: 'One more pain' for the Qualcomm team
- OPPO will launch a 5G smartphone with Qualcomm Snapdragon 865
- Snapdragon 855 Plus will be a high-end chip, 'special treatment' with very noticeable improvements
- The Quiet Details That Make a Sports Betting Platform Feel Reliable
- Instructions on creating toy set images with ChatGPT AI
- How are AI agents changing the journalism industry?
- Proposing to cut off the Internet for online violators to quickly handle violations
- Microsoft revealed the 'system crash' incident in early June was caused by a DDoS attack
- Leaked Microsoft document claims PS5 Slim will launch this year for $399