Qualcomm and Meta collaborate to bring Llama 2 AI models directly to devices, eliminating costly cloud reliance

Devesh Beri

Looking for more info on AI, Bing Chat, Chat GPT, or Microsoft's Copilots? Check out our AI / Copilot page for the latest builds from all the channels, information on the program, links, and more!

Qualcomm announced their and Meta’s collaboration to optimize Meta’s Llama 2 large language models directly on-device, eliminating the reliance on cloud services. This enables running generative AI models like Llama 2 on smartphones, PCs, VR/AR headsets, and vehicles, reducing cloud costs and providing private, reliable, personalized user experiences.

Qualcomm Technologies plans to provide on-device Llama 2-based AI implementations, facilitating the creation of new AI applications. This opens opportunities for intelligent virtual assistants, productivity tools, content creation, entertainment, and more. These on-device AI experiences, powered by Snapdragon, can function in areas without connectivity or airplane mode.

Meta And Qualcomm Work

Durga Malladi, senior vice president of Qualcomm Technologies, commends Meta’s approach to open and responsible AI. They are committed to driving innovation and making generative AI on-device accessible for developers of all sizes. To scale generative AI into the mainstream, it should run on cloud and edge devices like smartphones, laptops, vehicles, and IoT devices.

Their joint efforts to support the Llama ecosystem span research and product engineering. Qualcomm Technologies leadership in on-device AI uniquely positions it to support Llama’s ecosystem with billions of devices powered by its AI hardware and software solutions.

Qualcomm Technologies will make Llama 2-based AI implementation available on Snapdragon-powered devices starting in 2024.

Why does this matter?

  • The most powerful LLMs (Language Model Models), including Bard, ChatGPT, and others, heavily rely on expensive cloud computing resources. The limitations of cloud resources directly impact the scalability of generative AI, hindering its full potential.
  • This development’s significance lies in being the first major corporate partnership to bring LLMs to mobile devices. This marks a pivotal shift from the experimental phase to a concrete realization, promising a new paradigm for the future of mobile devices.