In a surprising collaboration, Qualcomm and Meta have joined forces to bring the new large language model, Llama 2, to Qualcomm chips on phones and PCs starting in 2024. This collaboration marks a shift from the traditional use of large server farms and Nvidia graphics processors for running LLMs due to their high computational power requirements. Due to the use of large server farms, Nvidia’s stock has soared by over 220% this year. However, Qualcomm, a leading edge processor manufacturer for phones and PCs, has not seen the same level of growth, with its stock only rising about 10% in 2023 compared to the NASDAQ’s gain of 36%.
The recent announcement from Qualcomm shows its intentions to position its processors as AI-compatible, but with a focus on running AI models directly on devices rather than in the cloud. This shift could be game-changing as it has the potential to significantly reduce the cost of running AI models while improving the performance of voice assistants and other applications.
To achieve this, Qualcomm plans to make Meta’s open-source Llama 2 models available on its devices, paving the way for the development of intelligent virtual assistants and similar applications. Although Meta’s Llama 2 shares similar functionalities with ChatGPT, it has the advantage of being condensed into a smaller program, making it suitable for running on phones.
One key feature of Qualcomm’s chips is the inclusion of a specialized processor unit called a “tensor processor unit” (TPU) that is specifically designed for the complex calculations needed by AI models. However, the processing power available on a mobile device remains significantly lower than that of a cutting-edge GPU-equipped data center.
Meta’s Llama stands out because Meta has openly published the “weights” of its AI model, enabling researchers and commercial enterprises to use the model on their own computers without restrictions. This is in contrast to other notable large language models like OpenAI’s GPT-4 or Google’s Bard, which are closed-source with closely guarded weight information.
Qualcomm has a history of collaboration with Meta, previously working together on chips for Meta’s Quest virtual reality devices. Qualcomm has also showcased AI models running on its chips, albeit at a slower pace, such as the open-source image generator Stable Diffusion.
The partnership between Qualcomm and Meta puts a new twist on AI models by harnessing them directly on mobile devices, potentially revolutionizing the AI landscape by making those services more accessible and affordable.
The whytry.ai article you just read is a brief synopsis; the original article can be found here: Read the Full Article…