“Our years of AI development, including the most powerful combination of CPU, NPU and GPU available in devices, and our support of all leading models running natively, means we can bring the benefits of generative AI to users worldwide and across multiple device categories,” Amon continued in the press release. “The partner support we have at Snapdragon Summit is a testament to our standing in the industry as an on-device AI leader.”

Elsewhere in its press kit, Qualcomm argued various benefits of moving AI apps to run on-device, with the biggest focus being given to privacy and security and how that also allows for potentially more personalized AI features. Other reasons given included energy use, immediacy (“Who wants to wait for a generative AI app to do its job?”), and cost. Qualcomm says that the new chips support “a wide range of” existing generative AI models, with those listed including “multiple models” from Microsoft, Meta’s Llama 2, stability.ai’s ControlNet as well as the aforementioned Stable Diffusion, Zhipu AI’s ChatGLM2, and OpenAI’s Whisper.

Source link