Smartphone for Snapdragon Insiders logo light closer up

Robert Triggs / Android Authority

TL;DR

  • Qualcomm has announced the Qualcomm AI Hub for developers.
  • It allows developers to quickly implement AI models in their apps.
  • The company says that these optimized models should work on non-Snapdragon devices too, with some caveats.

AI has been a fixture on smartphones for years now, but generative AI is a relatively new phenomenon in this space. Now, Qualcomm has announced the Qualcomm AI Hub to help developers quickly implement AI and generative AI features in their apps.

The company says this is effectively a central location for app developers to access on-device AI models that are quantized (i.e. shrunk) and validated by Qualcomm. The chip designer says over 75 AI models are supported too.

To get started, you’ll need to visit aihub.qualcomm.com, select an AI model, and then choose a target platform. Qualcomm adds that you can also drill down even further and choose a specific device.

From here, the Qualcomm AI Hub will guide you to the correct model. The company says developers can integrate these optimized models into their workflow with “a few lines of code.”

Qualcomm notes these models cover image segmentation, image generation, image classification, object detection, super-resolution, low-light enhancements, text generation, and natural language understanding.

Will this work on non-Snapdragon devices?

What if you’d like to make an AI-enabled app that runs on a device with a non-Snapdragon chip? We posed this question to Qualcomm during a media briefing and it turns out there’s good news and bad news.

“These models are actually pre-optimized in terms of the model itself, providing acceleration on generic CPU and GPUs, so it will work on other branded chips and you can still enjoy some performance gains,” the company told us during a briefing.

“But in order to get the best out of these models, we optimized even a level deeper for the Qualcomm platforms (sic), with NPU acceleration and some more enhancements across the board, these models will work best in Qualcomm platforms.”

In other words, these optimized AI models will work on a phone powered by an Exynos chip, Google Tensor chip, or MediaTek SoC. However, they can’t take advantage of dedicated AI silicon for faster and/or more efficient performance. So developers will still need to put in some extra work on these devices for the best results.

Source link