Snapchat users will now see a ghostly image hovering over their AI-generated images, as Snapchat joins other Big Tech companies using watermarking tech to battle AI misinformation and deepfakes.

Images made using Snapchat’s AI tools, like the app’s extend tool and its recently launched Dreams feature, will be stamped with a transparent watermark (Snapchat’s ghost logo) once it’s exported or downloaded off the app. Users receiving AI-generated images may also see the ghost logo and the app’s “sparkle” AI icon.

Currently, Snapchat marks AI-generated content, included text conversations with its My AI chatbot, in various ways. Images created using Dreams are accompanied by a “context card” that explains the feature and generative AI. My AI conversations and the extend tool uses “contextual” icons, like the sparkle symbol, to

“We also take great care to vet all political ads through a rigorous human review process, including a thorough check for any misleading use of content, including AI to create deceptive images or content,” the platform wrote. “The addition of these watermarks will help inform those viewing it that the image was made with AI on Snapchat.”

Along with the new transparency tool, Snapchat also committed to ongoing AI literacy efforts. So far, that just entails a generative AI FAQ available on its Support Site.

“While all of our AI tools, both text-based and visual, are designed to avoid producing incorrect, harmful, or misleading material, mistakes may still occur,” the company wrote. “Snapchatters are able to report content, and we appreciate this feedback.”

AI watchdogs say watermarking technology isn’t a sure fire solution, however, even as tech’s big players hinge their user-side bets on it.

In February, OpenAI announced it would be adding metadata watermarks to images generated by DALL-E 3. Google launched its SynthID, a tool that adds similarly invisible watermarks to AI images, in August. Youtube, meanwhile, is enforcing penalties against users who don’t use its labelling system for digitally altered content.


Source link