It’s Valentine’s Day and digital romances are blossoming. Across the world, lonely hearts are opening up to virtual lovers. But their secrets aren’t as safe as they may seem.

According to a new analysis by Mozilla, AI girlfriends harvest reams of private and intimate data. This information can then be shared with marketers, advertisers, and data brokers. It’s also vulnerable to leaks.

The research team investigated 11 popular romantic chatbots, including Replika, Chai, and Eva. Around 100 million people have downloaded these apps on Google Play alone.

Using AI, the chatbots simulate interactions with virtual girlfriends, soulmates, or friends. To produce these conversations, the systems ingest oodles of personal data. 

Often, that information is extremely sensitive and explicit.

“AI romantic chatbots are collecting far beyond what we might consider ‘typical’ data points such as location and interests,” Misha Rykov, a researcher at Mozilla’s *Privacy Not Included project, told TNW.

“We found that some apps are highlighting their users’ health conditions, flagging when they are receiving medication or gender-affirming care.”

Mozilla described the safeguards as “inadequate.” Ten of the 11 chatbots failed to meet the company’s minimum security standards, such as requiring strong passwords.

Replika, for instance, records all the text, photos, and videos posted by users. According to Mozilla, the app “definitely” shared and “possibly” sold behavioural data to advertisers.

Because users can create accounts with weak passwords, such as “11111111,” they’re also highly vulnerable to hacking.

AI girlfriends are concealing secrets

Trackers are widespread on romantic chatbots. On the Romantic AI app, the researchers found at least 24,354 trackers within just a minute of use. 

These trackers can send data to advertisers without explicit consent from users. Mozilla suspects they could be breaching GDPR.

Screenshots from EVA AI Chat Bot & Soulmate