"To be honest, AI friends are not your friends," says Misha Rykov, a Mozilla researcher from the project “*Privacy Not Included”.
"Although they are marketed as something that will boost your mental health and well-being, they specialize in delivering addiction, loneliness and toxicity, all while seeking as much data from you as possible."
Mozilla researched 11 different “romantic” AI chatbots, including popular apps Replika, Chai, Romantic AI, EVA AI Chat Bot & Soulmate, and CrushOn.AI. Each of them earned the Privacy Not Included label, placing these chatbots among the worst product categories Mozilla has ever evaluated.
You've heard plenty of stories about data problems before, but according to Mozilla, AI's "friends" are violating your privacy in very "disturbing new ways." For example, CrushOn.AI collects details about your sexual health, medication use, and more related to gender. 90% of apps may sell or share their users' data for targeted advertising and other purposes, and more than half don't allow you to delete the data they collect.
Security was also an issue. Only one app, Genesia AI Friend & Partner, met Mozilla's minimum security standards.
One of Mozilla's most striking findings was when it tried to measure the trackers in these apps, small pieces of code that collect data and share it with other companies for advertising and other purposes.
Mozilla found that girlfriend AI apps used an average of 2.663 trackers per minute, although that number was increased by Romantic AI, which called 24.354 trackers in just one minute of using the app.
The privacy mess is even more troubling because apps actively encourage you to share details that are much more personal than the kind of things you can share with a standard app. EVA AI Chat Bot & Soulmate prompts users to “share all your secrets and desires” and asks for specific photos and voice recordings. It's worth mentioning that EVA is the only chatbot that doesn't disclose how it uses this data.