AI girlfriend site breached, user fantasies stolen [updated]

A hacker has stolen a massive database of users’ interactions with their sexual partner chatbots, according to 404 Media.

The breached service, Muah.ai, describes itself as a platform that lets people engage in AI-powered companion NSFW chat, exchange photos, and even have voice chats.

As you can imagine, data like this is very sensitive, so the site assures customers that communications are encrypted and says it doesn’t sell any data to third parties.

Absolute privacy Encrypted communication. Delet account with ease. We do not sell any data to any 3rd party.
Absolute privacy promised

The stolen data, however, tells a different story. It includes chatbot prompts that reveal users’ sexual fantasies. These prompts are in turn linked to email addresses, many of which appear to be personal accounts with users’ real names.

Mauh.ai says it believes in freedom of speech and to uphold that right, it says:

“AI technology should be for everyone, and its use case to be decided by each mature, individual adult. So that means we don’t actively censor or filter AI. So any topic can be discussed without running into a wall.”

Unfortunately, that means that filth is created to satisfy the needs of some sick users, and some of the data contains horrifying explicit references to children.

Presumably those users in particular don’t want their fantasies to be discovered, which is exactly what might happen if they are connected to your email address.

The hacker describes the platform as “a handful of open-source projects duct-taped together.” Apparently, it was no trouble at all to find a vulnerability that provided access to the platform’s database.

The administrator of Muah.ai says the hack was noticed a week ago and claims that it must be sponsored by the competitors in the “uncensored AI industry.” Which, who knew, seems to be the next big thing.

The administrator also said that Muah.ai employs a team of moderation staff that suspend and delete ALL child-related chatbots on its card gallery (where users share their creations), Discord, Reddit, etc, But in reality, when two people posted about a reportedly underage AI character on the site’s Discord server, 404 Media claims a moderator told the users to not “post that shit” here, but to go “DM each other or something.”

Muah.ai is just one example of a new breed of uncensored AI apps that offer hundreds of role-play scenarios with chatbots, and others designed to behave like a long-term romantic companion.

404 Media says it tried to contact dozens of people included in the data, including users who wrote prompts that discuss having underage sex. Not surprisingly, none of those people responded to a request for comment.

Update October 11

There are reports that this information is in use for active extortion attempts. Whether these are based on actual activities on the platform or solely based on leaked email addresses is not yet known.

Innovation before security

Emerging platforms like these are often rushed into existence because there is money to be made. Unfortunately, that usually happens at the expense of security and privacy, so here are some things to bear in mind:

  • Don’t trust AI platforms that promise privacy and encryption just because they say so
  • Don’t login with your Google/Facebook/Microsoft credentials or by using your regular email address or phone number
  • Remember that anything you put online, including a service that promises privacy, has a risk of being made public

Check your digital footprint

If you want to find out what personal data of yours has been exposed online, you can use our free Digital Footprint scan. Fill in the email address you’re curious about (it’s best to submit the one you most frequently use) and we’ll send you a free report.


We don’t just report on threats – we help safeguard your entire digital identity

Cybersecurity risks should never spread beyond a headline. Protect your—and your family’s—personal information by using identity protection.

https://blog.malwarebytes.com/feed/