Meta has filed a lawsuit against a company behind AI apps that create fake nude images without consent. The legal action follows an investigation that exposed hundreds of such ads running on Meta’s platforms, including Facebook, Instagram, Messenger, and Threads.
The lawsuit targets Joy Timeline, a Hong Kong-based company. According to Meta, Joy Timeline tried several times to bypass its ad review systems to promote its CrushAI nudify apps. These apps allow users to digitally undress people, often targeting women and female celebrities.
Meta said the lawsuit shows how seriously it takes this abuse. “We are committed to protecting our community,” the company said. “We will take legal action when needed.”
The CBS News investigation found that many of these ads targeted men in the US, UK, and EU. These apps don’t just break Meta’s platform rules. They also fuel blackmail, sextortion schemes, and exploitation of children.
Despite efforts by Meta to block these ads and remove related accounts, CBS reported that some ads for AI deepfake tools were still live on Instagram even after the investigation.
This is not the first case of AI deepfake apps spreading online. Other reports have led to app removals by Apple and Google, and even lawsuits by cities like San Francisco.
Meta’s lawsuit is one of the strongest moves yet to fight against this growing problem of AI-powered image abuse.
Source: CBS News, Meta