img

Student used AI tool to create fake nude photos of underage classmate in NJ: lawsuit

The lawsuit states that a male classmate used an AI program called ClothOff to generate altered images that kept her face but digitally removed her clothing.

ADVERTISEMENT

The lawsuit states that a male classmate used an AI program called ClothOff to generate altered images that kept her face but digitally removed her clothing.

ADVERTISEMENT
A New Jersey teenager has filed a lawsuit against a company behind an artificial intelligence tool that created fake nude images of her using photos she posted online when she was 14.

The lawsuit states that a male classmate used an AI program called ClothOff to generate altered images that kept her face but digitally removed her clothing, making it appear she was nude. The explicit fake images were then circulated through social media and group chats.

The 17-year-old plaintiff is suing AI/Robotics Venture Strategy 3 Ltd, the company that developed the tool. The lawsuit, filed on her behalf by a Yale Law School professor, several students, and a trial attorney, seeks to have the fake images permanently deleted and to stop the company from using them to train AI models. It also requests the removal of the tool from the internet and financial compensation for emotional distress.

ClothOff remains active online, advertising itself as an app that can “remove clothes from photos.” The company’s website includes a disclaimer stating, "Is it ethical to use AI generators to create images? Using AI to create 'deepnude' style images raises ethical considerations. We encourage users to approach this with an understanding of responsibility and respect for others' privacy, ensuring that the use of undress app is done with full awareness of ethical implications."

The lawsuit comes amid a growing national debate over AI regulation. According to Fox News, more than 45 states have proposed or passed laws criminalizing the creation or distribution of AI-generated sexual images without consent. Under New Jersey law, producing or sharing deceptive AI media can lead to prison time and fines.

At the federal level, the Take It Down Act, signed into law earlier this year, requires companies to remove nonconsensual explicit content within 48 hours of a verified request. However, despite growing legal measures there are still challenges, as many AI developers operate overseas or through various online networks.
ADVERTISEMENT
ADVERTISEMENT
Sign in to comment

Comments

Powered by The Post Millennial CMS™ Comments

Join and support independent free thinkers!

We’re independent and can’t be cancelled. The establishment media is increasingly dedicated to divisive cancel culture, corporate wokeism, and political correctness, all while covering up corruption from the corridors of power. The need for fact-based journalism and thoughtful analysis has never been greater. When you support The Post Millennial, you support freedom of the press at a time when it's under direct attack. Join the ranks of independent, free thinkers by supporting us today for as little as $1.

Support The Post Millennial

Remind me next month

To find out what personal data we collect and how we use it, please visit our Privacy Policy

ADVERTISEMENT
ADVERTISEMENT
By signing up you agree to our Terms of Use and Privacy Policy
ADVERTISEMENT
© 2025 The Post Millennial, Privacy Policy | Do Not Sell My Personal Information