A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.
This is only going to get easier. The djinn is out of the bottle.
Doesn’t mean distribution should be legal.
People are going to do what they’re going to do, and the existence of this isn’t an argument to put spyware on everyone’s computer to catch it or whatever crazy extreme you can take it to.
But distributing nudes of someone without their consent, real or fake, should be treated as the clear sexual harassment it is, and result in meaningful criminal prosecution.
Almost always it makes more sense to ban the action, not the tool. Especially for tools with such generalized use cases.