A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • anticurrent@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    We are acting as if through out history we managed to orient technology so as to to only keep the benefits and eliminate negative effects. while in reality most of the technology we use still comes with both aspects. and it is not gonna be different with AI.

  • SendMePhotos@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    I’d like to share my initial opinion here. “non consential Ai generated nudes” is technically a freedom, no? Like, we can bastardize our president’s, paste peoples photos on devils or other characters, why is Ai nudes where the line is drawn? The internet made photos of trump and putin kissing shirtless.

    • abhibeckert@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 months ago

      The internet made photos of trump and putin kissing shirtless.

      And is that OK? I mean I get it, free speech, but just because congress can’t stop you from expressing something doesn’t mean you actually should do it. It’s basically bullying.

      Imagine you meet someone you really like at a party, they like you too and look you up on a social network… and find galleries of hardcore porn with you as the star. Only you’re not a porn star, those galleries were created by someone who specifically wanted to hurt you.

      AI porn without consent is clearly illegal in almost every country in the world, and the ones where it’s not illegal yet it will be illegal soon. The 1st amendment will be a stumbling block, but it’s not an impenetrable wall - congress can pass laws that limit speech in certain edge cases, and this will be one of them.

      • WaxedWookie@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        The internet made photos of trump and putin kissing shirtless.

        And is that OK?

        I’m going to jump in on this one and say yes - it’s mostly fine.

        I look at these things through the lens of the harm they do and the benefits they deliver - consequentialism and act utilitarianism.

        The benefits are artistic, comedic and political.

        The “harm” is that Putin and or Trump might feel bad, maaaaaaybe enough that they’d kill themselves. All that gets put back up under benefits as far as I’m concerned - they’re both extremely powerful monsters that have done and will continue to do incredible harm.

        The real harm is that such works risk normalising this treatment of regular folk, which is genuinely harmful. I think that’s unlikely, but it’s impossible to rule out.

        Similarly, the dissemination of the kinds of AI fakes under discussion is a negative because they do serious,measurable harm.

        • Mananasi@feddit.nl
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 months ago

          I think that is okay because there was no intent to create pornography there. It is a political statement. As far as I am concerned that falls under free speech. It is completely different from creating nudes of random people/celebrities with the sole purpose of wanking off to it.

  • JackGreenEarth@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    That’s a ripoff. It costs them at most $0.1 to do simple stable diffusion img2img. And most people could do it themselves, they’re purposefully exploiting people who aren’t tech savvy.

  • GrymEdm@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    8 months ago

    To people who aren’t sure if this should be illegal or what the big deal is: according to Harvard clinical psychiatrist and instructor Dr. Alok Kanojia (aka Dr. K from HealthyGamerGG), once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves. There’s also extreme risk of feeling depressed, angry, anxiety, etc. The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

    I’ll admit I used to look at celeb deepfakes, but once I saw that video I stopped immediately and avoid it as much as I possibly can. I believe porn can be done correctly with participant protection and respect. Regarding deepfakes/revenge porn though that statistic about suicidal ideation puts it outside of healthy or ethical. Obviously I can’t make that decision for others or purge the internet, but the fact that there’s such regular and extreme harm for the (what I now know are) victims of non-consensual porn makes it personally immoral. Not because of religion or society but because I want my entertainment to be at minimum consensual and hopefully fun and exciting, not killing people or ruining their happiness.

    I get that people say this is the new normal, but it’s already resulted in trauma and will almost certainly continue to do so. Maybe even get worse as the deepfakes get more realistic.

    • lud@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves.

      Not saying that they are justified or anything but wouldn’t people stop caring about them when they reach a critical mass? I mean if everyone could make fakes like these, I think people would care less since they can just dismiss them as fakes.

      • Drewelite@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        I think this is realistically the only way forward. To delegitimize any kind of nudes that might show up of a person. Which could be good. But I have no doubt that highschools will be flooded with bullies sending porn around of innocent victims. As much as we delegitimize it as a society, it’ll still have an effect. Like social media, though it’s normal for anyone to reach you at any time, It still makes cyber bullying more hurtful.

      • eatthecake@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

        You want a world where people just desensitise themselves to things that make them want to die through repeated exposure. I think you’ll get a whole lot of complex PTSD instead.

        • stephen01king@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          People used to think their lives are over if they were caught alone with someone of the opposite sex they’re not married to. That is no longer the case in western countries due to normalisation.

          The thing that makes them want to die is societal pressure, not the act itself. In this case, if societal pressure from having fake nudes of yourself spread is removed, most of the harm done to people should be neutralised.

          • too_much_too_soon@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            8 months ago

            Agreed.

            "I’ve been in HR since '95, so yeah, I’m old, lol. Noticed a shift in how we view old social media posts? Those wild nights you don’t remember but got posted? If they’re at least a decade old, they’re not as big a deal now. But if it was super illegal, immoral, or harmful, you’re still in trouble.

            As for nudes, they can be both the problem and the solution.

            To sum it up, like in the animate movie ‘The Incredibles’: ‘If everyone’s special, then no one is.’ If no image can be trusted, no excuse can be doubted. ‘It wasn’t me’ becomes the go-to, and nobody needs to feel ashamed or suicidal over something fake that happens to many.

            Of course, this is oversimplifying things in the real world but society will adjust. People won’t kill themselves over this. It might even be a good thing for those on the cusp of AI and improper real world behaviours - ‘Its not me. Its clearly AI, I would never behave so outrageously’.

    • conciselyverbose@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Doesn’t mean distribution should be legal.

      People are going to do what they’re going to do, and the existence of this isn’t an argument to put spyware on everyone’s computer to catch it or whatever crazy extreme you can take it to.

      But distributing nudes of someone without their consent, real or fake, should be treated as the clear sexual harassment it is, and result in meaningful criminal prosecution.

      • treadful@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        Almost always it makes more sense to ban the action, not the tool. Especially for tools with such generalized use cases.