Someone used Shutterstock’s AI image generator to create them.

  • db2@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    I’m curious what they call disturbing, but also don’t want to see in case they’re right.

      • Just_Pizza_Crust@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        I think it really depends on what “young girl” means in this context. The title says “children”, but nowhere in the article does it say that. So I’m unsure if this is another AI-boogyman article, or something else.

          • douglasg14b@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 months ago

            Theres a link below of a “you g girl” on the toilet.

            It appears to be a young adult, clothed, using a toilet as a seat. Idk why it’s labeled the way it is, it’s really weird.

            However , that somewhat dilutes the notion that that means children on this site.

          • Just_Pizza_Crust@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 months ago

            That’s pure speculation on your part.

            Like another person said, the “young girl” on the toilet looks to be a woman well into her 20s.

  • andrew@lemmy.stuart.fun
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 months ago

    A note on the page warns, “Shutterstock does not review AI-generated content for compliance with Shutterstock’s content compliance standards.” Adding that users must not generate imagery that is “false, misleading, deceptive, harmful, or violent.”

    “Pls don’t be bad mmkay?”

    “We’ve done all we possibly can.”

  • Zak@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    This may be controversial, but I don’t care what kind of AI-generated images people create as long as it’s obvious they’re not reality. Where I worry is the creation of believable false narratives, from explicit deepfakes of real people to completely fictional newsworthy events.

    • StunningGoggles@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      9 months ago

      I’ve read that pedophiles are more likely to act out on their urges if they have access to real images. I would guess that this also applies for ai generated images too, even if they don’t look 100% real, but I could be wrong on that. Whatever stops them from abusing kids is what I’m for.

      • Zak@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        I want to say research on the subject has been inconclusive overall. I’d certainly update my view given convincing evidence that fictional images lead to abuse of real children.

        Of course, none of that has anything to do with the non-explicit video linked elsewhere in this thread of an adult woman using the toilet.