A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • AquaTofana@lemmy.world
    link
    fedilink
    English
    arrow-up
    56
    arrow-down
    6
    ·
    8 months ago

    I don’t know why you’re being down voted. Sure, it’s unfortunately been happening for a while, but we’re just supposed to keep quiet about it and let it go?

    I’m sorry, putting my face on a naked body that’s not mine is one thing, but I really do fear for the people whose likeness gets used in some degrading/depraved porn and it’s actually believable because it’s AI generated. That is SO much worse/psychologically damaging if they find out about it.