• pokexpert30@lemmy.pussthecat.org
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    I kinda fail to see the problem. The GPU owner doesn’t see what workload they are processing. The pr0n company is willing to pay for GPU power. The GPU owner wants to earn money with his hardware. There’s a demand, there’s an offer, nobody is getting hurt (ai pr0n is not illegal, at least for now) so let people what they want to do

    • mavu@discuss.tchncs.de
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      The problem is that they are clearly targeting minors who don’t pay their own electricity bill, and dont even neccessarily have awareness that they are paying for their fortnite skins with their parents money. Also: there is a good chance that the generated pictures are at some point present on in the filesystem of the generating computer, and that alone is a giant can of worms that can even lead to legal troubles, if the person lives in a country where some or all kinds of pronography are illegal.

      This is a shitty grift, abusing people who don’t understand the consequences of the software.

  • PaupersSerenade@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    I’ll be a minority voice considering the other comments. But maybe just pay for onlyfans or whatever you guys use. I’m a generally attractive woman (I can surmise from interactions while trying to date) and I really don’t like the idea that my likeness would be used for something like this. Get your jollies off, but try and be a bit consensual about it. Is that so much to ask?

    • ArbiterXero@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      So I’m not disagreeing with you, but you’re assuming they’re making deepfake images, and the article doesn’t specify that. In fact I’d bet that it’s just AI generated “people” that don’t exist.

      What about AI porn of a person that doesn’t exist?

      • Arbiter@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        However, one of Salad’s clients is CivitAi, a platform for sharing AI generated images which has previously been investigated by 404 media. It found that the service hosts image generating AI models of specific people, whose image can then be combined with pornographic AI models to generate non-consensual sexual images.