I just want to make funny Pictures.

  • EldritchFeminity@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    0
    ·
    2 days ago

    The issue has never been the tech itself. Image generators are basically just a more complicated Gaussian Blur tool.

    The issue is, and always has been, the ethics involved in the creation of the tools. The companies steal the work they use to train these models without paying the artists for their efforts (wage theft). They’ve outright said that they couldn’t afford to make these tools if they had to pay copyright fees for the images that they scrape from the internet. They replace jobs with AI tools that aren’t fit for the task because it’s cheaper to fire people. They train these models on the works of those employees. When you pay for a subscription to these things, you’re paying a corporation to do all the things we hate about late stage capitalism.

    • DegenerateSupreme@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      Agreed. The problem is that so many (including in this thread) argue that training AI models is no different than training humans—that a human brain inspired by what it sees is functionally the same thing.

      My response to why there is still an ethical difference revolves around two arguments: scale, and profession.

      Scale: AI models’ sheer image output makes them a threat to artists where other human artists are not. One artist clearly profiting off another’s style can still be inspiration, and even part of the former’s path toward their own style; however, the functional equivalent of ten thousand artists doing the same is something else entirely. The art is produced at a scale that could drown out the original artist’s work, without which such image generation wouldn’t be possible in the first place.

      Profession. Those profiting from AI art, which relies on unpaid scraping of artists’s work for data sets, are not themselves artists. They are programmers, engineers, and the CEOs and stakeholders who can even afford the ridiculous capital necessary in the first place to utilize this technology at scale. The idea that this is just a “continuation of the chain of inspiration from which all artists benefit” is nonsense.

      As the popular adage goes nowadays, “AI models allow wealth to access skill while forbidding skill to access wealth.”

    • desktop_user@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      0
      ·
      2 days ago

      I think that, in many ways AI is just worsening the problems of excessive copyright terms. Copyright should last 20 years, maybe 40 if it can be proven that it is actively in use.

      • EldritchFeminity@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        0
        ·
        2 days ago

        Copyright is its own whole can of worms that could have entire essays just about how it and AI cause problems. But the issue at hand really comes down to one simple question:

        Is a man not entitled to the sweat of his brow?

        “No!” Says society. “It’s not worth anything.”

        “No!” Says the prompter. “It belongs to the people.”

        “No!” Says the corporation. “It belongs to me.”

        • ClamDrinker@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          1 day ago

          I think you are making the mistake of assuming disagreement with your stance means someone would say no to these questions. Simply put - it’s a strawman.

          Most (yes, even corporations, albeit much less so for the larger ones), would say “Yes” to this question on it’s face value, because they would want the same for their own “sweat of the brow”. But certain uses after the work is created no longer have a definitive “Yes” to their answer, which is why your ‘simple question’ is not an accurate representation, as it forms no distinctions between that. You cannot stop your publicly posted work from being analyzed, by human or computer. This is firmly established. As others have put in this thread, reducing protections over analysis will be detrimental to both artists as well as everyone else. It would quite literally cause society’s ability to advance to slow down if not halt completely as most research requires analysis of existing data, and most of that is computer assisted.

          Artists have always been undervalued, I will give you that. But to mitigate that, we should provide artists better protections that don’t rely on breaking down other freedoms. For example, UBI. And I wish people that were against AI would focus on that, since that is actually something you could get agreement on with most of society and actually help artists with. Fighting against technology that besides it negatives also provides great positives is a losing battle.

          • EldritchFeminity@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            0
            ·
            1 day ago

            It’s not about “analysis” but about for-profit use. Public domain still falls under Fair Use. I think you’re being too optimistic about support for UBI, but I absolutely agree on that point. There are countries that believe UBI will be necessary in a decades time due to more and more of the population becoming permanently unemployed by jobs being replaced. I say myself that I don’t think anybody would really care if their livelihoods weren’t at stake (except for dealing with the people who look down on artists and say that writing prompts makes them just as good as if not better than artists). As it stands, artists are already forming their own walled off communities to isolate their work from being publicly available and creating software to poison LLMs. So either art becomes largely inaccessible to the public, or some form of horrible copyright action is taken because those are the only options available to artists.

            Ultimately, I’d like a licensing system put in place, like for open source software where people can license their works and companies have to cite their sources for their training data. Academics have to cite their sources for research, and holding for-profit companies to the same standards seems like it would be a step in the right direction. Simply require your data scraper to keep track of where it got its data from in a publicly available list. That way, if they’ve used stuff that they legally shouldn’t, it can be proven.

        • LainTrain@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          0
          ·
          1 day ago

          Does it not belong to the people? The meaning of that saying is a shitty analogy for this. You’re entitled to the sweat of your brow, but not more from a society, and if you use free infrastructure of the commons to share your work, it belongs to the commons