• honey_im_meat_grinding@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    60
    arrow-down
    7
    ·
    edit-2
    1 year ago

    I sympathize with artists who might lose their income if AI becomes big, as an artist it’s something that worries me too, but I don’t think applying copyright to data sets is a long term good thing. Think about it, if copyright applies to AI data sets all that does is one thing: kill open source AI image generation. It’ll just be a small thorn in the sides of corporations that want to use AI before eventually turning them into monopolies over the largest, most useful AI data sets in the world while no one else can afford to replicate that. They’ll just pay us artists peanuts if anything at all, and use large platforms like Twitter, Facebook, Instagram, Artstation, and others who can change the terms of service to say any artist allows their uploaded art to be used for AI training - with an opt out hidden deep in the preferences if we’re lucky. And if you want access to those data sources and licenses, you’ll have to pay the platform something average people can’t afford.

    • Phanatik@kbin.social
      link
      fedilink
      arrow-up
      26
      arrow-down
      8
      ·
      1 year ago

      I completely disagree. The vast majority of people won’t be using the open source tools unless the more popular ones become open source (which I don’t think is likely). Also, a tool being open source doesn’t mean it’s allowed to trample over an artist’s rights to their work.

      > > > They’ll just pay us artists peanuts if anything at all, and use large platforms like Twitter, Facebook, Instagram, Artstation, and others who can change the terms of service to say any artist allows their uploaded art to be used for AI training - with an opt out hidden deep in the preferences if we’re lucky. > >

      This is going to happen anyway. Copyright law has to catch up and protect against this, just because they put it in their terms of service, doesn’t mean it can’t be legislated against.

      This was the whole problem with OpenAI anyway. They decided to use the internet as their own personal dataset and are now charging for it.

      • honey_im_meat_grinding@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        edit-2
        1 year ago

        I get where you’re coming from, but I don’t think even more private property is the answer here. This is ultimately a question of economics - we don’t like that a) we’re being put out of jobs, and b) it’s being done without our consent / anything in return. These are problems that we can address without throwing even more monopolosation power into the equation, which is what IP is all about - giving artists a monopoly over their own content, which mostly benefits large media corporations, not independent artists.

        I’d much rather we tackled the problem of automation taking our jobs in a more heads on manner via something like UBI or negative income taxes, rather than a one-off solution like even more copyright that only really serves to slow this inevitability down. You can regulate AI in as many ways as you want, but that’s adding a ton of meaningless friction to getting stuff done (e.g. you’d have to prove your art wasn’t made by AI somehow) when the much easier and more effective solution is something like UBI.

        The consent question is something that needs a bit more of a radical solution - like democratising work, something that Finland has done to their grocery stores, the biggest grocery chains are democratically owned and run by the members (consumer coops). We’ll probably get to something like that on a large scale… eventually - but I think it’s probably a bigger hurdle than UBI. Then you’d be able to vote on what ways an organisation operates, including if or how it builds AI data sets.

        • archomrade [he/him]@midwest.social
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          I appreciate this take, especially when applying copyright in the manner being proposed extends the already ambiguous grey area of “fair use”, which is most often used against artists.

      • Pulp@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        12
        ·
        1 year ago

        Who gives a shit about artists rights? We need to move on with the progress like we always have.

        • Phanatik@kbin.social
          link
          fedilink
          arrow-up
          6
          arrow-down
          1
          ·
          1 year ago

          We should give a shit about everyone’s rights to put food on the table. Compassion can be exhausting but it’s important to recognise that someone else’s problem might be yours one day and you’d wish someone was there to help you.

    • krnl386@lemmy.ca
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      2
      ·
      1 year ago

      I sympathize with artists too, but to a point. I predict that:

      1. AI art will overtake human art eventually; that is human art jobs will be mostly replaced. Day to day art (e.g. ads, illustrations, decorations, billboards etc) will likely be AI generated.
      2. Human art will become something akin to a home cooked meal in a sea of fast food art. This might actually make some artists famous and rich.
      3. Humans will continue to learn art, but more as a pastime/hobby/mental exercise.
      • ParsnipWitch@feddit.de
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        1 year ago

        For point 2 and 3 art is too expensive and time consuming to learn. I feel a lot of people extremely underestimate the time and cost that people have to bring up to become decent artists.

    • CloverSi@lemmy.comfysnug.space
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      1 year ago

      This was my thinking too. In principle I support restrictions on the data AI can be trained on, no question - but practically speaking the only difference restricting it makes is giving whatever companies gobble up the most IP the sole ability to make legal AI art. If a decision like that was made, there would be no more stable diffusion, available to anyone and everyone for free; the only legal options would be e.g. Adobe Firefly.