Thousands of authors demand payment from AI companies for use of copyrighted works::Thousands of published authors are requesting payment from tech companies for the use of their copyrighted works in training artificial intelligence tools, marking the latest intellectual property critique to target AI development.

  • bouncing@partizle.com
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    2
    ·
    1 year ago

    If you say “AI read my book and output a similar story, you owe me money” then how is that different from “Joe read my book and wrote a similar story, you owe me money.”

    You’re bounded by the limits of your flesh. AI is not. The $12 you spent buying a book at Barns & Noble was based on the economy of scarcity that your human abilities constrain you to.

    It’s hard to say that the value proposition is the same for human vs AI.

    • jecxjo@midwest.social
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      We are making an assumption that humans do “human things”. If i wrote a derivative work of your $12 book, does it matter that the way i wrote it was to use a pen and paper and create a statistical analysis of your work and find the “next best word” until i had a story? Sure my book took 30 years to write but if i followed the same math as an AI would that matter?

      • bouncing@partizle.com
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        1 year ago

        It wouldn’t matter, because derivative works require permission. But I don’t think anyone’s really made a compelling case that OpenAI is actually making directly derivative work.

        The stronger argument is that LLM’s are making transformational work, which is normally fair use, but should still require some form of compensation given the scale of it.

        • jecxjo@midwest.social
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          But no one is complaining about publishing derived work. The issue is that “the robot brain has full copies of my text and anything it creates ‘cannot be transformative’”. This doesn’t make sense to me because my brain made a copy of your book too, its just really lossy.

          I think right now we have definitions for the types of works that only loosely fit human actions mostly because we make poor assumptions of how the human brain works. We often look at intent as a guide which doesn’t always work in an AI scenario.

          • bouncing@partizle.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Yeah, that’s basically it.

            But I think what’s getting overlooked in this conversation is that it probably doesn’t matter whether it’s AI or not. Either new content is derivative or it isn’t. That’s true whether you wrote it or an AI wrote it.

            • jecxjo@midwest.social
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              I agree with that, but do politicians and judges who know absolutely nothing about the subject?

              I haf a professor in college who taught about cyber security. He was renowned in his field and was asked by the RIAA to testify about some cases related to file sharing. I lost respect for him when he intentionally refrained from stating that it wasnt possible for anyone outside of the home network yo know what or who was actually downloading stuff. The technology was being ignored and an invalid view was presented for a judge who couldn’t ELI5 how the internet worked let along actually networking protocols.

      • BartsBigBugBag@lemmy.tf
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        It’s not even looking for the next best word. It’s looking for the next best token. It doesn’t know what words are. It reads tokens.

        • jecxjo@midwest.social
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          Good point.

          I could easily see laws created where they blanket outlaw computer generated output derived from other human created data sets and sudden medical and technical advancements stop because the laws were written by people who don’t understand what is going on.