One prominent author responds to the revelation that his writing is being used to coach artificial intelligence.

By Stephen King

Non-paywalled link: https://archive.li/8QMmu

  • RyanHeffronPhoto@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    It’s baffling to me seeing comments like this as if the ‘AI’ is some natural intelligence just hanging out going around reading books it’s interested in for the hell of it… No. These are software companies illegally using artists works (which we require licensing for commercial use) to develop a commercial, profit generating product. Whatever the potential outputs of the AI are is irrelevant when the sources used to train it were obtained illegally.

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      These are software companies illegally using artists works

      There is nothing illegal about what they’re doing. You may want it to be illegal, but it’s not illegal until laws are actually passed to make it illegal. Things are not illegal by default.

      Copyright only prevents copying works. Not analyzing them. The results of the analysis are not the same as the original work.

        • FaceDeer@kbin.social
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          No, it’s not. Something that is merely in the style of something else is not a derivative work. If that were the case there’d be lawsuits everywhere.

          • anachronist@midwest.social
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            1 year ago

            LLMs regurgitate their training set. This has been proven many times. In fact from what I’ve seen LLMs are either regurgitating or hallucinating.

            there’d be lawsuits everywhere

            Early days.

    • admiralteal@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      Yeah, and even if it WERE truly intelligent – which these SALAMIs are almost certainly not – it doesn’t even matter.

      A human and a robot are not the same. They have different needs and must be afforded different moral protections. Someone can buy a book, read it, learn from it, and incorporate things it learned from that experience into their own future work. They may transform it creatively or it may plagiarize or it may rest in some grey area in-between where it isn’t 100% clear if it was novel or plagiarized. All this is also true for a LLM “AI”. – But whether or not this process is fundamentally the same or not isn’t even a relevant question.

      Copyright law isn’t something that exists because it is a pure moral good to protect the creative output of a person from theft. It would be far more ethical to say that all the outputs of human intellect should be shared freely and widely for all people to use, unencumbered by such things. But if creativity is rewarded with only starvation, creativity will go away, so copyright exists as a compromise to try and ensure there is food in the bellies of artists. And with it, we have an understanding that there is a LOT of unclear border space where one artist may feed on the output of another to hopefully grow the pot for everyone.

      The only way to fit generative bots into the philosophical framework of copyright is to demand that the generative bots keep food in the bellies of the artists. Currently, they threaten it. It’s just that simple. People act like it’s somehow an important question whether they “learn” the same way people do, but the question doesn’t matter at all. Robots don’t get the same leeway and protection afforded to humans because robots do not need to eat.

      • Storksforlegs@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Robots don’t get the same leeway and protection afforded to humans because robots do not need to eat.

        Well said.