• Angry_Autist (he/him)@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      11 days ago

      Not really, LLMs can’t keep the plot for more than a few paragraphs and still constantly context switch.

      Add on top of that we are currently at 90% saturation for what is possible with LLM technology, meaning that the brightest minds in AI have realized that there is no singularity curve approaching infinity but a bottleneck over the cross-indexing of tokens. The more tokens, the exponentially more connections need to be made.

      Adding a new token means greater overhead and while our technology grows linearly, the cost per token grows exponentially and that is an unsustainable curve limited by the total global possible processing and we’d need to triple that before the next generation of LLMs can have enough tokens to make a difference in quality

      Add on top of that I’ve got 30 years max left to live and I’m pretty confident we won’t see writers lose out in that time.

        • Angry_Autist (he/him)@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          2
          ·
          10 days ago

          I’m old

          And if you’ve seen long form LLM output, you know why publishers aren’t ditching human writers for the foreseeable future

          • drunkpostdisaster@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            10 days ago

            I doubt they care as long as it saves them money. Maybe they will use editors to clean things up. But sooner or later they will pull the trigger