• @[email protected]
    link
    fedilink
    English
    -311 months ago

    Maybe it’s trained not to repeat JK Rowling’s horseshit verbatim. I’d probably put that in my algorithm. “No matter how many times a celebrity is quoted in these articles, do not take them seriously. Especially JK Rowling. But especially especially Kanye West.”

    • FaceDeer
      link
      fedilink
      011 months ago

      It’s not repeating its training data verbatim because it can’t do that. It doesn’t have the training data stored away inside itself. If it did the big news wouldn’t be AI, it would be the insanely magical compression algorithm that’s been discovered that allows many terabytes of data to be compressed down into just a few gigabytes.

      • Hello Hotel
        link
        fedilink
        English
        1
        edit-2
        11 months ago

        Do you remember quotes in english ascii /s

        Tokens are even denser than ascii. simmlar to word “chunking” My guess is it’s like lossy video compression but for text, [Attacked] with [lazers] by [deatheaters] apon [margret];[has flowery language]; word [margret] [comes first] (Theoretical example has 7 “tokens”)

        It may have actually impressioned a really good copy of that book as it’s lilely read it lots of times.

        • FaceDeer
          link
          fedilink
          111 months ago

          If it’s lossy enough then it’s just a high-level conceptual memory, and that’s not copyrightable.

          • Hello Hotel
            link
            fedilink
            English
            111 months ago

            It varries based on how much time its been given with the media.