• Ogmios
    link
    English
    123 months ago

    AI is never going to get worse than it is now

    Is that just a wild assumption, or…? One phenomena that has already been witnessed with AI is that it does in fact get worse if it trains upon it’s own output.

    • FaceDeer
      link
      fedilink
      33 months ago

      Given that I have locally-run AIs sitting on my home computer that I have no plan to delete (until something better comes along), then yeah, it’s never going to get worse. If all else fails I can just use the existing AI for as long as I want. It doesn’t “wear out.”

      • Ogmios
        link
        English
        -13 months ago

        It doesn’t “wear out.”

        The physical components will, and compatible components for older systems keep getting harder to come across. Computers are not immortal entities. Maintenance of older machines will continually become more labour and cost intensive over time.

        • @[email protected]
          link
          fedilink
          English
          53 months ago

          The models are digital, making copies for safekeeping is easy.

          The hardware is a computer, and computers are general-purpose. The kind that run AI models well at infrastructure scale are rather high end, but are still available off-the-shelf.

        • FaceDeer
          link
          fedilink
          43 months ago

          Computers are general-purpose machines. You can run a computer program on any computer, it may just be faster or slower depending on the computer’s capabilities.

          The AIs I run locally are also open-source, so if future computers lose compatibility with existing programs they can be recompiled for the new architecture.

          I suppose we could lose the ability to build computers entirely, but that strikes me as a much bigger and more general issue than just this AI thing.

          • Ogmios
            link
            English
            -13 months ago

            You can run a computer program on any computer

            Incorrect. Certain programs require certain standards for how the hardware is designed. There are already lots of old programs which can’t be run natively on modern machines, and using software to emulate a compatible environment can impact performance in more ways than just speed.

            • FaceDeer
              link
              fedilink
              13 months ago

              You’re wildly wrong about the fundamentals of computer science here. I’d be starting from first principles trying to explain further. I recommend reading up on Turing machines, or perhaps getting ChatGPT to explain it to you.

              • Ogmios
                link
                English
                13 months ago

                I actually happen to know a lot about computers, and can even build them from raw materials. They are not eternal existences no matter how they’re treated in popular culture, and data retention as hardware/standards evolve is actually a serious concern that is getting a fair bit of attention in research. One of the more interesting avenues being explored is encoding data in DNA, because humans will always have a reason to want to be capable of reading DNA, but that’s still just theory at this point.