• qyron@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    edit-2
    5 days ago

    Somebody please explain me in very simple words why do I need an AI capable chip in my personal computer. And under Linux, for the most.

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      5 days ago

      Offline translation is pretty great. Some image editing tools are pretty great. Games may utilise them in the future. Offline image recognition for searching for images (e.g. “show me pics of grandma”), etc.

      It’s not particularly widely used now, but the same was true for hardware video encode/decode, hardware accelerated encryption/decryption, etc.

      • Justin@lemmy.jlh.name
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 days ago

        image processing is pretty intense and would likely be handled by the GPU. Efficient embedded NN accelerators like this are meant to be used for more passive things, like noise cancelation or like you mentioned, translation.

        • KingRandomGuy@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 days ago

          I don’t know the architecture of AI accelerator in Ryzen processors but I do know a fair amount of image deblurring and denoising tools run on the neural engine on Apple Silicon. The neural engine is good enough for a lot of tasks, provided that your model only uses relatively simple operators and doesn’t need full precision.

    • N.E.P.T.R@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      5 days ago

      This isnt for you, nor for me. I don’t need an AI-capable chip, I could just use my GPU if for some reason I wanted to run a local transformer model.

    • Mwa@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      5 days ago

      IKR, am fine with using cpu and gpu to run llms locally (even tho am trying to avoid using llms), But npus SRSLY