• Deiskos@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    1 year ago

    If it’s all in the cloud - yes. If some/most of it runs locally - still yes with a caveat - hardware acceleration can make it run both faster and at a lower battery cost, same how every computer has a dedicated graphics chip, whether as a separate expansion card or a separate module on the CPU. Yes, you technically can do all that math on CPU, it’s called software rendering, but rendering done with a purpose-built chip is so much better. Same logic applies to “AI”.

    • 👁️👄👁️@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      1 year ago

      I get what you mean, but there’s currently nothing planned or in the works to run local AI on phones, also they’re still way too demanding. They can’t even handle an 7b model if we’re talking about ram usage alone.

      • newIdentity
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 year ago

        Actually, Speach To Text is done locally on newer pixel devices. So is the audio recognition, the camera processing and a lot more AI features.

        AI isn’t just chatgpt and basically every device in the last 4 years has a dedicated AI chip

      • Deiskos@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Futureproofing, I guess, but also I’m sure there are things other than language models that can benefit from a dedicated processing unit, like photo processing, smaller models, Lens, etc.