I’m sure it depends on the AI tools and features being used, but with all the “magic” obfuscation from companies surrounding them, it’s not exactly clear how much of the processing is happening locally over remotely.

With some of the text stuff, I’m relatively sure most of that involves data exchange to work, but for some of the image/video editing and audio processing? That’s where things get much murkier, at least to me, and where this question is largely stemming from.

I’m aware more processors are specifically being made to support these features, so it seems like there are efforts to make more of this happen locally, on one’s own devices, but…What’s the present situation look like?

  • Lung
    link
    fedilink
    135 months ago

    Basically: to even have a decent model installed locally is in the gigabytes. If you didn’t install gigs then it’s remote

  • @sun_is_ra
    link
    English
    75 months ago

    Most ai providers do the processing remotely. As a rule of thunb, if you cant use specific AI service without internet then its done remotely

  • @[email protected]
    cake
    link
    fedilink
    45 months ago

    It varies some….

    Most of it is remote, however “Siri” actually does a lot locally, and I assume Google assistant does too.

    Those are likely the only two that do much locally, everyone else does it all remotely.

    • Atemu
      link
      fedilink
      25 months ago

      “Siri” actually does a lot locally, and I assume Google assistant does too.

      On what basis? It’s Google, so I would assume any and all data that you could possibly input into their apps and services to be used against you.

      • @[email protected]
        cake
        link
        fedilink
        15 months ago

        Mostly a “cost” basis, but it’s an assumption for the Google assistant for sure. Siri I’ve tested.

  • @[email protected]
    link
    fedilink
    35 months ago

    If you the download size is in the gigabytes and need a good graphics card to run it, you’re doing it locally. Otherwise, it’s remote.

  • Yer Ma
    link
    fedilink
    25 months ago

    It’s all remote and they keep everything you give them

  • @[email protected]
    link
    fedilink
    25 months ago

    It depends.

    Are you using chatGPT in a browser window? Then yeah.

    Are you running SD through Automatic1111 locally? Are you running Mistral or LLama models locally?

    Then you’re running locally.