• evo
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    1 year ago

    No, that might be accurate for what they are talking about. The absolute smallest Generative AI models (that are generally useful) are starting to shrink but are still several GB in size. Doing this on device is actually new.

    • sciencesebi@feddit.ro
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      3
      ·
      1 year ago

      It says AI not genAI. Anyway, autocomplete is genAI, even though it may be simple glove embeddings and MC.

      You don’t know what the fuck you’re talking about.

      • evo
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 year ago

        Do you know how to read?

        Gemini Nano now powers on-device generative AI features for Pixel 8 Pro

        Technically auto complete can be considered Gen AI, but it obviously lacks the creativity that we all associate with Gen AI today. You don’t need a model that is generally useful to do auto complete.

        The point is it didn’t take a generally useful Gen AI model to do auto complete before but Google is now shipping features (beyond auto complete) that use such a model. Gen AI on device is novel.

        • sciencesebi@feddit.ro
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          1 year ago

          I was talking about the title, not the 10th paragraph way down. Use your reading skills and tell me where the fuck “generative” is in the title.

          No. Autocomplete is a feature. The model behind it can be gen AI and was for a number of years. IDGAF if it’s not general purpose.

          The point it you have no fucking clue what you’re defending. LLMs and diffusion models have been in apps for months. You can say that General purpose LLMs embedded into mobile OS functions is novel, the rest of it is bullshit.

          • TJA!
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            1 year ago

            It’s directly in the first paragraph…

          • evo
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            1 year ago

            where the fuck “generative” is in the title

            LLMs and diffusion models have been in apps for months.

            Show me a single example of an app that has an LLM on device. Find a single one that isn’t making an API call to a powerful server running the LLM. Show me the app update that adds a multi gigabyte LLM into the device. I’ll wait…

            Feel free to not respond when you realize you are wrong and you have no clue what everyone else is talking about.