• evo
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    At a glance I was confused/angry why this would only be for the Pixel 8 Pro and not the standard Pixel 8 considering they both have the same Tensor G3.

    However, (from my own testing) it seems very likely the full 12 GB of ram the Pro has (vs the 8GB in the Pixel 8) is needed for some of these tasks like summarization.

  • sciencesebi@feddit.ro
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    7
    ·
    1 year ago

    “The first phone with AI built in.”

    LOL Google are dellirious

    What about autocomplete? Face detection? Virtual assistants

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      “AI” is a pretty meaningless term. It’s impossible to say objectively whether any of the things you mentioned should be considered AI.

    • butter@midwest.social
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      AI is broad enough that it does include those features.

      But it’s probably referring to machine learning.

      • sciencesebi@feddit.ro
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        1 year ago

        That’s my point. AI includes features that were added years ago. Even ML is too broad. Autocomplete uses small ML models. Spam filters as well.

        I think they mean LLMs, and specifically distilled BARDs. So a subset of a subset of a subset of AI.

        Neckbeard marketing

    • evo
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      No, that might be accurate for what they are talking about. The absolute smallest Generative AI models (that are generally useful) are starting to shrink but are still several GB in size. Doing this on device is actually new.

      • sciencesebi@feddit.ro
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        1 year ago

        It says AI not genAI. Anyway, autocomplete is genAI, even though it may be simple glove embeddings and MC.

        You don’t know what the fuck you’re talking about.

        • evo
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          1 year ago

          Do you know how to read?

          Gemini Nano now powers on-device generative AI features for Pixel 8 Pro

          Technically auto complete can be considered Gen AI, but it obviously lacks the creativity that we all associate with Gen AI today. You don’t need a model that is generally useful to do auto complete.

          The point is it didn’t take a generally useful Gen AI model to do auto complete before but Google is now shipping features (beyond auto complete) that use such a model. Gen AI on device is novel.

          • sciencesebi@feddit.ro
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            1 year ago

            I was talking about the title, not the 10th paragraph way down. Use your reading skills and tell me where the fuck “generative” is in the title.

            No. Autocomplete is a feature. The model behind it can be gen AI and was for a number of years. IDGAF if it’s not general purpose.

            The point it you have no fucking clue what you’re defending. LLMs and diffusion models have been in apps for months. You can say that General purpose LLMs embedded into mobile OS functions is novel, the rest of it is bullshit.

            • TJA!
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              1 year ago

              It’s directly in the first paragraph…

            • evo
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              1 year ago

              where the fuck “generative” is in the title

              LLMs and diffusion models have been in apps for months.

              Show me a single example of an app that has an LLM on device. Find a single one that isn’t making an API call to a powerful server running the LLM. Show me the app update that adds a multi gigabyte LLM into the device. I’ll wait…

              Feel free to not respond when you realize you are wrong and you have no clue what everyone else is talking about.

                • evo
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  You didn’t list a single production app in that post…

    • quirzle@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      What about autocomplete? Face detection? Virtual assistants

      How much of that is really built-in vs. offloaded to their cloud then cached locally (or just not usable offline, like Assistant)?

        • evo
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          1 year ago

          That’s the entire point. Running the LLM on device is what’s new here…

            • evo
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              1 year ago

              I can’t find a single production app that uses MLC LLM (because of the reasons I listed earlier (like multi GB models that aren’t garbage).

              Qualcomm announcement is a tech demo and they promised to actually do it next year…

              • sciencesebi@feddit.ro
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                1 year ago

                Who said about production and non-garbage? We’re not talking quality of responses or spread. You can use distilled roberta for all I give a fuck. We’re talking if they’re the first. They’re not.

                Are they the first to embed a LLM in an OS? Yes. A model with over x Bn params? Maybe, probably.

                But they ARE NOT the first to deploy gen AI on mobile.

                • evo
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  You’re just moving the goal posts. I ran an LLM on device in an Android app I built a month ago. Does that make me first to do it? No. They are the first to production with an actual product.

        • quirzle@kbin.social
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Services running in GCP aren’t built into the phone, which is kinda the main point of the statement you took issue with.

          • sciencesebi@feddit.ro
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            1 year ago

            What does that have to do with CACHING? That’s client server.

            No clue what you’re talking about