• AA5B@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    3 months ago

    I know we’re supposed to hate Apple here, but this is a big reason I’m excited about the upcoming event. I really like their path of on-device AI. I’ve been reading some of their case studies on making models work in limited memory situations and they’re already using their own soc with multiple specialized processing nodes that you can imagine being extended to support on device ai. Now let’s find out what they can deliver

    • Aurenkin
      link
      fedilink
      arrow-up
      2
      ·
      3 months ago

      Yes, Google has also moved in this direction with tensor and Gemini nano. I expect to see a lot more movement here over the next few years as there is a big financial incentive to offload all that compute cost as well.