Apple is creating its own AI-powered chatbot that some engineers are calling “Apple GPT,” according to a report from Bloomberg. The company reportedly doesn’t have any solid plans to release the technology to the public yet.

As noted by Bloomberg, the chatbot uses its own large language model (LLM) framework called “Ajax,” running on Google Cloud and built with Google JAX, a framework created to accelerate machine learning research. Sources close to the situation tell the outlet that Apple has multiple teams working on the project, which includes addressing potential privacy implications.

  • michikade@lemm.ee
    link
    fedilink
    arrow-up
    45
    ·
    1 year ago

    If they could integrate that development into making Siri better, that’d be great.

    • tunetardis@lemmy.ca
      link
      fedilink
      arrow-up
      20
      ·
      edit-2
      1 year ago

      Hey Siri, what’s the weather today?

      Expect sunny conditions and a high of 27.

      Hey Siri, so it’s not supposed to rain right?

      It is raining right now.

    • shodgdon@lemmy.world
      link
      fedilink
      arrow-up
      18
      ·
      1 year ago

      Even the most limited GPT that gave a lot of “I can’t help you with that” responses to be on the safe side would be light years ahead of Siri at this point

    • deong@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      1 year ago

      I think there are probably some ways to cross over a bit, but really, LLMs aren’t necessarily aimed at the kind of things we want a virtual assistant to do today. Siri falls down mostly on its ability to correctly do things quickly and reliably. Generating 5000 words of convincingly human sounding explanations isn’t what I want from a thing I quickly trigger on my phone. What I want is very short or no reply accompanying the action I wanted to take. Call this person. Start navigation to an address. Turn on the lights. Play the version of a song I like from this specific live album. Some of those things are things Siri really sucks at today, and none of them are likely to get a lot better with an LLM in place. Maybe playing music benefits from a more robust understanding of the language of my query, but the rest of it are things where the suckage is more that Siri takes 8 seconds for the server to respond or just inexplicably decides that today it doesn’t know how to turn on a light.

      At this point it feels like a great LLM would let Siri fail to respond to a much more varied set of ways for me to ask my question in English, but that’s not really the target we’re shooting for here.

      • tunetardis@lemmy.ca
        link
        fedilink
        arrow-up
        8
        ·
        1 year ago

        I agree with you to an extent in that I would not want Siri producing a thesis every time I ask a simple question. But I think one thing that would help is if she remembered the last few things you requested and builds some sort of context around it? That’s what impressed me most about chatgpt. If it doesn’t quite give me what I’m looking for, I could clarify it and we’d eventually get there. Siri is like a person with severe short term memory loss, and much of my frustration comes from that.

    • ForgetReddit@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      They indeed need to make it more conversational. I think this is a big thing Jobs would harp on if still alive. It should feel like always having a friend/assistant in the room who knows everything.

      It sucks for privacy but if you trust apple enough it’d be nice to have an always-on microphone for Siri so you could be like “hey siri that tour we talked about at breakfast- can you bring up directions to that?” Stuff like that

      • StarManta@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        1 year ago

        Holy shit that sounds like an absolute nightmare.

        Let’s ignore for the moment all the mega corporation and cloud data security implications of that (and there are MANY), let’s pretend it does all processing and storage locally and never needs to transmit any of those conversations offsite.

        That STILL sounds like an absolute nightmare. I could spy on the people who live with me in an extraordinarily efficient way. “Hey Siri, what did my wife talk about in the phone call over breakfast?” “Hey siri, is my daughter gay?” “Hey siri, summarize all the conversations you heard at this dinner party.”