• applebusch@lemmy.world
    link
    fedilink
    English
    arrow-up
    81
    arrow-down
    13
    ·
    1 year ago

    Doubt. These large language models can’t produce anything outside their dataset. Everything they do is derivative, pretty much by definition. Maybe they can mix and match things they were trained on but at the end of the day they are stupid text predictors, like an advanced version of the autocomplete on your phone. If the information they need to solve your problem isn’t in their dataset they can’t help, just like all those cheap Indian call centers operating off a script. It’s just a bigger script. They’ll still need people to help with outlier problems. All this does is add another layer of annoying unhelpful bullshit between a person with a problem and the person who can actually help them. Which just makes people more pissed and abusive. At best it’s an upgrade for their shit automated call systems.

    • RogueBanana@lemmy.zip
      link
      fedilink
      English
      arrow-up
      26
      ·
      1 year ago

      Most call centers have multiple level teams where the lower ones are just reading of a script and make up the majority. You don’t have to replace every single one to implement AI. Its gonna be the same for a lot of other jobs as well and many will lose jobs.

      • Ann Archy@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        It isn’t going to completely replace whole business departments, only 90% of them, right now.

        In five years it’s going to be 100%.

    • thetreesaysbark
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      I’d say at best it’s an upgrade to scripted customer service. A lot of the scripted ones are slower than AI and often have stronger accented people making it more difficult for the customer to understand the script entry being read back to them, leading to more frustration.

      If your problem falls outside the realm of the script, I just hope it recognises the script isn’t solving the issue and redirects you to a human. Oftentimes I’ve noticed chatgpt not learning from the current conversation (if you ask it about this it will say that it does not do this). In this scenario it just regurgitates the same 3 scripts back to me when I tell it it’s wrong. In my scenario this isn’t so bad as I can just turn to a search engine but in a customer service scenario this would be extremely frustrating.

    • guacupado@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      Your description of AI limitations sounds a lot like the human limitations of the reps we deal with every day. Sure, if some outlier situations comes up then that has to go to a human but let’s be honest - those calls are usually going to a manager anyway so I’m not seeing your argument. An escalation is an escalation. The article itself is even saying that’s not a literal 100% replacement of humans.

    • Ann Archy@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      6
      ·
      edit-2
      1 year ago

      You can doubt it all you want, the fact of the matter is that AI is provably more than capable to take over the roles of humans in many work areas, and they already do.