• Telorand@reddthat.com
    link
    fedilink
    English
    arrow-up
    17
    ·
    8 days ago

    Imagine sitting down with an AI model for a spoken two-hour interview. A friendly voice guides you through a conversation that ranges from your childhood, your formative memories, and your career to your thoughts on immigration policy. Not long after, a virtual replica of you is able to embody your values and preferences with stunning accuracy.

    Okay, but can it embody my traumas?

    • conciselyverbose
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 days ago

      lol because people always behave in ways consistent with how they tell an interviewer they will.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 days ago

        If I can make a version of me that likes its job then that will be a deviation from the template that’s worth having. Assuming this technology actually worked, an exact digital replica of me isn’t particularly useful, It’s just going to automate the things I was going to do anyway but if I was going to do them anyway they aren’t really a hassle worth automating.

        What I want is a version that has all of my knowledge, but infinitely more patience and for preference one that actually understands tax law. I need an AI to do the things I hate doing, but I can see the advantage of customizing it with my values to a certain extent.