Currently, talking to a face is the ultimate guarantee that you are communicating with a human (and on a subconscious level makes you try to relate, empathise, etc.). If humanoid robot technology eventually surpasses the Uncanny Valley, discovering that I’m talking to a humanoid with an LLM and that my intuitions had been betrayed would undermine the instinctive trust I give to the other party when I see a human face. This would degrade my social interactions across the board, because I’d live in constant suspicion that the humans I was talking to weren’t actually human.

It is for this reason I think it should be the law that humanoid robots must be clearly differentiated from humans. Or at least that people should have the right to opt out from encountering realistic-looking humanoids.

  • Num10ck@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    2 months ago

    asimov explained how androids should be human form so that our world continues to be designed around our shape, rather than leaving us behind. doesn’t require an uncanny face though.

    • perestroika@slrpnk.net
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 month ago

      It would not exclude clear differentiation, however. :)

      Just like a chatbot posting on social media can add a message footer “this content was posted by a robot” to a fluent and human-like message, a humanoid robot, while having human form, can clearly identify itself as a robot.

      Personally, I think such a design requirement is higly reasonable on social media (as a barrier or action threshold against automated mass manipulation) but probably also in real life, if a day comes when human-like robots are abundant.

  • TootSweet@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    2 months ago

    This makes me think of the commandment “thou shalt not make a machine in the likeness of a human mind” from the Dune series.

    Seriously, though, I suspect a lot of technologies we currently experience in society only in the context of oppression of average people and widening of the income gap might be able to be put to better use. Not even necessarily because we have rules in place so much as because people won’t be baking their selfish asshole agendas into the tech they build.

    That all kindof assumes that humanoid robots would be “tools” for humans to “use”. If of course they (or at least some of them) are more like sentient creatures with hopes and dreams and emotions, that might make for a much different conversation. And that feels like the kind of conversation that’d be hard to even comment on today.

    • Andy@slrpnk.net
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      2 months ago

      I’ve spent a lot of time thinking about this, because over the last year I was writing the world guide for a solarpunk setting to be used with a tabletop RPG or as a writing guide. And while I was working on this, OpenAI came along and put the Turing test out to pasture.

      Several existential crises later, the result looked remarkably like I hadn’t thought about it at all: in the game setting, there are robots and they are treated like people. Like Bender on Futurama.

      I think @[email protected] (love the username, btw!) is absolutely right that our concerns are all largely shaped by the presumption that today, everything someone builds is built to benefit the creator and manipulate the end user. If that isn’t the case, than a convincing android could just be… your neighbor Hassan.

      Most machines probably wouldn’t have a reason to pretend to be human. But if one wanted to, that’s basically transorganicism. No disrespect to OP, but if a machine is sentient, trying to restrict it from presenting as organic seems pretty similar to restrictions on trans people using the restroom that matches their presentation.

      And if they are trying to deceive you maliciously, well… I currently know everyone I meet is organic, and I already know not to trust all of them.

  • Kwiila@slrpnk.net
    link
    fedilink
    arrow-up
    6
    ·
    2 months ago

    Bots online impersonating humans are already causing so many problems at every level.

  • Sterile_Technique@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 months ago

    I have no idea what the actual origin story for Fallout’s Brotherhood of Steel is, but I’d at least place OP’s post as a strong candidate.

    In all seriousness, if it ever gets out of uncanny valley, then yeah that’s a major transparency issue. The problem at that point becomes, even with laws in place to prevent it, who’s to say those will actually be followed? It’ll cause the same issues that deepfakes are causing now, but off-screen in real time. That would have crazy-bad implications for politics, security, social engineering, market manipulation…

    • SubArcticTundra@lemmy.mlOP
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      2 months ago

      Yep. But then I don’t get why there are efforts to make a realistic robotic human face.

      (Edit: ok I do understand one reason – just as a challenge and to prove it’s possible, but I’m not sure that justifies doing it given the consequences)

  • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    2 months ago

    🙋‍♂️

    Do anthropomorphic animal looking robots count? If so, I disagree.

    On a serious note, I do think it’s pointless to make them indistinguishable from a human; it’s a minor comfort that comes with huge risks and problems. Having a humanoid shape is good, because it allows them to do a full range of things instead of a singular task they were completely designed around. I mean, our bodies are pretty cool for doing shit, so a robot with the same kind of body would be even better. But they don’t need a face. Or skin. Unless it’s a sex bot. Then you might want that.

  • Chozo@fedia.io
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    2 months ago

    They’ll remember that you said this, once they start protesting for their rights.