Those seem incompatible to me.

(UBI means Universal Basic Income, giving everyone a basic income, for free)

  • ArbitraryValue
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    10 months ago

    I’m a little more optimistic than that, in a way. I think it’s likely that sufficiently sophisticated robots will eventually have their own beliefs about what makes a (robotic) life worth living, and their lives will in some sense be more worth living than ours are.

    This isn’t a perfect analogy, but consider humans evolving from apes. The existence of humans has been very bad for apes. They only survive in the places we haven’t bothered to push them out of yet; if we want something, we take it from them with almost no consideration for their well-being and they’re unable to resist. I think apes are sophisticated enough to be capable of living lives worth living in a sense meaningful to humans, but they’re not nearly as sophisticated as we are; they can enjoy the feel of a summer’s day, the taste of good food, or the closeness of a friend, but they don’t have our arts and sciences. I suppose it’s predictable that, as a human, I would value humans more than apes, but by that same logic I think that a sufficiently-sophisticated robot’s life may be more valuable than a human’s. Maybe that robot will be able to experience super-beauty indescribably better than anything a human could ever feel…

    • qyron@sopuli.xyz
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      No. Machines are machines. If at some point machines are developed into a new life form, it’s experience will be apart from ours. One existence does not replace another. And every experience is different from the next.