• TheFutureIsDelaware
    link
    fedilink
    arrow-up
    2
    arrow-down
    3
    ·
    edit-2
    1 year ago

    Appealing to authority is useful. We all do it every day. And like I said, all it should do is make you question whether you’ve really thought about it enough.

    Every single thing you’re saying has no bearing on how AI will turn out. None.
    If a 0 is “we figured it out” and 1 is “we go extinct”, here is what all possible histories look like in terms of “how things that could have made us go extinct actually turned out”:

    1
    01
    001
    0001
    00001
    000001
    0000001
    00000001
    etc.

    You are looking at 00000000 and assuming there can’t be a 1 next, because of how many zeroes there have been. Every extinction event will be preceded by a bunch of not extinction events.

    But again, it is strange that you can label an appeal to authority, but not realize how much worse an “appeal to the past” is.

    • Nougat@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      You don’t seem to have actually read anything I’ve written, and just want to argue with someone.

      • TheFutureIsDelaware
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        1 year ago

        Nope. I certainly have. It’s the same arguments I’ve been hearing from people dismissing AI alignment concerns for 10 years. There’s nothing new there, and it all maps onto exactly the wishful thinking I’m talking about.

        • Nougat@kbin.social
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          You don’t seem to have actually read anything I’ve written, and just want to argue with someone.

          Based on the fact that I have not anywhere “[dismissed] AI alignment concerns,” I stand by the above statement.