Sorry about that ridiculous watermark.

  • blady_blah@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    6 months ago

    I dunno even if there is no you in a metaphysical sense the deconstruction method still ends your personal subjective experience of being you which sucks. Sure the next you might be just as much you as the first one but you don’t get to be around to enjoy that.

    But it doesn’t and that’s the point. You are not the collection of atoms that make up your body, YOU are the software program that is running on your brain-computer. The software program can be transferred (or copied) and you are still you. There is no “you” outside of that software.

    • starman2112
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      edit-2
      6 months ago

      Your idea of what constitutes “you” Is wrong. Your subjective experience ends when you get dismantled. We can say this definitively, because when the transporter fails to dismantle the original, they don’t get to see through their copy’s eyes. If they don’t get to see what the transporter clone sees when both are alive, then it stands to reason that if they get dismantled, they still don’t get to see what their clone sees. Their subjective experience ends.

      • blady_blah@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        I disagree with you, but I don’t know that I can explain it anymore clearly than I already have. There is no metaphysical “you” that exists outside of the software running in your head. You would experience perfect continuity if your body was dismantled and reconstructed. There is no real “you” except the software program that is running on your meat CPU.

        Like I said, this is a hard thing to wrap your head around.

        • starman2112
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          6 months ago

          There is no metaphysical “you” that exists outside of the software running in your head.

          100% agreed.

          You would experience perfect continuity if your body was dismantled and reconstructed.

          I’m going to explain it a different way.

          This is Bill.

          🕺⬜⬜⬜⬜⬜

          I’m going to transport Bill over here.

          ☁️⬜⬜⬜⬜🕺

          That’s still the same Bill, right? There’s continuity?

          Now I’m going to do a Tom Riker, and unsuccessfully transport Bill.

          🕺⬜⬜⬜⬜🕺

          Which one is the real Bill?

          If I’m understanding your argument right, you seem to think both of these are Bill. Which they are, but they’re not the same Bill. Despite both of them subjectively feeling a sense of continuity, only Left Bill has existed for more than a few seconds. If I correct my mistake by shooting Left Bill in the head, his subjective experience of being Bill is over. If I never made the mistake, and successfully dismantled him, the same would occur. For him, continuity is not maintained through the transporter.

          I was never concerned with whether the me that steps out of the transporter experiences continuity. I’m only concerned with whether the me that exists right now does.

          • blady_blah@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            6 months ago

            You understand me correctly and correctly predicted my response. Your last paragraph is the interesting part however.

            Imagine you have an AI. It’s a fully functional self aware AI. Let’s call this software “Bob”. From one instance to the next, this software is just memory and processing inside a computer. It is aware of it’s place in the universe to the same extent we are. Let’s say you pause the CPU. Did you just kill the AI? Of course not. Now lets say you make a perfect copy of the AI on two separate computers in two separate locations. The AI asks me “which one is the ‘real’ me?” My answer is their both the “real you,” but one moment they start processing independently, they’re now two different individuals that deviate from the moment of the copy.

            Now lets say you change a stick of memory in the original AI, is that the same entity? If you unplug the memory cards and fly them to another location and plug them back in, is that the same entity? If you FTP the entity from California to Germany and install it on another machine, is that the same entity? It’s all the same answer as making a copy.

            We humans are only the sum of the software in our heads. There is no real us, only the code executing line by line in our biological processor. That’s why there is no “real you” in this discussion, only software, and the person on the other side of the transporter is just as much the real you as the copy that’s destroyed. You are just a self-aware program.