• @[email protected]
    link
    fedilink
    311 months ago

    What’s more likely, in practice is an expansion of self into hardware.

    Basically, if you bolted a computer to your brain, you would still be you, just with better memory etc. If, at this point, your brain died, then you die. However, if you kept adding to the computer side, more and more of “you” would be software based. What happens at 90% (10x capacity) or 99%? If your brain were to die, “you” only lose 10% of your capabilities. So long as everything critical is duplicated in software, the pure software version of “you” can go on thinking.

    Critically, there is no neuron duplication here. It relies on both the plasticity of our brains to adapt, and the fact our self is continually updated.

    Interestingly, the first cases might not even be truly planned. What happens when your “neural AI assistant” can continue to function after your death. If it is self aware, and believes it is you, at what point is it still you?

    • wanderingmagus
      link
      fedilink
      211 months ago

      In that case I think it would heavily rely on the whole “everything critical is duplicated in software” part. It’s the same ship of theseus, just one level up - instead of neurons and pathways, it’s entire sections of the brain. If my biological short term memory withers away from dementia, boom, the chip already has that capability backed up and running. If my long term memory starts going because of Alzheimers, boom, also have those uploaded to the cloud on demand. Got ALS? Motor functions already rerouted through the chip.

      At no point is the entire biological brain destroyed at once and “transferred”; there is a continuity of consciousness throughout the process, a continuity of self and pattern. That’s the part most critical to me.

      • @[email protected]
        link
        fedilink
        2
        edit-2
        11 months ago

        Yeah this. Doesn’t particularly if it’s neuron by neuron or larger scale repairs. So long as not too much is replaced at once, and everything is backed up in software before very the switch, then I’d still be mostly me. I can’t imagine the changes wouldn’t change my personality, capabilities, etc, but I feel like I’d still be me so long as nothing fucks up in the process. Much better than whole brain backup/cloning, even with neuron-by-neuron copy+destruction.

        EDIT: where it gets sketchy is handling conscious (especially internal monologue) and nearly conscious sections. Those would need to be replaced at a slow rate IMO.