I know current learning models work a little like neurons but why not just make a sim that works exactly like how we understand neurons work

  • xmunk
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    7 months ago

    I just want to make sure one of your words there is emphasized “possible” (Edit it’s also wrong as I explained below)

    The number of possible connections in the human brain is literally greater than the number of atoms in the universe.

    Yes - the value of 86 billion choose two is insanely huge… one might even say mind bogglingly huge! However, in actuality, we’ve got about 100 trillion neural connections given our best estimates right now. That’s about a thousand connections per neuron.

    It’s a big number but one we could theoretically simulate - it also must be said that it’s impossible for the simulation of the brain to be technically impossible… We’ve each got a brain and there are a billion of us made up out of an insignificant portion of the mass+energy available terrestrially - eventually (unless we extinct ourselves first) we’ll start approaching neurological information storage density - we’re pretty fucking clever so we might even exceed it!

    Edit for math:

    So I did a thunk and 86 billion choose 2 actually isn’t that big, I was thinking of 86 billion factorial but it’s actually just 86 billion squared (it’d be 86 billion less than that but self-referential synapses are allowed).

    Apparently this “greater than the number of atoms in the universe” line came from famously incorrect shame of Canada Jordan Peterson… and, uh, he’s just fucking wrong (so math can be added to the list of things he’s bad at - and that’s already a long list).

    Yea so - 86 billion squared = impressively large number… but not approaching 10^80 impressively large.