• cynar@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    9 months ago

    Interestingly, an inner monologue isn’t required for conscious thought. E.g. I’ve got several “inner thought streams”, only 1 uses language. It just happens that a lot of our early learning is language based. That trains our brain to go from language to knowledge. Hijacking that circuit for self learning is a useful method. That could create our inner monologue as a side effect.

    Also, a looping LLM is more akin to an epileptic fit than an inane inner monologue. It effectively talks gibberish at itself.

    Conversely, Google’s Deep dream does produce dream like images. It also does it in a similar way ( we think) to how human dreams work. Stable diffusion takes this to its (current) limit.

    Basically, an AI won’t need to think with an inner monologue. Also, any inner monologue would be the product of interactions between subsystems and the LLM, not purely within it.