Thanks to @[email protected] for the links!
Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior
Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0
Here’s a link to a preprint: https://arxiv.org/abs/2408.10234
Information theory is all about cutting through the waste of a given computation to compare apples to apples.
I’ll replicate an example I posted elsewhere:
Let’s say I make a machine that sums two numbers between 0-127, and returns the output. Let’s say this machine also only understands spoken French. According to information theory, this machine receives 14 bits of information (two 7-bit numbers with equal probability for all values) and returns 8 bits of information. The fact that it understands spoken French is irrelevant to the computation and is ignored.
That’s the same line of reasoning here, and the article makes this clear by indicating that brains take in billions of bits of sensory data. But they’re not looking at overall processing power, they’re looking at cognition, or active thought. Performing a given computational task is about 10 bits/s, which is completely separate from the billions of bits per second of background processing we do.
A lion sucks if measured as a bird.