• sugar_in_your_tea
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 days ago

    You’re misunderstanding the terminology used then.

    In information theory, “bit” doesn’t mean “bitrate” like you’d see in networks, but something closer to “compressed bitrate.”

    For example, let’s say I build a computer that only computes small sums, where the input is two positive numbers from 0-127. However, this computer only understands spoken French, and it will ignore anything that’s not a French number in that range. Information theory would say this machine receives 14 bits of information (two 7-bit numbers) and returns 8 bits. The extra processing of understanding French is waste and ignored for the purposes of calculating entropy.

    The article also mentions that our brains take in billions of bits of sensory data, but that’s ignored for the calculation because we only care about the thought process (the useful computation), not all of the overhead of the system.

    • Buffalox@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      2 days ago

      I think I was pretty clear about a understanding or comprehension part, which is not merely input output.