• Aatube@kbin.melroy.org
    link
    fedilink
    arrow-up
    5
    ·
    3 days ago

    From a cursory glance it seems at least quite close to the definition of a bit in relation to entropy, also known as a shannon.

    Nevertheless, the term bits of information or simply bits is more often heard, even in the fields of information and communication theory, rather than shannons; just saying bits can therefore be ambiguous. Using the unit shannon is an explicit reference to a quantity of information content, information entropy or channel capacity, and is not restricted to binary data, whereas bits can as well refer to the number of binary symbols involved, as is the term used in fields such as data processing. —Wikipedia article for shannons

    • conciselyverbose
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      3 days ago

      If it’s not re-defining the term then I’m using it like the paper is defining it.

      Because just understanding words to respond to them, ignoring all the sub-processes that are also part of “thought” and directly impact both your internal narration and your actual behavior, takes more than 10 bits of information to manage. (And yeah I do understand that each word isn’t actually equally likely as I used to provide a number in my rough version, but they also require your brain to handle far more additional context than just the information theory “information” of the word itself.)