from pkTools blog, 2009
Information scientists conceived of information as inversely related to predictability. That was in the 1940s, and, typically, the scientists worked for the phone company. Norbert Wiener claimed that there was more information in a sequence of numbers we don’t understand — 9, 37, 243 … — than in a sequence we think we unerstand — 2, 4, 6, 8 … Wiener further said that there was more information in a good sonnet (fourteen lines, ten syllables per line) than in the Manhattan White Pages. Claude Shannon defined information as “the inverse of the probability of the signal.” Mathematically that’s expressed H = -∑pi logepi.
Very good, very deep: information à la 1940s. For this thinker jack-of-many-sciences and teacher non-pareil Gregory Bateson absorbed what the phone company men said and went deeper, more simply. Bateson defined information as “any difference that makes a difference.”
My own emphasis is on complex information. Emphatically I deny any (one-tier) equation between data and information. Data may be muddled; information may be ambiguous. The information in a paradox may be nigh infinite.