Conversation 1: Information Is Not Uncertainty

1999 Jul 15
Please help me understand the relationship between source uncertainty and the reduction of uncertainty at the receiver. As far as I can understand Shannon's equations, they state that the more random the source generator (the larger the value of H(x)), the greater the reduction of uncertainty can be for the receiver:
Right.
If all characters are equally likely, p(i) = 1/n, then H(x) is at its maximum value. This is equivalent to a random string when transmitted.
Right.
If the characters are weighted, with some more likely than others, then H(x) is less than that potential maximum value (for the same n).
Right.
Since Information is the reduction of uncertainty:
R = H(x) - H(x)(y)
Shannon usually wrote Hy(x) for the latter (the equivocation, "the average ambiguity of the received signal").
it seems to me that transmitting the random string would cause more of a reduction of uncertainty than transmitting the weighted string; and thus more information would be transmitted.
That's right. There is one more thing to add: the more random the initial string is, the more information can be transmitted. But random is a matter of perspective in this case. It means that the perfect transmission code looks random to an outsider, because it has no correlations or your weightings. To someone who knows how to read the code, it is a wonderfully clear message because it can have most of the noise removed by proper decoding. (Shannon's amazing channel capacity theorem says that as long as the rate R is less than or equal to the channel capacity C, the error rate can be made as low as desired! This has rather interesting implications for biology: it allows the evolution of 'near perfection' in communication. (See CCMM for details of applying the theorem to molecular biology.)
I am no expert on information theory, this is simply the understanding I have gleaned from reading through Shannon's paper "A Mathematical Theory of Information" and several commentaries (including yours) on the topic of Information Theory. I am very interested in this topic, and I would greatly appreciate any clarification of my understanding that you can provide.
I hope that helps.
Thank you very much for your time.
You're welcome.
Jeff Abramsohn
jabramsohn@jhancock.com
(with permission to put on the web)
Tom
Dr. Thomas D. Schneider
toms@alum.mit.edu
permanent email: toms@alum.mit.edu
https://alum.mit.edu/www/toms/

color bar Small icon for Theory of Molecular Machines: physics,
chemistry, biology, molecular biology, evolutionary theory,
genetic engineering, sequence logos, information theory,
electrical engineering, thermodynamics, statistical
mechanics, hypersphere packing, gumball machines, Maxwell's
Daemon, limits of computers


Schneider Lab

origin: 1999 July 15
updated: 1999 July 15
color bar