In article <E8udnR4LarelIizTnZ2dnUVZ_ridnZ2d@earthlink.com>, VWWall <email@example.com> wrote:
> I was at Bell Labs in 1949, and Betty Moore, Claude Shannon's wife, > worked in the same department. She complained that their new carpet > didn't have deep enough nap to hide the solder droppings from the > computer Claude was building. It could perform math functions with its > input/output being in Roman numerals. > > > It's sort of funny, he called it communication > > theory, it was his disciples who popularized > > 'information theory'. > > > Shannon took a very practical outlook on "information". I recall how > he would stop people in the halls and ask them to guess the next letter > in a sentence he'd show them. I remember one as: "A motorcycle has no > reverse; it can not back up." He used this method to determine the > redundancy in the English language. > > Most of the time, he was riding his unicycle down the hall!
The seminal introductory work for me was the Undergraduate text "Coding and Information Theory", by the late Richard W. Hamming, 1980, ISBN 0-13-139139-9 (perhaps one of the 1st books to have an ISBN number). It cost me $26.35 (CDN), hardcover, and I devoured every morsel of it.
His later work "The Art of Probability" is also worth a read for anyone who still believes in randomness.
-- Random : An infinitesimal, yet omnidimensional, god of science. Random often appears in the guise of a trickster named Error.