Claude Shannon - Father of the Information Age
David Neuhoff, University of Michigan: Digitization was an offshoot of Shannon’s ideas
David Neuhoff:
His discoveries were very much like Einstein’s, in the sense that Einstein was thinking about some things that no one else was, and suddenly came out with not just the question but the answer like the E=mc^2. [That] initiated all the research in atomic energy, you know, the fact that there is so much energy and this is how much there is. Well, Shannon discovered these formulas about information transmission, how fast, how many bits per second, you should e able to transmit over various media, and, uh, other people weren’t asking the question. And he came out with this answer and it was just so beautiful that it inspired the people who ended up designing your cell phone, and the communication links that make up the Internet.
Shannon’s master thesis (1938, when he was only 22) was about applying 2-value binary algebra and symbolic logic of Boole to the on and off positions of switching circuits, and described the idea of using these for a “logic machine”.
Robert McEliece, CalTech:
It sounds simple to say now, but at the time logic was a philosophical field focusing on arriving at correct conclusion from sound principles and a set of premises. He showed that and, or, and not–the simple connectives from boolean algebra could be used
People have said it’s the most influential master’s thesis in history, which is certainly true, but it understates the point. If he had done nothing else, he’d still be famous for inventing digital logic.
[6:00]
In 1941, he joined Bell Telephone Labs, at the time the Blitz over Britain was happening. So his early work was focused on anti-aircraft devices that could calculate and target aim of counter missiles against German rockets and airplanes. Bell also put Shannon onto cryptography.
In “Communication Theory of Secrecy Systems” he spelled out the fundamentals of modern cryptography.
Ian Blake, University of Toronto: recalls a conversation from Horst Feistel, indicating that DES was heavily inspired by Shannon’s principles about Confusion and Diffusion.
After the war, Shannon focused on how to apply binary values to telephone switching circuits.
Robert Lucky, Telcordia Technologies: back then in the ’40s, building and improving the world’s telephone network was almost religious.
Jack Keil Wolf, UCSD Jacobs School of Engineering:
One of the first things that they figured out as a result of this is that you probably shouldn’t just make a telephone line longer and longer and longer, and put amplifiers in the middle of the line. Pretty soon the signal starts getting smaller, and so eventually you have to amplify, you have to make it bigger again, so that the signal can move the next stretch. But every time you amplify the signal, you amplified all the imperfections that got into the signal, which we call noise. And pretty soon, all you had was noise.
That led Shannon to fundamentally rethink how telephone conversations and other messages (as he called it, information) were transmitted.
Jack Keil Wolf:
In Shannon’s case, what happened was, and it was a tremendous surprise, that there was sort of a common commodity associated with all kinds of information, whether it be radio or television or satellites et cetera. And this common commodity was, first of all information, and information could be represented in binary form or a sequence of bits.
Shannon’s 1948 “A Mathematical Theory of Communication” was the first published mention of the word bit (binary digit), though he credits that term to John Tukey.
Solomon Golomb, University of Southern California:
The Tukey bit and the Shannon bit really aren’t the same thing. The popular notion of a bit is simply a unit of storage that you store either a 0 or a 1 and that’s a bit. Tukey coined this as simply a contraction of binary digit. The Shannon bit is a unit of information. And a storage bit contains at most one bit of information, but quite often a lot less.
Shannon’s 1948 paper in the Bell System Technical Journal made him a star overnight.
Toby Berger, Cornell University:
It’s one of the few times in history where somebody founded a field, stated all the major results, and proved most of them, all pretty much at once.
Jack Keil Wolf: Bell came up with a ‘regenerative repeater’. So now instead of transmitting analog signals (voices), we transmit bits. As long as the bits aren’t “too small” they can be reconstructed perfectly on the other end, and the repeater can then forward a clean reconstruction of the original.
In this way, you can counteract noise (any fidelity lost when encoding to a bit sequence is decided upfront according to tolerances). This is better than analog signals, where noise accrues as a function of distance.
Ramesh Rao, CA Institute for Telecommunications & Information Technology: Shannon is known for discovering a fundamental limit (now called the Shannon limit or Shannon capacity) that indicates the maximum amount of transmissible information given a medium’s
Robert McEliece:
Since it was a prediction, it’s like the speed of light, he said, every–you can travel no faster than the speed of light. But he didn’t say how to build rocket ships to do that. And so we’ve been trying since that time–when I say we I mean the engineers, the mathematicians the computer scientists who followed in, who accepted the challenge that Shannon laid out, so to speak–have tried to find practical ways to get all the way up to channel capacity.
[14:00]
G. Dave Forney, who studied under Shannon at MIT, went on to design the first computer modems for Motorola and design codes and algorithms for data transmission.
Shannon’s 1948 paper also laid the foundation for data compression, decades before it could be used. He saw the need and formulated the theory.
He also came up with the concept of adding redundant bits to a message to detect or correct errors incurred during transmission or storage.
Andrew Viterbi, who designed coding algorithms essential to mobile telephone networks, notes that it took years for technology to catch up with the digital future that Shannon foresaw.
Andrew Viterbi, Co-Founder Qualcomm:
Where we used to have one transistor per chip, we now have somewhere between 10 million and a hundred million. All of that was necessary in order to make the Shannon theories practical. And it’s, it’s quite interesting to note that Shannon’s papers came out in 1948 and late in 1947 the transistor was invented, in the same Bell laboratory environment, Murray Hill New Jersey, and the two went hand-in-hand.
According to Viterbi, the first practical testing ground for Shannon’s ideas was JPL, during the Space Race. Solomon Golomb also worked at JPL during this time.
Solomon Golomb:
When I started talking in the late ’50s at JPL about digital communications, this was considered a contradiction in terms by the traditional communication theorists, they were so wedded to the notion that communication involved continuous waveforms and continuous modulation. So the whole idea that communications could be and were moving in the direction of going digital was a new idea that was heavily influenced by Shannon. And the systems we started designing for deep space communication were from that point of view very much influenced by this whole new concept. The basic notion that Shannon contributed to communications, which was that you could compare how well you were doing with an underlying model that told you what was the limit, what was the capacity of your channel in a theoretical sense.
In 1949 he built a rudimentary computer to play chess, and wrote a paper about how to program such machines, including strategies for computer chess algorithms which are still in chess today.
[18:45]
He also worked on rudimentary AI, such as his maze-solving mouse Theseus.
In 1956, he left Bell Labs for a teaching position at MIT, and his work mostly shifted to inventions he refers to as useless.
Thomas Cover, Stanford University, won the Shannon Award in 1990 for extending information theory into investment analysis–a subject Shannon never wrote about but had delivered 2 lectures on. Privately, Shannon and his wife made a killing on the stock market investing in tech startups owned by friends, such as Teledyne and Hewlett-Packard.
Ed Thorp, who wrote “Beat the Dealer” (about how to play blackjack optimally, and count cards), had tried to get Shannon to submit some papers to journals, and got Shannon interested in games–stuff like roulette prediction.
In 1973, 25 years after his groundbreaking paper, the Information Theory Society within the IEEE established an annual Shannon lecture that evolved into the Shannon Award.
Shannon retired from MIT in 1978, but never really could escape his fame.
Shannon never won a Nobel prize for his work, more or less on technicality, because there was not a Nobel prize for mathematics/engineering. But in the 1980s, the Japanese government set up the Kyoto prize which was supposed to be a mirror of the Nobel prize, and Shannon won the first Kyoto prize.
Shannon’s last major interview was in 1987 for Omni magazine. By the late 1980s friends noticed he was suffering from Alzheimer’s. When Bell Labs buildings in Florham Park, New Jersey were renamed in 1998 to AT&T Shannon Laboratories, Shannon himself was in a nursing home and unable to attend. He passed away in 2001.
David Neuhoff:
My guess is we’re 10, 20 years ahead of where we would’ve been if Shannon hadn’t been there to make these discoveries. It would’ve just been a lot of small discoveries. And he presented us with this big clear picture, all at once.
Andrew Viterbi:
What he did for communications and information theory in general, was startling and momentous. And if he hadn’t come along, it probably would’ve taken us 30, 40 years to come up with probably a, uh, only a subset of his invention.