Information Theory

of Claude Shannon & Warren Weaver

(From the Third  Edition of A First Look at Communication Theory by Em Griffin, © 1997, McGraw-Hill, Inc. This text-only version of the article appears on the World Wide Web site www.afirstlook.com. The text version does not contain any figures. A facsimile of the original article, which includes all figures,  is also available in PDF format.)

In the late 1940s, a Bell Telephone Company research scientist by the name of Claude Shannon developed a mathematical theory of signal transmission. As you might expect from a telephone engineer, his goal was to get maximum line capacity with minimum distortion.

Shannon showed little interest in the semantic meaning of a message or its pragmatic effect on the listener. Like today’s manufacturers of state-of-the-art compact disc players, he wasn’t concerned whether the channel carried Beethoven, the Beatles, or The Boss. He didn’t care whether the listener preferred the beat of rock or the counterpoint of Bach. His theory merely aimed at solving the technical problems of high-fidelity transfer of sound.

 

TECHNICAL SOLUTIONS TO SOCIAL PROBLEMS

In the wake of scientific discoveries spawned by World War II, Americans were optimistic that all social problems could be recast into mechanical terms susceptible to engineering solutions. Shannon was somewhat wary about the wholesale application of his mathematical equations to the semantic and pragmatic issues of interpersonal communication. But his hesitation was not shared by Warren Weaver, an executive with the Rockefeller Foundation and the Sloan-Kettering Institute on Cancer Research, and a consultant to a number of private scientific foundations. Shannon’s published theory was paired with an interpretive essay by Weaver that presented information theory as "exceedingly general in its scope, fundamental in the problems it treats, and of classic simplicity and power in the results it reaches."1 The essay suggested that whatever the communication problem, reducing information loss was the solution.

Most people working in the field of human communication had trouble following the mathematics of Shannon’s theory, but Weaver’s translation and commentary were easy to understand. Since the discipline was ripe for a model of communication and information theory was there to fill the need, its source-channel-receiver diagram quickly became the standard description of what happens when one person talks to another. Many of the terms we use today originated with Shannon and Weaver—message fidelity, multiple channels, information loss, source credibility, and feedback.

Because Shannon’s theory explores the electronic transmission of messages, it might seem appropriate to discuss it in the context of mass media theories. But his twenty-three theorems focus on syntax, the relationship between words. Research since the theory’s introduction contributes mainly to the field of applied linguistics. For these reasons, I include information theory in the section on messages.

 

A LINEAR MODEL OF COMMUNICATION

Since Bell Laboratories paid the bill for Shannon’s research, it seems only fair to use a telephone example to explain his model, which is shown in Figure 4.1. Imagine you have a summer job at a camp located far from civilization. A few weeks’ absence from a romantic partner has given you a strong desire to "reach out and touch someone." Finances, work schedule, and a line of others wanting to use the only pay phone available limit you to a three-minute long-distance call.

Shannon would see you as the information source. You speak your message into the telephone mouthpiece, which transmits a signal through the telephone-wire channel. The received signal picks up static noise along the way, and this altered signal is reconverted to sound by the receiver in the earpiece at the destination end of the line. Information loss occurs every step of the way so that the message received differs from the one you sent.

During his lifetime, Weaver applied the model to the interpersonal features of conversation. Your brain is the information source, your voice the transmitter. Noise could include a hoarse throat from yelling at the campers, background chatter of those waiting to use the phone, or the distraction of mosquitoes drawing blood. The received signal may be diminished by an ear that’s been overexposed to hard rock, and your friend is quite capable of altering the message as it moves from ear to brain.

Shannon concentrates on the technical center of his model. (Will the phone system work sufficiently well so that you can get your message across?) Weaver focused on the source-destination relationship. (What’s going on between the two of you?) But all information theorists share a common goal of maximizing the amount of information the system can carry.

 

INFORMATION: THE REDUCTION OF UNCERTAINTY

Most of us are comfortable with Wilbur Schramm’s notion that information is simply stuff that matters or anything that makes a difference.2 Shannon, however, has a technical definition for the word that doesn’t equate information with the idea of meaning. He emphasizes that "the semantic aspects of communication are irrelevant to the engineering aspects."3 For Shannon, information refers to the opportunity to reduce uncertainty. It gives us a chance to reduce entropy.

Shannon borrowed the idea of entropy from the second law of thermodynamics, which states that the universe is winding down from an organized state to chaos, moving from predictability to uncertainty. Entropy is randomness. How much information a message contains is measured by the extent it combats entropy. The less predictable the message, the more information it carries.

Picture yourself making that long-distance call, but this time in response to a blistering letter from your friend, who has heard that you’re having a summer fling with a co-worker. The letter is clear: "Call me and just say yes it’s true, or no it’s not—nothing more!" That either/or demand means you won’t require the three-minute channel capacity of the telephone line. But since your wavering friend has only an even chance of predicting your answer, that one bit of information will reduce his or her uncertainty by 50 percent. As a matter of fact, that’s how the theory defines a bit (taken from binary digit) of information. It’s communication that can cut entropy in half. Let’s play out the scene a few bits further.

 

REDUCING ENTROPY BIT BY BIT

At the beginning of the telephone conversation, you truthfully acknowledge romantic feelings for someone else. Your former friend breaks the rule stated in the letter and demands to know which one of the potential sixteen staff workers is the object of your affection. The conversation could literally narrow down the alternatives bit by bit.

friend: Is this special someone on the sports staff or the kitchen crew?

you: Sports staff. [Cut in half to eight.]

friend: Which cabin does this new friend live in, Sequoia or Cherokee?

you: Sequoia. [Cut in half to four.]

friend: First-year staff or an old-timer?

you: First year. [Cut in half to two.]

friend: The redhead or the blond?

you: The blond. [All uncertainty gone.]

Removing all uncertainty took four bits of information. Of course, it would have been less cumbersome simply to say the name in the first place, but either way four bits of entropy were eliminated.

If this conversation really happened, and if you truly cared about the person on the other end of the line, you would try to squeeze every bit of innovative explanation for your conduct into the three-minute period. That’s what Shannon and Weaver mean by information. As they use the term, it "relates not so much to what you do say, as to what you could say."4 Their focus on message possibilities inspired a touch of doggerel from University of Colorado professor Don Darnell:

What one does is only one

Of several things he might have done.

One must know the things rejected

To appreciate the one selected.5

A good connection for three minutes provides lots of opportunity to draw on a wide repertoire of messages. If lack of imagination or situational constraints limit you to a few predictable clichés such as "only good friends" or "doesn’t mean a thing to me," Shannon, Weaver, and probably your ex-friend will regard your efforts as uninformative and redundant.

There are many fine things that can be said over a communication channel that don’t qualify as information. Perhaps your phone call wasn’t crisis motivated, but was merely a way to announce "I just called to say I love you." If the person on the other end had no doubt that you cared in the first place, the call is a warm ritual rather than information. If the destination party already knows what’s coming, or the source isn’t free to choose the message sent, information is zero.

 

NOISE VS. INFORMATION

Noise is the enemy of information. For Shannon and Weaver, noise is more than an irritating sound or static on the line. It is anything added to the signal that’s not intended by the source. Usually that kind of interference is an unintended by-product of the situation. In nonelectrical channels, noise can be smudged newsprint, ah-um-er vocal filler, or visual movement that distracts the listener. There is a ground-floor seminar room at my college that overlooks a grassy knoll. The first warm day in May brings out a flock of sunbathers to soak up the rays. No teacher can begin to compete with the view; the room is too noisy.

Noise may be intentional. For many years, the government of the former Soviet Union jammed the Voice of America broadcasts so that its citizens wouldn’t hear news from the west. Hecklers try to drown out the words of a speaker in order to prevent the audience from considering an opposing viewpoint. We can even generate white noise to mask more disruptive sounds. That’s the purpose of Muzak. Yet whether accidental or planned, noise cuts the information-carrying capacity of the channel between the transmitter and receiver. Shannon describes the relationship with a simple equation:

Channel Capacity = Information + Noise 6

Every channel has a fixed upper limit on the information it can carry. Even if you resort to a fast-talking monologue in a no-noise environment, your three-minute telephone call restricts you to using a maximum of 600 words. But conditions are far from ideal. The noise on the line and the static in the mind of your jealous listener guarantee that many of your words won’t be heard. You will need to devote a portion of the channel capacity to repeating key ideas that might otherwise be lost.

The way to offset noise is through increased redundancy. Shannon and Weaver regard communication as the applied science of maintaining an optimal balance between predictability and uncertainty. Without a great amount of repetition, reiteration, and restatement, a noisy channel is quickly overloaded. Yet too much redundancy is inefficient. Needless duplication diminishes our chance to make novel statements, and our initially avid audience may become bored and inattentive.

 

LEARNING THROUGH FEEDBACK

Shannon and Weaver’s model is deficient in that it represents communication as a one-way flow of information. While the recent increase of voice mail and telephone answering machines may make unidirectional communication seem like the wave of the future, you would be wise to seek a response early in your three-minute phone call. On the basis of the feedback you receive, you can then encode the kind of audience-adapted message that speech teachers regard as the mark of effective communication.

Working independently from the Bell Lab program, MIT scientist Norbert Wiener conceived of human attempts to control entropy through feedback as exactly parallel to what happens in communication machines. During World War II he developed an antiaircraft firing system that would adjust future trajectory by reinstating results of past performance. Feedback is a way to introduce learning into the system, something ignored by Shannon and Weaver.

Wiener didn’t fit the traditional role of a detached scientist. He considered confusion as a personal affront and was fond of quoting Einstein’s comment that "God may be subtle, but not plain mean."7 Wiener was convinced that humans could use thinking machines (we call them computers) to combat chaos. To designate the field of artificial intelligence, he coined the term cybernetics, a transliteration of the Greek word for "steersman" or "governor." He was one of the first to see computers as offering great promise to the human race, but he also feared they would be used by those in power to control people rather than things. His brief book, The Human Use of Human Beings, presents the essential concepts of information theory while adding thoughts on feedback and ethics.

Wiener noted that feedback systems need to be dampened slightly so that they aren’t overly sensitive. One interpretation of the psychotic’s plight is that of a hypervigilant person constantly trying to adjust to the conflicting expectations of everyone else. The implication for verbal feedback is that we should monitor the effect of our words, but not be tyrannized if the response we get falls short of our expectation.

 

NARROW APPLICATIONS OF INFORMATION THEORY

Although Shannon’s mathematical model of signal transmission helped Bell Labs solve technological problems, the theory has limited application in the field of speech communication. Information theory did, however, foster modest advances in the study of the redundancy inherent in language, an issue of syntax.

Journalist Wilson Taylor developed a cloze procedure that deletes every nth word from the written text. Try filling in the blanks that replace every seventh word in the following passage from a Nero Wolfe mystery novel:

I had time to get a ___________ of orchid-germination records entered into ___________ PC before Fred came back to ___________ brownstone at four-fifteen. The timing ___________ he wouldn’t run into Wolfe, who ___________ was well into his playtime in ___________ plant rooms. Fred looked almost as ___________ as he had earlier. "What does ___________ think, Archie?" the accused asked as ___________ dropped into one of the yellow ___________.8

Since everyday English is about 50 percent redundant, you were probably able to predict about five of the correct words. Here are the answers so that you can check: batch, the, the, ensured, already, the, frazzled, he, he, chairs. This is a highly readable passage. It would be much harder to supply the missing words from the context if the text were a portion of Shannon’s technical treatise. There are times, however, when we will gladly trade readability for concentrated information. Classified ads omit filler words and many vowels from the message in order to convey more data for the dollar. The advertiser assumes the reader will have sufficient knowledge and motivation to wade through a highly concentrated stream of information.

Darnell used the same type of missing word test, but his "clozentropy" technique analyzed individuals rather than language in general. He found the fill-in-the-blanks procedure a reliable exam for competency in English as a second language, and he also used it as a way to spot potential group nonconformists by identifying those whose responses differed from everyone else’s.

As interesting as the syntactical application of information theory may be, it’s a far cry from the communication cure-all that Weaver proclaimed over forty years ago. A few applied researchers have tried to build on Shannon and Weaver’s concept of reducing entropy. For example, Charles Berger’s uncertainty reduction theory is a rare attempt to extend Shannon’s ideas to face-to-face interaction (see Chapter 12). Berger believes that the desire to reduce uncertainty explains much of what goes on when people are in the initial getting-to-know-you phase of a relationship. But for those who applaud Weaver’s attempt to frame information theory as an umbrella to cover syntactics, semantics, and pragmatics, the overall results must seem disappointing. For those who regard Shannon’s equations as technical models of signal transmission, Weaver’s extension into questions of meaning and effectiveness must seem distasteful.

 

CRITIQUE: IS TRANSMISSION OF INFORMATION OVERRATED?

Shannon and Weaver’s theory has great historical significance. Their model touched off an ongoing search for other physical representations of communication. Had they not conducted their groundbreaking work, this book might never have been written.

The theory’s diagram of information transmission appears in almost every communication textbook. Over the years, millions of students have been exposed to the one-way flowchart that makes information seem like a commodity that is packaged, picked up by UPS, then carried through noisy city streets, delivered to its destination, and finally unwrapped relatively intact. Communication philosopher Walter Ong thinks that’s unfortunate. In his book Orality and Literacy: The Technologizing of the Word he writes: "This model obviously has something to do with human communication, but, on close inspection, very little, and it distorts the act of communication beyond recognition."9 Almost all the other theories I’ll present in the book work to correct this linear conception of communication. Equating information transmission with communication, however, is an idea that dies hard.

Psychologist Janet Beavin Bavelas questions whether reducing uncertainty is always an appropriate communication goal. Along with three of her students at the University of Victoria, she examined numerous cases of equivocal communication that comes from being put on the spot in no-win situations. All of us have found ourselves forced to comment on a recommended movie, book, play, or concert we thought was rotten. Bavelas thinks that the strategic ambiguity of a remark like "Interesting!" is superior to a straightforward response. My favorite example of equivocal communication is the schizophrenic patient who sent his mother a Mother’s Day card that read, "For someone who has been like a mother to me."10 Bavelas writes:

Equivocation is not the deliberately deceitful "dirty old man" of communication. It is subtle, often commendable, and entirely understandable, if only the observer will expand his or her analysis to include the communication situation. When seen in context, not making sense does make sense.11

Information theory appears to ignore the human factor in human communication. When applied to interpersonal communication, Shannon and Weaver’s model reduces people at the destination end to unfeeling bowling pins who have no say in whether they stand or fall.

Social science literature on romantic jealousy also suggests the marginal usefulness of Shannon and Weaver’s concept of information.12 When a couple manages to repair a damaged relationship, the result is usually due to third-party counseling, building self-esteem or encouraging assertiveness in the jealous partner, or a joint celebration and reconstruction of past times together. New interpretations are much more important than new information.

 

QUESTIONS TO SHARPEN YOUR FOCUS

  1. Shannon and Weaver use the term information in a highly specialized way. How do they define information?
  2. There are 512 pages in a book. If I tell you I am reading page 317, I have communicated 9 bits of information. Can you explain why?
  3.  What are some examples of noise that you experienced as you read this chapter?
  4. Can you think of a recent phone call where your communication goal wasn’t the reduction of uncertainty

 

A SECOND LUCK

Recommended resource:Norbert Wiener, The Human Use of Human Beings, Avon, New York, 1967.

Comprehensive statement: Claude Shannon and Warren Weaver, The Mathematical Theory of Communication, University of Illinois, Urbana, 1949.

Introduction to concepts: Donald Darnell, "Information Theory: An Approach to Human Communication," in Approaches to Human Communication, Richard Budd and Brent Ruben (eds.), Spartan, New York, 1972, pp. 156 - 169.

Overview: Allan Broadhurst and Donald Darnell, "An Introduction to Cybernetics and Information Theory," Quarterly Journal of Speech, Vol. 51, 1965, pp. 442 - 453.

Advocates of broad theory: Seth Finn and Donald Roberts, "Source, Destination, and Entropy: Reassessing the Role of Information Theory in Communication Research," Communication Research, Vol. 11, 1984, pp. 453 - 476.

Advocate of narrow theory: David Ritchie, "Shannon and Weaver: Unraveling the Paradox of Information," Communication Research, Vol. 13, 1986, pp. 278 - 298.

Meaningful information: Robert Wright, "Information in Formation," in Three Scientists and Their Gods: Looking for Meaning in an Age of Information, Harper & Row, New York, 1988, pp. 79 - 110.

Cloze research: Wilson Taylor, "Cloze Procedure: A New Tool for Measuring Readability," Journalism Quarterly, Vol. 30, 1953, pp. 415 - 433.

Clozentropy: Donald Darnell, "Clozentropy: A Procedure for Testing English Language Proficiency of Foreign Students," Speech Monographs, Vol. 37, 1970, pp. 36 - 46.

Equivocal communication: Janet Beavin Bavelas, Alex Black, Nicole Chovil, and Jennifer Mullett, Equivocal Communication, Sage, Newbury Park, Calif., 1990