The Psychology of Communication

HOME | ABOUT | SEARCH | TALKS | COURSES | BOOKS | CHAPTERS | ARTICLES | REVIEWS

2.4 Transportation Theory of Communication

The theory of communication associated with the behaviorist concept of the person is the Shannon-Weaver model of communication [SHANNON & WEAVER]. Information is transmitted by a source over a channel to a destination. For example, right now I am the source, you are the destination, and we are communicating over the visual channel. The information transmitted by the source is not necessarily the information received by the destination. You may receive information which I did not transmit (noise) and I may transmit information which you do not receive (equivocation). The criterion of success is the percentage of transmitted information - that is, the overlap of information transmitted by source and information received by destination (see Figure 2-3).

Let us say that you know my last name is GARDINER but you do not know my first and middle names. If I now tell you that my last name is GARDINER, I provide you with no information. You already knew this. Information from the source (in this case, me) to the destination (in this case, you) is thus a function of uncertainty at the destination. Let us now say that I tell you my first name is WILLIAM. I provide you with information, since you did not already know this. Let us now say that I tell you my middle name is LAMBERT. Once again, I provide you with information because you did not already know this. However, I provided you with more information when I told you that my middle name was LAMBERT than when I told you my first name was WILLIAM, because there was more uncertainty at the destination. That is, you were more likely to guess that my first name was WILLIAM (every Tom, Dick, and Harry is called WILLIAM) than to guess that my middle name was LAMBERT.

The amount of information transmitted from the source can thus be measured as a function of the amount of uncertainty at the destination. Information theorists define the bit (binary unit) as the amount of information which cuts uncertainty in half. Thus, if I toss a coin and tell you that it came down HEADS, I transmit 1 bit of information because there were 2 equally likely alternatives - HEADS and TAILS. With 4 equally likely alternatives, then, I transmit 2 bits of information; with 8, 3 bits; with 16, 4 bits; and so on. The amount of information when told the results of tossing a die is between 2 and 3 bits, of choosing a letter from the alphabet is between 4 and 5 bits, of choosing a card from a pack is between 5 and 6 bits (see Figure 2-4).

In real life situations, however, letters of the alphabet are seldom equally likely. The likelihood of each letter is a function of the context in which they are found. Claude Shannon illustrates this in the Shannon Guessing Game. I am thinking of a four-letter word - guess the first letter. After some time, let us say you guess correctly that it is a Q. Now guess the second letter. You immediately guess correctly that it is a U. The second guess was easier because you had a context - in English, Q is always followed by U. Now guess the third letter. Once again, this is easier than guessing the first letter but harder than guessing the second letter, because the context reduces the options to the vowels. Let us say, you guess correctly that it is an I. Now guessing the fourth letter is easy because there are only a few letters which added to QUI creates an English word. The word by the way was QUIZ.

Shannon was illustrating the fact that a language does not consist of a random series of letters which are equally likely but provides a context in which certain letters are more likely than others. This feature of language is called redundancy and explains why we are able to understand one another even although some of the information which is transmitted by the source is not received at the destination. We can fill in the gaps. As we gain more and more competence in a language, we can fill in bigger and bigger gaps.