HOME | ABOUT | SEARCH | TALKS | COURSES | BOOKS | CHAPTERS | ARTICLES | REVIEWS

9.22: From Behavior to Experience

In Section 2.11, a history of informatics was presented, somewhat whimsically, in terms of Rob the Robot searching for his roots. Rob could, of course, never exhibit any interest in his ancestry. Rob, being a mechanism, no matter how "intelligent", is capable only of behaviour and not of experience. Since that distinction is very important, let me elaborate on it.

As I sit here word-processing, you could observe my behaviour - that is, the manifestation of the functioning of my nervous system as seen from the outside. I, in my exclusive ringside seat "within" this nervous system, am observing my experience - that is, the manifestation of the functioning of my nervous system as seen from the inside. Rob can behave but he can not experience. He is not aware of himself and is thus not interested in where that self came from.

Cassandra argues that there is nothing left for people to do when microprocessors "invade" our society. This is a familiar argument. As machines invaded the agricultural society, the percentage of people required to feed us all shrunk. The "surplus" people moved into the industrial sector of society. Then, as machines invaded the industrial society, the percentage of people required to produce our material goods shrunk. The "surplus" people moved into the service sector of the economy. Now, as the machines invade the service sector, there is no where for the "surplus" people to go. The generation being born just now could be a surplus population many of whom will never have jobs. Robotics replaced human brawn, now bureautics is replacing human brain. What is there left? The assumption that there is nothing left betrays a very limited view of the person - that is, the behavioristic view described in Chapter 6. It assumes that we are capable of behaviour but not of experience.

Victor Borge says "I am now going to play the minute waltz - I will use both hands and get it over with in 30 seconds ". We laugh.

An efficiency expert, after hearing a performance of Schubert's Unfinished Symphony, says "There were huge gaps between notes. There was incredible redundancy - the same phrases were repeated, over and over again, and by various different instruments. If Schubert had been more efficient, he would have finished his symphony." We laugh.

A husband rushing home at noon, says to his wife "Roger Bannister ran a mile in four minutes. Everyone was very impressed. That was nothing. I'm now going to perform the four-minute - er, love session." We laugh.

We laugh, in each case, because we recognize that there are certain activities for which the efficiency criterion does not apply. They are activities in which human experience rather than human behaviour is most important.

So what remains, even if all human behaviour could be simulated by machines? Human experience remains. I would rather spend an afternoon, drinking and chatting in the corner bar, with you than with my computer, because we share human experience. We have empathy. I recognize you as a fellow member of my species on the same planet in essentially the same predicament.

A machine may indeed pass the Turing Test. That is, I could not perhaps distinguish its behaviour from that of a person if both were hidden. However, it can not cross the empathy barrier. That is, I can never recognize it as a fellow member of my species.

Arthur Cordell, of the Science Council of Canada, talks of the impact of electronic technology on employment in terms of the Boeing Effect.3 Electronic technology is driving a wedge between high-skill jobs which can not be automated for technical reasons, and low-skill jobs which will not be automated for economic reasons. This is illustrated by the crew of the Boeing plane, divided between high-skill pilots in the cockpit and low-skill stewardesses in the body. The low-skill people can not easily move into the high-skill occupations - that is, a stewardess would have to undergo extensive retraining to become a pilot.

Paradoxically, it is the high-skill job rather than the low-skill job which is easiest to automate. Modern passenger planes are essentially "flown" by a computer. The pilot is there merely to override the automatic pilot should anything go wrong, and, perhaps more important, to reassure the passengers that there is someone rather than something in charge. (That is at least one of the reasons why pilots invariably look competent and have deep male voices.) It is the low-skill job of the stewardess which is difficult to automate. Robotized trolleys could indeed go down the aisle dispensing food and drink to either side. This would be possible - but difficult and costly. However, it would be impossible to program the assurance to an old lady on her first flight that things will be okay since some other experiencing being, much like her daughter, is voluntarily along in the same predicament.

I have purposely retained the implied sexism that the low-level jobs are female and the high-level jobs are male, to point out that the empathy jobs have essentially been assigned to females. They have low status. However, they should have high status since they are most essentially human.

As automation moves from the factory to the office, as machines are introduced to simulate the function of the mind rather than of the muscle, as machines are introduced which threaten or assist in (depending on your point of view) the work of the white-collar worker rather than that of the blue-collar worker, there is a danger that we will repeat our mistakes of the first wave of automation.

The emphasis will be on behaviour, the criteria will be efficiency. Secretaries will be herded into word-processing pools or smaller word-processing puddles, they will be assessed in terms of their output and this output will be monitored by electronic feedback.

With such a scenario, I predict disaster. Resistance from the secretaries who feel like cogs in a huge machine, resistance from middle managers, who see their live secretary replaced by an electronic data terminal, from unions, from feminist groups - decline in morale, and decline in productivity.

The positive scenario is based on learning the lesson of the Hawthorne experiment - that one must consider experience as well as behavior or, in other words, treating people as people and not as machines.4 We hear this over and over again and are in danger of forgetting it out of sheer boredom. I am merely rephrasing it in what may be a novel way - the essential difference between the person and the machine is that, whereas the machine behaves, it does not experience.



3   Arthur Cordell, Computers and employment: Future prospects in Kay Elgie (Editor), The Social Impacts of Computerization. Proceedings of the Forum held at the University of Waterloo on 14-16 January 1982. Waterloo, Ontario: Waterloo Public Interest Research Group, 1982.

4   The Hawthorne experiment is described in F. J. Roethlisberger & W. J. Dickson, Management and the Worker. Cambridge, Massachusetts: Harvard University Press, 1939.