The Psychology of Communication


9.2 History Of Approach

We must imagine something before we create it. Every invention appears first in the subjective map of someone before it appears in our common objective world. The machine, like the person, must be conceived before it is born. The automation of thought by means of computers emerged from imagining a simulation of ourselves. This section of our story began then with two teen-age girls.1

Mary Wollstonecraft Goodwin (1797-1851), while still a teenager, had run off to Europe with two poets - Percy Bysshe Shelley and George Gordon, the notorious Lord Byron. They were holed up in a villa in Switzerland during a very rainy summer. To entertain themselves, they decided to write horror stories. The two poets, being experienced writers, started writing right away, and taunted Mary who could not even get started. One night, however, she had a dream and started writing a story the next morning, which quickly evolved into the familiar famous story of Frankenstein [SHELLEY].

Meanwhile, back in England, Ada Augusta Byron (1815-1851), the daughter of Lord Byron, upset that her father had deserted her when she was only one month old, turned from art to science. She attended a demonstration by an eccentric inventor called Charles Babbage (1791-1871) of his Difference Engine. Although only 17, she recognized its importance right away. Babbage had invented the computer. Ada Byron (later, Lady Lovelace) worked with Charles Babbage for the rest of her life [RHEINGOLD 1985].

He worked on the hardware and she worked on the software. Alas, Babbage never could get his machine to work. However, if he had, the software written by Ada would have worked. She had discovered all the major basic principles of programming. "Ada" is now the name of a computer language named in her honor as the world's first programmer. The two major cyberpunk authors - William Gibson and Bruce Sterling - surprised their fans by getting together to write a Victorian novel. The Difference Engine was an alternative history which explores the repercussions on society if Babbage had been able to get his machine to work over 100 years before someone else finally managed to build one that did work [GIBSON W & STERLING B].

Another eccentric Englishman contributed to our conceptualization of the computer. Alan Turing (1912-1954) presented us with the Turing Machine and the Turing Test. He imagined a device through which one could thread an infinite loop of squares [TURING]. Each square contains either a 1 or a 0. The machine can either change one symbol to the other or leave it the same. He argued that any problem which can be expressed clearly in terms of logical statements could be solved by this machine. The Turing Test involved communication between a person with a terminal linked to two other terminals which are out of sight, operated respectively by a person and by a machine. If the person could not tell which was which, the machine had passed the Turing Test.2

Alan Turing put his theory into practice during World War 2. He was invited to join a group which was attempting to break the code of a machine used by the Nazis to transmit coded messages. Turing reverse-engineered the Enigma machine which created the code, from a description of the machine given him by a disgruntled former employee of the Nazis who had been dismissed because he was Jewish. As a result, the Allied Forces were able to read captured Nazi messages throughout most of the war. Many have argued that Alan Turing did more to win that war than anyone else including Winston Churchill who considered the breaking of the code so valuable that he did not warn people about a bombing raid for fear of alerting the Nazis that their code had been broken.

One of Turing's eccentricities was that he was an outed homosexual at a time and place where this was dangerous. (This homophobic society had jailed one of our greatest playwrights, Oscar Wilde, essentially for homosexuality at the height of his fame only a few decades earlier.) Dismissed from his job and abandoned by those who could have saved him because his contribution was still classified information, he committed suicide by biting into an apple laced with cyanide.3

As Turing realized, the best code for a computer consists of two letters - 0 and 1.4 Thus computer technology - no matter how esoteric it may seem - is simpler than 1, 2, 3. It's 0, 1. A computer can thus be created using any device which can toggle between two stable states, representing 0 and 1. In the first generation, the device was a vacuum tube; in the second, it was a transistor; in the third, it was an integrated circuit; in the fourth, it was a microprocessor (many integrated circuits on a single chip). Each of those devices supplanted the previous device, in contrast to the generations of media, in which each generation simply supplemented the previous generations.

We are currently awaiting the fifth generation. Perhaps it will be a chip made of gallium arsenide rather than silicon, which gets too hot at the current speed of operation. Perhaps it will be a biochip, made of organic rather than mechanical material. Whatever it will be, the saga of the incredible shrinking chip will continue. Computers are getting smaller and smaller, faster and faster, cheaper and cheaper, smarter and smarter, and friendlier and friendlier. Moore's Law, that the speed of a chip will double and the cost of a chip will half every 18 months, has applied over the last few decades of the 20th century and is continuing in the 21st century.

The computer had a long gestation period. Though conceived by Charles Babbage in the early 19th century, it was not born until over a century later. However, once born, it grew (or shrunk?) rapidly. Indeed, so rapidly that its history can be described within a single life-time - for example, mine.

1955   My first steady job after getting off the boat from Scotland was as a clerk in the Canadian Pacific Railway's offices at Windsor Station. CPR had just got its first computer [SPANNER]. It filled a huge, air-conditioned room, cost millions of dollars, had a score of priest-engineers attending to it, and all we clerks were terrified that it was going to take over our jobs. This was, of course, only seven years after the first working computer was built at the University of Pennsylvania in 1948.5

1969   A group of West Indian students, protesting what they perceived as racism, took over my classroom H-110 in the Hall Building of Sir George Williams University and then moved, along with many of my students, to occupy the Computer Centre upstairs. This site was chosen because the computer was viewed as the tool of the oppressive "System" (the military-industrial-academic complex) which was violating human rights. The computer was still a massive device which had to be fed a stack of cards to be processed by the engineers who returned results days later. "Do not bend, staple, or mutilate" was one of the slogans of the protesters, who threw the cards out of the windows when the police stormed the Computer Centre.

1984   Fifteen years later, I buy my first computer - the 128K Macintosh. Though its power and price ($3000) is laughable now, it had more power than that first computer at CPR and sat on my desktop.6 The big step however was not the next installment in the saga of the incredible shrinking chip but the improvement in the interface. It was like improving your axe by getting a new handle rather than a new head. There was no essential difference in the inside of the computer ( I opened it up to check and found that it was mostly empty) but in the relationship between the computer and myself - the user interface. This WIMP (Window-Icon-Mouse-Pulldown menu) interface replaced the previous MACHO interface (see Figure 9-1) and reduced our dependence on the high priests attending to the computer. It was user friendly so that we could use it (us) or idiot proof so that we could not misuse it (engineers).

2004   My computer is now powerful enough to enable me to make a silicon clone (siliclone) of myself, which has since grown to include 6 books (including this one), 4 chapters in books, 5 articles, 7 reviews, and 9 talks. This reduction in the size and cost of computing raises the hopes of the artificial intelligentsia once again. The prospect of the singularity when our creations will outsmart us and perhaps condescend to keep us as pets is being presented anew [KURZWEIL].

1   Up till now, I have been focusing, to my embarrassment, on the work of dead white males.

2   This means, of course, that the computer would sometimes have to act stupid in order to appear human. If given two 6-digit numbers to multiply, it would have to "pretend" to take some time. An instant answer, which it could provide, would be suspect.

3   This apple deserves a place alongside Newton's in the annals of famous apples in science. Turing's work was just de-classified in the 1970s. Since then, his story has been told in a number of media - for example, a cyberpunk novel, Cryptonomicon, by the current star within this genre [STEPHENSON 1999], a documentary, Breaking the Code, starring Derek Jacobi as Alan Turing; and a film, U-571, about the capture of Enigma from a German submarine (with Americans replacing the British who "starred" in the real story, since it's an American movie). Robert Harris has written a book based on the work of Turing and his colleagues. The book, entitled Enigma, is in turn the basis for a movie under the same title.

4   It may initially appear surprising that such a powerful language can be created using only two letters. Speech - the language of the first generation - consists of 44 units (the phonemes), and print - the language of the second generation - consists of 26 units (the graphemes - i.e. the letters of the alphabet). However, the "language" in which you and I are written - the genetic code - consists of only 4 letters - A (Adenine), G (Guanine), C (Cytosine), and T (Thymine). The complexity of a system is thus not directly correlated with the number of elements it contains.

5   Actually the release of the classified information about the work of Alan Turing and his colleagues in deciphering Nazi codes during World War 2 reveals that this group had built a computer earlier.

Thanks to my neighbor, Ron Ritchie, of the Canadian Pacific Railway, for finding those sources for me.

6   On a recent visit to a Macintosh computer store, I noticed that they were using two of them as bookends.