HOME | ABOUT | SEARCH | TALKS | COURSES | BOOKS | CHAPTERS | ARTICLES | REVIEWS

CHAPTER 8: FOURTH GENERATION - MULTIMEDIA AND INTERNET

No medium is inherently better than any other. --- It's all in what you do with it.

Michael Chabot, The Amazing Adventures of Kavalier and Clay, Page 363


8.1 A Short History of the Computer

The third generation of media involved the invention of extrasomatic tools for transmitting information - telegraph and telephone for transmitting information over wires and radio and television for transmitting information through the air. The fourth generation of media is largely the story of the invention of one extrasomatic tool for storing information - the computer. This automation of thought (computers) then converges with the previous automation of language (tele-communications). This convergence requires the digitization of our various extrasomatic tools for transmission of information so that they can "talk" to the computer for the storage of information.

We must imagine something before we create it. Every invention appears first in the subjective map of someone before it appears in our common objective world. The machine, like the person, must be conceived before it is born. The automation of thought by means of computers emerged from imagining a simulation of ourselves. This section of our story began then with two teen-age girls.

Mary Wollstonecraft Goodwin (1797-1851), while still a teenager, had run off to Europe with two poets - Percy Bysshe Shelley and George Gordon, the notorious Lord Byron. They were holed up in a chateau in Switzerland during a very rainy summer. To entertain themselves, they decided to write horror stories. The two poets, being experienced writers, started writing right away, and taunted Mary who could not even get started. One night, however, she had a dream and started writing a story the next morning, which quickly evolved into the familiar famous story of Frankenstein [SHELLEY].

Back in England, Ada Augusta Byron (1815-1851), the daughter of Lord Byron, upset that her father had deserted her when she was only one month old, turned from art to science. She attended a demonstration by an eccentric inventor called Charles Babbage (1791-1871) of his Difference Engine. Although only 17, she recognized its importance right away. Babbage had invented the computer. Ada Byron (later, Lady Lovelace) worked with Charles Babbage for the rest of her life [RHEINGOLD 1985].

He worked on the hardware and she worked on the software. Alas, Babbage never could get his machine to work. However, if he had, the software written by Ada would have worked. She had discovered all the major basic principles of programming. "Ada" is now the name of a computer language named in her honor as the world's first programmer. The two major cyberpunk authors - William Gibson and Bruce Sterling - surprised their fans by getting together to write a Victorian novel. The Difference Engine was an alternative history which explores the repercussions on society if Babbage had been able to get his machine to work over 100 years before someone else finally managed to build one that did work [GIBSON W & STERLING B].

Another eccentric Englishman contributed to our conceptualization of the computer. Alan Turing (1912-1954) presented us with the Turing Machine and the Turing Test. He imagined a device through which one could thread an infinite loop of squares [TURING]. Each square contains either a 1 or a 0. The machine can either change one symbol to the other or leave it the same. He argued that any problem which can be expressed clearly in terms of logical statements could be solved by this machine. The Turing Test involved communication between a person with a terminal linked to two other terminals which are out of sight, operated respectively by a person and by a machine. If the person could not tell which was which, the machine had passed the Turing Test.77

Alan Turing put his theory into practice during World War 2. He was invited to join a group which was attempting to break the code of a machine used by the Nazis to transmit coded messages. Turing reverse-engineered the Enigma machine which created the code, from a description of the machine given him by a disgruntled former employee of the Nazis who had been dismissed because he was Jewish. As a result, the Allied Forces were able to read captured Nazi messages throughout most of the war. Many have argued that Alan Turing did more to win that war than anyone else including Winston Churchill who considered the breaking of the code so valuable that he did not warn people about a bombing raid for fear of alerting the Nazis that their code had been broken.

One of Turing's eccentricities was that he was an outed homosexual at a time and place where this was dangerous. (This homophobic society had jailed one of our greatest playwrights, Oscar Wilde, essentially for homosexuality at the height of his fame only a few decades earlier.) Dismissed from his job and abandoned by those who could have saved him because his contribution was still classified information, he committed suicide by biting into an apple laced with cyanide.78

As Turing realized, the best code for a computer consists of two letters - 0 and 1.79 Thus computer technology - no matter how esoteric it may seem - is simpler than 1, 2, 3. It's 0, 1. A computer can thus be created using any device which can toggle between two stable states, representing 0 and 1. In the first generation, the device was a vacuum tube; in the second, it was a transistor; in the third, it was an integrated circuit; in the fourth, it was a microprocessor (many integrated circuits on a single chip). Each of those devices supplanted the previous device, in contrast to the generations of tele-communications, in which each generation simply supplemented the previous generations.

We are currently awaiting the fifth generation. Perhaps it will be a chip made of gallium arsenide rather than silicon, which gets too hot at the current speed of operation. Perhaps it will be a biochip, made of organic rather than mechanical material. Whatever it will be, the saga of the incredible shrinking chip will continue. Computers are getting smaller and smaller, faster and faster, cheaper and cheaper, smarter and smarter, and friendlier and friendlier. Moore's Law, that the speed of a chip will double and the cost of a chip will half every 18 months, has continued to apply over the last few decades of the 20th century and promises to continue in the 21st century.

The computer had a long gestation period. Though conceived by Charles Babbage in the early 19th century, it was not born till over a century later. However, once born, it grew (or shrunk?) rapidly. Indeed, so rapidly that its history can be described within a single life-time - for example, mine.

1955     My first steady job after getting off the boat from Scotland was as a clerk in the Canadian Pacific Railway's offices at Windsor Station. CPR had just got its first computer [CANADIAN TRANSPORTATION, SPANNER]. It filled a huge, air-conditioned room, cost millions of dollars, had a score of priest-engineers attending to it, and all we clerks were terrified that it was going to take over our jobs. This was, of course, only seven years after the first working computer was built at the University of Pennsylvania in 1948.80

1969     A group of West Indian students, protesting what they perceived as racism, took over my classroom H-110 in the Hall Building of Sir George Williams University and then moved, along with many of my students, to occupy the Computer Center upstairs. This site was chosen because the computer was viewed as the tool of the oppressive "System" (the military-industrial-academic complex) which was violating human rights. The computer was still a massive device which had to be fed a stack of cards to be processed by the engineers who returned results days later. "Do not bend, staple, or mutilate" was one of the slogans of the protesters, who threw the cards out of the windows when the police stormed the Computer Center.

1984     Fifteen years later, I buy my first computer - the 128K Macintosh. Though its power and price ($3000) is laughable now, it had more power than that first computer at CPR and sat on my desktop. The big step however was not the next installment in the saga of the incredible shrinking chip but the improvement in the interface. It was like improving your ax by getting a new handle rather than a new head. There was no essential difference in the inside of the computer ( I opened it up to check and found that it was mostly empty) but in the relationship between the computer and myself. This WIMP (Window-Icon-Mouse-Pulldown menu) interface replaced the previous MACHO interface (Figure 8-1) and reduced our dependence on the high priests attending to the computer.

1997     My computer is now powerful enough to enable me to make a silicon clone (siliclone) of myself and download it on to a CD-ROM. The big breakthrough during the intervening decade is a shift from an emphasis on artificial intelligence (AI) to an emphasis on intelligence amplification (IA). There is no need to simulate natural intelligence (that we already have) but to supplement it with artificial intelligence. The siliclone is an extension of Scot. Thus it fits within the framework of the Toronto School of Media Studies within which media is viewed as an extension of the person.



77   This means, of course, that the computer would sometimes have to act stupid in order to appear human. If given two 6-digit numbers to multiply, it would have to "pretend" to take some time. An instant answer, which it could provide, would be suspect.

78   This apple deserves a place alongside Newton's in the annals of famous apples in science. Turing's work was just de-classified in the 1970s. Since then, his story has been told in a number of media - for example, a cyberpunk novel, Cryptonomicon, by the current star within this genre [STEPHENSON], a documentary, Breaking the Code, starring Derek Jacobi as Alan Turing; and a film, U-571, about the capture of Enigma from a German submarine (with Americans replacing the British who "starred" in the real story, since it's an American movie). Mick Jagger of the Rolling Stones has taken out an option to make the movie, Enigma, based on a book about the work of Turing and his colleagues.

79   It may initially appear surprising that such a powerful language can be created using only two letters. Speech - the language of the first generation - consists of 44 units (the phonemes), and print - the language of the second generation - consists of 26 units (the graphemes - i.e. the letters of the alphabet). However, the "language" in which you and I are written - the genetic code - consists of only 4 letters - A (Adenine), G (Guanine), C (Cytosine), and T (Thymine). The complexity of a system is thus not directly correlated with the number of elements it contains.

80   Actually the release of the classified information about the work of Alan Turing and his colleagues in deciphering Nazi codes during World War 2 reveals that this group had built a computer earlier. Thanks to my neighbour, Ron Ritchie, of the Canadian Pacific Railway, for finding those sources for me.