AD: Browse, Email, Share, Publish.  Click Here To Download The New Netscape Communicator.

by Howard Rheingold

The idea that people could use computers to amplify thought and communication, as tools for intellectual work and social activity, was not an invention of the mainstream computer industry or orthodox computer science, nor even homebrew computerists; their work was rooted in older, equally eccentric, equally visionary, work. You can't really guess where mind-amplifying technology is going unless you understand where it came from.

- HLR

Chapter One: The Computer Revolution Hasn't Happened Yet
Chapter Two: The First Programmer Was a Lady
Chapter Three: The First Hacker and his Imaginary Machine
Chapter Four: Johnny Builds Bombs and Johnny Builds Brains
Chapter Five: Ex-Prodigies and Antiaircraft Guns
Chapter Six: Inside Information
Chapter Seven: Machines to Think With
Chapter Eight: Witness to History: The Mascot of Project Mac
Chapter Nine: The Loneliness of a Long-Distance Thinker
Chapter Ten: The New Old Boys from the ARPAnet
Chapter Eleven: The Birth of the Fantasy Amplifier
Chapter Twelve: Brenda and the Future Squad
Chapter Thirteen: Knowledge Engineers and Epistemological Entrepreneurs
Chapter Fourteen: Xanadu, Network Culture, and Beyond
Footnotes


Chapter One:
The Computer Revolution Hasn't Happened Yet

South of San Francisco and north of Silicon Valley, near the place where the pines on the horizon give way to the live oaks and radiotelescopes, an unlikely subculture has been creating a new medium for human thought. When mass-production models of present prototypes reach our homes, offices, and schools, our lives are going to change dramatically.

The first of these mind-amplifying machines will be descendants of the devices now known as personal computers, but they will resemble today's information processing technology no more than a television resembles a fifteenth-century printing press. They aren't available yet, but they will be here soon. Before today's first-graders graduate from high school, hundreds of millions of people around the world will join together to create new kinds of human communities, making use of a tool that a small number of thinkers and tinkerers dreamed into being over the past century.

Nobody knows whether this will turn out to be the best or the worst thing the human race has done for itself, because the outcome of this empowerment will depend in large part on how we react to it and what we choose to do with it. The human mind is not going to be replaced by a machine, at least not in the foreseeable future, but there is little doubt that the worldwide availability of fantasy amplifiers, intellectual toolkits, and interactive electronic communities will change the way people think, learn, and communicate.

It looks as if this latest technology-triggered transformation of society could have even more intense impact than the last time human thought was augmented, five hundred years ago, when the Western world learned to read. Less than a century after the invention of movable type, the literate community in Europe had grown from a privileged minority to a substantial portion of the population. People's lives changed radically and rapidly, not because of printing machinery, but because of what that invention made it possible for people to know. Books were just the vehicles by which the ideas escaped from the private libraries of the elite and circulated among the population.

The true value of books emerged from the community they made possible, an intellectual community that is still alive all over the world. The printed page has been a medium for the propagation of ideas about chemistry and poetry, evolution and revolution, democracy and psychology, technology and industry, and many other notions beyond the ken of the people who invented movable type and started cranking out Bibles.

Because mass production of sophisticated electronic devices can lag ten years or more behind the state of the art in research prototypes, the first effects of the astonishing achievements in computer science since 1960 have only begun to enter our lives. Word processors, video games, educational software, and computer graphics were unknown terms to most people only ten years ago, but today they are the names for billion-dollar industries. And the experts agree that the most startling developments are yet to come.

A few of the pioneers of personal computing who still work in the computer industry can remember the birth and the dream, when the notion of personal computing was an obscure heresy in the ranks of the computing priesthood. Thirty years ago, the overwhelming majority of the people who designed, manufactured, programmed, and used computers subscribed to a single idea about the proper (and possible) place of computers in society: "computers are mysterious devices meant to be used in mathematical calculations." Period. Computer technology was believed to be too fragile, valuable, and complicated for nonspecialists.

In 1950 you could count the people who took exception to this dogma on the fingers of one hand. The dissenting point of view shared by those few people involved in a different way of thinking about how computers might be used. The dissenters shared a vision of personal computing in which computers would be used to enhance the most creative aspects of human intelligence-for everybody, not just the technognoconscenti.

Those who questioned the dogma of data processing agreed that computers can help us calculate, but they also suspected that in the devices could be made more interactive, these tools might help us to speculate, build and study models, choose between alternatives, and search for meaningful patterns in collections of information. They wondered whether this newborn device might become a communication medium as well as a calculating machine.

These heretical computer theorists proposed that if human knowledge is indeed power, then a device that can help us transform information into knowledge should be the basis for a very powerful technology. While most scientists and engineers remained in awe of the giant adding machines, this minority insisted on thinking about how computers might be used to assist the operation of human minds in nonmathematical ways.

Tools for Thought focuses on the ideas of a few of the people who have been instrumental in creating yesterday's, today's, and tomorrow's human-computer technology. Several key figures in the history of computation lived and died centuries or decades ago. I call these people, renowned in scientific circles but less known to the public, the patriarchs . Other cocreators of personal computer technology are still at work today, continuing to explore the frontiers of mind-machine interaction. I call them the pioneers.

The youngest generation, the ones who are exploring the cognitive domains we will all soon experience, I call the Infonauts . It is too early to tell what history will think of the newer ideas, but we're going to take a look as some of the things the latest inner-space explorers are thinking, in hopes of catching some clues to what (and how) everybody will be thinking in the near future.

As we shall see, the future limits of this technology are not in the hardware but in our minds. The digital computer is based upon a theoretical discovery known as "the universal machine," which is not actually a tangible device but a mathematical description of a machine capable of simulating the actions of any other machine. Once you have created a general-purpose machine that can imitate any other machine, the future development of the tool depends only on what tasks you can think to do with it. For the immediate future, the issue of whether machines can become intelligent is less important than learning to deal with a device that can become whatever we clearly imagine it to be.

The pivotal difference between today's personal computers and tomorrow's intelligent devices will have less to do with their hardware than their software-- the instructions people create to control the operations of the computing machinery. A program is what tells the general-purpose machine to imitate a specific kind of machine. Just as the hardware basis for computing has evolved from relays to vacuum tubes to transistors to integrated circuits, the programs have evolved as well. When information processing grows into knowledge processing, the true personal computer will reach beyond hardware and connect with a vaster source of power than that of electronic microcircuitry--the power of human minds working in concert.

The nature of the world we create in the closing years of the twentieth century will be determined to a significant degree by our attitudes toward this new category of tool. Many of us who were educated in the precomputer era shall be learning new skills. The college class of 1999 is already on its way. It is important that we realize today that those skills of tomorrow will have little to do with how to operate computers and a great deal to do with how to use augmented intellects, enhanced communications, and amplified imaginations.

Forget about "computer literacy" or obfuscating technical jargon, for these aberrations will disappear when the machines and their programs grow more intelligent. The reason for building a personal computer in the first place was to enable people to do what people do best by using machines to do what machines do best. Many people are afraid of today's computers because they have been told that these machines are smarter than they are--a deception that is reinforced by the rituals that novices have been forced to undergo in order to use computers. In fact, the burden of communication should be on the machine. A computer that is difficult to use is a computer that's too dumb to understand what you want.

If the predictions of some of the people in this book continue to be accurate, our whole environment will suddenly take on a kind of intelligence of its own sometime between now and the turn of the century. Fifteen years from now there will be a microchip in your telephone receiver with more computing power than all the technology the Defense Department can buy today. All the written knowledge in the world will be one of the items to be found in every schoolchild's pocket.

The computer of the twenty-first century will be everywhere, for better or for worse, and a more appropriate prophet than Orwell for this eventuality might well be Marshall McLuhan. Of McLuhan was right about the medium being the message, what will it mean when the entire environment becomes the medium? If such development does occur as predicted, it will probably turn out differently from even the wildest "computerized household" scenarios of the recent past.

The possibility of accurately predicting the social impact of any new technology is questionable, to say the least. At the beginning of the twentieth century, it was impossible for average people or even the most knowledgeable scientists to envision what life would be like for their grandchildren, who we now know would sit down in front of little boxes and watch events happening at that moment on the other side of the world.

Today, only a few people are thinking seriously about what to do with a living room wall that can tell you anything you want to know, simulate anything you want to see, connect you with any person or group of people you want to communicate with, and even help you find out what it is when you aren't entirely sure. In the 1990's it might be possible for people to "think as no human being has ever thought" and for computers to "process data in a way not approached by the information-handling machines we know today," as J.C.R. Licklider, one of the most influential pioneers, predicted in 1960, a quarter of a century before the hardware would begin to catch up with his ideas.

The earliest predictions about the impact of computing machinery occurred quite a bit earlier than 1960. The first electronic computers were invented by a few individuals, who often worked alone, during World War II. Before the actual inventors of the 1940s were the software patriarchs of the 1840s. And before them, thousands of years ago, the efforts of thinkers from many different cultures to find better ways to use symbols as tools led to the invention of mathematics and logic. It was these formal systems for manipulating symbols that eventually led to computation. Links in what we can now see as a continuous chain of thought were created by a series of Greek philosophers, British logocians, Hungarian mathematicians, and American inventors.

Most of the patriarchs had little in common with each other, socially or intellectually, but in some ways they were very much alike. It isn't surprising that they were exceptionally intelligent, but what is unusual is that they all seem to have been preoccupied with the power of their own minds. For sheer intellectual adventure, many intelligent people pursue the secrets of the stars, the mysteries of life, the myriad ways to use knowledge to accomplish practical goals. But what the software ancestors sought to create were tools to amplify the power of their own brains--machines to take over what they saw as the more mechanical aspects of thought.

Perhaps as an occupational hazard of this dangerously self-reflective enterprise, or as a result of being extraordinary people in restrictive social environments, the personalities of these patriarchs (and matriarchs) of computation reveal a common streak of eccentricity, ranging from the mildly unorthodox to the downright strange.

The software patriarchs came from wildly different backgrounds. Then as now, computer geniuses were often regarded as "odd" by those around them, and their reasons for wanting to invent computing devices seem to have been as varied as their personalities. Something about the notion of a universal machine enticed mathematicians and philosophers, logicians and code-breakers, whiz kids and bomb-builders. Even today, the worlds of computer research and the software business bring together an unlikely mixture of entrepreneurs and evangelists, futurians and utopians, cultists, obsessives, geniuses, pranksters, and fast-buck artists.

Despite their outward diversity, the computer patriarchs of a hundred years ago and the cyberneticians if the World War II era appear to have shared at least one characteristic with each other and with software pioneers and infonauts of more recent vintage. In recent years, the public has become more aware of a subculture that sprouted in Cambridge and Palo Alto and quietly spread through a national network of fluorescent-lit campus computer centers for the past two decades--the mostly young, mostly male, often brilliant sometimes bizarre "hackers," or self-confessed compulsive programmers. Sociologists and psychologists of the 1980's are only beginning to speculate about the deeper motivation for this obsession, but any later-day hacker will admit that the most fascinating thing in his own life is his own mind, and tell you that he regards intense, prolonged interaction with a computer program as a particularly satisfying kind of dialogue with his own thoughts.

A little touch of the hacker mentality seems to have affected all of the major players in this story. From what we know today about the patriarchs and pioneers, they all appear to have pursued a vision of a new way to use their minds. Each of them was trying to create a mental lever. Each of them contributed indispensable components of the device that was eventually assembled. But none of them encompassed it all.

The history of computation became increasingly complex as it progressed from the patriarchs to the pioneers. At the beginning, many of the earliest computer scientists didn't know that their ideas would end up in a kind of machine. Almost all of them worked in isolation. Because of their isolation from one another, the common intellectual ancestors of the modern computer are relatively easy to discern in retrospect. But since the 1950s, with the proliferation of researchers and teams of researchers in academic, industrial, and military institutions, the branches of the history have become tangled and too numerous to describe exhaustively. Since the 1950s, it has become increasingly difficult to assign credit for computer breakthroughs to individual inventors.

Although individual contributors to the past two or three decades of computer research development have been abundant, the people who have been able to see some kind of overall direction to the fast, fragmented progress of recent years have been sparse. Just as the earliest logicians and mathematicians didn't know their thoughts would end up as a part of a machine, the vast majority of the engineers and programmers of the 1960s were unaware that their machines had anything to do with human thought. The latter day computer pioneers in the middle chapters of this book were among the few who played central roles in the development of personal computing. Like their predecessors, these people tried to create a kind of mental lever. Unlike most of their predecessors, they were also trying to design a tool that the entire population might use.

Where the original software patriarchs solved various problems in the creation of the first computers, the personal computer pioneers struggled with equally vexing problems involved in using computers to create leverage for human intellect, the way wheels and dynamos create leverage for human muscles. Where the patriarchs were out to create computation, the pioneers sought to transform it:

Licklider, Engelbart, Taylor, and Kay are still at work, confident that many more of us will experience the same thrill that has kept them going all these years--what Licklider, still at MIT, calls the "religious conversion" to interactive computing. Engelbart works for Tymshare Corporation, marketing his "Augment" system to information workers. Taylor is setting up another computer systems research center, this time under the auspices of the Digital Equipment Corporation, and is collecting people once again, this time for a research effort that will bring computing into the twenty-first century. Kay, at Atari, continued to steer toward the fantasy amplifier, despite the fact that their mother company was often described in the news media as "seriously troubled." It is fair to assume that he will continue to work toward the same goal in his new association with Steve Jobs, chairman of Apple and a computer visionary of a more entrepreneurial bent.

The pioneers, although they are still at work, are not the final characters in the story of the computer quest. The next generations of innovators are already at work, and some of them are surprisingly young. Computer trailblazers in the past tended to make their marks early in life--a trend that seems to be continuing in the present. Kay, the former quiz kid, is now in his early forties. Taylor is in his early fifties, Engelbart in his late fifties, and Licklider in his sixties. Today, younger men and, increasingly, younger women, have begun to take over the field professionally, while even younger generations are now living in their own versions of the future for fun, profit, and thrills.

The ones I call the "infonauts" are the older brothers and sisters of the adolescent hackers you read about in the papers. Most of them are in their twenties and thirties. They work for themselves or for some research institution or software house, and represent the first members of the McLuhan generation to use the technology invented by the von Neumann generation as tools to extend their imagination. From the science of designing what they call the "user interface"--where mind meets machine--to the art of building educational microworlds, the infonauts have been using their new medium to create the mass-media version we will use fifteen years from now.

Despite their differences in background and personality, the computer patriarchs, software pioneers, and the newest breed of infonauts seem to share a distant focus on a future that they are certain the rest of us will see as clearly as they do--as soon as they turn what they see in their mind's eye into something we can hold in our hands. What did they see? What will happen when their visions materialize in our homes? And what do contemporary visionaries see in store for us next?
index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14
read on to
Chapter Two:
The First Programmer Was a Lady

electric minds | virtual community center | world wide jam | edge tech | tomorrow | conversations

Any questions? We have answers.

©1996, 1997 electric minds, all rights reserved worldwide.
electric minds and the electric minds logo are trademarks of electric minds
online information system by Leverage