Perhaps the most lucid, least humorless, and most systematic look at the power of technology over human life is Autonomous Technology: Technics-Out-of-Control as a theme in Political Thought, by Langdon Winner. Here are a couple short quotes:
- While it is widely admitted that the structure and processes of technology now constitute an imporant part of the human world, the request that this be opened up for political discussion is still somehow seen as an attempt to foul the nest.
- Technical systems become severed from the ends originally set for them and, in effect, reprogram themselves and their environments to suit the special conditions of their own operation. The artificial slave gradually subverts the role of its master.
Bringing the Internet into every classroom is a bad idea if all that means is pulling wires and uncrating computers. Adding Internet access to overburdened educational systems will push them closer to collapse if there is no plan to pay for ongoing access, training, maintenance, and especially if there is not a solid plan for how to use this tool. A handful of pioneers have been experimenting for years. I hope these experienced practitioners will be noticed in the rush to a technological fix. For those educators who think the Internet might be useful, but want to know more about how to use it, I recommend starting with the work of Amy Bruckman, and if you really want detail, look at Wired Together: The Online Classroom in K-12. A detailed description of this four-volume set of case histories is one of many useful things you will find at The Moderators Homepage.
Intelligent agents are what Seymour Papert or Sherry Turkle would call "objects to think with." We can think about what a tool might do if we use it, and how we might try to design the tool to minimize negative effects, but we also need to think about what kind of world tools like this will be used to create. Perhaps new technologies ought to have societal impact reports, not as an attempt at political regulation, but as a way of thinking systemically instead of just instrumentally. Do we know where we are going? Do we want to go there? Is there anything we can do about it? I responded to an interview with Pattie Maes with a few questions about the implications of intelligent agents.
Centuries ago, philosophical subversives conspired to infect Europe and America with a radical idea: If we could invent better methods of discovering knowledge, we could better our lives, even govern ourselves. Modern science, the industrial revolution, and the US Constitution were results of this conspiracy, known as "the enlightenment project." A search for "enlightenment project" yielded, among too many irrelevant others:
This great quote: "The project of modernity had essentially been one of arms and media technology...all the better that it was shrouded in a petty phraseology of democracy and the communication of consensus."
Democracy on Trial (which reminded me of last month's scariest think-piece, Was Democracy Just a Moment?)
The weather has been screwy lately. Like the last week, the last 15,000 years, the last 2.5 million years. The Atlantic Monthly's January cover story, The Great Climate Flip-Flop, makes the connection between screwy weather and human evolution. The editor's introductory notes explain:"... [Calvin says] the evolution of the human mind is intimately linked to abrupt climate change: our brains seem to have begun their transformation from apelike to fully human just when temperatures on earth began their current trend of jumping rapidly -- often within a single lifetime -- between warm and cold. Calvin argues that in the context of brief environmental opportunities (periods of warmth) and hazards (sudden icy temperatures), survival for our ancestors became dependent on having highly agile, "jack-of-all-trade" minds. The flip-flop of climates, in other words, led to the evolution of brains that could themselves flip-flop abruptly between strategies for survival."
I'm still thinking about technology, but Calvin's ideas are helping me understand that the urge to alter things may be hardwired. Last week, I pointed to some rather bleak quotes by Jacques Ellul. Here's a slightly more optimistic statement by Andrew Feenberg.
I've been thinking about how to think about technology for some time. Not a day goes by without a reminder. Today, the headlines are about the Unabomber trial and a guy named Dick Seed who claims he's going to clone humans.The direction and impact of technology are critically important and problematic influences on our future. Why is it so hard to find broad-based, sophisticated discourse about technology that is neither polarized, simplistic, or incomprehensible?
I've found most contemporary technology criticism to be shallow, but learned that far deeper thinking about technology was going on four and even five decades ago by Lewis Mumford, Jacques Ellul, and others. So right now, I'm going back to basics and reading about how industrial technological capitalist civilization came to be. Here are a few Ellul quotes.
I've added links to my declaration and testimony in support of free expression in the ACLU challenge to the Communications Decency Act