Reprinted from C-Arts Magazine (September 2010).
I believe that the horrifying deterioration in the ethical conduct of people today stems from the mechanization and dehumanization of our lives—the disastrous by-product of the scientific and technical mentality. Nostra culpa. Man grows cold faster than the planet he inhabits.
— Albert Einstein
One of the things I missed while living in Bali was the presence of playgrounds, so when we moved to Vancouver I started taking my four-year-old son to playgrounds all over the city. There are some award-winning playgrounds here that make an adult wish he were young enough to climb. And sometimes I do. But the most remarkable thing about these modern playgrounds, beyond the giant swinging plates, spaceship ropes and inside-out slides is the fact that it’s possible to pause the children. Dozens of children running, climbing, playing tag—and if someone suddenly yells “Pause!” all of them freeze. Everything stops. Including my son, who had never played this game.
When I first saw this, I found it disturbing. A bit too much like bad science fiction. Someone had implanted a pause function into my son while I wasn’t looking.
I figured it out a couple of days later, when he was watching Winnie the Pooh and needed to pee. He couldn’t find the remote control, so he started yelling, “Pause! Pause!” We don’t have a TV, all his movies are on DVD, and they can all be paused. So can electronic games.
This pause function scares me. I don’t worry about the Terminator, smart machines, genetic engineering or even the self-replicating gray nano goo that Bill Joy, founder of Sun Microsystems, famously worried about in a Wired article ten years ago titled, “Why the future doesn’t need us.” My fear stems from a different William, the psychologist William James who described the “gray chaotic indiscriminateness” of people who are unable to focus, to pay attention. I’m afraid of the way machines interact with the human brain, because of the two, it is always the human brain that will adapt.
Have enough games with a pause function, and it starts to work on kids. Have enough corporate managers using telepresencing—like video conferencing, but with the screen mounted on machines that walk around the office to check on everyone—and people stop holding doors open for each other (since the five-foot-tall robots, made by companies like MobileRobots Inc., don’t have arms). Have enough safety built into GPS-enabled phones, and you get idiots…uh, unprepared hikers climbing into mountains far above their skill level calling in search-and-rescue helicopters to complain that the water is salty. Technology never simply satisfies a need. It always also changes human behavior.
I’m a member of the California bar. Before the bar exam, I took a ten-week course on how to stop thinking. Former exam graders explained how all essay questions needed to have certain keywords. We should underline the keywords. If we used a synonym, the grader would miss the keyword and we would lose points. The text between the keywords didn’t really matter. Just something to fill the space. These human graders with the power to decide who could become a lawyer were imitating a search engine.
Invented to help us retrieve information, Google has completely changed the nature of information. Just ask any marketing agency that has moved to more “scientific” approaches—striving that every press release, and increasingly every newspaper headline, includes the words ‘Green,’ ‘Sex,’ ‘Cancer,’ ‘Secret,’ ‘Fat,’ ‘Toxic’ and either ‘Taylor Momsen’ or ‘Lady Gaga.’
And while it has always been amazing that any creativity survived a formal education, now that computers are grading elementary school essays and the background technological pressure has reinforced the existing human impulse to avoid thinking—to replace real situational thinking with categories, checklists, and rules—the only thing our kids will be good for is becoming border patrol agents. We are teaching them to think in ways that are very quick and very shallow. Like computers.
This pressure comes not just from our tendency to adapt to our tools, but from the sheer quantity of information these tools enable. Lord Chesterfield once wrote to his son that “Steady and undissipated attention to one object is a sure mark of a superior genius; as hurry, bustle, and agitation are the never-failing symptoms of a weak and frivolous mind.” Modern neurological research has proven him right—a 2005 Hewlett-Packard study found that, “Workers distracted by e-mail and phone calls suffer a fall in IQ more than twice that found in marijuana smokers.”
Russell Poldrack has shown that multitasking interferes with learning by giving the work to the striatum (a part of the brain that handles novelty-related decision making) rather than the hippocampus, which controls the storage and recall of information and is triggered only in undistracted learning. René Marois has shown it triggers gluticocorticoid and adrenaline hormones, which not only interfere with learning but can cause long-term health problems. Loren Frank has shown that when rats explore, their brains show new patterns of activity, but these only turn into persistent memory if the rats have a chance to take a break from their explorations. Similarly, if you take a walk in nature after learning something, you’ll remember your lesson far better than if you take a walk in a dense urban environment. “Downtime lets the brain go over experiences it’s had, solidify them and turn them into permanent long-term memories,” Frank said. With constant stimulation, “you prevent this learning process.”
In 1998, I brought Heidegger’s Being and Time on a year-long trip through Africa. It was my reading, my pillow, my weapon for fighting off giant beetles. And it took about six months to read properly. One of my favorite sections was when Heidegger talks about neugierig, the German word for curiosity which translates literally, and unflatteringly, as “greed for novelty.” In 2010, I can’t go to the toilet without taking my blackberry—otherwise I feel like I’m wasting time doing only one thing—and can’t imagine anyone reading Heidegger at all. Not while PlayFish strives to “reinvent the [mobile video] game experience to fit into micro-moments” of under two minutes.
Don’t get me wrong. Having never lived in one place for more than four years, I have my own greed for novelty. And I like gadgets as much as the next male. Give me one of the new Martin Jetpacks and I’ll be sky-high happy. (This is a good year for jetpacks, with three different companies finally bringing commercial jetpacks onto the market. No pilot’s license required.)
If your jetpack crashes, the company And Vinyly can bake your ashes into vinyl records, to haunt any descendants who continue to hold onto vinyl-playing technology. You can record everything in between with a Life Recorder; new research will make video searchable; HP’s new memristor-based chips will have one petabit of memory per cm3, the ability to remember voltages when turned off, and the potential for emulating human-synapse-like neural systems; and the face-tracking software already running on the Nokia N900 smartphone “knows” where you’re looking and how you’re feeling. It’s a Brave New World with possibilities ranging from thought police to my blackberry telling me, as one anonymous wit put it, “I noticed you’ve been watching that blonde over there, and you appear to be sad. Would you like a list of local escort services?”
Even the perennial limitation of battery power is about to fall as this August researchers created two viruses—one a tobacco pathogen, the other a virus that infects bacteria—to create the cathode and anode for a lithium-ion battery that can be sprayed on clothing. “Typical soldiers have to carry several pounds of batteries. But if you could turn their clothing into a battery pack, they could drop a lot of weight,” said Mark Allen, one of the MIT researchers.
I love the idea of power coming from tobacco and disease—straight from the Devil, so to speak—but I can’t help remembering those Dell laptop batteries a few years ago that kept bursting into flames at unpredictable moments. I won’t be doing my own laundry.
In short, we have a million different players lifting the baseline level of technology in a giant tide, each thinking he’s just making micro-improvements in a browser, or a medical nanobot, or (as in the case of research at Georgia Tech) algorithms teaching robots when and how to lie to humans.
I’m no futurist, I don’t know if in ten years there will be a thousand different people on the cusp of creating the nano-goo, or Skynet or the Singularity. What I do see is that James’ description of the child’s mind, its “extreme mobility of attention” that “makes the child seem to belong less to himself than to every object which happens to catch his notice” is increasingly true of adults today. And I can predict that if we get to the point where 80% of the population prefers to nonthink like a computer, then corporations will apply their beloved 80-20 rule and decide it’s more efficient to drop the complicated 20% than it is to cater to them, especially since they would be holding on to opaque, inefficient and hard-to-aggregate mechanisms of social interaction like quality. These are the people who would resist having a pause function installed.
I wonder, though, who are these people? Artists, maybe, if the word is broadly defined to include the scientists creating the technology in the first place. To quote Einstein again (his name is, after all, a keyword), “After a certain high level of technical skill is achieved, science and art tend to coalesce in aesthetics, plasticity, and form. The greatest scientists are always artists as well.” And it is this very plasticity of thought that is threatened by technology.
More directly, artists like Peter Vogel, James Seawright and Mark Madel have been integrating technology in their art for decades. I like Madel’s work because it remains aware how the technology itself shapes perception and social spaces, instead of merely showing off his own “technological virtuosity.” One of Madel’s pieces, titled Jewelry Box, is designed to mirror a long-term relationship, with the lifespan of the relationship dependent on how often the owner opens the box. Too often at the beginning, and it will get bored of you. Ignore it for too long, and, similarly, it will cease to work. But open it at just the right frequency in each stage of the relationship, and it can last sixty years.
“You cannot love a car the way you love a horse,” said Einstein.
Other artworks by Madel include microcomputer implants that take over normal electronic appliances—your DVD player, radio, TV—and give them “personalities” that let them control their own actions rather than blindly obeying their operator when he presses, say, “Pause,” (and you thought your DVD player was annoying now!) and a number of pieces with self-destruct mechanisms built in, based on things like how many times they’ve been viewed.
Despite these sorts of rear-guard actions, however, one can feel the impact of the fast-and-shallow technological mind even in the most creative fields. In detective fiction, for example, every chapter must have between 15 and 17 pages, and the reversal (where the detective realizes he’s been pursuing the wrong suspect) must happen in chapter 13. Break the formula and you don’t get published—because 80% of the time this formula works. And I’ve known too many artists whose career paths, and often individual works, were as calculated as any other management optimization. As for me, finding a way to include the phrase “love a horse” in this article will shoot it through the SEO roof. Especially, according to Google Trends, in the Philippines…
The current issue of Make magazine has 23 gadgets you can build on your own. The most interesting one is simply a gadget that turns itself off. The editors of the magazine described it as “creepy.”
But maybe there’s our hope. If we can’t hit pause on our technological tide, then maybe we can make machines that suicide, that lie and cheat and get addicted to tobacco or other drugs, that are built on human-like memristor neural nets. If we’re lucky, maybe we can make machines human before they turn us into machines. Call it a race.