By Jaron Lanier
Alfred A. Knopf, 2010
Notice anything? Did you feel any difference between the stuffy atmosphere surrounding your computer and the outdoor breeze? Was it raining? Did your skin start to heat in the sun, or was the world around you cooled by the grayish tint of clouds?
The human brain is more than a machine made for computational processes. It’s able to quickly parse a variety of quantitative information, such as prices or schedules. But it’s also able to perceive that information on a variety of different qualitative levels. This becomes extremely important when precise factual information is hard to come by. You’ll never be able to tell exactly how many water droplets are suspended midair within a defined geographical location, but you can approximate the threshold that will require an umbrella.
Whatever the case, human beings do not operate as predictable algorithms. For the mathematically illiterate, and this includes myself, an algorithm is a step-by-step repetition of an operation over a finite number of steps to complete some mathematical problem or quantifiable process. In a very basic sense, a recipe could be considered an algorithm. You have your ingredients, you do certain things to them, and voila: apple pie.
Humans use this basic idea of the algorithm all the time. When you wake up in the morning on a workday, you have some sort of ritual that prepares you for the day ahead. This generally includes eating breakfast, taking a shower, wearing the proper clothes, and then getting to work. That order I gave you in the first paragraph? That’s an algorithm: stand up, walk to the door, open it, look outside, and then return. Anyone can follow that process. However, the possibilities of that small event are infinite; it’s impossible to create an algorithm that adequately accounts for each stimulus and response that you could experience while you’re outside. Let’s say it was raining. Maybe you like rainy days, but if someone who hates rain designed the algorithm that you used for that outdoor experience, you would be at that program designer’s mercy.
Where computers are concerned, algorithms are ubiquitous – and absolute. Literally every computational process that requires some form of user input is based upon an algorithm which converts that input into a previously defined output. Double-clicking a desktop icon powers the algorithm that causes the program to start. Engaging a close command, by clicking a box in the upper right corner of the program’s window or through some other means, turns on the algorithm that closes out the program. And unlike humans, who can adapt to frustrating changes, any snafu in the digital algorithmic process causes the whole program to crash.
None of these processes could occur without the physical presence of a human being operating the computer. According to software expert Jaron Lanier, this fact has become increasingly ignored by computer scientists, and as a result technology has been evolving with some disastrous implications.
You Are Not a Gadget: A Manifesto, Lanier’s first full-length book, is a meandering journey through the variety of philosophical and humanistic issues arising from technology, both in terms of software designs that undervalue human beings and the speed at which these technologies are expanding. Lanier fears that the prevailing digital culture prizes the gifts of technology to a degree that endangers both humanity and individuality.
Lanier watched a large portion of digital technology revolution unfold, having worked in Silicon Valley since the 1980s. Along with a fellow group of computer scientists, he has personally provided much of the foundational research and development of “virtual reality” and is sometimes credited as having coined that term. He contributed to the first applications of virtual reality in surgical simulation, vehicle interior prototyping, and other areas. He even shared an apartment with Richard Stallman during the period when Stallman created UNIX, an operating system for computers that has attained an almost mythic status in some digital circles.
What makes Lanier an oddity in the computer science world is his disdain for Web 2.0 online business models and so-called “open culture,” which Lanier argues is far more constricting than its moniker makes it seem. Facebook and Wikipedia appear to be fairly innocent, if time consuming. But Lanier finds that both devalue individual experience in terms of social expression and intellectual discourse. “I know quite a few people … who are proud to say that they have accumulated thousands of friends on Facebook,” he writes. “Obviously, this statement can only be true if the idea of friendship is reduced.”
You Are Not a Gadget is part stern admonition and part hopeful daydream regarding both the current and future use of digital information technologies in our society. Although Lanier uses language and anecdotes that are geared towards those with knowledge of computer science, his message is pertinent to anyone who’s ever turned on a computer.
Better than most other people, Lanier understands that any piece of software is a digital expression reflecting a designer’s concept of what is and is not a successful program. “Technologies are extensions of ourselves,” writes Lanier, noting that even small variations in programming can influence behavioral patterns in users. “It is impossible to work with information technology without also engaging in social engineering”:
For instance, Stanford University researcher Jeremy Bailenson has demonstrated that changing the height of one’s avatar in immersive virtual reality transforms self-esteem and social self-perception. …We inventors of digital technologies are like stand-up comedians or neurosurgeons, in that our work resonates with deep philosophical questions; unfortunately, we’ve proven to be poor philosophers lately.
What makes this social engineering problematic are the limitations posed by the mind-boggling process of software lock-in. In the halcyon days of computer programming, you could create a small program – it may not have had a pretty user interface, but then, it didn’t have to be formatted to coexist with millions of other programs. Today, basic programs need to be written in a Byzantine fashion simply to interact with operating systems without crashing.
“The fateful, unnerving aspect of information technology,” Lanier tells us, “is that a particular design will occasionally happen to fill a niche and, once implemented, turns out to be unalterable.” Google may not bring up the most relevant search links, but it’s a supergiant when it comes to Internet indexing. Facebook owns a similar niche in the social networking world. New players in the online world need to bow to both of these and other strongly entrenched programs, or else risk relegation to obscurity.
“Next to the many problems the world faces today, debates about online culture may not seem that pressing,” writes Lanier early on in his missive. He notes that other orders of business, such as global climate and economic concerns, have a much higher priority. “But digital culture and related topics … concern the society we’ll have if we can survive these challenges.”
This perspective is hard to maintain, especially during some harrowing passages about the noosphere, trolls, and the Singularity, terms which aren’t discussed much outside of computer science circles but have massive implications for all of humanity.
The Singularity, as Lanier explains it, is simply the vague idea that at some point, a technology will be developed that is advanced beyond a human’s computational skill. In more radical versions, the Singularity becomes a kind of technological Rapture that includes, “people dying in the flesh and being uploaded into a computer and remaining conscious, or people simply being annihilated in an imperceptible instant before a new super-consciousness takes over the Earth.” One would hope that participation isn’t mandatory; I personally enjoy being human.
While the Singularity is not an imminent reality, the noosphere, a term describing the collective brain formed by those connected via the Internet, has already contributed to the degradation of individuality, according to our author. Lanier also refers to the noosphere as a ‘hive-mind mentality,’ a cognitive interpretation of digital culture with clear Marxist collectivist leanings. Lanier doesn’t find the ideology malevolent, but he provides evidence of its dangers.
When online users are presented with what Lanier calls “transient anonymity” (such as when a new e-mail address – something obtainable in seconds – is the only prerequisite to creating an account that allows a user to spew all the hatred in the multiverse on some poor message board), ornery feelings can snowball into malicious attacks. At least, that’s the most generous explanation for behavior such as the digital taunting of the parents and friends of Mitchell Henderson, a seventh-grader who committed suicide in 2006. The first sentence of the first entry pulled up in a Google search for “Mitchell Henderson” reads: “Mitchell Henderson killed himself over losing an iPod, listening to Morrissey and getting bullied for being an [sic] wimpy white kid.” The grammatical error is another backhand, an allusion to the apparently comic misspellings of classmates who were audacious enough to take up Web bandwidth in publishing condolences on a memorial MySpace page. Apparently, the first link, which comes from Encyclopedia Drammatica, a Wikipedia-style catalogue of wide-ranging topics with entries that are marked by parody and satire, was more relevant (at least according to the at times obscure priority hierarchies of Google) than an August 2008 New York Times article on the same subject.
In a chronology of what went wrong in the digital world, Lanier would probably begin with UNIX. UNIX is manipulated by users through a feature called the command line interface; instead of a mouse and cursor, you typed a command that could be understood by UNIX and pressed Enter. UNIX doesn’t care whether a human user hits Enter or if the command comes from another computer program. The computational speed of UNIX, much faster than any human can type, implicitly disparages the sloth-like experience of being human. “As a result, UNIX is based on discrete events that don’t have to happen at a precise moment in time,” writes Lanier. “The human organism, meanwhile, is based on continuous sensory, cognitive, and motor processes that have to be synchronized precisely in time.” People argue that the speed of technology aids human experience. Lanier believes technology’s speed confuses our experience, causing us to conform to what the computer asks of us instead of computers obeying human users, which is ideally how the whole gambit runs.
“People degrade themselves in order to make machines seem smart all the time,” writes Lanier. “Did that search engine really know what you want, or are you just playing along, lowering your standards to make it seem clever?” It’s not that search engines are unhelpful. But digital technologies should be ancillary, Lanier argues; the boast shouldn’t be of “smart” technology but of the people who contribute to the meaningful depth of these technologies, from the website developer down to the casual forum wanderer who posts insightful gardening or pet-care tips.
The essential point here is that technology is dumb. Smart phones are not smart and the World Wide Web is not a large brain. A search engine can index more information in ten seconds than a human can in a lifetime, but it can’t cook a meal. A smart phone with voice recognition can dial a friend without the need to push one button, but it can’t tell you the right words to say. Our willingness to accept the prevailing advertising behind “intelligent” technologies is indicative of what Lanier calls the spiritual failure of digital culture, “redirecting the leap of faith we call ‘hope’ away from people and towards gadgets.”
“It seems ridiculous to have to say this, but … let me affirm that I am not turning against the internet. I love the internet,” Lanier asserts. In many respects, the Internet and technology in general has been a great boon to human experience, and Lanier understands this, even if his comments at time seem unreasonably harsh. For instance, he mentions an online forum for musicians who play the oud, a string instrument from the Middle East. Without the Internet allowing access to the forum, information on the instrument would be very hard to come by, and those who actually play it would be at a total loss unless they lived in the Middle East.
Lanier is trying to remind us of the great responsibility we all have to keep the evolution of technology honest and subservient to the human race instead of the other way around. He waxes poetic about songles, physical objects (within which song files are implanted) that can interact with media players in a local area, or Second Life, a 3D virtual reality world with a social interface that discourages trolling. He even proposes a compelling idea: paying for Internet access by the amount of bits accessed instead of a flat monthly fee. In Lanier’s proposal, you would be paid for the amount of people accessing your information as well.
Lanier’s manifesto lacks direction at points, and the jargon gets even worse than some terms I’ve included here (Bachelardian neotony? Holy crap…) But even if it can’t tell the common person what to change, You Are Not a Gadget gives you a clear idea of what to resist. Lanier even gives a short list of ideas early in the book that can help us remain individuals on the Web, including (ironically?) creating a website for personal expression or posting a video that takes one hundred times longer to create than it takes to view. Unless we actively take a stand to assert our individuality and create a culture of understanding instead of violation, we take the heavy risk of being reduced to fragments of data, with all the global social unrest that invites.
So turn your computer off and go for a walk. Soak up all the little things programs can’t even recognize, much less enjoy. Maybe even read You Are Not A Gadget, which is very lively company. And when you turn your computer back on (as you must, because that world wants – and deserves, let’s not forget – your attention too), think about it differently.
Steve Brachmann is a freelance writer and actor from Buffalo, NY. Has had work published for Dissolver Magazine, Image Icon Entertainment, Northeastern’s Times New Roman and The Buffalo News. His personal blog can be found at http://scubasteve519.livejournal.com/.