Tag Archives: Luddite

Jared Lanier’s Book “You Are Not A Gadget: A Manifesto”

This is the kind of book I like: one by a knowledgeable person in a scientific field, a creative even quirky person, in fact, and one taking a contrarian point of view. Lanier was certainly raised to be different; his entry in Curious Minds: How a Child Becomes a Scientist reveals the perfect if somewhat lonely childhood for a creator, to the extent that he got to design his family home. He seems to have gone to public school, but his peers seem to have been scientists working at Los Alamos.

But back to that point of view. Lanier’s not a Luddite—he’s a computer scientist, after all, and most Luddites are not especially tech-savvy. But Lanier’s not so sure that the internet has made us better or happier people. Valuing the development of the individual person, Lanier’s suspicious of at least two different and important creeping ideas caused (he says) by the ways that computer scientists set up computer interfaces, software, and the internet. For one, humans are induced to think of their minds as computers, at least to some extent. And second, the “hive mind” is seemingly validated as having a “legitimate point of view”—over those of unique individuals (4).

Lanier’s assertions and questions, though, while reassuring that there is someone prominent in computer science who cares about philosophy and the human experience, largely supply more reasons to be concerned about technology’s effect on humankind. Because this technology forms “extensions to your being” and influences how people “connect to the world and other people,” it can “change how you conceive of yourself and the world” (5-6). As Lanier says so well, “We tinker with your philosophy by direct manipulation of your cognitive experience, not indirectly, through argument” (6).

The big obvious problem with this is that we don’t know when our experience is being manipulated. Doesn’t that sound Orwellian to you? It reminds me of the time that George Orwell sent an article about WWII England to the USA, and the editor happily reported that it came with nothing censored out; in fact, the censor had retyped that page so it was not apparent that anything had been removed, and Orwell only figured this out because the government (possibly accidently) wrote him a letter telling him that his letter had been censored. According to The Filter Bubble, when I Google a topic, I’ll get different answers than someone else googling the same words. In both cases, we don’t know when or how we’re censored or directed.

We get what we want, or at least what we’ve wanted in the past. And that’s one of the problems that Lanier touches upon: “Being a person is not a pat formula, but a quest, a mystery, a leap of faith.” Will Google let me become the person I’m capable of becoming if it only shows me stuff I already know and/or like?

Of course, I live in a world beyond the internet. But that real world is much less cooperative than the one in front of me on the screen. I don’t notice it much, but I see it with my son: he’s in control on the computer, pushing buttons makes him jump, climb, shoot. It’s all fast and easy. Once his mother lets him on the computer—and there’s some big friction on that topic—the friction stops. Lanier discusses this himself: a creative mind can communicate more easily at a younger age via technology than speech. In other words, a smart kid can take a lot in at an early age, but sometimes has difficulty expressing himself. Technology can help him. But as technology becomes more ubiquitous, the child may never wean himself from this too-easy media of not-quite-full self-expression.

I’ve raised only a few of the topics that struck me as most interesting, but Lanier’s book raises multiple issues that engineers would do well to consider and try to address.