Erik Larson

Dec 18, 2013

Continuation of Things Past

The prior post is rough and this one promises to be choppy. Some notes towards an article, that’s all.

Deconstructing the Web

(1) The Web paradox is something like: once you start treating people like information processing systems—and I’ll explain how this works with the cognitive-social model on the Web—“deeper” and core creative intellectual acts lie outside your scope. So the paradox is that all the information at your finger tips leads, in the end, to having less knowledge. It’s sort of like a law of human thinking, some comparison at least metaphorically to a law of thermodynamics, where you can’t get something for free. You want lots of information? You have the Web. You want, as Carr puts it, concentration and contemplation? You have to get off of the Web.

(2) None of this really matters—even if you accept the thesis here—if you have an instrumentalist view of technology; you won’t see the danger or the problem. But part of my argument is that there is no such thing as instrumentalism ; the Web is paradigmatically non-instrumentalist. In fact, you can go “realist” about the non-instrumentalism of the Web and point to actual brain science: our brains are literally changing. So it’s not a philosophical debate. It’s true.

(3) Getting all the positives of endless information without succumbing to the underlying cognitive-social information processing model is the Big Question. There are two ways to approach this.

(a) Introduce a distinction between Web use and “full” or “natural” human thought and action. A good example here is the distinction between using a network to discover a physical book (say, on Amazon), and actually reading and absorbing what the book says (say, by buying it and then reading it in the physical world).

(b) Change the Web. This is an intriguing possibility, and I think there are a number of promising routes here. Most of the thoughts I have on this matter involve a principle I “noticed” a few years ago on expertise. Call it the “natural world” principle or I’ll think of a better title, but here are some examples to motivate it:

(1) Someone writes a blog about driving Highway 101, which he does every summer.

(2) Someone writes a review on Yelp about the French cafe in the Mission District in San Francisco, and the reviewer spent the afternoon at the cafe just last week.

(3) Someone writes an article on Heisenberg’s Uncertainty Principle or Sartre’s Being and Nothingness on Wikipedia, and the person has a degree in mathematics or physics or just took a course on French Existentialists at the University of Kentucky (or wherever).

Revolution Cometh

In all of these examples, there’s a principle of knowledge at work, and underlying this principle, there’s one of, say, effort. Someone did some actual work in every example. For instance, the fellow with the travel blog actually drove the highway (it’s long, it takes time). Or, the customer at the cafe actually went there, and sat down, and ordered an Espresso and a Croissant. The effort principle underlies the knowledge principle because, well, it takes effort to know things about the world. And whenever people know things about the world and translate this knowledge into bits of information online, like with all communication we can learn (if not experientially, at least cognitively) from those bits, by reading them. In this guise nothing is really that different than fifty years ago; it’s like looking at Microfiche, say. Doing research. Learning.

But the effort principle is inextricably tied to the knowledge principle, and this is where this model departs from the current Web model. For instance, something like “Web 2.0”, or what Lanier pejoratively calls the “hive mind”, pulls the effort and knowledge principles apart. Here, a bunch of anonymous “Web resources” (people online) all chip in little bits of effort to make a finished product. Like, say, a Wikipedia entry. The big fallacy here is that there’s something from nothing—no one ever really knows a ton about quantum mechanics, or atheistic existentialism. The focus here is not on what an individual might know (an “expert”) but rather on what many anonymous non-experts might collectively “know.” And this is where all the trouble starts; for the information processing model that gives rise to the negative conclusions of a Carr or a Lanier (or a New Yorker article about Facebook) is ideally suited to the cognitive-social model that ignores physical-world-expertise and the effort it takes in favor of anonymous Web resources. If information is processed, hive-like, by so many resources, then—like any information processing device—the process is what ultimately matters, not the knowledge from experts. Expertise emerges, somehow, out of the __ process of information processing. Indeed, that what we call “expertise” is actually structural, and exploitable by algorithms, is precisely the idea driving the mega-search company Google. We’ll get to Google later.

So to conclude these thoughts for now, what’s driving the negative conclusions of Lanier-Carr (to put their conclusion memorably: “the Web is making us stupid”) is our participation in an information processing model that is more suited for computers than for people. As this is becoming our cognitive-social model, of course we’re getting stupider , to the extent in fact that computation or information processing is not a complete account of human cognitive-social practices. This point is why someone like Lanier—a computer scientist at Berkeley—can ask “Can you imagine an Einstein doing any interesting thinking in this [Web] environment?” He’s point out, simply, that innovation or true creativity or let’s say “deep” things like what Einstein did have little in common with much of what passes for “thinking” on the Web today. It’s not just that lots of people are online and many people aren’t Einsteins; it’s that lots of people are online and they’re all doing something shallow with their heads without even realizing it. As Carr puts it so well in The Shallows, they’re surfing instead of digging into ideas; skimming longish text for “bullet points”, jumping from titillating idea to idea without ever engaging anything. And, echoing Heidegger again, as the Web isn’t simply an instrument we’re using, but it’s in fact changing us, the question before us is whether the change is really good, and whether the cognitive-social model we’re embracing is really helpful.

All the way back to the beginning of this, then, I want to suggest that far from steering us away from the Web (though this simple idea actually has legs, too, I think), what’s really suggestive is how to encourage the knowledge-effort principle in the sorts of technologies we design, implement, and deploy online. I use Yelp, for instance. I use it because someone who actually visits a restaurant is a real-world “expert” for purposes of me choosing to spend an hour there. It all lines up for an online experience, in this case. They did the work, got the knowledge, and even if they’re no Einstein, they’re an expert about that place in the physical world (that cafe in San Francisco, with the great Espresso).

And likewise with other successes. Wikipedia doesn’t “work” relative to a traditional encyclopedia like Britannica because the “hive mind” pieced together little bits of mindless factoids about quantum theory, arriving at a decent exposition of Heisenberg’s Uncertainty Principle (magic!). It works because of all those little busy bees online, one of them had actual knowledge of physics (or was journalistic enough to properly translate the knowledge about physics from someone who did).

But again, the problem here is that the Web isn’t really set up to capture this—in fact much of the Web implicitly squelches (or hides) real-world categories like knowledge and effort in favor of algorithms and processing. When Google shows you the top stories for your keywords “health care crisis”, you get a virtual editorial page constructed from the Google algorithm. And when you key in “debt crisis” instead (you’re all about crises this morning, turns out), you get another virtual editorial page, with different Web sites. Everything is shallow and virtual, constructed with computation on the fly, and gone the moment you move to the next. You’re doomed, eventually, to start browsing and scanning and acting like an information processor with no deeper thoughts yourself. So it’s a hard problem to get “effort” and “knowledge” actually built into the technology model of the Web. It takes a revolution, in other words. And this starts with search.

Search is the Alpha and Omega