Home » nature, science

Lab v. Library

Proust Was a Neuroscientist

By Jonah Lehrer
Houghton Mifflin, 2007

Charles Percy Snow is perhaps best remembered for his 1959 Rede Lecture at the University of Cambridge, in which he identified two distinct cultures into which his (Western, male) academic acquaintances seemed to fall. Both cultures possessed comparable levels of intelligence, both came from similar backgrounds, both were paid about the same amount for their work—and yet, neither had much, intellectually speaking, to say to the other. In fact, these cultures often regarded one another with open hostility. Scientists and non-scientists—or, if you prefer, scientists and “literary intellectuals” charged with maintaining a “traditional culture”—did not, apparently, mix well. In fact, any reconciliation between the sciences and the humanities was doomed by what Snow called a “mutual incomprehension” so great as to have created, from what should have been simply two fields of knowledge, two entire cultures, each bound to its particular world view, and each bent on disavowing the other.  

 
Snow illustrates this polarity with an anecdote, in which he, at a dinner party attended by otherwise very bright humanities scholars, asks his fellow guests to describe the Second Law of Thermodynamics:

The response was cold: it was also negative. Yet I was asking something which is the scientific equivalent of Have you read a work of Shakespeare’s?

I now believe that if I had asked an even simpler question—such as, What do you mean by mass, or acceleration, which is the scientific equivalent of saying, Can you read?—not more than one in ten of the highly educated would have felt that I was speaking the same language.

Few would disagree that little progress has been made to bridge the “communications gap” in the nearly half-century since Snow’s lecture. Certainly scientists read, and read Shakespeare, and it is likely that more than one humanities scholar has worked out that the Second Law of Thermodynamics states that disorder or chaos in the universe will increase with time (and that mass is the amount of matter an object contains, independent of the gravitational force exerted on it; and that acceleration is the change in velocity per unit of time). Yet these remain two distinct cultures with two distinct methods of discerning truth. Will they ever be able to speak the same language?

Jonah Lehrer believes they can, and argues as much in his first book Proust Was a Neuroscientist. In fact, Lehrer believes that even a third culture, posited by Snow and put into practice by various popular science writers who “translat[e] their truths for the masses,” is currently failing to reconcile science and art. Those popularizers working in this third field—Lehrer lists Richard Dawkins, Brian Greene, Steven Pinker, and E.O. Wilson—explain science in laymen’s terms to demonstrate the role science plays in our everyday lives. According to Lehrer, though, these translators (saving perhaps Wilson) fail at creating the kind of third culture for which Snow laid the ground; by granting preeminence to the world of science, they make few attempts to connect the two cultures on a fundamental level. More problematically, these third-fielders often misunderstand or are even aggressive towards the humanities. In a rebuttal to a dismissive work of criticism Pinker wrote about Virginia Woolf, Lehrer points out that “[p]ostmodernists have ignorantly written off science as nothing but another text, and many scientists have written off the humanities as hopelessly false.”

To extricate ourselves from this quagmire, Lehrer claims we need a fourth culture. This is the work he proposes to do in his book, to show “how art and science might be reintegrated into an expansive critical sphere.” It is a noble and necessary cause, and one which Lehrer undertakes with a good deal of enthusiasm. He gathers a handful of nineteenth-century artists—the nineteenth century being both an “age of anxiety” and a “thrilling time to be studying science”—and explains how their work expressed, predicted, anticipated, paralleled, or intuited truths about the brain that modern science has only just proven to be true. Under Lehrer’s rubric, Walt Whitman helped bridge the Cartesian split, George Eliot predicted adult neurogenesis, the chef Auguste Escoffier linked smell and taste to anticipate the sense of deliciousness, Paul Cézanne understood that higher processing centers of our brain force the data our eyes receive into recognizable patterns, Stravinsky forced us to realize that we had to pay attention to music, Gertrude Stein tore down the behaviorist view of language, and Virginia Woolf theorized the emergence of the conscious self from the brain.

And Proust? Proust was a neuroscientist, like the others, not because he worked in a laboratory (as Lehrer himself did, a fact he mentions in the book’s very first sentence) and not because he was intimately acquainted with contemporary work on the brain (in which hypotheses were just beginning to be made about the existence of the synapse). No, Proust was a neuroscientist because he could imagine the processes by which memories are created and stored. Proust, “by sheer force of adjectives and loneliness…intuited some of modern neuroscience’s most basic tenets,” Lehrer claims. “As scientists dissect our remembrances into a list of molecules and brain regions, they fail to realize that they are channeling a reclusive French novelist.”

If the premises sound at times slightly implausible, it is not for lack of trying on Lehrer’s part. Each of his chapters follows roughly the same format: he begins with a bit of bio-historical background on his artist, introduces the contemporary neuroscientific milieu in which the artist worked, then demonstrates how modern science now knows the artist’s intuitions to hold true. In his chapter on Cézanne, for example, Lehrer begins with Woolf’s famous assertion that “on or about December 1910 human nature changed.” (This is, in fact, exactly what Steven Pinker objected to, rather pedantically arguing that no such thing happened in any biologically demonstrable way.) A 1910 exhibition of postimpressionist paintings introduced Cézanne’s abstracted art to the world—and he was immediately derided as someone “literally insane,” who produced “nothing more than an ugly untruth, a deliberate distortion of nature.” His critics adhered to the belief that the eye, camera-like, faithfully recorded everything it saw and transmitted it to the brain; thus perfect art would represent reality with a similar faithfulness. How wrong they would turn out to be, Cézanne knew already and modern science would discover years later.

The eyes, Lehrer explains, are actually complicated mechanisms for translating light photons into electrical impulses that then travel to parts of the visual cortex. A series of seminal experiments by David Hubel and Torsten Weisel in the 1950s confirmed as much. Other regions of the cortex respond to these impulses, interpreting them into shapes and colors as they are transmitted to the part of the brain responsible for conscious thought: “Form is imposed onto the formless rubble of the [visual cortex]; the outside world is forced to conform to our expectations.” How does Cézanne enter into this picture, then?

Cézanne’s art [Lehrer writes] exposes the process of seeing. Although his paintings were criticized for being unnecessarily abstract…they actually show us the world as it first appears to the brain. A Cézanne picture has no boundaries or stark black lines separating one thing from the next. Instead, there are only strokes of paint, and places on the canvas where one color, knotted on the surface, seems to change into another color. This is the start of vision: it is what reality looks like before it has been resolved by the brain…. Because he gives the brain just enough information, viewers are able to decipher his paintings and rescue the picture from the edge of obscurity.

  So Cézanne has translated the brain’s visual processes onto the canvas, which is why we must work so hard to understand his paintings. This argument is certainly intriguing, but it seems to disregard the fact that all paintings are only strokes of paint and knots of color. Various artists will make their viewers work harder or less hard to make the strokes and knots cohere, according to their talents or the dictates of their imaginations, but the electronic impulses will travel along the same neuronal pathways. Lehrer’s argument here, and throughout, is less a merging of art and science than an explanation of art using science—which is not necessarily a bad thing, but it does appear to run counter to his desire for a fourth culture capable of handling the former rather than the latter. What he manages is to invert the postmodern tendency to read science as “nothing but another text” by reading art as just another kind of science.

 
Again, this is not an uninteresting direction to take. Lehrer’s chapters on Escoffier and Stein are his strongest because they are most convincing from a neurological perspective. It is easy to believe that Stein, by playing around with the limits of language and grammar, could sense its failings, its attempts at rigidity, its necessities, its slipperiness. (It helps Lehrer’s case that Stein herself studied neuroanatomy at Johns Hopkins.) She could define language “not in terms of its expressive content—her writing rarely makes sense—but in terms of its hidden structure. When meaning was stripped away, that is what remained.” And that “hidden structure” translates relatively smoothly into Noam Chomsky’s deep structure and generative grammar of language. Linguistics is less a branch of neuroscience than a science in its own right, and language is the currency both of linguistics and literature, so Stein and Chomsky seem well matched.

The chapter on Escoffier also works, largely because of Lehrer’s vivid and avid descriptions. Escoffier codified the process of deglazing (when a small amount of wine is poured into the pan in which meat has been cooked in order to make a sauce or gravy), which transforms denatured proteins into something delicious: L-glutamate, or umani, an amino acid whose existence would eventually deconstruct the dominant theory of the four primary tastes (salty, bitter, sweet, and sour). Escoffier collected, popularized, and systematized traditional French cooking methods into haute cuisine and, in doing so, laid the foundations for the modern kitchen—a far cry from the chaotic and disorganized workspaces of his peers. At the same time, Escoffier served his creations thoughtfully, paying attention to how their aromas and flavors combined, as opposed to the arcane, ornate, and generally inedible odd delicacies that were standard fare at the time. As innovative as Escoffier’s methods were, any cook will tell you that creating food is a kind of alchemy, a science we can all practice without fully comprehending the chemical processes responsible for the tasty outcome. Escoffier’s insistence on “a leisurely culinary narrative” makes intuitive sense when we think about the fact that our olfactory system is intertwined with our gustatory system.

And yet, this book proves problematic in many areas, chief among which is Lehrer’s treatment of the history of science. Lehrer often characterizes neuroscience as a number of brilliant ideas lying dormant—or, more generally, intentionally ignored—among vast arrays of incorrect or ill-conceived theories. The brilliant ideas, because they are correct, eventually find their way to the laboratory, are carefully tested, and finally hailed as they should rightly have been ages ago. In his chapter on Escoffier, for example, Lehrer relates the story of Kikunae Ikeda, a Japanese chemist working in the early twentieth century. Ikeda (correctly) theorized and discovered the existence of umani, or deliciousness, a flavor distinct from the classical Aristotelian four tastes (sweet, sour, salty, and bitter) and caused by the presence of glutamic acid. Yet, as Lehrer puts it, “Ikeda’s research, although a seminal finding in the physiology of taste, was completely ignored. Science thought it had the tongue figured out…[S]cience persisted in its naive and unscientific belief in four, and only four, tastes.” Ikeda’s work managed to gain “a cult following” which eventually produced monosodium glutamate, or MSG, “[d]espite the willful ignorance of science.”

Elsewhere, Lehrer writes of scientists’ work being “marginalized” or “attacked” by the prevailing scientific community, despite the fact that these brave rogue scientists would eventually be proven to have been right all along. Such characterizations of science are frequent within the historical community (a popular book about Renaissance anatomy, for example, devotes most of its time to explaining just how wrong people’s theories about the body were in the 1600s). Indeed, it makes for a more compelling read to think of the lone scientist solving a problem against all odds, in the face of derision from his colleagues and under the threat of losing funding from his university.

But does science work this way? It moves forward slowly and is self-correcting (as Lehrer takes pains to point out), but it is not the process of finding a needle in a haystack. To think properly of a given historical period requires us to pay far more attention to contextual details. Ikeda’s work didn’t remain hidden because of “the willful ignorance of science”; certain scientific theories dominated at that time because there was not enough evidence to the contrary. To claim at the same time that science is both self-correcting and willfully ignorant is paradoxical, and worse: it compromises the integrity of an historical project by suggesting that hindsight is all that is necessary to find the truth. Neuroscientists would understand, better than most, the necessity of balancing adherence to accepted theory and self-correction, for no part of the body is as little understood as the brain, no part in which uncertainty still plays such an enormous role.

Also distressing, not just in Lehrer’s work but in the scientific culture in general, is the assumption that humanities criticism needs no special translation of its truth to the masses. Lehrer’s use of literature is simply that: use of literature, the pulling of quotes to illustrate a scientific point, a selectivity that makes the words fall flat. Though he is adept at explaining scientific work clearly, concisely, and without sounding condescending (certainly an important talent), he does not devote the same energy to dealing with the artistic work:

Since soul is body and body is soul, to lose a part of one’s body is to lose a part of one’s soul. As Whitman wrote in “Song of Myself,” “Lack one lacks both.” The mind cannot be extricated from its matter, for mind and matter, these two seemingly opposite substances, are impossibly intertwined. Whitman makes our unity clear on the very first page of Leaves of Grass, as he describes his poetic subject:


        Of physiology from top to toe I sing
        not physiognomy alone nor brain alone is worthy for the
        Muse, I say the form complete is worthier far.

And then Lehrer proceeds to discuss something else. He relies on the phrase “As X wrote…” far too often in his writing, and to use such a phrase marks a simple juxtaposition of two things rather than a concern for their underlying connections. It is fine to place neuroscience next to art, but how can we make them talk to each other? What exactly is Whitman saying here? Why is the “form complete” worthier to sing of than any single part of the body? Why use these lines when you yourself are trying to prove that the brain is the thing “worthier far”? Lehrer’s use of the words “anticipated,” “foretold,” “predicted,” and so on, also participate in such juxtaposition. To say that a work of literature anticipated science is to relegate the literature to the service of science. Or course, this is not a work of literary criticism, but to call for a fourth culture that integrates science and art requires more attention to the art, right?

Charles Percy Snow’s anecdote bears here a little deeper thought, for it actually points to a kind of supposed inequality skewing the one culture’s view of the other. “Have you read a work of Shakespeare’s?” and “Can you read?” are not, in fact, scientific equivalents to “What is the Second Law of Thermodynamics?” and “What is mass, or acceleration?” (They certainly ought to be equivalents, but this is a problem for our own educational system to resolve.) Most people, regardless of their affiliation with the first or second cultures, can read, and many can read Shakespeare; if reading was all that was necessary to be a “literary intellectual” there would be a great many more English scholars than there are today. A more correct way to phrase the question would have been to ask, “Do you understand the culture that produced Shakespeare, or how he related to his contemporary authors?” or “Do you know how to close-read?” The point is that Lehrer’s work purports to merge science and art but winds up explaining art using science. What artistic criticism there is does not move beyond what is necessary to support Lehrer’s scientific claims. Perhaps this is the direction in which this fourth culture will move, but if so the future seems a little bleak for the arts.

I add to Snow’s my own anecdote: Several years ago, I was interviewing for a graduate position in a neuroscience program, and I had a chat with one of the professors in whose lab I could potentially have worked. I mentioned that I’d studied literature alongside neuroscience as an undergraduate, and that I was also applying to English graduate programs. “Oh,” she said, “you don’t need to go to graduate school to study literature! And if you did, you certainly couldn’t keep up with neuroscience. But if you came here, you could keep up with literature. You could join my book club; I’ll put in a good word for you.”

What we seem to need, alongside Lehrer’s fourth culture, is a fifth culture, one to unify current humanities criticism in answer to science—and one through which science realizes that it is itself a layman to the humanities.

___
Lianne Habinek is a PhD candidate in English literature at Columbia University. She is working on a dissertation about literary metaphor and 17th-century neuroscience.