Like most people currently attending this college, I knew a fair amount about quite a few things when I was in high school. I could tell you who the twenty-eighth president of the United States was or ramble coherently about central place theory. I was able to integrate things. If someone handed me DNA helicase, I would have known what to do with it. I even knew what the word “reticent” meant. Now I’m not so sure. This doesn’t keep me from using it in sentences. “Have you tried the butternut squash soup?” I ask my friends. “I hear it’s really reticent today.”
Recently, during a study at the business school, someone posed me a relatively simple question about the number of elves required to make X widgets in four hours if one elf could make Z widgets in two hours. I was instantly bewildered. “I think this is a real commentary on elf working conditions,” I wrote next to the question, before bursting into tears and muttering something about post-feminism.
Of course, we should know better. The moment the class of 2010 arrived at Harvard, we were treated to a talk by Dean Jeremy Knowles, who informed us that the main benefit of a Harvard education was that it would enable us to “talk rot” intelligently and—more importantly—to identify when others were talking rot. He was certainly right about the first part. I have described more things as “post-structuralist critiques of modern mores” than I can shake a stick at. I have no idea what any of those words mean, but when strung together in that order they seem to have quite a startling effect on people. It has gotten to the point where I work this phrase into as many sentences as possible. “Want to come back to my place?” I ask dates. “I’ll show you my post-structuralist critique of modern mores.” So far, this has yet to work, but I have great hopes.
But the problem lies in the second part. Sure, I’m able to talk rot. But four years of college have taken away my ability to tell if others are talking rot as well. Beyond what I’ve learned in my concentration, I’m amazed how little knowledge I’ve managed to accrue since arriving at Harvard. As a starry-eyed freshman, I signed up for a proof-based linear algebra class instantly after arriving. “I will not regret this!” I informed my somewhat bemused parents. “Rigorous proof-based linear algebra is something I see playing a pivotal role in my future.” Now I’m an English concentrator. Math 23 spelled the end of my relationship with math—doomed, like many modern relationships, by trying to go too far too fast. I filled the rest of my “hard science” requirement by taking a course called “Nanothings.” I’m not saying that this course wasn’t a rigorous exploration of new science, but one of the “Key Lessons” of the entire semester was “Small things are different.” Instead of a final paper, I wrote a play about a ray of light trying to decide if he were a particle or a wave. I enjoyed myself tremendously, but, as a consequence of this class, you can tell me that someone has built a tiny robot capable of reading all our minds and I will believe you. “Sounds about right,” I will say, nodding sagely. “Small things are different.”
Since coming to college, I have learned extremely little about a bizarre range of oddly specific subjects. And it’s not strictly my fault. The Core—or now, Gen Ed—curriculum caters to our basest impulses. If you can get away with studying dinosaurs instead of taking a rigorous survey of new developments in science, why wouldn’t you? Who doesn’t love a good dinosaur? Besides, it only meets on Tuesdays and Thursdays, so it won’t conflict with all the rigorous drinking you have scheduled. And this problem isn’t just for science. To receive credit for Historical Study, I can take courses as broad as American Constitutional History from the Framing to the Present or as narrow as Frontiers of Europe: Ukraine Since 1500. I’m not saying that Ukraine isn’t deeply significant, but if I can graduate from college without knowing anything about the history of something that isn’t Ukraine, that seems like it’ll make me irritating at cocktail parties. Sure, they say those who don’t know history are doomed to repeat it. But as long as the part I’m repeating is the part where America prospers and makes strides in civil rights instead of the part with slavery and no flush toilets, this ought to be fine.
Whenever I suggest that I’ve gotten dumber since going to college, my friends nod sympathetically. Part of this is because of the apparently inevitable narrowing of interest that follows a decision to concentrate in one subject rather than another, to read Milton instead of Machiavelli. Part of this stems from the distribution requirements that allow us to graduate with smatterings of disconnected, specific bits of knowledge rather than any sort of larger picture. But it also stems from the way information itself has evolved. Since we were in high school, we’ve moved into a model where knowledge is increasingly stored “off-site.” Instead of memorizing trivia or reading books, we google things. If someone somewhere out there has the answers, and we can obtain them at the press of a button, why bother actually learning? We’re quickly turning into a generation of intellectual Blanche Duboises—depending on the kindness of strangers for everything from facts about plants to driving directions. But something gets lost in this process. Because we could hypothetically know everything, no one winds up knowing anything. And if the benevolent person out there who has the particular fact we want ceases to be benevolent—if Google censors its search results, or if, in the perpetual shouting match of the internet, the facts appear to conflict—it becomes impossible to sort out the rot from the not. Little knowledge is a dangerous thing.
Then again, Socrates said that the beginning of wisdom is the knowledge of your own ignorance. Maybe that’s what this is.
Alexandra A. Petri ’10 is an English concentrator in Eliot House. Her column appears on alternate Fridays.