Karthinking About Partying

Immortality is just a keg stand away


I used to party hard. Then sophomore year ended, I filled out my Q Guide evaluations, looked up my transcript, and stopped. I’ve done well by my renewed focus, but as I look ahead to May of my “senior spring,” I—like many other 2012ers—see a rare break in my life of labor: With no exams, no classes, and no job for 20+ days, it will be an opportunity to rage like no other. On the cusp of adulthood with childhood firmly in the rearview, we’ll find ourselves for the last time surrounded by each other and without any real responsibilities. Bottoms up.

But there’s a problem: I don’t know if I still have it in me. The hangovers are worse. My tolerance isn’t what it used to be. Hair floating in a solo cup of lukewarm Keystone doesn’t taste quite as good as it used to. Sedate Harvard Square bars and mediocre Boston clubs are more attractive diversions these days than sticky, black-lit dance floors and Rubinoff. Can I still party like its 2009? I’ve found that these pressing questions are by no means unique to me. In prepping myself not to squander a single minute of this last ever opportunity to be super social for three straight weeks, I turned to science! This week—with my last column ever—I’d like to Karthink about my morally hazardous justification for the benefit of my similarly hesitant peers: Partying all May will make you live longer.

First up: telomeres. Those of you who suffered through LS1b may remember the word. Telomeres are repetitive sequences of non-coding DNA at the very end of each of the chromosomes that make up the human genome. The DNA copying process is pretty good, but it can’t get all the way to the end of a strand, so a bit of telomere is lost each time a cell copies its DNA during division. Telomeres are great because they stop you from losing real DNA during replication; they can also stop cancer by limiting the number of times a cell divides. Unfortunately, shorter telomeres also mean you’re an old fart who’s about to die.

It’s well known that stress—something Harvard students have plenty of—makes telomeres degrade faster. That’s no good for those of us who want to stay alive. But some recent studies have shown that managing the way you react to that stress can not only stop telomere shortening, but also even reverse it. Is there any better way to balance out the stress of four years of grueling problem sets and essays than with three weeks of outdoor debauchery in the May sunshine? Unlikely.

Next up: epigenetics. This relatively new area of biology aims to understand the way that environmental factors contribute to relative levels of genetic expression. In a bit of neo-Lamarckism, some studies even indicate that epigenetic patterns are inheritable from one generation to the next. Rather, environmental—and even social—factors may dictate an ever so slight chemical addition to a gene’s DNA—known as methylation—that regulates whether and to what extent that particular protein is expressed. Methylation is usually paired with some evolutionary benefit specific to the environmental context that causes it.

Happily for my cause, a new study covered in last week’s Economist found a strong correlation between gene expression and social rank in a group (they only used girl monkeys, but whatever). The correlation was so strong that looking at cell samples alone let researchers correctly pinpoint an individual’s social rank 80 percent of the time. What were the genes in question? Low status monkeys made more of the proteins needed for immune response because they were more likely to get sick. This, however, resulted in greater chronic inflammation, the kind of thing that can lead to heart disease or Alzheimer’s over time. The relevant thing about epigenetics is that as one’s social environment changes, so, too, does one’s gene expression. We antisocial monkeys still have a chance to change our methylation patterns and live longer if we make a few new friends by partying away this May.

But you might be wondering: Karthik, won’t substance abuse give me liver cancer and alcohol poisoning? To detractors I say, ignore the years of personal experience and medical knowledge you may have. Just pay attention to another study I found on Bing last night instead. Yes, Bing. It found that loneliness can make you sick. Loneliness—as with being a beta—tilts your gene expression to better protect you from antisocial dirty-keyboard-to-person bacterial diseases instead of social person-to-person viral diseases. That all sounds well and good, but—again, as with being a beta—always protecting too heavily against bacterial disease also can lead to chronic inflammation and its negative consequences on longevity. In fact, the study found that loneliness’s “effect on mortality is comparable with that of smoking and drinking.” Over seven and a half years, a gregarious person has a 50 percent better chance of surviving than a lonely one. So don’t worry while you’re out there partying: All the damage will be cancelled out.

I recently found myself explaining the “going-out/staying-in” dichotomy to a friend who goes to school in Arizona. He literally didn’t even know what the terms meant. I welcome May as a time to put memories like that one behind me and do college like it’s Blue Mountain State. Given the preponderance of evidence, making May a month of epically stress-free, alpha, and gregarious partying will only make me live a longer, fuller life. But what’s the real message here? Science is cool! It can be used to justify anything.

Karthik R. Kasaraneni ’12, a former Crimson associate editorial editor, is a chemistry concentrator in Lowell House. His column appears on alternate Thursdays.


Recommended Articles