In less than a month, I will be graduating from Harvard University. I will spend the rest of my life hiding the fact that I bleed Crimson—while subconsciously finding any opportunity to show off the alma mater to which I worked so hard to gain acceptance—and make jokes about Harvard Time to people who don’t care or understand (i.e., everyone). Since the gravitas associated with my termination as a Harvard student somewhat outweighs that of my time as a Crimson columnist, I will use this last piece as a final goodbye to an institution that has given me so much over the past four years.
Reflecting on my time here, there is nothing I would rather do than thank Harvard for all that it has done, and how better to thank an academic institution than to list all that it has taught me. So Harvard, a sincere thank you for teaching me the following:
Next month will be the 20th anniversary of the day my father permanently lost vision in his left eye due to a fatal surgical error. Currently, this eye is slightly shriveled, glossed over, and completely blind — a physical reminder of the frustration my father felt as a victim of medical malpractice. Exacerbating this already tragic situation, my father never received a clear answer regarding why this complication occurred because his surgeon avoided any inquiries about the subject. Almost all medical malpractice testimonies share this common theme of unanswered questions and helpless frustration. If anything, my father was “lucky” to realize his predicament days after his surgery. Dr. Frederick S. Southwick, for example, shares how a surgical mistake 17 years ago eventually resulted in amputation of his left leg.
Before embarking on an arguably justifiable rampage of the quality of America’s modern healthcare malpractice system, let’s stop to consider what compelled these surgeons to avoid admitting these mistakes. From the doctor’s perspective, there are several compelling reasons to not admit preventable errors. These include today’s negative stigma associated with making mistakes in medicine and a lack of meticulously accurate quality inspections in medicine, which more easily allows doctors to get away with their mistakes if they so choose. It’s difficult to imagine why doctors wouldn’t want to keep their mistakes secret given the negative consequences they face if a mistake proves fatal, including losing a patient’s trust, millions of dollars in a malpractice case, and even a medical license. In order to persuade doctors to admit their mistakes, we need to create an environment with more compelling reasons to own up to their errors rather than keep them hidden.
Overcoming history is never an easy task. Yet recent efforts by student organizations and the administration here on campus are attempting to reverse a century’s worth of prejudice against mental illness by opening new discussions on how it should be addressed. These efforts are commendable and clearly much-needed, given the recent slew of meetings, rallies, and the anonymous Crimson piece about the subject, and as the product of the bring-back-hot-breakfast era, I am glad to see Harvard unified on a genuinely substantial and often-neglected issue.
What does concern me, however, is whether this sudden urgency to talk about mental illness will last. This is not the first time Harvard was questioned on its mental health capabilities, and their prior solution of hiring psychiatrists is concerningly similar to what they hope to do now to address the issue. Though the current student-led efforts to increase awareness are at a much larger scale than those of the past, more needs to be done to ensure mental illness stays at the forefront of the campus consciousness. Harvard is not the only place that needs to reconsider its efforts on how mental health is viewed; this is something that should be addressed at a national scale.
Last year, more than 160,000 Americans died from lung cancer. It was the leading cause of cancerous deaths, and is expected to only increase in both prevalence and incidence over the coming years. Yet ask the average American what he thinks the most common type of cancer is, and lung cancer does not immediately cross his mind. Especially in recent years, propaganda surrounding lung cancer awareness has been stingy; I only learned that November is Lung Cancer Awareness Month—or that lung cancer even had an awareness month—this past year, after a friend’s recommendation to read a New York Times blog on the subject. What’s most ironic is that when lung cancer began to exceed breast cancer as the leading cancerous killer amongst women, awareness of the disease started to steadily decrease.
Why do we forget such a dangerously persistent killer? Or perhaps a better question would be, is our ignorance by choice? These questions are difficult to answer because they force us to question our moral standing. There’s no denying that given its startling statistics, lung cancer should be recognized first and foremost as an extremely deadly medical disease. But instead of objectively seeing lung cancer for what it is, we form prejudices against the patients of lung cancer and blame them for their circumstances. Thus, we do not sympathize with them as much as we do with victims of the more “attractive” cancers. This in turn affects how much we are willing to help raise awareness to find a cure.
There was a time when Alzheimer’s was considered a genetic disease, one that mainly affected families with a history of Alzheimer’s and only rarely appeared outside this genetic chain. If someone was diagnosed, his family was forced to accept the unfortunate fate as a happenstance outside of their control, similar to that of unexpected cancers or severe, traumatic accidents. However, like many other diseases rampant in today’s society, Alzheimer’s has recently been linked to the unhealthy American diet and fallen into the obesity bandwagon, labeled type 3 diabetes by Brown University neuropathologist Suzanne De La Monte. Her research indicates the main cause of Alzheimer’s is the brain’s inability to use insulin properly for normal brain activity; hence, the diabetes label. Both type 1 and type 2 diabetes share an analogous inability. The negative effect high sugar foods have on our physical existence is well understood, but according to this research, these same foods could even affect our future cognitive functionality.
Though the research is clearly compelling, it leaves us with more questions than solutions. Several government policies have already surfaced to mitigate high sugar consumption, only to be immediately criticized with public disapproval. New York Mayor Michael Bloomberg’s highly debated “soda ban” that mandated restaurants only serve up to a certain size of soft drink was accused of being discriminatory and proto-Big Brother. Similarly, Michelle Obama’s 2010 federal school lunch initiative barred certain foods or rationed others and forced an 850-calorie limit. Like many of its ancestral reforms, there has been more evidence supporting the repeal of these regulations than their continuation.