Former Defense Department General Counsel Appointed Harvard’s Top Lawyer


Democracy Center Protesters Stage ‘Emergency Rally’ with Pro-Palestine Activists Amid Occupation


Harvard Violated Contract With HGSU in Excluding Some Grad Students, Arbitrator Rules


House Committee on China to Probe Harvard’s Handling of Anti-CCP Protest at HKS


Harvard Republican Club Endorses Donald Trump in 2024 Presidential Election


More information is not necessarily better

By Alexander N. Li, Crimson Staff Writer

When I logged on to my Facebook account early Tuesday morning, I was pretty sure I was hallucinating. My home screen was filled with odd factoids about my friends’ comings and goings. Mary had just broken up with her boyfriend. Bob was no longer single. Evan had written on Jill’s wall, “I hate you and your dog.”

When after a few seconds it became evident that I was not in fact hallucinating, I clicked through my email, hoping desperately someone had figured a way out. Instead I found sixteen replies to the subject line “Facebook on CRACK!”

Like me, Facebook founder Mark E. Zuckerberg, formerly of the Class of 2006, was probably a little shell-shocked when, over the course of the next two days, his company received half a million complaints about the new features. He quickly backtracked, issuing a mass apology and promising to implement better privacy controls. Facebook “really messed this one up,” he wrote, when it ignored the part of their mission, “helping people share information with the people they want to share it with.”

So now Facebook lets you select what information it puts into the news feeds, which has largely quelled the uprising. But I’m still uneasy.

Privacy doesn’t seem to quite capture the problem; as a Facebook representative noted, the feeds “do not give out any information that wasn’t already visible.” What is so disturbing about the new Facebook is not that the information it shares is private—no one imagines that anything they post is really private—but that information itself has become so ubiquitous. To put it another way, I care less about other people seeing what I’m doing than being forced to see what others do.

The new Facebook bothers me because it implements all too well Zuckerberg’s guiding principle, that the free flow of information is in everyone’s best interest. “Information flow is an important issue,” Zuckerberg writes in a Facebook group devoted to the subject, “because our ability to solve other problems is generally limited by our ability to communicate with other people.” More information helps us make better choices. If I am looking for a new roommate, every detail about Bill helps me make a fairer assessment of his compatibility. I am also free, it should be noted, to ignore irrelevant information altogether.

So the free flow of information, checked by individual privacy controls, grants us a wider array of choices and makes us better off. But at least in Zuckerberg’s current implementation, it also means we lose a choice. We can’t say “I’d rather not have this information.” We can’t say stop.

And for me, this lost choice is critical. When we are in positions of power, when we must make decisions concerning others, our responsibility is not merely to use all available information—it is to use all available information well. When we have the foresight to know we can’t do so, responsibility compels us to ignore some information. Recruiters don’t ask for race or age or sex. Doctors don’t demand that patients undergo criminal background checks. In such situations, we recognize that more information can give us more choices and at the same time, make us poorer decision-makers. And so we decide that the best choice is to make the information flow stop.

The Facebook, always an experiment in open communication, attempted this week to expand its position. The venture was unsettling, but not wholly unproductive. Millions asked how information impacts choice. Zuckerberg grappled publicly with what the “free flow of information” really means. Having been deluged in overwhelmingly negative input, it would be a sad irony if the Facebook did not now scale back. They have the information—but will they make the right choice?

Alexander N. Li ’08, a Crimson editorial editor, is a philosopy concentrator in Leverett House.

Want to keep up with breaking news? Subscribe to our email newsletter.