News

Cambridge Residents Slam Council Proposal to Delay Bike Lane Construction

News

‘Gender-Affirming Slay Fest’: Harvard College QSA Hosts Annual Queer Prom

News

‘Not Being Nerds’: Harvard Students Dance to Tinashe at Yardfest

News

Wrongful Death Trial Against CAMHS Employee Over 2015 Student Suicide To Begin Tuesday

News

Cornel West, Harvard Affiliates Call for University to Divest from ‘Israeli Apartheid’ at Rally

Comments

Securing the Internet

Companies and individuals ignore security at their peril

By Andrew G. Brody

Imagine that you’re sitting in Lamont Library, quietly studying. The student next to you shouts, “Hey Facebook, give me all the photos you have of that cute guy from section!” Someone in a blue Facebook jacket runs over and displays a big poster with embarrassing party photos. All around the library, students can be heard yelling at the top of their lungs about Facebook friends, Google searches, or the latest gossip. It sounds ridiculous, but this is how public our  interactions are when we use wireless networks. We’re ordinarily unaware of it because our laptops politely cover their ears when they hear private messages.

But what happens when someone decides to eavesdrop or—worse still—to actively pretend to be someone else? Firesheep is an extension to the Firefox browser that allows for exactly that. It exploits the fact that many prominent websites (including Facebook, Twitter, and Google search) don’t encrypt normal page requests. Once you’ve logged in, your browser sends a cookie to the server every time it connects so that the server knows who you are. If the connection is unencrypted, an eavesdropper can steal the cookie and pretend to be you; this is known as session hijacking.

This sort of attack is actually nothing new. The lack of encryption on popular websites has been a source of dismay in the network security community since 2004, if not earlier. Wireless networks make eavesdropping much easier to execute and much harder to detect. Unsecured networks like the Harvard wireless network are particularly vulnerable since any passers-by with a laptop can listen in. Financial institutions and others that place a premium on security have generally protected their websites with encryption, but most others lag in their efforts. Still, Google offered an option to always use HTTPS encryption in Gmail as early as 2008. This past January, Google changed the option to be opt-out rather than opt-in, a move that I hope others will follow.

What makes Firesheep different is the ease with which it can be used. There have been other tools such as Driftnet, which displays all images downloaded by people nearby, and Hamster, a program that makes session hijacking almost as easy as Firesheep does.  They all require at least some rudimentary knowledge. A large part of what makes Firesheep so alarming is seeing how quickly your roommate can learn to use it. This is precisely Eric Butler’s goal: to demonstrate to the general public how easy it is to eavesdrop on Internet conversations and to force companies to protect their websites against it.

Condemnation of Butler for releasing Firesheep is misplaced. There is a tradeoff between minimizing short-term exposure and encouraging websites to employ long-term solutions. This problem of responsible disclosure is hotly debated in the security field. Some argue that researchers like Butler should instead quietly tell companies about their vulnerabilities. This is sometimes disparagingly called “security by obscurity,” the idea that if no one knows about an issue no one can exploit it. The difficulty in the case of Firesheep is that the vulnerability was already well known among researchers and hobbyists. For companies like Facebook, the model was closer to “security by willful ignorance,” the idea that if you pretend an issue is unimportant, users won’t care. By and large, security problems in the Internet are not fixed unless they are well known, since there are so many people at so many companies who must be aware of them. No one will be convinced by a single programmer saying he made something unless people can try it out for themselves. For this reason, security researchers routinely provide sample exploits when they publish their findings.

What can we do about all this? The Crimson’s blog recently published a good list of precautions to take, starting with the all-important use of HTTPS wherever possible. Internet firms have a long history of ignoring security and privacy unless users demand it. Facebook and others do the public a disservice by not giving users the option to use HTTPS. This practice is lazy and irresponsible. It is not clear whether regulation would be of any use, either. Something akin to the Health Insurance Portability and Accountability Act, which governs online medical records, could end up being an overly complicated and questionably effective mess of regulation that only protects against yesterday’s security problems. We can do better than this. The solution will come only when we demand a higher standard from the websites we use.

Andrew G. Brody ’11, a Crimson IT editor, is a computer science concentrator in Cabot House. He is the president of the Harvard Computer Society.

Want to keep up with breaking news? Subscribe to our email newsletter.

Tags
Comments