News

Cambridge Residents Slam Council Proposal to Delay Bike Lane Construction

News

‘Gender-Affirming Slay Fest’: Harvard College QSA Hosts Annual Queer Prom

News

‘Not Being Nerds’: Harvard Students Dance to Tinashe at Yardfest

News

Wrongful Death Trial Against CAMHS Employee Over 2015 Student Suicide To Begin Tuesday

News

Cornel West, Harvard Affiliates Call for University to Divest from ‘Israeli Apartheid’ at Rally

Widescreen to Flatscreen: Televising the Oscars

Film and TV industries team up to keep audiences captivated

By Molly O. Fitzpatrick, Crimson Staff Writer

Warren Beatty, who co-hosted the Oscars in 1976, summed up the event fittingly: “We want to thank all of you for watching us congratulate ourselves tonight.” On March 7, the occasion of the 82nd Academy Awards, America will do it once again.

The Oscars is the oldest media awards ceremony, and the prototype for most if its successors. But it’s an improbable television institution, in terms of both its origins and—to put it simply—just how little it has to do with television.

Emanuel Levy’s book “Oscar Fever” traces the awards back to their inception. The Academy of Motion Picture Arts and Sciences, founded in 1927, was the brainchild of MGM’s eponymous Louis B. Mayer. Its first awards ceremony took place in 1929—the operative logic being that the best way to legitimize the fledging industry might be to host a highly publicized event in its honor.

The Oscar ceremonies were traditionally funded by contributions from the major studios. But before the 25th Academy Awards, several of its primary financial supporters unexpectedly backed out.The Academy needed to secure another sponsor, or else to cancel the extravagance they’d planned. Just in time, RCA purchased the rights to broadcast the ceremony, and it was watched on NBC by the largest audience in the history of television at that time.

Yet, in 1953, the relationship between the film and television industries was far from friendly. Still very much a new medium, TV had conquered the country in the first few years of the decade: it constituted a tremendous improvement on radio, and watching “I Love Lucy” cost no ticket price—this correlated, not surprisingly, with a sharp drop in box office revenue. Hollywood responded with the jealous petulance you’d expect from any first-born child. Many studios forbade their contracted stars from appearing on television, and the networks—devoid of their own celebrities—were considered little more than a dumping ground (and a source of licensing fees) for stale feature films.

Televising the Oscars (the ceremonies had been broadcast on radio for some time) represented a convenient symbiosis. But the merger of film and television presented producers with a formidable challenge: how to create a program that would appeal to both the cinephile—deigning for one night to watch, shame of shames, television—and the devout TV viewer whose remote control happened to lead him there.

It is rare that anything, with the possible exception of sleeping, can hold one’s interest for four hours. But to a significant extent, the Academy Awards manages to do so, in a way that reflects the status shift in media that its broadcast entails. For the Oscars, celebrities are quite literally brought down to size—transported from a fifty-foot wide movie screen to a thirty-two-inch TV screen. The real genius of the Academy Awards broadcast is what it invites its viewers to do: fancy ourselves among the elite, if not somewhere slightly above them.

Year after year, the award categories are populated with consistent archetypes: the underdog, the obvious filler, the perhaps-not-particularly-deserving-this-year-but-boy-is-it-about-time-she-won-already. The winner selection process, intrinsically tainted by Academy politics, is anything but quantitative—statistically, even the most impulsive civilian guesser is likely to make at least one correct prediction. This lends a satisfying, authoritative feel to one’s preferences regarding, for instance, Meryl Streep—who should be given an Oscar every year, by default, just to thank her for being Meryl Streep—versus Sandra Bullock—who, incredibly, is somehow still allowed to make films after appearing in “All About Steve.” The cult popularity of the Golden Raspberry Awards (the “Razzies”), which honor the year’s worst films, thrives on the same instinct for superiority.

Which Oscar moments do people remember? Keep in mind that—as the Academy Awards is a multi-hour ceremony that has aired on television more than fifty times—that’s a lot of moments to choose from. In general, it seems, we most enjoy those that make famous people look uncomfortable, stupid, or silly. Take any one of the inevitable bloated, overemotional acceptance speeches—technically constrained by a forty-five second time limit, but which nevertheless regularly result in millions of dollars of wasted airtime.

In 1998, James Cameron reached new heights of tacky by shouting “I’m the king of the world!” after winning Best Director for Titanic. Here’s hoping we won’t find out how he’d react to a win for Avatar.

Professionally shrill red carpet hosts encourage us to cackle at the fashion foibles of celebrities, and the same nebulous schadenfreude motivates the compiling of countless Worst Dressed Lists. TV cameras are masterfully positioned to capture deliciously revealing reaction shots; there’s nothing quite like the strained smile of someone who just lost an Oscar.

—Columnist Molly O. Fitzpatrick can be reached at fitzpat@fas.harvard.edu.

Want to keep up with breaking news? Subscribe to our email newsletter.

Tags
ColumnsFilm