Project Implicitly Racist

•
If you see something here, you might be racist.
If you see something here, you might be racist.

In what could almost be contrived as a follow-up to this summer's run-in with the Cambridge police, Harvard has now taken to developing its own meter to determine if in fact people are implicitly racist or sexist. Enter "Project Implicit."

The system uses Pavlovian conditioning to have the user sort cropped faces into left and right categories, originally denoted as "European American" or "African American," a red 'X' signaled when the user makes a selection of which the program does not approve. The second round then replaces these titles with "Bad" and "Good," now flashing words intended to be sorted based on connotations. Later rounds permutate "Good" and "Bad" with "European American" and "African American," in the race test, to measure changes in reaction times when, for example, "Good" is paired with "European American" or "Bad" with "African American". The results of this say how racist the user is.

But the Implicit Association Test seems to be flawed from the start, presumably yielding only the results it is intended to: that, in fact, you are a racist. Or a sexist. Or that based on the given test, results were inconclusive to conclusively say that you are a racist, but you probably are.

More info on the project after the jump.

Project Implicit offers a whole range of implicit association tests, from other races and various skin tones to a sexuality IAT. Again, FlyBy wonders what these tests seek to achieve. Such a program builds itself only on the ability to highlight racism where none may actually exist. This likely exacerbates the issue of racism, doing nothing to combat it, but trivializes it to the scale of what could be a Facebook app. Tired of the same old liberal versus conservative grids? Instead, check out my bigot meter... telling you that when I see faces of black people, I think to press "Bad" instead of "Good." FlyBy actually wonders how much the design of this program forces these unexpected results. Though, consequently, you should probably expect to learn you are a racist.

FlyBy does remember this email forward from couple years back, but only recently has it won the appendage of being a "Harvard"-branded product, the most startling part of this whole affair. Glad to know that Harvard is now on top of things when it comes to combatting racism on campus... by reminding people that they are, in fact, racists.

(Image courtesy Wikimedia Commons)

Tags

Harvard Today

The latest in your inbox.

Sign Up

Follow Flyby online.