Former Defense Department General Counsel Appointed Harvard’s Top Lawyer


Democracy Center Protesters Stage ‘Emergency Rally’ with Pro-Palestine Activists Amid Occupation


Harvard Violated Contract With HGSU in Excluding Some Grad Students, Arbitrator Rules


House Committee on China to Probe Harvard’s Handling of Anti-CCP Protest at HKS


Harvard Republican Club Endorses Donald Trump in 2024 Presidential Election

Harvard Visualization Scientist Helps Translate Space Images Into Music

A new project out of the Harvard-Smithsonian Center for Astrophysics visualizes cosmic images.
A new project out of the Harvard-Smithsonian Center for Astrophysics visualizes cosmic images. By Kathryn S. Kuhar
By Vivi E. Lu and Mayesha R. Soshi, Contributing Writers

Visualization scientist Kimberly Kowal Arcand from the Harvard-Smithsonian Center for Astrophysics and collaborators from science-art outreach project SYSTEM Sounds created a new sonification technique that translates cosmic images into music.

The team’s goal was to develop a new strategy that would make space imaging more accessible to blind and visually impaired people. Through a process called data sonification, SYSTEM Sounds co-founders Matt P. Russo, Andrew Santaguida, and Dan Tamayo worked with NASA’s Chandra X-ray Observatory to convert space images into sound based on solar systems’ distinct features.

Prior to the COVID-19 pandemic, Arcand built 3D models that enabled the visually impaired to experience space images through their sense of touch. With all in-person programs halted this year, Arcand decided to collaborate with SYSTEM Sounds to create a new type of digitally accessible space image.

“I had some datasets that I was hoping to work on, and Matt and Andrew from SYSTEM Sounds are both complete pros about it,” Arcand said. “It was very, very easy to work with them, to communicate the type of information.”

Arcand added data sonification layers to the 2D images of space to create a multimodal experience that incorporated both visual and auditory fields.

Arcand and SYSTEM Sounds created the music by translating custom Python scripts and Logic Pro data into corresponding notes and rhythms. They initially generated an intensity map where the widest features of the 2D image were the most intense and the narrowest features were the least intense.

“It’s all about what is the essential feature of that system and what’s the most clear and interesting way to communicate that through sound,” Russo said.

The scientists then labeled each intensity level on the image with a specific musical instrument and then formulated the output into cohesive harmonies and rhythms designed to appeal to the ears.

“We wanted something that not only makes sense for the science but also sounds good,” Arcand said.

The translation of the cosmic images into music has yielded increased accessibility for people of all visual abilities. Robyn L. Rennie, an artist with low vision, said she was inspired when she first heard the music for Saturn’s rings after being unable to see the night sky for years.

“It really helped me to conceptualize in my mind what those images were,” Rennie said. “The show resonated with me because I do the same thing in my own art, using the alternative media to create an accessible show.”

Arcand and Russo said they were surprised by the positive response to the music they developed. SYSTEM Sounds plans to continue their collaboration with NASA to create the cosmic music.

“I was quite shocked, in the best way, at the sort of emotional response that I received from the data sonification,” Arcand said. “That was a really special moment for me, that it had an immediate impact.”

Want to keep up with breaking news? Subscribe to our email newsletter.