Emerging out of my own work in nonfiction sound, I have developed a specialty in producing narrative audio within teams across diverse media and contexts.

On commission from the Albany Symphony, I led a team that created a radiophonic sound design, which was incorporated into a new orchestral work that invoked the history of the Erie Canal. In this clip (one of three segments we produced), an actor performs the actual speech given at an 1817 groundbreaking ceremony for work on the canal, while spectators shuffle and applaud. Emerging out of the speech, and tightly timed to the orchestral score (by Dan Scholsberg) are the sounds that we surmised, based on our archival research, would have followed: shovels and then celebratory cannons. Consulting with the symphony and the composer, we balanced a sense of period accuracy with an international, multilingual cast, and a sound that acknowledges New York’s current realities and riffs on the temporal displacements invoked by Hamilton. In addition to triggering the fixed media audio track live onstage at the premiere, I also performed with the symphony as a guest on guitar.

I collaborated with the internationally-known performance artist Marina Abramovic and the Hugo-award winning science fiction novelist Kim Stanley Robinson to create a number of audio works for gallery, podcast and literary contexts. During an “Abramovic Method” workshop, I shepherded a cast of 20 through a studio voice-recording process, and then produced, edited, mixed, and sound designed several versions of the piece, which blogger Cory Doctorow reviewed: “The audioscape is rich and haunting, and has moments of goosepimple-raising eeriness.”

In addition to gallery sound installation version of the project credited to Abramovic as “3015”, and a performance track which Robinson took on tour to read with at literary festivals, the project found its final form as Episode 3 of the podcast Into the Impossible, from the Arthur C. Clarke Center for the Human Imagination.

I created the sound design for the 360-degree VR film How to Tell a True Immigrant Story (dir: Aggie Bazaz) using the emerging Ambisonics format, in which sound is encoded as a virtual sphere that replicates a naturalistic listening environment in which sounds may come at the listener from any direction. When wearing a headset and headphones, the positions of the listener’s two ears are tracked within that sphere, which is then “decoded” into a dynamically-changing headphone experience: an illusion that the sounds are accurately positioned in space. Practically, this allows a sound designer to position a changing array of sounds in coordination with spherical VR moving images, and (conventionally) to produce a naturalistic, immersive sense of virtual realism, or (as on this project) to defy this convention and bring attention to the artifice of VR. Among my challenges on this project was to emphasize the film’s potent artistic sensibilities in ways that focused attention on its politically urgent content, and not just the whiz-bang of its innovative technology and format. The film debuted at the prestigious Locarno International Film Festival (the first VR film ever to compete there), and subsequently had a successful festival run across several countries, winning a Special Jury Mention at the Encounters Festival (Bristol, UK)

Though the ideal watching/listening experience is on a VR headset, it is possible to get a sense of the effect of the ambisonic sound design by listening on headphones while mousing over and dragging the video around to explore the sphere.

%d bloggers like this: