We build an immersive audio environment using a platform developed by
which is capable of mapping sound to physical space at the granularity of a
The NousSonic platform allows for the composition of audio objects into
immersive sound sculptures defined by a combination of digital sound files
and code. Sound sculptures are managed by system level events, allowing for
the room of active audio objects to be reprogrammed according to choices
made by the visitors.
Visitors wear a special headset to track their physical location while
listening, enabling them to interactively explore the audio environment of
each sound sculpture with their ears.
See below for technical details and first-person videos of the visitors'
immersion into the sound sculptures.
-- Two compositions, each an individual sound sculpture
-- Three compositions, each an individual sound sculpture
-- Two compositions in a single sound sculpture
In the artist directories above,
a Diagram documents the shape and location of audio objects in the
Each Walk video records an actual exploration of the sound
sculpture, from the perspective of the Diagram or from the first-person
perspective of the visitor in the physical space of the Room.
The room contains two compositions: 1) a small, circular set in the
top room; 2) a longer, linear set that fills the remainder of the room.
Each of these intends to provide a well-defined narrative for the
visitor to explore and/or to improvise further according to their movement.
The following tracks contain original sound material, presented
roughly in order of its original conception with some overlap between
sounds to simulate visitor movement.
Each composition is driven by modsynth sets and complimented with found
sound. The room also contains easter eggs intended to stand alone,
but which can easily combine with other audio objects. A few stand-alone
compositions were created along the way, but unused in the final work. See
the following directories for some of these: