The AlloMixer is an interactive audio mixing console made for the Allosphere in UC Santa Barbara. Object-based audio has gained a lot of attention in the industry. Given the fact that with object-based audio, the prevailing metaphor is sounds in space as opposed to sounds in channels, this project tries to use familiar tools on a channel-strip of a mixing console like solo, mute, volume, and provides a per-object interface for each of these controls.
The system utilizes the Allosphere’s full surround VR-like projection system for visualizing the sound scene, along with the 54.1 channel audio system to render the sound scene.
The projection system provides visual feedback as to where the sources are present in the scene, and the audio system, plays back the sound at the corresponding locations using First-order Ambisonics.
The interface is hand-gesture controlled using a motion tracked glove to point-to and modify sound sources in the scene.
The following video illustrates the gain, mute and solo functions. Note that capturing a video in the Allosphere is always tricky due to the architecture of the space.
Written in C++ using Allosystem using FOA for spatial-audio over the 54.1 speaker system, and Phasespace for motion tracking.
This was created as a part of Matt Wright’s MAT201B class in Fall 2014, UC Santa Barbara.