The growing computation power allowed musicians to start considering instrumentation in live computer music. The typical computer interfaces proved to be poor for this new “instrumental” quality, resulting in the development of new physical interfaces. These interfaces tend to be fully customized and only limited by one’s creativity.
Some of these controllers are based on gesture mapping: music controllers that respond to body articulations performed “in the air”, without any physical contact between a player and the instrument’s body. AirStick is played “in the air”, in a Theremin style. It is composed of an array of infrared proximity sensors, which allow the mapping of the position of any interfering obstacle inside a bi-dimensional zone. This controller sends both x and y control data to various real-time synthesis algorithms.
The Airstick was first presented in the NIME 2005 Conference [pdf].
The new version (manufactured by YDreams):