As an ensemble paradigm, the laptop orchestra is still young. Much of what we are doing on a regular basis is asking questions and searching for answers. Is this an egalitarian ensemble, or do players have distinct roles like in traditional ensembles? Do we want to write new software for each piece, or create reusable systems? Is this an instrument or a controller? Sometimes our answers to these questions change from week to week. This is the story of a composition that has morphed into an instrument.
One piece that is a part of the LOLs pre-history is Gua, which was written as a group assignment for a graduate computer music class in the spring of 2009. The team members who wrote the piece were J. Corey Knoll, Lindsey Jacob, Wennan Wang, and Jeff Albert (me). We developed the musical conception and wrote (or adapted) all of the code. It was written in ChucK. In its original form, the piece utilized four laptops, and one live audio input. It had eight composed melodies, one of which would be selected at random and played every minute. There was a player responsible for triggering these melodies at other times, a player improvising on an acoustic instrument, a player manipulating live effects on the acoustic input and controlling parameters of the output of the other players, and a player sampling and manipulating the samples of the live input.
A good bit of the structure of this original version, both musically and technically, was a result of specific parameters of the assignment that it was written to fulfill. When the LOLs started in earnest in the fall of 2009, Gua was one of the few somewhat complete pieces that we had. Now that we no longer had to meet the specific technical requirements of an assignment, Gua quickly began to morph. Nick Hwang wrote the code that allowed the live input effects and sampling manipulation to be run by a single person, and controlled using an M-Audio TriggerFinger, instead of the onscreen MAUI controls that we had built in ChucK. It soon struck me that the compelling part of this piece was the way it allowed for interaction between the improvising acoustic instrumentalist and the musician that was playing the laptop. As we continued to trim the fat, we let go of the melody player part of the piece, and were left with what we called the Gua Engine. The composition Gua was no longer being performed, but the bulk of the software written for Gua had now become the Gua Engine, which manipulates and records live audio input. The software allows for pitch and delay manipulation in real time of the live audio input, and also provides a 5 second audio buffer that can be filled with sound from the live input, then manipulated in terms of speed, direction, pitch, and delay.
The Gua Engine became an instrument, in my mind, when we wrote another piece that would utilize it. We had an idea to do a vocal piece, and as we discussed ways to structure the piece, we realized that we already had a tool that would do what we wanted, the Gua instrument. On our LOLs official debut concert, two pieces used the Gua instrument. Improvisations and Manipulations was performed on alto saxophone, flute, trombone, and three Gua instruments. Colonnes was performed with two voices and two Gua instruments. Each piece has its own sound and structure, but the technology is the same. Gua has become an instrument because we can write pieces for it, teach musicians to play it, and then those musicians can play all of the pieces written for it. In the hands of the good musicians in the LOLs, the Gua instrument was used to introduce strong musical content into improvised musical situations. The players of the Gua instruments interacted as equals with the players of the traditional instruments.
The Gua instrument will continue to evolve. It is presently being ported to Max/MSP, and expanded to include more buffers. The control interface will likely move away from the TriggerFinger to an iPad/TouchOSC based solution. And of course, we have to write more pieces for it.