For many of us, when we look at artifacts made by others, we enjoy a finished product for whatever it is. But when we look at our own creations, we instead see the shadows of what could have been. We see the messy process and sometimes we see the compromise. I remember watching this Youtube video, and an earnest Creationist was asking (rhetorically) how such a complex and wondrous creature like a horse can come from anything but a perfect god- though he had the vantage point of not being able to see the modern horse as the product of a slow, churning, frankly uncaring process that necessitates mountains of horse and pre-horse corpses.
Perhaps his god really had something else in mind and had to settle for a horse.
Music of the Spheres Postmortem
This past weekend was the debut of a recent project I was involved in, at MIT’s first “Hacking Arts” festival.
“Music of the Spheres” is the second collaboration between me, my associate, and Slide20xl, a musician from Augusta, Georgia who is now based in Boston. Our first collaboration was for a similar project, for his performance in Madrid for the Red Bull Music Academy in 2011.
Both projects were experiments in “crowd noise-input games”, where the crowd is prompted by the game interface to be part of the music. The last project involved 16-bit music and calls back to iconic game moments in short bursts, where the crowd’s noise would prompt triggers for something to happen in-game. We wanted to play around with that more, and came up with the idea of a game where the crowd is given more responsibility for the musical composition/play experience.
In MotS, the crowd is organized and directed to sing at different pitches and intensities to build and adorn a planet. The crowd’s live singing layers with recordings of the crowd singing earlier, so that as the planet rotates the song changes and becomes more complex. Then, the crowd could be split in two, and the two teams compete to create the planet to their own respective likings, probably making a mess in the final process.
Overview: The Setup
-Hardware: microphones, projector, one computer (to run two applications simultaneously- the Sound system and the Game system).
-Sound Layer: Written in python, plays the composed music, filters crowd noise, and plays back the crowd layers. Writes two CSV files to the game layer: the music composition and the identified pitches/times of the crowd noise.
-Game Layer: Scripted in GML, within Gamemaker Studio. Reads Sound Layer files and populates a rotating planet. Zooms to the surface to help guide the crowd through suggested melodies on-screen, and zooms out to demonstrate the total effect of the crowd’s singing (what the whole planet looks like). Reading the files, the game layer creates little circles of “matter” that are collected by a UFO’s beam of light when singing is timed on it. The more matter a beam collects, the larger the artifact that the material can form.
-Presentation: Slide20XL presents our product in context at the event. He is also acting composer, directing the crowd through melodies that complement the composition that the game presents (that uses earlier crowd sounds, as well).
Overview: Project Management
We used Google+ Communities to manage our team of four. We were in Pittsburgh, New York City, Boston, and San Francisco, so in-person meetings were impossible until the weekend of the presentation. Ephemeral discussions happened in Google chat and we had a weekly video conference on Hangouts. Posts in the Community were for Very Important Topics, and discussion on them happened in their comments.
I and some of my associates have been using a framework that we call “Boundary Conditions” (or “BoundCons”) in all of our small projects. The idea is that we have different tracks (game interface #gam, musical composition #mus, sound management software #snd, #art) and different people clearly responsible for each track. Tagging was useful to help filter for only relevant information to whoever is looking at it. Every week goals for each track are outlined. If two failures occur in any track over any amount of time, the project is scrapped due to demonstrated lack of interest. BoundCons were a harsh but necessary bottom line that has guided our recent small projects and have occasionally forced us to move projects into hiatus for a couple of weeks. This was especially useful since we often found ourselves working on several projects simultaneously. Effort was a measure of interest.
Finally, the weekend of the presentation, we met in Boston to integrate and set up the project.
Contingency Plans:
I knew that I would not be present for the actual presentation.
The last time we all worked together, only the composer was physically able to be at the presentation (in Madrid). We realized, very close to the deadline, that we needed a backup plan in case the setup in Europe proved to be too difficult (note: it did). We scrambled, collecting recording equipment at the college TV station and rounding up people to use as guinea pigs. As our backup presentation, we recorded a demo of the game being played, in order to use it as a visual for a non-interactive version of the concert. At that Madrid event, (Red Bull Music Academy), the music was the focus of the event and the game just drove home the themes in a novel way. Reducing the game to video was a profanity to me and the other developers, but it didn’t diminish anything for the audience at that event, especially since they didn’t know what Plan A was. Our Plan B could well have been our Plan A.
The Hacking Arts Festival in Boston, though, was different. The interaction was part of the kernel that they were interested in. Having a recording wouldn’t be an acceptable backup plan. The audience needed to understand the concept of how the game *should* work, even if it didn’t. We ultimately decided that an alternative show, if the microphone setup should fail or if the software couldn’t filter sounds well in the new environment, was to Wizard-of-Oz the interactions, (that is, to have us put in the pitch/time information on behalf of the microphones, feeding the game directly through hotkeys on a keyboard). This way, the idea of the presentation would be preserved.
Contingency planning isn’t very fun. It often involves scoping ambitions down, and it also directs time and energy away from adding that one final tweak that might fix everything but when your product is on a deadline, and especially when you’re presenting, you have to spend some time in your audience’s shoes and work out how you can present what you have. Are they going to be disproportionately wow’d by the emitter you’ve been playing with? Would they appreciate the difference between a simple implementation of this event or an elegant/scalable one? (Sometimes the answer is yes!)
I imagine most judged arts face the same kind of dilemma. A gymnast might calculate whether they should do an extremely difficult maneuver pretty well or a merely “pretty difficult” maneuver nearly perfectly. Choosing to scope down certainly doesn’t feel great but sometimes it’s appropriate for the occasion.
Verdict
Ultimately, integration was rough, and the time pressure of the deadline did us few favors. We stayed up later than I’ve had to do since college. There was a period where I was guiding folks through code over the phone after I had left for work. There was scrambling up until the very hour of the presentation.
I had to receive a call while I was at the airport to find out how we did.
Apparently, things went pretty smoothly- our composer had a crowd engaged, our other partner was behind the computer and the game was running. People were getting the gist of it!
-And then the computer died. M’kay. When making your Grand Plans A, B, and C, don’t forget to bring a charger.
The team was able to clear up the situation with the crowd, nobody died but the computer.