Aanii, my name is Feather Metsch and I am a junior at IAIA in the Digital Arts program. I also am part of the work study program as Digital Dome Interactivity Research Assistant. This performance was the culmination of my Advanced Digital Arts Project course with Professor Craig Tompkins Fall 2015. Access to the digital dome to produce the content and conduct research and development was provided by the IAIA Digital Dome Manager, Mats Reiniusson. Technical support, specialized software, and guidance was provided by Charles Veasey, who oversees my work study as Digital Dome Interactivity Research Assistant. Additional support and guidance was provided by my mentor, David Beining at ArtsLab UNM, David also produced the 360 documentation for this performance. The Digital Arts course was organized so that we had pretty liberal freedom with what we could do within digital art using any skills we’d accumulated thus far, as long as it culminated in a tangible final project. I presented a formal proposal and presentation on the topic of using immersive and interactive environments for telecommunicative digital storytelling and cultural exchange. A large part of the project (the telecommunicative aspect) had to be compromised rather early on as I realized that a lot more work would be needed to achieve that than the semester allowed.
I reached out to Death Convention Singers to collaborate on this project, as I was very impressed with the large installation and performance they contributed to the recent exhibit at the Museum of Contemporary Native Arts, An Evening Redness in the West by then chief curator Candice Hopkins.
The Death Convention Singers are a large collective that changes it’s line up based on each performance. I had the pleasure of working with the following members on this project:
Performers: Musicians: John Dieterich (Death Convention Singers, Deerhoof) Raven Chacon (Death Convention Singers, Postcommodity, Mesa Ritual, etc.) Rene Aguilera Marisa Demarco (Death Convention Singers, Bigawatt, The 5 Star Motelles, Chicharra, etc.) Sound Spatialization (custom software/MaxMSP) Tahnee Udero Depth Sensor/Kinect VJ (Zvector software): Autumn Chacon Producer & Overall Composition VJ (custom software/TouchDesigner): Feather Metsch
To accomplish what I was trying to achieve for this performance I had to learn at lease five new software. The work flow ended up being as follows:
- I created a pre rendered dome master that was the length of the intended performance using After Effects and a Navegar plug-in. My assets consisted of spherical panoramas (taken with a Gigapan), public simulation data from NASA, and found videos.
- I then had to train Autumn and Tahnee on manning their respective software for the live performance.
- During the live performance I shot the musicians with a depth sensor (Kinect) and ran their cloud point data through a sound reactive software (ZVector) that also allows for visual manipulation of the data. I created some standard video files to be dropped into this data, as a variance to the software’s visual limitations. This depth sensor footage was then chroma-keyed on top of the dome master, and was manipulated live by both Autumn and I.
- Tahnee was spatializing the sound in real time, on a 15:1 surround sound system through customized software specifically designed for our space.
I feel really pleased post show. Once we did the rehearsal and everything went rather close to how I’d originally imagined the piece I felt very relieved and excited for the performance. The rehearsal was the first time that all of us had ever been in the same room, so as you can imagine I had a lot of trepidations as to whether everything was going to work well together. I had several levels of “disaster plans” had we encountered any problems in the pipeline, which I was quite happy that we did not have to resort to.
Aside from not having enough time in one semester to set up the teleprescence aspect I am VERY satisfied with how everything came out. It was a fluid collaboration and everyone did such a wonderful job at their individual and collaborative parts. After only one rehearsal I actually did feel the collaborative aspect quite strongly in that we were all reacting to each other’s decisions in real time and found a comfortable space to execute that within. Part of this comfortableness could be due to the Death Convention Singers being very familiar with each other from their past collaborative efforts with each other, and I feel honored to have been a participated in that magic.
I set the performers out of sight as the original intent was that they would be offsite and I was trying to simulate that vision as closely as possible. If I were to replicate this same performance I would maybe place them in the center so that people could see them and the performers could see their data on the dome.
I had to put so much time into the research and development of this project that I didn’t really get to devote as much time to the assets that I would’ve liked, but I am content with how those turned out despite them having been made under a very short deadline (I spent about 4 ten hours days during the last week before the show editing the pre rendered content).
I am looking forward to taking this project further and setting up the telecommunication capabilities, which would allow for me to develop collaborative relationships over a longer range of physical distance. We recorded the performance in 360 degrees, using a Freedom 360 rig that consists of a 3D printed rig that holds six GoPros in a configuration that allows them to shoot outwards in all directions. The footage from all six cameras is then stitched (I prefer Kolor software) together to create content that can be used for the dome or as a 360 interactive. The sound for this footage was recorded straight out of the board and is a Binaural Sound recording with simulated 3D spatialization when used with headphones. Documenting this sort of performance accurately was a challenge but I am pleased with the results, which you can experience here on YouTube where it is navigable via your mouse, the arrows on the screen, or via the gyroscope or touchscreen on your smartphone.