Cello Cosmos “Alien Concerto Close Approach”at the IAIA Digital Dome

IAIA and Lumenscapes presents CelloCosmos “Alien Concerto: Close Approach” with the electronic cello as a creature on an interstellar space mission. The duo of electronic producer Dwight Loop (Dream Jungle, Third Ear) and Michael Kott (Wedan DePansai) are CelloCosmos; Dome Lighting Artist Joe Dean (lumenscapes.com) has prepared the Digital Dome at IAIA for 50 participants to imagine a 200-year journey through space. “Alien Concerto: Close Approach”roach is a visual experience with electro-ambient sound, a LIVE CONCERT throughout the flight. The audience is within a simulation of the first close-up views of a new home planet.

POSTER

More info @ soundcloud.com/cellocosmos

IAIA Dome Performance ft. Death Convention Singers

Aanii, my name is Feather Metsch and I am a junior at IAIA in the Digital Arts program. I also am part of the work study program as Digital Dome Interactivity Research Assistant. This performance was the culmination of my Advanced Digital Arts Project course with Professor Craig Tompkins Fall 2015. Access to the digital dome to produce the content and conduct research and development was provided by the IAIA Digital Dome Manager, Mats Reiniusson. Technical support, specialized software, and guidance was provided by Charles Veasey, who oversees my work study as Digital Dome Interactivity Research Assistant. Additional support and guidance was provided by my mentor, David Beining at ArtsLab UNM, David also produced the 360 documentation for this performance. The Digital Arts course was organized so that we had pretty liberal freedom with what we could do within digital art using any skills we’d accumulated thus far, as long as it culminated in a tangible final project. I presented a formal proposal and presentation on the topic of using immersive and interactive environments for telecommunicative digital storytelling and cultural exchange. A large part of the project (the telecommunicative aspect) had to be compromised rather early on as I realized that a lot more work would be needed to achieve that than the semester allowed.

I reached out to Death Convention Singers to collaborate on this project, as I was very impressed with the large installation and performance they contributed to the recent exhibit at the Museum of Contemporary Native Arts, An Evening Redness in the West by then chief curator Candice Hopkins.

The Death Convention Singers are a large collective that changes it’s line up based on each performance. I had the pleasure of working with the following members on this project:

Performers:                                                                                                           Musicians:                                                                                                                        John Dieterich (Death Convention Singers, Deerhoof)                                                  Raven Chacon (Death Convention Singers, Postcommodity, Mesa Ritual, etc.)              Rene Aguilera                                                                                                                Marisa Demarco (Death Convention Singers, Bigawatt, The 5 Star Motelles, Chicharra, etc.)                                                                                                                          Sound Spatialization (custom software/MaxMSP)                                                Tahnee Udero                                                                                                                Depth Sensor/Kinect VJ (Zvector software):                                                         Autumn Chacon                                                                                                          Producer & Overall Composition VJ (custom software/TouchDesigner):                      Feather Metsch

To accomplish what I was trying to achieve for this performance I had to learn at lease five new software. The work flow ended up being as follows:

  1. I created a pre rendered dome master that was the length of the intended performance using After Effects and a Navegar plug-in. My assets consisted of spherical panoramas (taken with a Gigapan), public simulation data from NASA, and found videos.
  2. I then had to train Autumn and Tahnee on manning their respective software for the live performance.
  3. During the live performance I shot the musicians with a depth sensor (Kinect) and ran their cloud point data through a sound reactive software (ZVector) that also allows for visual manipulation of the data. I created some standard video files to be dropped into this data, as a variance to the software’s visual limitations. This depth sensor footage was then chroma-keyed on top of the dome master, and was manipulated live by both Autumn and I.
  4. Tahnee was spatializing the sound in real time, on a 15:1 surround sound system through customized software specifically designed for our space.

I feel really pleased post show. Once we did the rehearsal and everything went rather close to how I’d originally imagined the piece I felt very relieved and excited for the performance. The rehearsal was the first time that all of us had ever been in the same room, so as you can imagine I had a lot of trepidations as to whether everything was going to work well together. I had several levels of “disaster plans” had we encountered any problems in the pipeline, which I was quite happy that we did not have to resort to.

Aside from not having enough time in one semester to set up the teleprescence aspect I am VERY satisfied with how everything came out. It was a fluid collaboration and everyone did such a wonderful job at their individual and collaborative parts. After only one rehearsal I actually did feel the collaborative aspect quite strongly in that we were all reacting to each other’s decisions in real time and found a comfortable space to execute that within. Part of this comfortableness could be due to the Death Convention Singers being very familiar with each other from their past collaborative efforts with each other, and I feel honored to have been a participated in that magic.

I set the performers out of sight as the original intent was that they would be offsite and I was trying to simulate that vision as closely as possible. If I were to replicate this same performance I would maybe place them in the center so that people could see them and the performers could see their data on the dome.

I had to put so much time into the research and development of this project that I didn’t really get to devote as much time to the assets that I would’ve liked, but I am content with how those turned out despite them having been made under a very short deadline (I spent about 4 ten hours days during the last week before the show editing the pre rendered content).

I am looking forward to taking this project further and setting up the telecommunication capabilities, which would allow for me to develop collaborative relationships over a longer range of physical distance. We recorded the performance in 360 degrees, using a Freedom 360 rig that consists of a 3D printed rig that holds six GoPros in a configuration that allows them to shoot outwards in all directions. The footage from all six cameras is then stitched (I prefer Kolor software) together to create content that can be used for the dome or as a 360 interactive. The sound for this footage was recorded straight out of the board and is a Binaural Sound recording with simulated 3D spatialization when used with headphones. Documenting this sort of performance accurately was a challenge but I am pleased with the results, which you can experience here on YouTube where it is navigable via your mouse, the arrows on the screen, or via the gyroscope or touchscreen on your smartphone.

Interactive Dome Work at the Rio de Janeiro Planetarium

Recently Rafael Cordeiro and Rafael Drelich of the Rio de Janeiro Planetarium and Gamesquare, a platform for games in public spaces, produced the Rio ShowDome, an event that explored interactivity within the fulldome. The event showcased interactive dome pieces from artists of Brazil and New Mexico including faculty and former students of IAIA. The artists who attended the event were pleased to try this new interactive medium (most for the first time). The event, assisted by the United VJs, took place on November 7, 2015 at the Rio de Janeiro Planetarium in Brazil.

G94A9303

Three of the interactive pieces were developed through an IAIA digital arts course and workshop entitled 3D Animation for the Dome. The Fall 2014 course investigated animation, modeling, gaming, and other immersive production techniques within the fulldome. It was taught by Assistant Professor Craig Tompkins and included a workshop on the Unity game engine taught by Adjunct Professor Charles Veasey. The following pieces were showcased.

Microscape 1
Artist: Craig Tompkins

This piece explores an imaginative landscape based on photogrammetry and panoramic images from the Bisti/De-Na-Zin Wilderness in New Mexico.

Microscape 1

Robot Love
Artist: Felicia Nez

This piece is an interactive version of an animated short called Robot Love. The character Robot Love wanders the streets avoiding the evil robots that chase him around the city.

Robot Love
Terra Firma

This piece explores the different landscapes of northern New Mexico. Each landscape was created through onsite photogrammetry, panoramic photography, and audio recordings.

Terra Firma

New Courses Offered at IAIA Spring 2016: Immersive Art and Interactivity

Starting in Spring of 2016, six new courses will be delivered at IAIA over a 3-year period including Computer Programming for the Arts, Interactive Programming and Development, 3D Graphics and Simulation, 3D Sound Spatialization, Immersive Game Development, and Project Development in Immersive Environments. These courses will deliver knowledge and practice in the technical aspects of art, science, and immersive production. They will augment current courses in filmmaking, 3D animation, and fulldome production.

Students will learn computer programming, interactivity, projection mapping, game development, electronics, and other topics. They will develop a fundamental understanding of computational thinking and project development for immersive displays. These courses are designed to investigate new forms of art and storytelling that arise from the exploration of new technologies, particularly in convergence of traditional forms of cinema and the non-linear, user-driven nature of games and interactives.

Screen Shot 2015-11-04 at 10.43.15 AM

Computer Programming for the Arts (Spring 2016)
An introduction to the use of computing technology within the arts, this course provides a theoretical and technical understanding of computer mediated production and algorithmic art. Through the use of creative toolkits such as Processing and openFrameworks, students will learn the fundamentals of video, audio, 3D graphics, virtual reality, and web technologies from the programmer’s perspective. This level of understanding facilities the learning of new software and the use of computers in general. A hands-on approach allows students to implement this knowledge into their relevant work and practice.

Interactive Programming and Development (Fall 2016)
This course introduces the fundamentals of interactivity exploring the human-computer relationship and interactive art. Students will learn how to program for input devices such as touchscreens and motion capture devices including the LEAP Motion and Microsoft Kinect. Students will also learn how to build custom input / output devices utilizing the Arduino microcontroller. An introduction to circuits and electronics will be covered along with creative coding through the Processing and openFrameworks toolkits. A studio course, students will create interactive art emphasizing non-linear storytelling and user-driven art.

3D Graphics and Simulation (Spring 2017)
This course outlines concepts of 3D graphics, simulation, and reproduction emphasizing the technical and creative aspects of immersive production. It focuses on an understanding of graphics processing including the programming of geometry, shaders, particles, and simulation effects. The students will learn how to render immersive environments targeting fulldome, cylindrical screens, VR headsets, and other arbitrary 3D surfaces. Students will learn projection mapping techniques by printing a 3D model in the fabrication lab and utilizing it as a projection surface.

3D Sound Spatialization (Fall 2017)
An introduction to 3D sound in recording, spatialization, and reproduction, students will use visual programming languages such as Max and PD to learn about the science of acoustics, psychoacoustics, and digital signal processing in relation to spatialization. Students will learn Ambisonics and Vector Based Amplitude Panning (VBAP) surround techniques utilizing IAIA’s 15.2 channel surround system. They will also learn about binaural surround and head tracking within VR headsets. Studio projects will emphasize spatial listening, the study of soundscapes, sound design, and electroacoustic composition.

Immersive Game Development (Spring 2018)
An introduction to game theory and development, students will utilize the Unity 3D engine to create immersive games, interactive films, and other works. A history of games and relevant techniques within art production will be covered. Students will learn about scene construction, animation, artificial intelligence and scripting within 2D and 3D environments. Projects will be developed for the desktop computer, mobile devices, and games consoles with an emphasis placed on rendering to immersive displays such as fulldome, cylindrical screens, and VR headsets.

Project Development in Immersive Environments (Spring 2018)
This course will focus on executing an immersive work of art incorporating several aspects of production in order to create a large-scale user experience. Topics include budgeting, design, development, and the dissemination of creative projects. With an emphasis on team development and innovative storytelling, students will work together to utilize their individual art practices within the field of immersive art. At the conclusion of the course students will be comfortable entering a production firm or managing their own team-based projects. Students’ work will be publicly exhibited and submitted to professional conferences and festivals.

Currents 2016

The November 12th deadline for Currents 2016 International New Media is fast approaching and IAIA is excited to be partnering with them again for the fulldome portion of their program. Submission are accepted for all genres of fulldome video, interactives, and performances.

Yusuf-Emre-Kucur-_Bio-Inspire_2-940x911

Bio-Inspire
Yusuf Emre Kucur
Currents 2015 @ Institute of American Indian Arts

The full list of festival categories include:
New Media Installations, Outdoor New Media Installations and Architectural Mapping, Single Channel Video and Animation, Multimedia Performance, Fulldome, Experimental or Interactive Documentary, Web-Art/Art-Gaming/Mobile Device Apps, Oculus Rift, Robotics, 3D Printing and Interactive Installations for Children.

Simulacra-MoTA

SIMULACRA
Karina Smigla-Bobinski

New Addition to the IAIA Family

Charles Veasey, Software Developer, has re-joined the IAIA Family in the digital dome, teaching new media courses and developing software for fulldome technology. Charles has been developing immersive, interactive environments and applications for a variety of computing platforms. He served as lead programmer and technician for the Digital Dome at IAIA for the last three years as well as a senior software developer and project manager at Ideum, where he created digital museum exhibits for institutions such as NASA, Chicago MSI, and SFMOMA. He also managed Ideum’s NSF-sponsored projects: Open Exhibits, an open source multi-touch framework designed for museum exhibits, and Creating Museum Media for Everyone (CMME), an initiative founded to ensure emerging digital museum exhibits will be accessible and universally designed. Charles has also served as the principal researcher and software developer of 3D surround sound technology for New Media and Emerging Technologies. He has an MFA in Electronic Arts from Rensselaer Polytechnic Institute and a Bachelor of Science from Bowling Green State University. Over the next three years he will be presenting a variety of new courses designed to investigate new forms of art and storytelling that arise from the exploration of new technologies – particularly in the convergence of traditional forms of cinema and the non-linear, user-driven nature of games and interactives.2014_09_26_iaia_campus_life_124_w

Reflections from iX Symposium 2015

This was an amazing symposium in interactivity and VR. We got to see some very interesting talks and lectures. equipment and project presentations, lots of dome projects and a magnificent dome performance: Entropia, by Eric Raynaud (FR), Aurélien Lafargue (FR) and LP St-Arnault (CA). (Link to artist interview in french.)

https://vimeo.com/118748780

I had the pleasure to do a presentation of the IAIA College and the work we have done here in the IAIA Digital Dome during the last year including student projects in interactivity / gaming in the fulldome, the Telepresence project during iX 2014 and immersive interactive projects during Currents 2014 by visitng artist Zack Settel and UNM student Chris Clavio. It has been an amazing year with good student work in Interactivity, Dome Production I fall 2014 and good work from students in summer internship 2014 in collaboration with UNM students and ArtsLab.

Currents 2015 in the IAIA Dome: June 13-14 and 20-21 2.30 pm – 6 pm

This years Currents International Media Festival in the IAIA Digital Dome will have two parts, one student screening (2.30-2.50 pm; 4.15-4.40 pm) and one selected artists main screening (3-4 pm; 5-6 pm).

In Currents 2015 @ the IAIA Dome we have the unique possibility and pleasure to screen works from Electronic Arts and Experimental Video pioneers Steina and Woody Vasulka.

Participating selected artists: Eric Freeman with “Cycles”, Chris Henschke with “Edge of the Observable”, Michael Wyshock with “Deluge”, Audri Phillips with “Relentless Beauty”, Kortney Luby with “I know what I am”, Lance Ford Jones and Claudia Cumbie-Jones with “Gulfstream”, Tony Abeyta with “Aqueous/Acquiesce”, Yusuf Emre Kucur “Bio-inspire”, Fernanada D’Agostino with “The New Math”, Steina Vasulka “Qaqortoq” and Woody Vasulka with “From the Lucifer’s commission”.

Student screening will show works from IAIA, UNM ArtsLab and RingLing College of Art and Design.
Participating students:
RingLing College of Art and Design: Andrew Halley, Peggy Blount, Corey Allen, Nick Lennon
University of New Mexico / ArtsLab: Mia Casesa, Rubin Olgin, Adam Davis
IAIA College of Contemporary Native Arts: Deepak Maharjan, Erica Moore, Dwayne Joe, Delfino Castillo, Felicia Nez

Currents International Media Festival runs from June 12th to June 28th at various locations in Santa Fe with the main exhibition in El Museo in the Santa Fe Railyard.

Vasulka Dome Works
Image from Steina and Woody Vasulka Dome Project