Team: Madi Heath, Andrew Barlow, Phu Le, and Sofi Ozambela
Duration: February 2021 - December 2021
Tools Used: Blender, TouchDesigner, Processing, Laser Cutting, and more
For my final CTD Capstone project, I wanted to combine the forces of all of the valuable experiences and skills I have gained from my time in the program. My teammates and I created Emitting. Emitting is an interactive installation utilizing projection mapping onto walls and sculptures, using visual design and immersive sound to represent the feelings of isolation that many of us first experienced during the pandemic.
The final product can be viewed below.
This project idea was born in early spring 2021, while we were all still learning via Zoom. We wanted to create an immersive experience that was not only intriguing and cool to look at, but something that would make the audience reflect. We began with a lot of feasibility testing to begin as we iterated out our ideas. We developed an implementation plan and project management timeline. We proposed our idea to faculty, and was approved to begin creating our project.
When the fall semester came, there was a lot to do to make this project possible. All of our project documentation can be found on our process blog.
My teammate Sofi and I began by focusing on the large-form sculptures. Our original plan was to use chicken wire to cut out each shape of the sculpture that was designed in Blender, and then cover it in paper mâché. This idea ended up not working for our needs, so we reworked our idea and decided to laser-cut each of the pieces of the geometric shape and assembled them together like a puzzle.
One the sculptures were assembled, we painted and plastered them to add texture to them. We then made large stands to sit them on top of made from plywood and PVC that would give them a floating effect when in the Black Box Experimental Studio at CU Boulder.
In the meantime, we iterated the visuals that would be displayed on the background as well as the sculptures themselves. Phu focused on the background visuals, while Andrew worked on the interaction between two Kinects and the projection. We ended up using a skeleton tracking library in Processing paired with TouchDesigner to track how close and how many people were in front of the Kinects, which would then trigger the animations, making it disappear when the audience member got closer, and got warmer and more sporadic in color and speed when more people were detected.