The Forest was a Canada Council for the Arts sponsored inter-arts research grant exploring the contradicting themes of hyper-connectivity and isolation in today’s world. Given the unique challenges aerial arts face because of its dynamic nature, it sought to leverage accessible and affordable technologies in order to integrate real-time media into performances by small and local aerial circus troops.
We achieved affordable real-time tracking with a single cellphone camera and Google’s MediaPipe to Touchdesigner plugin developed by Torin Blankensmith and Dom Scott. This allowed us analyze skeletal data without additional tracking equipment on the aerialist themselves, giving them freedom to mold the projected environment as they performed.
No AI was used to create content. Instead audience members, on-site or remote, whispered into their device creating a virtual “wind” that shaped how the environment looked and reacted to the performance. ChatGPT analyzed speech patterns focusing on context, tone and complexities. This information was mapped to generative visuals in Touchdesigner, adjusting the forest rendering in real-time.
Combined, these techniques resulted in an ambient proof-of-concept performance that occurred over several hours, allowing audiences to come and go for durations of their choice as the piece evolved over time.