For Chain, our creative collective In-grid took over IKLECTIK to test the boundaries between live event and installation.
Using computer vision and speech recognition, we built a feedback loop of devices and data that audience could become part of. Performers including movement practitioner Shai Rapoport and singer Christina Karpodini interacted with the devices and shaped the visual and sonic landscape of the performance space.
Besides being the In-grid lead for this project, I co-built a speech-to-text system in Python that used Natural Language Processing to control the colour of LEDs based on what people were talking to the system.