Interactive performance rethought: A research project by ZHdK and HKBU is developing an interactive performance in which up to 20 people influence the dramaturgy in real time through gaze, gestures and movement. The state-of-the-art nVis system combines immersive technology, spatial computing and AI to create a new audience experience.
This interdisciplinary project brings together two internationally recognized labs working at the intersection of arts and sciences - the Immersive Arts Space (IAS) at the Zurich University of the Arts (ZHdK) and the Visualization Research Center (VRC) at the Hong Kong Baptist University (HKBU). The project will develop a pilot prototype for a new kind of multi-user interactive performance where the audience (n=20) can individually and collectively influence the dramaturgical evolution of the event. Through a major Hong Kong government innovation grant, the VRC under the leadership of Jeffrey Shaw, a pioneer in the development of immersive technological environments, has developed a world-leading interactive multimedia visualization system called nVis. nVis consists of four core components: (1) A 360-degree, 400 LED panel-based wrap around screen which unlike standard projection-based technologies allows visitors to see each other in a highly illuminated space with no shadows; (2) a tracking system that allows up to 20 visitors to interact in real time with content on the screens; (3) a multi directional sound system (32.1 channels) that can position sound based on where visitors are standing or looking; and (4) iPhone interfaces that can communicate with the system via text prompts (running over an generative AI engine) and audio feeds over individual visitor headphones. This unique system will thus enable IAS researchers to experiment with how multiple users can become performers in a dramatic scenario who interact with, influence others and collectively shape an evolving narrative through eye, gesture and body interaction. This proposed research innovatively leverages ongoing technological advancements in human-computer Interaction and virtual production within the nVis system. At the same time, nVis will enable a new paradigm in spatial computing which interaction can seamlessly be woven throughout the physical environment, taking into account the context of users' actions, and needs. The proposed project thus innovatively combines physical immersion with embodied computing, creating a dynamic environment where digital elements coexist and are manipulated collaboratively through natural interactions like group movement and gesture.