DANCE^2 (DANCe squared)
Role: Director, Narrative Designer, Producer
DANCE^2 (Dance Squared) is a multidisciplinary research and immersive performance project in which audience members answer questions via their smartphones to interact with the performance. Their answers determine the choreography of human and robotic dancers as well as the media, lights, and sound design environment.
Through their choices, the project explores the audience’s relationship to emerging technology and how much agency they feel they have over its future impact on society.
DANCE^2 began as an investigation into the choreographic possibilities of small robot named "Calico." We began the project through a series of workshop sessions in a dance studio, simply observing how Calico moved on a body and the various themes and questions it elicited: how might we use wearable technology and dance to make both emerging technology and the performing arts more accessible and compelling to an audience? How could we use dance and interactive art installation to discover novel uses for wearable technology and spark a public conversation about its utility and ethical implications?
More than 20 artists, designers, and engineers from dance, computer science, information science, theater, and immersive media have contributed to the performance and its accompanying research. Over the course of more than two years of development at the University of Maryland (UMD), the project has gone through 3 radically different iterations, with a 4th on the way.
Version 1: February 2022-April 2023
In the initial version of DANCE^2, the performance focused on the duet between a solo dancer and a single robot. The performance began by instructing the audience to scan a QR code that led them to a browser-based interface. When cued through sound and media, the audience was prompted to vote for how the robot should respond to the next section of choreography: should it follow the dancer’s lead, or make a different choice: i.e. “move to shoulder.” When the robot moved in ways that ran counter to the dancer’s expectations, the dancer would be forced to follow the robot. The resulting choreographic narrative was one where the audience mediated the degree and nature of the conflict of control between human and machine. As one audience member shared during the post showing feedback session, “Sometimes it felt like the robot was controlling the dancer, then I realized that we (the audience) are the robot.”
Version 2: April 2023-September 2023
For a second performance in September, 2023 as part of the UMD campus-wide NextNOW Arts Festival, we decided to explore pushing the project more towards an installation by taking it outside of the theater space. We mounted this new version in the UMD Art Building atrium, a two story open space that would allow audiences to come and go as they pleased.The narrative structure would play as a loop, meaning that the questions that the audience were posed via their smartphones would repeat, giving those audience members that did stay for multiple iterations the opportunity to replay the performance with different answers each time.
Version 3: October 2023-April 2024
A third version transformed the project into a immersive performance where the audience was invited to participate in a theatrical HCI research experiment that was, at the same time, a real HCI research experiment. Audiences were asked questions before and after the performance along with 25 questions posed to them during the performance itself. The questions asked them about their feelings about technology in general, their feelings about wearable technology, and what they imagine the future of technology to look and feel like.
A sophisticated show control system allowed for a high degree of flexibility and automation in how the media, lights, and sound were cued based on the audience’s votes. This meant we could develop a highly sophisticated branching narrative structure and increase the number of permutations of the production into the millions. An improved database system enabled the collection of both qualitative and quantitative data from the audience via their smartphone inputs. A post show installation shared information on the development process of the project and invited the audience to leave their own questions for other audience members to answer.
Version 4 and Beyond:
Building on the structure of Version 3, Version 4 will incorporate generative AI through real-time manipulation of a video feed using an open-source Stable Diffusion model.
The post show installation will recreate a fictional version of the DANCE^2 lab where audiences will be invited to interact with the data collected, learn more about the development process, and build their own robot.
A new DANCE^2 website will translate the project's themes into an interactive media project where online participants can learn about the project, explore the research data collected, and participate in the research themselves.
PRODUCTION HISTORY
-
January 2025 - PRAx Performing Arts Center - Oregon State Univ.
-
April 2024 - Moving With Screens and Machines Symposium
-
Sept 2023 - Univ. of Maryland NextNow Festival
-
March 2023 - Univ. of Maryland Spring Faculty Dance Concert
CREDITS
Director: Jonathan David Martin
Choreographer: Adriane Fang
UI Designer: Bill Kules
Technical Lead: Huaishu Peng
Sound Designer: Samuel Crawford
Media and Systems Designer: Mark Williams
Media Designer: Tim Kelly
Costume Designer: Tori Niemic
Lighting Designer: Scott Monin
Scenic Designer: Margarita Syrocheva
Dancers: Patricia Mullaney-Loss, Amber Chabus, Zoe Caldwell, Angela Smith
Production & Stage Manager: Tori Schuchmann
Robot & UI Engineer: Anup Sathya Kumar
Robot Engineer: Jiasheng Li
UI Engineer: Christopher Maxey
Choreo & Robotics Assistant: Julie Zalalutdinov
Created through the generous support of an Arts for All Arts AMP Collaborative Faculty Grant, UMD Immersive Media Design program, UMD School of Theater, Dance, and Performance Studies, UMD College of Information Studies, UMD Department of Art, and UMD Department of Computer, Mathematical and Natural Sciences.