Cradle_Site_Favicon

TheARter – Augmented Reality in Staged Entertainment

Back to Projects
Starting Date – September 1, 2022
20220901
Live Events
0
Simultaneous Viewers
0
minutes of Awe
0

Introduction

Live performances are well known with the use of technology, audio, video and light. With the development of new innovative technologies, we can expand the use of technology, and provide an additional layer of reality, blended with physical performances. In the TheARter project we set-out to explore some opportunities offered by augmenting a 10 minute contemporary dance performance with non-interactive 3D visualizations.

Story

This Research and Development project is aimed at (a) learning how to create an Augmented Reality enriched dance (live and recorded) dance performance, and (b) to understand how audiences in a venue and at home experience (this) AR enriched contemporary dance performance.

A 10-minute dance was created by two professional dancers, and an AR platform was built to enable the projection of 3D computer-generated animations synchronously through a set of 20 AR glasses. Together with the dancers, a visual artist then developed AR content to visually support the narrative of the dance performance. The live performance was experienced on stage (at the Effenaar in Eindhoven) with and without AR.

In addition, a recording of the performance was experienced in a home-like setting, again with and without AR. Experience measurements, using amongst other physiological measurements, indicate that the live performance is experienced better than the at home performance.

AR can improve the at home experience, but to increase the live experience, improvements that range from technological and design optimization to increased congruency between AR content and live performance seem necessary. 

Follow-up research is planned for 2023 to better understand why this is the case and with different forms of AR content and creative input.

Process

Cradle’s main responsibility was in the design and development of the digital content. While the dancers made the choreography, we worked on 3 core challenges:

Developing Content – We wanted our content to provide a sense of immersion and context. Together with 4DR studios and the dancers we designed 3 environments for the different dance segments. A background environment for the beamer project showing the world, and foreground content consisting of 3D models and visual effects. Visual effects ranged from falling leaves & objects materializing to abstract figures guiding the dancers in their motion.

Syncing Content – Playing all this content simultaneously required a custom platform solution to connect and synchronize the content between a server and the audience. We used FishNet to create a network layer which could communicate and send event requests to the background environment as well as the AR HMDs. This gave us the opportunity to connect, control, and monitor all users simultaneously from any device with a basic browser.

Spatial Localization – The last major challenge was spatial placement of the AR content. We decided to model the 3D space where the content would be shown, and design all content around that space specifically. On-site we used an image and marker-based position tracking, allowing us to place the content inside the physical space, by offsetting the center of the room in the digital environment. Although low-light conditions led to some drifting of the content, the results were relatively stable, and manageable within the duration of the performance.

Augmented Reality – We integrated support for the Nreal Light AR HMDs. These glasses are connected to a smartphone, and allow the user to their physical surroundings through transparent glasses, with “holograms” projected on top of the real world. These glasses have their own SDK (NRSDK), allowing us to use build in Image-Tracking, Spatial-Tracking and App-Controls, to speed up the development process.

Results

The project resulted in the successful projection of AR content in 4 situations. 2 Live events, and 2 at home settings.

Live Events were organized in collaboration with the Effenaar, a pop-podium in Eindhoven. During these sessions, the participants traveled to the poppodium, to see the performance in a real-to-life situation. The application was installed on 20 devices, allowing all participants to see the same content appear in the same location.

At Home Events were then organized on the campus, where a lab was setup with some chairs. Groups of up to 5 students were invited to watch the play on a television. These students were familiar with each other.

The result, a platform that can play content on a timeline on multiple devices at the same time, was successfully achieved, and it was shown that this concept can be achieved with current technology. While the quality was lower than expected, this is shown to be a time and money matter, and we expect that more extensive projects can elevate the content from a research to a professional level.

Technical Details

This project has been developed using the Nreal Light augmented reality headsets.

Background content was projected on a beamer screen, while stage and hall filling content was projected directly on the headmounted display.

Unity3D 2022.1 was used to develop the full project. 

We used FishNet multi-player solutions for our network communication.

Research Output

The research consisted of 4 separate conditions, we had an audience watching the performance live, both with and without augmented reality, we also gathered data from an at home setting, where small groups watched the pre-recorded with and without version on a television.

Key results of the research showed that the specific AR content & technology used did not increase the willingness to recommend the performance. We did however see a high similarity in the impact the performance had on the emotional intensity levels of the live audience.

We also showed the recorded performance in a home environment. Here we did see an increase in recommendation, and similar emotional intensity levels between the version with and without AR. 

We hereby stated that the short time-frame, certain technical limitations, and experimental design have likely impacted the quality of the final performance. Future projects aim to improve on these factors, for a more extensive research with broader results.

Partners

Project Manager

Wilco Boode
https://pure.buas.nl/en/persons/wilco-boode

Would you like more news on this project in general, please contact: