Live performances are well known with the use of technology, audio, video and light. With the development of new innovative technologies, we can expand the use of technology, and provide an additional layer of reality, blended with physical performances. In the TheARter project we set-out to explore some opportunities offered by augmenting a 10 minute contemporary dance performance with non-interactive 3D visualizations.
Immersive Live Performance
October 2022 – December 2022
There are annually 220.000 company accidents in the Netherlands, one of the reasons why company safety training is mandator. Improving these trainings with VR equipment is not new, however, the technology is continuously improving and new features are added to these headsets on a regular bases. The research in the use of VR in this industry is not always in agreement with each other.
Our aim for this project was to utilize the latest technology to discover the current opportunities the tech offers, and to develop a prototype which can be used to evaluate the effectiveness of VR and Eye-Tracking as a tool in safety-training in a proper research environment.
During the project, BUas partnered with a consortium of of experts, consisting of Software Developer Enversed, who developed safety training VR software, and two independent companies providing Safety Training courses, both with as well as without immersive technology.
Cradle’s main responsibility was in the design and development of the digital content. While the dancers made the choreography, we worked on 3 core challenges:
Developing Content – We wanted our content to provide a sense of immersion and context. Together with 4DR studios and the dancers we designed 3 environments for the different dance segments. A background environment for the beamer project showing the world, and foreground content consisting of 3D models and visual effects. Visual effects ranged from falling leaves & objects materializing to abstract figures guiding the dancers in their motion.
Syncing Content – Playing all this content simultaneously required a custom platform solution to connect and synchronize the content between a server and the audience. We used FishNet to create a network layer which could communicate and send event requests to the background environment as well as the AR HMDs. This gave us the opportunity to connect, control, and monitor all users simultaneously from any device with a basic browser.
Spatial Localization – The last major challenge was spatial placement of the AR content. We decided to model the 3D space where the content would be shown, and design all content around that space specifically. On-site we used an image and marker-based position tracking, allowing us to place the content inside the physical space, by offsetting the center of the room in the digital environment. Although low-light conditions led to some drifting of the content, the results were relatively stable, and manageable within the duration of the performance.
Augmented Reality – We integrated support for the Nreal Light AR HMDs. These glasses are connected to a smartphone, and allow the user to their physical surroundings through transparent glasses, with “holograms” projected on top of the real world. These glasses have their own SDK (NRSDK), allowing us to use build in Image-Tracking, Spatial-Tracking and App-Controls, to speed up the development process.
The project resulted in the successful projection of AR content in 5 situations. 2 Live events, and 3 at home settings.
Live Events were organized in collaboration with the Effenaar, a pop-podium in Eindhoven. During these sessions, the participants traveled to the poppodium, to see the performance in a real-to-life situation. The application was installed on 20 devices, allowing all participants to see the same content appear in the same location.
At Home Events were then organized on the campus, where a lab was setup with some chairs. Groups of up to 5 students were invited to watch the play on a television. These students were familiar with each other.
The result, a platform that can play content on a timeline on multiple devices at the same time, was successfully achieved, and it was shown that this concept can be achieved with current technology. While the quality was lower than expected, this is shown to be a time and money matter, and we expect that more extensive projects can elevate the content from a research to a professional level.
The research consisted of 4 separate conditions, we had an audience watching the performance live, both with and without augmented reality, we also gathered data from an at home setting, where small groups watched the pre-recorded with and without version on a television.
Key results of the research showed that the specific AR content & technology used did not increase the willingness to recommend the performance. We did however see a high similarity in the impact the performance had on the emotional intensity levels of the live audience.
We also showed the recorded performance in a home environment. Here we did see an increase in recommendation, and similar emotional intensity levels between the version with and without AR.
We hereby stated that the short time-frame, certain technical limitations, and experimental design have likely impacted the quality of the final performance. Future projects aim to improve on these factors, for a more extensive research with broader results.
This project has been developed using the Nreal Light augmented reality headsets.
Background content was projected on a beamer screen, while stage and hall filling content was projected directly on the headmounted display.
Unity3D 2022.1 was used to develop the full project.
We used FishNet multi-player solutions for our network communication.
Breda University of Applied Sciences
Effenaar (Smart Venues)
Would you like more information on the work cradle did for this project. Or inform about the other activities of the Cradle Research Lab. Please contact:
Would you like more information on this project in general, please contact: Wilco Boode Boode.W@buas.nl