Recording and motion capturing an African djembe trio and a dancer
The HoloDeck computing network aims to prototype the digital collaborative interactions of the future. By setting up an audiovisual network of data exchange between calibrated rooms, we can do today what will be possible to do in 10 years from now with augmented and mixed reality technologies. We can thus study the social impact of remote digital immersive interaction for a vast number of applications.
One of the many applications of interest to us is music. How will remote music performance improve in the future? How can VR and AR be used to give a realistic impression to the musician and to the audience alike? Will we ever achieve an immersive digital experience comparable to real-life collaborative performance? In other words… can we make people create music remotely together and believe this is real? These are the questions that we are approaching by putting together this first iteration of our VR demo experience.
In order to collect material to use for our experiments, we recorded the sound and motion capture of an African percussion trio, plus a dancer (all members of the NYU percussion program). This material can later be used to make digital avatars that represent these musicians… sort of video-game characters. These avatar characters will eventually be used to provide a canvas in VR to test the response of additional musicians, dancers, and audience members. In these videos, we show the recording setup of 2 out of 3 drummers, and the dancer capturing.
Motive and Optitrack were used for the motion capture and ProTools for recording the sound. We tried to capture the different sound areas of the djembe in the most isolated way possible using spot-mics, in order to create audio objects easy to manipulate in a VR implementation. The motion tracking suits did require a little bit of posture adaptation for the performer’s drumming technique but the results were good nonetheless. For the dancer, we simply played back the lead-drum recording and mocapped the movements, in a future version we will try to capture the sound of the footsteps too.
Click here for 2nd part
Thanks also to:
Marta Gospodarek & Sripathi Sridhar
Christopher Allen O’Leary, Max Meyer & Jared Shaw