The theatrical situation we face confronts 3D digital space to physical stage :
- Space A: physical stage
- Space B: 3D digital scenery, projected on a 2D screen
- Space C: mocaptor space
- Space D: space where digital artist and operator works out the 3D avatars and scenery
- Space E: audience and stage director space
Multiple configurations of interrelation between physical and 3D digital stages are possible. We may have mocaptors acting in corridors aside the stage, or being also acting as performer in front of the screen.
The avatar is controlled in real time by two performers :
- the mocaptor uses a low cost motion capture device to control the body parts
- the manipulactor uses keyboard, midi or gamepad controller to partially control some movements
For adresses issues, the manipulactor can change the location and the rotation of the avatar to help the performer adjust his relationships with the audience, or his physical and virtual partners.
For the moment, AvatarStaging focuses on mixed reality issues concerning relationships between physical and digital actors and spaces. Nevertheless, all the results about achieving the best presence quality in 3D world will be used for immersive future development.
We focus on low cost motion capture device systems, as Perception Neuron, by Noitom, Smartsuit Pro, by Rokoko or Kinect by Microsoft. Combined with Unreal 4 game engine it allows to create performances in 3D space in real time.