Loading...
 

Tuned collaboration to direct the avatar

The avatar interacts with a performer through a 3D image projected in the rear of space A, in front of the stage director and audience in space E. Transformations in real time of the position of the avatar are performed both by the mocaptor wearing the mocap suit and the manipulactor using controller devices by hand. Scenographic constraints often require a tuned collaboration between both actors to follow the director's indications concerning the avatar.
Image
The manipulactor moves the avatar in a forward/backward direction, laterally to the right or the left, and up and down. She rotates the yaw for controlling the movement direction, and the pitch to put the avatar on a horizontal position. She focuses on two main functions:

  • adjusting the scenic address of the avatar towards the performer in the A space, from the point of view of the audience; and respecting perspectival constraints
  • accompanying the mocaptor in augmented movements that are usually not accessible to a performer, as, for instance, floating in the air.

Why do we use controller devices?

Instead of using a keyboard and a mouse, we explore richer interaction design to give the best conditions of control to the manipulactor. The gamepad is used for combining complex actions on the avatar as moving forward, backward, left, right, up, down, or turning with a yaw or pitch rotations.


Image
Image


NanoKontrol2 (by Korg), usually used for music, offers to combine multiple triggers.

AKN NanoKontrol
NanoKontrol2 is used to quickly navigate in the CueSheet, jumping from one cue to another, or reseting the same cue. As well, we can program other possibilities, as setting the 3D lights, changing size of objects and avatars etc.