Graphical gesture analysis: A behavioral tool for virtual environment design

D. Fass

Augmented Human project, ICN Business School and LORIA /INRIA Lorraine, Nancy, France

Virtual reality and augmented reality technologies, since they are multimodal and aesthetic, are obviously the tools for the design and development of the assistance action and multimodal knowledge based artefactual environments. Knowledge is gathered from interactions and dynamics of the individual-environment complex and motivations. It is an evolutional, adaptative and integrative physiological process. It is fundamentally linked to emotions, mnesic process, perception and action. Then, designing an artefactual or a virtual environment, a sensori-motor knowledge based environment, consists of making biological individual and artefactual physical system consistency. That needs an ‘eco-ethological’ approach, both for the knowledge modelisation and interaction system design.

Humans use multimodal sensori-motor stimuli for interacting with their environment, be it natural or artificial (vision, vestibular stimulus, proprioception, hearing, touch, olfaction, taste. When a subject is in situation of immersive interaction, wearing head-mounted display and looking at a three-dimensional computer-generated environment, his sensorial system is submitted to an unusual pattern of stimuli. This dynamical pattern may largely influence the balance, the posture control, the spatial cognition and the spatial motor control of the subject. Moreover, the coherence between artificial information and natural perceptual inputs is essential for the perception of the space and the action in it.

If this coherence is absent, perceptual and motor disturbance appears, as well as illusions. These illusions are solutions built by the brain in response to the inconsistency between sensorial stimuli and internal processes. Therefore, the cognitive and sensory-motor abilities of the subject may be disturbed if the design of the virtual environment does not take into account the constraints imposed by human sensory and motor integrative physiology.

In this paper, we propose to describe a gesture-based method for evaluating physiological effects of both design and integration of the virtual environment structures and functions. By analysis of three-dimensional hand movement (drawing of ellipses), we compare the dynamical sensory-motor integration and motor performance in real and virtual environments. We present results of our experiences performed in laboratory, and in weightlessness during parabolic flights, using a virtual and augmented reality system for assisted action.


Paper presented at Measuring Behavior 2005 , 5th International Conference on Methods and Techniques in Behavioral Research, 30 August - 2 September 2005, Wageningen, The Netherlands.

© 2005 Noldus Information Technology bv