SPECIAL INTEREST GROUP
Tools and techniques for the study of multimodal communication:
speech, gesture and facial expression
Organizers: Els den Os (Max Planck Institute for Psycholinguistics, Nijmegen,
The Netherlands) and Niels Cadee (Noldus Information Technology bv, Wageningen, The Netherlands)
Analysis and evaluation of multimodal communication is gaining importance
in language research and related sectors in industry. This SIG was about
tools and techniques for annotating and analyzing multimodal communicative
behavior between humans as well as between humans and machines (systems).
Multimodal communication combines speech with for example gestures,
facial expression, body posture or gaze. A wide range of tools will
be demonstrated. We expect that a broad range of industrial sectors
can benefit from understanding how humans interact with each other and
with machines: spoken dialogue systems and multimodal human-computer
interfaces, animation, communication technologies, and language documentation.
Aim of the meeting
- Show the state of the art, by means of a series of short talks and software demonstrations. All speakers are involved in
development of tools for the study of multimodal communication.
- Allow conference participants to discuss with people involved in tool development. It would be especially interesting to get
feedback from potential users (language and communication researchers, but also for example psychologists interested in
non-verbal behavior).
Program
- Els den Os / Niels Cadée. Introduction.
- Jan Peter de Ruiter, Louis Vuurpijl & Willem Levelt (Max Planck
Institute for Psycholinguistics, Nijmegen, The Netherlands). SLOT
(Spatial Logistics Planning Task), an experimental platform for studying
multimodal communication.
- Mykola Kolodnytsky, Laila Dybkjaer & Niels Ole Bernsen (Natural
Interactive Systems Laboratory, Odense University, Denmark). The
visual interface for the NITE Workbench, a tool for annotation of
natural interactivity and multimodal data.
- Niels Cadée (Noldus Information Technology bv, Wageningen, The Netherlands).
Using The Observer as a tool for annotation
of multimodal behavior.
- Peter Wittenburg (Max Planck Institute for Psycholinguistics, Nijmegen,
The Netherlands). EUDICO: a general
tool set for annotating and exploiting multimedia signals.
- Jean-Claude Martin (LIMSI, Orsay, France). Measuring
cooperations between modalities in human multimodal behavior.
- Ulrich Heid & Holger Voormann (Institute for Natural Language Processing
(IMS), Stuttgart University, Germany). Querying
multimodal corpora represented in XML.
Last updated: 30
December 2002
|