Prof. Irene Mittelberg, Ph.D.
Building 6070 | Room 405
Phone: +49 241 8025486
Building 6070 | Room 404
Phone: +49 241 80 25485
Fax: +49 241 80 22493
In 2010, the Natural Media group has established one of the few motion capture labs specifically configured for gesture and multimodal communication research. It houses a state-of-the-art optical Vicon Motion Capture (MoCap) system aligned with high-speed and video cameras. MoCap overcomes the limits of traditional video recording by visualizing movement trajectories and measuring positions of articulators in 3D space. Group members have developed hand model prototypes suitable for gesture research, and are developing automated annotation tools (ELAN Plugin for Automated Annotation, EPAA) for specific hand configurations, motion patterns, velocities, and distances between gesturing hands or between the gesturer’s torso and a given freely moving articulator. Development of the first 3D models of gesture space to be shared with the gesture and sign language research communities is also underway.
The Natural Media Lab houses a Vicon MX series marker-based infrared motion capture system, consisting of
Based on data stemming from semi-structured interviews conducted in the Natural Media Lab, a prototype of a new method to map and visualize speakers' three-dimensional gesture spaces was developed. Taking a pragmatic approach, the locations in which the gestures were produced were correlated with their respective functions (referential, modal, interactive, etc.)
The method developed in this thesis subdivides the 3D space available for gestural communication into minimal segments, and measures the gesture density for each segment over the course of a conversation. The resulting data are visualized in two or three dimensions (see illustrations), which allows us to study the spatial and temporal structure of gesture spaces with unprecedented precision. The results offer new insights into natural gesture production in terms of both the influence of communicative function on gesture location as well as individual communication styles.
Three abstract paintings by Paul Klee were presented to participants, who were subsequently asked to describe the paintings from memory and to talk about the meaning and message of the images. The illustration shows Klee’s “Tightrope Walker” and MoCap traces of a participant drawing the picture’s structure into the gesture space in front of her during her description of the image.
In winter term 2011/12, the Natural Media Lab hosted an Architecture seminar by Lehrstuhl für Bildnerische Gestaltung (Prof. Thomas Schmitz, Hannah Groninger) about visualizing motion. The students pursued their own artistic projects, using the MoCap system to e.g. create three-dimensional sketches through gestures. The illustration shows a sketch of Aachen Cathedral, created by Paula Basilio using a pencil with a MoCap marker attached.
This study followed up on ethnographic work done by Simon Harrison in factory surroundings that indicated an influence of environmental conditions (e.g. obstacles, noise) on gesture production. In the lab, a production line situation was simulated on a computer screen, which participants had to monitor. Upon observing discrepancies from the norm, these had to be communicated by using predefined gestures. The experimental conditions included the presence vs. absence of factory noise. Currently the MoCap data are being analyzed to determine the influence of noise on gesture size.
The Natural Media Team is grateful to HumTec and to Prof. Jarke (Informatik 5) for the generous support of the lab, and to Prof. Binkofski (University Hospital Aachen) for lending us 4 infra-red cameras.