Tracklab and PeopleTracker: a solution for accurate tracking and movement analysis

Schedule: Wednesday 27 August 16:40-17:00 Kleine Veerzaal.

Presenters: Ben Loke and Nico van der Aa, Noldus InnovationWorks

Whether you study consumer traffic in stores, visitor behavior, spatial cognition, or develop interactive systems, an end-to-end solution for real-time and offline movement analysis is desired. In this demonstration we show how our movement analysis tool TrackLab™ uses our position tracking tool PeopleTracker™ to provide such a solution.

TrackLab™ is our software tool for recognition of and analysis of spatial behavior and the design of interactive systems. It allows you to work with any number of subjects, in any spatial context, tracked by any type of positioning system. Although TrackLab supports a wide variety of indoor tracking solutions, including Ubisense™ ultra-wideband sensors and tags, EagleEye™ stereo cameras, the EthoVision® video tracking system, and WiFi tracking, in this demonstration we will use our new vision-based PeopleTracker™ system as the tracking solution. The collected data can be visualized, processed and analyzed. Based on the generated data, real-time interactive systems can be created. Events related to zone-related behavior or user-defined movement classes can be sent out in real-time and thus can present stimuli or trigger events based on the location and movement of your test subject.

PeopleTracker™ is our video tracking system that tracks people in a specific area in a nonintrusive way. The tool analyses multiple video streams captured by multiple fixed color cameras that capture approximately the same scene with overlapping field-of-views and extracts peoples’ positions on the ground floor. The main challenge of tracking multiple persons is the handling of occlusion. Even when the scene is empty, a person might be partly visible because another person is in front of him at a certain time. Our proposed solution uses multiple calibrated cameras to combine the 2D video streams to 3D representations of people in the real world, which forms the basis of tracking the subjects’ positions on the ground floor.

The demonstration includes a full system setup with multiple calibrated cameras for the tracking and a display to show the feedback. The demo will be presented in the form of an interactive game, but applications involve any human movement behavior studies that need to be done nonintrusively. As an example, in the field of consumer behavior you might want to find the hotspots in the shop or see if customers react on the presentation of a new product. In the demonstration, we show the main functionalities of TrackLab™ and PeopleTracker™.

Schedule