SIGN UP FOR MEASURING BEHAVIOR UPDATES
Get the latest news delivered in your mailbox.
Symposium: Automatic behavior recognition in rodents: how new technology moves the field forward
Organizers: Prof. Lucas Noldus, Department of Biophysics, Radboud University Nijmegen and Noldus Information Technology, and Elsbeth A. van Dam Department of Artificial Intelligence, Donders Institute for Brain, Cognition and Behavior, Radboud University, Nijmegen and Noldus Information Technology.
Schedule: Wednesday 18th May 12:35 - 14:00 CET Virtual Room 2
12:35-12:45 Introduction.
Lucas Noldus. Radboud University Nijmegen and Noldus Information Technology, The Netherlands.
12:45-13:00 uBAM: Unsupervised Behavior Analysis and Magnification using Deep Learning.
Björn Ommer. University of Munich, Germany.
We present uBAM, a novel, automatic deep learning algorithm for behavior analysis by discovering and magnifying deviations. Our unsupervised learning of posture & behavior representations enables an objective behavior comparison even across subjects.
13:00-13:15 Self-supervised learning as a gateway to reveal underlying dynamics in animal behavior.
Kevin Luxem. Leibniz Institute for Neurobiology, Magdeburg, Germany
The fields of neuroscience and computational ethology need robust methods to quantify animal actions. By leveraging recent advances of self-supervised learning, we propose a model to learn spatiotemporal behavioral embedding from video recordings.
13:15-13:30 Deep learning systems for automated rodent behavior recognition systems suffer from observer bias: Time to raise the bar.
Elsbeth van Dam & Lucas Noldus, Radboud University Nijmegen and Noldus Information Technology, The Netherlands.
M.A.J. van Gerven, Radboud University Nijmegen, The Netherlands
In order to be useful in behavioral research, automated systems that can recognize high level behavioral activities must be able to recognize them independent of animal genetic background, drug treatment or laboratory setup. However, just as human observers, deep learning systems suffer from observer bias.Deep learning models are very good at finding informative cues, but this also means they are sensitive to using biased cues that only apply within the training dataset. Behavior recognition beyond body point tracking and pose esimtation needs to deal with the following :Appearance differences, Behavior style differences and Behavior sequence differences.
13:30-13:45 Learning to embed lifetime social behavior from interaction dynamics.
Benjamin Wild. Freie Universität Berlin, Germany.
We propose a new temporal matrix factorization model that jointly learns the average developmental path and structured variations of individuals in the social networks of multiple generations of individually-marked honey bees.
13:45-14:00 Live Mouse Tracker 2022 : new animals, new features, new limits.
Fabrice de Chaumont. Institut Pasteur, Paris, France.
Live Mouse Tracker (LMT) has been created by a group of behaviorists, engineers and researchers to study the social interactions of mice. We present here the new capabilities of Live Mouse Tracker for 2022.
14:00-14-15 Break
14:15-14:30 Multi-animal pose estimation, identification, tracking and action segmentation with DeepLabCut.
Alexander Mathis. EPFL, Switzerland.
I will discuss the problems of pose estimation and action segmentation.
14:30-14:45 Measuring Social Behavior from Video and Trajectory Data of Interacting Mice.
Jennifer Sun. CalTech, Pasadena, USA
We present our methods and a dataset (CalMS21) on measuring social behavior with the goal of improving the efficiency and reproducibility of behavior analysis..
14:45-15:00 Deep learning approaches to study rodent behavior.
Heike Schauerte and Florian Montel. Boehringer Ingelheim Pharma, Germany.
New open innovation funding opportunity for #Datascientists! How would you propose to detect subtle patterns and changes of rodent behavior in standardized video recordings with innovative #machinelearning and #computervision approaches? Learn more from Boehringer Ingelheim’s #opnMe team and apply!
15:00-15:15 Discussion
Description:
Animal models can provide us with unique insights into the functioning of the brain and the underlying biology of conditions that affect humans. Behavioral protocols have been developed over the years to parameterize a wide range of behavior patterns in rodents. Many of these metrics do not incorporate discrete animal behaviors, but are generalized parameters derived from simple locomotor measurements including location, distance travelled and velocity of movement. While insightful, these measurements do not represent the broad range of behaviors an animal may demonstrate during a behavioral test. Experimental end-points can be enriched by incorporating the ethologically relevant behavioral repertoire rodents express. For individually housed animals, software tools exist that can infer high-level behaviors like grooming, rearing, sniffing, digging, etc. These measurements have been done by classifying carefully designed features, resulting in fast and robust solutions. In recent years, segmenting temporal sequences into behavioral ‘syllables’ has supported machine learning-based prediction of treatments based on behaviors in an open field. Recent advances in deep learning for image processing have boosted the field to new heights, especially with the publication of powerful tools for the estimation of body points. These developments help in shifting focus towards analyzing discrete animal behaviors with distinct relationships to biological variables of interest.
While increasingly sophisticated deep learning models provide a promising future for automated behavior recognition, two major challenges remain: generalization of the tools, and their ability to recognize relevant behaviors. In order to be useful in behavioral research, automated systems that can recognize high-level behavioral activities must be able to recognize them independent of animal genetic background, treatment or laboratory setup. However, just as human observers, deep learning systems suffer from observer bias. Ideally, automated systems should recognize behaviors without fine-tuning by the operator and be able to deal with differences in appearance (animal fur color, image resolution, etc.) and behavioral style (temporal dynamics, sequential structure, speed of movement, amplitude of limb movements, etc.). Understanding the composition of behavior by identifying the syllables and grammar that it consists of can be a useful intermediate step. If out-of-the-box solutions are not feasible, we need efficient ways to browse and annotate the behavioral recording. Progress is facilitated by the availability of open deep learning frameworks, open source libraries, and open challenge datasets in the animal domain.