Machine learning methods for motion tracking have transformed a wide range of scientific disciplines—from neuroscience and biomechanics to conservation and ethology. Tools such as DeepLabCut (Mathis et al. 2018) and SLEAP (Pereira et al. 2022) enable researchers to track animal movements in video recordings with impressive accuracy, without the need for physical markers.
However, the variety of available tools can be overwhelming (Luxem et al. 2023). It’s often unclear which tool is best suited to a given application, or how to get started. We’ll provide an overview of the approaches used for quantifying animal behaviour, and we’ll narrow down into Computer Vision (CV) methods for detecting and tracking animals in videos.
Luxem, Kevin, Jennifer J Sun, Sean P Bradley, Keerthi Krishnan, Eric Yttri, Jan Zimmermann, Talmo D Pereira, and Mark Laubach. 2023.
“Open-Source Tools for Behavioral Video Analysis: Setup, Methods, and Best Practices.” Edited by Denise J Cai and Laura L Colgin.
eLife 12 (March): e79305.
https://doi.org/10.7554/eLife.79305.
Mathis, Alexander, Pranav Mamidanna, Kevin M. Cury, Taiga Abe, Venkatesh N. Murthy, Mackenzie Weygandt Mathis, and Matthias Bethge. 2018.
“DeepLabCut: Markerless Pose Estimation of User-Defined Body Parts with Deep Learning.” Nature Neuroscience 21 (9): 1281–89.
https://doi.org/10.1038/s41593-018-0209-y.
Pereira, Talmo D., Nathaniel Tabris, Arie Matsliah, David M. Turner, Junyu Li, Shruthi Ravindranath, Eleni S. Papadoyannis, et al. 2022.
“SLEAP: A Deep Learning System for Multi-Animal Pose Tracking.” Nature Methods 19 (4): 486–95.
https://doi.org/10.1038/s41592-022-01426-1.