University of Pennsylvania
Location: Barchi Library (140 John Morgan Building) and via Zoom
Multi-view pose estimation and tracking of cowbirds in an outdoor aviary
Automated understanding of animal activity is poised to transform many fields in biology. Of particular interest to ecology, biomechanics, behavior, and neuroscience are studies of animals moving through naturalistic 3D environments and interacting in complex social groups. In both situations, we aim to capture movement dynamics and social cues encoded in pose trajectories and shape changes, while retaining identities and gracefully handling frequent occlusions that occur when animals interact with each other and objects in the environment. Working with socially gregarious cowbirds as a model species, we develop a method for single-view pose and shape estimation using an articulated 3D mesh model, allowing us to reconstruct a flock of birds interacting in a large aviary using relatively few cameras. Data for many other species of interest, however, only exist as collections of unrelated images. To adapt mesh models to applications where multi-view setups or 3D scans are not available, we introduce a method that disentangles pose and shape to learn a shape basis for a novel species directly from an image collection. By applying our method to a diverse collection of bird species, we form a multi-species shape space and show that captured shapes reflect the phylogenetic relationships among birds better than learned perceptual features. Finally, I will discuss our approaches to the challenge of multi-view multi-animal tracking and ReID when only limited annotations are available.