Recognising the behaviour of chimpanzees: let artificial intelligence do the work!
To study the behaviour of chimpanzees in a non-intrusive fashion, a team of researchers from the University of Neuchâtel and Idiap Research Institute, in Martigny, members of the NCCR Evolving Language, have proposed a new approach based on artificial intelligence (AI), to automatically analyse primate videos. A promising step forward for research and species protection, the first results of which have been published in the International Journal of Computer Vision.

Walking chimpanzee. © Adrian Soldati.
A new database
Training AI systems to automate the identification of behaviour in film footage requires large amounts of high-quality data. In the case of primate behaviour, only a few datasets of videos suitable for this purpose are publicly available. And even when data is accessible, AI models often struggle when they encounter footage from new environments, different from the ones they were trained with – for instance, a model trained on videos recorded in a zoo may perform poorly when tested on videos recorded in the forest. “To tackle this, we created a new public video dataset of chimpanzees filmed at Basel Zoo, called ChimpBehave,” says first author Michael Fuchs.
The ChimpBehave dataset comprises 1362 video segments of chimpanzees, all manually annotated by expert primatologist Emilie Genty. “ChimpBehave designed not only to teach AI to recognize key chimpanzee behaviours – like walking, climbing, hanging, or swinging – in a zoo setting, ” Fuchs explains.

Annotated segments of videos from the ChimpBehave dataset. The algorithm is able to decompose movement into a skeletal view (anatomical representation of the position of articulations) and accurately determine the behaviour of the ape. © Michael Fuchs and Emilie Genty.
Training the algorithms
To put their new database to the test, the researchers used ChimpBehave to train five different AI models: two based , and three based on skeletal movement, i.e. an anatomical representation of the movement happening in the video. The video-based models showed the most accuracy in learning the behaviours and transferring this knowledge to more challenging and unseen videos. “Interestingly, we also found that smaller, skeletal-based models – while slightly less accurate – may be more practical in low-resource settings, such as remote field stations, where electricity and computing power are limited,” states Fuchs. “From a practical point of view, they could offer a more ecological and therefore more sustainable solution for large-scale analysis in the field.”
This work takes a step toward building AI tools that could support ecological, non-invasive, and large-scale monitoring of endangered species in the wild. In the long run, tools like ChimpBehave could be used for the safety and health of many wild animals, helping scientists and caretakers make faster and smarter decisions – even in remote forests where resources are limited. “By making this technology and data available to others, we are also helping the wider scientific community and conservation organisations build more effective tools for protecting wildlife,” concludes the researcher.