Translational and Regulatory Sciences
Online ISSN : 2434-4974
TS
Development and application of animal behavior analysis system using video images
Naoaki SAKAMOTOYusuke MIYAZAKIKoji KOBAYASHITakahisa MURATA
Author information
JOURNAL OPEN ACCESS FULL-TEXT HTML

2022 Volume 4 Issue 1 Pages 25-29

Details
Abstract

There is an urgent need for the development of therapeutic drugs for central nervous system (CNS) diseases, such as depression, dementia, and pain. Behavioral analysis of animal disease models is indispensable for the pathological investigation and drug evaluation of these diseases. However, we do not know whether laboratory animals, such as mice and rats, represent depression, anxiety, and congenital disorders similar to humans. In addition, we do not know how to evaluate their detailed expressions. In the evaluation of various disease models, including CNS diseases, the reproducibility and objectivity of animal experiments and their extrapolation to human beings is always an issue. To address these questions and issues, we need to establish a new evaluation system for experimental animals. In the last decade, the development of technologies for image acquisition and analysis has advanced dramatically, accompanied by an increase in computer processing speed. Various technologies have been developed to analyze human behavior and emotions. These technologies can also be applied to the analysis of animal behavior and emotions. We developed a system to assess the behavior of laboratory animals using video analysis technologies and artificial intelligence. In this review, we introduce technologies that enable the identification of body feature points and behavior and individuals of mice and rats in their home cages. In addition, we discuss the possibility of developing and applying new technologies that will lead to breakthroughs in new drug development.

Highlights

We developed a system to assess the behavior of laboratory animals using video analysis technology and artificial intelligence. In this review, we briefly introduce the following techniques used for mouse and rat experiments: 1. Evaluation of spontaneous locomotor activity, 2. identification of behaviors, such as scratching and grooming in moving images (video), 3. identification of feature points, such as the nose and paws in still images, and 4. markerless tracking for multiple animals. These technologies are useful for non-clinical studies in drug discovery and basic biological research.

Introduction

Many people suffer from central nervous system (CNS) disorders, such as depression and Alzheimer’s disease, worldwide [1, 2]. However, in most cases, there are few effective medications and the detailed pathophysiological mechanisms remain unknown. Patients suffering from CNS disorders often have common characteristic symptoms, including anxiety and social deficits [3, 4]. Researchers have established rodent disease models and observed abnormal behavior similar to that of human patients [5,6,7]. Assessing animal behavior is indispensable for elucidating mechanisms and developing medications for CNS disorders.

In addition to the evaluation of CNS disease models, animal behavioral analysis has been performed in various pharmacological and toxicological studies. To evaluate allergic disease models, we assessed the scratching behavior of experimental animals [8]. To evaluate locomotor disease models, gait and spontaneous locomotor activity were assessed. The detection of unpredictable abnormal behavior is one of the biggest themes, especially in toxicological studies.

Rodent behavior is assessed for a short time in special environments. For example, the elevated plus maze test is a prevalent method for estimating anxiety tendencies [9], and three-chambered social tests are widely used to investigate sociability and social novelty in rodents [10]. However, considering that human patients often suffer from symptoms in normal daily life, behavioral evaluation for 24 h in breeding environments is also important. Because such a long observation by humans is practically impossible, an automated evaluation system is required.

Advanced developments in computer technology are noteworthy. Videos and images can be processed simply and rapidly. Furthermore, deep-learning methods have become popular, and numerous research fields have benefited from them. Here, we focused on the development and application of animal behavior analysis systems that use images. The following contents are briefly introduced in this review (Fig. 1): 1. Evaluation of spontaneous locomotor activity, 2. identification of behaviors, such as scratching and grooming in moving images (video), 3. identification of feature points, such as the nose and paws in still images, and 4. markerless tracking for multiple animals. We expect that these new technologies will accelerate the development of medications and elucidate disease mechanisms.

Fig. 1.

Schematic images of rodent behavioral analysis using artificial intelligence. SLA; spontaneous locomotor activity.

Analysis of Locomotor Activity in Rodents

Spontaneous locomotion activity (SLA), which is the movement of animals in a familiar environment, is a basic index of their status. As SLA reflects an animal’s condition, including pain, stress, and chemical stimulation, its assessment is widely used in various studies.

Several SLA measurement methods, such as running wheels and infrared sensors, have been established [11]. They counted the number of wheel rotations and the disruption of infrared beams caused by mouse movement. Although these methods are widely used, they require specific equipment and a long habituation period. They also used unfamiliar cages for animals, which may have affected their activities. It is expected that video tracking in familiar cages will allow us to investigate SLA to avoid these problems. First, we established an automated SLA measurement method for black C57BL/6 mice [12]. We recorded the movement of mice in their home cages using an infrared video camera controlled by a single-board computer, Raspberry Pi. Using the OpenCV library in Python, we calculated the geometric center of the mouse in each frame, and its movement for every second was defined as SLA. The 24hr continuous analyses showed that mice actively moved in the dark period, while they spent most of their time resting in the light period. We also showed that this system successfully detected changes in SLA after a CNS stimulant, caffeine, or a sedative, chlorpromazine [12].

Next, we tuned the above system and applied it to the analysis of the movement of white BALB/c mice [13]. Black cages and bedding were used to distinguish BALB/c mice with white fur from the background. We assessed the effects of the period, habituation, age, and sex on the SLA of BALB/c mice and found that 1) BALB/c mice moved more actively during the dark period than during the light period as well as C57BL/6 mice, 2) BALB/c mice became habituated to a new home-cage environment for at most for three days, and 3) 16- and 32-week-old mice moved less than 4-week-old mice.

In addition, we compared the SLA of C57BL/6 and BALB/c mice. Although there was no significant difference in the total distance moved by the mice during both the dark and light periods, the time zone of the active phase was different. C57BL/6 mice were active throughout the dark period, whereas BALB/c mice moved more actively during the first half of the dark period than during the second half. These basic SLA data are useful for future research and indicate that we must be careful regarding the time when the behavioral test is performed.

These studies showed that video-tracking-based SLA measurement can be a simple and easy-to-use method that can be applied to various situations, regardless of mouse strain and stimulants. Furthermore, because unconstrained and familiar environments can attenuate fear and stress during experiments, these systems are friendly to mice in terms of animal welfare.

Automated Classification of Behavior Using Deep Learning Methods

Rodent behaviors, such as scratching and self-grooming, have a strong connection with their mental and physical conditions. For example, an itching sensation elicited by the allergen exposure induces scratching behavior. Excessive self-grooming is considered as a repetitive behavior which was often observed in CNS disorders, such as autism spectrum disorder and compulsive obsessive disorder [6, 7]. Because we cannot directly ask animals about their mental and physical conditions, behavioral evaluation is important.

Deep learning is one of the machine learning methods using a “neural network” that modeled connections between neurons in the brain. A noteworthy characteristic is the ability to automatically extract features from the provided information, such as images and time-series data. Several types of neural networks have been proposed. As convolutional neural networks (CNNs) have outperformed conventional methods in image classification tasks [14], they have become a de facto standard method for image recognition. In addition, recurrent neural networks (RNNs) are used to extract features from the time-series data. As we check animal behaviors from visual and temporal information, using CNN and/or RNN is a promising way to develop automated behavioral recognition tools.

During the last decade, several studies have shown that CNN and RNN are useful in detecting animal behaviors [15,16,17,18]. For example, we and others have developed automated identification methods for mouse scratching [16], grooming [15, 17], and daily behaviors in cows [18]. Here, we introduce our system that automatically detects scratching [16] or facial and body grooming [17] in mice. First, the mouse behavior in the open-field arena was recorded using a standard video camera. The videos were divided into frame images, and each image was preprocessed (refer to the details in [16, 17]). Second, we labeled the images as target behaviors (i.e., scratching or facial/body grooming) and other behaviors. Third, neural networks were fed preprocessed images and labels, and their features were learned. After completing these processes, our system detected target behaviors in first-look videos, comparable to human observations. Because our system requires only simple recording environments, it is easy to introduce this system to other laboratories.

The automated identification of animal behaviors has enabled us to collect a large amount of fundamental data, such as strain and sex differences. For instance, Geuther et al. comprehensively analyzed self-grooming in 2,457 mice across 62 strains [15]. They also investigated the underlying genetic factors and their links to humans by performing a genome-wide association study and a human phenome-wide association study. Such large-scale analyses provide useful information for the extrapolation of animal experiments to humans. Automated tools are indispensable in future animal experiments.

Markerless Identification of Feature Points

The movement of body feature points, such as the eyes, nose, paws, and tail base of rodents, provides us with a lot of information. Body orientation is easily estimated from the positions of the nose and tail base, and facial expressions can be assessed from facial feature points. Conventionally, focused body parts are painted using colored fluorescent markers, and specific color information is used for image analysis [19]. This conventional method is simple; however, researchers should always consider the effects of odor or painting on animal behavior. Therefore, markerless identification of the feature points is desired.

Currently, as deep learning methods are evolutionarily advanced, several markerless identification methods have been developed. Mathis et al. developed DeepLabCut for markerless pose estimation with deep learning in 2018 [20]. DeepLabCut requires users to select any user-defined points on the screen for only dozens to hundreds of images. Owing to its simplicity and convenience, this software has rapidly become popular and widely used in several studies. In addition, there have been many studies on markerless pose estimation, improving inference speed and/or the accuracy, such as LEAP and DeepPosekit [21,22,23].

Some studies have proposed an analysis method using the feature points of the animal body. Hsu et al. showed that B-SOiD can automatically classify animal behavior using user-defined feature points [24]. This method uses unsupervised clustering (HBDSCAN) and random forest methods, which do not require human annotations of specific behaviors. Sheppard et al. also reported gait analysis of mice in an open field arena using deep-learning-based pose estimation [25]. Considering that the current gait evaluation of rodent models requires special equipment and software, such as CatWalk XT(Noldus), this study provided a method of gait analysis in more familiar environments.

Multi-animal Tracking

Because deficits in social interaction are a major symptom of CNS disorders, such as ASD, depression, and schizophrenia, the evaluation of sociability in laboratory animals is important. The most prevalent method to investigate sociability in rodent models is the three-chamber social test. This test is used worldwide with reliability, but natural social interactions, such as fighting and mating between rodents are beyond the scope. Although natural social interactions within the same cage can be a better indicator of sociability, it is labor-intensive and practically impossible to observe the behaviors of several mice over a long period. Therefore, automated tracking methods are required for multiple animals.

Pérez-Escudero et al. reported idTraker as a multitracking method in 2014 [26]. idTraker identifies individuals by calculating the features of each mouse. The same group also developed a deep learning-based method, idTraker.ai, in 2019 [27]. This software improved the tracking accuracy by combining crossing detection and individual identification using CNN. Currently, deep-learning methods for object detection or instance segmentation tasks have been applied to single- or multi-animal tracking systems [29] [28, 30](preprint). We are also developing a tracking system for multiple mice using these methods.

Future Development and Application of These Technologies

To date, we have reviewed various automated tools for single and multiple laboratory animals. These tools will promote the collection of a large amount of behavioral data without excessive effort in the coming decades. However, data interpretation depends on the experience of the researchers. Because data analysis becomes complicated as the amount of data increases, it is necessary to develop supportive tools for data interpretation. In 2020, Maekawa et al. developed software called DeepHL that helps researchers interpret the differences in trajectories between two groups using attention mechanisms, a deep learning method [31]. This software automatically identifies differences and visualizes them on the screen. Furthermore, in 2021, the same group developed another tool to discover cross-species features in locomotion activity using attention mechanism. This software revealed that humans, mice, and C. elegans have common behavioral features under dopamine deficiency [32]. Interpretation supported by deep learning can elucidate underlying facts beyond human thought in the future.

Conclusion

Because rodent disease models exhibit characteristic behavior similar to that of human patients, their changes have focused on the development of medications for CNS disorders. However, human observation is labor-intensive and has low throughput. In this review, we focus on automated tools for rodent behavioral analysis. Automated tools have achieved sufficient accuracy comparable to human observations as computer technologies, including deep learning, have rapidly evolved. These tools enable long-term evaluation of rodent behavior in natural environments. In addition, some studies have shown that deep learning-based algorithms can support the human interpretation of behavioral data. We expect these tools to promote the elucidation of the underlying mechanisms of CNS disorders and other diseases and the development of new medications.

Ethic Statement

All the experiments were approved by the Institutional Animal Care and Use Committee of the University of Tokyo (P18-067, P19-031, and P19-079). Animal care and treatment were performed in accordance with the guidelines outlined in the Guide to Animal Use and Care of the University of Tokyo.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as potential conflicts of interest.

Acknowledgements

This work was supported by a Grant-in-Aid for Scientific Research from the Japan Society for the Promotion of Science (19K15975 to Koji Kobayashi, 20H05678 and 17H06252 to Takahisa Murata), the University of Tokyo Gap Fund Program (to Takahisa Murata), and the Kobayashi Foundation and Shimadzu Science Foundation (to Takahisa Murata).

References
 
© 2022 Catalyst Unit

This article is licensed under a Creative Commons [Attribution-NonCommercial-NoDerivatives 4.0 International] license.
https://creativecommons.org/licenses/by-nc-nd/4.0/
feedback
Top