Abstract | ||
---|---|---|
This paper introduces the concept of first-person animal activity recognition, the problem of recognizing activities from a view-point of an animal (e.g., a dog). Similar to first-person activity recognition scenarios where humans wear cameras, our approach estimates activities performed by an animal wearing a camera. This enables monitoring and understanding of natural animal behaviors even when there are no people around them. Its applications include automated logging of animal behaviors for medical/biology experiments, monitoring of pets, and investigation of wildlife patterns. In this paper, we construct a new dataset composed of first-person animal videos obtained by mounting a camera on each of the four pet dogs. Our new dataset consists of 10 activities containing a heavy/fair amount of ego-motion. We implemented multiple baseline approaches to recognize activities from such videos while utilizing multiple types of global/local motion features. Animal ego-actions as well as human-animal interactions are recognized with the baseline approaches, and we discuss experimental results. |
Year | DOI | Venue |
---|---|---|
2014 | 10.1109/ICPR.2014.739 | Pattern Recognition |
Keywords | DocType | ISSN |
image motion analysis,image recognition,video signal processing,animal view-point,automated logging,baseline approaches,biology experiments,egocentric videos,first-person animal activity recognition,global motion features,human-animal interactions,local motion features,medical experiments,multiple baseline approaches,natural animal behaviors,pet monitoring | Conference | 1051-4651 |
Citations | PageRank | References |
21 | 0.87 | 6 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Yumi Iwashita | 1 | 212 | 23.59 |
Asamichi Takamine | 2 | 21 | 0.87 |
Ryo Kurazume | 3 | 622 | 74.18 |
M. S. Ryoo | 4 | 1429 | 49.18 |