Robot House Human Activity Recognition Dataset

Robot House Human Activity Recognition Dataset

Title: Robot House Human Activity Recognition Dataset
Authors: Mohammad Abadi (e School of Physics, Engineering, and Computer Science, University of Hertfordshire); Mohammad Alashti (e School of Physics, Engineering, and Computer Science, University of Hertfordshire); Patrick Holthaus (e School of Physics, Engineering, and Computer Science, University of Hertfordshire); Catherine Menon (e School of Physics, Engineering, and Computer Science, University of Hertfordshire); Farshid Amirabdollahian (e School of Physics, Engineering, and Computer Science, University of Hertfordshire);
Year: 2021
Citation: Abadi, M., Alashti, M., Holthaus, P., Menon, C., Amirabdollahian, F., (2021). Robot House Human Activity Recognition Dataset. UKRAS21 Conference: “Robotics at home” Proceedings, 19-20. doi: 10.31256/Bw7Kt2N

Abstract:

—Human activity recognition is one of the most
challenging tasks in computer vision. State-of-the art approaches
such as deep learning techniques thereby often rely on large
labelled datasets of human activities. However, currently available
datasets are suboptimal for learning human activities in
companion robotics scenarios at home, for example, missing
crucial perspectives. With this as a consideration, we present
the University of Hertfordshire Robot House Human Activity
Recognition Dataset (RH-HAR-1). It contains RGB videos of a
human engaging in daily activities, taken from four different
cameras. Importantly, this dataset contains two non-standard
perspectives: a ceiling-mounted fisheye camera and a mobile
robot’s view. In the first instance, RH-HAR-1 covers five daily
activities with a total of more than 10,000 videos.

Download PDF