Action Recognition Using Single-Pixel Time-of-Flight Detection
View/ Open
Access
info:eu-repo/semantics/openAccessDate
2019-04Author
Anbarjafari, GholamrezaOfodile, Ikechukwu
Helmi, Ahmed
Clapes, Albert
Avots, Egils
Peensoo, Kerttu Maria
Valdma, Sandhra-Mirella
Valdmann, Andreas
Valtna-Lukner, Heli
Omelkov, Sergey
Escalera, Sergio
Ozcinar, Cagri
Metadata
Show full item recordCitation
Ofodile, I., Helmi, A., Clapés, A., Avots, E., Peensoo, K. M., Valdma, S.-M., Valdmann, A., ... Anbarjafari, G. (April 18, 2019). Action Recognition Using Single-Pixel Time-of-Flight Detection. Entropy, 21, 4, 414.Abstract
Action recognition is a challenging task that plays an important role in many robotic systems, which highly depend on visual input feeds. However, due to privacy concerns, it is important to find a method which can recognise actions without using visual feed. In this paper, we propose a concept for detecting actions while preserving the test subject's privacy. Our proposed method relies only on recording the temporal evolution of light pulses scattered back from the scene. Such data trace to record one action contains a sequence of one-dimensional arrays of voltage values acquired by a single-pixel detector at 1 GHz repetition rate. Information about both the distance to the object and its shape are embedded in the traces. We apply machine learning in the form of recurrent neural networks for data analysis and demonstrate successful action recognition. The experimental results show that our proposed method could achieve on average <mml:semantics>96.47%</mml:semantics> accuracy on the actions walking forward, walking backwards, sitting down, standing up and waving hand, using recurrent neural network.