6533b859fe1ef96bd12b8956
RESEARCH PRODUCT
A starting point for real-time human action detection
Yu LiuFan YangDominique Ginhacsubject
[SPI.SIGNAL] Engineering Sciences [physics]/Signal and Image processingdescription
Analyzing videos of human actions involves understanding the spatial and temporal context of the scenes. State-of-the-art approaches have demonstrated impressive results using Convolution Neural Networks (CNNs). However, most of them operate in a non-real-time, offline fashion and are not well-equipped for many emerging real-world scenarios, such as autonomous driving and public surveillance. In addition, they are computationally demanding to be deployed on devices with limited power resources (e.g., embedded systems). This paper reviews state-of-the-art methods based on CNN for human action detection and related topics. Following that, we propose an initial framework to efficiently address action detection using flow-guided appearance features. We validate its performance on the UCF-101-24 dataset, and show that the method can achieve real-time action detection with a processing speed of 40 fps.
year | journal | country | edition | language |
---|---|---|---|---|
2019-01-01 |