Spatial-temporal neural networks for action recognition

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

Action recognition is an important yet challenging problem in many applications. Recently, neural network and deep learning approaches have been widely applied to action recognition and yielded impressive results. In this paper, we present a spatial-temporal neural network model to recognize human actions in videos. This network is composed of two connected structures. A two-stream-based network extracts appearance and optical flow features from video frames. This network characterizes spatial information of human actions in videos. A group of LSTM structures following the spatial network describe the temporal information of human actions. We test our model with data from two public datasets and the experimental results show that our method improves the action recognition accuracy compared to the baseline methods.

Original languageEnglish
Title of host publicationArtificial Intelligence Applications and Innovations - 14th IFIP WG 12.5 International Conference, AIAI 2018, Proceedings
EditorsLazaros Iliadis, Vassilis Plagianakos, Ilias Maglogiannis
PublisherSpringer New York LLC
Pages619-627
Number of pages9
ISBN (Print)9783319920061
DOIs
StatePublished - 2018
Event14th IFIP WG 12.5 International Conference on Artificial Intelligence Applications and Innovations, AIAI 2018 - Rhodes, Greece
Duration: 25 May 201827 May 2018

Publication series

NameIFIP Advances in Information and Communication Technology
Volume519
ISSN (Print)1868-4238

Conference

Conference14th IFIP WG 12.5 International Conference on Artificial Intelligence Applications and Innovations, AIAI 2018
Country/TerritoryGreece
CityRhodes
Period25/05/1827/05/18

Keywords

  • Action recognition
  • LSTM
  • Spatial-temporal structure

Fingerprint

Dive into the research topics of 'Spatial-temporal neural networks for action recognition'. Together they form a unique fingerprint.

Cite this