Memory-Boosting RNN with Dynamic Graph for Event-based Action Recognition
DOI:
Author:
Affiliation:

1.Zhejiang University of Technology;2.College of Computer Science and Technology, Zhejiang University of Technology

Clc Number:

Fund Project:

The National Key Technologies R&D Program of China

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Owing to its characteristic of high dynamic, event camera is well suited for capturing subtle temporal changes in action recognition applications. However, existing action recognition methods based on event cameras have not fully exploited the advantages of event cameras, such as compressing event streams into frames for subsequent calculation, which greatly sacrifices the time information of event streams. Meanwhile, the conventional PointCloud-based methods suffer from large computational complexity while processing event data, which make it difficult to handle long-term actions. To tackle the above problems, we propose a Recurrent Neural Network with Memory-Boosting and Dynamic Graph(DG-MBRNN). The proposed DG-MBRNN splits the event stream into sequential graph data for preserving structural information, then uses the RNN with boosting spatiotemporal memory to handle long-term sequences of actions. In addition, the proposed method introduces a dynamic reorganization mechanism for the graph based on the distances of features, which can effectively increase the ability to extract local features. In addition, in order to cope with the situation that the existing data sets have too simple actions and too limited categories, we propose a new Event-based dataset containing 36 complex actions. This dataset will greatly promote the development of Event-based action recognition research. Experimental results show the effectiveness of the proposed method in completing the Event-based action recognition task.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:February 20,2023
  • Revised:April 02,2023
  • Adopted:April 17,2023
  • Online:
  • Published: