E-MobileNeXt:face expression recognition model based on improved MobileNeXt
CSTR:
Author:
Affiliation:

School of Physics and Electronic Engineering, Northwest Normal University, Lanzhou 730070, China

  • Article
  • | |
  • Metrics
  • |
  • Reference [4]
  • | | | |
  • Comments
    Abstract:

    In response to the high complexity and low accuracy of current facial expression recognition networks, this paper proposes an E-MobileNeXt network for facial expression recognition. E-MobileNeXt is built based on our proposed E-SandGlass block. In addition, we also improve the overall performance of the network through RepConv and SGE attention mechanisms. The experimental results show that the network model improves the expression recognition accuracy by 6.5% and 7.15% in RAF-DB and CK+ datasets, respectively, while the parameter and floating-point operations decreased by 0.79 M and 4.2 M compared with MobileNeXt.

    Reference
    [1] REVINA I M, EMMANUEL W R S. A survey on human face expression recognition techniques[J]. Journal of King Saud University-computer and information sciences, 2021, 33(6):619-628.
    [2] LI S, DENG W. Deep facial expression recognition:a survey[J]. IEEE transactions on affective computing, 2020, 13(3):1195-1215.
    [3] YU Z, ZHANG C. Image based static facial expression recognition with multiple deep network learning[C]//Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, November 9-13, 2015, Washington, USA. New York:Association for Computing Machinery, 2015:435-442.
    [4] JIANG S, XU X, LIU F, et al. CS-GResNet:a simple and highly efficient network for facial expression recognition[C]//2022 IEEE International Conference on
    Related
    Cited by
    Comments
    Comments
    分享到微博
    Submit
Get Citation

ZHANG Xiang, YAN Chunman. E-MobileNeXt:face expression recognition model based on improved MobileNeXt[J]. Optoelectronics Letters,2024,20(2):122-128

Copy
Share
Article Metrics
  • Abstract:236
  • PDF: 642
  • HTML: 0
  • Cited by: 0
History
  • Received:May 17,2023
  • Revised:August 06,2023
  • Online: January 05,2024
Article QR Code