doi: 10.17586/2226-1494-2022-22-1-120-126


Recognition the emotional state based on a convolutional neural network

G. Soma, A. M. Kadnova


Read the full article  ';
Article in Russian

For citation:
Soma G.M., Kadnova A.M. Recognition of the emotional state based on a convolutional neural network. Scientific and Technical Journal of Information Technologies, Mechanics and Optics, 2022, vol. 22, no. 1, pp. 120–126 (in Russian). doi: 10.17586/2226-1494-2022-22-1-120-126


Abstract
The paper proposes a new solution for recognizing the emotional state of a person (joy, surprise, sadness, anger, disgust, fear, and neutral state) by facial expression. Along with traditional verbal communication, emotions play a significant role in determining true intentions during a communicative act in various areas. There is a large number of models and algorithms for recognizing human emotions by class and applying them to accompany a communicative act. The known models show a low accuracy in recognizing emotional states. To classify facial expressions, two classifiers were built and implemented in the Keras library (ResNet50, MobileNet) and a new architecture of a convolutional neural network classifier was proposed. The classifiers were trained on the FER 2013 dataset. Comparison of the results for the chosen classifiers showed that the proposed model has the best result in terms of validation accuracy (60.13 %) and size (15.49 MB), while the loss function is 0.079 for accuracy and 2.80 for validation. The research results can be used to recognize signs of stress and aggressive human behavior in public service systems and in areas characterized by the need to communicate with a large number of people.

Keywords: ResNet50, MobileNet, facial expressions, deep neural network, emotion, classification

Acknowledgements. This research is financially supported by the Russian Science Foundation, Agreement No. 22-21-00604.

References
  1. Varma S., Shinde M., Chavan S.S. Analysis of PCA and LDA features for facial expression recognition using SVM and HMM classifiers. Techno-Societal 2018. Proc. of the 2nd International Conference on Advanced Technologies for Societal Applications.Vol. 1, 2020, pp. 109–119. https://doi.org/10.1007/978-3-030-16848-3_11
  2. Ryumina E.V. Karpov A.A Analytical review of methods for emotion recognition by human face expressions. Scientific and Technical Journal of Information Technologies, Mechanics and Optics, 2020, vol. 20, no. 2, pp. 163–176. (in Russian). https://doi.org/10.17586/2226–1494-2020-20-2-163-176
  3. Akhmetshin R.I., Kirpichnikov A.P., Shleimovich M.P. Human emotion recognition in images. Vestnik Technologicheskogo Universiteta, 2015, vol. 18, no. 11, pp. 160–163. (in Russian)
  4. Erman E.A., Mamdouh Mokhammed Gomaa Mokhammed. Face detection in image by using a combination of Viola - Jones method and skin detection algorithms. Vestnik of Astrakhan State Technical University. Series: Management, Computer Sciences and Informatics, 2015, no. 1, pp. 49–55. (in Russian)
  5. Mukhamadieva K.B. Comparative analysis of facial recognition algorithms. Modern Materials, Equipment, and Technology, 2017, no. 7(15), pp. 58–62. (in Russian)
  6. Amelkin S. A., Zakharov A.V., Khachumov V.M. Generalized Mahalanobis Euclidean distance and its properties. Information Technologies and Computer Systems, 2006, no. 4, pp. 40–44. (in Russian)
  7. KaramizadehS., AbdullahS.M., ManafA.A., ZamaniM., HoomanA. An overview of principal component analysis. Journal of Signal and Information Processing, 2013, vol. 4,pp. 173–175. https://doi.org/10.4236/jsip.2013.43B031
  8. Dino H.I., Abdulrazzaq M.B. Facial expression classification based on SVM, KNN and MLP classifiers. Proc. of the International Conference on Advanced Science and Engineering (ICOASE 2019), 2019, pp. 70–75. https://doi.org/10.1109/ICOASE.2019.8723728
  9. Greche L., Es-Sbai N., Lavendelis E. Histogram of oriented gradient and multi layer feed forward neural network for facial expression identification. Proc. of the International Conference on Control, Automation and Diagnosis (ICCAD 2017), 2017, pp. 333–337. https://doi.org/10.1109/CADIAG.2017.8075680
  10. Greche L., Akil M., Kachouri R., Es-Sbai N. A new pipeline for the recognition of universal expressions of multiple faces in a video sequence. Journal of Real-Time Image Processing, 2020, vol. 17, no. 5, pp. 1389–1402. https://doi.org/10.1007/s11554-019-00896-5
  11. Kulke L., Brümmer L., Pooresmaeili A., Schacht A. Overt and covert attention shifts to emotional faces: Combining EEG, eye tracking, and a go/no-go paradigm. Psychophysiology, 2021, vol. 58, no. 8, pp. e13838. https://doi.org/10.1111/psyp.13838
  12. Aleksic P.S., Katsaggelos A.K. Automatic facial expression recognition using facial animation parameters and multistream HMMs. IEEE Transactions on Information Forensics and Security, 2006, vol. 1, no. 1, pp. 3–11. https://doi.org/10.1109/TIFS.2005.863510
  13. Gardner M.W., Dorling S.R. Artificial neural networks (the multilayer perceptron) – a review of applications in the atmospheric sciences. Atmospheric Environment, 1998, vol. 32, no. 14-15, pp. 2627–2636. https://doi.org/10.1016/S1352-2310(97)00447-0
  14. Cao H., Cooper D.G., Keutmann M.K., Gur R.C., Nenkova A., Verma R. CREMA-D: Crowd-sourced emotional multimodal actors dataset. IEEE Transactions on Affective Computing, 2014, vol. 5, no. 4, pp. 377–390. https://doi.org/10.1109/TAFFC.2014.2336244
  15. Challenges in Representation Learning Facial Expression Recognition Challenge. Available at: https://www.kaggle.com/c/challenges-in-representation-learning-facial-expression-recognition-challenge (accessed: 13.11.2021).


Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License
Copyright 2001-2024 ©
Scientific and Technical Journal
of Information Technologies, Mechanics and Optics.
All rights reserved.

Яндекс.Метрика