Identifying emotions from human beings is the most challenging area in artificial intelligence. There are different modules used to identify emotions like speech, face, EEG, Physiological Signals, and body movement. However, emotional recognition from body movement is the need of time. The review focuses on identifying various emotions with the help of the full-body movement model and the parts-based model. The aim of the survey is to identify the recent work done by the researchers with the help of full-body movements and body parts-based models. Recently, little research has been done on the identification of emotions using body movements, but most of the time it has succeeded to some extent. Identifying various human emotions using body movements is a really very challenging task. This research work discovers that the various popular machine learning algorithms like Support Vector Machines, Neural Networks, and convolutional neural networks are majorly used to identify basic emotions.

1.
S.
Gilda
,
H.
Zafar
,
C.
Soni
, and
K.
Waghurdekar
, “Smart music player integrating facial emotion recognition and music mood recommenda-tion,” in
2017 international conference on wireless communications, signal processing and networking (wispnet)
(
IEEE
,
2017
) pp.
154
158
.
2.
R.
Du
and
H. J.
Lee
, “Frontal alpha asymmetry during the audio emotional experiment revealed by event-related spectral perturbation,” in
2015 8th International Conference on Biomedical Engineering and Informatics (BMEI)
(
IEEE
,
2015
) pp.
531
536
.
3.
T.-H.
Li
,
W.
Liu
,
W.-L.
Zheng
, and
B.-L.
Lu
, “
Classification of five emotions from eeg and eye movement signals: Discrimination ability and stability over time
,” in
2019 9th International IEEE/EMBS Conference on Neural Engineering (NER)
(IEEE,
2019
) pp.
607
610
.
4.
Z.
Guendil
,
Z.
Lachiri
,
C.
Maaoui
, and
A.
Pruski
, “Emotion recognition from physiological signals using fusion of wavelet based features,” in
2015 7th International Conference on Modelling, Identification and Control (ICMIC)
(
IEEE
,
2015
) pp.
1
6
.
5.
J.
Chen
,
T.
Ro
, and
Z.
Zhu
, “
Emotion recognition with audio, video, eeg, and emg: A dataset and baseline approaches
,”
IEEE Access
10
,
13229
13242
(
2022
).
6.
F.
Noroozi
,
C. A.
Corneanu
,
D.
Kamińska
,
T.
Sapiński
,
S.
Escalera
, and
G.
Anbarjafari
, “
Survey on emotional body gesture recognition
,”
IEEE transactions on affective computing
12
,
505
523
(
2018
).
7.
S.
Wang
,
Z.
Liu
,
Y.
Zhu
,
M.
He
,
X.
Chen
, and
Q.
Ji
, “
Implicit video emotion tagging from audiences’ facial expression
,”
Multimedia Tools and Applications
74
,
4679
4706
(
2015
).
8.
H.
Gunes
and
M.
Piccardi
, “A bimodal face and body gesture database for automatic analysis of human nonverbal affective behavior,” in
18th International conference on pattern recognition (ICPR’06)
, Vol.
1
(
IEEE
,
2006
) pp.
1148
1153
.
9.
P.
Tarnowski
,
M.
Kołodziej
,
A.
Majkowski
, and
R. J.
Rak
, “
Emotion recognition using facial expressions
,”
Procedia Computer Science
108
,
1175
1184
(
2017
).
10.
S. P.
Deshmukh
,
M. S.
Patwardhan
, and
A. R.
Mahajan
, “
Feedback based real time facial and head gesture recognition for e-learning system
,” in
Proceedings of the ACM India Joint International Conference on Data Science and Management of Data
(
2018
) pp.
360
363
.
11.
R. T.
Bankar
and
S. S.
Salankar
, “Head gesture recognition system using adaboost algorithm with obstacle detection,” in
2015 7th International Conference on Emerging Trends in Engineering & Technology (ICETET)
(
IEEE
,
2015
) pp.
46
50
.
12.
R.
Indhumathi
and
A.
Geetha
, “
Survey on recognition of head movements and facial emotions in e-learning system
,”.
13.
R. S. J. Samta Jain
Goyal
,
Arvind Kumar
Upadhyay
, “
Facial emotion recognition through hand gesture and its position surrounding the face
,” in
International Journal of Engineering and Advanced Technology (IJEAT)
(
2019
).
14.
P.
Molchanov
,
S.
Gupta
,
K.
Kim
, and
J.
Kautz
, “
Hand gesture recognition with 3d convolutional neural networks
,” in
Proceedings of the IEEE conference on computer vision and pattern recognition workshops
(
2015
) pp.
1
7
.
15.
S.
Ali
,
R.
Yunus
,
A.
Arif
,
Y.
Ayaz
,
M. B.
Sial
,
R.
Asif
,
N.
Naseer
, and
M. J.
Khan
, “
Hand gestures based emotion identification using flex sensors
,”
International Journal of Industrial and Systems Engineering
12
,
882
886
(
2018
).
16.
Y.
Tahir
,
J.
Dauwels
,
D.
Thalmann
, and
N. Magnenat
Thalmann
, “
A user study of a humanoid robot as a social mediator for two-person conversations
,”
International Journal of Social Robotics
12
,
1031
1044
(
2020
).
17.
I.-O.
Stathopoulou
and
G. A.
Tsihrintzis
, “Emotion recognition from body movements and gestures,” in
Intelligent interactive multimedia systems and services
(
Springer
,
2011
) pp.
295
303
.
18.
H.
Zacharatos
,
C.
Gatzoulis
, and
Y. L.
Chrysanthou
, “
Automatic emotion recognition based on body movement analysis: a survey
,”
IEEE computer graphics and applications
34
,
35
45
(
2014
).
19.
A.
Hassouneh
,
A.
Mutawa
, and
M.
Murugappan
, “
Development of a real-time emotion recognition system using facial expressions and eeg based on machine learning and deep neural network methods
,”
Informatics in Medicine Unlocked
20
,
100372
(
2020
).
20.
M.
Li
,
B.
Yang
,
J.
Levy
,
A.
Stolcke
,
V.
Rozgic
,
S.
Matsoukas
,
C.
Papayiannis
,
D.
Bone
, and
C.
Wang
, “Contrastive unsupervised learning for speech emotion recognition,” in
ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
(
IEEE
,
2021
) pp.
6329
6333
.
21.
J.
Xu
,
F.
Ren
, and
Y.
Bao
, “Eeg emotion classification based on baseline strategy,” in
2018 5th IEEE International Conference on Cloud Computing and Intelligence Systems (CCIS)
(
IEEE
,
2018
) pp.
43
46
.
22.
P.
Kratzer
,
S.
Bihlmaier
,
N. B.
Midlagajni
,
R.
Prakash
,
M.
Toussaint
, and
J.
Mainprice
, “
Mogaze: A dataset of full-body motions that includes workspace geometry and eye-gaze
,”
IEEE Robotics and Automation Letters
6
,
367
373
(
2020
).
23.
N.
Elfaramawy
,
P.
Barros
,
G. I.
Parisi
, and
S.
Wermter
, “
Emotion recognition from body expressions with a neural network architecture
,” in
Proceedings of the 5th International Conference on Human Agent Interaction
(
2017
) pp.
143
149
.
24.
F.
Ahmed
,
A. H.
Bari
, and
M. L.
Gavrilova
, “
Emotion recognition from body movement
,”
IEEE Access
8
,
11761
11781
(
2019
).
25.
R.
Santhoshkumar
and
M. K.
Geetha
, “
Deep learning approach for emotion recognition from human body movements with feedforward deep convolution neural networks
,”
Procedia Computer Science
152
,
158
165
(
2019
).
26.
S.
Piana
,
A.
Stagliano
,
F.
Odone
,
A.
Verri
, and
A.
Camurri
, “
Real-time automatic emotion recognition from body gestures
,” arXiv preprint arXiv:1402.5047 (
2014
).
27.
F.
Ahmed
,
B.
Sieu
, and
M. L.
Gavrilova
, “Score and rank-level fusion for emotion recognition using genetic algorithm,” in
2018 IEEE 17th International Conference on Cognitive Informatics & Cognitive Computing (ICCI* CC)
(
IEEE
,
2018
) pp.
46
53
.
28.
Z.
Zhao
,
Y.
Wang
, and
S.
Fu
, “Head movement recognition based on lucas-kanade algorithm,” in
2012 International Conference on Computer Science and Service System
(
IEEE
,
2012
) pp.
2303
2306
.
29.
Y.
Zhao
and
H.
Yan
, “Head orientation estimation using neural network,” in
Proceedings of 2011 International Conference on Computer Science and Network Technology
, Vol.
3
(
IEEE
,
2011
) pp.
2075
2078
.
30.
P.
Suja
,
S.
Tripathi
, and
J.
Deepthy
, “Emotion recognition from facial expressions using frequency domain techniques,” in
Advances in signal processing and intelligent recognition systems
(
Springer
,
2014
) pp.
299
310
.
31.
F.
Ghaffar
, “Facial emotions recognition using convolutional neural net,” arXiv preprint arXiv:2001.01456 (2020).
This content is only available via PDF.
You do not currently have access to this content.