In an era dominated by video conferencing and distance learning, technologies that can increase the quality of education have become exceptionally important. Moreover, these unprecedented times can be used to introduce new and improve the traditional paradigms of classroom teaching. In this regard, it would be highly beneficial to develop methods for efficient estimation and classification of student attention and engagement based on readily available visual information. Today webcams and other different optical sensors have become ubiquitous and are used in online lecturing on regular basis. It was already demonstrated that the visual signal they provide is of sufficient quality for the prediction of student's learning curves. In our previous work, a software application for the recognition of the head, arms, and upper-body orientation and position was developed and initially tested on a limited dataset achieving a mean accuracy of 96.88%. In this research, a catalogue of upper body posture positions characteristic in online lectures is identified and described. A catalogue of quantitative measures for the classification of upper body posture positions, based on the set, are proposed. The measures are divided into spatial, temporal, and functional groups. Subsequent studies should verify the validity of proposed measures.

1.
J.
Zaletelj
and
A.
Košir
, “
Predicting students’ attention in the classroom from Kinect facial and body features
,”
EURASIP journal on image and video processing
,
2017
(
1
), pp.
1
12
,
2017
.
2.
J.
Li
,
G.
Ngai
,
H. V.
Leong
and
S. C.
Chan
, “
Multimodal human attention detection for reading from facial expression, eye gaze and mouse dynamics
,”
ACM SIGAPP Applied Computing Review
,
16
(
3
), pp.
37
49
,
2016
.
3.
M.
Horvat
and
T.
Jagušt
, “
Emerging opportunities for education in the time of COVID-19: Adaptive e-learning intelligent agent based on assessment of emotion and attention
,”,
In Proceedings of the 31st Central European Conference on Information and Intelligent Systems (CECIIS 2020
), pp.
203
210
,
2020
.
4.
F.
Al-Shargie
,
U.
Tariq
,
H.
Mir
,
H.
Alawar
,
F.
Babiloni
and
H.
Al-Nashash
, “
Vigilance decrement and enhancement techniques: a review
,”
Brain sciences
,
9
(
8
), pp.
178
,
2019
.
5.
C. M.
Chen
,
J. Y.
Wang
and
C. M.
Yu
, “
Assessing the attention levels of students by using a novel attention aware system based on brainwave signals
,”
British Journal of Educational Technology
,
48
(
2
), pp.
348
369
,
2017
.
6.
T. S.
Ashwin
and
R. M. R.
Guddeti
, “
Automatic detection of students’ affective states in classroom environment using hybrid convolutional neural networks
,”
Education and Information Technologies
,
25
(
2
), pp.
1387
1415
,
2020
.
7.
T. S.
Ashwin
and
R. M. R.
Guddeti
, “
Unobtrusive Behavioral Analysis of Students in Classroom Environment Using Non-Verbal Cues
,” in
IEEE Access
, vol.
7
, pp.
150693
150709
,
2019
, doi: .
8.
J.
Whitehill
,
Z.
Serpell
,
Y.-C.
Lin
,
A.
Foster
and
J. R.
Movellan
, “
The faces of engagement: Automatic recognition of student engagementfrom facial expressions
”,
IEEE Trans. Affect. Comput.
, vol.
5
, no.
1
, pp.
86
98
,
2014
.
9.
A.
Jalal
and
M.
Mahmood
, “
Students’ behavior mining in e-learning environment using cognitive processes with information technologies
,”
Education and Information Technologies
,
24
(
5
), pp.
2797
2821
,
2019
.
10.
N.
Veliyath
,
P.
De
,
A. A.
Allen
,
C. B.
Hodges
and
A.
Mitra
, “
Modeling students' attention in the classroom using eyetrackers
,”
In Proceedings of the 2019 ACM Southeast Conference
, pp.
2
9
,
2019
.
11.
Y.
Abdelrahman
, et al., “
Classifying attention types with thermal imaging and eye tracking
,”
In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
,
3
(
3
), pp.
1
27
,
2019
.
12.
Y.
Moroto
,
K.
Maeda
,
T.
Ogawa
and
M.
Haseyama
, “
Estimation of Visual Attention via Canonical Correlation between Visual and Gaze-based Features
,”
In 2019 IEEE 1st Global Conference on Life Sciences and Technologies (LifeTech
), pp.
229
230
,
2019
.
13.
C.
Hong
,
J.
Yu
,
J.
Zhang
,
X.
Jin
and
K. H.
Lee
, “
Multimodal face-pose estimation with multitask manifold deep learning
,”
IEEE Transactions on Industrial Informatics
,
15
(
7
), pp.
3952
3961
,
2018
.
14.
K.
Ćosić
,
S.
Popovic
,
M.
Horvat
,
D.
Kukolja
,
B.
Dropuljic
,
B.
Kovac
and
M.
Jakovljevic
, “
Computer-aided psychotherapy based on multimodal elicitation, estimation and regulation of emotion
,”
Psychiatria Danubina
,
25
(
3
), pp.
0
346
,
2013
.
15.
H.
Monkaresi
,
N.
Bosch
,
R. A.
Calvo
and
S. K.
D'Mello
, “
Automated detection of engagement using video-based estimation of facial expressions and heart rate
,”
IEEE Transactions on Affective Computing
,
8
(
1
), pp.
15
28
,
2016
.
16.
L.
Shu
, et al., “
A review of emotion recognition using physiological signals
,”
Sensors
,
18
(
7
),
2074
,
2018
.
17.
J. Z.
Lim
,
J.
Mountstephens
and
J.
Teo
, “
Emotion Recognition Using Eye-Tracking: Taxonomy, Review and Current Challenges
,”
Sensors
,
20
(
8
),
2384
,
2020
.
18.
D.
Kukolja
,
S.
Popovic
,
M.
Horvat
,
B.
Kovac
and
K.
Cosic
, “
Comparative analysis of emotion estimation methods based on physiological measurements for real-time applications
,”
International journal of human-computer studies
,
72
(
10-11
), pp.
717
727
,
2014
.
19.
M.
Horvat
,
M.
Dobrinic
,
M.
Novosel
and
P.
Jercic
, “
Assessing emotional responses induced in virtual reality using a consumer EEG headset: A preliminary report
,”
In 2018 41st International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO
), pp.
1006
1010
,
2018
.
20.
G.
Tonguç
and
B. O.
Ozkara
, “
Automatic recognition of student emotions from facial expressions during a lecture
,”
Computers & Education
,
148
,
103797
,
2020
.
21.
J.
Fredricks
,
W.
McColskey
,
J.
Meli
,
J.
Mordica
,
B.
Montrosse
and
K.
Mooney
, “
Measuring Student Engagement in Upper Elementary through High School: A Description of 21 Instruments. Issues & Answers. REL 2011-No. 098
,”
Regional Educational Laboratory Southeast
,
2011
.
22.
R.
Martinez-Maldonado
,
A.
Clayphan
,
K.
Yacef
and
J.
Kay
, “
MTFeedback: providing notifications to enhance teacher awareness of small group work in the classroom
,”
IEEE Transactions on Learning Technologies
,
8
(
2
), pp.
187
200
,
2014
.
23.
C. R.
Henrie
,
L. R.
Halverson
and
C. R.
Graham
, “
Measuring student engagement in technology-mediated learning: A review
,”
Computers & Education
,
90
, pp.
36
53
,
2015
.
24.
N. E.
Myers
,
M. G.
Stokes
and
A. C.
Nobre
, “
Prioritizing information during working memory: beyond sustained internal attention
,”
Trends in Cognitive Sciences
,
21
(
6
), pp.
449
461
,
2017
.
25.
H.
Rozo
and
M.
Real
, “
Pedagogical Guidelines for the Creation of Adaptive Digital Educational Resources: A Review of the Literature
,”
Journal of Technology and Science Education
,
9
(
3
), pp.
308
325
,
2019
.
26.
D.
Doljanin
,
L.
Pranjic
,
Lj.
Jelecevic
and
M.
Horvat
, “
Adaptive Intelligent Agent for e-Learning: First Report on Enabling Technology Solutions
,”
Accepted for publication
,
2021
.
27.
Z.
Cao
,
T.
Simon
,
S. E.
Wei
and
Y.
Sheikh
, “
Realtime multi-person 2d pose estimation using part affinity fields
,”
In Proceedings of the IEEE conference on computer vision and pattern recognition
, pp.
7291
7299
,
2017
.
28.
S.
Lugović
,
I.
Dun1er
and
M.
Horvat
, “
Techniques and applications of emotion recognition in speech
,”
In 2016 39th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO
), pp.
1278
1283
,
2016
.
This content is only available via PDF.
You do not currently have access to this content.