Human Activity Recognition (HAR) approaches are predominantly based on supervised deep learning and benefit from large amounts of labeled data—an expensive resource. Data augmentation enriches labeled datasets by adding synthetic data, which is substantially cheaper, and often results in improved model performance, but is very rarely used for sensor data. This work explores data augmentation for inertial-sensor-based HAR by transforming the data through physically interpretable operations. The main studies were conducted on the Opportunity and the Overhead Car Assembly (OCA) datasets. For these experiments, only 20% of the available training data were used, and the experiments were conducted in an 8-fold cross-validation procedure over different subsets of the training data.

The results show that simple geometric augmentations can be beneficial in many cases. Timewarping proved to offer the most reliable single augmentation, improving the average F1 score of Opportunity from 0.570 to 0.597 and of OCA Mixed from 0.884 to 0.906. Combining augmentations improved the accuracy in almost all scenarios but to a degree comparable to timewarping. Applying augmentations on all the available training data improved the F1 score compared to the base case with no augmentations, although this effect is more pronounced for datasets with more similar training and test data: for the OCA Mixed variant, the average F1 score improved from 0.917 to 0.933, while for the OCA Leave-One-Out (LOT) variant, the average F1 score did not significantly change. For Opportunity, which similarly to OCA LOT uses a participant-based training-test split, the F1 score improved from 0.684 to 0.697.

1.
G.
Ogbuabor
and
R.
La
, “
Human activity recognition for healthcare using smartphones
,” in
Proceedings of the 2018 10th International Conference on Machine Learning and Computing
,
ICMLC
2018 (
Association for Computing Machinery
,
New York, NY, USA
,
2018
) p.
41
46
.
2.
N. D.
Nath
,
R.
Akhavian
, and
A. H.
Behzadan
, “
Ergonomic analysis of construction worker’s body postures using wearable mobile sensors
,”
Applied Ergonomics
62
,
107
117
(
2017
).
3.
A.
Malaisé
,
P.
Maurice
,
F.
Colas
, and
S.
Ivaldi
, “
Activity recognition for ergonomics assessment of industrial tasks with automatic feature selection
,”
IEEE Robotics and Automation Letters
4
,
1132
1139
(
2019
).
4.
S.
Katz
, “
Assessing self-maintenance: Activities of daily living, mobility, and instrumental activities of daily living
,”
Journal of the American Geriatrics Society
31
,
721
727
(
1983
), https://agsjournals.onlinelibrary.wiley.com/doi/pdf/10.1111/j.1532-5415.1983.tb03391.x.
5.
R.
Burgess-Limerick
,
Master OHS and environment guide 2007
, 2nd ed. (
CCH Australia Ltd
,
North Ryde, N.S.W
.,
2007
) pp.
261
278
.
6.
J.
Kuschan
,
M.
Burgdorff
,
H.
Filaretov
, and
J.
Krüger
, “
Inertial Measurement Unit based Human Action Recognition for Soft-Robotic Exoskeleton
,”
IOP Conference Series: Materials Science and Engineering
1140
,
012020
(
2021
).
7.
K.
Schaub
,
G.
Caragnano
,
B.
Britzke
, and
R.
Bruder
, “
The european assembly worksheet
,”
Theoretical Issues in Ergonomics Science
14
,
616
639
(
2013
), .
8.
F. J.
Ordóñez
and
D.
Roggen
, “
Deep convolutional and lstm recurrent neural networks for multimodal wearable activity recognition
,”
Sensors
16
(
2016
), .
9.
L.
Perez
and
J.
Wang
, “
The Effectiveness of Data Augmentation in Image Classification using Deep Learning
,” (
2017
).
10.
C.
Shorten
and
T. M.
Khoshgoftaar
, “
A survey on Image Data Augmentation for Deep Learning
,”
Journal of Big Data
6
,
60
(
2019
).
11.
H. Ismail
Fawaz
,
G.
Forestier
,
J.
Weber
,
L.
Idoumghar
, and
P.-A.
Muller
, “
Deep learning for time series classification: A review
,”
Data Mining and Knowledge Discovery
33
,
917
963
(
2019
).
12.
T. T.
Um
,
F. M. J.
Pfister
,
D.
Pichler
,
S.
Endo
,
M.
Lang
,
S.
Hirche
,
U.
Fietzek
, and
D.
Kulić
, “
Data Augmentation of Wearable Sensor Data for Parkinson’s Disease Monitoring using Convolutional Neural Networks
,”
Proceedings of the 19th ACM International Conference on Multimodal Interaction
,
216
220
(
2017
), arXiv:1706.00527.
13.
H.
Ohashi
,
M.
Al-Naser
,
S.
Ahmed
,
T.
Akiyama
,
T.
Sato
,
P.
Nguyen
,
K.
Nakamura
, and
A.
Dengel
, “
Augmenting Wearable Sensor Data with Physical Constraint for DNN-Based Human-Action Recognition
,” (
2017
).
14.
N.
Tufek
,
M.
Yalcin
,
M.
Altintas
,
F.
Kalaoglu
,
Y.
Li
, and
S. K.
Bahadir
, “
Human Action Recognition Using Deep Learning Methods on Limited Sensory Data
,”
IEEE Sensors Journal
(
2020
), .
15.
Q.
Wen
,
L.
Sun
,
F.
Yang
,
X.
Song
,
J.
Gao
,
X.
Wang
, and
H.
Xu
, “
Time Series Data Augmentation for Deep Learning: A Survey
,” (
2021
), arXiv:2002.12478 [cs, eess, stat].
16.
B. K.
Iwana
and
S.
Uchida
, “
An Empirical Survey of Data Augmentation for Time Series Classification with Neural Networks
,”
PLOS ONE
(
2021
), arXiv:2007.15951 [cs, stat].
17.
Z.
Wang
,
Y.
Qu
,
J.
Tao
, and
Y.
Song
, “
Image-Mediated Data Augmentation for Low-Resource Human Activity Recognition
,” in
Proceedings of the 2019 3rd International Conference on Compute and Data Analysis, ICCDA 2019
(
Association for Computing Machinery
,
2019
) pp.
49
54
.
18.
J.-L.
Reyes-Ortiz
,
L.
Oneto
,
A.
Samà
,
X.
Parra
, and
D.
Anguita
, “
Transition-Aware Human Activity Recognition Using Smartphones
,”
Neurocomputing
171
,
754
767
(
2016
).
19.
K. M.
Rashid
and
J.
Louis
, “
Times-series data augmentation and deep learning for construction equipment activity recognition
,”
Advanced Engineering Informatics
42
,
100944
(
2019
).
20.
O. Steven
Eyobu
and
D. S.
Han
, “
Feature Representation and Data Augmentation for Human Activity Classification Based on Wearable IMU Sensor Data Using a Deep LSTM Neural Network
,”
Sensors
18
,
2892
(
2018
).
21.
L.
Yao
,
W.
Yang
, and
W.
Huang
, “
A data augmentation method for human action recognition using dense joint motion images
,” (
2020
), .
22.
R.
Chavarriaga
,
H.
Sagha
,
A.
Calatroni
,
S. T.
Digumarti
,
G.
Tröster
,
J. d. R.
Millán
, and
D.
Roggen
, “
The Opportunity challenge: A benchmark database for on-body sensor-based activity recognition
,”
Pattern Recognition Letters Smart Approaches for Human Action Recognition
,
34
,
2033
2042
(
2013
).
23.
J.
Kuschan
,
H.
Filaretov
, and
J.
Krüger
, “
Inertial measurement unit based human action recognition dataset for cyclic overhead car assembly and disassembly
,” in
2022 IEEE 20th International Conference on Industrial Informatics (INDIN)
(
2022
) pp.
469
476
.
24.
J.
Kuschan
and
H.
Filaretov
, “
Overhead car assembly imu dataset
,”.
25.
J.P.
Goppold
,
J.
Kuschan
,
G.
Thiele
,
H.
Schmidt
,
J.
Krueger
,
R.
Hackbart
,
J.
Kostelnik
,
J.
Liebach
, and
M.
Wolschke
, “
PowerGrasp-Design and Evaluation of a Modular Soft-Robotic Arm Exosuit for Industrial Applications
,” in
ISR 2020; 52th International Symposium on Robotics
(
2020
) pp.
1
8
.
This content is only available via PDF.
You do not currently have access to this content.