People working under water are often facing significant difficulties and risks to their lives. Development of a collaborative (assistive) underwater robot for combined operations with a person would make it possible to increase efficiency and safety of underwater operations. The article considers the problem of detecting and localizing a person under water in tasks of close interaction between underwater robot and person (up to 3-5 meters). Authors of this article, students of the Bauman Moscow State Technical University and members of the Hydronautics Educational and Scientific Youth Center developed design of the underwater module, which includes a stereoscopic camera able to build an image depth map, and algorithm for detecting and localizing a single or more persons. Stereoscopic camera is installed in a hermetic pressure hull with the swivel mechanism. Person localization software algorithm includes a convolutional neural network to identify objects in the image, which is started using the Nvidia Jetson TX2 single-board computer. Next, the algorithm based on calculation of the average in the depth map area is introduced to determine the distance to a person detected by the convolutional neural network. Work carried out resulted in obtaining a module capable of determining the presence of a person under water and the distance to him. Parameters accuracy was tested in the Bauman Moscow State Technical University swimming pool.

1.
J.
Yuh
,
G.
Marani
, and
D. R.
Blidberg
, “
Applications of marine robotic vehicles
,”
Intelligent Service Robotics
, vol.
4
, pp.
221
231
,
2011
.
2.
Gómez-Espinosa
,
Alfonso
&
Cuan-Urquizo
,
Enrique
&
González-García
,
Josué.
(
2020
).
Autonomous Underwater Vehicles: Localization, Navigation, and Communication for Collaborative Missions
.
Applied Sciences.
10
.
1256
. .
3.
S.
Petillo
and
H.
Schmidt
, "
Exploiting Adaptive and Collaborative AUV Autonomy for Detection and Characterization of Internal Waves
," in
IEEE Journal of Oceanic Engineering
, vol.
39
, no.
1
, pp.
150
164
, Jan.
2014
, doi: .
4.
J.
Song
,
S.
Gupta
,
J.
Hare
, and
S.
Zhou
, “Adaptive cleaning of oil spills by autonomous vehicles under partial information,” in
OCEANS’13 MTS/IEEE
, (
San Diego, CA
), pp.
1
5
,
2013
.
5.
N.
Wakita
,
K.
Hirokawa
,
T.
Ichikawa
, and
Y.
Yamauchi
, “
Development of autonomous underwater vehicle (AUV) for exploring deep sea marine mineral resources
,”
Mitsubishi Heavy Industries Technical Review
, vol.
47
, no.
3
, pp.
73
80
,
2010
.
6.
N.
Mišković
et al, "
CADDY project, year 3: The final validation trials
,"
OCEANS 2017 - Aberdeen
,
2017
, pp.
1
5
, doi: .
7.
Y.
Ukai
and
J.
Rekimoto
, "
Swimoid: Interacting with an underwater buddy robot
,"
2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI)
,
2013
, pp.
423
423
, doi: .
8.
K. J.
DeMarco
,
M. E.
West
and
A. M.
Howard
, "
Underwater human-robot communication: A case study with human divers
,"
2014 IEEE International Conference on Systems, Man, and Cybernetics (SMC)
,
2014
, pp.
3738
3743
, doi: .
9.
A. G.
Chavez
,
M.
Pfingsthorn
,
A.
Birk
,
I.
Rendulic
; and
N.
Misković
, "
Visual diver detection using multi-descriptor nearest-class-mean random forests in the context of underwater Human Robot Interaction (HRI)
,"
OCEANS 2015 - Genova
,
2015
, pp.
1
7
, doi: .
10.
Xia
,
Youya
&
Sattar
,
Junaed
. (
2018
).
Visual Diver Recognition for Underwater Human-Robot Collaboration.
.
11.
M. J.
Islam
,
M.
Fulton
and
J.
Sattar
, "
Toward a Generic Diver-Following Algorithm: Balancing Robustness and Efficiency in Deep Visual Detection
," in
IEEE Robotics and Automation Letters
, vol.
4
, no.
1
, pp.
113
120
, Jan.
2019
, doi: .
12.
Robert
Codd-Downey
and
Michael
Jenkin
.
2019
. Human robot interaction using diver hand signals.
In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI ’19)
.
IEEE Press
,
550
551
.
13.
Romagós
,
David
&
Ridao
,
Pere
&
Neira
,
Jose
&
Tardos
,
Juan
. (
2006
).
SLAM using an Imaging Sonar for Partially Structured Underwater Environments
.
5040
5045
. .
14.
Rahman
,
Sharmin
&
Quattrini
Li
,
Alberto
&
Rekleitis
,
Ioannis
. (
2019
).
SVIn2: An Underwater SLAM System using Sonar, Visual, Inertial, and Depth Sensor
.
1861
1868
. .
15.
Filisetti
,
Andrew
&
Marouchos
,
Andreas
&
Martini
,
Andrew
&
Martin
,
Tara
&
Collings
,
Simon
. (
2018
).
Developments and applications of underwater LiDAR systems in support of marine science
.
1
10
. .
16.
Woods
,
Andrew
&
Docherty
,
Tom
&
Koch
,
Rolf
. (
1994
).
Field Trials of Stereoscopic Video with an Underwater Remotely Operated Vehicle
.
Proceedings of SPIE - The International Society for Optical Engineering.
2177
. .
17.
Plotnikov
,
V.A.
,
Kamenev
,
Ya.M.
,
Litik
,
I.Yu
. Cousteau II student competition hybrid underwater vehicle.
E.V. Armensky interuniversity scientific and technical conference of students, postgraduates and young specialists.
M
.,
2019
. pp.
142
143
.
18.
Solodikhina
,
A.A.
,
Bonnet
,
Ya.V.
,
Semenyuk
,
I.S.
[et al] Automation in collecting ecological data, search for overall waste and sunken equipment by a hybrid unmanned underwater vehicle.
Baikal 2018 : Collected works of the International scientific and practice conference, Olkhonsky District
, June 11–20, 2018 –
Olkhonsky District: Irkutsk National Research Technical University
,
2018
. – Pp.
223
228
.
19.
Ortiz
,
Luis
&
Cabrera
,
Elizabeth
&
Gonçalves
,
Luiz
. (
2018
).
Depth Data Error Modeling of the ZED 3D Vision Sensor from Stereolabs
.
Electronic Letters on Computer Vision and Image Analysis.
17
. .
20.
Plotnikov
,
Vladislav
&
Akhtyamov
,
T.
&
Kopanev
,
Pavel
&
Serebrenny
,
Vladimir
. (
2022
).
Classical and neural network approaches to object detection in underwater robotics competitions
.
020021
. .
21.
M.
Sandler
,
A.
Howard
,
M.
Zhu
,
A.
Zhmoginov
, and
L.
Chen
, “
Mobilenetv2: Inverted residuals and linear bottlenecks
,” in
2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition
(
2018
) pp.
4510
4520
.
22.
M. J.
Islam
,
M.
Fulton
and
J.
Sattar
, "
Toward a Generic Diver-Following Algorithm: Balancing Robustness and Efficiency in Deep Visual Detection
," in
IEEE Robotics and Automation Letters
, vol.
4
, no.
1
, pp.
113
120
, Jan.
2019
, doi: .
This content is only available via PDF.
You do not currently have access to this content.