Institutional Repository
Technical University of Crete
EN  |  EL



My Space

Vision-based autonomous navigation for the BlueROV2 underwater vehicle

Siolis Panagiotis

Full record

Year 2023
Type of Item Diploma Work
Bibliographic Citation Panagiotis Siolis, "Vision-based autonomous navigation for the BlueROV2 underwater vehicle", Diploma Work, School of Electrical and Computer Engineering, Technical University of Crete, Chania, Greece, 2023
Appears in Collections


The use of Simultaneous Localization And Mapping (SLAM) algorithms is widespread in the field of robotics, especially when referring to ground robotic vehicles. SLAM algorithms, based on visual sensor information provided by a camera, require the inclusion of a procedure known as calibration, namely acquisition or estimation of all camera parameters required for the SLAM algorithm to work properly. In this diploma thesis, we focus on the use of visual SLAM algorithms, not on ground robotic vehicles, but on Remotely-Operated underwater robotic Vehicles (ROVs). In particular, a visual SLAM approach has been developed for the BlueROV2, which is small-size underwater robot, used for ocean research and exploration missions up to a depth of 100m. The proposed approach relies on the ORB-SLAM3 algorithm and is adapted for onboard execution on the ROV, using the Robot Operating System (ROS) framework. The successful deployment of our approach required two hardware modifications on the BlueROV2: replacement of the pre-installed Raspberry Pi 3 embedded computer with the more powerful Raspberry Pi 4 and replacement of the pre-installed monocular camera with an Intel RealSense T265 stereo camera to utilize the capabilities of ORB-SLAM3. At the same time, a control algorithm is proposed for the movement of the ROV, which is able to perform various motion patterns, such as moving along a line, a rectangle, a circle or a spiral, passing through points provided by the user. The combination of the proposed SLAM and motion control approaches make the vehicle able to move in an unknown environment without obstacles, with only minimal user intervention. Results were obtained through extensive simulations in water environments, but also in a real indoor environment, nevertheless outside the water, since the modified BlueROV2 is not yet 100% waterproof. In any case, the proposed approach enables successful navigation, as long as a sufficient number of visual features are identified in the environment.

Available Files