Underwater SLAM with sidescan sonar
Title: Underwater SLAM with sidescan sonar
DNr: Berzelius-2022-256
Project Type: LiU Berzelius
Principal Investigator: Yiping Xie <yipingx@kth.se>
Affiliation: Kungliga Tekniska högskolan
Duration: 2023-01-01 – 2023-07-01
Classification: 10207
Homepage: https://www.vincentsitzmann.com/siren/


Differentiable rendering (including neural rendering) has been applied to computer graphics recently, combining deep learning, especially neural representation. The advantage of it is that there is no need to acquire 3D annotation, which is expensive. One can only have 2D annotations on images and leverage differentiable rendering to compute loss in 2D camera images and back-propagate the gradients back to whatever is needed to optimize. We have a similar situation in underwater perception, especially on imaging sonars such as sidescan sonars. It is expensive to register sidescan sonar data to a 3D seafloor map, which is no longer needed if we can utilize the idea of differentiable rendering. In particular, with the tool of SIREN (https://www.vincentsitzmann.com/siren/) a differentiable (neural) rendering tool, it is possible to do SLAM with sidescan sonar data using neural rendering based approach. The training and optimization would need many GPU resources, hence the SNIC computing resources would be of great help of the said project. And the project would help autonomous underwater vehicles increase aotounomy. This is the continuation of the previous project (Berzelius-2022-128) where the mapping part is done. The next step is to add localization part so that we can do Simultaneous Localization and Mapping (SLAM).