Rapid consistent reef surveys with DeepReefMap
In light of the critical threat to coral reefs worldwide due to human activity, innovative monitoring strategies are needed that are efficient, standardized, scalable, and economical. This paper presents the results of the first large-scale transnational coral reef surveying endeavor in the Red Sea using DeepReefMap, which provides automatic analysis of video transects by employing neural networks for 3D semantic mapping. DeepReefMap is trained using imagery from low-cost underwater cameras, allowing surveys to be conducted and analyzed in just a few minutes. This initiative was carried out in Djibouti, Jordan, and Israel, with over 184 hours of collected video footage for training the neural network for 3D reconstruction. We created a semantic segmentation dataset of video frames with over 200,000 annotated polygons from 39 benthic classes, down to the resolution of prominent visually identifiable genera found in the Red Sea. We analyzed 365 video transects from 45 sites using the deep-learning based mapping system, demonstrating the method's robustness across environmental conditions and input video quality. We show that the surveys are consistent in characterizing the benthic composition, therefore showcasing the potential of DeepReefMap for monitoring. This research pioneers deep learning for practical 3D underwater mapping and semantic segmentation, paving the way for affordable, widespread deployment in reef conservation and ecology with tangible impact.
10.1038_s41598-025-20795-z.pdf
Main Document
Published version
openaccess
CC BY-NC-ND
3.18 MB
Adobe PDF
b64628aebc7af84ef19cfb3af1765313