%0 Journal Article %9 ACL : Articles dans des revues avec comité de lecture répertoriées par l'AERES %A Villon, Sébastien %A Iovan, Corina %A Mangeas, Morgan %A Vigliola, Laurent %T Toward an artificial intelligence-assisted counting of sharks on baited video %D 2024 %L fdi:010089628 %G ENG %J Ecological Informatics %@ 1574-9541 %K Deep learning ; Neural network ; Coral reef ; Marine ecology ; Shark conservation %M ISI:001174108700001 %P 102499 [9 ] %R 10.1016/j.ecoinf.2024.102499 %U https://www.documentation.ird.fr/hor/fdi:010089628 %> https://horizon.documentation.ird.fr/exl-doc/pleins_textes/2024-04/010089628.pdf %V 80 %W Horizon (IRD) %X Given the global biodiversity crisis, there is an urgent need for new tools to monitor populations of endangered marine megafauna, like sharks. To this end, Baited Remote Underwater Video Stations (BRUVS) stand as the most effective tools for estimating shark abundance, measured using the MaxN metric. However, a bottleneck exists in manually computing MaxN from extensive BRUVS video data. Although artificial intelligence methods are capable of solving this problem, their effectiveness is tested using AI metrics such as the F-measure, rather than ecologically informative metrics employed by ecologists, such as MaxN. In this study, we present both an automated and a semi-automated deep learning approach designed to produce the MaxN abundance metric for three distinct reef shark species: the grey reef shark (Carcharhinus amblyrhynchos), the blacktip reef shark (C. melanopterus), and the whitetip reef shark (Triaenodon obesus). Our approach was applied to one-hour baited underwater videos recorded in New Caledonia (South Pacific). Our fully automated model achieved F-measures of 0.85, 0.43, and 0.72 for the respective three species. It also generated MaxN abundance values that showed a high correlation with manually derived data for C. amblyrhynchos (R = 0.88). For the two other species, correlations were significant but weak (R = 0.35-0.44). Our semi-automated method significantly enhanced Fmeasures to 0.97, 0.86, and 0.82, resulting in high-quality MaxN abundance estimations while drastically reducing the video processing time. To our knowledge, we are the first to estimate MaxN with a deep-learning approach. In our discussion, we explore the implications of this novel tool and underscore its potential to produce innovative metrics for estimating fish abundance in videos, thereby addressing current limitations and paving the way for comprehensive ecological assessments. %$ 036 ; 122 ; 034