Dear Colleagues,

I am happy to share our new study on the use of Convolutional Neural
Networks for Passive Acoustic Monitoring. This study aimed to develop a
sound classifier to acoustically monitor the critically endangered humpback
dolphin in South African waters. The article is open access and you can
find the pdf in this link
<https://www.sciencedirect.com/science/article/pii/S1574954123003205?fbclid=IwAR2F0zItU-iGXEkvNsViXPQSNxx2syiVQ13GCyt5kHBu58_PREEpgDeyHgo>.
The new tool can be downloaded and tested with the demo you can get at
GitHub <https://github.com/Gui-Frainer/CetusID>.

All the best,

Gui

Abstract:
A novel framework for acoustic detection and species identification is
proposed to aid passive acoustic monitoring studies on the endangered
Indian Ocean humpback dolphin (*Sousa plumbea*) in South African waters.
Convolutional Neural Networks (CNNs) were used for both detection and
identification of dolphin vocalisations tasks, and performance was
evaluated using custom and pre-trained architectures (transfer learning).
In total, 723 min of acoustic data were annotated for the presence of
whistles, burst pulses and echolocation clicks produced by *Delphinus
delphis* (~45.6%), *Tursiops aduncus* (~39%), *Sousa plumbea*
(~14.4%), *Orcinus
orca* (~1%). The best performing models for detecting dolphin presence and
species identification used segments (spectral windows) of two second
lengths and were trained using images with 70 and 90 dpi, respectively. The
best detection model was built using a customised architecture and achieved
an accuracy of 84.4% for all dolphin vocalisations on the test set, and
89.5% for vocalisations with a high signal to noise ratio. The best
identification model was also built using the customised architecture and
correctly identified *S. plumbea* (96.9%), *T. aduncus* (100%), and *D.
delphis* (78%) encounters in the testing dataset. The developed framework
was designed based on the knowledge of complex dolphin sounds and it may
assists in finding suitable CNN hyper-parameters for other species or
populations. Our study contributes towards the development of an
open-source tool to assist long-term studies of endangered species, living
in highly diverse habitats, using passive acoustic monitoring.

--
*Guilherme Frainer*
<https://gui-frainer.github.io/guifrainer.github.io/index.html>, PhD
Postdoctoral Fellow

Centre for Statistics in Ecology, Environment and Conservation
<http://www.seec.uct.ac.za/>, Department of Statistical Sciences,
University of Cape Town
Sea Search Research and Conservation <http://www.seasearch.co.za>
+27 068 3330404 (RSA)
Lattes <http://lattes.cnpq.br/6792744964578279> - ORCID
<https://orcid.org/0000-0002-5527-9219> - Scholar
<https://scholar.google.com.br/citations?user=qbVW_uUAAAAJ&hl=pt-BR>
_______________________________________________
MARMAM mailing list
[email protected]
https://lists.uvic.ca/mailman/listinfo/marmam

Reply via email to