gibbonNetR: an R Package for the Use of Convolutional Neural Networks for Automated Detection of Acoustic Data

This is a Preprint and has not been peer reviewed. This is version 2 of this Preprint.

Add a Comment

You must log in to post a comment.


Comments

There are no comments or no comments have been made public for this article.

Downloads

Download Preprint

Authors

Dena Jane Clink , Abdul Hamid Ahmad

Abstract

Automated detection of acoustic signals is crucial for effective monitoring of vocal animals and their habitats across large spatial and temporal scales. Recent advances in deep learning have made high performance automated detection approaches accessible to more practitioners. However, there are few deep learning approaches that can be implemented natively in R. The 'torch for R' ecosystem has made the use of convolutional neural networks (CNNs) accessible for R users. Here, we provide an R package and workflow to use CNNs for automated detection and classification of acoustics signals from passive acoustic monitoring data. We provide examples using data collected in Sabah, Malaysia. The package provides functions to create spectrogram images from labeled data, compare the performance of different CNN architectures, deploy trained models over directories of sound files, and extract embeddings from trained models. The R programming language remains one of the most commonly used languages among ecologists, and we hope that this package makes deep learning approaches more accessible to this audience. In addition, these models can serve as important benchmarks for future automated detection work. 

DOI

https://doi.org/10.32942/X2G61D

Subjects

Life Sciences

Keywords

Deep learning, Passive acoustic monitoring, gibbon

Dates

Published: 2024-07-14 16:22

Last Updated: 2025-02-17 07:58

Older Versions
License

CC BY Attribution 4.0 International

Additional Metadata

Language:
English

Conflict of interest statement:
None.

Data and Code Availability Statement:
https://github.com/DenaJGibbon/gibbonNetR