Deciphering complex coral reef soundscapes with spatial audio and 360° video

Marc S. Dantzker*, Matthew T. Duggan, Erika Berlik, Symeon Delikaris-Manias, Vasileios Bountourakis, Ville Pulkki, Aaron N. Rice*

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

4 Downloads (Pure)

Abstract

Coral reef soundscapes hold an untapped wealth of biodiversity information. While they are easy to record and filled with snapping shrimp and fish sounds, they are difficult to decipher because we know little about which sounds are made by which species. With identified fish sounds, acoustic monitoring can directly inform biodiversity metrics, detect indicator or invasive species, identify behavioural events and estimate abundance at temporal and spatial scales that are impossible with methods like eDNA or visual surveys. The missing link, knowing which sounds come from which species, is exceedingly difficult to establish with fish, especially on a species-rich coral reef. Using a novel method to visualize in situ underwater sound, we have developed a technique that combines visualizations of spatial audio with concentric 360° video recordings, a combination not previously accomplished underwater. We have identified and assigned the most extensive collection of natural sounds of fishes. Further, we demonstrate that these species identifications can be used to decipher a passive acoustic monitor recording. We have collected our identified recordings into a growing open-access resource to catalyse passive acoustic monitoring research, enabling a species-specific resolution of coral reef soundscape dynamics and providing critical validated information for developing machine-learning models to analyse an ever-expanding collection of long-term recordings.

Original languageEnglish
Pages (from-to)2622-2637
Number of pages16
JournalMethods in Ecology and Evolution
Volume16
Issue number11
DOIs
Publication statusPublished - Nov 2025
MoE publication typeA1 Journal article-refereed

Funding

The authors wish to thank: Cooper Nichols designed the bracket. Robb Nichols of Aquarian Audio made the prototype hydrophones available for our use. Charlie Dantzker provided design and engineering support. Brayden Zee developed analysis software. In the field, we were given invaluable support by Adriaan ‘Dutch’ Schrier and the staff of the Curaçao Sea Aquarium and Substation Curaçao including Tafari Bakmeijer, Tico Cristiaan, Manuel Jove, Laureen Schenk, Jordy Stolk and Joel Tjong-A-Tjoe. By arrangement with the Sea Aquarium, Aldemar Rodriguez served as our field technician, whose experience on the local reef proved invaluable. We would also like to thank Carole Baldwin and Matthew Girard from the National Museum of Natural History at the Smithsonian Institution for their involvement in our field operations. Heather Dantzker gave significant editorial feedback on the manuscript. Thanks to Aaron Adams, Brent Miller and Will Palmer, our volunteer web team who wrote, designed, and developed the online library. Xavier Mouy, Rodney Rountree, and two anonymous reviewers offered helpful feedback on earlier draft version. Schmidt Marine Technology Partners, Schmidt Family Foundation (MSD, ANR, VP). Oceankind, LLC (ANR, MSD). National Science Foundation Graduate Research Fellowship under Grant No. 2139899 (MTD). Cornell Lab of Ornithology Athena Fund (MTD). Smithsonian Tropical Research Institution D. Ross Robertson Research Award Fellowship for Field Studies on Neotropical Deep‐Reef Fishes (MTD).

Keywords

  • animal communication
  • bioacoustics
  • conservation
  • passive acoustic monitoring

Fingerprint

Dive into the research topics of 'Deciphering complex coral reef soundscapes with spatial audio and 360° video'. Together they form a unique fingerprint.

Cite this