Coral Reefs Generate a Hidden Sound Underwater, And It Could Help Us to Save Them

We tend to equate healthy coral reefs with their aesthetic beauty: the brilliant array of colors and forms that inhabit these stunning underwater ecosystems.

They may, however, be rather loud. If you've ever snorkeled in a coral reef setting, you'll be familiar with the peculiar clicking and popping noises generated by various marine organisms beneath the water, such as snapping shrimp and feeding fish.

That buzzy din of background noise – almost like the chattering hiss of radio static – is such a distinct component of the coral reef soundscape that it might aid in monitoring the health of these critically endangered marine environments.

A recent research uses machine learning to train an algorithm to distinguish the small acoustic variations between a healthy, active reef and a damaged coral site - an audio contrast so subtle that it may be hard for humans to detect.

The new technology might provide considerable advantages over current labor-intensive and time-consuming techniques for monitoring reef health, such as having divers visit reefs to visually assess coral cover or manually listening to reef recordings, according to the study. Furthermore, many reef animals hide or are only visible at night, confounding any visual surveys.

"Our findings show that a computer can pick up patterns that are undetectable to the human ear," says marine scientist Ben Williams of the University of Exeter in the United Kingdom.

"It can tell us faster, and more accurately, how the reef is doing."

Williams and colleagues recorded coral acoustics at seven distinct locations in the Spermonde Archipelago, which is located off the southwest coast of Sulawesi in Indonesia and is home to the Mars Coral Reef Restoration project.

The recordings included four unique types of reef habitat: healthy, deteriorated, mature restored, and recently restored - each with a different quantity of coral cover and, as a result, a distinctive style of noise from aquatic species living and foraging in the region.

"Previously we relied on manual listening and annotation of these recordings to make reliable comparisons," Williams adds in a Twitter thread.

"However, this is a very slow process and the size of marine soundscape databases is skyrocketing given the advent of low-cost recorders."

To automate the procedure, the researchers created a machine learning system to distinguish between several types of coral recordings. Following tests, the AI program was able to assess reef health from audio recordings with 92 percent accuracy.

"This is a really exciting development," says co-author Timothy Lamont of Lancaster University in the United Kingdom.

"In many cases it's easier and cheaper to deploy an underwater hydrophone on a reef and leave it there than to have expert divers visiting the reef repeatedly to survey it – especially in remote locations." 

The algorithm's results, according to the researchers, are determined by a combination of underwater soundscape factors, including the abundance and diversity of fish vocalizations, sounds made by invertebrates, and possibly faint noises thought to be made by algae, as well as contributions from abiotic sources (such as subtle differences in how waves and wind might sound across different kinds of coral habitat).

While the human ear may not be able to detect such faint and hidden sounds easily, machines appear to be able to detect the differences reliably well, although the researchers acknowledge the method can still be refined further, with greater sound sampling in the future expected to deliver "a more nuanced approach to classifying ecostate".

Unfortunately, time is a commodity that the world's corals are rapidly running out of. We must act quickly if we want to save them.