Artificial Intelligence can listen to the coral of the fish

New research from Oregon State University shows how machine learning, a branch of Artificial Intelligence, can eavesdrop on fish making sounds and figure out how their environment affects them.

The research team developed an automated method that can accurately identify the calls of a family of fish.

The method takes advantage of data collected by underwater microphones known as hydrophones and provides an efficient and inexpensive way to understand changes in the marine environment.

The research analyzed 18,000 hours of sound recordings from a 12-station hydrophone area maintained by NOAA and the National Park Service in American Samoa.

He used software capable of visual noise reading that easily identified a range of sounds, such as whale calls.

Underwater Choir

But it also recorded other totally strange sounds, such as mooing and other more mysterious ones, reminiscent of some musical instruments such as tubas and jet skis.

The computer program was able to distinguish especially the sounds that originate when the fish come together to form a kind of underwater chorus, with some peculiar sounds.

Artificial Intelligence can listen to the chorus of the fish

The model focused on 400-500 damselfish calls and accurately identified 94% of the calls they made on the seabed.

Researchers observed that these fish grind their teeth to create clicking and chirping sounds associated with aggressive behavior and nest defense. They generate a kind of purring similar to that of cats.

Hydrophone deployed in a tropical reef region within Samoa National Park.Tim Clark/National Park Service.

Ecosystem health

The system applied in this research can not only identify sounds from fish colonies, but also offer clues about the health of an ecosystem, the researchers believe.

Since fish songs change with environmental conditions, such as wind speed, water temperature, tidal range, and sound pressure level, fish sounds can be an indicator of how an ecosystem is working, especially in oceans that are rapidly experiencing climate change.

These machine learning approaches have been used to analyze humpback whale calls before, but this is the first time they have been applied to the world of fish.

Hydrophones are increasingly being used in the world's oceans. They offer advantages over other types of tracking because they work at night, in low visibility conditions, and for long periods of time. But techniques for efficiently analyzing data from hydrophones are not well developed.

Machine Learning

Machine learning algorithms build a model based on sample data, known as training data, to make predictions or decisions without being explicitly programmed to do so.

Machine learning techniques have been used to automate the processing of large amounts of data from passive acoustic tracking devices, which collected sound data from birds, bats, and marine mammals.

“We built a machine learning model on a relatively small set of training data and then applied it to a huge data set,” explains the lead author of this research, Jill Munger, in a statement.

"The implications for monitoring the environment are enormous, he adds. The benefit of observing fish calls over a long period of time is that we can begin to understand how they relate to changing ocean conditions, which influence living marine resources: for example, the abundance of damselfish calls can be an indicator of the health of coral reefs."

Reference

Machine learning analysis reveals relationship between pomacentrid calls and environmental cues. Jill E. Munger et al. Marine Ecology Progress Series, 681:197-210 (2022). DOI: https://doi.org/10.3354/meps13912