Zephyrnet Logo

AI Being Tapped to Understand What Whales Say to Each Other 

Date:

AI is being applied to whale research, to understand what whales are trying to communicate with the audible sounds they make in the ocean. (Credit: Getty Images) 

By AI Trends Staff 

AI is being applied to whale research, especially to understand what whales are trying to communicate in the audible sounds they make to each other in the ocean.  

For example, marine biologist Shane Gero has worked to match clicks coming from whales around the Caribbean island nation of Dominica, to behavior he hopes will reveal the meanings of the sounds they make. Gero is a behavioral ecologist affiliated with the Marine Bioacoustics Lab at Aarhus University in Denmark, and the Department of Biology of Dalhousie University of Halifax, Nova Scotia.  

Shane Gero, founder, Dominica Sperm Whale Project

Gero works with a team from Project CETI, a nonprofit that aims to apply advanced machine learning and state-of-the-art robotics to listen to and translate the communication of whales. Project CETI has recently announced a five-year effort to build on Gero’s work with a research project to try to decipher what sperm whales are saying to each other, according to a recent account in National Geographic 

The team includes experts in linguistics, robotics, machine learning, and camera engineering. They will lean on advances in AI which can now translate one human language to another, in what is believed to be the largest interspecies communication effort in history.  

The team has been building specialized video and audio recording devices, which aim to capture millions of whale sounds and analyze them. They hope to gain insight into the underlying architecture of whale chatter.  

“The question comes up: What are you going to say to them? That kind of misses the point,” Gero stated. “It assumes they have a language to talk about us and boats or the weather or whatever we might want to ask them about.”  

The scientists wonder whether whales have grammar, syntax or anything analogous to words and sentences. They plan to track how whales behave when making or hearing clicks. Using advances in natural language processing, researchers will try to interpret this information.   

The Project CETI team includes David Gruber, a professor of biology and environmental science at City University of New York. He became interested in sperm whales while a fellow at Harvard University’s Radcliffe Institute. He wondered if sperm whales could have a communication system that could be called language, that linguists heretofore have thought non-human animals lack. After learning of Gero’s work, the two joined forces.   

Gruber’s machine learning colleagues applied AI techniques to some of Gero’s audio, to identify individual sperm whales from their sounds. The system was right more than 94% of the time. The fact that whales rely almost exclusively on acoustic information, narrowed the task.  

The CETI researchers have spent a year developing an array of high-resolution underwater sensors that will record sound 24 hours a day across a vast portion of Gero’s whale study area off Dominica. Three of these listening systems, each attached to a buoy at the surface, will drop straight down thousands of feet to the bottom, with hydrophones every few hundred meters. 

“We want to know as much as we can,” Gruber stated to National Geographic. “What’s the weather doing? Who’s talking to who? What’s happening 10 kilometers away. Is the whale hungry, sick, pregnant, mating? But we want to be as invisible as possible as we do it.” 

Scientists Staying Sounds of Endangered Beluga Whales in Alaska 

Similar whale research is going on in Alaska, where scientists are using a machine learning application to collect information essential to protect and recover the endangered Cook Inlet beluga whale population, according to a post from NOAA Fisheries. (NOAA is the National Oceanic and Atmospheric Administration, an agency within the US Department of Commerce.)  

In 1979, the Cook Inlet beluga population began a rapid decline. Despite being protected as an endangered species since 2008, the population still shows no sign of recovery and continues to decline.  

Beluga whales live in the Arctic or sub-Arctic. They are vulnerable to many threats such as pollution, extreme weather, and interactions with fishing activity. Underwater noise pollution, which interferes with the whales’ ability to communicate, is a special concern. The Cook Inlet, in Alaska’s most densely populated region, supports heavy vessel traffic, oil and gas exploration, construction, and other noisy human activities. 

The scientists working in Cook Inlet are using passive acoustic monitoring to provide information on beluga movement and habitat use. It also helps scientists identify when noise may be affecting beluga behavior, and ultimately, survival. 

Scientists listen for belugas using a network of moored underwater recorders, which collect enormous volumes of audio data including noise from the natural ocean environment, human activities, and other animals, as well as beluga calls. 

To detect potential beluga signals in these sometimes noisy recordings, scientists have traditionally used a series of basic algorithms. However, the algorithms do not work as well in noisy areas. It’s hard to distinguish faint beluga calls from signals such as creaking ice, ship propellers, and the calls of other cetaceans like killer and humpback whales. It required months of labor-intensive analyses by scientists to remove the false detections and correctly classify beluga calls, until now.  

This year, the NOAA scientists are working with Microsoft AI experts to train AI programs using deep learning techniques. The programs will perform the most tedious, expensive, and time-consuming part of analyzing acoustic data: classifying detections as beluga calls or false signals.   

Manuel Castellote, Bioacoustician, NOAA Alaska Fisheries Science Center

“Deep learning is as close as we can get to how the human brain works,” stated Manuel Castellote, NOAA Fisheries affiliate with the University of Washington, Joint Institute for the Study of the Atmosphere and Ocean, who is leading the study. “And so far the results have been beyond expectation. Machine learning is achieving more than 96% accuracy in classifying detections compared to a scientist doing the classification. It is even picking up things human analysts missed. We didn’t expect it to work as well as humans. Instead, it works better.” 

The machine learning model is highly accurate and can process an enormous amount of data very quickly. “A single mooring dataset, with 6-8 months of sound recordings, would take 10-15 days to manually classify all the detections,” Castellote stated. “With machine learning tools, it is done overnight. Unsupervised.”   

A network of 15 moorings in Cook Inlet is deployed and retrieved twice a year. “Remote sensors, like acoustic moorings, have revolutionized our ability to monitor wildlife populations, but have also created a backlog of raw data,” stated Dan Morris, principal scientist on the Microsoft AI for Earth team. AI is used to automate this data analysis, making it more efficient, This way, scientists “can get back to doing science instead of labeling data,” he stated. 

Simon Fraser University Studying Killer Whale Calls 

In another effort, researchers with Simon Fraser University, a public research university in British Columbia, Canada, are using AI and machine learning on a project to classify whale calls. The goal is to create a warning system to help protect endangered killer whales from potentially fatal ship strikes. 

The project is supported with $568,179 in funding from Fisheries and Oceans Canada under the Oceans Protection Plan–Whale Detection and Collision Avoidance Initiative. 

Ruth Joy, statistical ecologist and lecturer, Simon Fraser University

“Southern resident killer whales are an endangered species and people are very fond of these animals,” stated Ruth Joy, a statistical ecologist and lecturer in SFU’s School of Environmental Science, in a press release from the university. 

 “They want to see that these marine mammals are protected and that we are doing everything that we can to make sure that the Salish Sea is a good home for them.” 

The team is working with citizen scientists and the Orcasound project to provide several terabytes of whale call datasets, being collected by Steven Bergner, a computing science research associate at SFU’s Big Data Hub.  

The acoustic data will be used to ‘teach’ the computer to recognize which call belongs to each type of cetacean, according to Bergner. The project brings together experts from fields such as biology, statistics and machine learning. “In the end, we are developing a system that will be a collaboration between human experts and algorithms,” Bergner stated. 

Orcas or killer whales that are seen along the West Coast are divided into four distinct populations: the salmon-eating southern and northern residents, the transients, which prey on seals or other whales, and offshore, which mostly prey on sharks. Each orca population is further categorized into families called pods. Each pod has its own dialect and each population of orca has calls that differ from the other population.  

Read the source articles and information in National Geographic, from NOAA Fisheries and in a press release from Simon Fraser University. 

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://www.aitrends.com/ai-in-science/ai-being-tapped-to-understand-what-whales-say-to-each-other/

spot_img

Latest Intelligence

spot_img