Artificial intelligence helps us understand the language of animals.
The technology can analyze hours of animal audio in a fraction of the time the same work would take for a human.
“If you try to manually isolate these calls from audio files, it takes a very long time,” said Kevin Coffey, a professor at the University of Washington.
Coffey is also one of the creators of DeepSqueak, an AI program designed to pick up high-pitched rat calls that human ears often miss.
“In rats, these calls are often related to a positive or negative effect,” Coffey said. “They make certain calls in positive situations and others in negative situations.”
DeepSqueak’s technology is based on the visual waveforms associated with an audio file.
The AI scans the waves for irregular patterns.
“It certainly works better with some things, like rodent calls or a whistle, than others,” Coffey said. “Not all vocalizations are this beautiful.”
Similar machine learning technologies are being deployed at the Woods Hole Oceanographic Institution off the coast of Massachusetts.
Researchers use underwater microphones to catalog different species on coral reefs.
A single reef can be home to over a hundred species.
“Sound travels very well in the ocean,” said Aran Mooney, associate scientist at WHOI. “It’s a really good indicator of those different species. We can put in one sensor and cover most of the reef.”
The work is urgent, as climate change threatens to wipe out some species.
“It’s a way of cataloging the animals that are more obvious, but we can also detect some of the more cryptic animals,” Mooney said. “That’s one of the goals of the AI, to get some of those rarer species. Humans are pretty good at taking out the obvious. We want to take out the rare things.”
The early success of this technology does not mean that an ‘animal to English’ translator is on the horizon.
And any technology that claims to translate your dog’s barks is likely fake.
Pets have a smaller range of sounds. Much of their communication is through body language.
To fully understand a dog or cat, researchers think we need to use a hybrid AI, which analyzes audio and video.
Some research is already underway.
“Things like DeepLabCut are an estimate, where they automatically try to score animal behavior based on individual video frames,” Coffey said. “I want to put those two very, very powerful techniques together so that we have behavior and communication in one model and we can get a better idea of what the sounds mean.”