MONTREAL (AP) 鈥� Peeps trickle out of a soundproof chamber as its door opens. Female zebra finches are chattering away inside the microphone-lined box. The laboratory room sounds like a chorus of squeaky toys.

鈥淭hey're probably talking about us a little bit,鈥� says McGill University postdoctoral fellow Logan James.

It's unclear, of course, what they are saying. But James believes he is getting closer to deciphering their vocalizations through a partnership with the Earth Species Project. The nonprofit laboratory has drawn some of the technology industry's wealthiest philanthropists 鈥� and they want to see more than just scientific progress. On top of breakthroughs in animal language, they expect improved interspecies understanding will foster greater appreciation for the planet in the face of climate change.

The Earth Species Project hopes to decode other creatures' communications with its pioneering artificial intelligence tools. The goal is not to build a 鈥渢ranslator that will allow us to speak to other species,鈥� Director of Impact Jane Lawton said. However, she added, 鈥渞udimentary dictionaries鈥� for other animals are not only possible but could help craft better conservation strategies and reconnect humanity with often forgotten ecosystems.

鈥淲e believe that by reminding people of the beauty, the sophistication, the intelligence that is resident in other species and in nature as a whole, we can start to, kind of, almost repair that relationship," Lawton said.

At McGill University, the technology generates specific calls during simulated conversations with live finches that help researchers isolate each unique noise. The computer processes calls in real time and responds with one of its own. Those recordings are then used to train the Berkeley, California-based research group鈥檚 audio language model for animal sounds.

New insights into how animals communicate

This ad hoc collaboration is only a glimpse into what ESP says will come. By 2030, Lawton said, it expects 鈥渞eally interesting insights into how other animals communicate.鈥� Artificial intelligence advancements are expediting the research. New grants totaling $17 million will help hire engineers and at least double the size of the research team, which currently has roughly seven members. Over the next two years, Lawton said, the nonprofit's researchers will select species that 鈥渕ight actually shift something鈥� in people's relationship with nature.

Standing to benefit are animal groups threatened by habitat loss or human activity that could be better protected with better understandings of their languages. Existing collaborations aim to document the vocal repertoires 鈥� the distinct calls and their different contexts 鈥� of the Hawaiian crow and St. Lawrence River beluga whales.

After spending more than two decades extinct in the wild, the crows have been . But some conservationists fear that critical vocabulary has faded in captivity. Lawton said the birds might need to relearn some 鈥渨ords鈥� before they reenter their natural habitat in droves.

In Canada's St. Lawrence River, where shipping traffic imperils the marine mammals who feed there, the group's scientists are exploring whether machine learning can categorize unlabeled calls from the remaining belugas. Perhaps, Lawton suggested, authorities could alert nearby vessels if they understood that certain sounds signaled the whales were about to surface.

Big donors include LinkedIn co-founder Reid Hoffman, the family charity founded by late Microsoft co-founder Paul G. Allen and Laurene Powell Jobs鈥� Waverley Street Foundation. The latter aims to support 鈥渂ottom-up鈥� solutions to the 鈥渃limate emergency.鈥� At the root of that crisis, according to Waverley Street Foundation President Jared Blumenfeld, is the idea that humans deserve 鈥渄ominion鈥� over the world.

Blumenfeld finds that ESP's work is an important reminder that we are instead stewards of the planet.

"This is not a silver bullet,鈥� he said. 鈥淏ut it鈥檚 certainly part of a suite of things that can help transform how we view ourselves in relation to nature.鈥�

鈥楨xponential takeoff鈥� in processing calls

Gail Patricelli -- an unaffiliated animal behavior professor at the University of California, Davis -- remembers when such tools were just 鈥減ie in the sky.鈥� Researchers previously spent months laboring to manually comb through terabytes of recordings and annotate calls.

She said she's seen an 鈥渆xponential takeoff鈥� the past few years in bioacoustics' use of machine learning to accelerate that process. While she finds that ESP has the promise to make finer distinctions in existing 鈥渄ictionaries,鈥� especially for harder-to-reach species, she cautioned observers against attributing human characteristics to these animals.

Considering this research鈥檚 high equipment and labor costs, Patricelli said she鈥檚 happy to see big philanthropists backing it. But she said the field shouldn鈥檛 rely too much on one funding source. Government support is still necessary, she noted, because ecosystem protection also requires that conservationists examine 鈥渦nsexy鈥� species that she expects get less attention than more charismatic ones. She also encouraged funders to consult scientists.

鈥淭here鈥檚 a lot to learn and it鈥檚 very expensive,鈥� she said. 鈥淭hat might not be a big deal to some of these donors but it鈥檚 very hard to come up with the money to do this.鈥�

The current work largely involves developing baseline technologies to do all this. A separate initiative has recently described the . But ESP is trying to be 鈥渟pecies agnostic,鈥� AI Research Director Olivier Pietquin said, to provide tools that can sort out many animals鈥� speech patterns.

ESP introduced NatureLM-audio this fall, touting the system as the first large audio-language model fit for animals. The tool can identify species and distinguish characteristics such as sex or stage of life. When applied to a population 鈥� zebra finches 鈥� it had not been trained on, NatureLM-audio accurately counted the number of birds at a rate higher than random chance, according to ESP. The results were a positive sign for Pietquin that NatureLM might be able to scale across species.

鈥淭hat is only possible with a lot of computing, a lot of data and many, many collaborations with ecologists and biologists,鈥� he said. 鈥淭hat, I think, makes us, makes it, quite serious.鈥�

AI lets scientists see far more

ESP acknowledges that it isn鈥檛 sure what will be discovered about animal communications and won鈥檛 know when its model gets it absolutely right. But the team likens AI to the microscope: advancements that allowed scientists to see far more than previously considered possible.

Zebra finches are highly social animals with large call repertoires. Whether congregating in pairs or by the hundreds, they produce hours of data 鈥� a help to the nonprofit's AI scientists given that animal sounds aren't as abundant as the pages of internet text scraped to train chatbots.

James, an affiliated researcher with the Earth Species Project, struggles with the concept of decoding animal communications. Sure, he can clearly distinguish when a chick is screaming for food. But he doesn鈥檛 expect to ever translate that call or any others into a human word.

Still, he wonders if he can gather more hints about their interactions from aspects of the call such as its pitch or duration.

鈥淪o can we find a link between a form and function is sort of our way of maybe thinking about decoding," James said. 鈥淎s she elongates her call, is that because she鈥檚 trying harder to elicit a response?鈥�

___

Associated Press coverage of philanthropy and nonprofits receives support through the AP鈥檚 collaboration with The Conversation US, with funding from Lilly Endowment Inc. The AP is solely responsible for this content. For all of AP鈥檚 philanthropy coverage, visit .

The 春色直播 Press. All rights reserved.

More Science Stories

Sign Up to Newsletters

Get the latest from 春色直播News in your inbox. Select the emails you're interested in below.