How tech is helping us talk to animals

This story is part of a group of stories called

Uncovering and explaining how our digital world is changing — and changing us.

The world around us is vibrating with sounds we cannot hear. Bats chitter and babble in ultrasound; elephants rumble infrasonic secrets to each other; coral reefs are aquatic clubs, hopping with the cracks and hisses and clicks of marine life.

For centuries, we didn’t even know those sounds existed. But as technology has advanced, so has our capacity to listen. Today, tools like drones, digital recorders, and artificial intelligence are helping us listen to the sounds of nature in unprecedented ways, transforming the world of scientific research and raising a tantalizing prospect: Someday soon, computers might allow us to talk to animals.

In some ways, that has already begun.

“Digital technologies, so often associated with our alienation from nature, are offering us an opportunity to listen to nonhumans in powerful ways, reviving our connection to the natural world,” writes Karen Bakker in her new book, The Sounds of Life: How Digital Technology Is Bringing Us Closer to the Worlds of Animals and Plants.

Automated listening posts have been set up in ecosystems around the planet, from rainforests to the depths of the ocean, and miniaturization has allowed scientists to stick microphones onto animals as small as honeybees.

“Combined, these digital devices function like a planetary-scale hearing aid: enabling humans to observe and study nature’s sounds beyond the limits of our sensory capabilities,” Bakker writes.

All those devices create a ton of data, which would be impossible to go through manually. So researchers in the fields of bioacoustics (which studies sounds made by living organisms) and ecoacoustics (which studies the sounds made by entire ecosystems) are turning to artificial intelligence to sift through the piles of recordings, finding patterns that might help us understand what animals are saying to each other. There are now databases of whale songs and honeybee dances, among others, that Bakker writes could one day turn into “a zoological version of Google Translate.”

But it’s important to remember that we aren’t necessarily discovering these sounds for the first time. As Bakker points out in her book, Indigenous communities around the world have long been aware that animals have their own forms of communication, while the Western scientific establishment has historically dismissed the idea of animal communication outright. Many of the researchers Bakker highlights in her book faced intense pushback from the scientific community when they suggested whales, elephants, turtles, bats, and even plants made sounds and even might have languages of their own. They spent nearly as much time pushing back against the pushback as they did conducting research.

While that seems to be changing with our increased understanding of animals, Bakker cautions that the ability to communicate with animals stands to be either a blessing or a curse, and we must think carefully about how we will use our technological advancements to interact with the natural world. We can use our understanding of our world’s sonic richness to gain a sense of kinship with nature and even potentially heal some of the damage we have wrought, but we also run the risk of using our newfound powers to assert our domination over animals and plants.

We are on the edge of a revolution in how we interact with the world around us, Bakker told Recode. Now, we must decide which path we will follow in the years ahead. This interview has been edited for length and clarity.

Neel Dhanesha

Let’s start with the big idea that you lay out in your introduction: We’re using technologies like AI to talk to animals. What does that look like?

Karen Bakker

We can use artificial intelligence-enabled robots to speak animal languages and essentially breach the barrier of interspecies communication. Researchers are doing this in a very rudimentary way with honeybees and dolphins and to some extent with elephants. Now, this raises a very serious ethical question, because the ability to speak to other species sounds intriguing and fascinating, but it could be used either to create a deeper sense of kinship, or a sense of dominion and manipulative ability to domesticate wild species that we’ve never as humans been able to previously control.

We can use artificial intelligence-enabled robots to speak animal languages Neel Dhanesha

How would that work?

Karen Bakker

I’ll give you one example. A research team in Germany encoded honeybee signals into a robot that they sent into a hive. That robot is able to use the honeybees’ waggle dance communication to tell the honeybees to stop moving, and it’s able to tell those honeybees where to fly to for a specific nectar source. The next stage in this research is to implant these robots into honeybee hives so the hives accept these robots as members of their community from birth. And then we would have an unprecedented degree of control over the hive; we’ll have essentially domesticated that hive in a way we’ve never done so before. This creates the possibility of exploitive use of animals. And there’s a long history of the military use of animals, so that’s one path that I think raises a lot of alarm bells.

So these are the sorts of ethical questions that researchers are now starting to engage in. But the hope is that with these ethics in place, in the future, we — you and I, ordinary people — will have a lot more ability to tune into the sounds of nature, and to understand what we’re hearing. And I think what that does is create a real sense of awe and wonder and also a feeling of profound kinship. That’s where I hoped we would take these technologies.

Neel Dhanesha

How did we first realize that animals — and even the Earth — were making all of these sounds outside of our hearing range?

Karen Bakker

It’s funny, humans as a species tend to believe that what we cannot observe does not exist. So a lot of these sounds were literally right in front of our ears. But because of a tendency, especially in Western science, to privilege sight over sound, we simply hadn’t listened for them.

The game changer, and the reason I wrote this book, is that digital technology now enables us to listen very easily and very cheaply to species all over the planet. And what we’re discovering is that a huge range of species that we never even suspected could make sound or respond to sound are indeed sort of participating in nature’s symphony. And that’s a discovery that is as significant as the microscope was a few hundred years ago: It opens up an entirely new sonic world, and is now ushering in many discoveries about complex communication in animals, language, and behavior that are really overturning many of our assumptions about animals and even plants.

Neel Dhanesha

Elephants seem to be a particularly good example of that inability to listen.

Karen Bakker

One story I tell in my book is that of Katie Payne, who’s one of the heroes of 20th century bioacoustics. She was actually a classically trained musician. After doing some amazing work on whale sounds, she was the one to first discover that elephants make sounds below our human hearing range, in infrasound. And this explains some of the amazingly uncanny ability of elephants to know where other elephants are over long distances. They can coordinate their movements and almost communicate telepathically. They’re pretty amazing animals, using this infrasound that can travel long distances through soil, through stones, or even walls. But the way that was discovered was simply by sitting and attentively listening.

Katie Payne described that feeling of elephant infrasound as a strange throbbing in her chest, a strange feeling of unease. And that’s often how we can, as humans, sense infrasound. But until the advent of digital technology, the only way we could find out about these sounds was kind of haphazardly, we might go out and record something and painstakingly listen to it in the lab.

Neel Dhanesha

I’m curious about how animals experience these sounds themselves. You said we experience infrasound as a sort of throbbing in our chest — is there any way to tell how the elephants themselves are experiencing these sounds? Are they also hearing a low throbbing sound? Or are they hearing something that’s so complex that we don’t quite understand?

Karen Bakker

We are limited because these digital technologies are, at the end of the day, only a simulacra. When we want to listen to those sounds, which are often much higher or lower than the human hearing range, those sounds have to be altered. So we can’t ever really know what a bat sounds like to a bat.

The term that scientists use for this is the umwelt, the embodied experience of an animal that’s listening, that’s sensing its environment in its own skin. And we can only guess at that. But as we tried to do so I think it’s really important to put aside some of our human-centered ideas about what language is and what communication is. In the book, Mirjam Knörnschild — who’s an amazing German researcher who works on bats — makes a really great point: It’s actually not that interesting to ask what we can understand about language or how that sounds to us. What’s much more interesting is to try to understand what bats are saying to one another or to other species. So if we have a more biocentric approach to understanding animal communication, I think that’s when some of the most exciting and interesting insights arise.

Neel Dhanesha

Early in the book, you mention the idea of a zoological version of Google Translate. This idea that you’re talking about points to something else, though. Translation in the past has always been about what one group can do to interact with the other, but you’re talking about an idea that involves actively choosing not to interact with a group but instead sort of just observing. That’s very different from how we usually might think of these kinds of applications.

Karen Bakker

So many of the attempts to teach primates human language or sign language in the 20th century were underpinned by an assumption that language is unique to humans, and that if we were to prove animals possess language we would have to prove that they could learn human language. And in retrospect, that’s a very human-centered view.

The research today takes a very different approach. It begins by recording the sounds that animals and even plants make. It then uses essentially machine learning to parse through mountains of data to detect patterns and associate those with behaviors to attempt to determine whether there’s complex information being conveyed by the sounds. What [these researchers] are doing is not trying to teach those species human language, but rather compiling, essentially, dictionaries of signals and then attempting to understand what those signals mean within those species.

They’re finding some amazing things. For example, elephants have a different signal for honeybee, which is a threat, and a different signal for human. Moreover, they distinguish between threatening human and nonthreatening human. Honeybees themselves have hundreds of sounds. And now we know their language is vibrational and positional as well as auditory.

Neel Dhanesha

I was absolutely fascinated by your chapter on coral and the way coral reefs not only make sounds of their own but also attract baby coral, who seem able to hear them despite not having any ears. I’m curious, what does a healthy coral reef sound like?

Karen Bakker

A healthy coral reef sounds a little bit like an underwater symphony. There are cracks and burbles and hisses and clicks from the reef and its inhabitants and even whales dozens of miles away. If you could hear in the ultrasonic, you might hear the coral itself.

Even coral larvae have demonstrated the ability to hear the sounds of a healthy reef. These creatures are microscopic, they have no arms or legs or apparent means of hearing and no central nervous system. But somehow they hear the sounds of a healthy reef and can swim toward it. So that’s astounding. If even these little creatures can hear in a manner that’s much more precise and attuned than humans, who knows what else nature is listening to?

Neel Dhanesha

There’s a point that you bring up about how digital listening is new but deep listening is not. What do you mean by that?

Karen Bakker

The way Blackfoot philosopher Leroy Little Bear puts it is, “The human brain is like a station on the radio dial; parked in one spot, it is deaf to all the other stations … the animals, rocks, trees, simultaneously broadcasting across the whole spectrum of sentience.”

The Indigenous writers John Borrows have Robin Wall Kimmerer described deep listening as a sort of venerable and ancient art. Before the advent of digital technologies, humans had lots of practices whereby they listened to nature. Animals’ complex communication abilities were well known to Indigenous peoples, who had various strategies and tactics for interpreting those sounds and engaging in cross-species communication. So deep listening provides us with another window into the soundscapes of the nonhuman and it does so with a sense of rootedness in place and a sort of sacred responsibility to place and a set of ethical safeguards that digital listening lacks.

Neel Dhanesha

It seems every person you write about who has studied these animal sounds received significant pushback from the scientific establishment, and they spent half their time pushing back against the pushback until finally they were proven right. I can’t help but think that acknowledging these forms of communication requires us to confront our ideas of sentience and intelligence in ways that make us uncomfortable.

Karen Bakker

Yes, the scientists whose stories are told in the book often encountered very stiff resistance. They had their funding revoked. They had their lapels shaken at conferences. They were laughed at. They were sworn at. They were dismissed frequently. And yet they persisted, because the empirical evidence was there.

We have a residual sort of human exceptionalism in science and in our public discourse, where we want to believe that humans are unique at something. We used to say humans were unique at toolmaking. Now we know that not to be the case. Wouldn’t it be nice if humans were uniquely gifted at language? Well, maybe that’s not the case, either. Maybe as we refine our understanding of nonhuman language, we’ll have a much more inclusive definition or understanding of language as a continuum across the tree of life.

This is pretty profoundly destabilizing. And it’s also destabilizing to realize that we were essentially deaf to all of these sounds going on all around us. We were the ones who were hard of hearing. And there’s a feeling of, I think, chagrin, and maybe mild embarrassment, that all of these sounds were there all the time, and we just never realized. So the feelings associated with this research are complicated. The philosophical debates are intense. And yet the sheer weight of the empirical evidence brings us to a point where we do need to start having these conversations.

Neel Dhanesha

You write that climate change is directly impacting the Earth soundscapes in sort of physical ways. How does that work?

Karen Bakker

If you think of the planet as being like a symphony or a jazz band with lots of seasonal rhythms, the noises that we’re hearing ebb and flow according to life’s rhythms. And climate change disrupts those rhythms.

In some cases, climate change could even inhibit the ability of species to communicate. So for example, the dawn and dusk chorus of birds and many other species in the African savanna happen at those times because dawn and dusk are moments when you have higher humidity in the air. So sound travels faster and farther at dawn and dusk. It’s a great moment to communicate with your far-off relatives, right?

But now, as climate change affects the temperature and humidity of the atmosphere, we are going to be affecting the dawn chorus in ways we cannot yet fully understand. We may make it harder for species to communicate in drier and hotter environments. If they can’t communicate as well, they’re less safe, they can’t warn each other of threats, it’s harder to find mates. And this will also affect their ability to survive and thrive.

Neel Dhanesha

You write that these digital technologies might help undo some of that damage too, though. Is there any project or application of these digital technologies that you’re particularly excited about?

Karen Bakker

One project that really excites me is the use of bioacoustics to create a form of music therapy for the environment. It turns out that some species, like fish and coral, will respond to sounds like the sounds of healthy reefs. And this could help us regenerate degraded ecosystems. That research is in its infancy. We don’t know how many species that could apply to, but it could be fantastic if we could actually begin using essentially bioacoustics-based music therapy as a way to help with ecosystem regeneration.

Neel Dhanesha

That’s such a fascinating idea to me, to undo our sonic damage with healthy sounds.

Karen Bakker

Yeah, or in a world with so many environmental crises, to have this be a tool in our toolkit as we try to triage saving species amidst the onslaught.

Will you support Vox’s explanatory journalism?

Millions turn to Vox to understand what’s happening in the news. Our mission has never been more vital than it is in this moment: to empower through understanding. Financial contributions from our readers are a critical part of supporting our resource-intensive work and help us keep our journalism free for all. Please consider making a contribution to Vox today.

Sourse: vox.com

No votes yet.
Please wait...

Leave a Reply

Your email address will not be published. Required fields are marked *