What You Can’t Hear, Yet

Everything is talking. And we are trying to listen. The conversations that animals, plants, and even the cells inside our bodies are having are about to be recorded and understood. Tune in for the symphony.

Understanding the Language of Nature

Any dog owner can tell you the meaning of the different barks their dogs make. The bark to go outside is different from the bark when a stranger approaches the house. 

Ask any beekeeper and they can tell you about the different sounds bees make when they are foraging, as opposed to making honey. 

Moths emit unique sounds—in the ultrasonic ranges we don’t hear—to block a bat’s sonar signal and hide the moth’s location. 

Whales sing, dolphins chatter, fish gurgle, all in languages we don’t yet know.

And the recordings we make are not limited to audible sounds. Both the high and low-frequency sounds emitted by various animals can be recorded as well. A skillful gardener can tell you about their plants’ responses while they are being watered. Trees at the edge of a forest communicate to trees deep within when a fire is closing in or when a blight occurs. Fungi have miles-long underground mycelium networks that respond to environmental stimuli.

With the new power of AI language models, each of these signals can be recorded thousands of times. From this data, a dictionary of the insect, animal, or even plants signals can be created, and the language reliably decoded. Simple examples that are here today include an AI-enabled dog collar that can record the various barks your dog makes and send you a text message if you are out of earshot. That same dog, when visiting the AI-enabled veterinarian, may soon be able to communicate its medical symptoms. 

Danger is often announced by a wide range of insects and animals with changes in their vocalizations. That danger may soon be broadcast with tools far different than the ones created by evolution. Imagine a siren blaring when a blue jay approaches a bee hive to steal its contents—or park rangers being alerted when a herd of African elephants encounters armed poachers.

What this means is that humans will one day be able to understand the language of all living things. But it is not just interspecies communication that will change.

Why, for example, should two human brains be the only minds in the exam room? Soon, every visit to a doctor will include each of you having your AI-enabled voice agent (think Siri on steroids) listening to your complaints, reminding you of other symptoms, and empowering your care provider with the world’s collected knowledge about your possible illness. 

Our world is about to explode with information, communication, and noise in a fantastic way.

And if your partner thinks you are not listening when he or she is trying to tell you something, imagine if all the living creatures around us were vying for our attention. Selective hearing never sounded so good.

Medically authored by
Kevin R. Stone, MD
Orthopaedic surgeon, clinician, scientist, inventor, and founder of multiple companies. Dr. Stone was trained at Harvard University in internal medicine and orthopaedic surgery and at Stanford University in general surgery.