Scientists Use Artificial Intelligence to Decode Whale Communication, Revealing a Hidden Language Beneath the Ocean

For centuries, the world’s oceans have concealed forms of intelligence that humans could hear but never truly understand.

Beneath the surface, massive marine mammals have exchanged rapid sequences of clicks, whistles, and rhythmic patterns—sounds long admired for their beauty but dismissed as instinctive or primitive.

Today, that assumption is being overturned.

Using advanced artificial intelligence, scientists are uncovering evidence that whales communicate through a complex, structured system that challenges humanity’s understanding of intelligence, language, and even its place among other thinking beings on Earth.

This breakthrough is the result of Project CETI—the Cetacean Translation Initiative—an ambitious scientific effort that combines marine biology, artificial intelligence, linguistics, and robotics.

What researchers are now discovering suggests that whales, particularly sperm whales, use a sophisticated communication system with features resembling elements of human language.

The implications extend far beyond marine science, offering new insights into cognition, culture, and the possibility of communicating with non-human intelligences.

thumbnail

Early Encounters With Whale Voices

Human fascination with whale sounds dates back centuries.

Sailors once described eerie calls echoing across open water, fueling legends of sea monsters and mythical creatures.

Yet for much of history, whales were not regarded as intelligent beings.

They were hunted extensively for oil, bone, and meat, reduced to resources rather than recognized as sentient animals.

This perception began to shift in the 1970s, during a period of growing environmental awareness.

Marine researchers studying humpback whales made a discovery that would reshape public opinion.

Recordings of whale vocalizations revealed long, evolving sequences of sounds—later known as whale songs—that could last for hours and change gradually over time.

These recordings were compiled into the album Songs of the Humpback Whale, which unexpectedly became a cultural phenomenon.

The album captured global attention and played a role in the emerging movement to protect whales from extinction.

However, while the recordings demonstrated complexity, scientists still did not know what the sounds meant.

Were they mating calls, navigation signals, or expressions of social bonding? The tools of the time allowed researchers to listen, but not to decode.

A Radical New Idea

Decades later, marine biologist Dr.David Gruber proposed a radical shift in perspective.

Rather than treating whale sounds as biological signals to be categorized manually, he suggested approaching them as a form of unknown language—one that could be studied using the same techniques designed to search for extraterrestrial intelligence.

Inspired by SETI, the Search for Extraterrestrial Intelligence, Gruber posed a provocative question: what if humanity’s first encounter with non-human intelligence was not in outer space, but already living in Earth’s oceans? From this idea, Project CETI was born.

The project set out with an unprecedented goal: to translate whale communication and ultimately establish two-way interaction.

To do this, Gruber assembled a diverse team of experts, including marine biologists, AI researchers, computational linguists, and robotic engineers.

No single discipline could solve the problem alone; understanding whale communication required a fusion of science and technology.

thumbnail

The Importance of Long-Term Observation

One of the project’s most valuable contributors is marine biologist Shane Gero, who has spent more than a decade studying sperm whales near the island of Dominica in the Caribbean.

Gero’s work involved detailed, long-term observation of specific whale families, documenting their social behavior, movements, and vocal patterns over generations.

This extensive field knowledge provided essential context.

Rather than treating whales as anonymous sound sources, Project CETI could associate vocalizations with specific individuals, families, and social situations.

This distinction proved crucial, as understanding language requires more than sound—it requires knowing who is speaking, to whom, and why.

Dominica’s geography made it an ideal research location.

Deep ocean waters lie close to shore, allowing sperm whales to be studied without long offshore expeditions.

Many of the same whale families return year after year, creating a stable population for long-term analysis.

Building an Underwater Listening Network

With funding support of approximately $33 million from the TED Audacious Project, Project CETI officially launched in 2020.

One of its first major undertakings was the construction of a large-scale underwater listening system off the coast of Dominica.

This system consists of arrays of hydrophones—underwater microphones—placed across the ocean floor in a grid pattern covering roughly 20 by 20 kilometers.

These hydrophones continuously record sounds, allowing researchers to pinpoint the location of individual vocalizations and identify which whale produced them.

In addition to passive listening, the project employs non-invasive robotic systems.

Autonomous underwater vehicles and suction-cup tags are used to temporarily attach sensors to whales.

These tags collect data on movement, depth, orientation, heart rate, and proximity to other whales, providing vital behavioral context for each vocalization.

Together, these technologies generate an unprecedented volume of data.

Unlike earlier studies that analyzed hours or days of recordings, Project CETI collects millions of whale sounds over extended periods, creating the largest whale vocalization database ever assembled.

Teaching Machines to Listen

image

Raw audio data alone is insufficient to reveal meaning.

To make sense of this information, researchers convert sounds into spectrograms—visual representations of audio frequencies over time.

This transformation allows AI systems to “see” patterns that human ears cannot detect.

Machine learning models are then trained to identify specific sequences of clicks known as codas, which are the primary communicative units used by sperm whales.

Instead of being programmed with predefined rules, the AI learns by analyzing vast numbers of examples, gradually identifying recurring structures and relationships.

Using deep learning and neural networks, the system begins to recognize subtle variations in rhythm, tempo, spacing, and ornamentation—extra clicks added to the beginning or end of a sequence.

These variations, once invisible to human researchers, appear to function similarly to phonetic elements in human language.

A Structured Communication System

The AI-driven analysis revealed that sperm whale communication is not random.

Instead, it follows a structured system that allows for a wide range of meanings.

Changes in timing, rhythm, and emphasis can significantly alter the message conveyed by a coda.

Researchers found evidence that whales use a combinatorial system, where a limited set of sound elements can be rearranged to create many distinct messages.

This discovery challenges the long-held belief that complex language systems are unique to humans.

Even more strikingly, the analysis showed that whale communication is deeply social.

Different whale clans use distinct dialects, identifiable through subtle variations in their codas.

These dialects function much like accents or regional speech patterns in human societies and are passed down through generations.

From Listening to Responding

With a growing understanding of whale communication structure, Project CETI has begun exploring the possibility of two-way interaction.

The goal is not to teach whales human language, but to communicate using patterns derived from their own vocal system.

In 2023, researchers conducted a controlled experiment using an underwater speaker to play a specific sperm whale coda.

A nearby whale responded by producing the same coda, marking the first documented instance of intentional human-initiated whale communication using decoded patterns.

Although limited in scope, this interaction demonstrated that whales recognize and respond meaningfully to structured sound sequences generated by humans.

It represents a significant step toward real-time interspecies communication.

Ethical and Scientific Implications

The possibility of communicating with whales raises important ethical questions.

Scientists involved in Project CETI emphasize caution, recognizing the potential risks of disrupting whale social structures or altering natural behaviors.

Any interaction, they argue, must occur on the whales’ terms and prioritize their well-being.

Beyond marine biology, the implications of this research are profound.

The methods developed by Project CETI could inform the search for extraterrestrial intelligence by providing tools to detect and interpret unknown communication systems.

If humanity encounters intelligent life beyond Earth, the principles used to decode whale language may prove invaluable.

At a more immediate level, understanding whale communication fosters a deeper sense of empathy and responsibility.

Recognizing whales as intelligent beings with culture, social bonds, and shared histories strengthens the moral case for ocean conservation.

A New Chapter in Understanding Intelligence

Project CETI’s work is ongoing, with plans to expand research to other species such as orcas and to refine AI models further.

Scientists believe this approach may reveal complex communication systems throughout the animal kingdom, reshaping humanity’s understanding of intelligence itself.

What began as an effort to listen more carefully has evolved into a transformative scientific endeavor.

By decoding the language of whales, researchers are not only uncovering hidden conversations beneath the ocean but also redefining what it means to share a planet with other intelligent life.

The ocean, long seen as silent and unknowable, is finally beginning to speak—and humanity is learning how to listen.