Real-time Translation of Dolphin Whistle

Software has performed the first real-time translation of a dolphin whistle – and better data tools are giving fresh insights into primate communication too

It was late August 2013 and Denise Herzing was swimming in the Caribbean. The dolphin pod she had been tracking for the past 25 years was playing around her boat. Suddenly, she heard one of them say, “Sargassum”.

“I was like whoa! We have a match. I was stunned,” says Herzing, who is the director of the Wild Dolphin Project. She was wearing a prototype dolphin translator called Cetacean Hearing and Telemetry (CHAT) and it had just translated a live dolphin whistle for the first time.

It detected a whistle for sargassum, or seaweed, which she and her team had invented to use when playing with the dolphin pod. They hoped the dolphins would adopt the whistles, which are easy to distinguish from their own natural whistles – and they were not disappointed. When the computer picked up the sargassum whistle, Herzing heard her own recorded voice saying the word into her ear.

As well as boosting our understanding of animal behaviour, the moment hints at the potential for using algorithms to analyse any activity where information is transmitted – including our daily activities (see “Scripts for life“).

“It sounds like a fabulous observation, one you almost have to resist speculating on. It’s provocative,” says Michael Coen, a biostatistician at the University of Wisconsin-Madison.

Herzing is quick to acknowledge potential problems with the sargassum whistle. It is just one instance and so far hasn’t been repeated. Its audio profile looks different from the whistle they taught the dolphins – it has the same shape but came in at a higher frequency. Brenda McCowan of the University of California, Davis, says her experience with dolphin vocalisations matches that observation.

Thad Starner at the Georgia Institute of Technology and technical lead on the wearable computer Google Glass, built CHAT for Herzing with a team of graduate students. Starner and Herzing are using pattern-discovery algorithms, designed to analyse dolphin whistles and extract meaningful features that a person might miss or not think to look for. As well as listening out for invented whistles, the team hopes to start trying to figure out what the dolphins’ natural communication means, too.

McCowan says it’s an exciting time for the whole field of animal communication. With better information-processing tools, researchers can analyse huge data sets of animal behaviour for patterns.

Coen is already doing something like this with white-cheeked gibbons. Using similar machine-learning techniques to those used by Starner and McCowan, he has found 27 different fundamental units in gibbon calls.

McCowan, meanwhile, has recently modelled the behaviour of rhesus macaques at the National Primate Research Center in California. The idea is to predict when the macaques would descend into the violent social unrest known as “cage war” that often leads to the death of the alpha family.

Her team started collecting data, making 37,000 observations of key signs of dominance, subordination and affiliation over three years. Among other things, their analysis showed that cage stability improved if new young adult males were introduced now and again as they seemed to grow into “policing” roles. “You had to look at the data,” McCowan says. “It wasn’t something a human could see.”

Terrence Deacon, an anthropologist and neuroscientist at the University of California, Berkeley, explains that some pattern of repetition is a basic requirement when information is transmitted. In other words, if Herzing’s dolphins or McCowan’s macaques are exchanging information, if their behaviour is not just random, meaningless noise, then there must be some discoverable patterns. Information theory can find out what those pattern are, which parts of a whistle are important, helping behaviourists figure out what animals are communicating.

The first results from Starner and Herzing’s work on dolphin communication-processing are due to be presented at the speech and signal processing conference in Florence, Italy, in May. Last summer’s work was cut short because the team lost the dolphin pod, but they did make some progress. Starner’s algorithms discovered eight different components in a sample of 73 whistles. It’s still preliminary, but they were able to match certain strings of those components with mother-calf interactions, for instance. The work has let them plan for the coming summer when they want to confirm two-way communication between humans and dolphins.

Deacon is excited to see if such work can lead to a better understanding of animal cultures. He suspects much animal communication will turn out to be basic pointing or signposting rather than more complex language. But humans often communicate on a basic level too. “I don’t see a fundamental white line that distinguishes us from other animals,” he says.

More…

 

Leave a Reply