THE songs of the humpback whale are more complex than anyone realised. So claim scientists in Massachusetts who are analysing the "grammar" of whale calls with information theory. They hope this approach will tell them more about the purpose of the animals' clicks and squeals.
Information theory was born in the late 1940s, thanks to an influential series of papers by Claude Shannon of Bell Laboratories in New Jersey. Shannon described a way to measure the amount of information in a stream of symbols via the stream's "entropy" or unpredictability. The symbols could be anything from binary code to letters in words. Crucially, the less predictable the stream of symbols, the more information it is likely to contain-just as the repetitive sequence "eat eat eat" conveys less information than the non-repetitive sequence "eat this pie".
The same tools can be applied to whale calls. "The humpback whale song slowly changes over the season," explains John Buck of the University of Massachusetts at Dartmouth. "They have different themes that are repeated in a regular order-some die off, some are modified, and some are added."
These themes are made up of predictable sequences of sounds, or phrases. Buck and his colleagues took recordings of humpback whales and measured variations in the songs, such as changes in frequency. They then turned these measurements into a sequence of symbols representing the different themes and measured the entropy of the sequences.
At a joint meeting of European and American acoustical societies in Berlin next month, Buck's student Ryuji Suzuki will announce the preliminary findings. Buck says the entropy of the whales' calls hints that they may have a hierarchical grammar, in which one sound is varied to agree with a sound quite far back in a sequence. This is the same as human languages, where a word at the end of a sentence can be grammatically linked to the first word.
"I'm intrigued,
'"/>
Contact: Claire Bowles
claire.bowles@rbi.co.uk
44-171-331-2751
New Scientist
24-Feb-1999