The animal movement analysis system is part of the BioTracking Project, an effort conducted by Georgia Institute of Technology robotics researchers led by Tucker Balch, an assistant professor of computing.
"We believe the language of behavior is common between robots and animals," Balch said. "That means, potentially, that we could videotape ants for a long period of time, learn their 'program' and run it on a robot."
Social insects, such as ants and bees, represent the existence of successful large-scale, robust behavior forged from the interaction of many, simple individuals, Balch explained. Such behavior can offer ideas on how to organize a cooperating colony of robots capable of complex operations.
To expedite the understanding of such behavior, Balch's team developed a computer vision system that automates analysis of animal movement — once an arduous and time-consuming task. Researchers are using the system to analyze data on the sequential movements that encode information — for example in bees, the location of distant food sources, Balch said. He will present the research at the Second International Workshop on the Mathematics and Algorithms of Social Insects on Dec. 16-17 at Georgia Tech.
With an 81.5 percent accuracy rate, the system can automatically analyze bee movements and label them based on examples provided by human experts. This level of labeling accuracy is high enough to allow researchers to build a subsequent system to accurately determine the behavior of a bee from its sequence of motions, Balch explained.
For example, one sequence of motions bees commonly perform are waggle dances consisting of arcing to the right, waggling (wal
Contact: Jane Sanders
Georgia Institute of Technology Research News