Biologists have long marveled over the complex social behaviors seen among insects such as bees and ants, where different groups of individuals specialize in different tasks. Now, a team of roboticists has managed to emulate the cooperation strategy of leafcutter ants with computer simulations of small, four-wheeled robots. The result could lead to swarms of robots that team up and organize with minimal human intervention and could shed light on how cooperation evolved in animals.
"This is one of the few studies in which swarm robotics is used to test a biological hypothesis,” says computer scientist Elio Tuci at Aberystwyth University in the United Kingdom who was not involved in the study. "It's a nice piece of work."
The robots mimic leafcutter ants, which are masters of organization and cooperation. The ants (genus Atta) use cut-up leaf fragments to cultivate fungus, which they farm for food. When foraging in forests, some of the ants—the so-called droppers—climb into the treetops and cut the foliage, which they drop down to the forest floor. Other ants—the collectors—gather the harvest and carry it back to the nest. Drawing inspiration from the leafcutters, Eliseo Ferrante, a roboticist at the University of Leuven in Belgium, and colleagues designed robots whose objective was to gather cylindrical blocks—the analog of leaves—at the top of a ramp and return them to a “nest” area at the bottom. The blocks would also roll down the ramp if the robots dropped them from the top, mimicking the way a leaf would fall from the canopy to the forest floor. The ramp could also be flattened to prevent rolling.
The robots used in this study could exhibit only very simple behaviors: They could move to and from a light source placed above the blocks, move about randomly, or pick up or drop an object. Within this framework, each robot could be programmed to mimic the roles of the leafcutters: Droppers followed the light to the top of the ramp, picked up blocks, and dropped them back down the slope; collectors stayed at the bottom of the ramp to pick up the blocks and carry them back to the nest; and generalists both collected blocks from the top of the ramp and carried them all the way back to the nest.
Real robots are slow and expensive to use en masse, however, so the researchers chose to simulate the physical robots with digital versions. This allowed the team to carry out many generations of evolution rapidly. Ferrante and Tuci both say they’re sure that the physical bots would behave just the same if the in silico environment were replicated in the real world.
First, the researchers simply preprogrammed the robots to be either droppers, collectors, or generalists. The trial was designed to test whether the populations of robots would evolve to find the optimal mixture of the three behavior patterns, as the researchers report this month in PLOS Computational Biology. In this experiment, the evolutionary process was really just an exhaustive search of all possible ratios of droppers to collectors to generalists. After any new combination of the three was tried, the computer tabulated how efficient the foraging was and the best combinations were selected. As expected, in the sloped environment, evolution favored a mixture of mostly droppers and collectors, but in the flatlands, where there is no advantage to dropping the block, the generalist strategy was more prevalent.
One biological theory for how task specialization evolved in nature posits that the building blocks for cooperation were adapted from existing behaviors. Sibling care might have evolved from parental care, for instance. This experiment showed that if the building blocks are already in place, evolution can optimize the mixture of roles to fit the environment.
In a second study, however, the robots were bestowed with no such specific roles, but merely given the objective of filling the nest. Thus in the early generations, most of the machines bumbled about randomly, but gradually, as “mutations” made changes in the robots’ behavior, some of them stumbled onto productive sequences of behaviors. At the end of each simulated generation the robots with the most productive behaviors were more likely to “survive” and go on to take part in future generations. Fitness was defined based on how many blocks the group returned to the nest over 5000 seconds. The researchers acknowledge that their methodology—specifically the fact that all robots in each swarm are "genetically" identical—deviates from evolution in biology, in which fitness is measured on an individual basis. But, they say, the approach still allowed the scientists to study whether cooperation could evolve from scratch.
As seen in the video above, after 500 generations researchers saw the generalist strategy begin to emerge: navigate toward light, pick up block, navigate away from light, drop block, repeat. After a thousand generations task partitioning begins to emerge and the cooperative behavior is optimized and refined over an additional thousand generations.
The results demonstrate that division of labor could have evolved even without preset roles, the authors say. “There are a bunch of key evolutions,” Ferrante says. “The first thing they learn is the ability to do foraging. They would need to learn first to pick up objects and then they need to go [back to the nest]. To achieve that you need three or four mutations.”
Tuci agrees that the model works “quite well” to show how these behaviors could have evolved, but worries that the methods are a bit too basic to draw conclusions about how the process works in the biological world. The robots’ simple design and limited variety of behaviors may actually force the robots down specific evolutionary paths, he cautions: “You [can] somehow constrain the evolutionary process too much and you get [out] what you put into the system.”
(Video credit: Eliseo Ferrante)