James Charlick looks to Bristol research to investigate the effect of 'gut instinct' on animal behaviour and decision-making
“The best is the enemy of the good” is the Italian proverb serving as the epigraph of a new theoretical paper asserting that simple, internal systems are more useful in situations like foraging, than a computational system that integrates all prior information to generate an accurate temporal representation of an environment (Bayesian learning).
Selection in nature is not by complexity or ‘perfection’; rather, simplicity is often energetically desirable, and so prevails throughout the animal kingdom, at the heart of the heartbeat, hand wave, appetite for food and motivation for getting it.
Professor Andrew Higginson at the University of Exeter, in collaboration with the University of Bristol, provides new mathematical evidence that even if gut instinct is just a roundabout description of a feeling, it is also a mechanism highly conserved throughout evolution at the expense of a more sophisticated, albeit costly, system.
The study, published in Proceedings of the Royal Society B: Biological Sciences, investigates the way in which organisms integrate information about past events to develop a picture of food availability with enough accuracy that it can be used to determine future foraging behaviour.
Traditionally, research has dealt with the uncertain nature of environmental information, such as amount of pollen, by incorporating a Bayesian system into models that describe animal foraging. This is likely to be impossible in nature: it would require a large brain, and would waste energy where it should be conserved for actual foraging. Thus, Higginson proposes in a new model that we have evolved with a much simpler mechanism by which we judge and act within our environment; one that is heuristic and which seems, on the surface, inferior to a much more sophisticated system like Bayes’.
Becoming frustrated and possibly angry when hungry is good for foraging animals.
Higginson’s model predicts that with respect to food levels, conditions are good when energy reserves are high; this prepares the animal for bad conditions, or low food availability. It also demonstrates the similarity in foraging between a hypothetical animal with perfect knowledge of food availability, and one with a knowledge based on energy reserves. Under fluctuating conditions, in Higginson’s model, a reserve-based foraging strategy performs almost as well as sophisticated Bayesian learning; and since it would require far less energy, it has been conserved over this more sophisticated strategy for integrating environmental information.
Becoming frustrated and possibly angry when hungry is good for foraging animals; they work harder to find food. But like so many evolutionarily conserved traits, this response is now inordinate in humans, as the intensity with which we ‘forage’ has considerably dropped. Still, how animals respond to environmental changes is nonetheless relevant to us when considering the impact of human activity on habitats.
And in the extremes, there seems to be some room for Bayesian learning. Higginson’s model suggests that its extra cost is compensated for by its efficacy, compared to a reserve-based strategy when changes in food availability are either large and abrupt or subtle and infrequent. Does this mean that rapidly changing environments are conducive to better learning? It is certainly an interesting contradiction to the suggestion that learning is better in a steadily changing environment.
Despite its reliability in stable conditions, gut instinct for foraging does not adapt well under extreme changes. However, it is an effective way of keeping track of current food levels; and so if the environment changed atypically over a long period of time the forager would modify its decisions in line with the environmental change. There’s no need to calculate the intensity with which you should go hunting, forage around a brush or plunge into the ocean, beak first - your belly is empty, and is telling you to find a place to eat.
The model proposed by Higginson is convincing because we already know that hormones like ghrelin and leptin, cortisol and adrenaline, control appetite and levels of stress, respectively. These physiological – and associated psychological - changes make us behave in a certain way. The study also looks to the future, proposing applications of the model in practice: might we compare foraging based on hormone level, with the success of cognitive information processing in, for example, avoiding a predator.
Featured image: Unsplash / Andrea Reiman