Debunking common neuromyths

By William Watts, PhD candidate, Translational Health Sciences

SciTech dispels the most popular 'neuromyths' and explores their effect on education and policy.

The oldest recorded surgical procedure, known as trepanning, involved drilling a hole into the human skull to release trapped evil spirits. Since then, our understanding of the brain has fortunately advanced, yet public misconceptions persist. These 'neuromyths' are often harmless, only raising an eyebrow from a pedant, but certain myths hold significant sway over education and policy interventions. In this article, we delve into the most pernicious of these brain myths:

We only use 10 per cent of our brain. 

The origin of this myth is unclear, but it has led to several misguided attempts at 'brain training' and at least two questionable Hollywood blockbusters. 

First, evolution would not permit such inefficiency, especially considering that the brain is energetically expensive. Despite constituting only 2 per cent of the body’s weight, the brain consumes around 20 per cent of its metabolic energy.   

Second, brain scans persistently reveal activity across all brain regions, even at rest. While some areas are more active during specific tasks, no area is truly silent unless impaired by serious injury.

On the other hand, using 100 per cent of the brain simultaneously would be a recipe for disaster. Indeed, in the film Lucy, it is more likely that the main character would have triggered an epileptic seizure rather than master time travel.

Students have preferred learning styles. 

It is commonly accepted that we are either visual, auditory or tactile learners. However, this claim suffers from a total lack of evidence. Multiple studies have reported no significant difference in learning between those who learnt with their 'preferred style' and those who did not. Instead, the tenacity of this myth points to our proclivity to place people into discrete categories. 

This myth is harmful to both teacher and pupil. For the teacher, it wastes precious time and resources in planning lessons that cover each style. For the pupil, it can create a self-fulfilling prophecy where they perceive that they are not suited to certain learning techniques, so forgo them even if these techniques are more effective.  

Right-brained people are creative. 

During my first A-level Chemistry lesson, I remember my teacher announcing that most of us were left-brained, for to choose science was to reveal a penchant for analytical and logical thinking. Presumably, the people who chose the humanities were right-brained, those of an irrational and intuitive bend. Sadly, my teacher was guilty of an oversimplification. 

The left/right brain myth originates from experiments performed by the Nobel laureate Roger Sperry. Sperry studied patients that had undergone surgery severing their corpus callosum, the nerve tract that connects the two cerebral hemispheres. These patients had brain hemispheres that acted independently of each other, a so-called 'split brain'.

Since information from the left eye is processed by the right hemisphere and vice versa, Sperry controlled the visual information processed by each hemisphere through changing what was in either eye’s field of view. In this way, he observed that the left hemisphere performed better in language articulation and understanding; the patient could remember and speak a word seen with the right eye. Conversely, the right hemisphere could understand rudimentary language but not articulate it; the patient could not speak any words seen with the left eye, but they could draw pictures of some words.

While Sperry’s work is undoubtedly interesting, mapping it onto ideas of left/right brain dominance has several issues. Most glaringly, healthy people do not have a severed corpus callosum, so their hemispheres operate together. In fact, all cognitive skills use both hemispheres and there is no correlation between supposed left/right brain traits and hemispheric activity.     

This misconception can lead to similar problems as the previous myth where people 'pigeonhole' themselves into certain traits i.e. by believing a logical 'left-brained' individual cannot be creative. 

The adult brain does not grow new neurons. 

You may have encountered variants of this myth, such as the notion that each is born with a fixed number of neurons or that brain development ceases at 25. But, as ever, the maxim holds true: the brain is always more complicated than one might assume. Neurogenesis, the process of generating new brain cells, continues throughout one’s life.

This is especially pronounced in the hippocampus, where adult neurogenesis is believed to play a role in learning and memory processes. Additionally, brain maturity is quite a murky concept. It is difficult to pinpoint a definitive end in brain development given the variability between individual brains and the fact that structural changes continue after your 20s. 

Mistaken beliefs in brain maturity can lead adults to despair when acquiring new skills. While it is true that sensitive periods exist for learning certain skills, such as language acquisition, learning continues throughout life, accompanied by the ability to form rich neural representations of new environmental stimuli. For instance, London taxi drivers show drastic structural changes in brain regions implicated in spatial representation. So, in fact, you can teach an old dog new tricks. 

Featured image: Freepik / macrovector_official


How have these 'neuromyths' affected your education?