Nice piece from The Guardian Teacher Network,
Four neuro-myths still prevalent in schools, debunked, by Bradley Bush (@Inner_Drive). Now granted, The Guardian is not your average refereed, first-line journal, but the sources and research cited in the readable piece are credible. Just in case you need a little more information to help your colleague finally abandon any of them, check it out. The four myths are:
|
Haptic Wolverine, 2016 |
- Learning styles are important in teaching and instruction
- We use just 10% of our brains.
- Right vs left brain is a relevant distinction in understanding learning and designing instruction
- Playing "brain" games makes you smarter and should have a more prominent place in instruction
So, if those popular "teacher cognitions" are lacking in empirical support, especially the first and third, how should that affect design of instruction? (The fact that the second and fourth just seem so "right" at times when in the classroom, notwithstanding!)
One helpful framework, cited by Bush (and this blog earlier) is
Goswami (2008), which argues that learners learn best, in general, when taught using a multi-sensory, multiple-modality approach. From that perspective, for example, when teaching a sound or process or vocabulary word, as many senses as possible must be brought to the party, either simultaneously or in close proximity:
- Auditory (sound)
- Visual (imagery)
- Kinesthetic (muscle movement and memory)
- Tactile/cutaneous (surface skin touch)
- General (somatic) sensation of vocal resonance throughout the head and upper body.
- In addition, the potential impact of that is conditioned by the degree of meta-cognitive engagement (conscious awareness on the part of the learner of all that sensory input, plus existing schemas, such as rules, experience and connections to related sounds and language bits and processes).
How to best do that consistently is the question. The concept of "haptic cognition"
(Gentaz and Rossetti, in press) suggests why haptic awareness can function to bring together all those modalities in learning. From the conclusion:
"Taken together, this suggests that the links between perception and cognition may depend on the perceptual modality: visual perception is discontinuous with cognition
whereas haptic perception is continuous with cognition." (Emphasis, mine.)
In other words, visual schema, such as charts, colors and even text itself, may actually work against integration of sound, resonance, movement and meaning in pronunciation teaching. Research from a number of fields has established the potentially problematic nature of visual modality overriding auditory, in effect disconnecting sound from meaning. On the contrary, the haptic modality generally serves to unite sensory input, connecting more readily with cognition based in sound, resonance and meaning.
Another myth then, that of visual explanatory schemas (images and text) being a good approach in pronunciation teaching in textbooks and media--as opposed to active experience of sound, movement and awareness of resonance,
plus some visual support, needs serious reexamination. What Gentaz and Rossetti are asserting (or confirming) is that visual imagery may not always effectively contribute to conscious, critical, cognitive integration and awareness in learning--the ultimate goal of all media advertising!
In other words, pronunciation instruction should be centered more on comprehensive
haptic cognition. If you are not sure just how that happens . . . ask your local haptician!
(Coincidentally, the name of our company is: Acton
Multiple-Modality Pronunciation Instruction Systems, AMPISys, inc.!)