Saturday, October 1, 2022

What comes next in pronunciation teaching! (Why being in touch is so important!)

An intriguing new study by researchers at East Anglia University, Aix-Marseille University and Maastricht University, summarized by Neurosciencenews.com: How the Sounds We Hear Help Us Predict How Things Feel,  (title and actual empirical findings to be revealed later, with no link to the actual study, itself, other than a note that it will appear in Cerebral Cortex)

I am, nonetheless, delighted to take their word for it since I LOVE the conclusions and find them "touching!" Apparently they have uncovered yet another "new" type of connection between sound and touch or tactile processing. The key finding from the summary:  

“ . . . research shows that parts of our brains, which were thought to only respond when we touch objects, are also involved when we listen to specific sounds associated with touching objects. (Italics, mine.) This supports the idea that a key role of these brain areas is to predict what we might experience next, from whatever sensory stream is currently available.”

Across this unique, recently discovered circuit, for example, when we hear a sound, like that of a single consonant, the brain in principle simultaneously connects it with the physical sensations (touch, vocal resonance, micro-movements involved in producing it) associated with articulating it. If the focus is a word, on the other hand, we assume that other multiple, analogous circuits come into play that link to other dimensions. But the "touch" circuit has those unique properties. 

So what might that mean in the classroom, especially pronunciation and effectiveness? (I'll get to haptic pronunciation later, of course!) For one thing, (NO SURPRISE HERE!) a sound may be associated with the somatic (body) sensations in the vocal tract but not necessarily with a the concept, or phoneme, the phonological complex/nexus and the graphemic representation, itself. It is as if the sound points at the body, not the "brain" as a whole. 
 
On the other "hand," any number of other words could have have virtually identical "points of impact" on the body, associated with the same vowel "sound." The same may apply to a word articulated simultaneously with a gesture, or any experience associated with a sound, one heard or self-generated. That circuit connects the auditory image to at least the "body," but not necessarily one concept. 

Then what is the "workaround" for bringing together the multisensory event termed a "word," or for  example, assuming that it has been learned truly "multi-sensorialy," that is with as many senses as possible, or at least a "quorum level," vividly or intensely engaged as possible? 

In a sense, the "answer" is in the question: consistent, rich multisensory engagement. There are an almost infinite number of ways to accomplish that, of course, but haptic pronunciation teaching, based on touch-anchored speech-synchronized gesture attempts to do that, systematically. In principle, any sound, word or sound process can be experienced as a nexus involving: 
  • the physical sensation of articulating the sound/process
  • the auditory features of the sound (acoustic)
  • a concept (in the case of a word or, in come cases, patterns of pitch movement)
  • a gesture that involves hands touching with each other or the body, in some manner that mimics either the nature of the sensations involved in articulation or the "shape" of the concept itself, such as hands rising on a rising pitch or intonation, or hands positioned high in the visual field to represent a "high" vowel.  
According to the study, the use of haptic, touch-anchored gesture should strengthen considerably the connection between the concept associated with the gesture and the sound by "pointing" to the body-sensations involved in articulating the sound.

 And, of course, from our perspective, KINETIK (method) is what is coming next! 

 Source: https://neurosciencenews.com/auditory-tactile-processing-21279/


No comments:

Post a Comment