Sunday, November 13, 2016

(New) Haptic cognition-based pronunciation teaching workshop at 2016 TESL Ontario Conference

If you are coming to the 2016 TESL Ontario Conference later this month (November 24 and 25 in Toronto) please join us for the Haptic Pronunciation Teaching Workshop, on Thursday, 3:45 to 4:45. This will introduce the new "haptic cognition" framework for (amazingly) more efficient and integrated pronunciation modeling and correction that we have been developing for the last year or so. (See previous post on the applicability of a haptic cognition-based  model to pronunciation teaching in general.)
HaPT-E, v4.0

Haptic cognition defined: 
  • The felt sense of pronunciation change (Gendlin, 1996) – somatic (body) awareness and conscious, meta-cognitive processing 
  • Change activated consciously and initially through body movement pattern use (Lessac, 1967) 
  • Haptic (movement+touch) uniting, integrating and “prioritizing” of modalities in anchoring and recall (Minogue, 2006)
Modalities of the model:
  • Meta-cognitive (rules, schemas, explanations, conscious association of sound or form to other sounds or forms)
  • Auditory (sound patterns presented or recalled) 
  • Haptic
    • Kinesthetic (movement patterns experienced/performed or mirrored by the body, gesture, motion patterns)
    •  Cutaneous (differential skin touch: pressure, texture, temperature)
  • Vocal resonance (vibrations throughout upper body, neck and head)
  • Visual (visual schema presented or recalled: graphemes, charts, colors, modeling, demonstrations) 
 General instructional principles:
  • Get to "haptic" as soon as possible in modeling and correcting.
  • Use precise pedagogical movements patterns (PMPs), including tracking and speed in the visual field.
  • Insure as much cutaneous anchoring as possible.
  • Go “light” on visual; avoid overly “gripping” visual schema during haptic engagement.
  • Use as much vocal resonance as possible.
  • Repeat as few times as possible.
  • Insure that homework/follow up is feasible, clear—and done (including post hoc reporting of work, results and incidental/related learnings).
  • Use haptic PMPs first in correction/recall prompting, before providing oral, spoken model.
The elaborated, audio-embedded Powerpoint from the workshop will be available later this month.

KIT







Tuesday, November 8, 2016

The "myth-ing" link in (pronunciation) teaching: Haptic cognition

Nice piece from The Guardian Teacher Network, Four neuro-myths still prevalent in schools, debunked, by Bradley Bush (@Inner_Drive). Now granted, The Guardian is not your average  refereed, first-line journal, but the sources and research cited in the readable piece are credible. Just in case you need a little more information to help your colleague finally abandon any of them, check it out. The four myths are:
Haptic Wolverine, 2016
  • Learning styles are important in teaching and instruction
  • We use just 10% of our brains.
  • Right vs left brain is a relevant distinction in understanding learning and designing instruction
  • Playing "brain" games makes you smarter and should have a more prominent place in instruction
So, if those popular "teacher cognitions" are lacking in empirical support, especially the first and third, how should that affect design of instruction? (The fact that the second and fourth just seem so "right" at times when in the classroom, notwithstanding!)

One helpful framework, cited by Bush (and this blog earlier) is Goswami (2008), which argues that learners learn best, in general, when taught using a  multi-sensory, multiple-modality approach. From that perspective, for example, when teaching a sound or process or vocabulary word, as many senses as possible must be brought to the party, either simultaneously or in close proximity:
  • Auditory (sound)
  • Visual (imagery)
  • Kinesthetic (muscle movement and memory)
  • Tactile/cutaneous (surface skin touch)
  • General (somatic) sensation of vocal resonance throughout the head and upper body. 
  • In addition, the potential impact of that is conditioned by the degree of meta-cognitive engagement (conscious awareness on the part of the learner of all that sensory input, plus existing schemas, such as rules, experience and connections to related sounds and language bits and processes). 
How to best do that consistently is the question. The concept of "haptic cognition" (Gentaz and Rossetti, in press) suggests why haptic awareness can function to bring together all those modalities in learning. From the conclusion:

"Taken together, this suggests that the links between perception and cognition may depend on the perceptual modality: visual perception is discontinuous with cognition whereas haptic perception is continuous with cognition." (Emphasis, mine.)

In other words, visual schema, such as charts, colors and even text itself, may actually work against integration of sound, resonance, movement and meaning in pronunciation teaching. Research from a number of fields has established the potentially problematic nature of visual modality overriding auditory, in effect disconnecting sound from meaning. On the contrary, the haptic modality generally serves to unite sensory input, connecting more readily with cognition based in sound, resonance and meaning. 

Another myth then, that of visual explanatory schemas (images and text) being a good approach in pronunciation teaching in textbooks and media--as opposed to active experience of sound, movement and awareness of resonance, plus some visual support, needs serious reexamination. What Gentaz and Rossetti are asserting (or confirming) is that visual imagery may not always effectively contribute to conscious, critical, cognitive integration and awareness in learning--the ultimate goal of all media advertising!

In other words, pronunciation instruction should be centered more on comprehensive haptic cognition. If you are not sure just how that happens . . . ask your local haptician!

(Coincidentally, the name of our company is: Acton Multiple-Modality Pronunciation Instruction Systems, AMPISys, inc.!)