Thursday, November 28, 2013

Giving aural comprehension "a hand"-- in haptic pronunciation training

A common question we get is something to the effect of "How do the pedagogical gestures (PMPs - movement across the visual field terminating in touch on a stressed element of a word) work?" 2012 research by Turkeltaub and colleagues at Georgetown University, reported by Science Daily, suggests how that happens. In that study
it was demonstrated that what you are doing with your hands may affect what you hear, or at least how quickly you hear it.

In essence, subjects were instructed to respond by touching a button when they detected a heavily embedded background sound, either with their right or left hand. Right handed response was better at detecting fast-changing sounds; the left, better at slow changing sounds, according to Turkeltaub, " . . . the left hemisphere likes rapidly changing sounds, such as consonants, and the right hemisphere likes slowly changing sounds, such as syllables or intonation . . . " Well, maybe . . .

The study at least further establishes the potential connection between haptic work and L2 sound change. In this case, when the learner performs a PMP, mirroring the model and listening to the model of the target sound--without overt speaking--anchoring should be enhanced, more efficient. Part of the reason for that, as reported in several pervious posts, is that "fast" sounds tend to be in the right visual field (attached to the left hemisphere) and "slower" sounds, the left.

AMPISys, Inc. 
In the EHIEP protocol for intonation, for example, the intonation contour or tone group begins in the left visual field with the left hand moving to the right until it touching the right hand on the stressed syllable or focus word. (See Intonation PMP demonstration linked off earlier post.) In the vowel protocols, similar PMPS are involved as well as the visual display reflects the "fast and slow" phonaesthetic quality of the vowels. (See earlier post on that as well.)

Keep in touch! (v2.0 will be released next week!)

Wednesday, November 27, 2013

When is AH-EPS haptic pronunciation teaching best?

Quick answer: Most of the time haptic video-based AH-EPS (Acton Haptic English Pronunciation System)  is better than EHIEP live, at least in initial instruction! In all modesty, for what AH-EPS does, it is pretty much unbeatable, too.  This is a follow up to the earlier post on when EHIEP is best.

How is it possible that students learning PMPs (pedagogical movement patterns--synchronized with speaking the vowels, consonants, rhythm groups, stress patterns, intonation contours and tone groups) could be better done by using a video (of me!) than learning "live" from an instructor?



Clip art: Clker
There is actually a considerable amount of research and decades of experience in several fields that identifies when video may be more appropriate and effective. Be happy to unpack that later in
comments to this post, in fact. Here are the ETPs (elevator talking points) for when/why AH-EPS is better.

  • AH-EPS can do a substantial amount of the initial, basic pronunciation instruction for the inexperienced teacher, which can then be followed up in regular classroom instruction, modelling and correcting. 
  • Haptic pronunciation teaching, and haptic work in general, is highly susceptible to visual and auditory distraction. The haptic video framework (movement and touch performed along with the video modelling) maintains attention well. 
  • For many instructors--myself included--leading the class in initial PMP training can quite "cognitively and affectively" complex. Trying do the precise movements leading the class and visually monitoring student performance at the same time is challenging, at best. When you are really tired, nearly impossible, especially if you are even slightly ambidextrous. (See earlier posts on that!) 
  • Most importantly, it is essential that the PMPs be performed with precision by the model, so that hand and arm placement is done consistently across the visual field. If not, some highly visual students will not be able to "nail down" or anchor where the touch occurs at or near the end of the gesture. 
  • And finally, we have about eight years of experience and field testing using the PMPs in many different instructional settings. 

AH-EPS v2.0 will be launched shortly.

For more info: actonhaptic@gmail.com

Keep in touch.


Sunday, November 24, 2013

When is EHIEP haptic pronunciation teaching best?

I got that question yesterday at the conference after our Tai Chi and linking workshop. (Shine, Olya and I will do a blogpost on the specifics of that next week.)

Quick (modest) answer: In many contexts.

Here are your basic EHIEP "Elevator talking points!")
Clip art: Clker

When . . .
A. Integrating new or changed pronunciation into spontaneous speech is a prime concern.
B. Learners' immediate need is anchoring new vocabulary or basic intonation contours.
C. Presenting new vocabulary, especially terms that are not easily contextualized.
D. Doing on-the-spot correction of mispronunciation, especially in class.
E. Holding learners' attention during pronunciation work is problematic (due to environmental distraction or other "internal" factors).
F. Doing focused peer correction of basic prosodics (intonation, rhythm and stress) using oral reading conversational texts.
G. Learner pronunciation homework is critical to success.

Those are EHIEP-based (Essential Haptic-integrated English Pronunciation), the basic model we have been developing here and elsewhere for sometime. (For more info and a free copy of the draft v2.0 AH-EPS Instructor's guide, email: actonhaptic@gmail.com.) For demos of what the basic pedagogical movement patterns look like see this earlier blogpost. (Do that soon; the links are only live until 11/30!)

Tomorrow's post will be focus on when AH-EPS, the haptic video system for doing EHIEP, is best.

Keep in touch!





Sunday, November 17, 2013

Teaching linking in speaking with touch and Tai Chi

Clip art: Clker
This one will be fun. If you are in Vancouver next Saturday, join us: 

Workshop to be presented at the BCTEAL Lower Mainland Regional Conference in Vancouver, BC, at Columbia College, November, 23, 2013, 1:30-2:30. 

(Hapticians: JaeHwa Hong, Olya Kliuyeva and myself)

Pay attention to pronunciation!

As reported in earlier posts, no matter how terrific our attempt at pronunciation teaching is, if a learner isn't paying attention or is distracted, chances are not much uptake will happen--especially when haptic anchoring is involved. No surprise there. A new study by Lavie and colleagues of UCL Institute of Cognitive Neuroscience, focusing on "inattentional blindness" entitled,"How Memory Load Leaves Us 'Blind' to New Visual Information," just reported at Science Daily, sheds new "light" on exactly how visual attention serves learning.

In essence, when subjects were required to momentarily attend to an event or object in the visual field and remember it, their ability to respond to new events or distractions occurring immediately afterward was curtailed significantly. (The basic stuff of hypnosis, stage magicians and texting while driving, of course!)

What is of particular interest here is that, whereas the visual image that one is attempting to focus on can strongly exclude other competing distractions, that effect works precisely the other way around in haptic-integrated pronunciation instruction. It helps explain the potential effectiveness of pedagogical movement patterns of EHIEP and AH-EPS:

  • Carefully designed gestures across the visual field 
  • Performed while saying a word, sound or phrase 
  • With highly resonate voice, and
  • Terminating in some kind of touch on a stressed vowel, what we term "haptic anchoring." 
It also explains why insightful and potentially priceless comments from instructors coming in too close proximity to vivid and striking pronunciation-related "visual events" . . . may not stick or get "uptaken!" 

See what we mean? 



Wednesday, November 13, 2013

Embodied cognitive complexity--with haptic-integrated pronunciation!

I'm doing a plenary at the BCTEAL regional conference next week. Here is the abstract:
Credit: Villanova.edu


"This interactional presentation focuses on three of the most influential ideas in research in the field today: e-learning, embodiment and cognitive complexity. Taken together, the three help us address the question: How can students effectively acquire a second language--and especially pronunciation and high level cognitive functions--when more and more of their learning experience is mediated through computers?"

The point of my talk will be the power of haptic anchoring (as a form of embodiment), both in developing technologies such as the iPhone and in representing and teaching very complex concepts--even pronunciation! Those two perspectives are converging rapidly today, especially when it comes to dealing with today's media-immersed and media-integrated learners. Ironically, embodied methodologies, where explicit training and control of the body and management of its immediate physical milieu, provide both great promise and great cause for "a sober second look," as Canadians often remark. 

I'll spend more time on the former but will return to the latter here in a later post. If you'd like to initiate that discussion now, feel free! (Note: Unfortunately, I have had to switch to moderating all comments on this blog. If you do propose a comment, I'll review it quickly. Promise!) 

Sunday, November 10, 2013

Announcing new AH-EPS v2.0 packages and Demonstration videos!

Along with release of v2.0 of Acton Haptic English Pronunciation System will be a new set of 4, 2-module packages: Vowels and word stress, rhythm and linking, intonation and expressiveness, and fluency and integration. Any one of those packages can be used as a set. Each also includes some basic introductory material to AH-EPS for students as well. 

Also for a limited time, links to Vimeo-streaming demonstrations of the haptic pedagogical movement patterns (PMPs--See the Teaman and Acton paper) are available, included below. Each video will give you an idea of the basic haptic (movement + touch) gesture that is used in presenting, practice and correcting pronunciation in that module.  (If you cannot access Vimeo, email actonhaptic@gmail.com for a demonstration DVD or further information.)

NOTE: Some of the demo links below are now password protected but will be available shortly as part of AH-EPS v2.0, either on the AH-EPS DVDs or as streaming off Vimeo.com. If you would like to view some of the demos, please email me at actonhpatic@gmail.com for a temporary password!

Credit: AMPISys, Inc. 
A module typically includes instructional and student materials, plus a set of videos, including:

(a) A warm up DEMONSTRATION
(b) Demonstration of new PMP
(c) Review of PMP from previous module(s)
(d) Training in new PMP
(e) Practice of new PMP
(f) Practice of new PMP in conversational dialogues

Package 1. Vowels and word stress
(Module 2) Short vowels (lax vowels) DEMONSTRATION
(Module 3) Long vowels (tense vowels, and tense vowels + off glides) DEMONSTRATION

Package 2. Rhythm, phrasal stress and linking 
(Module 4) Syllable grouping DEMONSTRATION
(Module 6) Rhythm training and linking DEMONSTRATION (rhythm training only)

Package 3. Intonation and expressiveness
(Module 5) Basic Intonation DEMONSTRATION
(Module 8) Expressiveness (discourse intonation) DEMONSTRATION

Package 4. Fluency and integration
(Module 7) Conversational fluency DEMONSTRATION
(Module 9) Integrating pronunciation change DEMONSTRATION

Each package includes:
Instructor materials: Complete Instructor's Guide download and Vimeo video streaming of 2 modules (hardcopy and DVDs available)
 Student Workbook materials from 2 modules: Workbook download and Vimeo video streaming of 2 modules (hardcopy and DVDs available)
Cost will be about $35 for download/streaming version, or $90 plus shipping for the hardcopy/DVD version.
-----------------------------------------------------------------------
Cost of other packages will range from $50 (consonants) to $400 (including student practice DVD/videos for a class of 12).

In addition to complete AH-EPS packages of Videos and materials (in download or streaming versions), the 2-module packages will be available later this month.

Keep in touch!

Thursday, November 7, 2013

Pronunciation anxiety? Don't worry, be "haptic!"

Have done several previous posts that "touch" on the effects of interpersonal touch, such as "healing touch." In our kind of haptic pronunciation teaching, for a number of reasons, we use only "intra-personal" touch, typically the hands touching or hands touching the arms and shoulders or the outside of the hips. Generally, that's it. A new study by Koole and colleagues at the University of Amsterdam, reported in Science Daily in a summary entitled, "Touch may alleviate existential fears in people with low self esteem," re-opens that intriguing area of research and development for me.

Credit: AMPISys, Inc. 
I earlier explored interpersonal touch in private work, for example where a couple or two female learners practiced the EHIEP pedagogical movement patterns together, where one touched the hand of the other on stressed syllables in anchoring new pronunciation. (Have also had reports from instructors who work with child L2 learners that various group-based hand clapping or "give-me-five" gestures seem to work well, too.) The reports from the students were quite positive. Have always wanted to get back to figuring out culturally and interpersonally appropriate use of interpersonal touch.

There are certainly good reasons for that. Koole's work suggests that even our "intra-personal" touch gesture work may "work" better than we thought! Although this is close to being filed in our "Well . . . duh! file" (a study that empirically validates common sense), in essence, interpersonal touch, even touching inanimate objects, for some people, lowered anxiety--and anxiety can easily cancel out any kind of instruction, let along haptic engagement. What caught my eye was the last sentence: "The researchers are currently exploring the possibilities of simulated interpersonal touch through the use of a "haptic jacket," which can electronically give people the feeling that they are being hugged."

Hug your local haptician . . . and bring your teddy bear to class today. 

Sunday, November 3, 2013

Minding your P's and Q's: Pronunciation Change Mindfulness at work! Quiet!

Clip art:
As unpacked in earlier posts, "mindfulness" theory is often a good point of departure for understanding and managing pronunciation change, both as it is initiated in the classroom and "worked at" outside of class. A 2013 piece entitled, "Mindfulness-based emotional intelligence: Research and training," by Ciarrouchi and Godsell of Wollongong University, presents an interesting and useful set of parameters for optimal functioning of emotional intelligence, based on mindfulness theory and mindfulness training:

  • Identifying personal emotional states
  • Managing "incoming" emotion, recognizing intent of emotion expressed by others and appropriate responses to it
  • Countering fusion (counterproductive influences of emotion in ways that undermine concentration, analysis, logic, learning or self concept)
  • Clker
  • Expressing emotion
How does that apply to our work? It is a good set of guidelines for learners to review as they practice, being mindful at all times as to the state of their "mindset." Especially in haptic-integrated pronunciation practice, some degree of mindfulness is essential to ensure that targeted sounds get their basic 3~8 seconds of undivided attention:

  • Focus intensely on the present moment and task at hand, with controlled, emotional engagement,
  • Work at anchoring the new or changed sounds quickly, speaking out loud in an expressive and resonant voice (accompanied by a haptic, pedagogically-designed gesture, of course!)   
Students can be trained to do that. Should be. At the very least something to be mindful of . . .

Saturday, November 2, 2013

Introduction to some haptic gadgets - II

Kudos to CNN Tech Trends for this nice 14-slide piece by Arion McNicoll on haptics and new haptic gadgets. If you are just getting "in touch" with haptics, you'll like this. See especially slide #8 on Tesla Touch. I have done some research on that technology recently, an approach that may have promise for our AH-EPS haptic pronunciation. (See also the recent blogpost linking the TED talk on haptics, too.)
Credit: CNN.com