Several posts on the blog, including a recent one, have dealt with the basic question of to what extent visual stimuli can potentially undermine learning of sound, movement and touch (the basic stuff of the haptic approach to pronunciation teaching.) I went back to Doellar and Burgess (2008), "Distinct error-correcting and incidental learning of location relative to landmarks and boundaries" (Full citation below.), one of the key pieces of research/theory that our haptic work has been based on.
In essence, that study demonstrated that we have two parallel systems for learning locations, in two different parts of the brain, one from landmarks in the visual (or experiential) field and another from boundaries of the field. Furthermore, boundaries tend to override landmarks in navigating. (For instance, when finding your way in the dark, your first instinct is to go along the wall, touching what is there, if possible, not steer through landmarks or objects in the field in front of you whose relative location may be much less fixed in your experience.)
Most importantly for us, boundaries tend to be learned incidentally; landmarks, associatively. In other words, location relative to boundaries is more like a map, where the exact point is first identified by where it is relative to the boundary, not the other points within the map itself. Conversely, landmarks tend to be learned associatively, relative to each other, not in relation to the boundary of the field, which may be irrelevant anyway, not conceptually present.
So what does that imply for teaching English vowels?
- Learner access in memory to the vowels when still actively working on improving pronunciation is generally a picture or image of a matrix, where the vowels are placed in it. (Having asked learners for decades how they "get to" vowels, the consistent answer is something like: "I look at the vowel chart in my mind.")
- The relative position of those vowels, especially adjacent vowels, is almost certainly tied more to the boundaries of the matrix, the sides and intersecting lines, not the relative auditory and articulatory qualities of the sounds themselves.
- The impact of visual schema and processing over auditory and haptic is such that, at least for many learners, the chart is at least not doing much to facilitate access to the articulatory and somatic features of the phonemes, themselves. (I realize that is an empirical question that cries out for a controlled study!)
- The phonemic system of a language is based fundamentally on relative distances between phonemes. The brain generally perceives phonemic differences as binary, e.g., it is either 'u' or 'U', or 'p' or 'b', although actual sound produced may be exceedingly close to the conceptual "boundary" separating them.
- Haptic work basically backgrounds visual schema and visual prominence, attempting to promote a stronger association between the sounds, themselves, and the "distance" between them, in part by locating them in the visual field immediately in front of the learner, using gesture, movement and touch, so that the learner experiences the relative phonemic "differences" as distinctly as possible.
- We still do some initial orientation to the vowel system using a clock image with the vowels imposed on it, to establish the technique of using vowel numbers for correction and feedback, but try to get away from that as soon as possible, since that visual schema as well gives the impression that the vowels are somehow "equidistant" from each other--and, of course, according to Doellar and Burgess (2008) probably more readily associated with the boundary of the clock than with each other.
Doellar, C. and Burgess, N. (2008). "Distinct error-correcting and incidental learning of location relative to landmarks and boundaries", retrieved from http://www.pnas.org/content/105/15/5909.long, December 19, 2015.