|
Clip art: Clker |
Ever since seeing the Clint Eastwood 1982 film,
Firefox, I have been intrigued with the idea of using eye tracking for teaching pronunciation. In the film, Eastwood flies a stolen, high tech Russian fighter--controlled by his eyes and a few spoken Russian commands-- back to the US. As noted in the right column, the work of Bradshaw and Cook in developing
"Observed Experiential Integration,", which involves extensive, therapeutic use of eye tracking, was fundamental to much of the early development of the HICP/EHIEP framework. Here is, basically, a marketing piece for a company that has developed (from my perspective)
an amazing range of eye tracking-based software applications. As I read the product list, it would take only two or three applications to allow a learner to do almost all of the protocols or procedures that we have developed, hands free.
In fact, the "optic anchoring" created simply by tracking the eyes across the visual field in roughly the same patterns that we do with arms and hands would be at least as effective, if not more so. Although we do use some eye tracking techniques in working with
accent reduction, in general EHIEP work, no explicit eye tracking is used, in part because of the inherent potency of eye tracking procedures and the absolute necessity of being formally trained in working with it. This technology is certainly worth "taking a look at" now. It is clearly integral to the future of virtual reality-based language instruction.
No comments:
Post a Comment