Haptic-integrated Clinical Pronunciation Research and Teaching
Friday, June 8, 2012
News release about a robot designed to teach English in Taiwan which " . . . has a "large doll head" and arms and a body that can make movements based on the dialogues being taught in an English class." Furthermore, " . . . the robot allows young students to learn to speak English in a "stressless" environment and in a fun way." Wow. On the one hand, "robotic," mechanical pronunciation teaching does not sound that "fun," but from the EHIEP perspective, also being able to provide learners with clear, consistent models of pedagogical movement patterns (see previous post) and accurate acoustic models is appealing. I have written earlier on the potential use of virtual reality avatars in our work. Our "EHIEP-bot" logo could use a little spiffying up, of course, but it does embody the spirit of what Professor Wu's baby is about--a balance of cognitive and affective anchoring, 3-second moments of concentrated focus and attention on sound processes. At times assuming the EHIEP-bot personna of "goofy, haptic precision" whether in presenting or practicing or providing corrective feedback can be extraordinarily effective. Sport your "hapticobot" this week; support your local "Haptician!"