Thursday, July 21, 2011

V-Braille meets Kinect in Hapticland

Image credit: www. pmos.org.uk
To get an idea of what a haptic-based, computer-mediated, near virtual reality system for pronunciation teaching might work and look like, imagine a Kinect interface with a V-Braille set up, where you wear gloves that have sensors covering the end of your middle fingers (like a thimble) and quarter-size sensors on both the palm and back of each hand. As has been explored in earlier posts, using the HICP/EHIEP framework, learning new pronunciation should be so easy and efficient that you could learn it "with your eyes closed"--and probably should!

Why the Mikado image? Listening to it right now. What better model of integration (of fun and politics?) 

No comments:

Post a Comment