One of the basic assumptions of research in haptics has been that in most settings visual stimuli override auditory, kinaesthetic and tactile. The import of studies reported in earlier posts for our work had been that haptic procedures (such as hand movements across the visual field to anchor intonation) were "seen" to be highly susceptible to interference from visual distraction in the immediate environment. What the new research revealed is that in a very real sense both visual and haptic information are, in effect, processed in the same channel or areas in the brain. Which of the modalities dominates at any point in time appears now to have as much to do with the quality of the stimuli themselves, not some intrinsic difference in potency of the modality type. The point is this: from a theoretical perspective, it means that haptic work done right can be enormously powerful in anchoring sound, much more so than I had thought possible earlier. In retrospect, I have consistently seen evidence of the strength of haptic anchoring, regardless of the scene or potential distractions in the classroom but was hesitant to interpret that as evidence reflecting more of a balance in visual/haptic processing. A welcome touch of uncommon (unconventional) sense . . .