A comment I wrote on Ethan Hein's account of color-coding sequencer tracks:
When you say “color coding improved your ears”, the first thing that comes to mind is that you are doing some kind of machine-assisted synesthesia. very cool, that.
another enticing pice from oblong industries
It's true that one of the elements is a blue ball that bounces in time to the music and touches down on the notes that are playing just then. Also true: a ball like that can never be entirely serious, but it can be entirely effective. What works is assisted synaesthesia, making sound seem like sight and looking seem like hearing. The time in which music happens is turned into the space of the animated score.
And then pretty quickly we hit this video of John Underkoffler at TED on the future of the user interface. Not quite what I was after, very lovely in its own Minority Report style but not for the perspective I was after.
This has an interesting perspective from a synaesthete, but I won't quote based on the copyright:
PLEASE DO NOT LIFT CONTENT FROM THESE PAGES WITHOUT FIRST OBTAINING PERMISSION. ONCE PERMISSION HAS BEEN OBTAINED, YOU ARE OBLIGED TO INCLUDE A LINK FROM THE CONTENT BACK TO THIS SITE. You are welcome to post content on this site, provided you include a link back to its hosting origin. Many thanks.
An, finally, aha, wikipedia yields a reference: Plouznikoff, N., Plouznikoff, A. & Robert, J.-M. (2005), "Artificial Grapheme-Color Synesthesia for Wearable Task Support", Ninth IEEE International Symposium on Wearable Computers, pp. 108-113
This paper presents the benefits of generating an artificial visual synesthesia through a wearable computer. Following a short introduction to remind the need for seamless human-wearable computer interactions, this paper makes the case for drawing upon synesthesia, a combination of the senses naturally occurring in a small portion of the population, to augment everyday entities and more precisely to enrich written graphemes. We present the rationale behind our research and summarize the functionality, architecture and implementation of our current prototype. Preliminary results suggest that this kind of artificial synesthesia improves short term memory recall and visual information search times.
and the followup paper crucially gives the link to augmented reality
This paper studies a novel approach advocating the virtual alteration of real-world interfaces through a form of augmented reality. Following an introduction reminding the need for easy to use and more consistent interfaces across our many day to day devices, this paper makes the case for using wearable computers to enhance the interactions between humans and conventional appliances. We present the rationale behind our research and summarize our current prototype's functionalities, architecture and implementation. Preliminary results suggest that virtually altering the interface of real world devices improves execution times for simple tasks using these devices.