Synthetic synesthesia
Synesthesia
is a condition in which people have difficulty distinguishing
between various sensory inputs.
You smell
and you see, and you hear.
Presently, computers are very limited in the sensory data they can provide.
Suppose I am eating a slice of pizza right now, I can say "click here to
hear the sound of my chewing", or "click here to see the pizza",
but how about "lick here to taste this pizza"?
Any smell coming
from a computer is probably a bad sign (e.g. a burning smell).
Artificially induced synesthesia would allow us to map taste,
or smell, onto another medium like vision or sound, that can be
transmitted over a computer system using current technology.
Seeing music
A more traditional
example of artificially induced synesthesia is the conversion
of sound to vision, which is traditionally done with a bank
of filters, the output of which drives a light (usually each light
is of a different color so that the sound can be "seen" without looking
directly at the apparatus). Such a collection of audio filters and light
sources is known as a color organ.
A device as simple as a blinking light that flashes when someone
is knocking at the door
may be used to enable a deaf person to "hear" the bell.
Hearing video
Consider the reverse, a device that would allow a person
to "hear shapes". Such a device would consist of one or more
cameras sending live video to a computer, with the computer
sending audio back to the user.
It is easy to imagine how a blind person could be guided by
someone at a remote viewing station talking into a microphone.
This system requires a full-duplex radio communications link
so that video could be transmitted while audio was received at
the same time.
Now, instead of having a person at a remote site, imagine having
a computer either at a remote site, or worn locally by the user.
Synthetic synesthesia of 6th and 7th senses
Imagine if we had extrasensory perception.
Let me invent a 6th or 7th sense, say, radar.
We cannot perceive radar directly, but we can wear an instrument
that does, and we can map the output of this instrument to another
sense.
I found that Doppler auralization
allowed me to walk down a corridor, or the like, in total darkness,
so it appears that radar can provide sufficient information to
give us some simple navigatonal ability.
However, in presenting my findings to the
Canadian National Institute for the Blind, it became apparent
that blind people rely heavily on the sense of hearing, and that
any device that uses headphones or even produces sound is
unacceptable. Thus the next phase of the project was to develop
vibrotactile radar systems, so that I could feel objects at a distance,
pressing against my body. As the objects get closer they
press "harder", resulting in a Reality Metaphor User Interface
(RMUI).
Radar gives us different information about the world than
our regular five senses, so its use may extend beyond the
visually impaired. It may be used to augment the existing senses,
and provide us with a better understanding of the world around us.
In particular, I find that while riding my bicycle, I am able
to attain a better awareness of the world around me by using radar
as an extra sense. I am aware of traffic that is coming up behind
me, even if it is out of sight (either off in the distance, our
out of the field of view of
my electronic mirror).
Return to
WearCam