Presentation I gave for the paper:
Cassinelli Alvaro, Reynolds Carson and Ishikawa Masatoshi : Augmenting spatial awareness with Haptic Radar, Tenth International Symposium on Wearable Computers(ISWC) (Montreux, 2006.10.11-14).
For more: http://www.k2.t.u-tokyo.ac.jp/perception/HapticRadar/index-e.html
Hemostasis Physiology and Clinical correlations by Dr Faiza.pdf
Haptic Radar at ISWC 2006
1. Augmenting spatial
awareness with the
haptic radar
Ishikawa Namiki Komuro Laboratory
Parallel Processing for Sensory information
Ishikawa Namiki Komuro Laboratory
Parallel Processing for Sensory information
Ishikawa Namiki Komuro Laboratory
Parallel Processing for Sensory information
Ishikawa Namiki Komuro Laboratory
Parallel Processing for Sensory information
Álvaro Cassinelli, Carson Reynolds &
Masatoshi Ishikawa
The University of Tokyo
ISWC 2006
Montreux, 2006.10.11-14
2. Concept & Motivation
• Antennae, hairs and cilia precede eyes in
evolutionary development
• Efficient for short-range spatial awareness
(unambiguous, computationally inexpensive)
• Robust (insensitive to illumination & background)
• Easily configurable (hairs at strategic locations)
and potentially omni-directional
+
• Today’s MOEMS technology enables mass
produced, tiny opto-mechanical devices...
Time to (re)grow antennas on people (and machines)?
3. An opto-mechanical hair?
• Hair shaft is a steerable beam of light (a laser-radar).
• Local, range-to-tactile stimulation
• Active scanning of the surrounding:
– Proprioception-based
– Automatic sweeping of the surroundings to extract
important features (inspired by animal whiskers’
motion, two-point touch technique, etc)
• Modular, but interconnected structure
(artificial skin)
4. … but also Human-Machine interface technology:
– “Hairy electronics”: versatile human- computer interface
– Sensitive spaces: human-aware “hairy” architecture
– Display: laser-based vectorial graphics, laser annotation on
surrounding (augmented reality, attentional cues)
Possible applications
Augmented spatial awareness & sensing
– Electronic Travel Aid for the visually impaired .
– Augmented spatial awareness for motorcycle drivers
and workers in hazardous environments.
– Collision avoidance (robotic limbs, vehicles, etc).
– Augmented sensing & tele-sensing (texture, speed).
Input…
… and
output!
5. (*) “The Smart Laser Scanner”, SIGCHI 2005
Laser-based module: feasability*
• MEMS galvano-mirrors
• Smart Laser Tracking (*)
principle (can work as an
antenna sweeping mode)
Concept (“hairy
electronics”)…
… opto-mechanical
implementation
( )
6. The haptic radar as a travel aid
A few fundamental questions:
– New sensorial modality: how easy to appropriate?
(Would it be like re-exercising an atrophied one?)
– reflex reaction to range-to-tactile translation?
– Is the brain capable of intuitive integration of data
from eyes on the back, the front, the sides…?
Prototype characteristics and limitations:
– Configurations studied: headband & single module.
– Limitation: non-mobile beam
– Three prototypes built: (a) one without range-finders
(simulated maze exploration), (b) another with range-
finders (but short range), and a (c) single autonomous
range/actuator module.
(a)
(b)
(c)
7. (a) Haptic Radar Simulator
• Six actuators & LED indicators
• No range-sensors (controlled virtual space)
• Adjustable horizon of visibility
• Perception modalities:
• proximity
• open-space
Q: How participants deal with 360°
of spatial awareness without
previous training?
Simulator features
9. (a) Simulator discussion
• Sense of orientation rapidly lost => add tactile compass "beacon"
• Interactive horizon of visibility is a necessary feature
• “proximity feel” mode disturbing if many actuators vibrate at the
same time => compute center of gravity
• “open-space” perception mode interesting, but counterintuitive
(needs training!).
• continuous range-to-vibration function not easy to interpret = >
discretize levels (max 3 or 4 levels).
• Too few actuators/sensors (annoying "jumping" effect)
• vibrators need to be calibrated to produce same perceived effect
(motors characteristics differ, as well as sensitivity on each site)
10. Prototype features:
• Six sensor & vibrators
• Non-steerable “hairs” (infrared sensors)
• Max range 80 cm (arm’s range)
(b) Prototype with sensors
and actuators
Q: Can participants avoid unseen
object approaching from behind ?
12. Immediate problems & possible improvements
– Range detection too short (1 meter max) [⇒ next prototype will use
utrasound sensors (up to 6 meters), then laser rangefinders]
– Simultaneous stimulus confusing [⇒ only one actuator active at any time,
perhaps in the opposite direction (showing direction of clear path)]
– Low spatial resolution of actuators [⇒ more vibrators / different actuators]
– Variable motor characteristics [⇒ individual calibration]
– Range-to-tactile linear function too simplistic [⇒ log scale / discrete]
– Effect of rotation is confusing in the simulator [⇒ head tracking]
– Sense of direction is rapidly lost when there is no “reference background”
[⇒use “interactive horizon” technique & add compass cue]
(b) Results discussion
13. • AVR microcontroller based,
(reprogrammable)
• Discretized vibration levels
(four levels). Easy to
calibrate each module
separately.
• Li-Ion polymer battery
(approx. 4 hours autonomy)
• IR-based near-neighbor
communication channel
(enabling non-local haptic
feedback like spreading
haptic waves)
• Attachable with Velcro to
precise locations on the
body surface
(c) Haptic Radar Module: first steps
15. Future Research Directions
• Ultrasound sensors (more range – up to 6 meters)
• MEMS based steerable laser beams (automatic sweeping)
• Evaluate other tactile actuators (skin stretch?) and tactile signals (ex: tactons).
• More compact MEMS device modules for more density
• Grid network of interconnected modules
• More comprehensive experiments:
– Can participants navigate through crowds?
– Can participants predict if an object will hit them?
– In the long term, is there habituation to vibration stimulus?
• Other interesting research directions:
– Study haptic radar for rehabilitation of hemi-negligent patients.
– Use the optical hair to write/annotate objects in the surrounding
Notas del editor
Develop insect-like artificial antennae antenna precede eyes in evolutionary development. They provide a simple and efficient method for perceiving space. Antennae present unique advantages as sensors direct and computationally inexpensive range perception insensitive to illumination conditions Refigure hair and antenna as a useful sensorial modality for people and machines : Conclusion: A wearable, augmenting sensing device, a double skin. For the people who where here yesterday and heard professor Inami’s talk: this clearly goes in the direction of “X-men computing”
Infrared or ultrasound rangefinder sensors can be used too, but won’t be mobile.
Thanks to the its reconfigurability, the proposed concept (module formed by coupling a range-detector and a tactile stimulator) can target very different areas.
I am going to show here a proof of principle of the modular range-to-tactile translation system Simultaneous, “full-horizon” spatial awareness possible: to my knowledge, this has never been explored before. Can we learn to integrate information from “eyes on the back (as well as on the sides)”? Is it possible to form and visualize a coherent 3D model of the world (with some reasonable training), or this 360 degrees of awareness will always impose a cognitive overload (i.e., we have to selectively pay attention to the “back”, as if we were looking there using back mirrors in a car). My guess is that it is possible, because we already integrate this information form the objects in our back (for instance, using the skin or the hairs). It’s very short range, but this are good news because the training is not targeted to the creation of an entirely new modality, but rather to the exercising of an existing one (perhaps atrophied ?). No mobile beam: the user relies on body proprioception to give meaning to the tactile cues.
-Importance of pre-processing information in order to REDUCE THE AMOUNT OF INFORMATION. REM: works by Leslie Kay (Sonicguide), Tony Heyes (Sonic Pathfinder) and Allan Dodds (http://www.seeingwithsound.com/sensub.htm). “Heyes' approach is rather different from Kay's in that the Sonic Pathfinder deliberatily supplies only the minimum but most relevant information for travel needed by the user, whereas Kay strives for more information-rich sonar-based displays. “. Both systems present information as sound signals…
- fMRI tests on this device to see how the brain learn to decode this information (using pneumatic actuators, as in Evaluation of a pneumatically driven tactile stimulator device for vision substitution during fMRI studies. Zappe AC, Maucher T, Meier K, Scheiber C Magn Reson Med. 2004 Apr ; 51(4): 828-34 http://www.hubmed.org/display.cgi?uids=15065257)