4. The Incredible Disappearing Computer
1960-70’s
Room
1970-80’s
Desk
1980-90’s
Lap
1990-2000’s
Hand
2010 -
Head
5. Graphical User Interfaces
• Separation between real and digital worlds
• WIMP (Windows, Icons, Menus, Pointer) metaphor
6. Rekimoto, J. and Nagao, K. 1995. The world through the computer: computer augmented interaction with real world environments.
Making Interfaces Invisible
(c) Internet of Things
7. Internet of Things (IoT)..
• Embed computing and sensing in real world
• Smart objects, sensors, etc..
(c) Internet of Things
8. Virtual Reality (VR)
• Users immersed in Computer Generated environment
• HMD, gloves, 3D graphics, body tracking
9. Augmented Reality (AR)
• Virtual Images blended with the real world
• See-through HMD, handheld display, viewpoint tracking, etc..
10. Milgram’s Mixed Reality (MR) Continuum
Augmented Reality Virtual Reality
Real World Virtual World
Mixed Reality
"...anywhere between the extrema of the virtuality continuum."
P. Milgram and A. F. Kishino, (1994) A Taxonomy of Mixed Reality Visual Displays
Internet of Things
15. Creating the Illusion of Reality
• Fooling human perception by using
technology to generate artificial sensations
• Computer generated sights, sounds, smell, etc
16. Reality vs. Virtual Reality
• In a VR system there are input and output devices
between human perception and action
17. Example: Birdly - http://www.somniacs.co/
• Create illusion of flying like a bird
• Multisensory VR experience
• Visual, audio, wind, haptic
20. A Human Information Processing Model
• High level staged model from Wickens and Carswell (1997)
• Relates perception, cognition, and physical ergonomics
Perception Cognition Ergonomics
21. 1. Design for Perception
• Need to understand perception to design AR/VR systems
• Visual perception
• Many types of visual cues (stereo, oculomotor, etc.)
• Auditory system
• Binaural cues, vestibular cues
• Somatosensory
• Haptic, tactile, kinesthetic, proprioceptive cues
• Chemical Sensing System
• Taste and smell
26. 2. Design for Cognition
• Design for Working and Long-term memory
• Working memory
• Short term storage, Limited storage (~5-9 items)
• Long term memory
• Memory recall trigger by associative cues
• Situational Awareness
• Model of current state of user’s environment
• Used for wayfinding, object interaction, spatial awareness, etc..
• Provide cognitive cues to help with situational awareness
• Landmarks, procedural cues, map knowledge
• Support both ego-centric and exo-centric views
27. Design for Micro Interactions in AR
▪ Design interaction for less than a few seconds
• Tiny bursts of interaction
• One task per interaction
• One input per interaction
▪ Benefits
• Use limited input
• Minimize interruptions
• Reduce attention fragmentation
28. Make it Glanceable
• Seek to rigorously reduce information density. Successful designs afford for
recognition, not reading.
Bad Good
29. Reduce Information Chunks
You are designing for recognition, not reading. Reducing the total # of information
chunks will greatly increase the glanceability of your design.
1
2
3
1
2
3
4
5 (6)
Eye movements
For 1: 1-2 460ms
For 2: 1 230ms
For 3: 1 230ms
~920ms
Eye movements
For 1: 1 230ms
For 2: 1 230ms
For 3: 1 230ms
For 4: 3 690ms
For 5: 2 460ms
~1,840ms
30. Navigation
• How we move from place to place within an environment
• The combination of travel with wayfinding
• Wayfinding: cognitive component of navigation
• Travel: motor component of navigation
31. Wayfinding – Making Cognitive Maps
• Goal of Wayfinding is to build Mental Model (Cognitive Map)
• Types of spatial knowledge in a mental model
• landmark knowledge, procedural knowledge, map-like knowledge
• Creating a mental model
• study a map, explore the space, explore a copy of the space
• Problem: Sometimes perceptual judgments are incorrect within VR
33. Support Wayfinding in VR
• Provide Landmarks
• An obvious, distinct and non-mobile object
• Seen from several locations (e.g. tall)
• Audio beacons can also serve as landmarks
• Use Maps
• Copy real world maps
• Ego-centric vs. Exocentric map cues
• World in Miniature
• Map based navigation
34. Situation Awareness: Ego-centric and Exo-centric views
• Combining ego-centric and exo-centric cues for better situational awareness
36. 3. Design for Ergonomics
• Design for the human motion range
• Consider human comfort and natural posture
• Design for physical input
• Coarse and fine scale motions, gripping and grasping
• Avoid “Gorilla arm syndrome” from holding arm pose
37. Gorilla Arm in AR/VR
• Design interface to reduce mid-air gestures
39. XRgonomics
• Uses physiological model to calculate ergonomic interaction cost
• Difficulty of reaching points around the user
• Customizable for different users
• Programmable API, Hololens demonstrator
• GitHub Repository
• https://github.com/joaobelo92/xrgonomics
Evangelista Belo, J. M., Feit, A. M., Feuchtner, T., & Grønbæk, K. (2021, May). XRgonomics: Facilitating the Creation of
Ergonomic 3D Interfaces. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-11).
42. New Tools for Human Factors
• New types of sensors
• EEG, ECG, GSR, etc
• Sensors integrated in HMD
• Integrated into HMD faceplate, straps
• Data processing and capture tools
• iMotions, etc
• AR/VR Analytics tools
• Cognitive3D, etc
43. Project Galea: Multiple Physiological Sensors in HMD
• Incorporate range of sensors in HMD faceplate and over head
• EMG – muscle movement
• EOG – Eye movement
• EEG – Brain activity
• EDA, PPG – Heart rate
44.
45. Cognitive3D
• Capture capture and analytics for VR
• Multiple sensory input (eye tracking, HR, EEG, body movement, etc)
47. Example: Adaptive VR based on Workload
• VR training systems adapt in
real-time based on cognitive load
• Goal to induce the best level of
performance gain
Dey, A., Chatburn, A., & Billinghurst, M. (2019, March). Exploration of an EEG-
based cognitively adaptive training system in virtual reality. In 2019 IEEE
Conference on Virtual Reality and 3d User Interfaces (VR) (pp. 220-226). IEEE.
49. Experimental Task
● Search task
○ Search multiple times in 5 minutes
● Target selection increasing difficult
○ number of objects, different colors
○ shapes, and movement
Increasing levels (0 - 20)
53. Results – Response Time
Increasing levels of difficulty
Response Time (sec.)
No difference between
easiest and hardest levels
54. Results – Time Frequency Representation
● Task Load
○ Significant alpha synchronisation in the hardest difficulty levels
of the task when compared to the easiest difficulty levels
Easiest Hardest Difference
55. Lessons Learned
● Similar reaction time but increased brain activity showing
increased cognitive effort at higher levels to sustain performance
● Adaptive VR training can increase the user’s cognitive load
without affecting task performance
● First demo of the use of real-time EEG signals to adapt the
complexity of the training stimuli in VR
56. • AR/VR makes the computer invisible
• Altering human perception
• Using Human Information Processing Model for Design
• Consider Perception, Cognition, Ergonomic elements
• New tools becoming available
• Physiological sensors, sensor enhanced HMDs
• Data collection, analytics software
• Directions for Research
• New models, application/validation studies, novel interfaces
Conclusions