SlideShare una empresa de Scribd logo
1 de 98
Descargar para leer sin conexión
LECTURE 3: VR SYSTEMS
COMP 4010 – Virtual Reality
Semester 5 - 2018
Bruce Thomas, Mark Billinghurst, Gun Lee
University of South Australia
August 7th 2018
TRACKING
Tracking inVirtual Reality
• Registration
• Positioning virtual object wrt real/virtual objects
• Tracking
• Continually locating the users viewpoint/input
• Position (x,y,z), Orientation (r,p,y)
MechanicalTracker
•Idea: mechanical arms with joint sensors
•++: high accuracy, haptic feedback
•-- : cumbersome, expensive
Microscribe Sutherland
MagneticTracker
• Idea: difference between a magnetic
transmitter and a receiver
• ++: 6DOF, robust
• -- : wired, sensible to metal, noisy, expensive
• -- : error increases with distance
Flock of Birds (Ascension)
Example: Razer Hydra
• Developed by Sixense
• Magnetic source + 2 wired controllers
• Short range (1-2 m)
• Precision of 1mm and 1o
• $600 USD
Razor Hydra Demo
• https://www.youtube.com/watch?v=jnqFdSa5p7w
InertialTracker
• Idea: measuring linear and angular orientation rates
(accelerometer/gyroscope)
• ++: no transmitter, cheap, small, high frequency, wireless
• -- : drift, hysteris only 3DOF
IS300 (Intersense)
Wii Remote
OpticalTracker
• Idea: Image Processing and ComputerVision
• Specialized
• Infrared, Retro-Reflective, Stereoscopic
• Monocular BasedVisionTracking
ART Hi-Ball
Outside-In vs.Inside-OutTracking
Example: Vive Lighthouse Tracking
• Outside-in tracking system
• 2 base stations
• Each with 2 laser scanners, LED array
• Headworn/handheld sensors
• 37 photo-sensors in HMD, 17 in hand
• Additional IMU sensors (500 Hz)
• Performance
• Tracking server fuses sensor samples
• Sampling rate 250 Hz, 4 ms latency
• See http://doc-ok.org/?p=1478
Lighthouse Components
Base station
- IR LED array
- 2 x scanned lasers
Head Mounted Display
- 37 photo sensors
- 9 axis IMU
Lighthouse Setup
Lighthouse Tracking
Base station scanning
https://www.youtube.com/watch?v=avBt_P0wg_Y
https://www.youtube.com/watch?v=oqPaaMR4kY4
Room tracking
Example: Windows Mixed Reality
• Inside-out tracking
• Two cameras on HMD
• Low rez Black and White
• Combined with IMU data
• Uses environment mapping
• Tracking from visual features
https://www.youtube.com/watch?v=67yLuiSfMWM
Occipital Bridge Engine/Structure Core
• Inside out tracking
• Uses structured light
• Better than room scale tracking
• Integrated into bridge HMD
• https://structure.io/
https://www.youtube.com/watch?v=qbkwew3bfWU
VR INPUT DEVICES
VR Input Devices
• Physical devices that convey information into the application
and support interaction in the Virtual Environment
Mapping Between Input and Output
Input
Output
Motivation
• Mouse and keyboard are good for desktop UI tasks
• Text entry, selection, drag and drop, scrolling, rubber banding, …
• 2D mouse for 2D windows
• What devices are best for 3D input in VR?
• Use multiple 2D input devices?
• Use new types of devices?
vs.
Input Device Characteristics
• Size and shape, encumbrance
• Degrees of Freedom
• Integrated (mouse) vs. separable (Etch-a-sketch)
• Direct vs. indirect manipulation
• Relative vs. Absolute input
• Relative: measure difference between current and last input (mouse)
• Absolute: measure input relative to a constant point of reference (tablet)
• Rate control vs. position control
• Isometric vs. Isotonic
• Isometric: measure pressure or force with no actual movement
• Isotonic: measure deflection from a center point (e.g. mouse)
Hand Input Devices
• Devices that integrate hand input into VR
• World-Grounded input devices
• Devices fixed in real world (e.g. joystick)
• Non-Tracked handheld controllers
• Devices held in hand, but not tracked in 3D (e.g. xbox controller)
• Tracked handheld controllers
• Physical device with 6 DOF tracking inside (e.g. Vive controllers)
• Hand-Worn Devices
• Gloves, EMG bands, rings, or devices worn on hand/arm
• Bare Hand Input
• Using technology to recognize natural hand input
World Grounded Devices
• Devices constrained or fixed in real world
• Not ideal for VR
• Constrains user motion
• Good for VR vehicle metaphor
• Used in location based entertainment (e.g. Disney Aladdin ride)
Disney Aladdin Magic Carpet VR Ride
Non-Tracked Handheld Controllers
• Devices held in hand
• Buttons, joysticks, game controllers, etc.
• Traditional video game controllers
• Xbox controller
Tracked Handheld Controllers
• Handheld controller with 6 DOF tracking
• Combines button/joystick input plus tracking
• One of the best options for VR applications
• Physical prop enhancing VR presence
• Providing proprioceptive, passive haptic touch cues
• Direct mapping to real hand motion
HTC Vive Controllers Oculus Touch Controllers
Example: Sixense STEM
• Wireless motion tracking + button input
• Electromagnetic tracking, 8 foot range, 5 tracked receivers
• http://sixense.com/wireless
Sixense Demo Video
• https://www.youtube.com/watch?v=2lY3XI0zDWw
Example: WMR Handheld Controllers
• Windows Mixed Reality Controllers
• Left and right hand
• Combine computer vision + IMU tracking
• Track both in and out of view
• Button input, Vibration feedback
https://www.youtube.com/watch?v=rkDpRllbLII
Cubic Mouse
• Plastic box
• Polhemus Fastrack inside (magnetic 6 DOF tracking)
• 3 translating rods, 6 buttons
• Two handed interface
• Supports object rotation, zooming, cutting plane, etc.
Fröhlich, B., & Plate, J. (2000). The cubic mouse: a new device for three-dimensional input.
In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (pp. 526-
531). ACM.
Cubic Mouse Video
• https://www.youtube.com/watch?v=1WuH7ezv_Gs
Hand Worn Devices
• Devices worn on hands/arms
• Glove, EMG sensors, rings, etc.
• Advantages
• Natural input with potentially rich gesture interaction
• Hands can be held in comfortable positions – no line of sight issues
• Hands and fingers can fully interact with real objects
Myo Arm Band
• https://www.youtube.com/watch?v=1f_bAXHckUY
Data Gloves
• Bend sensing gloves
• Passive input device
• Detecting hand posture and gestures
• Continuous raw data from bend sensors
• Fiber optic, resistive ink, strain-gauge
• Large DOF output, natural hand output
• Pinch gloves
• Conductive material at fingertips
• Determine if fingertips touching
• Used for discrete input
• Object selection, mode switching, etc.
How Pinch Gloves Work
• Contact between conductive
fabric completes circuit
• Each finger receives voltage
in turn (T3 – T7)
• Look for output voltage at
different times
Example: Cyberglove
• Invented to support sign language
• Technology
• Thin electrical strain gauges over fingers
• Bending sensors changes resistence
• 18-22 sensors per glove, 120 Hz samples
• Sensor resolution 0.5o
• Very expensive
• >$10,000/glove
• http://www.cyberglovesystems.com
How CyberGlove Works
• Strain gauge at joints
• Connected to A/D converter
Demo Video
• https://www.youtube.com/watch?v=IUNx4FgQmas
StretchSense
• Wearable motion capture sensors
• Capacitive sensors
• Measure stretch, pressure, bend, shear
• Many applications
• Garments, gloves, etc.
• http://stretchsense.com/
StretchSense Glove Demo
• https://www.youtube.com/watch?v=wYsZS0p5uu8
Comparison of Glove Performance
From Burdea, Virtual Reality Technology, 2003
Bare Hands
• Using computer vision to track bare hand input
• Creates compelling sense of Presence, natural interaction
• Challenges need to be solved
• Not having sense of touch
• Line of sight required to sensor
• Fatigue from holding hands in front of sensor
Leap Motion
• IR based sensor for hand tracking ($50 USD)
• HMD + Leap Motion = Hand input in VR
• Technology
• 3 IR LEDS and 2 wide angle cameras
• The LEDS generate patternless IR light
• IR reflections picked up by cameras
• Software performs hand tracking
• Performance
• 1m range, 0.7 mm accuracy, 200Hz
• https://www.leapmotion.com/
Example: Leap Motion
• https://www.youtube.com/watch?v=QD4qQBL0X80
Non-Hand Input Devices
• Capturing input from other parts of the body
• Head Tracking
• Use head motion for input
• Eye Tracking
• Largely unexplored for VR
• Microphones
• Audio input, speech
• Full-Body tracking
• Motion capture, body movement
Eye Tracking
• Technology
• Shine IR light into eye and look for reflections
• Advantages
• Provides natural hands-free input
• Gaze provides cues as to user attention
• Can be combined with other input technologies
Example: FOVE VR Headset
• Eye tracker integrated into VR HMD
• Gaze driven user interface, foveated rendering
• https://www.youtube.com/watch?v=8dwdzPaqsDY
Pupil Labs VIVE/Oculus Add-ons
• Adds eye-tracking to HTC Vive/Oculus Rift HMDs
• Mono or stereo eye-tracking
• 120 Hz eye tracking, gaze accuracy of 0.6° with precision of 0.08°
• Open source software for eye-tracking
• https://pupil-labs.com/pupil/
Full Body Tracking
• Adding full-body input into VR
• Creates illusion of self-embodiment
• Significantly enhances sense of Presence
• Technologies
• Motion capture suit, camera based systems
• Can track large number of significant feature points
Camera Based Motion Capture
• Use multiple cameras
• Reflective markers on body
• Eg – Opitrack (www.optitrack.com)
• 120 – 360 fps, < 10ms latency, < 1mm accuracy
Optitrack Demo
• https://www.youtube.com/watch?v=tBAvjU0ScuI
Wearable Motion Capture: PrioVR
• Wearable motion capture system
• 8 – 17 inertial sensors + wireless data transmission
• 30 – 40m range, 7.5 ms latency, 0.09o
precision
• Supports full range of motion, no occlusion
• www.priovr.com
PrioVR Demo
• https://www.youtube.com/watch?v=q72iErtvhNc
Pedestrian Devices
• Pedestrian input in VR
• Walking/running in VR
• Virtuix Omni
• Special shoes
• http://www.virtuix.com
• Cyberith Virtualizer
• Socks + slippery surface
• http://cyberith.com
Cyberith Virtualizer Demo
• https://www.youtube.com/watch?v=R8lmf3OFrms
Virtusphere
• Fully immersive sphere
• Support walking, running in VR
• Person inside trackball
• http://www.virtusphere.com
Virtusphere Demo
• https://www.youtube.com/watch?v=5PSFCnrk0GI
Omnidirectional Treadmills
• Infinadeck
• 2 axis treadmill, flexible material
• Tracks user to keep them in centre
• Limitless walking input in VR
• www.infinadeck.com
Infinadeck Demo
• https://www.youtube.com/watch?v=seML5CQBzP8
Comparison Between Devices
From Jerald (2015)
Comparing between hand
and non-hand input
Input Device Taxonomies
• Helps to determine:
• Which devices can be used for each other
• What devices to use for particular tasks
• Many different approaches
• Separate the input device from interaction technique (Foley 1974)
• Mapping basic interactive tasks to devices (Foley 1984)
• Basic tasks – select, position, orient, etc.
• Devices – mouse, joystick, touch panel, etc.
• Consider Degrees of Freedom and properties sensed (Buxton 1983)
• motion, position, pressure
• Distinguish bet. absolute/relative input, individual axes (Mackinlay 1990)
• separate translation, rotation axes instead of using DOF
Foley and Wallace Taxonomy (1974)
Separate device from
interaction technique
Buxton Input Device Taxonomy (Buxton 1983)
• Classified according to degrees of freedom and property sensed
• M = devise uses an intermediary between hand and sensing system
• T = touch sensitive
VR SYSTEMS
• High level overview
• User engaged in task
• User provides input
• VR engine provides output
• VR engine connected to external databases
Basic VR System
Simple SystemArchitecture
Head
Tracker
Host
Processor
Data Base
Model
Rendering
Engine
Frame
Buffer
head position/orientation
to network
Display
Driver
Non see-
thru
Image
source &
optics
Virtual
World
From Content to User
Modelling
Program
Content
• 3d model
• Textures
Translation
• CAD data
Application
programming
Dynamics
Generator
Input Devices
• Gloves, Mic
• Trackers
Renderers
• 3D, sound
Output Devices
• HMD, audio
• Haptic
User Actions
• Speak
• Grab
Software
Content
User I/O
Case Study: Multimodal VR System
• US Army project
• Simulate control of an unmanned vehicle
• Sensors (input)
• Head/hand tracking
• Gesture, Speech (Multimodal)
• Displays (output)
• HMD, Audio
• Processing
• Graphics: Virtual vehicles on battlefield
• Speech processing/understanding
Neely, H. E., Belvin, R. S., Fox, J. R., & Daily, M. J. (2004, March). Multimodal interaction
techniques for situational awareness and command of robotic combat entities. In Aerospace
Conference, 2004. Proceedings. 2004 IEEE (Vol. 5, pp. 3297-3305). IEEE.
System Diagram
VR Graphics Architecture
• Application Layer
• User interface libraries
• Simulation/behaviour code
• User interaction specification
• Graphics Layer (CPU acceleration)
• Scene graph specification
• Object physics engine
• Specifying graphics objects
• Rendering Layer (GPU acceleration)
• Low level graphics code
• Rendering pixels/polygons
• Interface with graphics card/frame buffer
Typical Graphics Architecture
Typical VR Simulation Loop
• User moves head, scene updates, displayed graphics change
• Need to synchronize system to reduce delays
System Delays
Typical Delay from Tracking to Rendering
System Delay
Typical System Delays
• Total Delay = 50 + 2 + 33 + 17 = 102 ms
• 1 ms delay = 1/3 mm error for object drawn at arms length
• So total of 33mm error from when user begins moving to when object drawn
Tracking Calculate
Viewpoint
Simulation
Render
Scene
Draw to
Display
x,y,z
r,p,y
Application Loop
20 Hz = 50ms 500 Hz = 2ms 30 Hz = 33ms 60 Hz = 17ms
Living with High Latency (1/3 sec – 3 sec)
• https://www.youtube.com/watch?v=_fNp37zFn9Q
Effects of System Latency
• Degraded Visual Acuity
• Scene still moving when head stops = motion blur
• Degraded Performance
• As latency increases it’s difficult to select objects etc.
• If latency > 120 ms, training doesn’t improve performance
• Breaks-in-Presence
• If system delay high user doesn’t believe they are in VR
• Negative Training Effects
• User train to operative in world with delay
• Simulator Sickness
• Latency is greatest cause of simulator sickness
Simulator Sickness
• Visual input conflicting with vestibular system
Many Causes of Simulator Sickness
• 25-40% of VR users get Simulator Sickness, due to:
• Latency
• Major cause of simulator sickness
• Tracking accuracy/precision
• Seeing world from incorrect position, viewpoint drift
• Field of View
• Wide field of view creates more periphery vection = sickness
• Refresh Rate/Flicker
• Flicker/low refresh rate creates eye fatigue
• Vergence/Accommodation Conflict
• Creates eye strain over time
• Eye separation
• If IPD not matching to inter-image distance then discomfort
Motion Sickness
• https://www.youtube.com/watch?v=BznbIlW8iqE
How to Reduce System Delays
• Use faster components
• Faster CPU, display, etc.
• Reduce the apparent lag
• Take tracking measurement just before rendering
• Remove tracker from the loop
• Use predictive tracking
• Use fast inertial sensors to predict where user will be looking
• Difficult due to erratic head movements
Jerald, J. (2004). Latency compensation for head-mounted virtual reality. UNC
Computer Science Technical Report.
Reducing System Lag
Tracking Calculate
Viewpoint
Simulation
Render
Scene
Draw to
Display
x,y,z
r,p,y
Application Loop
Faster Tracker Faster CPU Faster GPU Faster Display
ReducingApparent Lag
Tracking
Update
x,y,z
r,p,y
Virtual Display
Physical
Display
(640x480)
1280 x 960
Last known position
Virtual Display
Physical
Display
(640x480)
1280 x 960
Latest position
Tracking Calculate
Viewpoint
Simulation
Render
Scene
Draw to
Display
x,y,z
r,p,y
Application Loop
PredictiveTracking
Time
Position
Past Future
Use additional sensors (e.g. inertial) to predict future position
• Can reliably predict up to 80 ms in future (Holloway)
• Use Kalman filters or similar to smooth prediction
Now
PredictiveTracking Reduces Error (Azuma 94)
System Design Guidelines - I
• Hardware
• Choose HMDs with fast pixel response time, no flicker
• Choose trackers with high update rates, accurate, no drift
• Choose HMDs that are lightweight, comfortable to wear
• Use hand controllers with no line of sight requirements
• System Calibration
• Have virtual FOV match actual FOV of HMD
• Measure and set users IPD
• Latency Reduction
• Minimize overall end to end system delay
• Use displays with fast response time and low persistence
• Use latency compensation to reduce perceived latency
Jason Jerald, The VR Book, 2016
System Design Guidelines - II
• General Design
• Design for short user experiences
• Minimize visual stimuli closer to eye (vergence/accommodation)
• For binocular displays, do not use 2D overlays/HUDs
• Design for sitting, or provide physical barriers
• Show virtual warning when user reaches end of tracking area
• Motion Design
• Move virtual viewpoint with actual motion of the user
• If latency high, no tasks requiring fast head motion
• Interface Design
• Design input/interaction for user’s hands at their sides
• Design interactions to be non-repetitive to reduce strain injuries
Jason Jerald, The VR Book, 2016
VR LANDSCAPE
Landscape of VR
VR Studios
VR Capture
VR Processes and Engines
VR Distribution
VR Displays
VR Input/Output
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

Más contenido relacionado

La actualidad más candente

La actualidad más candente (20)

Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research Directions
 
Comp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XRComp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XR
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality
 
Comp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsComp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR Systems
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface Design
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
COMP 4010 Lecture5 VR Audio and Tracking
COMP 4010 Lecture5 VR Audio and TrackingCOMP 4010 Lecture5 VR Audio and Tracking
COMP 4010 Lecture5 VR Audio and Tracking
 
COMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and SystemsCOMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and Systems
 
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic DisplaysCOMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
COMP 4010 - Lecture4 VR Technology - Visual and Haptic Displays
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
 
Advanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR StudiesAdvanced Methods for User Evaluation in AR/VR Studies
Advanced Methods for User Evaluation in AR/VR Studies
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VR
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual Reality
 
Lecture 6 Interaction Design for VR
Lecture 6 Interaction Design for VRLecture 6 Interaction Design for VR
Lecture 6 Interaction Design for VR
 
COMP 4010: Lecture2 VR Technology
COMP 4010: Lecture2 VR TechnologyCOMP 4010: Lecture2 VR Technology
COMP 4010: Lecture2 VR Technology
 
COMP 4010 - Lecture 4: 3D User Interfaces
COMP 4010 - Lecture 4: 3D User InterfacesCOMP 4010 - Lecture 4: 3D User Interfaces
COMP 4010 - Lecture 4: 3D User Interfaces
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
COMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VRCOMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VR
 

Similar a COMP 4010 - Lecture 3 VR Systems

virtual reality Information-160422181930.pdf
virtual reality Information-160422181930.pdfvirtual reality Information-160422181930.pdf
virtual reality Information-160422181930.pdf
21107117
 

Similar a COMP 4010 - Lecture 3 VR Systems (20)

COMP 4010 Lecture6 - Virtual Reality Input Devices
COMP 4010 Lecture6 - Virtual Reality Input DevicesCOMP 4010 Lecture6 - Virtual Reality Input Devices
COMP 4010 Lecture6 - Virtual Reality Input Devices
 
Lecture3 - VR Technology
Lecture3 - VR TechnologyLecture3 - VR Technology
Lecture3 - VR Technology
 
Introduction to DaydreamVR from DevFestDC 2017
Introduction to DaydreamVR from DevFestDC 2017Introduction to DaydreamVR from DevFestDC 2017
Introduction to DaydreamVR from DevFestDC 2017
 
eng.pptx
eng.pptxeng.pptx
eng.pptx
 
Virtual Reality
Virtual RealityVirtual Reality
Virtual Reality
 
virtual reality Information-160422181930.pdf
virtual reality Information-160422181930.pdfvirtual reality Information-160422181930.pdf
virtual reality Information-160422181930.pdf
 
Introduction to daydream for AnDevCon DC - 2017
Introduction to daydream for AnDevCon DC - 2017Introduction to daydream for AnDevCon DC - 2017
Introduction to daydream for AnDevCon DC - 2017
 
Mobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research DirectionsMobile AR Lecture 10 - Research Directions
Mobile AR Lecture 10 - Research Directions
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
 
Interaction modalities, technologies and tools for interactive art
Interaction modalities, technologies and tools for interactive artInteraction modalities, technologies and tools for interactive art
Interaction modalities, technologies and tools for interactive art
 
Application in Augmented and Virtual Reality
Application in Augmented and Virtual RealityApplication in Augmented and Virtual Reality
Application in Augmented and Virtual Reality
 
Virtual Reality
Virtual RealityVirtual Reality
Virtual Reality
 
Lecture 9 AR Technology
Lecture 9 AR TechnologyLecture 9 AR Technology
Lecture 9 AR Technology
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
 
COMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research DirectionsCOMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research Directions
 
Virtual reality (vr)
Virtual  reality (vr)Virtual  reality (vr)
Virtual reality (vr)
 
COMP 4010: Lecture8 - AR Technology
COMP 4010: Lecture8 - AR TechnologyCOMP 4010: Lecture8 - AR Technology
COMP 4010: Lecture8 - AR Technology
 
2016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 52016 AR Summer School - Lecture 5
2016 AR Summer School - Lecture 5
 
COMP 4010 - Lecture 8 AR Technology
COMP 4010 - Lecture 8 AR TechnologyCOMP 4010 - Lecture 8 AR Technology
COMP 4010 - Lecture 8 AR Technology
 
COMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRCOMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VR
 

Más de Mark Billinghurst

Más de Mark Billinghurst (15)

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 

Último

+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
?#DUbAI#??##{{(☎️+971_581248768%)**%*]'#abortion pills for sale in dubai@
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
panagenda
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 

Último (20)

+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
 
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
Emergent Methods: Multi-lingual narrative tracking in the news - real-time ex...
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
 
[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdfRising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
Rising Above_ Dubai Floods and the Fortitude of Dubai International Airport.pdf
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with Milvus
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
CNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In PakistanCNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In Pakistan
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
 
AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024AXA XL - Insurer Innovation Award Americas 2024
AXA XL - Insurer Innovation Award Americas 2024
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 

COMP 4010 - Lecture 3 VR Systems

  • 1. LECTURE 3: VR SYSTEMS COMP 4010 – Virtual Reality Semester 5 - 2018 Bruce Thomas, Mark Billinghurst, Gun Lee University of South Australia August 7th 2018
  • 3. Tracking inVirtual Reality • Registration • Positioning virtual object wrt real/virtual objects • Tracking • Continually locating the users viewpoint/input • Position (x,y,z), Orientation (r,p,y)
  • 4. MechanicalTracker •Idea: mechanical arms with joint sensors •++: high accuracy, haptic feedback •-- : cumbersome, expensive Microscribe Sutherland
  • 5. MagneticTracker • Idea: difference between a magnetic transmitter and a receiver • ++: 6DOF, robust • -- : wired, sensible to metal, noisy, expensive • -- : error increases with distance Flock of Birds (Ascension)
  • 6. Example: Razer Hydra • Developed by Sixense • Magnetic source + 2 wired controllers • Short range (1-2 m) • Precision of 1mm and 1o • $600 USD
  • 7. Razor Hydra Demo • https://www.youtube.com/watch?v=jnqFdSa5p7w
  • 8. InertialTracker • Idea: measuring linear and angular orientation rates (accelerometer/gyroscope) • ++: no transmitter, cheap, small, high frequency, wireless • -- : drift, hysteris only 3DOF IS300 (Intersense) Wii Remote
  • 9. OpticalTracker • Idea: Image Processing and ComputerVision • Specialized • Infrared, Retro-Reflective, Stereoscopic • Monocular BasedVisionTracking ART Hi-Ball
  • 11. Example: Vive Lighthouse Tracking • Outside-in tracking system • 2 base stations • Each with 2 laser scanners, LED array • Headworn/handheld sensors • 37 photo-sensors in HMD, 17 in hand • Additional IMU sensors (500 Hz) • Performance • Tracking server fuses sensor samples • Sampling rate 250 Hz, 4 ms latency • See http://doc-ok.org/?p=1478
  • 12. Lighthouse Components Base station - IR LED array - 2 x scanned lasers Head Mounted Display - 37 photo sensors - 9 axis IMU
  • 14. Lighthouse Tracking Base station scanning https://www.youtube.com/watch?v=avBt_P0wg_Y https://www.youtube.com/watch?v=oqPaaMR4kY4 Room tracking
  • 15. Example: Windows Mixed Reality • Inside-out tracking • Two cameras on HMD • Low rez Black and White • Combined with IMU data • Uses environment mapping • Tracking from visual features
  • 17. Occipital Bridge Engine/Structure Core • Inside out tracking • Uses structured light • Better than room scale tracking • Integrated into bridge HMD • https://structure.io/
  • 20. VR Input Devices • Physical devices that convey information into the application and support interaction in the Virtual Environment
  • 21. Mapping Between Input and Output Input Output
  • 22. Motivation • Mouse and keyboard are good for desktop UI tasks • Text entry, selection, drag and drop, scrolling, rubber banding, … • 2D mouse for 2D windows • What devices are best for 3D input in VR? • Use multiple 2D input devices? • Use new types of devices? vs.
  • 23. Input Device Characteristics • Size and shape, encumbrance • Degrees of Freedom • Integrated (mouse) vs. separable (Etch-a-sketch) • Direct vs. indirect manipulation • Relative vs. Absolute input • Relative: measure difference between current and last input (mouse) • Absolute: measure input relative to a constant point of reference (tablet) • Rate control vs. position control • Isometric vs. Isotonic • Isometric: measure pressure or force with no actual movement • Isotonic: measure deflection from a center point (e.g. mouse)
  • 24. Hand Input Devices • Devices that integrate hand input into VR • World-Grounded input devices • Devices fixed in real world (e.g. joystick) • Non-Tracked handheld controllers • Devices held in hand, but not tracked in 3D (e.g. xbox controller) • Tracked handheld controllers • Physical device with 6 DOF tracking inside (e.g. Vive controllers) • Hand-Worn Devices • Gloves, EMG bands, rings, or devices worn on hand/arm • Bare Hand Input • Using technology to recognize natural hand input
  • 25. World Grounded Devices • Devices constrained or fixed in real world • Not ideal for VR • Constrains user motion • Good for VR vehicle metaphor • Used in location based entertainment (e.g. Disney Aladdin ride) Disney Aladdin Magic Carpet VR Ride
  • 26. Non-Tracked Handheld Controllers • Devices held in hand • Buttons, joysticks, game controllers, etc. • Traditional video game controllers • Xbox controller
  • 27. Tracked Handheld Controllers • Handheld controller with 6 DOF tracking • Combines button/joystick input plus tracking • One of the best options for VR applications • Physical prop enhancing VR presence • Providing proprioceptive, passive haptic touch cues • Direct mapping to real hand motion HTC Vive Controllers Oculus Touch Controllers
  • 28. Example: Sixense STEM • Wireless motion tracking + button input • Electromagnetic tracking, 8 foot range, 5 tracked receivers • http://sixense.com/wireless
  • 29. Sixense Demo Video • https://www.youtube.com/watch?v=2lY3XI0zDWw
  • 30. Example: WMR Handheld Controllers • Windows Mixed Reality Controllers • Left and right hand • Combine computer vision + IMU tracking • Track both in and out of view • Button input, Vibration feedback
  • 32. Cubic Mouse • Plastic box • Polhemus Fastrack inside (magnetic 6 DOF tracking) • 3 translating rods, 6 buttons • Two handed interface • Supports object rotation, zooming, cutting plane, etc. Fröhlich, B., & Plate, J. (2000). The cubic mouse: a new device for three-dimensional input. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (pp. 526- 531). ACM.
  • 33. Cubic Mouse Video • https://www.youtube.com/watch?v=1WuH7ezv_Gs
  • 34. Hand Worn Devices • Devices worn on hands/arms • Glove, EMG sensors, rings, etc. • Advantages • Natural input with potentially rich gesture interaction • Hands can be held in comfortable positions – no line of sight issues • Hands and fingers can fully interact with real objects
  • 35. Myo Arm Band • https://www.youtube.com/watch?v=1f_bAXHckUY
  • 36. Data Gloves • Bend sensing gloves • Passive input device • Detecting hand posture and gestures • Continuous raw data from bend sensors • Fiber optic, resistive ink, strain-gauge • Large DOF output, natural hand output • Pinch gloves • Conductive material at fingertips • Determine if fingertips touching • Used for discrete input • Object selection, mode switching, etc.
  • 37. How Pinch Gloves Work • Contact between conductive fabric completes circuit • Each finger receives voltage in turn (T3 – T7) • Look for output voltage at different times
  • 38. Example: Cyberglove • Invented to support sign language • Technology • Thin electrical strain gauges over fingers • Bending sensors changes resistence • 18-22 sensors per glove, 120 Hz samples • Sensor resolution 0.5o • Very expensive • >$10,000/glove • http://www.cyberglovesystems.com
  • 39. How CyberGlove Works • Strain gauge at joints • Connected to A/D converter
  • 41. StretchSense • Wearable motion capture sensors • Capacitive sensors • Measure stretch, pressure, bend, shear • Many applications • Garments, gloves, etc. • http://stretchsense.com/
  • 42. StretchSense Glove Demo • https://www.youtube.com/watch?v=wYsZS0p5uu8
  • 43. Comparison of Glove Performance From Burdea, Virtual Reality Technology, 2003
  • 44. Bare Hands • Using computer vision to track bare hand input • Creates compelling sense of Presence, natural interaction • Challenges need to be solved • Not having sense of touch • Line of sight required to sensor • Fatigue from holding hands in front of sensor
  • 45. Leap Motion • IR based sensor for hand tracking ($50 USD) • HMD + Leap Motion = Hand input in VR • Technology • 3 IR LEDS and 2 wide angle cameras • The LEDS generate patternless IR light • IR reflections picked up by cameras • Software performs hand tracking • Performance • 1m range, 0.7 mm accuracy, 200Hz • https://www.leapmotion.com/
  • 46. Example: Leap Motion • https://www.youtube.com/watch?v=QD4qQBL0X80
  • 47. Non-Hand Input Devices • Capturing input from other parts of the body • Head Tracking • Use head motion for input • Eye Tracking • Largely unexplored for VR • Microphones • Audio input, speech • Full-Body tracking • Motion capture, body movement
  • 48. Eye Tracking • Technology • Shine IR light into eye and look for reflections • Advantages • Provides natural hands-free input • Gaze provides cues as to user attention • Can be combined with other input technologies
  • 49. Example: FOVE VR Headset • Eye tracker integrated into VR HMD • Gaze driven user interface, foveated rendering • https://www.youtube.com/watch?v=8dwdzPaqsDY
  • 50. Pupil Labs VIVE/Oculus Add-ons • Adds eye-tracking to HTC Vive/Oculus Rift HMDs • Mono or stereo eye-tracking • 120 Hz eye tracking, gaze accuracy of 0.6° with precision of 0.08° • Open source software for eye-tracking • https://pupil-labs.com/pupil/
  • 51. Full Body Tracking • Adding full-body input into VR • Creates illusion of self-embodiment • Significantly enhances sense of Presence • Technologies • Motion capture suit, camera based systems • Can track large number of significant feature points
  • 52. Camera Based Motion Capture • Use multiple cameras • Reflective markers on body • Eg – Opitrack (www.optitrack.com) • 120 – 360 fps, < 10ms latency, < 1mm accuracy
  • 54. Wearable Motion Capture: PrioVR • Wearable motion capture system • 8 – 17 inertial sensors + wireless data transmission • 30 – 40m range, 7.5 ms latency, 0.09o precision • Supports full range of motion, no occlusion • www.priovr.com
  • 56. Pedestrian Devices • Pedestrian input in VR • Walking/running in VR • Virtuix Omni • Special shoes • http://www.virtuix.com • Cyberith Virtualizer • Socks + slippery surface • http://cyberith.com
  • 57. Cyberith Virtualizer Demo • https://www.youtube.com/watch?v=R8lmf3OFrms
  • 58. Virtusphere • Fully immersive sphere • Support walking, running in VR • Person inside trackball • http://www.virtusphere.com
  • 60. Omnidirectional Treadmills • Infinadeck • 2 axis treadmill, flexible material • Tracks user to keep them in centre • Limitless walking input in VR • www.infinadeck.com
  • 62. Comparison Between Devices From Jerald (2015) Comparing between hand and non-hand input
  • 63. Input Device Taxonomies • Helps to determine: • Which devices can be used for each other • What devices to use for particular tasks • Many different approaches • Separate the input device from interaction technique (Foley 1974) • Mapping basic interactive tasks to devices (Foley 1984) • Basic tasks – select, position, orient, etc. • Devices – mouse, joystick, touch panel, etc. • Consider Degrees of Freedom and properties sensed (Buxton 1983) • motion, position, pressure • Distinguish bet. absolute/relative input, individual axes (Mackinlay 1990) • separate translation, rotation axes instead of using DOF
  • 64. Foley and Wallace Taxonomy (1974) Separate device from interaction technique
  • 65. Buxton Input Device Taxonomy (Buxton 1983) • Classified according to degrees of freedom and property sensed • M = devise uses an intermediary between hand and sensing system • T = touch sensitive
  • 67. • High level overview • User engaged in task • User provides input • VR engine provides output • VR engine connected to external databases Basic VR System
  • 68. Simple SystemArchitecture Head Tracker Host Processor Data Base Model Rendering Engine Frame Buffer head position/orientation to network Display Driver Non see- thru Image source & optics Virtual World
  • 69. From Content to User Modelling Program Content • 3d model • Textures Translation • CAD data Application programming Dynamics Generator Input Devices • Gloves, Mic • Trackers Renderers • 3D, sound Output Devices • HMD, audio • Haptic User Actions • Speak • Grab Software Content User I/O
  • 70. Case Study: Multimodal VR System • US Army project • Simulate control of an unmanned vehicle • Sensors (input) • Head/hand tracking • Gesture, Speech (Multimodal) • Displays (output) • HMD, Audio • Processing • Graphics: Virtual vehicles on battlefield • Speech processing/understanding Neely, H. E., Belvin, R. S., Fox, J. R., & Daily, M. J. (2004, March). Multimodal interaction techniques for situational awareness and command of robotic combat entities. In Aerospace Conference, 2004. Proceedings. 2004 IEEE (Vol. 5, pp. 3297-3305). IEEE.
  • 72. VR Graphics Architecture • Application Layer • User interface libraries • Simulation/behaviour code • User interaction specification • Graphics Layer (CPU acceleration) • Scene graph specification • Object physics engine • Specifying graphics objects • Rendering Layer (GPU acceleration) • Low level graphics code • Rendering pixels/polygons • Interface with graphics card/frame buffer
  • 74. Typical VR Simulation Loop • User moves head, scene updates, displayed graphics change
  • 75. • Need to synchronize system to reduce delays System Delays
  • 76. Typical Delay from Tracking to Rendering System Delay
  • 77. Typical System Delays • Total Delay = 50 + 2 + 33 + 17 = 102 ms • 1 ms delay = 1/3 mm error for object drawn at arms length • So total of 33mm error from when user begins moving to when object drawn Tracking Calculate Viewpoint Simulation Render Scene Draw to Display x,y,z r,p,y Application Loop 20 Hz = 50ms 500 Hz = 2ms 30 Hz = 33ms 60 Hz = 17ms
  • 78. Living with High Latency (1/3 sec – 3 sec) • https://www.youtube.com/watch?v=_fNp37zFn9Q
  • 79. Effects of System Latency • Degraded Visual Acuity • Scene still moving when head stops = motion blur • Degraded Performance • As latency increases it’s difficult to select objects etc. • If latency > 120 ms, training doesn’t improve performance • Breaks-in-Presence • If system delay high user doesn’t believe they are in VR • Negative Training Effects • User train to operative in world with delay • Simulator Sickness • Latency is greatest cause of simulator sickness
  • 80. Simulator Sickness • Visual input conflicting with vestibular system
  • 81. Many Causes of Simulator Sickness • 25-40% of VR users get Simulator Sickness, due to: • Latency • Major cause of simulator sickness • Tracking accuracy/precision • Seeing world from incorrect position, viewpoint drift • Field of View • Wide field of view creates more periphery vection = sickness • Refresh Rate/Flicker • Flicker/low refresh rate creates eye fatigue • Vergence/Accommodation Conflict • Creates eye strain over time • Eye separation • If IPD not matching to inter-image distance then discomfort
  • 83. How to Reduce System Delays • Use faster components • Faster CPU, display, etc. • Reduce the apparent lag • Take tracking measurement just before rendering • Remove tracker from the loop • Use predictive tracking • Use fast inertial sensors to predict where user will be looking • Difficult due to erratic head movements Jerald, J. (2004). Latency compensation for head-mounted virtual reality. UNC Computer Science Technical Report.
  • 84. Reducing System Lag Tracking Calculate Viewpoint Simulation Render Scene Draw to Display x,y,z r,p,y Application Loop Faster Tracker Faster CPU Faster GPU Faster Display
  • 85. ReducingApparent Lag Tracking Update x,y,z r,p,y Virtual Display Physical Display (640x480) 1280 x 960 Last known position Virtual Display Physical Display (640x480) 1280 x 960 Latest position Tracking Calculate Viewpoint Simulation Render Scene Draw to Display x,y,z r,p,y Application Loop
  • 86. PredictiveTracking Time Position Past Future Use additional sensors (e.g. inertial) to predict future position • Can reliably predict up to 80 ms in future (Holloway) • Use Kalman filters or similar to smooth prediction Now
  • 88. System Design Guidelines - I • Hardware • Choose HMDs with fast pixel response time, no flicker • Choose trackers with high update rates, accurate, no drift • Choose HMDs that are lightweight, comfortable to wear • Use hand controllers with no line of sight requirements • System Calibration • Have virtual FOV match actual FOV of HMD • Measure and set users IPD • Latency Reduction • Minimize overall end to end system delay • Use displays with fast response time and low persistence • Use latency compensation to reduce perceived latency Jason Jerald, The VR Book, 2016
  • 89. System Design Guidelines - II • General Design • Design for short user experiences • Minimize visual stimuli closer to eye (vergence/accommodation) • For binocular displays, do not use 2D overlays/HUDs • Design for sitting, or provide physical barriers • Show virtual warning when user reaches end of tracking area • Motion Design • Move virtual viewpoint with actual motion of the user • If latency high, no tasks requiring fast head motion • Interface Design • Design input/interaction for user’s hands at their sides • Design interactions to be non-repetitive to reduce strain injuries Jason Jerald, The VR Book, 2016
  • 94. VR Processes and Engines