SlideShare a Scribd company logo
1 of 25
X




Spatial Gestures using a Tactile-Proprioceptive Display
              Eelke Folmer & Tony Morelli - TEI’12, Kingston
                                Player-Game Interaction Lab
                                 University of Nevada, Reno
Spatial Gestures in NUI’s




                   Player-Game Interaction Research
                         University of Nevada, Reno
Spatial Gestures in NUI’s




                   Player-Game Interaction Research
                         University of Nevada, Reno
No Display / Unable to see




                    Player-Game Interaction Research
                          University of Nevada, Reno
No Display / Unable to see




          ?
                    Player-Game Interaction Research
                          University of Nevada, Reno
Non-Visual NUI’s

          item A
          item B
          item C



     visual impairment          mobile contexts


Limitations:
» no spatial gestures
» rely on visuospatial memory
                                    Player-Game Interaction Research
                                          University of Nevada, Reno
Non-Visual NUI’s

            item A
 “item B”   item B
            item C



      visual impairment         mobile contexts


Limitations:
» no spatial gestures
» rely on visuospatial memory
                                    Player-Game Interaction Research
                                          University of Nevada, Reno
Non-Visual NUI’s

            item A
 “item B”   item B
            item C
                                        ?


      visual impairment         mobile contexts


Limitations:
» no spatial gestures
» rely on visuospatial memory
                                    Player-Game Interaction Research
                                          University of Nevada, Reno
Tactile-Proprioceptive display




Turn the Human body into a display
Proprioception
 »human ability to sense the orientation of limbs
 »augment haptic feedback with prop. information
                                   Player-Game Interaction Research
                                         University of Nevada, Reno
Tactile-Proprioceptive display




Turn the Human body into a display
Proprioception
 »human ability to sense the orientation of limbs
 »augment haptic feedback with prop. information
                                   Player-Game Interaction Research
                                         University of Nevada, Reno
Example




                                         frequency




               error                 0
          Player-Game Interaction Research
                University of Nevada, Reno
Example




                                         frequency




               error                 0
          Player-Game Interaction Research
                University of Nevada, Reno
Example




                                         frequency




               error                 0
          Player-Game Interaction Research
                University of Nevada, Reno
Study 1: 2D target acquisition




         linear                     multilinear

               Yerror: frequency                        Yerror: frequency




Xerror: band                       Xerror: pulse delay
                                       Player-Game Interaction Research
                                             University of Nevada, Reno
Study 1: 2D target acquisition




         linear                     multilinear

               Yerror: frequency                        Yerror: frequency




Xerror: band                       Xerror: pulse delay
                                       Player-Game Interaction Research
                                             University of Nevada, Reno
Study 1: 2D target acquisition




         linear                     multilinear

               Yerror: frequency                        Yerror: frequency




Xerror: band                       Xerror: pulse delay
                                       Player-Game Interaction Research
                                             University of Nevada, Reno
Study 1: 2D target acquisition




         linear                     multilinear

               Yerror: frequency                        Yerror: frequency




Xerror: band                       Xerror: pulse delay
                                       Player-Game Interaction Research
                                             University of Nevada, Reno
Study 1: procedure & results




Space invaders like game
Between-subjects study with 16 subjects
Corrected search time:
 »linear 51.7 ms/pixel      significant difference
 »multilinear 40.3 ms/pixel
No sig. difference in error
                                       Player-Game Interaction Research
                                             University of Nevada, Reno
Study 1: procedure & results




Space invaders like game
Between-subjects study with 16 subjects
Corrected search time:
 »linear 51.7 ms/pixel      significant difference
 »multilinear 40.3 ms/pixel
No sig. difference in error
                                       Player-Game Interaction Research
                                             University of Nevada, Reno
Study 2: Spatial Gesture

                                    X




Multilinear scanning
correct gesture if:
  » within 150 pixels of target
  » Z axis decrease of 20 cm
  » less than 5% error in each axis of rotation
8 subjects (not participate in Study 1)
corrected search time: 45.9 ms/pixel
aiming accuracy: 21.4°             Player-Game Interaction Research
                                                   University of Nevada, Reno
Study 2: Spatial Gesture

                                    X




Multilinear scanning
correct gesture if:
  » within 150 pixels of target
  » Z axis decrease of 20 cm
  » less than 5% error in each axis of rotation
8 subjects (not participate in Study 1)
corrected search time: 45.9 ms/pixel
aiming accuracy: 21.4°             Player-Game Interaction Research
                                                   University of Nevada, Reno
Potential Applications




navigation   low cost motor    exergames for users
              rehabilitation      who are blind




                               Player-Game Interaction Research
                                     University of Nevada, Reno
Current/Future Work
                                  direction of error found




                                                Y



                                                X

     3D scanning            Extension of Fitts’s law

3D target selection
two handed scanning
Model for non-visual pointing
                                     Player-Game Interaction Research
                                           University of Nevada, Reno
props & questions




This research supported by NSF Grant IIS-1118074
Any opinions, findings, and conclusions or recommendations expressed in this
material are those of the author(s) and do not necessarily reflect the views of the
National Science Foundation.
                                                       Player-Game Interaction Research
                                                             University of Nevada, Reno
props & questions




                           ?
This research supported by NSF Grant IIS-1118074
Any opinions, findings, and conclusions or recommendations expressed in this
material are those of the author(s) and do not necessarily reflect the views of the
National Science Foundation.
                                                       Player-Game Interaction Research
                                                             University of Nevada, Reno

More Related Content

Recently uploaded

Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...apidays
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWERMadyBayot
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businesspanagenda
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesrafiqahmad00786416
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAndrey Devyatkin
 
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Victor Rentea
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...Zilliz
 
Vector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptxVector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptxRemote DBA Services
 
WSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering DevelopersWSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering DevelopersWSO2
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Zilliz
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfsudhanshuwaghmare1
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxRustici Software
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...apidays
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024The Digital Insurer
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherRemote DBA Services
 
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...apidays
 
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...apidays
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobeapidays
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsNanddeep Nachan
 

Recently uploaded (20)

Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
 
ICT role in 21st century education and its challenges
ICT role in 21st century education and its challengesICT role in 21st century education and its challenges
ICT role in 21st century education and its challenges
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
 
Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
Vector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptxVector Search -An Introduction in Oracle Database 23ai.pptx
Vector Search -An Introduction in Oracle Database 23ai.pptx
 
WSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering DevelopersWSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering Developers
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
Apidays New York 2024 - Passkeys: Developing APIs to enable passwordless auth...
 
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 

Featured

2024 State of Marketing Report – by Hubspot
2024 State of Marketing Report – by Hubspot2024 State of Marketing Report – by Hubspot
2024 State of Marketing Report – by HubspotMarius Sescu
 
Everything You Need To Know About ChatGPT
Everything You Need To Know About ChatGPTEverything You Need To Know About ChatGPT
Everything You Need To Know About ChatGPTExpeed Software
 
Product Design Trends in 2024 | Teenage Engineerings
Product Design Trends in 2024 | Teenage EngineeringsProduct Design Trends in 2024 | Teenage Engineerings
Product Design Trends in 2024 | Teenage EngineeringsPixeldarts
 
How Race, Age and Gender Shape Attitudes Towards Mental Health
How Race, Age and Gender Shape Attitudes Towards Mental HealthHow Race, Age and Gender Shape Attitudes Towards Mental Health
How Race, Age and Gender Shape Attitudes Towards Mental HealthThinkNow
 
AI Trends in Creative Operations 2024 by Artwork Flow.pdf
AI Trends in Creative Operations 2024 by Artwork Flow.pdfAI Trends in Creative Operations 2024 by Artwork Flow.pdf
AI Trends in Creative Operations 2024 by Artwork Flow.pdfmarketingartwork
 
PEPSICO Presentation to CAGNY Conference Feb 2024
PEPSICO Presentation to CAGNY Conference Feb 2024PEPSICO Presentation to CAGNY Conference Feb 2024
PEPSICO Presentation to CAGNY Conference Feb 2024Neil Kimberley
 
Content Methodology: A Best Practices Report (Webinar)
Content Methodology: A Best Practices Report (Webinar)Content Methodology: A Best Practices Report (Webinar)
Content Methodology: A Best Practices Report (Webinar)contently
 
How to Prepare For a Successful Job Search for 2024
How to Prepare For a Successful Job Search for 2024How to Prepare For a Successful Job Search for 2024
How to Prepare For a Successful Job Search for 2024Albert Qian
 
Social Media Marketing Trends 2024 // The Global Indie Insights
Social Media Marketing Trends 2024 // The Global Indie InsightsSocial Media Marketing Trends 2024 // The Global Indie Insights
Social Media Marketing Trends 2024 // The Global Indie InsightsKurio // The Social Media Age(ncy)
 
Trends In Paid Search: Navigating The Digital Landscape In 2024
Trends In Paid Search: Navigating The Digital Landscape In 2024Trends In Paid Search: Navigating The Digital Landscape In 2024
Trends In Paid Search: Navigating The Digital Landscape In 2024Search Engine Journal
 
5 Public speaking tips from TED - Visualized summary
5 Public speaking tips from TED - Visualized summary5 Public speaking tips from TED - Visualized summary
5 Public speaking tips from TED - Visualized summarySpeakerHub
 
ChatGPT and the Future of Work - Clark Boyd
ChatGPT and the Future of Work - Clark Boyd ChatGPT and the Future of Work - Clark Boyd
ChatGPT and the Future of Work - Clark Boyd Clark Boyd
 
Getting into the tech field. what next
Getting into the tech field. what next Getting into the tech field. what next
Getting into the tech field. what next Tessa Mero
 
Google's Just Not That Into You: Understanding Core Updates & Search Intent
Google's Just Not That Into You: Understanding Core Updates & Search IntentGoogle's Just Not That Into You: Understanding Core Updates & Search Intent
Google's Just Not That Into You: Understanding Core Updates & Search IntentLily Ray
 
Time Management & Productivity - Best Practices
Time Management & Productivity -  Best PracticesTime Management & Productivity -  Best Practices
Time Management & Productivity - Best PracticesVit Horky
 
The six step guide to practical project management
The six step guide to practical project managementThe six step guide to practical project management
The six step guide to practical project managementMindGenius
 
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...RachelPearson36
 

Featured (20)

2024 State of Marketing Report – by Hubspot
2024 State of Marketing Report – by Hubspot2024 State of Marketing Report – by Hubspot
2024 State of Marketing Report – by Hubspot
 
Everything You Need To Know About ChatGPT
Everything You Need To Know About ChatGPTEverything You Need To Know About ChatGPT
Everything You Need To Know About ChatGPT
 
Product Design Trends in 2024 | Teenage Engineerings
Product Design Trends in 2024 | Teenage EngineeringsProduct Design Trends in 2024 | Teenage Engineerings
Product Design Trends in 2024 | Teenage Engineerings
 
How Race, Age and Gender Shape Attitudes Towards Mental Health
How Race, Age and Gender Shape Attitudes Towards Mental HealthHow Race, Age and Gender Shape Attitudes Towards Mental Health
How Race, Age and Gender Shape Attitudes Towards Mental Health
 
AI Trends in Creative Operations 2024 by Artwork Flow.pdf
AI Trends in Creative Operations 2024 by Artwork Flow.pdfAI Trends in Creative Operations 2024 by Artwork Flow.pdf
AI Trends in Creative Operations 2024 by Artwork Flow.pdf
 
Skeleton Culture Code
Skeleton Culture CodeSkeleton Culture Code
Skeleton Culture Code
 
PEPSICO Presentation to CAGNY Conference Feb 2024
PEPSICO Presentation to CAGNY Conference Feb 2024PEPSICO Presentation to CAGNY Conference Feb 2024
PEPSICO Presentation to CAGNY Conference Feb 2024
 
Content Methodology: A Best Practices Report (Webinar)
Content Methodology: A Best Practices Report (Webinar)Content Methodology: A Best Practices Report (Webinar)
Content Methodology: A Best Practices Report (Webinar)
 
How to Prepare For a Successful Job Search for 2024
How to Prepare For a Successful Job Search for 2024How to Prepare For a Successful Job Search for 2024
How to Prepare For a Successful Job Search for 2024
 
Social Media Marketing Trends 2024 // The Global Indie Insights
Social Media Marketing Trends 2024 // The Global Indie InsightsSocial Media Marketing Trends 2024 // The Global Indie Insights
Social Media Marketing Trends 2024 // The Global Indie Insights
 
Trends In Paid Search: Navigating The Digital Landscape In 2024
Trends In Paid Search: Navigating The Digital Landscape In 2024Trends In Paid Search: Navigating The Digital Landscape In 2024
Trends In Paid Search: Navigating The Digital Landscape In 2024
 
5 Public speaking tips from TED - Visualized summary
5 Public speaking tips from TED - Visualized summary5 Public speaking tips from TED - Visualized summary
5 Public speaking tips from TED - Visualized summary
 
ChatGPT and the Future of Work - Clark Boyd
ChatGPT and the Future of Work - Clark Boyd ChatGPT and the Future of Work - Clark Boyd
ChatGPT and the Future of Work - Clark Boyd
 
Getting into the tech field. what next
Getting into the tech field. what next Getting into the tech field. what next
Getting into the tech field. what next
 
Google's Just Not That Into You: Understanding Core Updates & Search Intent
Google's Just Not That Into You: Understanding Core Updates & Search IntentGoogle's Just Not That Into You: Understanding Core Updates & Search Intent
Google's Just Not That Into You: Understanding Core Updates & Search Intent
 
How to have difficult conversations
How to have difficult conversations How to have difficult conversations
How to have difficult conversations
 
Introduction to Data Science
Introduction to Data ScienceIntroduction to Data Science
Introduction to Data Science
 
Time Management & Productivity - Best Practices
Time Management & Productivity -  Best PracticesTime Management & Productivity -  Best Practices
Time Management & Productivity - Best Practices
 
The six step guide to practical project management
The six step guide to practical project managementThe six step guide to practical project management
The six step guide to practical project management
 
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
Beginners Guide to TikTok for Search - Rachel Pearson - We are Tilt __ Bright...
 

Spatial Gestures using a Tactile-Proprioceptive Display

  • 1. X Spatial Gestures using a Tactile-Proprioceptive Display Eelke Folmer & Tony Morelli - TEI’12, Kingston Player-Game Interaction Lab University of Nevada, Reno
  • 2. Spatial Gestures in NUI’s Player-Game Interaction Research University of Nevada, Reno
  • 3. Spatial Gestures in NUI’s Player-Game Interaction Research University of Nevada, Reno
  • 4. No Display / Unable to see Player-Game Interaction Research University of Nevada, Reno
  • 5. No Display / Unable to see ? Player-Game Interaction Research University of Nevada, Reno
  • 6. Non-Visual NUI’s item A item B item C visual impairment mobile contexts Limitations: » no spatial gestures » rely on visuospatial memory Player-Game Interaction Research University of Nevada, Reno
  • 7. Non-Visual NUI’s item A “item B” item B item C visual impairment mobile contexts Limitations: » no spatial gestures » rely on visuospatial memory Player-Game Interaction Research University of Nevada, Reno
  • 8. Non-Visual NUI’s item A “item B” item B item C ? visual impairment mobile contexts Limitations: » no spatial gestures » rely on visuospatial memory Player-Game Interaction Research University of Nevada, Reno
  • 9. Tactile-Proprioceptive display Turn the Human body into a display Proprioception »human ability to sense the orientation of limbs »augment haptic feedback with prop. information Player-Game Interaction Research University of Nevada, Reno
  • 10. Tactile-Proprioceptive display Turn the Human body into a display Proprioception »human ability to sense the orientation of limbs »augment haptic feedback with prop. information Player-Game Interaction Research University of Nevada, Reno
  • 11. Example frequency error 0 Player-Game Interaction Research University of Nevada, Reno
  • 12. Example frequency error 0 Player-Game Interaction Research University of Nevada, Reno
  • 13. Example frequency error 0 Player-Game Interaction Research University of Nevada, Reno
  • 14. Study 1: 2D target acquisition linear multilinear Yerror: frequency Yerror: frequency Xerror: band Xerror: pulse delay Player-Game Interaction Research University of Nevada, Reno
  • 15. Study 1: 2D target acquisition linear multilinear Yerror: frequency Yerror: frequency Xerror: band Xerror: pulse delay Player-Game Interaction Research University of Nevada, Reno
  • 16. Study 1: 2D target acquisition linear multilinear Yerror: frequency Yerror: frequency Xerror: band Xerror: pulse delay Player-Game Interaction Research University of Nevada, Reno
  • 17. Study 1: 2D target acquisition linear multilinear Yerror: frequency Yerror: frequency Xerror: band Xerror: pulse delay Player-Game Interaction Research University of Nevada, Reno
  • 18. Study 1: procedure & results Space invaders like game Between-subjects study with 16 subjects Corrected search time: »linear 51.7 ms/pixel significant difference »multilinear 40.3 ms/pixel No sig. difference in error Player-Game Interaction Research University of Nevada, Reno
  • 19. Study 1: procedure & results Space invaders like game Between-subjects study with 16 subjects Corrected search time: »linear 51.7 ms/pixel significant difference »multilinear 40.3 ms/pixel No sig. difference in error Player-Game Interaction Research University of Nevada, Reno
  • 20. Study 2: Spatial Gesture X Multilinear scanning correct gesture if: » within 150 pixels of target » Z axis decrease of 20 cm » less than 5% error in each axis of rotation 8 subjects (not participate in Study 1) corrected search time: 45.9 ms/pixel aiming accuracy: 21.4° Player-Game Interaction Research University of Nevada, Reno
  • 21. Study 2: Spatial Gesture X Multilinear scanning correct gesture if: » within 150 pixels of target » Z axis decrease of 20 cm » less than 5% error in each axis of rotation 8 subjects (not participate in Study 1) corrected search time: 45.9 ms/pixel aiming accuracy: 21.4° Player-Game Interaction Research University of Nevada, Reno
  • 22. Potential Applications navigation low cost motor exergames for users rehabilitation who are blind Player-Game Interaction Research University of Nevada, Reno
  • 23. Current/Future Work direction of error found Y X 3D scanning Extension of Fitts’s law 3D target selection two handed scanning Model for non-visual pointing Player-Game Interaction Research University of Nevada, Reno
  • 24. props & questions This research supported by NSF Grant IIS-1118074 Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. Player-Game Interaction Research University of Nevada, Reno
  • 25. props & questions ? This research supported by NSF Grant IIS-1118074 Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. Player-Game Interaction Research University of Nevada, Reno

Editor's Notes

  1. Hi, My name is Eelke Folmer and I'm here to present the work I did with my grad student Tony Morelli on using a tactile proprioceptive display to perform spatial gestures. \n\n
  2. Spatial gestures are an essential feature of natural user interfaces where a touch or gesture activates or manipulates an on screen object. For example, dragging a file into a folder. \nSpatial interaction relies upon being able to visually acquire the position of an object.\n
  3. Spatial gestures are an essential feature of natural user interfaces where a touch or gesture activates or manipulates an on screen object. For example, dragging a file into a folder. \nSpatial interaction relies upon being able to visually acquire the position of an object.\n
  4. Spatial gestures are an essential feature of natural user interfaces where a touch or gesture activates or manipulates an on screen object. For example, dragging a file into a folder. \nSpatial interaction relies upon being able to visually acquire the position of an object.\n
  5. Spatial gestures are an essential feature of natural user interfaces where a touch or gesture activates or manipulates an on screen object. For example, dragging a file into a folder. \nSpatial interaction relies upon being able to visually acquire the position of an object.\n
  6. Spatial gestures are an essential feature of natural user interfaces where a touch or gesture activates or manipulates an on screen object. For example, dragging a file into a folder. \nSpatial interaction relies upon being able to visually acquire the position of an object.\n
  7. Spatial gestures are an essential feature of natural user interfaces where a touch or gesture activates or manipulates an on screen object. For example, dragging a file into a folder. \nSpatial interaction relies upon being able to visually acquire the position of an object.\n
  8. So you can imagine this pretty difficult if you are unable see or if you don’t have a display. \n\n
  9. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  10. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  11. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  12. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  13. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  14. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  15. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  16. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  17. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  18. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  19. In recent years several non visual NUI’s have been developed. \n\n- For example, audio based interfaces have been developed that allow blind users to scroll lists and select items on a touch screen devices but those don’t use spatial gestures. \n\n- To increase input spaces on mobile devices, screenless mobile interfaces have been developed. \nUsers interact with imaginary objects or shortcuts that are defined in a plane in front of them. \nThough some of these techniques may allow spatial gestures, no spatial feedback is provided and the user must keep track of the locations of objects which may be hard if there are a large number of objects.\n\nTo address the shortcoming of existing non-visual NUI’s we present a novel ear and eye free display technique that allows you to acquire the location of an object in a 2D display defined in front of the user, which users can then manipulate using a spatial gesture. \n\n\n
  20. Along the lines of recent work that turns the body into an input space, we explore turning the body into an display. But instead of using your body to communicate information to someone else we communicate information to the user using their own body. To do that we use a largely unexplored output modality called proprioception. Proprioception is the human ability to sense the orientation of their limbs and which allows you for example to touch your nose with your eyes closed. \n\nRecent work by my lab and others shows you can augment haptic feedback with proprioceptive information to facilitate an significantly larger information space that can be accessed in an ear and eye free manner and which can be used to point out targets around the user. Let me illustrate this with an example. \n\n
  21. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  22. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  23. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  24. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  25. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  26. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  27. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  28. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  29. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  30. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  31. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  32. In previous work we created a bowling game for blind users. \nUser can find the location of the bowling pins by scanning their environment with a hand held orientation aware device that is capable of providing haptic feedback. Directional vibrotactile feedback for example frequency guides the user to point the device at the pins, for example, the higher the frequency the closer you get to the target. The target direction is then conveyed to the user using their own arm, and which allows you to perform a gesture towards the target. \n\nSo some preliminary research on tactile proprioceptive displays has been conducted but these have only explored 1D target acquisition. \n
  33. In the first study we explore 2D target acquisition. \n\nWe implemented a tactile proprioceptive display using the Sony Move controller which can provide two different types of directional vibrotactile feedback and this controller can be tracked using an external camera with very high accuracy. \n\nThe size of this display we use is constrained by the reach of your arm.\n\nTwo different scanning techniques were defined: \nIn linear scanning users first find a band that is defined around the target’s X in which directional vibrotactile feedback is provided upon which the Target’s Y coordinate is found using frequency. \nin multilinear scanning directional vibrotactile feedback is provided on both axes simultaneously using frequency and pulse delay. Preliminary experiences showed multilinear scanning was much harder to perform so it would be of interest which one would yield the best performance. \n\n
  34. In the first study we explore 2D target acquisition. \n\nWe implemented a tactile proprioceptive display using the Sony Move controller which can provide two different types of directional vibrotactile feedback and this controller can be tracked using an external camera with very high accuracy. \n\nThe size of this display we use is constrained by the reach of your arm.\n\nTwo different scanning techniques were defined: \nIn linear scanning users first find a band that is defined around the target’s X in which directional vibrotactile feedback is provided upon which the Target’s Y coordinate is found using frequency. \nin multilinear scanning directional vibrotactile feedback is provided on both axes simultaneously using frequency and pulse delay. Preliminary experiences showed multilinear scanning was much harder to perform so it would be of interest which one would yield the best performance. \n\n
  35. In the first study we explore 2D target acquisition. \n\nWe implemented a tactile proprioceptive display using the Sony Move controller which can provide two different types of directional vibrotactile feedback and this controller can be tracked using an external camera with very high accuracy. \n\nThe size of this display we use is constrained by the reach of your arm.\n\nTwo different scanning techniques were defined: \nIn linear scanning users first find a band that is defined around the target’s X in which directional vibrotactile feedback is provided upon which the Target’s Y coordinate is found using frequency. \nin multilinear scanning directional vibrotactile feedback is provided on both axes simultaneously using frequency and pulse delay. Preliminary experiences showed multilinear scanning was much harder to perform so it would be of interest which one would yield the best performance. \n\n
  36. In the first study we explore 2D target acquisition. \n\nWe implemented a tactile proprioceptive display using the Sony Move controller which can provide two different types of directional vibrotactile feedback and this controller can be tracked using an external camera with very high accuracy. \n\nThe size of this display we use is constrained by the reach of your arm.\n\nTwo different scanning techniques were defined: \nIn linear scanning users first find a band that is defined around the target’s X in which directional vibrotactile feedback is provided upon which the Target’s Y coordinate is found using frequency. \nin multilinear scanning directional vibrotactile feedback is provided on both axes simultaneously using frequency and pulse delay. Preliminary experiences showed multilinear scanning was much harder to perform so it would be of interest which one would yield the best performance. \n\n
  37. In the first study we explore 2D target acquisition. \n\nWe implemented a tactile proprioceptive display using the Sony Move controller which can provide two different types of directional vibrotactile feedback and this controller can be tracked using an external camera with very high accuracy. \n\nThe size of this display we use is constrained by the reach of your arm.\n\nTwo different scanning techniques were defined: \nIn linear scanning users first find a band that is defined around the target’s X in which directional vibrotactile feedback is provided upon which the Target’s Y coordinate is found using frequency. \nin multilinear scanning directional vibrotactile feedback is provided on both axes simultaneously using frequency and pulse delay. Preliminary experiences showed multilinear scanning was much harder to perform so it would be of interest which one would yield the best performance. \n\n
  38. In the first study we explore 2D target acquisition. \n\nWe implemented a tactile proprioceptive display using the Sony Move controller which can provide two different types of directional vibrotactile feedback and this controller can be tracked using an external camera with very high accuracy. \n\nThe size of this display we use is constrained by the reach of your arm.\n\nTwo different scanning techniques were defined: \nIn linear scanning users first find a band that is defined around the target’s X in which directional vibrotactile feedback is provided upon which the Target’s Y coordinate is found using frequency. \nin multilinear scanning directional vibrotactile feedback is provided on both axes simultaneously using frequency and pulse delay. Preliminary experiences showed multilinear scanning was much harder to perform so it would be of interest which one would yield the best performance. \n\n
  39. In the first study we explore 2D target acquisition. \n\nWe implemented a tactile proprioceptive display using the Sony Move controller which can provide two different types of directional vibrotactile feedback and this controller can be tracked using an external camera with very high accuracy. \n\nThe size of this display we use is constrained by the reach of your arm.\n\nTwo different scanning techniques were defined: \nIn linear scanning users first find a band that is defined around the target’s X in which directional vibrotactile feedback is provided upon which the Target’s Y coordinate is found using frequency. \nin multilinear scanning directional vibrotactile feedback is provided on both axes simultaneously using frequency and pulse delay. Preliminary experiences showed multilinear scanning was much harder to perform so it would be of interest which one would yield the best performance. \n\n
  40. We conducted a between subjects study with 16 CS students. \nSubject played an augmented reality like space invaders game where they had to shoot 40 aliens. \nResults showed a significant difference between search time corrected for distance with multlinear scanning being significantly faster. No difference in error was found. \n
  41. We conducted a between subjects study with 16 CS students. \nSubject played an augmented reality like space invaders game where they had to shoot 40 aliens. \nResults showed a significant difference between search time corrected for distance with multlinear scanning being significantly faster. No difference in error was found. \n
  42. The second study explored spatial interaction. \n\nSubject had to scan to the location of a balloon and pop it using a thrust gesture. A gesture was correct if it was within 150 pixels of the target and there was less than 5% error in rotation along each axes of the controller. A user study with 8 users found an aiming error of 21 degrees. \n\n
  43. The second study explored spatial interaction. \n\nSubject had to scan to the location of a balloon and pop it using a thrust gesture. A gesture was correct if it was within 150 pixels of the target and there was less than 5% error in rotation along each axes of the controller. A user study with 8 users found an aiming error of 21 degrees. \n\n
  44. The second study explored spatial interaction. \n\nSubject had to scan to the location of a balloon and pop it using a thrust gesture. A gesture was correct if it was within 150 pixels of the target and there was less than 5% error in rotation along each axes of the controller. A user study with 8 users found an aiming error of 21 degrees. \n\n
  45. Potential applications of our technique include: \n- navigation where the Y coordinate can be used to indicate the distance to a target, low cost motor rehabilitation and exercise games for users who are blind \n
  46. Current and future work focuses on extending this to 3D target selection.\n\nWe are further interested in seeing if this non visual pointing task can somehow be modeled. \n\n
  47. \n