This document summarizes a research paper on measuring perceptible affordances with eye tracking using an iGoogle case study. The paper had two supervisors and was authored by Jacques Chueke from London, UK in May 2011 for his Master's in Design. It discusses new modes of interaction using technologies like Microsoft Surface, Kinect, and eye tracking displays. It addresses the problem of hidden interactions where users do not know what actions they can take or the effects of their actions. The paper proposes studying these issues using eye tracking on iGoogle to understand how to design interfaces that make available interactions perceptible.
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
JChueke HCID Open Day_apr2012_pt01
1. Measuring Perceptible
Affordances with Eye Tracking:
An iGoogle Case Study
Jacques Chueke
London, UK, May 2011
George Buchanan
(1st Supervisor)
Lecturer, Centre for HCI Design
Stephanie Wilson
(2nd Supervisor)
Lecturer, Centre for HCI Design
Master in Design, PUC-Rio, RJ, Brazil
PhD Researcher at the Centre for HCI Design
School of Informatics, City University London 1
2. New Modes of Interaction
Microsoft Surface, 2007 Xbox 360 (Kinect) Dashboard, 2011
Tobii LeNovo, Jun 2011
CES 2009: Hitachi's Gesture Remote Control TV Prototype Front-facing webcam to track head movements for
cursor control, 2011
2
4. Problem Statement
• Traditional control modes of interaction (e.g. Mouse and
Keyboard) are being substituted by NUI (physical
interactions, e.g. touch, voice, gestures, eye gaze).
• Emerging Post-WIMP command vocabularies.
• Hybrid solutions of control modes (interactions) x visual
solutions (interfaces).
• The ‘hidden gestures’ issue.
• The ‘invisible’ control issue or ‘hidden interactions’.
4
5. Problem Statement
• What am I supposed to do? And if I did, what’s going to happen?
Carroll, Lewis (2009). Alice's Adventures in
Wonderland and Through the Looking-
Glass.
5
6. The Hidden Interactions issue
• The new Windows 8 with similar features as used in Windows Phone and Xbox 360 Dashboard.
6
7. The Hidden Interactions issue
• iPad and hidden interaction/gestures issue and some interface solutions.
7
Notas del editor
The introduction of novel hardware for computing and gaming during the last decade is changing the way we control everyday devices. It provides NUI control methods, such as haptic (e.g. iPhone, iPad, MS Surface), gesture-based and voice (e.g. Nintendo Wii, Microsoft Xbox 360 console gaming with Kinect sensor) and eye tracking interactions (e. g. Tobii P-10).One specific impact this has had is on the user’s control of such devices. New ways to control – new gestures vocabularies (example kinect and iPad) - New command vocabularies have emerged and users do not know how to access or activate them.New visual metaphors (metro dashboard)How do people make sense?How do people learn?How to design better pas to improve ux and learning?How to teach at first and when one becomes and expert he/she may hide pa----------------Regarding new gestures vocabularies, what you gonna see is not what I mean…