A short course on rapid prototyping for head mounted wearable computers, taught at the AWE 2015 conference on June 8th 2015 by Mark Billinghurst. The course presents some interface design guidelines for developing head mounted wearable interfaces, and prototyping tools that can be used to develop interactive versions of the interfaces.
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
The Glass Class at AWE 2015
1. The Glass Class
Rapid Prototyping for Wearables
June 8th, AWE 2015
Mark Billinghurst
HIT Lab NZ
University of Canterbury
mark.billinghurst@hitlabnz.org
3. Mark Billinghurst
▪ Ex-Director of HIT Lab NZ,
University of Canterbury
▪ PhD Univ. Washington
▪ Research on AR, mobile HCI,
Collaborative Interfaces
▪ More than 300 papers in AR, VR,
interface design
▪ Sabbatical in Glass team at
Google [x] in 2013
4. Goals and Schedule
Goals
Learn simple interface design guidelines
Learn useful prototyping tools
Learn where further resources are
Schedule
10:00 Introduction
10:05 Wearable Interface Design (15 minutes)
10:20 Prototyping Tools (25 minutes)
10:45 Finish
8. Wearable Computing
▪ Computer on the body that is:
▪ Always on
▪ Always accessible
▪ Always connected
▪ Other attributes
▪ Augmenting user actions
▪ Aware of user and surroundings
9. History of Wearables
▪ 1960-90: Early Exploration
▪ Custom build devices
▪ 1990 - 2000: Academic, Military Research
▪ MIT, CMU, Georgia Tech, EPFL, etc
▪ 1997: ISWC conference starts
▪ 1995 – 2005+: First Commercial Uses
▪ Niche industry applications, Military
▪ 2010 - : Second Wave of Wearables
▪ Consumer applications, Head Worn
10. ● Second Gen. Systems
▪ Recon (2010 - )
▪ Head worn displays for sports
▪ Ski goggle display
▪ Investment from Intel (2013)
▪ Google (2011 - )
▪ Google Glass
▪ Consumer focus
18. ● Designing for Intimacy
▪ Interface Design for wearables means
designing for the body:
▪ Designing for Attention
▪ Designing for Social Interaction
▪ User Experience Design
23. ● Time Looking at Screen
Oulasvirta, A. (2005). The fragmentation of attention in mobile
interaction, and what to do with it. interactions, 12(6), 16-18.
25. ● Design for MicroInteractions
▪ Design interactions less than a few seconds
▪ Tiny bursts of interaction
▪ One task per interaction
▪ One input per interaction
▪ Benefits
▪ Use limited input
▪ Minimize interruptions
▪ Reduce attention fragmentation
26. NHTSA Guidelines - www.nhtsa.gov
For technology in cars:
• Any task by a driver should be interruptible at any time.
• The driver should control the pace of task interactions.
• Tasks should be completed with glances away from the
roadway of 2 seconds or less
• Cumulative time glancing away from the road <=12 secs.
27. Rule of Thumb: The interface
should not take more than 4
seconds to complete a given
step in the interaction
28. ● Designing for Interruptions
▪ Assume user is engaged in critical real world task
▪ Use context to filter interruptions (is it necessary?)
▪ Interrupt in way that consumes least attention
▪ Allow user to dismiss interruption with minimal effort
▪ Progressively disclose information and increase interaction
29. ● Example: Interruptions on Glass
▪ Gradually increase engagement and attention load
▪ Respond to user engagement
Receiving SMS on Glass
“Bing”
Tap
Swipe
Glass
Show Message Start Reply
User
Look
Up
Say
Reply
30. ● Consider Cognitive Load (Input)
▪ Consider the Cognitive Load required for input
▪ Little user input = low cognitive load (e.g. Sensor)
▪ Constant user input = high cognitive load (e.g. touch)
Continuum of Cognitive Load for User Input
31. ● Cognitive Load (Output)
▪ The Cognitive Resource consumed by system output
▪ Agents = low cognitive load (e.g. Web shopping agent)
▪ Multimedia = high cognitive load (e.g. VR system)
Continuum of Cognitive Load for Output
33. ● Social Acceptance
▪ People don’t want to look silly
▪ Only 12% of 4,600 adults would be willing to wear AR glasses
▪ 20% of mobile AR browser users experience social issues
▪ Acceptance more due to Social than Technical issues
▪ Needs further study (ethnographic, field tests, longitudinal)
34. Rule of Thumb: Fashion First
- It DOES NOT MATTER what
the device does unless the user is
willing to put it on the first time
40. Last year Last week NowForever
The Now machine
Focus on location, contextual
and timely information, and
communication.
41.
42. ● Consider Your User
▪ Wearable User
▪ Probably Mobile
▪ One/no hand interaction
▪ Short application use
▪ Need to be able to multitask
▪ Use in outdoor or indoor environment
▪ Want to enhance interaction with real world
47. Rule of Thumb: Provide
multiple ways of accessing
functionality.
48. ● Example: Glass Pictures
▪ On Glass there are three ways to take a picture
1/ Voice commands – “Ok Glass, Take a Picture”
2/ Touch navigation through menu
3/ Winking with right eye
▪ Which you use depends on context
▪ Riding a bike outdoors – voice commands
▪ During a meeting – winking
50. It's
like
a
rear
view
mirror
Don't
overload
the
user.
S7ck
to
the
absolutely
essen7al,
avoid
long
interac7ons.
Be
explicit.
51. Make it glanceable
Seek to rigorously reduce information density.
Successful designs afford for recognition, not reading.
Bad Good
52. ✓Reduce the number of info chunks
You are designing for recognition, not reading. Reducing the total # of
information chunks will greatly increase the glanceability of your design.
1
2
3
1
2
3
4
5 (6)
Test done by Morten Just using a watch
53. Design single interactions to be faster than 4 s
Eye movements
For 1: 1 230ms
For 2: 1 230ms
For 3: 1 230ms
For 4: 3 690ms
For 5: 2 460ms
~1,840ms
Eye movements
For 1: 1-2 460ms
For 2: 1 230ms
For 3: 1 230ms
~920ms
1
2
3
1
2
3
4
5 (6)
Test done by Morten Just using a watch
60. Remember, people have an ever-growing
ecosystem of wearables
Each device should be used when it’s most relevant and when it’s
the easiest interaction available.
61.
62. Interface Guidelines
▪ Design for device
▪ Use Micro Interaction
▪ Make it glanceable
▪ Do one thing at a time
▪ Reduce number of information chunks
▪ Design for indoor and outdoor use
64. Attention: least visual-manual attention
necessary, 4 second checkpoints, < 2 second
access time
Social: graceful interfaces, multiple ways of
accessing functionality
User Experience: Glanceable interface,
design for device, multiple ways of accessing
information, keep it relevant
Summary
69. Important Note
▪ Most current wearables run Android OS
▪ eg Glass, Vuzix, Atheer, Epson, etc
▪ So many tools for prototyping on Android
mobile devices will work for wearables
▪ If you want to learn to code, learn
▪ Java, Android, Javascript/PHP
70. Typical Development Steps
▪ Sketching
▪ Storyboards
▪ UI Mockups
▪ Interaction Flows
▪ Video Prototypes
▪ Interactive Prototypes
▪ Final Native Application
Increased
Fidelity &
Interactivity
82. ● Viewing Design on Device
▪ Android Design Preview
▪ https://github.com/romannurik/AndroidDesignPreview
▪ View a portion of your desktop on Android device
▪ Select region of screen
▪ Mirror it on Android Device
▪ Use to view mock-ups on target device
▪ Eg Powerpoint for Glass mockups
83. Limitations
▪ Positives
▪ Good for documenting screens
▪ Can show application flow
▪ Negatives
▪ No interactivity/transitions
▪ Can’t be used for testing
▪ Can’t deploy on wearable
▪ Can be time consuming to create
85. ▪ Series of still photos in a movie format.
▪ Demonstrates the experience of the product
▪ Discover where concept needs fleshing out.
▪ Communicate experience and interface
▪ You can use whatever tools, from Flash to iMovie.
Video Sketching
92. UXpin - www.uxpin.com
▪ Web based wireframing tool
▪ Mobile/Desktop applications
▪ Glass templates, run in browser
https://www.youtube.com/watch?v=0XtS5YP8HcM
93. Proto.io - http://www.proto.io/
▪ Web based mobile prototyping tool
▪ Features
▪ Prototype for multiple devices
▪ Gesture input, touch events, animations
▪ Share with collaborators
▪ Test on device
100. Wireframe Limitations
▪ Can’t deploy on Device
▪ No access to sensor data
▪ Camera, orientation sensor
▪ No multimedia playback
▪ Audio, video
▪ Simple transitions
▪ No conditional logic
▪ No networking
102. Processing
▪ Programming tool for Artists/Designers
▪ http://processing.org
▪ Easy to code, Free, Open source, Java based
▪ 2D, 3D, audio/video support
▪ Processing For Android
▪ http://wiki.processing.org/w/Android
▪ Strong Android support
▪ Generates Android .apk file
103. Processing - Motivation
▪ Language of Interaction
▪ Sketching with code
▪ Support for rich interaction
▪ Large developer community
▪ Active help forums
▪ Dozens of plug-in libraries
▪ Strong Android support
▪ Easy to run on wearables
107. Basic Parts of a Processing
Sketch/* Notes comment */
//set up global variables
float moveX = 50;
//Initialize the Sketch
void setup (){
}
//draw every frame
void draw(){
}
108. Importing Libraries
▪ Can add functionality by Importing
Libraries
▪ java archives - .jar files
▪ Include import code
import processing.opengl.*;
▪ Popular Libraries
▪ Minim - audio library
▪ OCD - 3D camera views
▪ Physics - physics engine
▪ bluetoothDesktop - bluetooth networking
110. Processing and Glass
▪ One of the easiest ways to build rich
interactive wearable applications
▪ focus on interactivity, not coding
▪ Collects all sensor input
▪ camera, accelerometer, touch
▪ Can build native Android .apk files
▪ Side load onto Glass
111. Example: Hello World
//called initially at the start of the Processing sketch
void setup() {
size(640, 360);
background(0);
}
//called every frame to draw output
void draw() {
background(0);
//draw a white text string showing Hello World
fill(255);
text("Hello World", 50, 50);
}
113. Hello World Image
PImage img; // Create an image variable
void setup() {
size(640, 360);
//load the ok glass home screen image
img = loadImage("okGlass.jpg"); // Load the image into
the program
}
void draw() {
// Displays the image at its actual size at point (0,0)
image(img, 0, 0);
}
115. Touch Pad Input
▪ Tap recognized as DPAD input
void keyPressed() {
if (key == CODED){
if (keyCode == DPAD) {
// Do something ..
▪ Java code to capture rich motion events
▪ import android.view.MotionEvent;
116. Motion Event
//Glass Touch Events - reads from touch pad
public boolean dispatchGenericMotionEvent(MotionEvent event) {
float x = event.getX(); // get x/y coords
float y = event.getY();
int action = event.getActionMasked(); // get code for action
switch (action) { // let us know which action code shows up
case MotionEvent.ACTION_DOWN:
touchEvent = "DOWN";
fingerTouch = 1;
break;
case MotionEvent.ACTION_MOVE:
touchEvent = "MOVE";
xpos = myScreenWidth-x*touchPadScaleX;
ypos = y*touchPadScaleY;
break;
118. Sensors
▪ Ketai Library for Processing
▪ https://code.google.com/p/ketai/
▪ Support all phone sensors
▪ GPS, Compass, Light, Camera, etc
▪ Include Ketai Library
▪ import ketai.sensors.*;
▪ KetaiSensor sensor;
119. Using Sensors
▪ Setup in Setup( ) function
▪ sensor = new KetaiSensor(this);
▪ sensor.start();
▪ sensor.list();
▪ Event based sensor reading
void onAccelerometerEvent(…)
{
accelerometer.set(x, y, z);
}
121. Using the Camera
▪ Import camera library
▪ import ketai.camera.*;
▪ KetaiCamera cam;
▪ Setup in Setup( ) function
▪ cam = new KetaiCamera(this, 640, 480, 15);
▪ Draw camera image
void draw() {
//draw the camera image
image(cam, width/2, height/2);
}
126. Rasberry Pi Glasses
▪ Modify video glasses, connect to Rasberry Pi
▪ $200 - $300 in parts, simple assembly
▪ https://learn.adafruit.com/diy-wearable-pi-near-eye-kopin-video-glasses
127. Physical Input Devices
▪ Can we develop unobtrusive input devices ?
▪ Reduce need for speech, touch pad input
▪ Socially more acceptable
▪ Examples
▪ Ring,
▪ pendant,
▪ bracelet,
▪ gloves, etc
132. ● Light Blue Bean - punchthrough.com/bean/
▪ Low energy Bluetooth Arduino microcontroller
▪ Programmed wirelessly (Blue Tooth 4.0)
▪ Runs off coin battery
▪ On-board sensors (accelerometer, temperature)
▪ Ideal for wearable sensor/input projects
133. ● LittleBits - http://littlebits.cc/
▪ Quick and dirty prototyping
▪ Snap together electronics
▪ Dozens of input and output modules
▪ Arduino module to connect to wearable
135. Other Tools
▪ Wireframing
▪ pidoco
▪ FluidUI
▪ Rapid Development
▪ Phone Gap
▪ AppMachine
▪ Interactive
▪ App Inventor
▪ WearScript
136. ● WearScript
▪ Combines power of Android development on Glass with
the learning curve of a website
▪ Use a cloud IDE to write JavaScript that runs instantly on
Glass inside a WebView
▪ Get started at http://wearscript.com
137. ● WearScript Features
▪ Community of Developers
▪ Easy development of Glass Applications
▪ GDK card format
▪ Support for all sensor input
▪ Support for advanced features
▪ Augmented Reality
▪ Eye tracking
▪ Arduino input
141. ● Nested Cards
WS.displayCardTree();!
var select = function () {WS.say('select')};!
var tap = function () {WS.say('tap')};!
var log = function () {WS.log('log')};!
var tree = new WS.Cards();!
!
var subtree = new WS.Cards();!
!subtree.add('No Listeners', 's0');!
!subtree.add('Select', 's1', select);!
!subtree.add('Select + Tap', 's2', select, tap);!
!subtree.add('Menu', 's3', 'Tap', tap, 'Log', log);!
tree.add('Subtree', '0', subtree);!
tree.add('Subtree', '', subtree);!
WS.cardTree(tree);!
142. ● Sensor Input
▪ Listen and respond to sensor data
▪ Ibeacon
▪ Gyroscope
▪ Gps
▪ Accelerometer
▪ magnetic field
▪ Orientation
▪ Light
▪ Gravity
▪ linear acceleration
▪ rotation vector
WS.sensorOn(WS.sensor(‘gps’),5,function(data){!
!var latitude = data[‘values’][0];!
!var longitude = data[‘values’][1];!
!//do something with coordinates!
});!
146. Summary
▪ Prototyping for wearables is similar to mobiles
▪ Tools for UI design, storyboarding, wireframing
▪ Android tools to create interactive prototypes
▪ Processing, WearScript, etc
▪ Arduino can be used for hardware prototypes
▪ Once prototyped Native Apps can be built
▪ Android + SDK for each platform