2. OVERVIEW
- Conducted HF research to fill in possible gaps
in current understanding of information density
with regards to augmented reality and wearable
technology.
- Consulted 40+ scholarly journal articles.
- Key Insights:
- Posture/Walking
- Plasticity of AR
- Immersion versus Presence
- Multimodality
3. INFORMATION DENSITY
-‐ Focus:
Overload
vs
Over-‐Reliance
-‐ How
can
the
system
be
informed
of
changes
in
the
users
objec=ves
and
environment?
-‐ How
do
we
determine
the
correct
amount
of
immersion
and
presence?
-‐ Should
the
user
be
able
to
request
specific
tools
or
should
the
system
be
context
aware?
7. PLASTICITY
- Plasticity is based on the recognition of the context of
use. It computes the evolution of an interactive system
depending on the context changes. [10]
- The system uses cues from the user to recognize a
change in the context or environment and enables the
users with the correct tools.
- The factors considered in what follows are: scene
lighting, the position of the user estimated by his
distance to the target, and the ambient noise.
9. MOVEMENT
- Users attempted to control their walking while
simultaneously performing a task that was
shown on the HMD, their ability to perform both
tasks deteriorated. [7]
- Both visual and motor aspects of walking
contributed to the decrease in performance. [7]
10. POSTURE
- Considerations: HMD must have equal
weight distribution.
- Posture can be used to measure fatigue,
discomfort, activation or valence.
12. INTERACTION
-‐ Interac=on
includes
the
determining
the
selec=on,
annota=on,
direct
manipula=on
of
physical
objects,
adap=ng
to
changes
in
the
environment,
and
realizing
the
presence
of
other
people
in
the
environment.
-‐ Challenge:
What
tools
to
give
users?
13. ENGAGEMENT
- Allow users to control their level of reality and
control their position on the mixed reality
continuum by combining transitional interfaces
mixed reality and focused casual control. [5]
- Should the users be the choosers? This is often
debated among researchers.
14. IMMERSION
What kind of functionality
provides the user with a
heightened awareness of
surroundings or sensory
capability?
15. PRESENCE
- Presence is co-constructed through
the experience of real and virtual
elements and as such understanding
this relationship becomes critical. [8]
16. PRESENCE
- How does this experience
compare to our everyday
reality?
- Presence can be heightened by
the effective usage of hybrid
UI.
17. HOW TO IMPROVE
PERFORMANCE?
- Multimodal environments are associated
with faster mental processing times of
discrete stimulus events, potentially
because they provide the user with more
complete information about the
environment. [10]
18. MULTI-MODALITY (HYBRID UI)
- Synchronicity of Sensory
Functionality
- How can we use these to consume
information and make decisions
more effectively?
20. READING INPUT
- Maximum contrast when reading time is important, and
the use of colors to reduce errors.
- Studies a colored billboard with transparent text
where colors have a specific meaning.
- Only bright objects can overlap on the background. In
practice, dark colors appear as semi-transparent and
they mix in with the background.
- The brightness of the background overcomes the
brightness of the display.
21. VISION (OCULAR VARIANTS)
-‐ Refrac=ve
Error
(20%)
-‐ Contrasts
in
color
may
vary
-‐ Presbyopia
(Aging
Eye,
45+)
-‐ Displays
designed
for
use
within
20”
may
be
hard
to
see
-‐ Accommoda=ve/Convergence
Disorders
(5-‐10%)
-‐ Prolonged
work
forcing
the
user
to
concentrate
or
near
sight
can
cause
eye
strain
or
double
vision
23. TYPING INPUT
-‐ The
system
uses
immersive
3D
stereo
displays
to
render
a
virtual
environment
and
a
virtual
keyboard,
a
data
glove
to
track
finger
mo=ons,
and
micro-‐speakers
to
create
low-‐frequency
(50
Hz)
vibra=ons
for
realis=c
tac=le
hap=c
feedback
for
each
finger.
When
the
users
press
a
virtual
key,
realis=c
tac=le
feedback
can
be
provided
to
the
users.
[9]
25. MEDICAL
-‐ Now:
Similar
to
maintenance
personnel,
roaming
nurses
and
doctors
could
benefit
from
important
informa=on
being
delivered
directly
to
their
glasses
[8].
-‐
Future:
-‐ Lathroscopic
(non-‐invasive)
surgery
-‐ Projected
X-‐Rays
[2]
26. ROBOTIC TELEOPERATION
-‐ Mostly
applicable
for
drone
opera=ons.
-‐ Instead
of
controlling
the
robot
directly,
it
may
be
preferable
to
instead
control
a
virtual
version
of
the
robot.
-‐ The
user
plans
and
specifies
the
robot's
ac=ons
by
manipula=ng
the
local
virtual
version,
in
real
=me.
-‐ The
results
are
directly
displayed
on
the
real
world.
Once
the
plan
is
tested
and
determined,
then
user
tells
the
real
robot
to
execute
the
specified
plan.
[2]
27. FIELD WORK
-‐ Allow
virtual
training
for
new
hires.
-‐ Scavenger
hunt-‐like
workflow
with
tutorials
and
remote
troubleshoo=ng
func=onality.
-‐ Collabora=on
-‐ Repair
-‐ Troubleshoo=ng
29. AFFECTIVE COMPUTING
Affec=ve
compu=ng
iden=fies
some
challenges
in
making
computers
more
aware
of
the
emo=onal
state
of
their
users
and
able
to
adapt
accordingly.
[8]
30. SENSORY AUGMENTATION
Augmented
Mirror
Box:
The
mirror
box
enabled
the
researchers
to
project
an
image
of
the
healthy
limb
moving,
thus
giving
the
visual
appearance
of
two
limbs
(bimanual
coupling)
in
ac=on.
-‐ It
was
hypothesized
that
this
form
of
virtual
movement
could
be
used
to
engage
the
brain’s
processes
associated
with
the
other
(impaired)
limb,
thereby
reducing
spa=al
and
motor
impairments.
[11]
31. HAPTIC AUDIO
-‐ Hap=c
audio
refers
to
sound
that
is
felt
rather
than
heard
and
is
already
applied
in
consumer
devices
such
as
Turtle
Beach’s
Ear
Force5
headphones
to
increase
the
sense
of
realism
and
impact,
but
also
to
enhance
user
interfaces
of
e.g.
mobile
phones
[11].
32. FUTURE ACCESSORIES
ECOSYSTEM
-‐ Wearable
device
in
the
fitbit/fuel
band
form
factor
-‐ Naviga=on
(Gestures
or
Cursor)
-‐ Voice
-‐ Camera
|
Scan
-‐ Olfactory
Receptor
(?)
-‐ Hap=c
Audio
33. SUMMARY
-‐ Big
Picture:
To
drive
produc=vity
the
HUD
user
experience
must
be
adap=ve
to
changes
in
the
users
environment
by
altering
the
users
sense
of
presence
and
leveraging
the
mul=
modal
affordances
of
the
device
to
react.
34. REFERENCES
[1]
Adap=ve
Augmented
Reality:
Plas=city
of
Augmenta=ons
(Ghouaiel
et
all,
2014)
[2]
Augmented
Reality
(Pandya
Aniket)
[3]
Augmented
Reality
Text
Style
Readability
with
See-‐Through
Head
Mounted
Displays
in
Industrial
Senngs
(Fioren=no
et
al,
2013)
[4]
An
Informa=on
Presenta=on
Method
for
Head
Mounted
Display
Considering
Surrounding
Environments
(Nakao
et
al,
2014)
[5]
A
Dose
of
Reality:
Overcoming
Usability
Challenges
(McGill
et
al)
35. REFERENCES
[6]
The
Nexus
of
Human
Factors
in
Cyber-‐Physical
Systems:
Ergonomics
of
Eyewear
for
Industrial
Applica=ons
(Theis
&
Wille,
2014)
[7[
Visual
Task
Performance
Using
a
Monocular
See-‐Through
Head-‐Mounted
Display
(HMD)
While
Walking