SlideShare una empresa de Scribd logo
1 de 63
Assistive technologies: experiences from
AAL for the blind and visually impaired
within the ALICE project
Andrei BURSUC, Prof. Titus ZAHARIA
Institut Mines-Télécom; Télécom SudParis
firstname.lastname@telecom-sudparis.eu
Invited talk by DemaCare FP7 Project
• Context and objectives
• The ALICE project and AAL
• State-of-the-art
• User requirements
• System prototype
• Obstacle detection
• Navigation assistant
• Human-Machine interface
• Conclusion and perspectives
2
Outline
Experiences from the ALICE project
• VI persons face many problems every day:
– overall contextual understanding of space semantics
– interaction with surrounding objects
– planning, orientation, communication, navigation
• 285M registered visually impaired people: 39M blind, 246M
with low vision (WHO report)
• The degree of visual impairment is increasing with an
ageing population
3
Context and objectives
Nowadays
Experiences from the ALICE project
• Provide navigational assistive device for elderly blind
with cognitive capabilities:
– Positioning
– Obstacle detection/alerting
– Landmark/object recognition
• Offer VI users a cognitive description based on a fusion of
of perceptions gathered from multiple sensors
• Personal benefits:
– Enable independency of blind and partially sighted people
– Save stress and time of the end-users
– Improve the individual self-esteem
4
Context and objectives
Objectives
Experiences from the ALICE project
• 7 partners (academics, SMEs, VI persons associations)
• 4 European countries (ES, FR, SI, UK)
• Duration: June 2012 – November 2014
• Final product: device consisting of smartphone with
additional sensors, wirelessly connected with local processing
unit
The project
ZVEZA
SLEPIH
5Experiences from the ALICE project
• Ambient Assisted Living - funding activity that aims:
– to create better condition of life for the older adults
– to strengthen the industrial opportunities in Europe through the
use of ICT
• Funding across-national projects involving SMEs, research
bodies and user’s organizations
• Time-to-market perspective of max 2-3 years after the end
of the project
• Project total budget: 1-7 M€ (funding 3 M€ at most)
AAL Joint Programme
6Experiences from the ALICE project
Experiences from the ALICE project 7
What’s possible?
State of the art
How VI orient themselves?
• With the help of the guide (other person)
• Using a white cane, guide dog
• Using electronic devices, GPS
• By listening familiar sounds
• By looking for something familiar (edge of pavements,
curves, crossroads, very large inscriptions)
• Underfoot textures, different surfaces
• Sun, wind directions, smell
• Road signs
8
State of the Art
Experiences from the ALICE project
Experiences from the ALICE project
How VI orient themselves?
• Current techniques are still not very advanced
9
State of the Art
Experiences from the ALICE project
How VI orient themselves?
• Cane and dogs are still kings!
10
State of the Art
How VI (could) orient themselves?
• Navigation systems:
– GPS + computer vision (clear path, landmark recognition)
• Object recognition systems:
– Grocery shopping assistant
– RFID tags on objects
– OCR (Optical Character Recognition)
– Detectors: crosswalk , walk lights, staircase, street signs, pedestrians
• Obstacle avoidance systems:
– Integrating depth information
– Step and curb detection
11
State of the Art
Experiences from the ALICE project
• Conclusions:
– Few systems work in real time
– Many approaches require the use of heavy equipment
– Some systems need tags
– The research field should get a new boost with the advent of the
Google Glass
How VI (could) orient themselves?
12
State of the Art
[Lee, 2012]
[Marduchi, 2012] [Pradeep, 2010]
Experiences from the ALICE project
• Limited computational resources: light and low powerful
wearable devices
• Real-time responsiveness
• Reliability and no false positives
• Adequate and non-overwhelming communication with the
user (alerts, indications)
13
State of the Art
Challenges
Experiences from the ALICE project
24 July 2013 14
Setting up the path
User feedback and requirements
Experiences from the ALICE project
• Participants’ profile:
– Age: 55-75
– Countries: Slovenia, UK
– Degree of visual impairness: blind and partially sighted
– Total: 40 participants (20 from each country)
Questionnaire for end-users
15
User requirements
Questionnaire conclusions
• 50 % of participants are using only familiar routes
• Most participants need someone to guide them to certain
places.
• Some of them need the guide every time – often they
depend on the time and will of others.
• It is important to know where they are positioned, how far
the destination is and the vicinity of the route
16
User requirements
Experiences from the ALICE project
Questionnaire conclusions - Device
• Not very much confidence placed in the electronic
navigation system (only after several successful tests)
• Necessity of training and information about electronic
devices.
• Half of users use speech synthesis
• Willingness to use headphones, but hearing shouldn‘t
be obstructed.
• “Turn by turn” functionality should not give too much
info
17
User requirements
Experiences from the ALICE project
Questionnaire conclusions - Indoor
• 85 % of respondents have difficulties with orientation
through indoor public institutions.
• Difficulties the users are facing in indoor environments:
– the size of the room
– glittering surfaces
– room darkness
– no orientating points to navigate with white cane
– difficulties to recognize the landmarks
– background music.
18
User requirements
Experiences from the ALICE project
Questionnaire conclusions - Obstacles
• Obstacles that users want to be warned about:
– pillars
– curves
– overhanging branches
– edge of pavements
– street furniture
– steps
– down slopes
– ramps
– holes
– bumps
19
User requirements
Experiences from the ALICE project
Experiences from the ALICE project
User expectations
• The device should be accurate:
– Exact info about the obstacles
– Find safe corridors for walking
– Warn the user when is safe to cross the road, the green light is on,
if traffic is coming (especially bikes, electric cars)
• The device should be small, portable, phone sized.
20
User requirements
User expectations
• Other features:
– Give the distance to the building
– Find the right bus stop, post box.
– Text-to-speech for: letters, journey‘s instructions , street
inscriptions, shop names
– Tell the weather, temperature, local taxi availability.
– Recognize faces and the person‘s name.
21
User requirements
Experiences from the ALICE project
Experiences from the ALICE project 22
First tests and experiments
System prototype
Sensor evaluation
• Evaluation of multiple sensors: camera (ToF, stereo, web),
compass, gyroscope, ultra-sonic ranger, GPS, pedometer)
• Samsung Galaxy S3 used as baseline
23
System prototype
Image
Comunication
Sound commands
Tactile comunication
Orientation
Positioning
Light sensor
Inclination
Experiences from the ALICE project
Sensor evaluation
• Sensors have different sampling speeds
24
System prototype
Experiences from the ALICE project
Sensor evaluation - Conclusions
• All sensors in Samsung S3 are superior than the external
ones tested (except GPS).
• External GPS has better reception due to antena – but in
areas with strong multipath effect, the advantage is reduced
• Accuracy of GPS: 10 – 40 meters in urban areas
• Ultrasonic ranger would be useful for obstacles in front of
the user
25
System prototype
Experiences from the ALICE project
Possible camera positions
26
System prototype
Experiences from the ALICE project
Possible camera positions
27
System prototype
Experiences from the ALICE project
Possible camera positions
• Setting used for video recording
28
System prototype
Experiences from the ALICE project
Headphones
• Bone conduction headphones:
– Effective even in very loud enviroment (city traffic)
– Does not obscure sounds from enviroment
– Very High frequencies not as good as in normal headphones
29
System prototype
Experiences from the ALICE project
30
Platform configuration
System prototype
Experiences from the ALICE project
Conclusion
24 July 2013 31
Conclusion and perspectives
Parsing the visual domain
Obstacle detector
32
Input video stream
Method overview
Obstacle detection
Experiences from the ALICE project
33
Input video stream
Interest points extraction
Grid of points regularly spread in a frame
Method overview
Obstacle detection
Experiences from the ALICE project
34
Input video stream
Interest points extraction
Grid of points regularly spread in a frame
Interests points matching and
tracking
Multiscale Lucas-Kanade algorithm
Method overview
Obstacle detection
Experiences from the ALICE project
35
Input video stream
Interest points extraction
Interests points matching and
tracking
Multiscale Lucas-Kanade algorithm
Background / Camera motion
estimation
Global geometric transform – RANSAC
algorithm
Method overview
Obstacle detection
Experiences from the ALICE project
36
Input video stream
Interest points extraction
Interests points matching and
tracking
Background / Camera motion
estimation
Global geometric transform – RANSAC
algorithm
Static / Dynamic obstacle
motion estimation
Agglomerative clustering based on
proximity computation
Method overview
Obstacle detection
Experiences from the ALICE project
37
Input video stream
Interest points extraction
Interests points matching and
tracking
Background / Camera motion
estimation
Static / Dynamic obstacle
motion estimation
Agglomerative clustering based on
proximity computation
Interest points refinement
K-NN algorithm and small clusters removal
Method overview
Obstacle detection
Experiences from the ALICE project
38
Input video stream
Interest points extraction
Interests points matching and
tracking
Background / Camera motion
estimation
Static / Dynamic obstacle
motion estimation
Interest points refinement
Obstacles classification
K-NN algorithm and small clusters removal
Method overview
Obstacle detection
Experiences from the ALICE project
Experiences from the ALICE project 39
Input video stream
Interest points extraction
Interests points matching and
tracking
Background / Camera motion
estimation
Static / Dynamic obstacle
motion estimation
Interest points refinement
Obstacles classification
Obstacle classification based on position
and direction relative to the video camera
Experimental results
Method overview
Obstacle detection
40
Experimental results
Obstacle detection
Experiences from the ALICE project
41
The algorithms were run on an Intel Xeon Machine 3.6 GHz, RAM 16 GB RAM and on a NVIDIA Quadro 4000 video board (256 cores CUDA, 256 bits of external memory
interface and 9945 MB graphical memory), under a Windows 7 platform (desktop).
Preprocessing steps
Time - without GPU
(msec)
Time - with GPU (msec)
Interest points detection (image grid) 0.05 – 0.5
Interests points matching and tracking
(unidirectional Lucas – Kanade optical flow)
22 - 23 10 - 11
Background / camera motion estimation (unidirectional
homographic motion model (RANSAC)
6.5 - 8.0
Object / obstacle motion estimation
(agglomerative clustering)
0.05 – 0.15
Interest points refinement (K-NN algorithm) 0.05 – 0.1
Obstacle classification
(approaching / departing and urgent / normal)
0.05 - 0.1
Saving results (video) 1.5 – 2.05
TOTAL TIME / FRAME (average) 31 ms 20 ms
Computational time
Obstacle detection
Experiences from the ALICE project
Objectives
Human-Machine interface
Taking the path
Navigation assistant
Accessible Maps
• Crow-sourced application for maps annotation
• Routes are entered, edited and shared with Google Maps
• OpenStreetMaps used as repository and online access to
information about points of interest.
43
Navigation assistant
Experiences from the ALICE project
Accessible Maps
• Waypoints annotations:
– WHAT: presence of crosswalk, traffic lights in an intersection, type
of intersection, walk buttons, Stop signs, median strips.
– WHERE: information in form of absolute geographic form (Lat, Long)
44
Navigation assistant
Experiences from the ALICE project
Experiences from the ALICE project
Assistance
• Crossing ahead:
• Turn left and then cross:
45
Navigation assistant
Assistance
• Demo:
46
Navigation assistant
Experiences from the ALICE project
Objectives
Human-Machine interface
Making the connection
Human-Machine interface
Objectives
Human-Machine interface
• Create a communication/presentation system:
– Highly adapted to user needs
– Enable the VI to perceive and interact with the surrounding
environment
• Instructions for navigation will have to acknowledge that
user perception is similar to moving blindfolded in a maze:
– Verbalization: for description of surrounding objects
– Enactive methods: for presenting orientation, distance, motion
and position of moving objects
48Experiences from the ALICE project
Methods
Human-Machine interface
• 2 separate groups of users according to:
– Level of visual impairment
– Other criteria (age, education, etc.)
• Interface modalities:
– Audio semantics using sound, music and synthesized voice
– Text-to-speech synthesis using headphones
• Input modalities: screen, tapping, gestures, voice
• Output modalities: audio, haptic, tactile
49Experiences from the ALICE project
Enactive methods
Human-Machine interface
• Communication with the user: what, when, how
– Not just how to transfer information between the system and the
user, but what information and when.
– The timely delivery of the right information avoids information
overload.
– Translate the sensory impressions about the surroundings into
tactile or sound information ( faster and easier to comprehend
than verbalization).
50Experiences from the ALICE project
User warning
• Directional warnings: earcons
• Positional warning:
– alerting a user must give user enough time to prepare (2-3 sec for
a voice message)
– acoustic signal (sequence of beeps) with varying frequencies
– vibrations in the bone conduction headphones
51
Human-Machine interface
Experiences from the ALICE project
Menu
• Hierarchical menu
52
Human-Machine interface
Experiences from the ALICE project
Georgie prototype
• Sample user-interface
53
Human-Machine interface
Experiences from the ALICE project
24 July 2013 54
Next steps
Conclusion and Perspectives
Conclusion
• Encouraging first achievements within the ALICE project
• Human-Machine interfacing is a difficult challenge
• User feedback is essential
• Still plenty of things left to improve
55
Conclusion and perspectives
Experiences from the ALICE project
Perspectives
• Learning and recognizing user-defined landmarks and
objects of interest
• Obstacle classification according to degree of risk to the
user and generation of adequate alerts
• Improve navigation and recognition at key points of trip
(start and finish)
• Navigation and obstacle recognition modules integrated
into a single application
56
Conclusion and perspectives
Experiences from the ALICE project
ALICE benefits in day-to-day life?
• Jean:
– is partially sighted
– works at UBPS
– travels the same route to his office every day
57
Conclusion and perspectives
Experiences from the ALICE project
ALICE benefits in day-to-day life?
• Jean:
– knows the route
– with his white cane he manages to travel safely from the bus stop
to the building.
58
Conclusion and perspectives
Experiences from the ALICE project
ALICE benefits in day-to-day life?
• Paul:
– is blind
– goes at the UBPS once a week
– uses different route (he doesn’t feel safe enough)
59
Conclusion and perspectives
Experiences from the ALICE project
ALICE benefits in day-to-day life?
• Paul:
– Paul’s route
60
Conclusion and perspectives
Experiences from the ALICE project
Experiences from the ALICE project
ALICE benefits in day-to-day life?
• Paul and some other blind people usually need to take
longer routes (more then 400m)
61
Conclusion and perspectives
Paul’s routeJean’s route
How can ALICE bring benefits?
24 July 2013 62
Conclusion and perspectives
Find out more at
www.alice-project.euThank you!
Experiences from the ALICE project
• Slide 2: http://www.flickr.com/photos/gullevek/3240421172/
• Slide 7: http://www.flickr.com/photos/pointshoot/3590816656/
• Slide 10: http://blog.grdodge.org/wp-content/uploads/2011/08/Morris-and-Buddy-1.jpg
http://www.iowablindhistory.org/sites/default/files/image/History%20Site%20Images%20and%20Audio%20/Pic%20o
f%20Jernigan.jpg
http://www.flickr.com/photos/library_of_congress/8190452507/
http://www.globalride-sf.org/images/0608/images/2_PedInfra_TactileWarnings.jpg
http://images.ookaboo.com/photo/m/Geleidehond_testparcours_m.jpg
http://www.robertschroeder.com/wordpress/wp-content/uploads/2011/01/GuidedWalkSchroeder.jpg
http://abramsonscorner.files.wordpress.com/2011/06/img_9072-13-of-54-version-2-1-of-1.jpg
• Slide 14: http://farm4.staticflickr.com/3459/3188288778_3d44b943b4_b.jpg
• Slide 15: http://blockingfortheblind.org/wp-content/uploads/2013/02/peoplewithcanes.jpg
• Slide 20: http://i.huffpost.com/gen/819993/thumbs/r-BLIND-MAN-TASERED-large570.jpg
• Slide 31: http://www.flickr.com/photos/swiiffer/4593608484/
• Slide 42:
http://upload.wikimedia.org/wikipedia/commons/thumb/a/af/Blind_Leading_the_Blind_by_Lee_Mclaughlin.jpg/1024px-
Blind_Leading_the_Blind_by_Lee_Mclaughlin.jpg
• Slide 47: http://i.imgur.com/f3fqnEY.jpg
• Slide 54: http://www.flickr.com/photos/84681882@N00/5467879589
• Slide 62: http://www.austindowntownlions.org/Resources/Pictures/Gucci%20looking%20forward%20and%20canes.jpg
63
Photo credits

Más contenido relacionado

Similar a Assistive technologies: experiences from AAL for the blind and visually impaired within the ALICE project

"Click here" to build your UAV
"Click here" to build your UAV"Click here" to build your UAV
"Click here" to build your UAV
Dirk Gorissen
 
Space Science Technology and Applications at CPUT By Prof Robert van Zyl
Space Science Technology and Applications at CPUT By Prof Robert van ZylSpace Science Technology and Applications at CPUT By Prof Robert van Zyl
Space Science Technology and Applications at CPUT By Prof Robert van Zyl
Polytechnic of Namibia
 

Similar a Assistive technologies: experiences from AAL for the blind and visually impaired within the ALICE project (20)

PLS 2018: Resolving a gap: Task visual acuity with short tunnels
PLS 2018: Resolving a gap: Task visual acuity with short tunnelsPLS 2018: Resolving a gap: Task visual acuity with short tunnels
PLS 2018: Resolving a gap: Task visual acuity with short tunnels
 
Google Glass
Google GlassGoogle Glass
Google Glass
 
Robotprogrammatie: enkele lessen uit de praktijk, trends en uitdagingen
Robotprogrammatie: enkele lessen uit de praktijk, trends en uitdagingenRobotprogrammatie: enkele lessen uit de praktijk, trends en uitdagingen
Robotprogrammatie: enkele lessen uit de praktijk, trends en uitdagingen
 
An Open and Improved VISIR System Through PILAR Federation for Electrical/Ele...
An Open and Improved VISIR System Through PILAR Federation for Electrical/Ele...An Open and Improved VISIR System Through PILAR Federation for Electrical/Ele...
An Open and Improved VISIR System Through PILAR Federation for Electrical/Ele...
 
Successful Lighthouse City Smart Solutions Replication in Fellow Cities?
Successful Lighthouse City Smart Solutions Replication in Fellow Cities?Successful Lighthouse City Smart Solutions Replication in Fellow Cities?
Successful Lighthouse City Smart Solutions Replication in Fellow Cities?
 
2016 iccgis module1_methods_andtechniques
2016 iccgis module1_methods_andtechniques2016 iccgis module1_methods_andtechniques
2016 iccgis module1_methods_andtechniques
 
"Click here" to build your UAV
"Click here" to build your UAV"Click here" to build your UAV
"Click here" to build your UAV
 
Space Science Technology and Applications at CPUT By Prof Robert van Zyl
Space Science Technology and Applications at CPUT By Prof Robert van ZylSpace Science Technology and Applications at CPUT By Prof Robert van Zyl
Space Science Technology and Applications at CPUT By Prof Robert van Zyl
 
Introduction to mobile accessibility, 2015
Introduction to mobile accessibility, 2015Introduction to mobile accessibility, 2015
Introduction to mobile accessibility, 2015
 
ICE_Horoshenkov-Pipebots-Presentation-compressed.pdf
ICE_Horoshenkov-Pipebots-Presentation-compressed.pdfICE_Horoshenkov-Pipebots-Presentation-compressed.pdf
ICE_Horoshenkov-Pipebots-Presentation-compressed.pdf
 
The H2020 InLife project - A platform for digital AT services in communities ...
The H2020 InLife project - A platform for digital AT services in communities ...The H2020 InLife project - A platform for digital AT services in communities ...
The H2020 InLife project - A platform for digital AT services in communities ...
 
Common Approach for UAS Data Geoprocessing
Common Approach for UAS Data GeoprocessingCommon Approach for UAS Data Geoprocessing
Common Approach for UAS Data Geoprocessing
 
iBrussels Partnership Program
iBrussels Partnership ProgramiBrussels Partnership Program
iBrussels Partnership Program
 
OTASCE Map: A Mobile Map Tool with Customizable Audio-Tactile Cues for the Vi...
OTASCE Map: A Mobile Map Tool with Customizable Audio-Tactile Cues for the Vi...OTASCE Map: A Mobile Map Tool with Customizable Audio-Tactile Cues for the Vi...
OTASCE Map: A Mobile Map Tool with Customizable Audio-Tactile Cues for the Vi...
 
Web accessibility workshop 1
Web accessibility workshop 1Web accessibility workshop 1
Web accessibility workshop 1
 
Neven Vrček - Role of Governments, Academy & Science Parks - University of Za...
Neven Vrček - Role of Governments, Academy & Science Parks - University of Za...Neven Vrček - Role of Governments, Academy & Science Parks - University of Za...
Neven Vrček - Role of Governments, Academy & Science Parks - University of Za...
 
Ploughshare intro 2014
Ploughshare intro 2014Ploughshare intro 2014
Ploughshare intro 2014
 
Ae terics aegis 11 oct 2011
Ae terics aegis 11 oct 2011Ae terics aegis 11 oct 2011
Ae terics aegis 11 oct 2011
 
Experiences on AtoN's gained in the NEWADA DUO project
Experiences on AtoN's gained in the NEWADA DUO projectExperiences on AtoN's gained in the NEWADA DUO project
Experiences on AtoN's gained in the NEWADA DUO project
 
Companies which attended Fundamental Elements Info Day
Companies which attended Fundamental Elements Info DayCompanies which attended Fundamental Elements Info Day
Companies which attended Fundamental Elements Info Day
 

Último

Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
vu2urc
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
Earley Information Science
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
Joaquim Jorge
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
giselly40
 

Último (20)

Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 

Assistive technologies: experiences from AAL for the blind and visually impaired within the ALICE project

  • 1. Assistive technologies: experiences from AAL for the blind and visually impaired within the ALICE project Andrei BURSUC, Prof. Titus ZAHARIA Institut Mines-Télécom; Télécom SudParis firstname.lastname@telecom-sudparis.eu Invited talk by DemaCare FP7 Project
  • 2. • Context and objectives • The ALICE project and AAL • State-of-the-art • User requirements • System prototype • Obstacle detection • Navigation assistant • Human-Machine interface • Conclusion and perspectives 2 Outline Experiences from the ALICE project
  • 3. • VI persons face many problems every day: – overall contextual understanding of space semantics – interaction with surrounding objects – planning, orientation, communication, navigation • 285M registered visually impaired people: 39M blind, 246M with low vision (WHO report) • The degree of visual impairment is increasing with an ageing population 3 Context and objectives Nowadays Experiences from the ALICE project
  • 4. • Provide navigational assistive device for elderly blind with cognitive capabilities: – Positioning – Obstacle detection/alerting – Landmark/object recognition • Offer VI users a cognitive description based on a fusion of of perceptions gathered from multiple sensors • Personal benefits: – Enable independency of blind and partially sighted people – Save stress and time of the end-users – Improve the individual self-esteem 4 Context and objectives Objectives Experiences from the ALICE project
  • 5. • 7 partners (academics, SMEs, VI persons associations) • 4 European countries (ES, FR, SI, UK) • Duration: June 2012 – November 2014 • Final product: device consisting of smartphone with additional sensors, wirelessly connected with local processing unit The project ZVEZA SLEPIH 5Experiences from the ALICE project
  • 6. • Ambient Assisted Living - funding activity that aims: – to create better condition of life for the older adults – to strengthen the industrial opportunities in Europe through the use of ICT • Funding across-national projects involving SMEs, research bodies and user’s organizations • Time-to-market perspective of max 2-3 years after the end of the project • Project total budget: 1-7 M€ (funding 3 M€ at most) AAL Joint Programme 6Experiences from the ALICE project
  • 7. Experiences from the ALICE project 7 What’s possible? State of the art
  • 8. How VI orient themselves? • With the help of the guide (other person) • Using a white cane, guide dog • Using electronic devices, GPS • By listening familiar sounds • By looking for something familiar (edge of pavements, curves, crossroads, very large inscriptions) • Underfoot textures, different surfaces • Sun, wind directions, smell • Road signs 8 State of the Art Experiences from the ALICE project
  • 9. Experiences from the ALICE project How VI orient themselves? • Current techniques are still not very advanced 9 State of the Art
  • 10. Experiences from the ALICE project How VI orient themselves? • Cane and dogs are still kings! 10 State of the Art
  • 11. How VI (could) orient themselves? • Navigation systems: – GPS + computer vision (clear path, landmark recognition) • Object recognition systems: – Grocery shopping assistant – RFID tags on objects – OCR (Optical Character Recognition) – Detectors: crosswalk , walk lights, staircase, street signs, pedestrians • Obstacle avoidance systems: – Integrating depth information – Step and curb detection 11 State of the Art Experiences from the ALICE project
  • 12. • Conclusions: – Few systems work in real time – Many approaches require the use of heavy equipment – Some systems need tags – The research field should get a new boost with the advent of the Google Glass How VI (could) orient themselves? 12 State of the Art [Lee, 2012] [Marduchi, 2012] [Pradeep, 2010] Experiences from the ALICE project
  • 13. • Limited computational resources: light and low powerful wearable devices • Real-time responsiveness • Reliability and no false positives • Adequate and non-overwhelming communication with the user (alerts, indications) 13 State of the Art Challenges Experiences from the ALICE project
  • 14. 24 July 2013 14 Setting up the path User feedback and requirements
  • 15. Experiences from the ALICE project • Participants’ profile: – Age: 55-75 – Countries: Slovenia, UK – Degree of visual impairness: blind and partially sighted – Total: 40 participants (20 from each country) Questionnaire for end-users 15 User requirements
  • 16. Questionnaire conclusions • 50 % of participants are using only familiar routes • Most participants need someone to guide them to certain places. • Some of them need the guide every time – often they depend on the time and will of others. • It is important to know where they are positioned, how far the destination is and the vicinity of the route 16 User requirements Experiences from the ALICE project
  • 17. Questionnaire conclusions - Device • Not very much confidence placed in the electronic navigation system (only after several successful tests) • Necessity of training and information about electronic devices. • Half of users use speech synthesis • Willingness to use headphones, but hearing shouldn‘t be obstructed. • “Turn by turn” functionality should not give too much info 17 User requirements Experiences from the ALICE project
  • 18. Questionnaire conclusions - Indoor • 85 % of respondents have difficulties with orientation through indoor public institutions. • Difficulties the users are facing in indoor environments: – the size of the room – glittering surfaces – room darkness – no orientating points to navigate with white cane – difficulties to recognize the landmarks – background music. 18 User requirements Experiences from the ALICE project
  • 19. Questionnaire conclusions - Obstacles • Obstacles that users want to be warned about: – pillars – curves – overhanging branches – edge of pavements – street furniture – steps – down slopes – ramps – holes – bumps 19 User requirements Experiences from the ALICE project
  • 20. Experiences from the ALICE project User expectations • The device should be accurate: – Exact info about the obstacles – Find safe corridors for walking – Warn the user when is safe to cross the road, the green light is on, if traffic is coming (especially bikes, electric cars) • The device should be small, portable, phone sized. 20 User requirements
  • 21. User expectations • Other features: – Give the distance to the building – Find the right bus stop, post box. – Text-to-speech for: letters, journey‘s instructions , street inscriptions, shop names – Tell the weather, temperature, local taxi availability. – Recognize faces and the person‘s name. 21 User requirements Experiences from the ALICE project
  • 22. Experiences from the ALICE project 22 First tests and experiments System prototype
  • 23. Sensor evaluation • Evaluation of multiple sensors: camera (ToF, stereo, web), compass, gyroscope, ultra-sonic ranger, GPS, pedometer) • Samsung Galaxy S3 used as baseline 23 System prototype Image Comunication Sound commands Tactile comunication Orientation Positioning Light sensor Inclination Experiences from the ALICE project
  • 24. Sensor evaluation • Sensors have different sampling speeds 24 System prototype Experiences from the ALICE project
  • 25. Sensor evaluation - Conclusions • All sensors in Samsung S3 are superior than the external ones tested (except GPS). • External GPS has better reception due to antena – but in areas with strong multipath effect, the advantage is reduced • Accuracy of GPS: 10 – 40 meters in urban areas • Ultrasonic ranger would be useful for obstacles in front of the user 25 System prototype Experiences from the ALICE project
  • 26. Possible camera positions 26 System prototype Experiences from the ALICE project
  • 27. Possible camera positions 27 System prototype Experiences from the ALICE project
  • 28. Possible camera positions • Setting used for video recording 28 System prototype Experiences from the ALICE project
  • 29. Headphones • Bone conduction headphones: – Effective even in very loud enviroment (city traffic) – Does not obscure sounds from enviroment – Very High frequencies not as good as in normal headphones 29 System prototype Experiences from the ALICE project
  • 31. Conclusion 24 July 2013 31 Conclusion and perspectives Parsing the visual domain Obstacle detector
  • 32. 32 Input video stream Method overview Obstacle detection Experiences from the ALICE project
  • 33. 33 Input video stream Interest points extraction Grid of points regularly spread in a frame Method overview Obstacle detection Experiences from the ALICE project
  • 34. 34 Input video stream Interest points extraction Grid of points regularly spread in a frame Interests points matching and tracking Multiscale Lucas-Kanade algorithm Method overview Obstacle detection Experiences from the ALICE project
  • 35. 35 Input video stream Interest points extraction Interests points matching and tracking Multiscale Lucas-Kanade algorithm Background / Camera motion estimation Global geometric transform – RANSAC algorithm Method overview Obstacle detection Experiences from the ALICE project
  • 36. 36 Input video stream Interest points extraction Interests points matching and tracking Background / Camera motion estimation Global geometric transform – RANSAC algorithm Static / Dynamic obstacle motion estimation Agglomerative clustering based on proximity computation Method overview Obstacle detection Experiences from the ALICE project
  • 37. 37 Input video stream Interest points extraction Interests points matching and tracking Background / Camera motion estimation Static / Dynamic obstacle motion estimation Agglomerative clustering based on proximity computation Interest points refinement K-NN algorithm and small clusters removal Method overview Obstacle detection Experiences from the ALICE project
  • 38. 38 Input video stream Interest points extraction Interests points matching and tracking Background / Camera motion estimation Static / Dynamic obstacle motion estimation Interest points refinement Obstacles classification K-NN algorithm and small clusters removal Method overview Obstacle detection Experiences from the ALICE project
  • 39. Experiences from the ALICE project 39 Input video stream Interest points extraction Interests points matching and tracking Background / Camera motion estimation Static / Dynamic obstacle motion estimation Interest points refinement Obstacles classification Obstacle classification based on position and direction relative to the video camera Experimental results Method overview Obstacle detection
  • 41. 41 The algorithms were run on an Intel Xeon Machine 3.6 GHz, RAM 16 GB RAM and on a NVIDIA Quadro 4000 video board (256 cores CUDA, 256 bits of external memory interface and 9945 MB graphical memory), under a Windows 7 platform (desktop). Preprocessing steps Time - without GPU (msec) Time - with GPU (msec) Interest points detection (image grid) 0.05 – 0.5 Interests points matching and tracking (unidirectional Lucas – Kanade optical flow) 22 - 23 10 - 11 Background / camera motion estimation (unidirectional homographic motion model (RANSAC) 6.5 - 8.0 Object / obstacle motion estimation (agglomerative clustering) 0.05 – 0.15 Interest points refinement (K-NN algorithm) 0.05 – 0.1 Obstacle classification (approaching / departing and urgent / normal) 0.05 - 0.1 Saving results (video) 1.5 – 2.05 TOTAL TIME / FRAME (average) 31 ms 20 ms Computational time Obstacle detection Experiences from the ALICE project
  • 43. Accessible Maps • Crow-sourced application for maps annotation • Routes are entered, edited and shared with Google Maps • OpenStreetMaps used as repository and online access to information about points of interest. 43 Navigation assistant Experiences from the ALICE project
  • 44. Accessible Maps • Waypoints annotations: – WHAT: presence of crosswalk, traffic lights in an intersection, type of intersection, walk buttons, Stop signs, median strips. – WHERE: information in form of absolute geographic form (Lat, Long) 44 Navigation assistant Experiences from the ALICE project
  • 45. Experiences from the ALICE project Assistance • Crossing ahead: • Turn left and then cross: 45 Navigation assistant
  • 47. Objectives Human-Machine interface Making the connection Human-Machine interface
  • 48. Objectives Human-Machine interface • Create a communication/presentation system: – Highly adapted to user needs – Enable the VI to perceive and interact with the surrounding environment • Instructions for navigation will have to acknowledge that user perception is similar to moving blindfolded in a maze: – Verbalization: for description of surrounding objects – Enactive methods: for presenting orientation, distance, motion and position of moving objects 48Experiences from the ALICE project
  • 49. Methods Human-Machine interface • 2 separate groups of users according to: – Level of visual impairment – Other criteria (age, education, etc.) • Interface modalities: – Audio semantics using sound, music and synthesized voice – Text-to-speech synthesis using headphones • Input modalities: screen, tapping, gestures, voice • Output modalities: audio, haptic, tactile 49Experiences from the ALICE project
  • 50. Enactive methods Human-Machine interface • Communication with the user: what, when, how – Not just how to transfer information between the system and the user, but what information and when. – The timely delivery of the right information avoids information overload. – Translate the sensory impressions about the surroundings into tactile or sound information ( faster and easier to comprehend than verbalization). 50Experiences from the ALICE project
  • 51. User warning • Directional warnings: earcons • Positional warning: – alerting a user must give user enough time to prepare (2-3 sec for a voice message) – acoustic signal (sequence of beeps) with varying frequencies – vibrations in the bone conduction headphones 51 Human-Machine interface Experiences from the ALICE project
  • 52. Menu • Hierarchical menu 52 Human-Machine interface Experiences from the ALICE project
  • 53. Georgie prototype • Sample user-interface 53 Human-Machine interface Experiences from the ALICE project
  • 54. 24 July 2013 54 Next steps Conclusion and Perspectives
  • 55. Conclusion • Encouraging first achievements within the ALICE project • Human-Machine interfacing is a difficult challenge • User feedback is essential • Still plenty of things left to improve 55 Conclusion and perspectives Experiences from the ALICE project
  • 56. Perspectives • Learning and recognizing user-defined landmarks and objects of interest • Obstacle classification according to degree of risk to the user and generation of adequate alerts • Improve navigation and recognition at key points of trip (start and finish) • Navigation and obstacle recognition modules integrated into a single application 56 Conclusion and perspectives Experiences from the ALICE project
  • 57. ALICE benefits in day-to-day life? • Jean: – is partially sighted – works at UBPS – travels the same route to his office every day 57 Conclusion and perspectives Experiences from the ALICE project
  • 58. ALICE benefits in day-to-day life? • Jean: – knows the route – with his white cane he manages to travel safely from the bus stop to the building. 58 Conclusion and perspectives Experiences from the ALICE project
  • 59. ALICE benefits in day-to-day life? • Paul: – is blind – goes at the UBPS once a week – uses different route (he doesn’t feel safe enough) 59 Conclusion and perspectives Experiences from the ALICE project
  • 60. ALICE benefits in day-to-day life? • Paul: – Paul’s route 60 Conclusion and perspectives Experiences from the ALICE project
  • 61. Experiences from the ALICE project ALICE benefits in day-to-day life? • Paul and some other blind people usually need to take longer routes (more then 400m) 61 Conclusion and perspectives Paul’s routeJean’s route
  • 62. How can ALICE bring benefits? 24 July 2013 62 Conclusion and perspectives Find out more at www.alice-project.euThank you!
  • 63. Experiences from the ALICE project • Slide 2: http://www.flickr.com/photos/gullevek/3240421172/ • Slide 7: http://www.flickr.com/photos/pointshoot/3590816656/ • Slide 10: http://blog.grdodge.org/wp-content/uploads/2011/08/Morris-and-Buddy-1.jpg http://www.iowablindhistory.org/sites/default/files/image/History%20Site%20Images%20and%20Audio%20/Pic%20o f%20Jernigan.jpg http://www.flickr.com/photos/library_of_congress/8190452507/ http://www.globalride-sf.org/images/0608/images/2_PedInfra_TactileWarnings.jpg http://images.ookaboo.com/photo/m/Geleidehond_testparcours_m.jpg http://www.robertschroeder.com/wordpress/wp-content/uploads/2011/01/GuidedWalkSchroeder.jpg http://abramsonscorner.files.wordpress.com/2011/06/img_9072-13-of-54-version-2-1-of-1.jpg • Slide 14: http://farm4.staticflickr.com/3459/3188288778_3d44b943b4_b.jpg • Slide 15: http://blockingfortheblind.org/wp-content/uploads/2013/02/peoplewithcanes.jpg • Slide 20: http://i.huffpost.com/gen/819993/thumbs/r-BLIND-MAN-TASERED-large570.jpg • Slide 31: http://www.flickr.com/photos/swiiffer/4593608484/ • Slide 42: http://upload.wikimedia.org/wikipedia/commons/thumb/a/af/Blind_Leading_the_Blind_by_Lee_Mclaughlin.jpg/1024px- Blind_Leading_the_Blind_by_Lee_Mclaughlin.jpg • Slide 47: http://i.imgur.com/f3fqnEY.jpg • Slide 54: http://www.flickr.com/photos/84681882@N00/5467879589 • Slide 62: http://www.austindowntownlions.org/Resources/Pictures/Gucci%20looking%20forward%20and%20canes.jpg 63 Photo credits