SlideShare una empresa de Scribd logo
1 de 5
ON SUFFICIENCY OF VERBAL INSTRUCTIONS FOR INDEPENDENT BLIND
SHOPPING


John Nicholson
Computer Science Assistive Technology Laboratory
Department of Computer Science
Utah State University
Logan, UT 84322-4205
john.nicholson@aggiemail.usu.edu


Vladimir Kulyukin
Computer Science Assistive Technology Laboratory
Department of Computer Science
Utah State University
Logan, UT 84322-4205
Vladimir.Kulyukin@usu.edu


Daniel Coster
Department of Mathematics and Statistics
Utah State University
Logan, UT 84322-3900
daniel.coster@usu.edu

Abstract
A field study of independent blind shopping in a modern supermarket suggests that
verbal instructions may be sufficient for retrieving shelved products.


Introduction
The list of the most functionally challenging environments for individuals with visual
impairments is topped by shopping complexes [5]. The Food Marketing Institute [1]
reports that a typical modern supermarket stocks an average of 45,000 products and has a
median store size of 47,500 square feet. Finding individual products, especially products
that are not purchased regularly, can take time, even for sighted shoppers, due to the size
of stores and the number of available products.
Many people with visual impairments do not shop independently and instead receive
assistance from a friend, a relative, an agency volunteer, or a store employee [3]. To
enable independent shopping, we have developed ShopTalk (Figure 1), a prototype of a
system designed to assist visually impaired shoppers with finding shelved products in
grocery stores. ShopTalk does not require any additional hardware or sensors to be
installed in the store. Instead, it takes advantage of research evidence suggesting that
people with visual impairments share route descriptions and guide each other over cell
phones [2].
ShopTalk assumes that visually impaired people who are independent navigators have the
skills needed to understand and follow verbal directions describing routes through large
areas of a store – store entrance to aisle, aisle to aisle, and aisle to cashier lane. In these
areas, that we call the locomotor space, ShopTalk issues verbal route directions based on
a topological map of the store. Since there are no additional sensors installed in the store,
ShopTalk does not know where the shopper is actually located and instead assumes the
shopper is capable of following the directions and correcting navigational errors
independently. In short, the verbal directions guide the shopper to the general area of the
target product.
Many grocery stores use inventory systems which place barcodes on the front edge of
shelves directly below products. ShopTalk uses these shelf barcodes to create a barcode
connectivity matrix (BCM). Whenever the user scans a shelf barcode, the shopper's
location in the store immediately becomes known to the system from the scanned
barcode. ShopTalk can then instruct the user how to find the location of the target
product. If the shopper is in the right aisle and the right side of the aisle, instructions can,
for example, be “move two shelves up and scan ten barcodes to the left.” If the shopper is
on the wrong side of the right aisle or in a wrong aisle, instructions are issued to guide the
shopper to the right location.
Experiment
We conducted a study involving 10 participants with visual impairments from the Logan,
UT area over multiple weeks. The locations of shelf barcodes were recorded for three
aisles (9, 10, and 11) in Lee’s MarketPlace, a local supermarket, resulting in 4,297 items
being scanned to build the BCM. One product was randomly chosen from each aisle for
a total of three products.
The experiment was performed during the grocery store's normal business hours, starting
at 9:00 PM and ending between 10:30 PM and 11:30 PM, depending on a participant's
performance. After a one hour training session, each participant was led to the store
entrance and was given a shopping basket to carry. The participant was then asked to
perform five runs of the experiment's shopping route. The 384 feet route began at the
entrance of the store, went to each of the three products, and ended at an entrance to a
cashier lane. Participants were not informed beforehand for which products they were
going to shop.
The experiment was designed to test five hypotheses. Hypothesis 1 (H1) is that using
only verbal route directions, a person with a visual impairment can successfully navigate
the locomotor space in a grocery store. Hypotheses 2 (H2) is that verbal instructions
based on barcode scans and the BCM are sufficient to guide shoppers with visual
impairments to target products. Hypotheses 3, 4, and 5 state that as participants
repeatedly perform a shopping task, (H3) the total distance they travel approaches the
distance traveled by a blind shopper being guided by a sighted person, (H4) the total time
taken to find products approaches the time needed by a blind shopper being guided by a
sighted person, and (H5) the number of barcode scans needed to find a target product
decreases.
Results
The overall success rate of product retrieval was 100%. All ten participants were able to
find all three products in every run. This includes the three participants who had complete
vision loss. Verbal route instructions and barcode scans appeared to be sufficient for
navigating the store and retrieving target products in grocery aisles. Thus, the null
hypotheses associated with research hypotheses H1 and H2 were rejected for our sample,
and experimental evidence indicates that both H1 and H2 hold.
To test hypotheses H3 and H4, a sighted guide led a visually impaired shopper to the
same products in the same order, as was done during the experiment. During this baseline
run, the shopper's guide dog followed the sighted guide. The baseline run was performed
once, took 133 seconds to complete, and had a distance of 384 feet. Both the average
total distance and the average run time needed to complete a run (see Figure 2) fell over
repeated runs and were found to be significant providing sufficient evidence in our data
to reject the null hypothesis associated with H3 and H4. The average number of products
scanned per run also fell over repeated runs with the decrease found to be significant,
allowing the null hypothesis associated with H5 to be rejected. A complete data analysis
is in [4].


Summary
ShopTalk provides two major contributions. First, it shows that visually impaired
individuals who are independent navigators do not necessarily need a special sensor to
navigate a new route in an unknown structured environment. Provided they are given
adequate verbal instructions description for the route they need to follow, they can use
their everyday navigation skills to successfully navigate routes. Second, a device such as
ShopTalk can assist shoppers with visual impairments and allow them to independently
shop in a modern grocery store for shelved products.


Acknowledgments
The second author would like to acknowledge that this research has been supported, in
part, through NSF grant (IIS-0346880), the National Eye Institute of the National
Institutes of Health under Grant (1 R41 EY017516-01A1), and three Community
University Research Initiative (CURI) grants (CURI-04, URI-05, and CURI-06) from the
State of Utah.


References
[1] Food Marketing Institute Research. 2007. The Food Retailing Industry Speaks 2007.
Food Marketing Institute.
[2] Gaunet, F. and Briffault, X. 2005. Exploring the functional specifications of a
localized wayfinding verbal aid for blind pedestrians: Simple and structured urban areas.
Human-Computer Interaction 20, 3, 267–314.
[3] Kulyukin, V., Gharpure, C., and Nicholson, J. 2005. Robocart: toward robot-assisted
navigation of grocery stores by the visually impaired. IEEE/RSJ International
Conference on Intelligent Robots and Systems, 2005. (IROS 2005). 2845–2850. 2-6 Aug.
2005.
[4] Kulyukin, V., Nicholson, J., and Coster, D. 2008. ShopTalk: Toward Independent
Shopping by People with Visual Impairments. Technical Report USU-CSATL-1-04-08,
Computer Science Assistive Technology Laboratory, Department of Computer Science,
Utah State University. April 15, 2008.
[5] Passini, R. and Proulx, G. 1988. Wayfinding without vision: An experiment with
congenitally totally blind people. Environment and Behavior 20(2), 227–252.


Figures




Figure 1. A view of ShopTalk's hardware.
Image description: A user is shown wearing ShopTalk. He is holding a barcode scanner
and carrying a shopping basket. He is wearing a backpack. A computer is attached to the
backpack's back, and a numeric keypad is attached to the backpack's shoulder strap.




Figure 2. A graph displaying the average time needed to complete each run. Similar
patterns were seen for the average total distance and average number of products
scanned.
Image description: The graph shows a downward trend, with the initial run having a 765
second average for all participants and the final run having a 272 second average. As a
group, low vision participants performed slightly faster than the group as a whole, and
participants with complete vision loss performed slightly slower. All groups trend
towards the baseline line time of 133 seconds.

Más contenido relacionado

Destacado

Different approaches and methods
Different approaches and methodsDifferent approaches and methods
Different approaches and methods
switlu
 
Principles of Teaching:Different Methods and Approaches
Principles of Teaching:Different Methods and ApproachesPrinciples of Teaching:Different Methods and Approaches
Principles of Teaching:Different Methods and Approaches
justindoliente
 

Destacado (8)

Self concept ppt
Self concept pptSelf concept ppt
Self concept ppt
 
Problem based learning: Principles and Practice for Healthcare practitioners
Problem based learning: Principles and Practice for Healthcare practitionersProblem based learning: Principles and Practice for Healthcare practitioners
Problem based learning: Principles and Practice for Healthcare practitioners
 
Study techniques
Study techniquesStudy techniques
Study techniques
 
Computer-Assisted Instruction
Computer-Assisted InstructionComputer-Assisted Instruction
Computer-Assisted Instruction
 
Deductive and inductive method of teching
Deductive and inductive method of techingDeductive and inductive method of teching
Deductive and inductive method of teching
 
Different approaches and methods
Different approaches and methodsDifferent approaches and methods
Different approaches and methods
 
Methods, approaches and techniques of teaching english
Methods, approaches and techniques of teaching englishMethods, approaches and techniques of teaching english
Methods, approaches and techniques of teaching english
 
Principles of Teaching:Different Methods and Approaches
Principles of Teaching:Different Methods and ApproachesPrinciples of Teaching:Different Methods and Approaches
Principles of Teaching:Different Methods and Approaches
 

Similar a On Sufficiency of Verbal Instructions for Independent Blind Shopping

Accessible Shopping Systems for Blind and Visually Impaired Individuals: Desi...
Accessible Shopping Systems for Blind and Visually Impaired Individuals: Desi...Accessible Shopping Systems for Blind and Visually Impaired Individuals: Desi...
Accessible Shopping Systems for Blind and Visually Impaired Individuals: Desi...
Vladimir Kulyukin
 
Apparel Retail Shopping Preferences of Generation X and Generation Y
Apparel Retail Shopping Preferences of Generation X and Generation YApparel Retail Shopping Preferences of Generation X and Generation Y
Apparel Retail Shopping Preferences of Generation X and Generation Y
Skip Spoerke
 

Similar a On Sufficiency of Verbal Instructions for Independent Blind Shopping (20)

Several Qualitative Observations on Independent Blind Shopping
Several Qualitative Observations on Independent Blind ShoppingSeveral Qualitative Observations on Independent Blind Shopping
Several Qualitative Observations on Independent Blind Shopping
 
ShopTalk: Independent Blind Shopping = Verbal Route Directions + Barcode Scans
ShopTalk: Independent Blind Shopping = Verbal Route Directions + Barcode ScansShopTalk: Independent Blind Shopping = Verbal Route Directions + Barcode Scans
ShopTalk: Independent Blind Shopping = Verbal Route Directions + Barcode Scans
 
Accessible Shopping Systems for Blind and Visually Impaired Individuals: Desi...
Accessible Shopping Systems for Blind and Visually Impaired Individuals: Desi...Accessible Shopping Systems for Blind and Visually Impaired Individuals: Desi...
Accessible Shopping Systems for Blind and Visually Impaired Individuals: Desi...
 
Shifting towards virtual stores
Shifting towards virtual storesShifting towards virtual stores
Shifting towards virtual stores
 
Robots as Interfaces to Haptic and Locomotor Spaces
Robots as Interfaces to Haptic and Locomotor SpacesRobots as Interfaces to Haptic and Locomotor Spaces
Robots as Interfaces to Haptic and Locomotor Spaces
 
Presentation updated.pptx
Presentation updated.pptxPresentation updated.pptx
Presentation updated.pptx
 
Ethnography and Observation in Applied Marketing Research
Ethnography and Observation in Applied Marketing ResearchEthnography and Observation in Applied Marketing Research
Ethnography and Observation in Applied Marketing Research
 
Has online shopping contributed towards the recent decline in uk retail sales
Has online shopping contributed towards the recent decline in uk retail salesHas online shopping contributed towards the recent decline in uk retail sales
Has online shopping contributed towards the recent decline in uk retail sales
 
Dissertation
DissertationDissertation
Dissertation
 
14
1414
14
 
Retail Audit of Lux, Lifebuoy & Breeze
Retail Audit of Lux, Lifebuoy & BreezeRetail Audit of Lux, Lifebuoy & Breeze
Retail Audit of Lux, Lifebuoy & Breeze
 
Digital impact-on-in-store-shopping research-studies
Digital impact-on-in-store-shopping research-studiesDigital impact-on-in-store-shopping research-studies
Digital impact-on-in-store-shopping research-studies
 
Retail Audit for HUL( Lux,Lifebuoy & breeze)
Retail Audit for HUL( Lux,Lifebuoy & breeze)Retail Audit for HUL( Lux,Lifebuoy & breeze)
Retail Audit for HUL( Lux,Lifebuoy & breeze)
 
Synopsis store layout [www.writekraft.com]
Synopsis store layout [www.writekraft.com]Synopsis store layout [www.writekraft.com]
Synopsis store layout [www.writekraft.com]
 
Synopsis On Store Layout [www.writekraft.com
Synopsis On Store Layout [www.writekraft.comSynopsis On Store Layout [www.writekraft.com
Synopsis On Store Layout [www.writekraft.com
 
What visual methods are employed by UK supermarkets to influence product choi...
What visual methods are employed by UK supermarkets to influence product choi...What visual methods are employed by UK supermarkets to influence product choi...
What visual methods are employed by UK supermarkets to influence product choi...
 
Apparel Retail Shopping Preferences of Generation X and Generation Y
Apparel Retail Shopping Preferences of Generation X and Generation YApparel Retail Shopping Preferences of Generation X and Generation Y
Apparel Retail Shopping Preferences of Generation X and Generation Y
 
Conference slides 6_18
Conference slides 6_18Conference slides 6_18
Conference slides 6_18
 
Scope and future of omnichannel in indian retail
Scope and future of omnichannel in indian retail Scope and future of omnichannel in indian retail
Scope and future of omnichannel in indian retail
 
Planning & Lists 2013
Planning & Lists 2013Planning & Lists 2013
Planning & Lists 2013
 

Más de Vladimir Kulyukin

Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...
Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...
Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...
Vladimir Kulyukin
 
Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...
Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...
Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...
Vladimir Kulyukin
 
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...
Vladimir Kulyukin
 

Más de Vladimir Kulyukin (20)

Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...
Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...
Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...
 
Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...
Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...
Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...
 
Generalized Hamming Distance
Generalized Hamming DistanceGeneralized Hamming Distance
Generalized Hamming Distance
 
Adapting Measures of Clumping Strength to Assess Term-Term Similarity
Adapting Measures of Clumping Strength to Assess Term-Term SimilarityAdapting Measures of Clumping Strength to Assess Term-Term Similarity
Adapting Measures of Clumping Strength to Assess Term-Term Similarity
 
A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...
A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...
A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...
 
Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...
Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...
Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...
 
Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...
Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...
Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...
 
Text Skew Angle Detection in Vision-Based Scanning of Nutrition Labels
Text Skew Angle Detection in Vision-Based Scanning of Nutrition LabelsText Skew Angle Detection in Vision-Based Scanning of Nutrition Labels
Text Skew Angle Detection in Vision-Based Scanning of Nutrition Labels
 
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
 
Effective Nutrition Label Use on Smartphones
Effective Nutrition Label Use on SmartphonesEffective Nutrition Label Use on Smartphones
Effective Nutrition Label Use on Smartphones
 
An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...
An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...
An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...
 
An Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the Cloud
An Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the CloudAn Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the Cloud
An Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the Cloud
 
Narrative Map Augmentation with Automated Landmark Extraction and Path Inference
Narrative Map Augmentation with Automated Landmark Extraction and Path InferenceNarrative Map Augmentation with Automated Landmark Extraction and Path Inference
Narrative Map Augmentation with Automated Landmark Extraction and Path Inference
 
Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...
Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...
Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...
 
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
 
Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...
Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...
Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...
 
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
 
Toward Blind Travel Support through Verbal Route Directions: A Path Inference...
Toward Blind Travel Support through Verbal Route Directions: A Path Inference...Toward Blind Travel Support through Verbal Route Directions: A Path Inference...
Toward Blind Travel Support through Verbal Route Directions: A Path Inference...
 
Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...
Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...
Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...
 
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...
 

Último

Pests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdfPests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdf
PirithiRaju
 
Pests of cotton_Borer_Pests_Binomics_Dr.UPR.pdf
Pests of cotton_Borer_Pests_Binomics_Dr.UPR.pdfPests of cotton_Borer_Pests_Binomics_Dr.UPR.pdf
Pests of cotton_Borer_Pests_Binomics_Dr.UPR.pdf
PirithiRaju
 
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 bAsymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Sérgio Sacani
 
Disentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTDisentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOST
Sérgio Sacani
 

Último (20)

Green chemistry and Sustainable development.pptx
Green chemistry  and Sustainable development.pptxGreen chemistry  and Sustainable development.pptx
Green chemistry and Sustainable development.pptx
 
GBSN - Biochemistry (Unit 1)
GBSN - Biochemistry (Unit 1)GBSN - Biochemistry (Unit 1)
GBSN - Biochemistry (Unit 1)
 
Botany 4th semester series (krishna).pdf
Botany 4th semester series (krishna).pdfBotany 4th semester series (krishna).pdf
Botany 4th semester series (krishna).pdf
 
Pests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdfPests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdf
 
Pests of cotton_Borer_Pests_Binomics_Dr.UPR.pdf
Pests of cotton_Borer_Pests_Binomics_Dr.UPR.pdfPests of cotton_Borer_Pests_Binomics_Dr.UPR.pdf
Pests of cotton_Borer_Pests_Binomics_Dr.UPR.pdf
 
fundamental of entomology all in one topics of entomology
fundamental of entomology all in one topics of entomologyfundamental of entomology all in one topics of entomology
fundamental of entomology all in one topics of entomology
 
Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)
 
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 bAsymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
 
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43bNightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
 
Botany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdfBotany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdf
 
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCRStunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
 
GBSN - Microbiology (Unit 2)
GBSN - Microbiology (Unit 2)GBSN - Microbiology (Unit 2)
GBSN - Microbiology (Unit 2)
 
Natural Polymer Based Nanomaterials
Natural Polymer Based NanomaterialsNatural Polymer Based Nanomaterials
Natural Polymer Based Nanomaterials
 
Botany krishna series 2nd semester Only Mcq type questions
Botany krishna series 2nd semester Only Mcq type questionsBotany krishna series 2nd semester Only Mcq type questions
Botany krishna series 2nd semester Only Mcq type questions
 
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
 
Disentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTDisentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOST
 
Isotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on IoIsotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on Io
 
Chemistry 4th semester series (krishna).pdf
Chemistry 4th semester series (krishna).pdfChemistry 4th semester series (krishna).pdf
Chemistry 4th semester series (krishna).pdf
 
Biological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdfBiological Classification BioHack (3).pdf
Biological Classification BioHack (3).pdf
 
CELL -Structural and Functional unit of life.pdf
CELL -Structural and Functional unit of life.pdfCELL -Structural and Functional unit of life.pdf
CELL -Structural and Functional unit of life.pdf
 

On Sufficiency of Verbal Instructions for Independent Blind Shopping

  • 1. ON SUFFICIENCY OF VERBAL INSTRUCTIONS FOR INDEPENDENT BLIND SHOPPING John Nicholson Computer Science Assistive Technology Laboratory Department of Computer Science Utah State University Logan, UT 84322-4205 john.nicholson@aggiemail.usu.edu Vladimir Kulyukin Computer Science Assistive Technology Laboratory Department of Computer Science Utah State University Logan, UT 84322-4205 Vladimir.Kulyukin@usu.edu Daniel Coster Department of Mathematics and Statistics Utah State University Logan, UT 84322-3900 daniel.coster@usu.edu Abstract A field study of independent blind shopping in a modern supermarket suggests that verbal instructions may be sufficient for retrieving shelved products. Introduction The list of the most functionally challenging environments for individuals with visual impairments is topped by shopping complexes [5]. The Food Marketing Institute [1] reports that a typical modern supermarket stocks an average of 45,000 products and has a median store size of 47,500 square feet. Finding individual products, especially products that are not purchased regularly, can take time, even for sighted shoppers, due to the size of stores and the number of available products. Many people with visual impairments do not shop independently and instead receive assistance from a friend, a relative, an agency volunteer, or a store employee [3]. To enable independent shopping, we have developed ShopTalk (Figure 1), a prototype of a system designed to assist visually impaired shoppers with finding shelved products in grocery stores. ShopTalk does not require any additional hardware or sensors to be installed in the store. Instead, it takes advantage of research evidence suggesting that people with visual impairments share route descriptions and guide each other over cell
  • 2. phones [2]. ShopTalk assumes that visually impaired people who are independent navigators have the skills needed to understand and follow verbal directions describing routes through large areas of a store – store entrance to aisle, aisle to aisle, and aisle to cashier lane. In these areas, that we call the locomotor space, ShopTalk issues verbal route directions based on a topological map of the store. Since there are no additional sensors installed in the store, ShopTalk does not know where the shopper is actually located and instead assumes the shopper is capable of following the directions and correcting navigational errors independently. In short, the verbal directions guide the shopper to the general area of the target product. Many grocery stores use inventory systems which place barcodes on the front edge of shelves directly below products. ShopTalk uses these shelf barcodes to create a barcode connectivity matrix (BCM). Whenever the user scans a shelf barcode, the shopper's location in the store immediately becomes known to the system from the scanned barcode. ShopTalk can then instruct the user how to find the location of the target product. If the shopper is in the right aisle and the right side of the aisle, instructions can, for example, be “move two shelves up and scan ten barcodes to the left.” If the shopper is on the wrong side of the right aisle or in a wrong aisle, instructions are issued to guide the shopper to the right location. Experiment We conducted a study involving 10 participants with visual impairments from the Logan, UT area over multiple weeks. The locations of shelf barcodes were recorded for three aisles (9, 10, and 11) in Lee’s MarketPlace, a local supermarket, resulting in 4,297 items being scanned to build the BCM. One product was randomly chosen from each aisle for a total of three products. The experiment was performed during the grocery store's normal business hours, starting at 9:00 PM and ending between 10:30 PM and 11:30 PM, depending on a participant's performance. After a one hour training session, each participant was led to the store entrance and was given a shopping basket to carry. The participant was then asked to perform five runs of the experiment's shopping route. The 384 feet route began at the entrance of the store, went to each of the three products, and ended at an entrance to a cashier lane. Participants were not informed beforehand for which products they were going to shop. The experiment was designed to test five hypotheses. Hypothesis 1 (H1) is that using only verbal route directions, a person with a visual impairment can successfully navigate the locomotor space in a grocery store. Hypotheses 2 (H2) is that verbal instructions based on barcode scans and the BCM are sufficient to guide shoppers with visual impairments to target products. Hypotheses 3, 4, and 5 state that as participants repeatedly perform a shopping task, (H3) the total distance they travel approaches the distance traveled by a blind shopper being guided by a sighted person, (H4) the total time taken to find products approaches the time needed by a blind shopper being guided by a sighted person, and (H5) the number of barcode scans needed to find a target product decreases.
  • 3. Results The overall success rate of product retrieval was 100%. All ten participants were able to find all three products in every run. This includes the three participants who had complete vision loss. Verbal route instructions and barcode scans appeared to be sufficient for navigating the store and retrieving target products in grocery aisles. Thus, the null hypotheses associated with research hypotheses H1 and H2 were rejected for our sample, and experimental evidence indicates that both H1 and H2 hold. To test hypotheses H3 and H4, a sighted guide led a visually impaired shopper to the same products in the same order, as was done during the experiment. During this baseline run, the shopper's guide dog followed the sighted guide. The baseline run was performed once, took 133 seconds to complete, and had a distance of 384 feet. Both the average total distance and the average run time needed to complete a run (see Figure 2) fell over repeated runs and were found to be significant providing sufficient evidence in our data to reject the null hypothesis associated with H3 and H4. The average number of products scanned per run also fell over repeated runs with the decrease found to be significant, allowing the null hypothesis associated with H5 to be rejected. A complete data analysis is in [4]. Summary ShopTalk provides two major contributions. First, it shows that visually impaired individuals who are independent navigators do not necessarily need a special sensor to navigate a new route in an unknown structured environment. Provided they are given adequate verbal instructions description for the route they need to follow, they can use their everyday navigation skills to successfully navigate routes. Second, a device such as ShopTalk can assist shoppers with visual impairments and allow them to independently shop in a modern grocery store for shelved products. Acknowledgments The second author would like to acknowledge that this research has been supported, in part, through NSF grant (IIS-0346880), the National Eye Institute of the National Institutes of Health under Grant (1 R41 EY017516-01A1), and three Community University Research Initiative (CURI) grants (CURI-04, URI-05, and CURI-06) from the State of Utah. References [1] Food Marketing Institute Research. 2007. The Food Retailing Industry Speaks 2007. Food Marketing Institute. [2] Gaunet, F. and Briffault, X. 2005. Exploring the functional specifications of a localized wayfinding verbal aid for blind pedestrians: Simple and structured urban areas. Human-Computer Interaction 20, 3, 267–314.
  • 4. [3] Kulyukin, V., Gharpure, C., and Nicholson, J. 2005. Robocart: toward robot-assisted navigation of grocery stores by the visually impaired. IEEE/RSJ International Conference on Intelligent Robots and Systems, 2005. (IROS 2005). 2845–2850. 2-6 Aug. 2005. [4] Kulyukin, V., Nicholson, J., and Coster, D. 2008. ShopTalk: Toward Independent Shopping by People with Visual Impairments. Technical Report USU-CSATL-1-04-08, Computer Science Assistive Technology Laboratory, Department of Computer Science, Utah State University. April 15, 2008. [5] Passini, R. and Proulx, G. 1988. Wayfinding without vision: An experiment with congenitally totally blind people. Environment and Behavior 20(2), 227–252. Figures Figure 1. A view of ShopTalk's hardware.
  • 5. Image description: A user is shown wearing ShopTalk. He is holding a barcode scanner and carrying a shopping basket. He is wearing a backpack. A computer is attached to the backpack's back, and a numeric keypad is attached to the backpack's shoulder strap. Figure 2. A graph displaying the average time needed to complete each run. Similar patterns were seen for the average total distance and average number of products scanned. Image description: The graph shows a downward trend, with the initial run having a 765 second average for all participants and the final run having a 272 second average. As a group, low vision participants performed slightly faster than the group as a whole, and participants with complete vision loss performed slightly slower. All groups trend towards the baseline line time of 133 seconds.