SlideShare una empresa de Scribd logo
1 de 7
Eyes-Free Barcode Localization and Decoding for Visually Impaired
                        Mobile Phone Users
                                   Vladimir Kulyukin Aliasgar Kutiyanawala∗
                                 Computer Science Assistive Technology Laboratory
                                         Department of Computer Science
                                              Utah State University
                                                Logan, UT, 84322
                        vladimir.kulyukin@aggiemail.usu.edu, aliasgar.k@aggiemail.usu.edu


Abstract                                                        English skills to read the products’ ingredients [2]. These
                                                                difficulties cause VI shoppers to abandon searching for de-
     An eyes-free barcode localization and decoding             sirable products or settle for distant substitutes. PeaPod or
     method is presented that enables visually im-              similar home delivery services provide grocery shopping al-
     paired (VI) mobile phone users to decode MSI               ternatives. However, such services are not universally avail-
     (Modified Plessy) barcodes on shelves and UPC               able and, when available, require shoppers to schedule and
     barcodes on individual boxes, cans, and bottles.           wait for deliveries, thereby reducing personal independence
     Simple and efficient barcode localization and de-           and making spontaneous shopping impossible. To over-
     coding techniques are augmented with an interac-           come these access barriers, accessible shopping systems are
     tive haptic feedback loop that allows the VI user          needed that increase personal independence, enable sponta-
     to align the phone’s camera with a fixed surface            neous shopping, and do not require that supermarkets un-
     in the pitch and yaw planes. The method is im-             dergo extensive technological adjustments. Such systems
     plemented on a Google Nexus One smart phone                will make VI individuals independent of external assistance
     running Android 2.1. A laboratory study is pre-            and fundamentally improve their quality of life.
     sented in which the method was evaluated by one
     VI and four blindfolded sighted participants.              In 2006, we began our work on ShopTalk [3], a wear-
                                                                able system for independent blind supermarket shopping.
Keywords: Accessible Shopping, Eyes-Free Barcode Lo-            ShopTalk consists of a small computer device (OQO model
calization & Decoding, Assistive Technology 1                   01), a Belkin numeric keypad, a Hand Held Products
                                                                IT4600 SR wireless barcode scanner and its base station,
                                                                and a USB hub that connects all components. To help carry
1 Introduction                                                  the equipment, the user wears a small CamelBak backpack.
                                                                The numeric keypad is attached by velcro to one of the
Supermarkets are one of the most functionally challenging       backpack’s shoulder straps. To ensure adequate air circula-
environments for VI individuals. A modern supermarket           tion, the OQO is placed in a wire frame attached to the out-
has a median area of 4,529 square meters and stocks an av-      side of the backpack. The remaining components - the bar-
erage of 45,000 products [1]. VI people typically rely on       code scanner’s base station, the USB hub, and all connect-
family members or friends who take them grocery shop-           ing cables - are placed inside the backpack’s pouch. Since
ping. When these individuals are unavailable, VI shoppers       the system gives speech instructions to the user, the user has
reschedule their shopping trips or rely on store staffers as-   a small headphone. The system was our first attempt (and
signed to them when they come to the store. Some staffers       the first attempt reported in the accessible shopping litera-
are unfamiliar with the store layout, others become irritated   ture) to use shelf barcodes (MSI, not UPC) as topological
with long searches, and still others do not have adequate       points for locating products through verbal directions.
  ∗ Contact   Author                                            In 2007 - 2008, after two single subject studies at Lee’s Mar-
  1 Submitted   to IPCV 2010                                    ket Place, an independently owned supermarket in Logan,
UT, and one single subject study at Sweet Peas, a small in-          vision techniques to recognize MSI barcodes on shelves and
dependent natural foods store in Logan, UT, ten VI partic-           UPC barcodes on individual products. Vision-based bar-
ipants were recruited for a longitudinal formal study. The           code localization and decoding is a well-known research
product database was extended to include 4,297 products.             problem. Algorithms have been proposed that look for
The experiment, performed during regular business hours,             line sets with mono-oriented gradients to locate the bar-
had each participant shop for the same set of three randomly         code region [6, 7]. Other algorithms extract connected
chosen products five times. The product retrieval rate was            regions with mono-oriented texture to perform local bar-
100%. All ten participants found all three products in every         code searches in cluttered 3-D scenes [8, 9, 10]. Hough
run (see [2] for detailed ANOVA analysis and experiment              transformation-based methods extract barcode lines by as-
design). A key finding of ShopTalk is that all participants in        suming robustness against orientation and size [11]. Var-
the sample could execute verbal route directions in super-           ious morphological filters and self-learning networks have
markets and product search instructions triggered by bar-            also been tried [12]. One common disadvantage of these al-
code scans with 100% accuracy.                                       gorithms is that they are very time-consuming and require
                                                                     external servers for image processing. Recently several bar-
In 2008 - 2009, ShopTalk was ported onto the Nokia E70               code decoding solutions for camera phones have been pro-
smartphone running Symbian OS 9.1. The smartphone is                 posed [13, 14]. However, these solutions target sighted
connected to a small Bluetooth barcode scanner (Baracoda             users in that they require that the phone camera is carefully
Pencil2) [4]. The port, called ShopMobile, was tested in             aligned and centered on a barcode for the decoding to work.
a laboratory study with six sighted blindfolded participants.        An accessible barcode localization and decoding method is
The experiment had eighty shelf barcodes obtained from the           proposed in [15]. However, unlike the method presented in
USU bookstore placed on shelves assembled into 3 aisles              this paper, it has not been implemented on a mobile phone
where aisles 1 and 3 had only one side while aisle 2 had             yet. We wholeheartedly endorse and commend these R&D
two sides. Each participant successfully retrieved a set of          efforts. Our approach is based on the hypothesis that so-
four products.                                                       phisticated vision techniques may not be needed to make
                                                                     barcode localization and decoding accessible to VI mobile
We are currently developing ShopMobile II, the next
                                                                     phone users, because simple and efficient vision techniques
version of ShopMobile. Like its direct predecessors,
                                                                     can be augmented with interactive user interfaces that en-
ShopTalk [5] and ShopMobile [4], ShopMobile II uses two
                                                                     sure that images have certain properties.
data structures to represent the shopping environment. The
first data structure is a topological map, which is a di-             In this paper, we present an interactive haptic feedback loop
rected graph whose nodes are decision points: the store              that allows the VI mobile phone user to align the camera
entrance, aisle entrances, and cashier lane entrances. The           with a fixed surface in the pitch and yaw planes, which
edges are labeled with directions. Due to the regularity of          guarantees that the captured image is taken from the mobile
modern supermarkets, we found it sufficient to have three             phone parallel to the surface. The remainder of our paper is
directional labels: left, right, and forward. The topologi-          organized as follows. In Section 2, we describe our barcode
cal map is built at installation time by walking through the         localization and decoding method. In Section 3, we present
store, noting decision points of interest, and then represent-       an interactive haptic camera alignment loop that enables the
ing them in the map. The second data structure, called the           VI mobile phone user to align the phone’s camera with fixed
Barcode Connectivity Matrix (BCM), is designed to take               surfaces in the pitch and yaw planes. In Section 4, we give
advantage of the inventory control systems used by many              brief descriptions of the target user and the target use case
grocery stores. These inventory systems place barcodes on            toward which we are working. We also present a laboratory
the shelves immediately beneath every product type. Shelf            study with one VI individual and four sighted blindfolded
barcodes assist the store personnel in managing the product          participants. In Section 5, we offer our conclusions.
inventory. Once the locations of all shelf barcodes in the
store are known, this information is used to guide the shop-
per through to the target shelf section, because products are        2    Eyes-Free Barcode Localization
located directly above their corresponding shelf barcodes.
More information on how the BCM can be computed from                      and Decoding
the inventory control databases is available in [2].
                                                                     A barcode can be viewed as a homogeneous region consist-
Unlike its predecessors, ShopMobile II no longer requires a          ing of alternate black and white lines condensed in a small
barcode scanner, because it relies exclusively on computer           region. In ShopMobile II, barcodes are characterized as

                                                                 2
image regions that have the following two properties. We             in images. In the current implementation, the Gabor filter is
call the first property alternating frequency and define it as         replaced with a much more efficient line detection filter that
the number of black to white and white to black transitions          convolutes the kernel shown in Figure 2 through the entire
along the x-axis. We call the second property vertical con-          image. This kernel is inspired by the line vertical detection
tinuity and define it as the continuity of black and white            kernel based on a 3 × 3 matrix discussed in [17]. We ex-
lines along the y-axis. As an example, consider two one-             perimentally extended the kernel to a 13 × 3 matrix with
pixel wide lines A and B placed on the barcode as shown              decreasing coefficients along the y-axis to make it less sen-
in the left part of Figure 1. These lines can be encoded as          sitive to the noise in the image. Given an m × n matrix, our
bit strings where black pixels map to zeros and white pixels         filter can be generated as follows:
map to ones. The alternating frequency is the number of
0-1 and 1-0 transitions in the bit string. Vertical continuity
                                                                                                     2
                                                                                                     −1
is measured as the longest common subsequence of the two                            ( m+1 − |i|) × n 4         if j = 0
                                                                       f [i][j] =      2
bit strings. Other string matching techniques can certainly                         −( 2 − |i|) × ( n+1 − |j|) if j = 0,
                                                                                       m+1
                                                                                                       2
be used provided that there is no impact on the overall effi-
ciency.                                                                   where −m/2 ≤ i ≤ m/2 and −n/2 ≤ j ≤ n/2.


                                                                     Since barcode lines pass through this filter along with some
                                                                     other vertical lines generated by graphics and text that may
                                                                     be present in the image, the filtered image is searched in a
                                                                     rasterized pattern with two one pixel wide lines in order to
                                                                     isolate areas with high alternating frequency and high ver-
                                                                     tical continuity. Fast histogram analysis is performed on
     Figure 1: One Pixel Wide Lines on the Barcode.                  all pairs of lines with sufficiently high coefficients for the
                                                                     alternating frequency vertical continuity. After a small set
                                                                     of candidate regions is obtained, each candidate region is
                                                                     mapped to the original image, because, while the reduced
                                                                     resolution is sufficient for fast and adequate barcode local-
                                                                     ization, it turned out not to be adequate for ZXing [13],
                                                                     which we used for barcode decoding. ZXing is an open
                                                                     source library for decoding several 1D and 2D barcode
                                                                     types, including UPC, but does not support the MSI bar-
                                                                     code type. We extended ZXing to decode MSI barcodes and
                                                                     extended the API to accept a set of barcode regions for de-
                                                                     coding. Thus, the localized candidate regions are mapped
                                                                     back to the original image scale (1024 by 768 pixels) and
                                                                     cropped from it so that ZXing can process them. Future
                                                                     references to ZXing in the paper will refer to this extended
                                                                     version of ZXing.
                                                                     We designed an experiment to measure the contribution of
                                                                     our barcode localization algorithm to ZXing. A total of 187
          Figure 2: Line Detection Filter Kernel.
                                                                     images of real products were captured with a HTC Touch
                                                                     Pro smartphone. In the first run, ZXing alone decoded 91
For efficiency, the 1024 by 768 image is reduced to 320               images. In the second run, the images were first processed
by 240 pixels. Each pixel, a packed integer containing               with our barcode localization algorithm and then ZXing ran
the pixel’s alpha and RGB components, is mapped to a                 on the cropped candidate regions. This time barcodes were
grayscale value Y between 0 and 255, where Y = 0.3 ×                 correctly decoded in 133 images, which amounts to an in-
R + 0.59 × G + 0.11 × B. Since the Google Nexus One’s                crease of 42 images or 46.15% over the number of images
camera has a provision to take a greyscale image, the above          decoded by ZXing alone. The software for this experiment
step reduced to setting Y = G for efficiency. In our previ-           was implemented using Java SE and Netbeans and ran on a
ous work [16], we used a Gabor filter to localize barcodes            Dell Optiplex 960 Core2Duo machine running at 3GHz.


                                                                 3
3 Interactive Camera Alignment                                                                  Downscale
                                                                                                image
The shopper always takes a picture of a fixed surface after
the phone’s camera is aligned with the surface in the pitch
and yaw planes. The alignment is carried out through an it-                             a                          b
erative haptic control loop. If the product is a box, the user                                         Apply line
is instructed to find the top of the box, i.e. the side with                                            detection filter
an opening tab, by touch and then align the camera with
                                                                                                 Rasterized
the bottom, i.e. the opposite side. This is because on most                                      search
boxes sold in the U.S. the UPC barcode is on the bottom.
Once the bottom is found, the shopper aligns the camera
with an edge of the side and clicks a button on the touch
screen to start the alignment loop. The current readings of                            d Extract barcode              c

the phone’s orientation sensor are taken to define the abso-                              from original image
lute pitch and yaw planes. The shopper then slowly moves
the camera away from the surface. If the camera is mis-
aligned in the pitch or yaw planes, specific haptic signals                              e
(vibrations) are issued to re-align it. The signals persist until
the camera is aligned. The user is instructed to stop moving            Figure 4: Accessible Barcode Localization and Decoding
the camera away from the surface when she thinks that the               Method.
camera is approximately 10cm from the surface. Then the
picture is taken by a click of a button on the touch screen. If         4    A Laboratory Study
the product is a can, the shopper aligns the camera with the
edge of the top or bottom circle. If it is a bottle, the camera         Before we describe our laboratory studies, we will briefly
is aligned with the edge of the bottom circle.                          describe the target user and the target use case toward which
                                                                        we are working. We will call our target user Alice (name
When the shopper reaches an aisle, she aligns the phone                 changed to protect identity) who was a participant in one of
camera with a shelf and captures an image. If a barcode is              our longitudinal accessible shopping experiments in Lee’s
decoded successfully and happens to be a shelf barcode, the             Market Place. She is in her mid-twenties, a mother of
system checks if it is the target barcode. If it is, the system         two children, and a seasoned mobile phone user. Both she
verbally informs the shopper about it. If it is not, the system         and her husband are completely blind and cannot help each
generates verbal directions to the target product (e.g. “Two            other with grocery shopping. Alice has excellent orientation
shelves up, three barcodes left.”). If the barcode cannot be            and mobility (O&M) skills, takes independent walks around
decoded in the image, the system divides the image into                 her neighborhood, uses the Cache Valley transit system, and
sub-images that are likely to contain barcodes. Each sub-               has a full-time job. Alice is a resident of Logan, Utah, and
image is processed with a barcode decoder. If the barcode is            goes to Lee’s Market Place several times a week. ShopMo-
decoded, the system goes into the verbal instruction mode.              bile II is a mobile solution for individuals like Alice.
If the barcode is not decoded, the system assumes that the
barcode is rotated by 90 degrees. This assumption is en-                We are working toward the following use case. When Al-
forced by the interactive haptic alignment loop. The image              ice enters the supermarket, she picks up a shopping basket.
is rotated by 90 degrees, and the entire processes is repeated          Alice also has adequate cane skills to pull a shopping cart
again. If part of a barcode is detected, the system gives the           if she has to. Alice takes out her mobile phone and, if her
user haptic feedback to move the camera slightly left, right,           shopping list is prepared, selects a product by menu brows-
up, or down, depending on where the partial barcode is de-              ing. The system tells Alice where the product is located,
tected. When the system is unable to find the barcode in the             and if requested, generates a template-based route descrip-
image, it notifies the user through audio feedback (currently            tion. When Alice reaches the aisle, she aligns her phone
a ping) to take another image. The overall flow of control               camera with a shelf and takes a picture. If the camera is not
of the barcode localization and decoding method is given in             vertically aligned to the shelf, the system uses the phone’s
Figure 3. If necessary, user feedback messages can be de-               orientation sensor to give Alice haptic (vibratory) feedback
livered in a variety of formats: haptic, audio, speech, or a            to align the camera. Alice then takes an image. If part of a
combination thereof.                                                    barcode is detected, another vibratory signal requests Alice


                                                                    4
Get image                                         Localize                                         Rotate
                              from camera                                         barcode                                          image
                                                                  No


                                                          Barcode
                             Decode barcode                                                                                       Localize
                                                          decoded?           Decode barcode
                                                                                                                                  barcode

                                                                  Yes


                                                                                 Barcode                        No            Decode barcode
                                                                                 decoded?


                                                                                                 Yes


                                            No        Is it a UPC       No       Is it a shelf                      Yes          Barcode
                                                       barcode?                   barcode?                                       decoded?

                               Ask user to                  Yes                                        Yes                              No
                                try again
                                                         Barcode                    Barcode
                                             No       corresponds                corresponds                    No
                                                        to target                  to target
                                                        product?                   product?


                                                            Yes                   Yes

                              Notify user that
                                                   Confirm product  Ask user to   Provide directions                                 Ask user to
                             selected product is
                             not target product      to the user   verify product to target product                                   try again



                                              Figure 3: Barcode Localization and Decoding.


to move the phone to the left (right). When the shelf barcode                                                                   Finding a Product      Verifying a Product

is recognized, the system uses its database to give Alice ver-                                           5
bal directions to the target product, e.g. two shelves up, five                                          4.5
barcodes right or this is aisle 5, target product is in aisle 2.                                         4

When the target shelf barcode is recognized, Alice reaches                                              3.5
                                                                                     Number of Scans




                                                                                                         3
above the barcode and takes a product from the shelf. If the
                                                                                                        2.5
product’s identity cannot be determined manually, e.g. by
                                                                                                         2
touch, smell, shake, etc., they system allows Alice to inter-                                           1.5
actively recognize its UPC barcode to verify the product’s                                               1
identity.                                                                                               0.5
                                                                                                         0
                                                                                                              Participant 1    Participant 2   Participant 3   Participant 4   Participant 5
We approximated the target use case with a laboratory
study. We ported all software to a Google Nexus One smart
phone equipped with a five megapixel camera. The phone                              Figure 5: Mean Values for Number of Scans to Retrieve and
runs on Android 2.1 on a 1 GHz processor with 512 MB                               Verify Products for Each Participant.
of RAM. A grocery store environment was simulated by
assembling a shelf section in our laboratory that consisted
of five shelves. Twelve empty boxes of real products were                           folded. Each participant was given a ten minute training
placed on three shelves.                                                           session in which the system was shown to him/her and then
                                                                                   asked to retrieve and verify ten randomly selected products.
We recruited one VI individual and four sighted individu-                          For each participant, the system logged the image associ-
als to test the system. The sighted individuals were blind-                        ated with each scan, the result of the scan and the method


                                                                             5
Finding a Product      Verifying a Product

                                                                                                                         13%
                   3.5

                    3

                   2.5
                                                                                                                                                             ZXing
 Number of Scans




                    2                                                                                            16%
                                                                                                                                                             Localization
                   1.5                                                                                                                                       Rotation and
                                                                                                                                                             Localization
                    1
                                                                                                                 4%                                          Not Decoded
                   0.5

                    0                                                                                                                              68%
                         Participant 1   Participant 2   Participant 3   Participant 4   Participant 5



Figure 6: Median Values for Number of Scans to Retrieve
and Verify Products for Each Participant.
                                                                                                                         Figure 7: Results of Barcode Scans.

by which the barcode was decoded.
                                                                                                             or by tapping on the phone’s trackball, a small joystick-like
All participants were able to retrieve all products success-                                                 hardware component below the touch screen. All partici-
fully. Figure 5 shows the average number of scans for prod-                                                  pants preferred tapping the screen to tapping the trackball,
uct retrieval and verification. It took an average of 2.72                                                    which resulted in many accidental images with no barcodes
scans for a participant to retrieve a product and an average                                                 present in them, which negatively affected the barcode lo-
of 3.32 scans to verify a product. The STD values are 4.4                                                    calization and decoding performance. In our future work,
for product retrieval and 5.6 for product verification. The                                                   we will design a more sophisticated approach (e.g. finger
median values for product retrieval and product verification                                                  gestures) to eliminate accidental images.
are shown in Figure 6.

In general, the median values for product retrieval are lower
than the median values for product verification. For partic-
                                                                                                             5     Conclusion
ipant 5 (a sighted individual), the median value for product
                                                                                                             We presented an eyes-free vision-based barcode localiza-
retrieval is 0. The system is designed so that that the shop-
                                                                                                             tion and decoding method that enables visually impaired
per has the option of picking up a product from the shelf
                                                                                                             (VI) mobile phone users to decode MSI (Modified Plessy)
without having to scan the MSI shelf barcode below it, be-
                                                                                                             barcodes on shelves and UPC barcodes on individual boxes,
cause some shoppers can locate the target product by count-
                                                                                                             cans, and bottles. We showed how simple and efficient bar-
ing shelf barcodes by touch.
                                                                                                             code localization and decoding techniques were augmented
Figure 7 shows a distribution of the result of the scans for                                                 with an interactive haptic feedback loop that allows the VI
all participants. ZXing alone decoded about 13% of all bar-                                                  user to align the phone’s camera with a fixed surface in the
code scans. When used in conjunction with our barcode                                                        pitch and yaw planes. Our method is implemented on a
localization algorithm it decoded another 16% of all bar-                                                    Google Nexus One smart phone running Android 2.1. We
code scans. Rotating the image before localizing and de-                                                     described a laboratory study in which the method was eval-
coding the barcode results in about 4% more barcode scans                                                    uated by one VI and four blindfolded sighted participants.
to be decoded. Barcode localization causes the number of                                                     All participants were able to retrieve and verify all the prod-
barcodes decoded to increase from 38 to 97, an increase of                                                   ucts successfully. Our touch screen user interface should be
over 155%.                                                                                                   improved to eliminate accidental images. Experiments are
                                                                                                             being planned with VI individuals in a real grocery store.
In the informal post-experiment interviews, all participants
said that our touch screen user interface was very simple
to use but too basic. The interface consisted of a single                                                    Acknowledgments
button encompassing the entire touch screen of the Google
Nexus One phone. The barcode decoding procedure was                                                          This research has been supported, in part, through NSF
initiated either by finger tapping on any part of the screen                                                  grant IIS-0346880.


                                                                                                         6
References                                                       [11] Muniz R., Junco L., and Otero A., A Robust Software
                                                                     Barcode Reader using the Hough Transform., In Pro-
[1] Food Marketing Institute Research. The Food Retailing            ceedings of International Conference on Information
    Industry Speaks 2006. Food Marketing Institute 2006.             Intelligence and Systems, No. 31, pp. 313-319, Novem-
                                                                     ber 1999.
[2] Nicholson J., Kulyukin V., Coster D. ShopTalk: Inde-
    pendent Blind Shopping Through Verbal Route Direc-           [12] Arnould S., Awcock G. J., and Thomas R., Remote
    tions and Barcode Scans The Open Rehabilitation Jour-            Bar-code Localization Using Mathematical Morphol-
    nal, ISSN: 1874-9437 Volume 2, 2009, pp. 11-23, DOI              ogy., Image Processing and its Applications, Vol. 2, No.
    10.2174/1874943700902010011.                                     465, pp. 642-646, 1999.

                                                                 [13] The Zebra Crossing Barcode Decoding Library
[3] Nicholson J. and Kulyukin V. ShopTalk: Independent
                                                                     http://code.google.com/p/zxing/
    Blind Shopping = Verbal Route Directions + Barcode
    Scans. Proceedings of the 30-th Annual Conference of         [14] Occipital, LLC. RedLaser http://redlaser.com/
    the Rehabilitation Engineering and Assistive Technol-
    ogy Society of North America (RESNA 2007), June              [15] Tekin, E. and Coughlan, J.M., An algorithm enabling
    2007, Phoenix, Arizona, Avail. on-line and on CD-                blind users to find and read barcodes WACV09, 2009
    ROM.
                                                                 [16] Kutiyanawala A., Qi X., Tian J. A Simple and Effi-
                                                                     cient Approach to Barcode Localization. 7th Interna-
[4] Janaswami, K. (2010). ShopMobile: A Mobile Shop-
                                                                     tional Conference on Information, Communication and
    ping Aid for Visually Impaired Individuals. M.S. Re-
                                                                     Signal Processing, Macau 2009
    port, Department of Computer Science, Utah State Uni-
    versity, Logan, UT, to appear.                               [17] Fisher      R.,        Perkins      S.,       Walker
                                                                     A.      and    Wolfart      E.     Line     Detection.
[5] Kulyukin V., Nicholson J., and Coster D. ShopTalk: To-           http://homepages.inf.ed.ac.uk/rbf/HIPR2/linedet.htm
    ward Independent Shopping by People with Visual Im-
    pairments. Proceedings of the 8th ACM Conference on
    Computers and Accessiblity (ASSETS 2008), Halifax,
    Canada.

[6] Normand N. and Viard-Gaudin C. A Two-Dimensional
    Bar Code Reader. Pattern Recognition, Vol. 3, pp. 201-
    203, 1994.

[7] Viard-Gaudin C., Normand N., and Barba D. Algorithm
    Using a Two-Dimensional Approach. In Proceedings
    of the Second Int. Conf. on Document Analysis and
    Recognition, No. 20-22, pp. 45-48, October 1993.

[8] Ando S., and Hontanj H., Automatic Visual Searching
    and Reading of Barcodes in 3-D Scene, In Proceedings
    of the IEEE Int. Conf. on Vehicle Electronics. No. 25-
    28, pp. 49-54, September 2001.

[9] Ando S. Image Field Categorization and Edge/Corner
    Detection from Gradient Covariance. IEEE Transac-
    tions on Pattern Analysis and Machine Intelligence,
    Vol. 22, No. 2, pp. 179-190, February 2000.

[10] Ando S., Consistent Gradient Operators. IEEE Trans-
    actions on Pattern Analysis and Machine Intelligence,
    Vol. 22, No. 3, pp. 252-265, March 2000.


                                                             7

Más contenido relacionado

Similar a Eyes-Free Barcode Localization and Decoding for Visually Impaired Mobile Phone Users

Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...
Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...
Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...Vladimir Kulyukin
 
Teleassistance in Accessible Shopping for the Blind
Teleassistance in Accessible Shopping for the BlindTeleassistance in Accessible Shopping for the Blind
Teleassistance in Accessible Shopping for the BlindVladimir Kulyukin
 
Top15 digital innovations in grocery
Top15 digital innovations in grocery Top15 digital innovations in grocery
Top15 digital innovations in grocery Techingrocery
 
IRJET- Pocket Mall Navigator: Bridging Digital and Traditional Shopping E...
IRJET-  	  Pocket Mall Navigator: Bridging Digital and Traditional Shopping E...IRJET-  	  Pocket Mall Navigator: Bridging Digital and Traditional Shopping E...
IRJET- Pocket Mall Navigator: Bridging Digital and Traditional Shopping E...IRJET Journal
 
IRJET- Pocket Mall Navigator: Bridging Digital and Traditional Shopping Exper...
IRJET- Pocket Mall Navigator: Bridging Digital and Traditional Shopping Exper...IRJET- Pocket Mall Navigator: Bridging Digital and Traditional Shopping Exper...
IRJET- Pocket Mall Navigator: Bridging Digital and Traditional Shopping Exper...IRJET Journal
 
ShopTalk: Toward Independent Shopping by People with Visual Impairments
ShopTalk: Toward Independent Shopping by People with Visual ImpairmentsShopTalk: Toward Independent Shopping by People with Visual Impairments
ShopTalk: Toward Independent Shopping by People with Visual ImpairmentsVladimir Kulyukin
 
IRJET - Internet of Things based Smart Shopping Cart
IRJET - Internet of Things based Smart Shopping CartIRJET - Internet of Things based Smart Shopping Cart
IRJET - Internet of Things based Smart Shopping CartIRJET Journal
 
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...Vladimir Kulyukin
 
IRJET- Indoor Shopping System for Visually Impaired People
IRJET- Indoor Shopping System for Visually Impaired PeopleIRJET- Indoor Shopping System for Visually Impaired People
IRJET- Indoor Shopping System for Visually Impaired PeopleIRJET Journal
 
Passive Radio Frequency Exteroception in Robot-Assisted Shopping for the Blind
Passive Radio Frequency Exteroception in Robot-Assisted Shopping for the BlindPassive Radio Frequency Exteroception in Robot-Assisted Shopping for the Blind
Passive Radio Frequency Exteroception in Robot-Assisted Shopping for the BlindVladimir Kulyukin
 
IoT-Based Intelligent Shopping Cart to Enhance the Shopping Experience
IoT-Based Intelligent Shopping Cart to Enhance the Shopping ExperienceIoT-Based Intelligent Shopping Cart to Enhance the Shopping Experience
IoT-Based Intelligent Shopping Cart to Enhance the Shopping ExperienceIRJET Journal
 
POCKET MALL NAVIGATOR
POCKET MALL NAVIGATORPOCKET MALL NAVIGATOR
POCKET MALL NAVIGATORIRJET Journal
 
Intelligence Billing System Using Radio Frequency Identification (RFID) and Z...
Intelligence Billing System Using Radio Frequency Identification (RFID) and Z...Intelligence Billing System Using Radio Frequency Identification (RFID) and Z...
Intelligence Billing System Using Radio Frequency Identification (RFID) and Z...iosrjce
 
Development of Electronic Bank Deposi and Withdrawal System Using Quick Respo...
Development of Electronic Bank Deposi and Withdrawal System Using Quick Respo...Development of Electronic Bank Deposi and Withdrawal System Using Quick Respo...
Development of Electronic Bank Deposi and Withdrawal System Using Quick Respo...AnthonyOtuonye
 
Intelligent trolley for shopping mall using io t
Intelligent trolley for shopping mall using io tIntelligent trolley for shopping mall using io t
Intelligent trolley for shopping mall using io tsuryakiran Chilukuri
 
All-in-One Intelligent Shopping Trolley with Automatic Billing and Payment Sy...
All-in-One Intelligent Shopping Trolley with Automatic Billing and Payment Sy...All-in-One Intelligent Shopping Trolley with Automatic Billing and Payment Sy...
All-in-One Intelligent Shopping Trolley with Automatic Billing and Payment Sy...IRJET Journal
 

Similar a Eyes-Free Barcode Localization and Decoding for Visually Impaired Mobile Phone Users (20)

Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...
Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...
Eye-Free Barcode Detection on Smartphones with Niblack's Binarization and Sup...
 
Teleassistance in Accessible Shopping for the Blind
Teleassistance in Accessible Shopping for the BlindTeleassistance in Accessible Shopping for the Blind
Teleassistance in Accessible Shopping for the Blind
 
Top15 digital innovations in grocery
Top15 digital innovations in grocery Top15 digital innovations in grocery
Top15 digital innovations in grocery
 
top15digitalinnovation
top15digitalinnovationtop15digitalinnovation
top15digitalinnovation
 
IRJET- Pocket Mall Navigator: Bridging Digital and Traditional Shopping E...
IRJET-  	  Pocket Mall Navigator: Bridging Digital and Traditional Shopping E...IRJET-  	  Pocket Mall Navigator: Bridging Digital and Traditional Shopping E...
IRJET- Pocket Mall Navigator: Bridging Digital and Traditional Shopping E...
 
IRJET- Pocket Mall Navigator: Bridging Digital and Traditional Shopping Exper...
IRJET- Pocket Mall Navigator: Bridging Digital and Traditional Shopping Exper...IRJET- Pocket Mall Navigator: Bridging Digital and Traditional Shopping Exper...
IRJET- Pocket Mall Navigator: Bridging Digital and Traditional Shopping Exper...
 
ShopTalk: Toward Independent Shopping by People with Visual Impairments
ShopTalk: Toward Independent Shopping by People with Visual ImpairmentsShopTalk: Toward Independent Shopping by People with Visual Impairments
ShopTalk: Toward Independent Shopping by People with Visual Impairments
 
Smart shopping system
Smart shopping systemSmart shopping system
Smart shopping system
 
IRJET - Internet of Things based Smart Shopping Cart
IRJET - Internet of Things based Smart Shopping CartIRJET - Internet of Things based Smart Shopping Cart
IRJET - Internet of Things based Smart Shopping Cart
 
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...
Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Clo...
 
IRJET- Indoor Shopping System for Visually Impaired People
IRJET- Indoor Shopping System for Visually Impaired PeopleIRJET- Indoor Shopping System for Visually Impaired People
IRJET- Indoor Shopping System for Visually Impaired People
 
Passive Radio Frequency Exteroception in Robot-Assisted Shopping for the Blind
Passive Radio Frequency Exteroception in Robot-Assisted Shopping for the BlindPassive Radio Frequency Exteroception in Robot-Assisted Shopping for the Blind
Passive Radio Frequency Exteroception in Robot-Assisted Shopping for the Blind
 
IoT-Based Intelligent Shopping Cart to Enhance the Shopping Experience
IoT-Based Intelligent Shopping Cart to Enhance the Shopping ExperienceIoT-Based Intelligent Shopping Cart to Enhance the Shopping Experience
IoT-Based Intelligent Shopping Cart to Enhance the Shopping Experience
 
POCKET MALL NAVIGATOR
POCKET MALL NAVIGATORPOCKET MALL NAVIGATOR
POCKET MALL NAVIGATOR
 
Intelligence Billing System Using Radio Frequency Identification (RFID) and Z...
Intelligence Billing System Using Radio Frequency Identification (RFID) and Z...Intelligence Billing System Using Radio Frequency Identification (RFID) and Z...
Intelligence Billing System Using Radio Frequency Identification (RFID) and Z...
 
J017356875
J017356875J017356875
J017356875
 
Development of Electronic Bank Deposi and Withdrawal System Using Quick Respo...
Development of Electronic Bank Deposi and Withdrawal System Using Quick Respo...Development of Electronic Bank Deposi and Withdrawal System Using Quick Respo...
Development of Electronic Bank Deposi and Withdrawal System Using Quick Respo...
 
Intelligent trolley for shopping mall using io t
Intelligent trolley for shopping mall using io tIntelligent trolley for shopping mall using io t
Intelligent trolley for shopping mall using io t
 
All-in-One Intelligent Shopping Trolley with Automatic Billing and Payment Sy...
All-in-One Intelligent Shopping Trolley with Automatic Billing and Payment Sy...All-in-One Intelligent Shopping Trolley with Automatic Billing and Payment Sy...
All-in-One Intelligent Shopping Trolley with Automatic Billing and Payment Sy...
 
14 569
14 569 14 569
14 569
 

Más de Vladimir Kulyukin

Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...
Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...
Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...Vladimir Kulyukin
 
Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...
Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...
Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...Vladimir Kulyukin
 
Generalized Hamming Distance
Generalized Hamming DistanceGeneralized Hamming Distance
Generalized Hamming DistanceVladimir Kulyukin
 
Adapting Measures of Clumping Strength to Assess Term-Term Similarity
Adapting Measures of Clumping Strength to Assess Term-Term SimilarityAdapting Measures of Clumping Strength to Assess Term-Term Similarity
Adapting Measures of Clumping Strength to Assess Term-Term SimilarityVladimir Kulyukin
 
A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...
A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...
A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...Vladimir Kulyukin
 
Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...
Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...
Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...Vladimir Kulyukin
 
Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...
Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...
Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...Vladimir Kulyukin
 
Text Skew Angle Detection in Vision-Based Scanning of Nutrition Labels
Text Skew Angle Detection in Vision-Based Scanning of Nutrition LabelsText Skew Angle Detection in Vision-Based Scanning of Nutrition Labels
Text Skew Angle Detection in Vision-Based Scanning of Nutrition LabelsVladimir Kulyukin
 
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...Vladimir Kulyukin
 
Effective Nutrition Label Use on Smartphones
Effective Nutrition Label Use on SmartphonesEffective Nutrition Label Use on Smartphones
Effective Nutrition Label Use on SmartphonesVladimir Kulyukin
 
An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...
An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...
An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...Vladimir Kulyukin
 
An Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the Cloud
An Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the CloudAn Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the Cloud
An Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the CloudVladimir Kulyukin
 
Narrative Map Augmentation with Automated Landmark Extraction and Path Inference
Narrative Map Augmentation with Automated Landmark Extraction and Path InferenceNarrative Map Augmentation with Automated Landmark Extraction and Path Inference
Narrative Map Augmentation with Automated Landmark Extraction and Path InferenceVladimir Kulyukin
 
Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...
Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...
Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...Vladimir Kulyukin
 
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...Vladimir Kulyukin
 
Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...
Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...
Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...Vladimir Kulyukin
 
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...Vladimir Kulyukin
 
Toward Blind Travel Support through Verbal Route Directions: A Path Inference...
Toward Blind Travel Support through Verbal Route Directions: A Path Inference...Toward Blind Travel Support through Verbal Route Directions: A Path Inference...
Toward Blind Travel Support through Verbal Route Directions: A Path Inference...Vladimir Kulyukin
 
Wireless Indoor Localization with Dempster-Shafer Simple Support Functions
Wireless Indoor Localization with Dempster-Shafer Simple Support FunctionsWireless Indoor Localization with Dempster-Shafer Simple Support Functions
Wireless Indoor Localization with Dempster-Shafer Simple Support FunctionsVladimir Kulyukin
 
RFID in Robot-Assisted Indoor Navigation for the Visually Impaired
RFID in Robot-Assisted Indoor Navigation for the Visually ImpairedRFID in Robot-Assisted Indoor Navigation for the Visually Impaired
RFID in Robot-Assisted Indoor Navigation for the Visually ImpairedVladimir Kulyukin
 

Más de Vladimir Kulyukin (20)

Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...
Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...
Toward Sustainable Electronic Beehive Monitoring: Algorithms for Omnidirectio...
 
Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...
Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...
Digitizing Buzzing Signals into A440 Piano Note Sequences and Estimating Fora...
 
Generalized Hamming Distance
Generalized Hamming DistanceGeneralized Hamming Distance
Generalized Hamming Distance
 
Adapting Measures of Clumping Strength to Assess Term-Term Similarity
Adapting Measures of Clumping Strength to Assess Term-Term SimilarityAdapting Measures of Clumping Strength to Assess Term-Term Similarity
Adapting Measures of Clumping Strength to Assess Term-Term Similarity
 
A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...
A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...
A Cloud-Based Infrastructure for Caloric Intake Estimation from Pre-Meal Vide...
 
Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...
Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...
Exploring Finite State Automata with Junun Robots: A Case Study in Computabil...
 
Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...
Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...
Image Blur Detection with 2D Haar Wavelet Transform and Its Effect on Skewed ...
 
Text Skew Angle Detection in Vision-Based Scanning of Nutrition Labels
Text Skew Angle Detection in Vision-Based Scanning of Nutrition LabelsText Skew Angle Detection in Vision-Based Scanning of Nutrition Labels
Text Skew Angle Detection in Vision-Based Scanning of Nutrition Labels
 
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
Vision-Based Localization and Scanning of 1D UPC and EAN Barcodes with Relaxe...
 
Effective Nutrition Label Use on Smartphones
Effective Nutrition Label Use on SmartphonesEffective Nutrition Label Use on Smartphones
Effective Nutrition Label Use on Smartphones
 
An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...
An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...
An Algorithm for Mobile Vision-Based Localization of Skewed Nutrition Labels ...
 
An Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the Cloud
An Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the CloudAn Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the Cloud
An Algorithm for In-Place Vision-Based Skewed 1D Barcode Scanning in the Cloud
 
Narrative Map Augmentation with Automated Landmark Extraction and Path Inference
Narrative Map Augmentation with Automated Landmark Extraction and Path InferenceNarrative Map Augmentation with Automated Landmark Extraction and Path Inference
Narrative Map Augmentation with Automated Landmark Extraction and Path Inference
 
Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...
Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...
Skip Trie Matching: A Greedy Algorithm for Real-Time OCR Error Correction on ...
 
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
 
Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...
Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...
Skip Trie Matching for Real Time OCR Output Error Correction on Android Smart...
 
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
Vision-Based Localization & Text Chunking of Nutrition Fact Tables on Android...
 
Toward Blind Travel Support through Verbal Route Directions: A Path Inference...
Toward Blind Travel Support through Verbal Route Directions: A Path Inference...Toward Blind Travel Support through Verbal Route Directions: A Path Inference...
Toward Blind Travel Support through Verbal Route Directions: A Path Inference...
 
Wireless Indoor Localization with Dempster-Shafer Simple Support Functions
Wireless Indoor Localization with Dempster-Shafer Simple Support FunctionsWireless Indoor Localization with Dempster-Shafer Simple Support Functions
Wireless Indoor Localization with Dempster-Shafer Simple Support Functions
 
RFID in Robot-Assisted Indoor Navigation for the Visually Impaired
RFID in Robot-Assisted Indoor Navigation for the Visually ImpairedRFID in Robot-Assisted Indoor Navigation for the Visually Impaired
RFID in Robot-Assisted Indoor Navigation for the Visually Impaired
 

Último

Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...jana861314
 
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43bNightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43bSérgio Sacani
 
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCRStunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCRDelhi Call girls
 
Discovery of an Accretion Streamer and a Slow Wide-angle Outflow around FUOri...
Discovery of an Accretion Streamer and a Slow Wide-angle Outflow around FUOri...Discovery of an Accretion Streamer and a Slow Wide-angle Outflow around FUOri...
Discovery of an Accretion Streamer and a Slow Wide-angle Outflow around FUOri...Sérgio Sacani
 
Call Us ≽ 9953322196 ≼ Call Girls In Mukherjee Nagar(Delhi) |
Call Us ≽ 9953322196 ≼ Call Girls In Mukherjee Nagar(Delhi) |Call Us ≽ 9953322196 ≼ Call Girls In Mukherjee Nagar(Delhi) |
Call Us ≽ 9953322196 ≼ Call Girls In Mukherjee Nagar(Delhi) |aasikanpl
 
G9 Science Q4- Week 1-2 Projectile Motion.ppt
G9 Science Q4- Week 1-2 Projectile Motion.pptG9 Science Q4- Week 1-2 Projectile Motion.ppt
G9 Science Q4- Week 1-2 Projectile Motion.pptMAESTRELLAMesa2
 
Physiochemical properties of nanomaterials and its nanotoxicity.pptx
Physiochemical properties of nanomaterials and its nanotoxicity.pptxPhysiochemical properties of nanomaterials and its nanotoxicity.pptx
Physiochemical properties of nanomaterials and its nanotoxicity.pptxAArockiyaNisha
 
Recombination DNA Technology (Nucleic Acid Hybridization )
Recombination DNA Technology (Nucleic Acid Hybridization )Recombination DNA Technology (Nucleic Acid Hybridization )
Recombination DNA Technology (Nucleic Acid Hybridization )aarthirajkumar25
 
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.aasikanpl
 
Raman spectroscopy.pptx M Pharm, M Sc, Advanced Spectral Analysis
Raman spectroscopy.pptx M Pharm, M Sc, Advanced Spectral AnalysisRaman spectroscopy.pptx M Pharm, M Sc, Advanced Spectral Analysis
Raman spectroscopy.pptx M Pharm, M Sc, Advanced Spectral AnalysisDiwakar Mishra
 
Spermiogenesis or Spermateleosis or metamorphosis of spermatid
Spermiogenesis or Spermateleosis or metamorphosis of spermatidSpermiogenesis or Spermateleosis or metamorphosis of spermatid
Spermiogenesis or Spermateleosis or metamorphosis of spermatidSarthak Sekhar Mondal
 
Types of different blotting techniques.pptx
Types of different blotting techniques.pptxTypes of different blotting techniques.pptx
Types of different blotting techniques.pptxkhadijarafiq2012
 
Presentation Vikram Lander by Vedansh Gupta.pptx
Presentation Vikram Lander by Vedansh Gupta.pptxPresentation Vikram Lander by Vedansh Gupta.pptx
Presentation Vikram Lander by Vedansh Gupta.pptxgindu3009
 
Orientation, design and principles of polyhouse
Orientation, design and principles of polyhouseOrientation, design and principles of polyhouse
Orientation, design and principles of polyhousejana861314
 
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...Sérgio Sacani
 
Work, Energy and Power for class 10 ICSE Physics
Work, Energy and Power for class 10 ICSE PhysicsWork, Energy and Power for class 10 ICSE Physics
Work, Energy and Power for class 10 ICSE Physicsvishikhakeshava1
 
GFP in rDNA Technology (Biotechnology).pptx
GFP in rDNA Technology (Biotechnology).pptxGFP in rDNA Technology (Biotechnology).pptx
GFP in rDNA Technology (Biotechnology).pptxAleenaTreesaSaji
 
Natural Polymer Based Nanomaterials
Natural Polymer Based NanomaterialsNatural Polymer Based Nanomaterials
Natural Polymer Based NanomaterialsAArockiyaNisha
 

Último (20)

Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
 
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43bNightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43b
 
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCRStunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
Stunning ➥8448380779▻ Call Girls In Panchshil Enclave Delhi NCR
 
Discovery of an Accretion Streamer and a Slow Wide-angle Outflow around FUOri...
Discovery of an Accretion Streamer and a Slow Wide-angle Outflow around FUOri...Discovery of an Accretion Streamer and a Slow Wide-angle Outflow around FUOri...
Discovery of an Accretion Streamer and a Slow Wide-angle Outflow around FUOri...
 
Engler and Prantl system of classification in plant taxonomy
Engler and Prantl system of classification in plant taxonomyEngler and Prantl system of classification in plant taxonomy
Engler and Prantl system of classification in plant taxonomy
 
Call Us ≽ 9953322196 ≼ Call Girls In Mukherjee Nagar(Delhi) |
Call Us ≽ 9953322196 ≼ Call Girls In Mukherjee Nagar(Delhi) |Call Us ≽ 9953322196 ≼ Call Girls In Mukherjee Nagar(Delhi) |
Call Us ≽ 9953322196 ≼ Call Girls In Mukherjee Nagar(Delhi) |
 
G9 Science Q4- Week 1-2 Projectile Motion.ppt
G9 Science Q4- Week 1-2 Projectile Motion.pptG9 Science Q4- Week 1-2 Projectile Motion.ppt
G9 Science Q4- Week 1-2 Projectile Motion.ppt
 
Physiochemical properties of nanomaterials and its nanotoxicity.pptx
Physiochemical properties of nanomaterials and its nanotoxicity.pptxPhysiochemical properties of nanomaterials and its nanotoxicity.pptx
Physiochemical properties of nanomaterials and its nanotoxicity.pptx
 
Recombination DNA Technology (Nucleic Acid Hybridization )
Recombination DNA Technology (Nucleic Acid Hybridization )Recombination DNA Technology (Nucleic Acid Hybridization )
Recombination DNA Technology (Nucleic Acid Hybridization )
 
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Mayapuri Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
 
Raman spectroscopy.pptx M Pharm, M Sc, Advanced Spectral Analysis
Raman spectroscopy.pptx M Pharm, M Sc, Advanced Spectral AnalysisRaman spectroscopy.pptx M Pharm, M Sc, Advanced Spectral Analysis
Raman spectroscopy.pptx M Pharm, M Sc, Advanced Spectral Analysis
 
Spermiogenesis or Spermateleosis or metamorphosis of spermatid
Spermiogenesis or Spermateleosis or metamorphosis of spermatidSpermiogenesis or Spermateleosis or metamorphosis of spermatid
Spermiogenesis or Spermateleosis or metamorphosis of spermatid
 
Types of different blotting techniques.pptx
Types of different blotting techniques.pptxTypes of different blotting techniques.pptx
Types of different blotting techniques.pptx
 
Presentation Vikram Lander by Vedansh Gupta.pptx
Presentation Vikram Lander by Vedansh Gupta.pptxPresentation Vikram Lander by Vedansh Gupta.pptx
Presentation Vikram Lander by Vedansh Gupta.pptx
 
9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service
9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service
9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service
 
Orientation, design and principles of polyhouse
Orientation, design and principles of polyhouseOrientation, design and principles of polyhouse
Orientation, design and principles of polyhouse
 
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
PossibleEoarcheanRecordsoftheGeomagneticFieldPreservedintheIsuaSupracrustalBe...
 
Work, Energy and Power for class 10 ICSE Physics
Work, Energy and Power for class 10 ICSE PhysicsWork, Energy and Power for class 10 ICSE Physics
Work, Energy and Power for class 10 ICSE Physics
 
GFP in rDNA Technology (Biotechnology).pptx
GFP in rDNA Technology (Biotechnology).pptxGFP in rDNA Technology (Biotechnology).pptx
GFP in rDNA Technology (Biotechnology).pptx
 
Natural Polymer Based Nanomaterials
Natural Polymer Based NanomaterialsNatural Polymer Based Nanomaterials
Natural Polymer Based Nanomaterials
 

Eyes-Free Barcode Localization and Decoding for Visually Impaired Mobile Phone Users

  • 1. Eyes-Free Barcode Localization and Decoding for Visually Impaired Mobile Phone Users Vladimir Kulyukin Aliasgar Kutiyanawala∗ Computer Science Assistive Technology Laboratory Department of Computer Science Utah State University Logan, UT, 84322 vladimir.kulyukin@aggiemail.usu.edu, aliasgar.k@aggiemail.usu.edu Abstract English skills to read the products’ ingredients [2]. These difficulties cause VI shoppers to abandon searching for de- An eyes-free barcode localization and decoding sirable products or settle for distant substitutes. PeaPod or method is presented that enables visually im- similar home delivery services provide grocery shopping al- paired (VI) mobile phone users to decode MSI ternatives. However, such services are not universally avail- (Modified Plessy) barcodes on shelves and UPC able and, when available, require shoppers to schedule and barcodes on individual boxes, cans, and bottles. wait for deliveries, thereby reducing personal independence Simple and efficient barcode localization and de- and making spontaneous shopping impossible. To over- coding techniques are augmented with an interac- come these access barriers, accessible shopping systems are tive haptic feedback loop that allows the VI user needed that increase personal independence, enable sponta- to align the phone’s camera with a fixed surface neous shopping, and do not require that supermarkets un- in the pitch and yaw planes. The method is im- dergo extensive technological adjustments. Such systems plemented on a Google Nexus One smart phone will make VI individuals independent of external assistance running Android 2.1. A laboratory study is pre- and fundamentally improve their quality of life. sented in which the method was evaluated by one VI and four blindfolded sighted participants. In 2006, we began our work on ShopTalk [3], a wear- able system for independent blind supermarket shopping. Keywords: Accessible Shopping, Eyes-Free Barcode Lo- ShopTalk consists of a small computer device (OQO model calization & Decoding, Assistive Technology 1 01), a Belkin numeric keypad, a Hand Held Products IT4600 SR wireless barcode scanner and its base station, and a USB hub that connects all components. To help carry 1 Introduction the equipment, the user wears a small CamelBak backpack. The numeric keypad is attached by velcro to one of the Supermarkets are one of the most functionally challenging backpack’s shoulder straps. To ensure adequate air circula- environments for VI individuals. A modern supermarket tion, the OQO is placed in a wire frame attached to the out- has a median area of 4,529 square meters and stocks an av- side of the backpack. The remaining components - the bar- erage of 45,000 products [1]. VI people typically rely on code scanner’s base station, the USB hub, and all connect- family members or friends who take them grocery shop- ing cables - are placed inside the backpack’s pouch. Since ping. When these individuals are unavailable, VI shoppers the system gives speech instructions to the user, the user has reschedule their shopping trips or rely on store staffers as- a small headphone. The system was our first attempt (and signed to them when they come to the store. Some staffers the first attempt reported in the accessible shopping litera- are unfamiliar with the store layout, others become irritated ture) to use shelf barcodes (MSI, not UPC) as topological with long searches, and still others do not have adequate points for locating products through verbal directions. ∗ Contact Author In 2007 - 2008, after two single subject studies at Lee’s Mar- 1 Submitted to IPCV 2010 ket Place, an independently owned supermarket in Logan,
  • 2. UT, and one single subject study at Sweet Peas, a small in- vision techniques to recognize MSI barcodes on shelves and dependent natural foods store in Logan, UT, ten VI partic- UPC barcodes on individual products. Vision-based bar- ipants were recruited for a longitudinal formal study. The code localization and decoding is a well-known research product database was extended to include 4,297 products. problem. Algorithms have been proposed that look for The experiment, performed during regular business hours, line sets with mono-oriented gradients to locate the bar- had each participant shop for the same set of three randomly code region [6, 7]. Other algorithms extract connected chosen products five times. The product retrieval rate was regions with mono-oriented texture to perform local bar- 100%. All ten participants found all three products in every code searches in cluttered 3-D scenes [8, 9, 10]. Hough run (see [2] for detailed ANOVA analysis and experiment transformation-based methods extract barcode lines by as- design). A key finding of ShopTalk is that all participants in suming robustness against orientation and size [11]. Var- the sample could execute verbal route directions in super- ious morphological filters and self-learning networks have markets and product search instructions triggered by bar- also been tried [12]. One common disadvantage of these al- code scans with 100% accuracy. gorithms is that they are very time-consuming and require external servers for image processing. Recently several bar- In 2008 - 2009, ShopTalk was ported onto the Nokia E70 code decoding solutions for camera phones have been pro- smartphone running Symbian OS 9.1. The smartphone is posed [13, 14]. However, these solutions target sighted connected to a small Bluetooth barcode scanner (Baracoda users in that they require that the phone camera is carefully Pencil2) [4]. The port, called ShopMobile, was tested in aligned and centered on a barcode for the decoding to work. a laboratory study with six sighted blindfolded participants. An accessible barcode localization and decoding method is The experiment had eighty shelf barcodes obtained from the proposed in [15]. However, unlike the method presented in USU bookstore placed on shelves assembled into 3 aisles this paper, it has not been implemented on a mobile phone where aisles 1 and 3 had only one side while aisle 2 had yet. We wholeheartedly endorse and commend these R&D two sides. Each participant successfully retrieved a set of efforts. Our approach is based on the hypothesis that so- four products. phisticated vision techniques may not be needed to make barcode localization and decoding accessible to VI mobile We are currently developing ShopMobile II, the next phone users, because simple and efficient vision techniques version of ShopMobile. Like its direct predecessors, can be augmented with interactive user interfaces that en- ShopTalk [5] and ShopMobile [4], ShopMobile II uses two sure that images have certain properties. data structures to represent the shopping environment. The first data structure is a topological map, which is a di- In this paper, we present an interactive haptic feedback loop rected graph whose nodes are decision points: the store that allows the VI mobile phone user to align the camera entrance, aisle entrances, and cashier lane entrances. The with a fixed surface in the pitch and yaw planes, which edges are labeled with directions. Due to the regularity of guarantees that the captured image is taken from the mobile modern supermarkets, we found it sufficient to have three phone parallel to the surface. The remainder of our paper is directional labels: left, right, and forward. The topologi- organized as follows. In Section 2, we describe our barcode cal map is built at installation time by walking through the localization and decoding method. In Section 3, we present store, noting decision points of interest, and then represent- an interactive haptic camera alignment loop that enables the ing them in the map. The second data structure, called the VI mobile phone user to align the phone’s camera with fixed Barcode Connectivity Matrix (BCM), is designed to take surfaces in the pitch and yaw planes. In Section 4, we give advantage of the inventory control systems used by many brief descriptions of the target user and the target use case grocery stores. These inventory systems place barcodes on toward which we are working. We also present a laboratory the shelves immediately beneath every product type. Shelf study with one VI individual and four sighted blindfolded barcodes assist the store personnel in managing the product participants. In Section 5, we offer our conclusions. inventory. Once the locations of all shelf barcodes in the store are known, this information is used to guide the shop- per through to the target shelf section, because products are 2 Eyes-Free Barcode Localization located directly above their corresponding shelf barcodes. More information on how the BCM can be computed from and Decoding the inventory control databases is available in [2]. A barcode can be viewed as a homogeneous region consist- Unlike its predecessors, ShopMobile II no longer requires a ing of alternate black and white lines condensed in a small barcode scanner, because it relies exclusively on computer region. In ShopMobile II, barcodes are characterized as 2
  • 3. image regions that have the following two properties. We in images. In the current implementation, the Gabor filter is call the first property alternating frequency and define it as replaced with a much more efficient line detection filter that the number of black to white and white to black transitions convolutes the kernel shown in Figure 2 through the entire along the x-axis. We call the second property vertical con- image. This kernel is inspired by the line vertical detection tinuity and define it as the continuity of black and white kernel based on a 3 × 3 matrix discussed in [17]. We ex- lines along the y-axis. As an example, consider two one- perimentally extended the kernel to a 13 × 3 matrix with pixel wide lines A and B placed on the barcode as shown decreasing coefficients along the y-axis to make it less sen- in the left part of Figure 1. These lines can be encoded as sitive to the noise in the image. Given an m × n matrix, our bit strings where black pixels map to zeros and white pixels filter can be generated as follows: map to ones. The alternating frequency is the number of 0-1 and 1-0 transitions in the bit string. Vertical continuity 2 −1 is measured as the longest common subsequence of the two ( m+1 − |i|) × n 4 if j = 0 f [i][j] = 2 bit strings. Other string matching techniques can certainly −( 2 − |i|) × ( n+1 − |j|) if j = 0, m+1 2 be used provided that there is no impact on the overall effi- ciency. where −m/2 ≤ i ≤ m/2 and −n/2 ≤ j ≤ n/2. Since barcode lines pass through this filter along with some other vertical lines generated by graphics and text that may be present in the image, the filtered image is searched in a rasterized pattern with two one pixel wide lines in order to isolate areas with high alternating frequency and high ver- tical continuity. Fast histogram analysis is performed on Figure 1: One Pixel Wide Lines on the Barcode. all pairs of lines with sufficiently high coefficients for the alternating frequency vertical continuity. After a small set of candidate regions is obtained, each candidate region is mapped to the original image, because, while the reduced resolution is sufficient for fast and adequate barcode local- ization, it turned out not to be adequate for ZXing [13], which we used for barcode decoding. ZXing is an open source library for decoding several 1D and 2D barcode types, including UPC, but does not support the MSI bar- code type. We extended ZXing to decode MSI barcodes and extended the API to accept a set of barcode regions for de- coding. Thus, the localized candidate regions are mapped back to the original image scale (1024 by 768 pixels) and cropped from it so that ZXing can process them. Future references to ZXing in the paper will refer to this extended version of ZXing. We designed an experiment to measure the contribution of our barcode localization algorithm to ZXing. A total of 187 Figure 2: Line Detection Filter Kernel. images of real products were captured with a HTC Touch Pro smartphone. In the first run, ZXing alone decoded 91 For efficiency, the 1024 by 768 image is reduced to 320 images. In the second run, the images were first processed by 240 pixels. Each pixel, a packed integer containing with our barcode localization algorithm and then ZXing ran the pixel’s alpha and RGB components, is mapped to a on the cropped candidate regions. This time barcodes were grayscale value Y between 0 and 255, where Y = 0.3 × correctly decoded in 133 images, which amounts to an in- R + 0.59 × G + 0.11 × B. Since the Google Nexus One’s crease of 42 images or 46.15% over the number of images camera has a provision to take a greyscale image, the above decoded by ZXing alone. The software for this experiment step reduced to setting Y = G for efficiency. In our previ- was implemented using Java SE and Netbeans and ran on a ous work [16], we used a Gabor filter to localize barcodes Dell Optiplex 960 Core2Duo machine running at 3GHz. 3
  • 4. 3 Interactive Camera Alignment Downscale image The shopper always takes a picture of a fixed surface after the phone’s camera is aligned with the surface in the pitch and yaw planes. The alignment is carried out through an it- a b erative haptic control loop. If the product is a box, the user Apply line is instructed to find the top of the box, i.e. the side with detection filter an opening tab, by touch and then align the camera with Rasterized the bottom, i.e. the opposite side. This is because on most search boxes sold in the U.S. the UPC barcode is on the bottom. Once the bottom is found, the shopper aligns the camera with an edge of the side and clicks a button on the touch screen to start the alignment loop. The current readings of d Extract barcode c the phone’s orientation sensor are taken to define the abso- from original image lute pitch and yaw planes. The shopper then slowly moves the camera away from the surface. If the camera is mis- aligned in the pitch or yaw planes, specific haptic signals e (vibrations) are issued to re-align it. The signals persist until the camera is aligned. The user is instructed to stop moving Figure 4: Accessible Barcode Localization and Decoding the camera away from the surface when she thinks that the Method. camera is approximately 10cm from the surface. Then the picture is taken by a click of a button on the touch screen. If 4 A Laboratory Study the product is a can, the shopper aligns the camera with the edge of the top or bottom circle. If it is a bottle, the camera Before we describe our laboratory studies, we will briefly is aligned with the edge of the bottom circle. describe the target user and the target use case toward which we are working. We will call our target user Alice (name When the shopper reaches an aisle, she aligns the phone changed to protect identity) who was a participant in one of camera with a shelf and captures an image. If a barcode is our longitudinal accessible shopping experiments in Lee’s decoded successfully and happens to be a shelf barcode, the Market Place. She is in her mid-twenties, a mother of system checks if it is the target barcode. If it is, the system two children, and a seasoned mobile phone user. Both she verbally informs the shopper about it. If it is not, the system and her husband are completely blind and cannot help each generates verbal directions to the target product (e.g. “Two other with grocery shopping. Alice has excellent orientation shelves up, three barcodes left.”). If the barcode cannot be and mobility (O&M) skills, takes independent walks around decoded in the image, the system divides the image into her neighborhood, uses the Cache Valley transit system, and sub-images that are likely to contain barcodes. Each sub- has a full-time job. Alice is a resident of Logan, Utah, and image is processed with a barcode decoder. If the barcode is goes to Lee’s Market Place several times a week. ShopMo- decoded, the system goes into the verbal instruction mode. bile II is a mobile solution for individuals like Alice. If the barcode is not decoded, the system assumes that the barcode is rotated by 90 degrees. This assumption is en- We are working toward the following use case. When Al- forced by the interactive haptic alignment loop. The image ice enters the supermarket, she picks up a shopping basket. is rotated by 90 degrees, and the entire processes is repeated Alice also has adequate cane skills to pull a shopping cart again. If part of a barcode is detected, the system gives the if she has to. Alice takes out her mobile phone and, if her user haptic feedback to move the camera slightly left, right, shopping list is prepared, selects a product by menu brows- up, or down, depending on where the partial barcode is de- ing. The system tells Alice where the product is located, tected. When the system is unable to find the barcode in the and if requested, generates a template-based route descrip- image, it notifies the user through audio feedback (currently tion. When Alice reaches the aisle, she aligns her phone a ping) to take another image. The overall flow of control camera with a shelf and takes a picture. If the camera is not of the barcode localization and decoding method is given in vertically aligned to the shelf, the system uses the phone’s Figure 3. If necessary, user feedback messages can be de- orientation sensor to give Alice haptic (vibratory) feedback livered in a variety of formats: haptic, audio, speech, or a to align the camera. Alice then takes an image. If part of a combination thereof. barcode is detected, another vibratory signal requests Alice 4
  • 5. Get image Localize Rotate from camera barcode image No Barcode Decode barcode Localize decoded? Decode barcode barcode Yes Barcode No Decode barcode decoded? Yes No Is it a UPC No Is it a shelf Yes Barcode barcode? barcode? decoded? Ask user to Yes Yes No try again Barcode Barcode No corresponds corresponds No to target to target product? product? Yes Yes Notify user that Confirm product Ask user to Provide directions Ask user to selected product is not target product to the user verify product to target product try again Figure 3: Barcode Localization and Decoding. to move the phone to the left (right). When the shelf barcode Finding a Product Verifying a Product is recognized, the system uses its database to give Alice ver- 5 bal directions to the target product, e.g. two shelves up, five 4.5 barcodes right or this is aisle 5, target product is in aisle 2. 4 When the target shelf barcode is recognized, Alice reaches 3.5 Number of Scans 3 above the barcode and takes a product from the shelf. If the 2.5 product’s identity cannot be determined manually, e.g. by 2 touch, smell, shake, etc., they system allows Alice to inter- 1.5 actively recognize its UPC barcode to verify the product’s 1 identity. 0.5 0 Participant 1 Participant 2 Participant 3 Participant 4 Participant 5 We approximated the target use case with a laboratory study. We ported all software to a Google Nexus One smart phone equipped with a five megapixel camera. The phone Figure 5: Mean Values for Number of Scans to Retrieve and runs on Android 2.1 on a 1 GHz processor with 512 MB Verify Products for Each Participant. of RAM. A grocery store environment was simulated by assembling a shelf section in our laboratory that consisted of five shelves. Twelve empty boxes of real products were folded. Each participant was given a ten minute training placed on three shelves. session in which the system was shown to him/her and then asked to retrieve and verify ten randomly selected products. We recruited one VI individual and four sighted individu- For each participant, the system logged the image associ- als to test the system. The sighted individuals were blind- ated with each scan, the result of the scan and the method 5
  • 6. Finding a Product Verifying a Product 13% 3.5 3 2.5 ZXing Number of Scans 2 16% Localization 1.5 Rotation and Localization 1 4% Not Decoded 0.5 0 68% Participant 1 Participant 2 Participant 3 Participant 4 Participant 5 Figure 6: Median Values for Number of Scans to Retrieve and Verify Products for Each Participant. Figure 7: Results of Barcode Scans. by which the barcode was decoded. or by tapping on the phone’s trackball, a small joystick-like All participants were able to retrieve all products success- hardware component below the touch screen. All partici- fully. Figure 5 shows the average number of scans for prod- pants preferred tapping the screen to tapping the trackball, uct retrieval and verification. It took an average of 2.72 which resulted in many accidental images with no barcodes scans for a participant to retrieve a product and an average present in them, which negatively affected the barcode lo- of 3.32 scans to verify a product. The STD values are 4.4 calization and decoding performance. In our future work, for product retrieval and 5.6 for product verification. The we will design a more sophisticated approach (e.g. finger median values for product retrieval and product verification gestures) to eliminate accidental images. are shown in Figure 6. In general, the median values for product retrieval are lower than the median values for product verification. For partic- 5 Conclusion ipant 5 (a sighted individual), the median value for product We presented an eyes-free vision-based barcode localiza- retrieval is 0. The system is designed so that that the shop- tion and decoding method that enables visually impaired per has the option of picking up a product from the shelf (VI) mobile phone users to decode MSI (Modified Plessy) without having to scan the MSI shelf barcode below it, be- barcodes on shelves and UPC barcodes on individual boxes, cause some shoppers can locate the target product by count- cans, and bottles. We showed how simple and efficient bar- ing shelf barcodes by touch. code localization and decoding techniques were augmented Figure 7 shows a distribution of the result of the scans for with an interactive haptic feedback loop that allows the VI all participants. ZXing alone decoded about 13% of all bar- user to align the phone’s camera with a fixed surface in the code scans. When used in conjunction with our barcode pitch and yaw planes. Our method is implemented on a localization algorithm it decoded another 16% of all bar- Google Nexus One smart phone running Android 2.1. We code scans. Rotating the image before localizing and de- described a laboratory study in which the method was eval- coding the barcode results in about 4% more barcode scans uated by one VI and four blindfolded sighted participants. to be decoded. Barcode localization causes the number of All participants were able to retrieve and verify all the prod- barcodes decoded to increase from 38 to 97, an increase of ucts successfully. Our touch screen user interface should be over 155%. improved to eliminate accidental images. Experiments are being planned with VI individuals in a real grocery store. In the informal post-experiment interviews, all participants said that our touch screen user interface was very simple to use but too basic. The interface consisted of a single Acknowledgments button encompassing the entire touch screen of the Google Nexus One phone. The barcode decoding procedure was This research has been supported, in part, through NSF initiated either by finger tapping on any part of the screen grant IIS-0346880. 6
  • 7. References [11] Muniz R., Junco L., and Otero A., A Robust Software Barcode Reader using the Hough Transform., In Pro- [1] Food Marketing Institute Research. The Food Retailing ceedings of International Conference on Information Industry Speaks 2006. Food Marketing Institute 2006. Intelligence and Systems, No. 31, pp. 313-319, Novem- ber 1999. [2] Nicholson J., Kulyukin V., Coster D. ShopTalk: Inde- pendent Blind Shopping Through Verbal Route Direc- [12] Arnould S., Awcock G. J., and Thomas R., Remote tions and Barcode Scans The Open Rehabilitation Jour- Bar-code Localization Using Mathematical Morphol- nal, ISSN: 1874-9437 Volume 2, 2009, pp. 11-23, DOI ogy., Image Processing and its Applications, Vol. 2, No. 10.2174/1874943700902010011. 465, pp. 642-646, 1999. [13] The Zebra Crossing Barcode Decoding Library [3] Nicholson J. and Kulyukin V. ShopTalk: Independent http://code.google.com/p/zxing/ Blind Shopping = Verbal Route Directions + Barcode Scans. Proceedings of the 30-th Annual Conference of [14] Occipital, LLC. RedLaser http://redlaser.com/ the Rehabilitation Engineering and Assistive Technol- ogy Society of North America (RESNA 2007), June [15] Tekin, E. and Coughlan, J.M., An algorithm enabling 2007, Phoenix, Arizona, Avail. on-line and on CD- blind users to find and read barcodes WACV09, 2009 ROM. [16] Kutiyanawala A., Qi X., Tian J. A Simple and Effi- cient Approach to Barcode Localization. 7th Interna- [4] Janaswami, K. (2010). ShopMobile: A Mobile Shop- tional Conference on Information, Communication and ping Aid for Visually Impaired Individuals. M.S. Re- Signal Processing, Macau 2009 port, Department of Computer Science, Utah State Uni- versity, Logan, UT, to appear. [17] Fisher R., Perkins S., Walker A. and Wolfart E. Line Detection. [5] Kulyukin V., Nicholson J., and Coster D. ShopTalk: To- http://homepages.inf.ed.ac.uk/rbf/HIPR2/linedet.htm ward Independent Shopping by People with Visual Im- pairments. Proceedings of the 8th ACM Conference on Computers and Accessiblity (ASSETS 2008), Halifax, Canada. [6] Normand N. and Viard-Gaudin C. A Two-Dimensional Bar Code Reader. Pattern Recognition, Vol. 3, pp. 201- 203, 1994. [7] Viard-Gaudin C., Normand N., and Barba D. Algorithm Using a Two-Dimensional Approach. In Proceedings of the Second Int. Conf. on Document Analysis and Recognition, No. 20-22, pp. 45-48, October 1993. [8] Ando S., and Hontanj H., Automatic Visual Searching and Reading of Barcodes in 3-D Scene, In Proceedings of the IEEE Int. Conf. on Vehicle Electronics. No. 25- 28, pp. 49-54, September 2001. [9] Ando S. Image Field Categorization and Edge/Corner Detection from Gradient Covariance. IEEE Transac- tions on Pattern Analysis and Machine Intelligence, Vol. 22, No. 2, pp. 179-190, February 2000. [10] Ando S., Consistent Gradient Operators. IEEE Trans- actions on Pattern Analysis and Machine Intelligence, Vol. 22, No. 3, pp. 252-265, March 2000. 7