RASPBERRY PI BASED SMART WALKING STICK FOR VISUALLY IMPAIRED PERSON

IRJET Journal
IRJET JournalFast Track Publications

https://www.irjet.net/archives/V9/i5/IRJET-V9I5319.pdf

© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1271
Ravikumar S1, Krishnakumar M2, Karthikeyan S3, Balamuguran N4
1Assistant professor(Sel.G), Department of IT, SRM Valliammai Engineering College, Tamilnadu, India
2,3,4UG Students, Department of IT, SRM Valliammai Engineering College, Tamilnadu, India
---------------------------------------------------------------------***----------------------------------------------------------------
Abstract - To provide a smart electronic help for blind individuals, a smart system concept has been designed. People who
are blind or visually challenged have difficulty finding their way around. The Raspberry Pi-based system is designed to give
artificial vision and object identification, as well as real-time support through GPS. In this project, we'll use the Raspberry Pi to
create a smart system for blind people that includes a camera module, a switch, and a GPSmodule. If someone is in distress, the
Pi Camera recognizes the location using GPS and sends a message to that person's Whatsapp. The system is made up of a GPS
module that receives feedback via audio, and the voice output is controlled by TTS (text to speech). The suggested system
identifies an object in their environment and gives feedbackin the form of speech, warning messages sent by earphone,and GPS
navigation to a specific area. The overall goal of thesystem is to deliver a low-cost, high-efficiency navigation andtext-to-voice aid
for the blind that provides a sense of artificial vision by supplying information about the environment's static and dynamic
objects.
Key Words: Artifical vision and object identifiaction, TTS(text to speech), GPS, Low-cost, high-efficiency navigation
and text-to-voice.
1.INTRODUCTION
Individuals with visual impairments are those whose good eyes make it impossible to understand eventhe smallest
detail. Many people with 6/60 or the optical range have a longitudinal range of less than or equal to 20 degrees, or have both
eyes wide open. These people are known to be deafeningly deafening visually handicapped persons live in a world where they
are completely reliant on others. Individuals with sensory disabilities find it impossible to discern even the tiniest
information from that of healthy people. Those with a 6/60 or optical range have a lateral scope of less than or equal to 20
degrees, orhave both eyes wide open. They are referred as as blind.
According to the World Health Organization's global data on visual impairment, there are an estimated 285 million
visually impaired persons of all ages, with 39 million of them blind. Blindness affects 80% of adults over the age of 50.
Uncorrected refractive errors (43 percent) and cataract (33 percent) are the leading causes of visual impairment; cataract is
the leading cause of blindness (51 percent ). Because humans receive 83 percent of their information from their surroundings
through sight, visionis the most crucial element of their physiology. People with those illnesses face a lot of agony in order
to live a normal life. They are mentally secluded and allow themselves to be neglected. This dysfunction with them has a
significant impact on their ability to make a living. It manifests as a mental illness that causes patients to lose hope in
themselves.
As a result, we devised a system for those suffering from those conditions. In recent years, a number of inventors
have created devices to assist visually impaired persons. Traditional and oldest accessibility devices for visually
handicapped people have their own set of disadvantages. A unique supply or navigator is frequently included with some
innovations, allowing theuser to carry it in their pocket when travelling al fresco. The consumer is likely to be confused by
the profuse patterns. Several attempts to construct blind guard or obstacle-measuring systems utilizing a small number of
applications of elements have been made. As a result, it is recommended to design and create a walking stick with all of these
functions, such as real-time object identification, voice direction, navigation, and so on, that is affordable to all persons who
require assistance.
2. LITERATURE REVIEW
1. A Survey of Voice-Assisted Electronic Sticks for the Visually Impaired. Young Ho and Sung Jae Kang are the authors.
Description- We learned about the latest technology in this article, such as the Graphics Positioning System (GPS) and
Graphics System Messaging (GSM). Which will aid in the tracking of a person's location and beutilised in the development of
RASPBERRY PI BASED SMART WALKING STICK FOR VISUALLY
IMPAIRED PERSON
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1272
a smart stick module for visually impaired persons, as well as providing information on voice messages sent from an Android
phone to a blindperson. [14].
2. Image Processing and Embedded System for Blind Navigation Sacinah Jamaludin and Zul Azizi Hailani are the
authors. Description: This paper inspired us to create a navigation system that aids in the mobility of blindpeople. This study
suggests that we capture live video of that individual and seize video feed in front of the blind person, and that this live
video may be viewed by theadministrator. [4]
3. Smart Cane: Visually Impaired People's Assistive Cane MohdHelmy Wahab, AmirulATalib, AmirulATalib,
AmirulATalib, AmirulATalib, AmirulATalib Description- We got the concept for Voice message & Vibration from this paper.
When a person detects an impediment with theuse of a smart stick, the blind person becomes aware of itby interpreting the
Vibration alert & Voice message that arrives from the smart phone. [3]
4. Electronic Path Guidance for People with Visual Impairment Iwan Ulrich and Johann Borenstein are the authors.
Description- From this paper, we learned about the range required for identifying an obstacle or object from the location
of a smart stick. We need to define a threshold value, and if the obstacle falls within that range, it may be detected
successfully; otherwise, it cannot. [4]
5. Development of an Electronic Travel Aid Using Ultrasonic Sensors Alex Harold and Chris Gearhart are the
authors. Description- From this article, we learned that inorder to capture video images, some processing is required. Using
some algorithms and methods, we perform some procedures on the image in order to capture it, and we can also view live
monitoring of that person on the admin side. All processing data is saved in a serialised manner on the server. [5]
6. Automated Mobility and Orientation System for People Who Are Blind or Partially Blind Abdel Ilah Nour
Alshbatat is the author. Description- We learned about GSM, GPS, and sensors such as integrated ultrasonic sensors and
accelerometers from this study. [14]
7. A Guide to Navigation for the Blind - For the sight impaired, taking each step without encountering any
impediments is a significant struggle in today's fast-paced world. The visually impaired can use a smart assistant like 3S
(Sensory Supervision Spectacles) to read images, avoid obstacles, and track routes. As a result, this study proposes the
creation of a clever framework that can assist visually impaired people in their daily lives. Some of the most significant
difficulties include difficulty moving without assistance, difficulty reading text or visuals, detecting impediments, and so on.
Text recognition using Optical Character Recognition (OCR), speech synthesis using Text-To-Speech (TTS), and obstacle
detection using ultrasonicsensors (HC-SR04) and a GPS tracker are all possible with the proposed 3S (Sensory Supervision
Spectacles). The device is a voice-activated frameworkthatwouldassistvisuallyimpairedpeoplein their daily tasks.
3.PROPOSED WORK
It also offers a camera-based assisted reading system to assist blind people in reading text labels and product packaging
from everyday things. The text that the user needs to read is collected as an image and sent to the image processing platform
using a small camera. OCR tesseract is used to recognise the text on the acquired image.CThe e- speak algorithm converts
the detected text into vocal output. The system is lightweight and portable, thanks to abuilt-in battery backup. It will provide
them with a safer atmosphere as well as a sense of independence, allowing them to enjoy a more regular life. The Raspberry
Pi 3b, GPS module, voice command module, moisture sensor unit, and ultrasonic sensor unit are all used in this smart
navigation system stick. The voice command module is the highlight since it gives customers with dependable voice
commandsupport. The GPS module is also important since it offers users with safe and precise navigation.
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1273
Block Diagram of Proposed system
3.1. ADVANTAGES:
It is efficient. It is inexpensive. By receiving communications, we can keep track of the blind people's whereabouts.
Character recognition that is done automatically. A prototype system for assisting blind people by reading printed text on
hand-held devices. Weproposed a motion-based method for detecting the object for a few seconds to detect it.
3.2. PROBLEM IDENTIFICATION:
Understanding the precise context of deaf and dumb people's symbolic expressions is a difficult task in real life unless it is
adequately described. Webcams can occasionally struggle to determine the desired skin colour due to light and contrast.
Because the tracking environment background colour and skin colour are so similar, the SLR receives unexpected pixels.
4.ARCHITECTURE DIAGRAM
METHODOLOGY
Image to text - Tesaract algorithm and text to audio - Espeak algorithm connecting ultrasonic sensor for detecting
the obstacle and updated to blind person through headset. Camera will take the pic and detects the location of the
area.
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1274
5.1 TESSERACT ALGORITHM:
Tesseract Algorithm is an optical character recognition engine with open source code. This is the most popular and
qualitative OCR library. Tesseract is finding templates in pixels ,letters, words and sentences. Thisalgorithm is able to
accurately decypher and extract text from a variety of sources As per it's namesake it uses an updated version of the
tesseract open source OCR tool. We also automatically binarize and preprocess.
Images using the binarization, so tesseract has an easier time decyphering images. And qualitative OCR library.
Tesseract is finding templates in pixels , letters, words and sentences.
5.2 E-SPEAK ALGORITHM:
eSpeak is an open source text-to-speech synthesizer that can be invoked from the Linux command line. This is a
compact speech synthesizer that provides support to English and many other languages. It can be used with the existing
layout analysis to recognize text within a large document, or it can be used in conjunction with an external text detector
to recognize text from an image ofa single text line. The eSpeak package provides a few good variations to the default
voice used to speak outthe text.
5. HARDWARE AND SOFTWARE COMPONENTS
Hardware Requirements: We are usingthe RaspberryPi,WebCamera,Speaker etc.,
Software Requirements: Raspian OS,Python andOpen CV.
Raspberry pi board
A. A. It's a lot of fun to bond with Arduino and you can do alot with it. The Raspberry Pi is available in two models: model A
and model B. These two are quite similar, with Model B having a few more advanced features than Model
A. Model B features 512 MB of RAM and two USB ports, while Model A only has 256 MB of RAM and one USB port.
Furthermore, the Model B features an Ethernet interface,whereas the Model A does not.
B. SD Card Slot : Raspberry Pi doesn’t have the real harddrive as in laptop and computer, SD card is taken as solid state
drive (SSD) which is used to install operating system and all others software and store everything. This card is needed to
insert into the slot for using the Raspberry Pi. SD card may be 2GB, 4GB or 16GB
C. Micro USB Power : The power port is a 5V micro-USB input and supply should be exactly 5v as it doesn’t have onboard
power regulator. So, power supply shouldn’t exceed than 5V.
D. HDMI Out: To connect the Raspberry Pi to a monitor via HDMI, use this output port (High Definition Multimedia
Interface). As a result, any screen or TV with an HDMI connector can be attached to it.
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1275
E. GPIO Headers (Pins): A GPIO pin is a general-purpose input-output pin. These pins are used to connect the Raspberry
Pi to a variety of physical expansions. Pre installed libraries on the Raspberry Pi allow us to access the pins using
programming languages such as C, C++, orPython.
F. Ultrasonic Sensor: A form of sensor that detects an item using sound waves is known as an ultrasonic sensor. It
works on the same principle as radar or sonar, which generates and receives high-frequency sound waves. Sensors
detect the distance of an object by measuring the time it takes for the echo signal to be received after transmitting the
signals and receiving the echo signals back.
6.SMART FLOW CHART
The above flowchart shows the process details that takes place in the smart walking stick. And much easier algorithm of
the smart stick is given below
7. FUNCTIONAL DESCRIPTION
An ultrasonic sensor is a device that detectsthe presence of ultrasonic waves
The ultrasonic sensor is used to calculate the object's distance or to detect potholes. Elastic waves of a frequency
more than 20,000 Hz are produced by these,which can be found in solids, liquids, and gases.
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1276
The ultrasonic sensor, which sends ultrasounds and estimates distance, is inserted in three sides of the walking
stick for object detection.
The sensor is put beneath the smart walking stick to identify potholes. The distance is chosen to be more than the
average distance. As a result, when a pothole arises,its distance will be greater than normal, and we will bealerted.
The pulse reflection method, which counts the number of reflection pulses reflected back, is used to calculate
distance in the block diagram[5].
A. Raspberry pi
The Raspberry Pi board's brain is the Central Processing Unit, which is in charge of carrying out the computer's
instructions through logical and mathematical operations. The Raspberry Pi's Ethernet connection serves as the
primary interface for communicating with other devices. The Ethernet port of the Raspberry Pi is used to connect your
home network to the internet [9].
B.ARM 11 processor
The raspberry pi uses ARM11 series processor ARM11 is a group of older 32-bit RISC ARM processor
Its operating frequency is 335Mhz. power consumption is0.4mV/Mhz
ARM Trust-Zone Technology for on chip security low power consumption and High performance integer processor
the GPU is a specialized chip in the raspberry pi board and that is designed to speed up the operation of image
calculations.
C. Object identification
Using digital image processing technology and software called computer vision 2, the process of item
identification is carried out. We feed the fundamental structures of items such as stones, vehicles, and persons into this
process, and if any variation is detected by ultrasonic sensors, the camera turns on to take the image, which it
compares to the pre-fed photographs to discover the object [11].
D. Optical character recognition
Tesseract, a supporting software for the Raspberry Pi, is used to do OCR (optical character recognition). The image
acquired with the camera is first converted to black and white, and then the edge detection procedure is used to
discover the edges in the image in order to find the distinct letters in the image. Against determine the right character of
the image, these letters are compared to pre-fetched characters from other languages [7].
E. Switch
To enable the ultrasonic sensor for obstacle detection, pothole detection, or text reading, a toggle switch is
needed. To get the smart walking stick'srequired output, the user must operate the switch.
8.CONCLUSION AND DISCUSSION:
In this analysis, we got described an epitome system toscan written communication and hand-held objects for serving to
the blind people. To extract text regions from advanced backgrounds, we have got projected a totally distinctive text
localization formula supported models of stroke orientation, and edge distributions. The corresponding feature maps
estimate the worldwide structural feature of text at every element. Block patterns project the projected feature maps of an
image patch into a feature vector. A adjacent character grouping is performedto calculate candidates of text patches prepared
for text classification. Associate Ada boost learning model is used to localize text in camera-based footage. OCR is utilized to
perform word recognition on the localized text regions andwork into audio output for blind users. Throughout this analysis,
the camera acts as input for the paper. As a result of the Raspberry Pi board is high-powered the camera starts streaming.
The streaming data square measure attending to be displayed on the screen victimization interface application. Once the
item for text reading is placed earlier than the camera then the capture button is clicked to supply image to the board.
Mistreatment Tesseract library the image square measure attending to be converted into data, and conjointly the data
detected from the image square measure attending to be shown on the standing bar. The obtained data square measure
attending to be pronounced through.
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1277
9. REFERENCES:
[1] X. Chen and A. L. Yuille, “Detecting and reading text in natural scenes,” in Proc. Comput. Vision Pattern Rec- ognit.,
2004, vol. 2, pp. II-366–II-373.
[2] S. Kumar, R. Gupta, N. Khanna, S. Chaudhury, and S.D. Joshi, “Text extraction and document image seg- mentation
using matched wavelets and MRF model,” IEEETrans Image Process., vol. 16, no. 8, pp. 2117– 2128,Aug. 2007.
[3] K. Kim, K. Jung, and J. Kim, “Texture-based ap- proach for text detection in images using support vec-tormachines
and continuously adaptive mean shift al- gorithm,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 25, no. 12,pp. 1631–1639,
Dec. 2003.
[4] N. Giudice and G. Legge, “Blind navigation and the role of technology,”in The Engineering Handbook of Smart
Technology for Aging, Disability,and Indepen-dence, A. A. Helal, M. Mokhtari, and B. Abdulrazak, Eds. Hoboken, NJ, USA:
Wiley, 2008.
[5] World Health Organization. (2009).10factsaboutblindness and visual impairment.
[6] AdvanceDataReportsfromtheNationalHealthIn-terview Survey (2008).
[7] International Workshop on Camera-Based Docu-ment Analysis and Recognition (CBDAR 2005, 2007, 2009, 2011).
[8] X. Chen, J. Yang, J. Zhang, and A. Waibel, “Auto- matic detection and recognition of signs from natural scenes,”
IEEE Trans. Image Process.,vol. 13, no. 1, pp. 87–99, Jan. 2004.
[9] D. Dakopoulos and N. G. Bourbakis, “Wearable ob- stacle avoidance electronic travel aids for blind: A sur-vey,” IEEE
Trans. Syst., Man, Cybern.,vol. 40, no. 1, pp. 25–35, Jan. 2010.
[10] B. Epshtein, E. Ofek, and Y. Wexler, “Detecting textin natural scenes with stroke width transform,” in Proc. Comput.
Vision Pattern Recognit.,2010, pp. 2963–2970.
[11] Y. Freund and R. Schapire, “Experiments with a new boosting algorithm,”in Proc. Int. Conf. Machine Learning,
1996, pp. 148–156.
[12] An overview of the Tesseract OCR (optical charac-ter recognition) engine, and its possible enhancement for use in
Wales in a pre-competitive research stage Prepared by the Language Technologies Unit (Canol-fan Bedwyr), Bangor
University April 2008.
[13] A. Shahab, F. Shafait, and A. Dengel, “ICDAR 2011 robust reading competition:ICDAR Robust Reading Competition
Challenge 2: Readingtextinsceneimag-es,”inProc. Int. Conf. Document Anal. Recognit., 2011, pp. 1491–1496.
[14] KReader Mobile User Guide, knfb Reading Technol-ogy Inc. (2008).
[15] S. M. Lucas, “ICDAR 2005 text locating competi- tion results,” in Proc.Int. Conf. Document Anal. Recog-nit., 2005,
vol. 1, pp. 80–84. ….

Recomendados

IRJET- A Survey on Indoor Navigation for Blind People por
IRJET- A Survey on Indoor Navigation for Blind PeopleIRJET- A Survey on Indoor Navigation for Blind People
IRJET- A Survey on Indoor Navigation for Blind PeopleIRJET Journal
58 vistas3 diapositivas
ALTERNATE EYES FOR BLIND advanced wearable for visually impaired people por
ALTERNATE EYES FOR BLIND advanced wearable for visually impaired peopleALTERNATE EYES FOR BLIND advanced wearable for visually impaired people
ALTERNATE EYES FOR BLIND advanced wearable for visually impaired peopleIRJET Journal
5 vistas6 diapositivas
IRJET- Healthy Beat por
IRJET- Healthy BeatIRJET- Healthy Beat
IRJET- Healthy BeatIRJET Journal
30 vistas6 diapositivas
SMART BLIND STICK USING VOICE MODULE por
SMART BLIND STICK USING VOICE MODULESMART BLIND STICK USING VOICE MODULE
SMART BLIND STICK USING VOICE MODULEIRJET Journal
28 vistas7 diapositivas
IRJET- Assistant Systems for the Visually Impaired por
IRJET- Assistant Systems for the Visually ImpairedIRJET- Assistant Systems for the Visually Impaired
IRJET- Assistant Systems for the Visually ImpairedIRJET Journal
44 vistas3 diapositivas
IoT Based Assistive Glove for Visually Impaired People por
IoT Based Assistive Glove for Visually Impaired PeopleIoT Based Assistive Glove for Visually Impaired People
IoT Based Assistive Glove for Visually Impaired PeopleIRJET Journal
23 vistas7 diapositivas

Más contenido relacionado

Similar a RASPBERRY PI BASED SMART WALKING STICK FOR VISUALLY IMPAIRED PERSON

IRJET - Smart E – Cane for the Visually Challenged and Blind using ML Con... por
IRJET -  	  Smart E – Cane for the Visually Challenged and Blind using ML Con...IRJET -  	  Smart E – Cane for the Visually Challenged and Blind using ML Con...
IRJET - Smart E – Cane for the Visually Challenged and Blind using ML Con...IRJET Journal
11 vistas4 diapositivas
IRJET - For(E)Sight :A Perceptive Device to Assist Blind People por
IRJET -  	  For(E)Sight :A Perceptive Device to Assist Blind PeopleIRJET -  	  For(E)Sight :A Perceptive Device to Assist Blind People
IRJET - For(E)Sight :A Perceptive Device to Assist Blind PeopleIRJET Journal
25 vistas6 diapositivas
IRJET- Wide Angle View for Visually Impaired por
IRJET- Wide Angle View for Visually ImpairedIRJET- Wide Angle View for Visually Impaired
IRJET- Wide Angle View for Visually ImpairedIRJET Journal
22 vistas4 diapositivas
IRJET- A Leading Hand for the Blind –A Review por
IRJET- A Leading Hand for the Blind –A ReviewIRJET- A Leading Hand for the Blind –A Review
IRJET- A Leading Hand for the Blind –A ReviewIRJET Journal
24 vistas4 diapositivas
Design and Development of Smart Wheelchair for Physically Disable people por
Design and Development of Smart Wheelchair for Physically Disable peopleDesign and Development of Smart Wheelchair for Physically Disable people
Design and Development of Smart Wheelchair for Physically Disable peopleIRJET Journal
6 vistas6 diapositivas
An Assistive System for Visually Impaired People por
An Assistive System for Visually Impaired PeopleAn Assistive System for Visually Impaired People
An Assistive System for Visually Impaired PeopleIRJET Journal
15 vistas3 diapositivas

Similar a RASPBERRY PI BASED SMART WALKING STICK FOR VISUALLY IMPAIRED PERSON(20)

IRJET - Smart E – Cane for the Visually Challenged and Blind using ML Con... por IRJET Journal
IRJET -  	  Smart E – Cane for the Visually Challenged and Blind using ML Con...IRJET -  	  Smart E – Cane for the Visually Challenged and Blind using ML Con...
IRJET - Smart E – Cane for the Visually Challenged and Blind using ML Con...
IRJET Journal11 vistas
IRJET - For(E)Sight :A Perceptive Device to Assist Blind People por IRJET Journal
IRJET -  	  For(E)Sight :A Perceptive Device to Assist Blind PeopleIRJET -  	  For(E)Sight :A Perceptive Device to Assist Blind People
IRJET - For(E)Sight :A Perceptive Device to Assist Blind People
IRJET Journal25 vistas
IRJET- Wide Angle View for Visually Impaired por IRJET Journal
IRJET- Wide Angle View for Visually ImpairedIRJET- Wide Angle View for Visually Impaired
IRJET- Wide Angle View for Visually Impaired
IRJET Journal22 vistas
IRJET- A Leading Hand for the Blind –A Review por IRJET Journal
IRJET- A Leading Hand for the Blind –A ReviewIRJET- A Leading Hand for the Blind –A Review
IRJET- A Leading Hand for the Blind –A Review
IRJET Journal24 vistas
Design and Development of Smart Wheelchair for Physically Disable people por IRJET Journal
Design and Development of Smart Wheelchair for Physically Disable peopleDesign and Development of Smart Wheelchair for Physically Disable people
Design and Development of Smart Wheelchair for Physically Disable people
IRJET Journal6 vistas
An Assistive System for Visually Impaired People por IRJET Journal
An Assistive System for Visually Impaired PeopleAn Assistive System for Visually Impaired People
An Assistive System for Visually Impaired People
IRJET Journal15 vistas
IRJET- Indoor Shopping System for Visually Impaired People por IRJET Journal
IRJET- Indoor Shopping System for Visually Impaired PeopleIRJET- Indoor Shopping System for Visually Impaired People
IRJET- Indoor Shopping System for Visually Impaired People
IRJET Journal41 vistas
DRISHTI – A PORTABLE PROTOTYPE FOR VISUALLY IMPAIRED por IRJET Journal
DRISHTI – A PORTABLE PROTOTYPE FOR VISUALLY IMPAIREDDRISHTI – A PORTABLE PROTOTYPE FOR VISUALLY IMPAIRED
DRISHTI – A PORTABLE PROTOTYPE FOR VISUALLY IMPAIRED
IRJET Journal15 vistas
Ieeepro techno solutions ieee 2014 embedded project talking assistance ab... por srinivasanece7
Ieeepro techno solutions    ieee  2014 embedded project talking assistance ab...Ieeepro techno solutions    ieee  2014 embedded project talking assistance ab...
Ieeepro techno solutions ieee 2014 embedded project talking assistance ab...
srinivasanece7496 vistas
Paper_39-SRAVIP_Smart_Robot_Assistant_for_Visually_Impaired.pdf por SathishRadcliffe
Paper_39-SRAVIP_Smart_Robot_Assistant_for_Visually_Impaired.pdfPaper_39-SRAVIP_Smart_Robot_Assistant_for_Visually_Impaired.pdf
Paper_39-SRAVIP_Smart_Robot_Assistant_for_Visually_Impaired.pdf
SathishRadcliffe7 vistas
Smart Gloves for Blind por IRJET Journal
Smart Gloves for BlindSmart Gloves for Blind
Smart Gloves for Blind
IRJET Journal1.2K vistas
Design and implementation of smart guided glass for visually impaired people por IJECEIAES
Design and implementation of smart guided glass for visually  impaired peopleDesign and implementation of smart guided glass for visually  impaired people
Design and implementation of smart guided glass for visually impaired people
IJECEIAES15 vistas
Blind Stick Using Ultrasonic Sensor with Voice announcement and GPS tracking por vivatechijri
Blind Stick Using Ultrasonic Sensor with Voice announcement and GPS trackingBlind Stick Using Ultrasonic Sensor with Voice announcement and GPS tracking
Blind Stick Using Ultrasonic Sensor with Voice announcement and GPS tracking
vivatechijri788 vistas
Design and Development of a Prototype Assistive Mobility Solution for the Vis... por IRJET Journal
Design and Development of a Prototype Assistive Mobility Solution for the Vis...Design and Development of a Prototype Assistive Mobility Solution for the Vis...
Design and Development of a Prototype Assistive Mobility Solution for the Vis...
IRJET Journal39 vistas
Smart Navigation Assistance System for Blind People por IRJET Journal
Smart Navigation Assistance System for Blind PeopleSmart Navigation Assistance System for Blind People
Smart Navigation Assistance System for Blind People
IRJET Journal21 vistas
Smart Stick for Blind People with Live Video Feed por IRJET Journal
Smart Stick for Blind People with Live Video FeedSmart Stick for Blind People with Live Video Feed
Smart Stick for Blind People with Live Video Feed
IRJET Journal234 vistas
IRJET- Smart Assistive Device for Visually Impaired por IRJET Journal
IRJET- Smart Assistive Device for Visually ImpairedIRJET- Smart Assistive Device for Visually Impaired
IRJET- Smart Assistive Device for Visually Impaired
IRJET Journal30 vistas
Smart Cane for Blind Person Assisted with Android Application and Save Our So... por Dr. Amarjeet Singh
Smart Cane for Blind Person Assisted with Android Application and Save Our So...Smart Cane for Blind Person Assisted with Android Application and Save Our So...
Smart Cane for Blind Person Assisted with Android Application and Save Our So...
Dr. Amarjeet Singh92 vistas

Más de IRJET Journal

SOIL STABILIZATION USING WASTE FIBER MATERIAL por
SOIL STABILIZATION USING WASTE FIBER MATERIALSOIL STABILIZATION USING WASTE FIBER MATERIAL
SOIL STABILIZATION USING WASTE FIBER MATERIALIRJET Journal
25 vistas7 diapositivas
Sol-gel auto-combustion produced gamma irradiated Ni1-xCdxFe2O4 nanoparticles... por
Sol-gel auto-combustion produced gamma irradiated Ni1-xCdxFe2O4 nanoparticles...Sol-gel auto-combustion produced gamma irradiated Ni1-xCdxFe2O4 nanoparticles...
Sol-gel auto-combustion produced gamma irradiated Ni1-xCdxFe2O4 nanoparticles...IRJET Journal
8 vistas7 diapositivas
Identification, Discrimination and Classification of Cotton Crop by Using Mul... por
Identification, Discrimination and Classification of Cotton Crop by Using Mul...Identification, Discrimination and Classification of Cotton Crop by Using Mul...
Identification, Discrimination and Classification of Cotton Crop by Using Mul...IRJET Journal
8 vistas5 diapositivas
“Analysis of GDP, Unemployment and Inflation rates using mathematical formula... por
“Analysis of GDP, Unemployment and Inflation rates using mathematical formula...“Analysis of GDP, Unemployment and Inflation rates using mathematical formula...
“Analysis of GDP, Unemployment and Inflation rates using mathematical formula...IRJET Journal
13 vistas11 diapositivas
MAXIMUM POWER POINT TRACKING BASED PHOTO VOLTAIC SYSTEM FOR SMART GRID INTEGR... por
MAXIMUM POWER POINT TRACKING BASED PHOTO VOLTAIC SYSTEM FOR SMART GRID INTEGR...MAXIMUM POWER POINT TRACKING BASED PHOTO VOLTAIC SYSTEM FOR SMART GRID INTEGR...
MAXIMUM POWER POINT TRACKING BASED PHOTO VOLTAIC SYSTEM FOR SMART GRID INTEGR...IRJET Journal
14 vistas6 diapositivas
Performance Analysis of Aerodynamic Design for Wind Turbine Blade por
Performance Analysis of Aerodynamic Design for Wind Turbine BladePerformance Analysis of Aerodynamic Design for Wind Turbine Blade
Performance Analysis of Aerodynamic Design for Wind Turbine BladeIRJET Journal
7 vistas5 diapositivas

Más de IRJET Journal(20)

SOIL STABILIZATION USING WASTE FIBER MATERIAL por IRJET Journal
SOIL STABILIZATION USING WASTE FIBER MATERIALSOIL STABILIZATION USING WASTE FIBER MATERIAL
SOIL STABILIZATION USING WASTE FIBER MATERIAL
IRJET Journal25 vistas
Sol-gel auto-combustion produced gamma irradiated Ni1-xCdxFe2O4 nanoparticles... por IRJET Journal
Sol-gel auto-combustion produced gamma irradiated Ni1-xCdxFe2O4 nanoparticles...Sol-gel auto-combustion produced gamma irradiated Ni1-xCdxFe2O4 nanoparticles...
Sol-gel auto-combustion produced gamma irradiated Ni1-xCdxFe2O4 nanoparticles...
IRJET Journal8 vistas
Identification, Discrimination and Classification of Cotton Crop by Using Mul... por IRJET Journal
Identification, Discrimination and Classification of Cotton Crop by Using Mul...Identification, Discrimination and Classification of Cotton Crop by Using Mul...
Identification, Discrimination and Classification of Cotton Crop by Using Mul...
IRJET Journal8 vistas
“Analysis of GDP, Unemployment and Inflation rates using mathematical formula... por IRJET Journal
“Analysis of GDP, Unemployment and Inflation rates using mathematical formula...“Analysis of GDP, Unemployment and Inflation rates using mathematical formula...
“Analysis of GDP, Unemployment and Inflation rates using mathematical formula...
IRJET Journal13 vistas
MAXIMUM POWER POINT TRACKING BASED PHOTO VOLTAIC SYSTEM FOR SMART GRID INTEGR... por IRJET Journal
MAXIMUM POWER POINT TRACKING BASED PHOTO VOLTAIC SYSTEM FOR SMART GRID INTEGR...MAXIMUM POWER POINT TRACKING BASED PHOTO VOLTAIC SYSTEM FOR SMART GRID INTEGR...
MAXIMUM POWER POINT TRACKING BASED PHOTO VOLTAIC SYSTEM FOR SMART GRID INTEGR...
IRJET Journal14 vistas
Performance Analysis of Aerodynamic Design for Wind Turbine Blade por IRJET Journal
Performance Analysis of Aerodynamic Design for Wind Turbine BladePerformance Analysis of Aerodynamic Design for Wind Turbine Blade
Performance Analysis of Aerodynamic Design for Wind Turbine Blade
IRJET Journal7 vistas
Heart Failure Prediction using Different Machine Learning Techniques por IRJET Journal
Heart Failure Prediction using Different Machine Learning TechniquesHeart Failure Prediction using Different Machine Learning Techniques
Heart Failure Prediction using Different Machine Learning Techniques
IRJET Journal7 vistas
Experimental Investigation of Solar Hot Case Based on Photovoltaic Panel por IRJET Journal
Experimental Investigation of Solar Hot Case Based on Photovoltaic PanelExperimental Investigation of Solar Hot Case Based on Photovoltaic Panel
Experimental Investigation of Solar Hot Case Based on Photovoltaic Panel
IRJET Journal3 vistas
Metro Development and Pedestrian Concerns por IRJET Journal
Metro Development and Pedestrian ConcernsMetro Development and Pedestrian Concerns
Metro Development and Pedestrian Concerns
IRJET Journal2 vistas
Mapping the Crashworthiness Domains: Investigations Based on Scientometric An... por IRJET Journal
Mapping the Crashworthiness Domains: Investigations Based on Scientometric An...Mapping the Crashworthiness Domains: Investigations Based on Scientometric An...
Mapping the Crashworthiness Domains: Investigations Based on Scientometric An...
IRJET Journal3 vistas
Data Analytics and Artificial Intelligence in Healthcare Industry por IRJET Journal
Data Analytics and Artificial Intelligence in Healthcare IndustryData Analytics and Artificial Intelligence in Healthcare Industry
Data Analytics and Artificial Intelligence in Healthcare Industry
IRJET Journal3 vistas
DESIGN AND SIMULATION OF SOLAR BASED FAST CHARGING STATION FOR ELECTRIC VEHIC... por IRJET Journal
DESIGN AND SIMULATION OF SOLAR BASED FAST CHARGING STATION FOR ELECTRIC VEHIC...DESIGN AND SIMULATION OF SOLAR BASED FAST CHARGING STATION FOR ELECTRIC VEHIC...
DESIGN AND SIMULATION OF SOLAR BASED FAST CHARGING STATION FOR ELECTRIC VEHIC...
IRJET Journal62 vistas
Efficient Design for Multi-story Building Using Pre-Fabricated Steel Structur... por IRJET Journal
Efficient Design for Multi-story Building Using Pre-Fabricated Steel Structur...Efficient Design for Multi-story Building Using Pre-Fabricated Steel Structur...
Efficient Design for Multi-story Building Using Pre-Fabricated Steel Structur...
IRJET Journal12 vistas
Development of Effective Tomato Package for Post-Harvest Preservation por IRJET Journal
Development of Effective Tomato Package for Post-Harvest PreservationDevelopment of Effective Tomato Package for Post-Harvest Preservation
Development of Effective Tomato Package for Post-Harvest Preservation
IRJET Journal4 vistas
“DYNAMIC ANALYSIS OF GRAVITY RETAINING WALL WITH SOIL STRUCTURE INTERACTION” por IRJET Journal
“DYNAMIC ANALYSIS OF GRAVITY RETAINING WALL WITH SOIL STRUCTURE INTERACTION”“DYNAMIC ANALYSIS OF GRAVITY RETAINING WALL WITH SOIL STRUCTURE INTERACTION”
“DYNAMIC ANALYSIS OF GRAVITY RETAINING WALL WITH SOIL STRUCTURE INTERACTION”
IRJET Journal5 vistas
Understanding the Nature of Consciousness with AI por IRJET Journal
Understanding the Nature of Consciousness with AIUnderstanding the Nature of Consciousness with AI
Understanding the Nature of Consciousness with AI
IRJET Journal12 vistas
Augmented Reality App for Location based Exploration at JNTUK Kakinada por IRJET Journal
Augmented Reality App for Location based Exploration at JNTUK KakinadaAugmented Reality App for Location based Exploration at JNTUK Kakinada
Augmented Reality App for Location based Exploration at JNTUK Kakinada
IRJET Journal6 vistas
Smart Traffic Congestion Control System: Leveraging Machine Learning for Urba... por IRJET Journal
Smart Traffic Congestion Control System: Leveraging Machine Learning for Urba...Smart Traffic Congestion Control System: Leveraging Machine Learning for Urba...
Smart Traffic Congestion Control System: Leveraging Machine Learning for Urba...
IRJET Journal18 vistas
Enhancing Real Time Communication and Efficiency With Websocket por IRJET Journal
Enhancing Real Time Communication and Efficiency With WebsocketEnhancing Real Time Communication and Efficiency With Websocket
Enhancing Real Time Communication and Efficiency With Websocket
IRJET Journal5 vistas
Textile Industrial Wastewater Treatability Studies by Soil Aquifer Treatment ... por IRJET Journal
Textile Industrial Wastewater Treatability Studies by Soil Aquifer Treatment ...Textile Industrial Wastewater Treatability Studies by Soil Aquifer Treatment ...
Textile Industrial Wastewater Treatability Studies by Soil Aquifer Treatment ...
IRJET Journal4 vistas

Último

Ansari: Practical experiences with an LLM-based Islamic Assistant por
Ansari: Practical experiences with an LLM-based Islamic AssistantAnsari: Practical experiences with an LLM-based Islamic Assistant
Ansari: Practical experiences with an LLM-based Islamic AssistantM Waleed Kadous
11 vistas29 diapositivas
GPS Survery Presentation/ Slides por
GPS Survery Presentation/ SlidesGPS Survery Presentation/ Slides
GPS Survery Presentation/ SlidesOmarFarukEmon1
7 vistas13 diapositivas
MongoDB.pdf por
MongoDB.pdfMongoDB.pdf
MongoDB.pdfArthyR3
51 vistas6 diapositivas
Module-1, Chapter-2 Data Types, Variables, and Arrays por
Module-1, Chapter-2 Data Types, Variables, and ArraysModule-1, Chapter-2 Data Types, Variables, and Arrays
Module-1, Chapter-2 Data Types, Variables, and ArraysDemian Antony D'Mello
6 vistas44 diapositivas
Integrating Sustainable Development Goals (SDGs) in School Education por
Integrating Sustainable Development Goals (SDGs) in School EducationIntegrating Sustainable Development Goals (SDGs) in School Education
Integrating Sustainable Development Goals (SDGs) in School EducationSheetalTank1
11 vistas29 diapositivas
BCIC - Manufacturing Conclave - Technology-Driven Manufacturing for Growth por
BCIC - Manufacturing Conclave -  Technology-Driven Manufacturing for GrowthBCIC - Manufacturing Conclave -  Technology-Driven Manufacturing for Growth
BCIC - Manufacturing Conclave - Technology-Driven Manufacturing for GrowthInnomantra
20 vistas4 diapositivas

Último(20)

Ansari: Practical experiences with an LLM-based Islamic Assistant por M Waleed Kadous
Ansari: Practical experiences with an LLM-based Islamic AssistantAnsari: Practical experiences with an LLM-based Islamic Assistant
Ansari: Practical experiences with an LLM-based Islamic Assistant
M Waleed Kadous11 vistas
MongoDB.pdf por ArthyR3
MongoDB.pdfMongoDB.pdf
MongoDB.pdf
ArthyR351 vistas
Integrating Sustainable Development Goals (SDGs) in School Education por SheetalTank1
Integrating Sustainable Development Goals (SDGs) in School EducationIntegrating Sustainable Development Goals (SDGs) in School Education
Integrating Sustainable Development Goals (SDGs) in School Education
SheetalTank111 vistas
BCIC - Manufacturing Conclave - Technology-Driven Manufacturing for Growth por Innomantra
BCIC - Manufacturing Conclave -  Technology-Driven Manufacturing for GrowthBCIC - Manufacturing Conclave -  Technology-Driven Manufacturing for Growth
BCIC - Manufacturing Conclave - Technology-Driven Manufacturing for Growth
Innomantra 20 vistas
Design of Structures and Foundations for Vibrating Machines, Arya-ONeill-Pinc... por csegroupvn
Design of Structures and Foundations for Vibrating Machines, Arya-ONeill-Pinc...Design of Structures and Foundations for Vibrating Machines, Arya-ONeill-Pinc...
Design of Structures and Foundations for Vibrating Machines, Arya-ONeill-Pinc...
csegroupvn13 vistas
ASSIGNMENTS ON FUZZY LOGIC IN TRAFFIC FLOW.pdf por AlhamduKure
ASSIGNMENTS ON FUZZY LOGIC IN TRAFFIC FLOW.pdfASSIGNMENTS ON FUZZY LOGIC IN TRAFFIC FLOW.pdf
ASSIGNMENTS ON FUZZY LOGIC IN TRAFFIC FLOW.pdf
AlhamduKure10 vistas
Design_Discover_Develop_Campaign.pptx por ShivanshSeth6
Design_Discover_Develop_Campaign.pptxDesign_Discover_Develop_Campaign.pptx
Design_Discover_Develop_Campaign.pptx
ShivanshSeth655 vistas
Design of machine elements-UNIT 3.pptx por gopinathcreddy
Design of machine elements-UNIT 3.pptxDesign of machine elements-UNIT 3.pptx
Design of machine elements-UNIT 3.pptx
gopinathcreddy38 vistas
Proposal Presentation.pptx por keytonallamon
Proposal Presentation.pptxProposal Presentation.pptx
Proposal Presentation.pptx
keytonallamon76 vistas
Web Dev Session 1.pptx por VedVekhande
Web Dev Session 1.pptxWeb Dev Session 1.pptx
Web Dev Session 1.pptx
VedVekhande20 vistas
GDSC Mikroskil Members Onboarding 2023.pdf por gdscmikroskil
GDSC Mikroskil Members Onboarding 2023.pdfGDSC Mikroskil Members Onboarding 2023.pdf
GDSC Mikroskil Members Onboarding 2023.pdf
gdscmikroskil68 vistas

RASPBERRY PI BASED SMART WALKING STICK FOR VISUALLY IMPAIRED PERSON

  • 1. © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1271 Ravikumar S1, Krishnakumar M2, Karthikeyan S3, Balamuguran N4 1Assistant professor(Sel.G), Department of IT, SRM Valliammai Engineering College, Tamilnadu, India 2,3,4UG Students, Department of IT, SRM Valliammai Engineering College, Tamilnadu, India ---------------------------------------------------------------------***---------------------------------------------------------------- Abstract - To provide a smart electronic help for blind individuals, a smart system concept has been designed. People who are blind or visually challenged have difficulty finding their way around. The Raspberry Pi-based system is designed to give artificial vision and object identification, as well as real-time support through GPS. In this project, we'll use the Raspberry Pi to create a smart system for blind people that includes a camera module, a switch, and a GPSmodule. If someone is in distress, the Pi Camera recognizes the location using GPS and sends a message to that person's Whatsapp. The system is made up of a GPS module that receives feedback via audio, and the voice output is controlled by TTS (text to speech). The suggested system identifies an object in their environment and gives feedbackin the form of speech, warning messages sent by earphone,and GPS navigation to a specific area. The overall goal of thesystem is to deliver a low-cost, high-efficiency navigation andtext-to-voice aid for the blind that provides a sense of artificial vision by supplying information about the environment's static and dynamic objects. Key Words: Artifical vision and object identifiaction, TTS(text to speech), GPS, Low-cost, high-efficiency navigation and text-to-voice. 1.INTRODUCTION Individuals with visual impairments are those whose good eyes make it impossible to understand eventhe smallest detail. Many people with 6/60 or the optical range have a longitudinal range of less than or equal to 20 degrees, or have both eyes wide open. These people are known to be deafeningly deafening visually handicapped persons live in a world where they are completely reliant on others. Individuals with sensory disabilities find it impossible to discern even the tiniest information from that of healthy people. Those with a 6/60 or optical range have a lateral scope of less than or equal to 20 degrees, orhave both eyes wide open. They are referred as as blind. According to the World Health Organization's global data on visual impairment, there are an estimated 285 million visually impaired persons of all ages, with 39 million of them blind. Blindness affects 80% of adults over the age of 50. Uncorrected refractive errors (43 percent) and cataract (33 percent) are the leading causes of visual impairment; cataract is the leading cause of blindness (51 percent ). Because humans receive 83 percent of their information from their surroundings through sight, visionis the most crucial element of their physiology. People with those illnesses face a lot of agony in order to live a normal life. They are mentally secluded and allow themselves to be neglected. This dysfunction with them has a significant impact on their ability to make a living. It manifests as a mental illness that causes patients to lose hope in themselves. As a result, we devised a system for those suffering from those conditions. In recent years, a number of inventors have created devices to assist visually impaired persons. Traditional and oldest accessibility devices for visually handicapped people have their own set of disadvantages. A unique supply or navigator is frequently included with some innovations, allowing theuser to carry it in their pocket when travelling al fresco. The consumer is likely to be confused by the profuse patterns. Several attempts to construct blind guard or obstacle-measuring systems utilizing a small number of applications of elements have been made. As a result, it is recommended to design and create a walking stick with all of these functions, such as real-time object identification, voice direction, navigation, and so on, that is affordable to all persons who require assistance. 2. LITERATURE REVIEW 1. A Survey of Voice-Assisted Electronic Sticks for the Visually Impaired. Young Ho and Sung Jae Kang are the authors. Description- We learned about the latest technology in this article, such as the Graphics Positioning System (GPS) and Graphics System Messaging (GSM). Which will aid in the tracking of a person's location and beutilised in the development of RASPBERRY PI BASED SMART WALKING STICK FOR VISUALLY IMPAIRED PERSON International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072
  • 2. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1272 a smart stick module for visually impaired persons, as well as providing information on voice messages sent from an Android phone to a blindperson. [14]. 2. Image Processing and Embedded System for Blind Navigation Sacinah Jamaludin and Zul Azizi Hailani are the authors. Description: This paper inspired us to create a navigation system that aids in the mobility of blindpeople. This study suggests that we capture live video of that individual and seize video feed in front of the blind person, and that this live video may be viewed by theadministrator. [4] 3. Smart Cane: Visually Impaired People's Assistive Cane MohdHelmy Wahab, AmirulATalib, AmirulATalib, AmirulATalib, AmirulATalib, AmirulATalib Description- We got the concept for Voice message & Vibration from this paper. When a person detects an impediment with theuse of a smart stick, the blind person becomes aware of itby interpreting the Vibration alert & Voice message that arrives from the smart phone. [3] 4. Electronic Path Guidance for People with Visual Impairment Iwan Ulrich and Johann Borenstein are the authors. Description- From this paper, we learned about the range required for identifying an obstacle or object from the location of a smart stick. We need to define a threshold value, and if the obstacle falls within that range, it may be detected successfully; otherwise, it cannot. [4] 5. Development of an Electronic Travel Aid Using Ultrasonic Sensors Alex Harold and Chris Gearhart are the authors. Description- From this article, we learned that inorder to capture video images, some processing is required. Using some algorithms and methods, we perform some procedures on the image in order to capture it, and we can also view live monitoring of that person on the admin side. All processing data is saved in a serialised manner on the server. [5] 6. Automated Mobility and Orientation System for People Who Are Blind or Partially Blind Abdel Ilah Nour Alshbatat is the author. Description- We learned about GSM, GPS, and sensors such as integrated ultrasonic sensors and accelerometers from this study. [14] 7. A Guide to Navigation for the Blind - For the sight impaired, taking each step without encountering any impediments is a significant struggle in today's fast-paced world. The visually impaired can use a smart assistant like 3S (Sensory Supervision Spectacles) to read images, avoid obstacles, and track routes. As a result, this study proposes the creation of a clever framework that can assist visually impaired people in their daily lives. Some of the most significant difficulties include difficulty moving without assistance, difficulty reading text or visuals, detecting impediments, and so on. Text recognition using Optical Character Recognition (OCR), speech synthesis using Text-To-Speech (TTS), and obstacle detection using ultrasonicsensors (HC-SR04) and a GPS tracker are all possible with the proposed 3S (Sensory Supervision Spectacles). The device is a voice-activated frameworkthatwouldassistvisuallyimpairedpeoplein their daily tasks. 3.PROPOSED WORK It also offers a camera-based assisted reading system to assist blind people in reading text labels and product packaging from everyday things. The text that the user needs to read is collected as an image and sent to the image processing platform using a small camera. OCR tesseract is used to recognise the text on the acquired image.CThe e- speak algorithm converts the detected text into vocal output. The system is lightweight and portable, thanks to abuilt-in battery backup. It will provide them with a safer atmosphere as well as a sense of independence, allowing them to enjoy a more regular life. The Raspberry Pi 3b, GPS module, voice command module, moisture sensor unit, and ultrasonic sensor unit are all used in this smart navigation system stick. The voice command module is the highlight since it gives customers with dependable voice commandsupport. The GPS module is also important since it offers users with safe and precise navigation.
  • 3. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1273 Block Diagram of Proposed system 3.1. ADVANTAGES: It is efficient. It is inexpensive. By receiving communications, we can keep track of the blind people's whereabouts. Character recognition that is done automatically. A prototype system for assisting blind people by reading printed text on hand-held devices. Weproposed a motion-based method for detecting the object for a few seconds to detect it. 3.2. PROBLEM IDENTIFICATION: Understanding the precise context of deaf and dumb people's symbolic expressions is a difficult task in real life unless it is adequately described. Webcams can occasionally struggle to determine the desired skin colour due to light and contrast. Because the tracking environment background colour and skin colour are so similar, the SLR receives unexpected pixels. 4.ARCHITECTURE DIAGRAM METHODOLOGY Image to text - Tesaract algorithm and text to audio - Espeak algorithm connecting ultrasonic sensor for detecting the obstacle and updated to blind person through headset. Camera will take the pic and detects the location of the area.
  • 4. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1274 5.1 TESSERACT ALGORITHM: Tesseract Algorithm is an optical character recognition engine with open source code. This is the most popular and qualitative OCR library. Tesseract is finding templates in pixels ,letters, words and sentences. Thisalgorithm is able to accurately decypher and extract text from a variety of sources As per it's namesake it uses an updated version of the tesseract open source OCR tool. We also automatically binarize and preprocess. Images using the binarization, so tesseract has an easier time decyphering images. And qualitative OCR library. Tesseract is finding templates in pixels , letters, words and sentences. 5.2 E-SPEAK ALGORITHM: eSpeak is an open source text-to-speech synthesizer that can be invoked from the Linux command line. This is a compact speech synthesizer that provides support to English and many other languages. It can be used with the existing layout analysis to recognize text within a large document, or it can be used in conjunction with an external text detector to recognize text from an image ofa single text line. The eSpeak package provides a few good variations to the default voice used to speak outthe text. 5. HARDWARE AND SOFTWARE COMPONENTS Hardware Requirements: We are usingthe RaspberryPi,WebCamera,Speaker etc., Software Requirements: Raspian OS,Python andOpen CV. Raspberry pi board A. A. It's a lot of fun to bond with Arduino and you can do alot with it. The Raspberry Pi is available in two models: model A and model B. These two are quite similar, with Model B having a few more advanced features than Model A. Model B features 512 MB of RAM and two USB ports, while Model A only has 256 MB of RAM and one USB port. Furthermore, the Model B features an Ethernet interface,whereas the Model A does not. B. SD Card Slot : Raspberry Pi doesn’t have the real harddrive as in laptop and computer, SD card is taken as solid state drive (SSD) which is used to install operating system and all others software and store everything. This card is needed to insert into the slot for using the Raspberry Pi. SD card may be 2GB, 4GB or 16GB C. Micro USB Power : The power port is a 5V micro-USB input and supply should be exactly 5v as it doesn’t have onboard power regulator. So, power supply shouldn’t exceed than 5V. D. HDMI Out: To connect the Raspberry Pi to a monitor via HDMI, use this output port (High Definition Multimedia Interface). As a result, any screen or TV with an HDMI connector can be attached to it.
  • 5. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1275 E. GPIO Headers (Pins): A GPIO pin is a general-purpose input-output pin. These pins are used to connect the Raspberry Pi to a variety of physical expansions. Pre installed libraries on the Raspberry Pi allow us to access the pins using programming languages such as C, C++, orPython. F. Ultrasonic Sensor: A form of sensor that detects an item using sound waves is known as an ultrasonic sensor. It works on the same principle as radar or sonar, which generates and receives high-frequency sound waves. Sensors detect the distance of an object by measuring the time it takes for the echo signal to be received after transmitting the signals and receiving the echo signals back. 6.SMART FLOW CHART The above flowchart shows the process details that takes place in the smart walking stick. And much easier algorithm of the smart stick is given below 7. FUNCTIONAL DESCRIPTION An ultrasonic sensor is a device that detectsthe presence of ultrasonic waves The ultrasonic sensor is used to calculate the object's distance or to detect potholes. Elastic waves of a frequency more than 20,000 Hz are produced by these,which can be found in solids, liquids, and gases.
  • 6. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1276 The ultrasonic sensor, which sends ultrasounds and estimates distance, is inserted in three sides of the walking stick for object detection. The sensor is put beneath the smart walking stick to identify potholes. The distance is chosen to be more than the average distance. As a result, when a pothole arises,its distance will be greater than normal, and we will bealerted. The pulse reflection method, which counts the number of reflection pulses reflected back, is used to calculate distance in the block diagram[5]. A. Raspberry pi The Raspberry Pi board's brain is the Central Processing Unit, which is in charge of carrying out the computer's instructions through logical and mathematical operations. The Raspberry Pi's Ethernet connection serves as the primary interface for communicating with other devices. The Ethernet port of the Raspberry Pi is used to connect your home network to the internet [9]. B.ARM 11 processor The raspberry pi uses ARM11 series processor ARM11 is a group of older 32-bit RISC ARM processor Its operating frequency is 335Mhz. power consumption is0.4mV/Mhz ARM Trust-Zone Technology for on chip security low power consumption and High performance integer processor the GPU is a specialized chip in the raspberry pi board and that is designed to speed up the operation of image calculations. C. Object identification Using digital image processing technology and software called computer vision 2, the process of item identification is carried out. We feed the fundamental structures of items such as stones, vehicles, and persons into this process, and if any variation is detected by ultrasonic sensors, the camera turns on to take the image, which it compares to the pre-fed photographs to discover the object [11]. D. Optical character recognition Tesseract, a supporting software for the Raspberry Pi, is used to do OCR (optical character recognition). The image acquired with the camera is first converted to black and white, and then the edge detection procedure is used to discover the edges in the image in order to find the distinct letters in the image. Against determine the right character of the image, these letters are compared to pre-fetched characters from other languages [7]. E. Switch To enable the ultrasonic sensor for obstacle detection, pothole detection, or text reading, a toggle switch is needed. To get the smart walking stick'srequired output, the user must operate the switch. 8.CONCLUSION AND DISCUSSION: In this analysis, we got described an epitome system toscan written communication and hand-held objects for serving to the blind people. To extract text regions from advanced backgrounds, we have got projected a totally distinctive text localization formula supported models of stroke orientation, and edge distributions. The corresponding feature maps estimate the worldwide structural feature of text at every element. Block patterns project the projected feature maps of an image patch into a feature vector. A adjacent character grouping is performedto calculate candidates of text patches prepared for text classification. Associate Ada boost learning model is used to localize text in camera-based footage. OCR is utilized to perform word recognition on the localized text regions andwork into audio output for blind users. Throughout this analysis, the camera acts as input for the paper. As a result of the Raspberry Pi board is high-powered the camera starts streaming. The streaming data square measure attending to be displayed on the screen victimization interface application. Once the item for text reading is placed earlier than the camera then the capture button is clicked to supply image to the board. Mistreatment Tesseract library the image square measure attending to be converted into data, and conjointly the data detected from the image square measure attending to be shown on the standing bar. The obtained data square measure attending to be pronounced through.
  • 7. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 05 | May 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 1277 9. REFERENCES: [1] X. Chen and A. L. Yuille, “Detecting and reading text in natural scenes,” in Proc. Comput. Vision Pattern Rec- ognit., 2004, vol. 2, pp. II-366–II-373. [2] S. Kumar, R. Gupta, N. Khanna, S. Chaudhury, and S.D. Joshi, “Text extraction and document image seg- mentation using matched wavelets and MRF model,” IEEETrans Image Process., vol. 16, no. 8, pp. 2117– 2128,Aug. 2007. [3] K. Kim, K. Jung, and J. Kim, “Texture-based ap- proach for text detection in images using support vec-tormachines and continuously adaptive mean shift al- gorithm,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 25, no. 12,pp. 1631–1639, Dec. 2003. [4] N. Giudice and G. Legge, “Blind navigation and the role of technology,”in The Engineering Handbook of Smart Technology for Aging, Disability,and Indepen-dence, A. A. Helal, M. Mokhtari, and B. Abdulrazak, Eds. Hoboken, NJ, USA: Wiley, 2008. [5] World Health Organization. (2009).10factsaboutblindness and visual impairment. [6] AdvanceDataReportsfromtheNationalHealthIn-terview Survey (2008). [7] International Workshop on Camera-Based Docu-ment Analysis and Recognition (CBDAR 2005, 2007, 2009, 2011). [8] X. Chen, J. Yang, J. Zhang, and A. Waibel, “Auto- matic detection and recognition of signs from natural scenes,” IEEE Trans. Image Process.,vol. 13, no. 1, pp. 87–99, Jan. 2004. [9] D. Dakopoulos and N. G. Bourbakis, “Wearable ob- stacle avoidance electronic travel aids for blind: A sur-vey,” IEEE Trans. Syst., Man, Cybern.,vol. 40, no. 1, pp. 25–35, Jan. 2010. [10] B. Epshtein, E. Ofek, and Y. Wexler, “Detecting textin natural scenes with stroke width transform,” in Proc. Comput. Vision Pattern Recognit.,2010, pp. 2963–2970. [11] Y. Freund and R. Schapire, “Experiments with a new boosting algorithm,”in Proc. Int. Conf. Machine Learning, 1996, pp. 148–156. [12] An overview of the Tesseract OCR (optical charac-ter recognition) engine, and its possible enhancement for use in Wales in a pre-competitive research stage Prepared by the Language Technologies Unit (Canol-fan Bedwyr), Bangor University April 2008. [13] A. Shahab, F. Shafait, and A. Dengel, “ICDAR 2011 robust reading competition:ICDAR Robust Reading Competition Challenge 2: Readingtextinsceneimag-es,”inProc. Int. Conf. Document Anal. Recognit., 2011, pp. 1491–1496. [14] KReader Mobile User Guide, knfb Reading Technol-ogy Inc. (2008). [15] S. M. Lucas, “ICDAR 2005 text locating competi- tion results,” in Proc.Int. Conf. Document Anal. Recog-nit., 2005, vol. 1, pp. 80–84. ….