SlideShare a Scribd company logo
Enviar búsqueda
Cargar
Obstacle Detection for Visually Impaired Using Computer Vision
Denunciar
IRJET Journal
Fast Track Publications
Seguir
•
0 recomendaciones
•
6 vistas
Ingeniería
https://www.irjet.net/archives/V10/i3/IRJET-V10I3128.pdf
Leer más
Obstacle Detection for Visually Impaired Using Computer Vision
•
0 recomendaciones
•
6 vistas
IRJET Journal
Fast Track Publications
Seguir
Denunciar
Ingeniería
https://www.irjet.net/archives/V10/i3/IRJET-V10I3128.pdf
Leer más
Obstacle Detection for Visually Impaired Using Computer Vision
1 de 4
Descargar ahora
Recomendados
Blind Stick Using Ultrasonic Sensor with Voice announcement and GPS tracking
vivatechijri
784 vistas
•
5 diapositivas
IRJET - Enhancing Indoor Mobility for Visually Impaired: A System with Real-T...
IRJET Journal
6 vistas
•
4 diapositivas
Smart Navigation Assistance System for Blind People
IRJET Journal
21 vistas
•
5 diapositivas
IRJET - Third Eye for Blind People using Ultrasonic Vibrating Gloves with Ima...
IRJET Journal
1.1K vistas
•
3 diapositivas
A Survey on Smart Devices for Object And Fall Detection
IRJET Journal
5 vistas
•
5 diapositivas
PROMINENT NAVIGATION FOR VISUALLY IMPAIRED PEOPLE
IAEME Publication
554 vistas
•
7 diapositivas
Más contenido relacionado
Similar a Obstacle Detection for Visually Impaired Using Computer Vision
IRJET- Smart Assistive Device for Visually Impaired
IRJET Journal
30 vistas
•
4 diapositivas
Third Eye for Blind using Ultrasonic Sensor Vibrator Glove
ijtsrd
62 vistas
•
5 diapositivas
IRJET-Voice Assisted Blind Stick using Ultrasonic Sensor
IRJET Journal
310 vistas
•
3 diapositivas
IRJET- VI Spectacle – A Visual Aid for the Visually Impaired
IRJET Journal
13 vistas
•
3 diapositivas
IRJET- Smart Guide Stick For Blind
IRJET Journal
82 vistas
•
5 diapositivas
A Survey on Smart Devices for Object and Fall Detection
IRJET Journal
5 vistas
•
4 diapositivas
Similar a Obstacle Detection for Visually Impaired Using Computer Vision
(20)
IRJET- Smart Assistive Device for Visually Impaired
IRJET Journal
•
30 vistas
Third Eye for Blind using Ultrasonic Sensor Vibrator Glove
ijtsrd
•
62 vistas
IRJET-Voice Assisted Blind Stick using Ultrasonic Sensor
IRJET Journal
•
310 vistas
IRJET- VI Spectacle – A Visual Aid for the Visually Impaired
IRJET Journal
•
13 vistas
IRJET- Smart Guide Stick For Blind
IRJET Journal
•
82 vistas
A Survey on Smart Devices for Object and Fall Detection
IRJET Journal
•
5 vistas
DRISHTI – A PORTABLE PROTOTYPE FOR VISUALLY IMPAIRED
IRJET Journal
•
15 vistas
Smart Walking Cane for Visually Impaired Pedestrian
IRJET Journal
•
2 vistas
AUTOMATIC REAL-TIME RAILWAY FISHPLATE MONITORINGSYSTEM FOR EARLY WARNING USIN...
IRJET Journal
•
3 vistas
IRJET- Bemythirdeye- A Smart Electronic Blind Stick with Goggles
IRJET Journal
•
178 vistas
SMART BLIND STICK USING VOICE MODULE
IRJET Journal
•
27 vistas
IRJET - Aid for Blind People using IoT
IRJET Journal
•
14 vistas
Design and implementation of smart guided glass for visually impaired people
IJECEIAES
•
15 vistas
IRJET- Smart Blind Stick for Visually Impaired People
IRJET Journal
•
83 vistas
IRJET- Smart Blind Stick using Arduino
IRJET Journal
•
335 vistas
Hand Gesture for Multiple Applications
IRJET Journal
•
42 vistas
IRJET- Step Lock System in Bus using IoT
IRJET Journal
•
454 vistas
IRJET- Smart Guiding System for Blind: Obstacle Detection and Real-time Assis...
IRJET Journal
•
35 vistas
Hand Gesture for Multiple Applications
IRJET Journal
•
30 vistas
IRJET- Healthy Beat
IRJET Journal
•
30 vistas
Más de IRJET Journal
SOIL STABILIZATION USING WASTE FIBER MATERIAL
IRJET Journal
19 vistas
•
7 diapositivas
Sol-gel auto-combustion produced gamma irradiated Ni1-xCdxFe2O4 nanoparticles...
IRJET Journal
7 vistas
•
7 diapositivas
Identification, Discrimination and Classification of Cotton Crop by Using Mul...
IRJET Journal
7 vistas
•
5 diapositivas
“Analysis of GDP, Unemployment and Inflation rates using mathematical formula...
IRJET Journal
12 vistas
•
11 diapositivas
MAXIMUM POWER POINT TRACKING BASED PHOTO VOLTAIC SYSTEM FOR SMART GRID INTEGR...
IRJET Journal
14 vistas
•
6 diapositivas
Performance Analysis of Aerodynamic Design for Wind Turbine Blade
IRJET Journal
7 vistas
•
5 diapositivas
Más de IRJET Journal
(20)
SOIL STABILIZATION USING WASTE FIBER MATERIAL
IRJET Journal
•
19 vistas
Sol-gel auto-combustion produced gamma irradiated Ni1-xCdxFe2O4 nanoparticles...
IRJET Journal
•
7 vistas
Identification, Discrimination and Classification of Cotton Crop by Using Mul...
IRJET Journal
•
7 vistas
“Analysis of GDP, Unemployment and Inflation rates using mathematical formula...
IRJET Journal
•
12 vistas
MAXIMUM POWER POINT TRACKING BASED PHOTO VOLTAIC SYSTEM FOR SMART GRID INTEGR...
IRJET Journal
•
14 vistas
Performance Analysis of Aerodynamic Design for Wind Turbine Blade
IRJET Journal
•
7 vistas
Heart Failure Prediction using Different Machine Learning Techniques
IRJET Journal
•
7 vistas
Experimental Investigation of Solar Hot Case Based on Photovoltaic Panel
IRJET Journal
•
3 vistas
Metro Development and Pedestrian Concerns
IRJET Journal
•
2 vistas
Mapping the Crashworthiness Domains: Investigations Based on Scientometric An...
IRJET Journal
•
3 vistas
Data Analytics and Artificial Intelligence in Healthcare Industry
IRJET Journal
•
3 vistas
DESIGN AND SIMULATION OF SOLAR BASED FAST CHARGING STATION FOR ELECTRIC VEHIC...
IRJET Journal
•
36 vistas
Efficient Design for Multi-story Building Using Pre-Fabricated Steel Structur...
IRJET Journal
•
10 vistas
Development of Effective Tomato Package for Post-Harvest Preservation
IRJET Journal
•
4 vistas
“DYNAMIC ANALYSIS OF GRAVITY RETAINING WALL WITH SOIL STRUCTURE INTERACTION”
IRJET Journal
•
4 vistas
Understanding the Nature of Consciousness with AI
IRJET Journal
•
11 vistas
Augmented Reality App for Location based Exploration at JNTUK Kakinada
IRJET Journal
•
6 vistas
Smart Traffic Congestion Control System: Leveraging Machine Learning for Urba...
IRJET Journal
•
18 vistas
Enhancing Real Time Communication and Efficiency With Websocket
IRJET Journal
•
4 vistas
Textile Industrial Wastewater Treatability Studies by Soil Aquifer Treatment ...
IRJET Journal
•
4 vistas
Último
Multi-objective distributed generation integration in radial distribution sy...
IJECEIAES
13 vistas
•
14 diapositivas
Saikat Chakraborty Java Oracle Certificate.pdf
SaikatChakraborty787148
9 vistas
•
1 diapositiva
DevOps to DevSecOps: Enhancing Software Security Throughout The Development L...
Anowar Hossain
9 vistas
•
34 diapositivas
performance uploading.pptx
SanthiS10
7 vistas
•
18 diapositivas
What is Whirling Hygrometer.pdf
IIT KHARAGPUR
10 vistas
•
3 diapositivas
CHI-SQUARE ( χ2) TESTS.pptx
ssusera597c5
11 vistas
•
33 diapositivas
Último
(20)
Multi-objective distributed generation integration in radial distribution sy...
IJECEIAES
•
13 vistas
Saikat Chakraborty Java Oracle Certificate.pdf
SaikatChakraborty787148
•
9 vistas
DevOps to DevSecOps: Enhancing Software Security Throughout The Development L...
Anowar Hossain
•
9 vistas
performance uploading.pptx
SanthiS10
•
7 vistas
What is Whirling Hygrometer.pdf
IIT KHARAGPUR
•
10 vistas
CHI-SQUARE ( χ2) TESTS.pptx
ssusera597c5
•
11 vistas
What is Unit Testing
Sadaaki Emura
•
17 vistas
Wire Rope
Iwiss Tools Co.,Ltd
•
8 vistas
Dynamics of Hard-Magnetic Soft Materials
Shivendra Nandan
•
11 vistas
LFA-NPG-Paper.pdf
harinsrikanth
•
40 vistas
SWM L1-L14_drhasan (Part 1).pdf
MahmudHasan747870
•
38 vistas
How I learned to stop worrying and love the dark silicon apocalypse.pdf
Tomasz Kowalczewski
•
22 vistas
cloud computing-virtualization.pptx
RajaulKarim20
•
72 vistas
7_DVD_Combinational_MOS_Logic_Circuits.pdf
Usha Mehta
•
11 vistas
Pointers.pptx
Ananthi Palanisamy
•
58 vistas
Art of Writing Research article slide share.pptx
sureshc91
•
13 vistas
SNMPx
Amatullahbutt
•
10 vistas
MSA Website Slideshow (16).pdf
msaucla
•
21 vistas
Electronic Devices - Integrated Circuit.pdf
booksarpita
•
10 vistas
Informed search algorithms.pptx
Dr.Shweta
•
12 vistas
Obstacle Detection for Visually Impaired Using Computer Vision
1.
International Research Journal
of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072 © 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 817 Obstacle Detection for Visually Impaired Using Computer Vision Pratik Kharat1, Tushar Kumar2, Rutuja Sirsikar3, Rushil Sawant4, Prof. Vedika Avhad5 1234B.E. Student, Dept of Information Technology, VPPCOE&VA, Mumbai, Maharashtra, India 5 Assistant Professor, Dept of Information Technology, VPPCOE&VA, Mumbai, Maharashtra, India ---------------------------------------------------------------------***--------------------------------------------------------------------- Abstract - Blindness refers to the loss of visual perception, which can cause mobility and self-reliance issues for people who are visually impaired or blind. “Visioner”isatoolforblind individuals, which uses computer vision through obstacle detection technology to improve their quality of life. This intelligent aid is equipped with different sensors and is designed to help the visually impaired and blindnavigateboth indoor andoutdoorenvironmentswithoutlimitations. Visioner is a cost-effective, highly efficient, and reliable device that is lightweight, fast-responding, andenergy-efficient. Thedevice's advanced algorithm and computer vision technology detect objects in the surrounding environment and relay this information to the user through voice messages. To enhance usability, support for multiple languages will be included, which will allow users to select the language according to their preference. Key Words: LiDAR Sensor, Arduino Uno, ESP32 CAM, YOLO, Smart Blind Stick. 1. INTRODUCTION In 2014, the World Health Organization (WHO) released a report stating that approximately 285 million individuals were visually impaired. Of this total, 39 million people were classified as blind, while 246 million were categorized as having low vision. The advent of modern technologies has led to the development of Electronic Travel Aids (ETAs), which are designedto enhancemobilityforvisuallyimpaired individuals. These devices are equipped with sensors that can detect potential hazards and alert the user with sounds or vibrations, providing advanced warning and enabling users to avoid obstacles and navigate their surroundings with greater confidence. The availability of ETAs has significantly improved safety and independence for visually impaired individuals, enhancing their quality of life and enabling greater participation in society. The Visioner smart stick is a new technology designed to aid visually impaired individuals in navigating their surroundings. It is equipped with a Lidar sensor, an esp32 camera sensor, and a buzzer to help identify objects and obstacles in the environment. The purpose ofthesmartstick is to provide a more reliable and efficient way for the visually impaired to move around and navigate their surroundings. Visioner is a cost-effective, efficient,anduser- friendly ETA that uses Internet of Things (IoT) technology. 2. LITERATURE REVIEW In this section the summarization of previous and existing system is given. Though many technologies have emerged in the last few years for visually challenged people, there are a lot of limitations and restrictions in those inventions. Kim S. Y & Cho K.[1] The author has proposed in this paper that by analyzing users' needs and requirements for a smart cane withobstacledetectionandindication,designguidelines can be generated. The study foundthatasmartcaneprovides more advantages to visually impairedpeoplethantraditional white canes, as it is more effective in avoiding obstacles. Z. Saquib, V. Murari and S. N. Bhargav. [2] The authors have proposed in their system, BlinDar - An Invisible Eye for the Blind People, a highly efficient and cost-effective device for the blind, which uses ultrasonic sensors, MQ2 Gas sensors, ESP8266 Wi-Fi module, GPS module, Wristband key button, Buzzer, Vibrator module, speech synthesizer, and RFTx/RX module to aid in navigation. However, one limitation of the device is that it takes a longer scanning time, resulting in a longer waiting time for the user. R. O'Keeffe et al.[3] The authors have proposed an obstacle detection system for the visually impaired and blind that utilizes a long-range LiDAR sensor mounted on a white cane. The system operates both indoors and outdoors, has a range of up to 5m, and aims to report the characterizationresultsof the Gen 1 long-range LiDAR sensor. The sensor is integrated with short-range LiDar, UWB radar, and ultrasonic range sensors, and each range sensor must meet specific requirements, including a small power budget, size, and weight, while maintaining the necessary detection range under various environmental conditions. N. Krishnan.[4] The authors have proposed a low-cost proximity sensing system embedded with LiDAR forvisually impaired individuals. While the system is precise foraverage to large distances, it requires compensations for loss of linearity at short distances. The circuit diagram was established based on available componentresources,andthe DRV2605L device was placed in PWM interface mode to control vibration strength. The system efficiently detects obstacles, but the challenge remains in conveying the information to the user. M. Maragatharajan, G. Jegadeeshwaran,R.Askash,K.Aniruth, A. Sarath.[5] In their paper, the authors have introduced the
2.
International Research Journal
of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072 © 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 818 Third Eye for the Blind project, a wearable band designed to aid visually impaired individuals in navigating their environment. The device utilizes ultrasonic waves to detect nearby obstaclesandalertstheuserthrougheithervibrations or buzzing sounds. The obstacle detector is based on the Arduino platform and is portable, cost-effective, and user- friendly. By enabling the detection of obstacles in all directions, the device empowers the visually impaired to move independently and with greater confidence, reducing their reliance on external assistance. 3. PROPOSED SYSTEM 3.1 PROBLEM STATEMENT The visually impaired face significant challenges in mobility and safety, particularly in unfamiliar environments.Existing assistive technologies such as canes and guide dogs have limitations in detecting obstacles and providing real-time information about the environment. The lack ofaccessibility to visual cues makes it difficult for the visually impaired to navigate and avoid obstacles, leading to accidents and injuries. Therefore, there is a need for an assistive device that can provide accurate and real-time information about the environment to enhance mobility and safety for the visually impaired. The Visionary smart stick aims toaddress this problem by utilizing advanced sensors and machine learning algorithms to detect and identify obstacles and provide speech output to the user in multiple languages, thereby improving their independence and quality of life. 3.2 PROPOSED METHODOLGY This project is an IoT-based solution utilizing Arduino Uno, a LiDARsensor,an ESP32 CAMboard, and the YOLO algorithm for obstacle detection. The LiDAR sensor identifies obstacles and records their precise distance, which is then transmitted to the ESP32 CAM board. The ESP32 CAM board captures an image of the obstacle and transmits both the image and distance value to a server. Machine learning algorithms such as YOLO are employed for real-time object classification and precise identification. Speech output is provided to the user in their preferredlanguage,utilizingeitherthepyttsx3library or web speech APIs. Ensuring a seamless data transfer from the sensors to the web server and ultimately to the user's device, the overall experience is optimized for the visually impaired user to navigate their surroundingareas with ease. 3.3 System Architecture The system architecture consists of a blind stick that houses an Arduino Uno Board, connected toa batterysource. The Arduino board has a TF-Luna Micro Lidar sensor connected to pins 2 and 3 for RX and TX respectively. The main purpose of this sensor is to capture the distance of obstacles. The Arduino board calculates the distance in cm and sends this value to an ESP32 CAM board connected to it at pins 0 and 1 for RX and TX respectively. The ESP32 CAM board also captures the image of the obstacle and sends both the Lidar distance value in cm and the image to a web server. On the web server, the image received from the ESP32 CAM board is analyzed using the yolov4 algorithm. The objects in the image are classified, and the total number of obstacles detected, along with their nameand the distanceof the nearest obstacle, is sent to the user device as a speech output. This output is customizable in terms of language, volume, gender, and pitch, and can be configured using the web server's speech aid functionality. In addition, when the distance of the nearest obstacle is less than 30cm, a buzzer sound is raised in the speech output, warning the visually impaired of an impending collision. Overall, the smart blind stick project provides a comprehensive solution to assist visually impaired individuals with navigation, and the system architecture ensures the reliable and efficient operation of the device. Fig-1. Architecture of Smart Stick 3.4 YOLO Algorithm Object detection is a complex computer vision task that involves both localizing one or more objects within an image and classifying each object present in the image. This task is challenging as it requires successful object localization and classification to predict the correct class of object. YOLOv4 is an advanced real-time object detection algorithm that achieves state-of-the-art accuracy on multiple benchmarks. It builds upon the previous versions of YOLO (You Only Look Once) algorithms, which were already recognized for their speed and accuracy. The YOLOv4 algorithm utilizes a deep neural network to classify objects in real-time images. It is a single-stage object detection model that detects objects in one pass through the network, making it faster than many other object detection algorithms.
3.
International Research Journal
of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072 © 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 819 In this project, YOLOv4 analyzes the images captured by the ESP32 CAM board. The algorithm detects the objects present in the image and classifies themintodifferentcategories. The information is then used to provide the user with a speech output that includes the total number of objects detected, their names, and the distance of the nearest object. The YOLOv4 algorithm works by dividing the input image into a grid and predicting the objects within each grid cell. The algorithm then assigns a confidence score to each prediction and filters out low-confidence detections. Finally, non-maximum suppression is applied to remove redundant detections. The YOLOv4 algorithm is trained on large datasets, enabling it to detect objects accurately in a wide range of real-world scenarios. The algorithm can also be fine-tuned for specific use cases, making it highly adaptable to different applications. It is a highly accurate and fast object detection algorithm utilized in this project to classify objects in real- time images captured by the ESP32 CAM board. The algorithm provides visually impaired individuals with a comprehensive understanding of their surroundings, allowing them to navigate safely. 4. RESULTS AND DISCUSSION The sensors and modules depicted in Figure 1 underwent individual testing, with their respective outputs observed. Firstly, the TF-Luna Micro LiDAR was tested. Thedevice was programmed in an Arduino IDE environment and the code was uploaded to the microcontroller board of Arduino Uno. The LiDAR sensor emits a light beam on the object, and the time taken to receive the reflected light is used to calculate the distance in centimeters. Subsequently, the ESP32 microcontroller, which has an OV2640 camera module attached to it, was tested. Its purpose is to capture an image and send it to the server where object detection is applied, as shown in Figure 2. The figure displays the total number of objects detected, their names, and their accuracy. Fig.2. Screenshot of Yolo Algorithm Output1 The bounding box in the image for detected objects is presented in Figure 3. The output of the Yolo Algorithm is part of the backend process, and the user receives audio output from the server where the "Total number of detected Objects in the area," "Objects," and their distance in cm are announced through the speech synthesis library. Fig.3. Screenshot of Yolo Algorithm Output2 Following the individual testing of the sensors, the integration processwascarriedout,wherebyall components were connected to the blind stick. The battery was used as the primary power source, and a switch was incorporatedto provide the functionality of turning the stick on and off. 5. CONCLUSION AND FUTURE SCOPE 5.1 CONCLUSION In conclusion, the development of technologies such as IoT has created new possibilities for improvingthe qualityoflife for visually impaired individuals. This project is an example of how IoT technology can be harnessed to provide innovative solutions that aid the visually impaired in navigating their surroundings with greater ease and safety. The smart blind stick developed in this project utilizes Lidar sensors, computer vision algorithms, and speech output to detect and classify obstacles in the environmentand provide real-time feedback to the user. The use of the YOLOv4 algorithm for object detection and classification makes the system highly accurate and reliable, and the ability to customize the speech output to suit individual preferences further enhances. In summary, the smart blind stick developed in this project is an IoT-based solution that leverages the power of Lidar sensors, computer vision, and speech output to provide visually impaired individuals with a comprehensive understanding of their surroundings and navigate safely. The salient features of this project include accurate object detection and classification, real-time feedback, customizable speech output, cost-effectiveness, and easy replication.
4.
International Research Journal
of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072 © 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 820 5.2 FUTURE SCOPE This project has a promising future scope, offering opportunities for further research and development. Possible directions include improving the accuracy and precision of sensors used for distance measurement and object detection, incorporatingmachinelearningalgorithms, and integrating the device with other technologies like GPS or haptic feedback systems. The device's accessibility could be enhanced by exploring alternative components or optimizing the manufacturing process. This project sets the groundwork for future innovation in assistive technologies for individuals with visual impairments, withthepotential to greatly improve their quality of life and independence. ACKNOWLEDGEMENT We wish to express our sincere gratitude to Prof. Vedika Avhad, Assistant Professor of Information Technology Department, for her guidance and encouragement in carrying out this project. We also thank faculty of our department for their help and support during the completion of our project. We sincerely thank the Principal of Vasantdada Patil Pratishthan'sCollege of Engineering for providing us the opportunityof doingthis project. REFERENCES [1] Kim S. Y & Cho K, “Usability and design guidelines of smart canes for users with visual impairments”, International Journal of Design 7.1,pp.99-110,2013 [2] Z. Saquib, V. Murari and S. N. Bhargav, "BlinDar: An invisible eye for the blind people makinglifeeasyforthe blind with Internet of Things (IoT)," 2017 2nd IEEE International Conference on Recent Trends in Electronics, Information & Communication Technology (RTEICT), Bangalore, India, 2017, pp. 71-75, doi: 10.1109/RTEICT.2017.8256560. [3] R. O'Keeffe et al., "Long Range LiDAR Characterisation for Obstacle Detection for use by the Visually Impaired and Blind," 2018 IEEE 68th Electronic Components and Technology Conference (ECTC), San Diego, CA, USA, 2018, pp. 533-538, doi: 10.1109/ECTC.2018.00084. [4] N. Krishnan, "A LiDAR based proximity sensing system for the visually impaired spectrum," 2019 IEEE 62nd International Midwest Symposium on Circuits and Systems (MWSCAS), Dallas, TX, USA, 2019, pp. 1211- 1214, doi: 10.1109/MWSCAS.2019.8884887. [5] M. Maragatharajan, G. Jegadeeshwaran, R. Askash, K. Aniruth, A. Sarath, “ObstacleDetectorforBlindPeoples”, International Journal of Engineering and Advanced Technology (IJEAT), pp. 61-64, Dec 2019 [6] Y. -C. Huang and C. -H. Tsai, "Speech-Based Interface for Visually Impaired Users," 2018 IEEE 20th International Conference on High Performance Computing and Communications;IEEE16thInternational Conferenceon Smart City; IEEE 4th International Conference on Data ScienceandSystems(HPCC/SmartCity/DSS),Exeter,UK, 2018, pp. 1223-1228, doi: 10.1109/HPCC/SmartCity/DSS.2018.00206. [7] J. Redmon, S. Divvala, R. Girshick and A. Farhadi, "You Only Look Once: Unified, Real-Time Object Detection," 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 2016, pp. 779- 788, doi: 10.1109/CVPR.2016.91.
Descargar ahora