SlideShare una empresa de Scribd logo
1 de 4
Descargar para leer sin conexión
IOSR Journal of Engineering (IOSRJEN) www.iosrjen.org
ISSN (e): 2250-3021, ISSN (p): 2278-8719
Vol. 04, Issue 04 (April. 2014), ||V3|| PP 25-28
International organization of Scientific Research 25 | P a g e
Laser Distance Measurement by using web camera
Ahmed A.Hamad , Dhafir A. Dhahir, Bushra.R.Mhdi, Wadah H.Salim
Ministry of Science and Technology ,Laser and Optoelectronic center ,Iraq-Bagdad
Abstract: - In this work the near distance measurement system was presented .The system are consist from
transmitter and receiver .The transmitter is used as semiconductor laser at wavelength 526 nm and the power 5
mw and the receiver are consist of optics part and web camera .The concept of operation is transmit a pulse of
laser and then is received the reflected pulse from the object .The software program was written in C# language
to calculate the object distance .The accuracy of the measurement distance is high. The error in the distance
measurement reach to ±3cm.
Keywords: - distance measurement, laser distance measurement
I. INTRODUCTION
Any self-respecting mobile robot requires some sort of distance measurement sensor or range finding
mechanism. Among the most accessible are ultrasonic range finders, which time the propagation and reaction of
a sonic pulse. ultrasonic rangers, especially when placed in ring configuration, are plagued with fundamental
difficulties that make them unattractive for serious localization.
At the other end of the spectrum are high-end laser rangefinders, which use light instead of sound to determine
distance from time-of-light. These are often bundled with precision optics and mechanisms that allow one to
infer 3D point clouds over large and small " human" distances. Their precision and reliability makes them ideal
for industrial and research applications, but their cost is prohibitive to the hobbyist or researcher on a low
budget.
To solve the problem of optimizing user cost vs. sensor precision and reliability, we consider a different
sensor design. Here, the only materials needed are a laser line generator and a camera (e.g, a cheap webcam).
The principle of operation is simple: one points the line generator and the camera in the same direction, but of
sets them some known distance d off (~ 10cm). The camera sees the laser line, filtered from the rest of the
environment based on brightness and color, which is of set whenever it strikes an object. By calculating the of
set distance from the expected position of the laser line, the distance to the object can be inferred.
II. RANGE SENSOR DESIGN
Figure( 1 )shows the arrangement of the webcam and laser pointer. This configuration allow us to
maintain in parallel position both of components, there by facilitating a quick calculation of real distances from
a simple model.
Fig(1) The photograph image of laser distance measurement setup.
Laser Distance Measurement by using web camera
International organization of Scientific Research 26 | P a g e
III. GEOMETRY OF THE MODEL
In Figure( 2) the scheme used for the calculation of distance is showed. The distance between the
webcam and laser pointer is defined by H (in centimeters). The distance between the scanner and the object is
defined by D (in centimeters).
Also note that the representation of the scene in the image formed by the webcam lens is the size A"'
B "
,
where AB represents the same scene in real size. On the other hand, the distance between the center of the
image to the location of the beam projected in the images can be defined as pfc (pixels from center), where Ɵ
represents the angle of vision (in radians) formed by projection of the laser beam projected onto the image.
Fig. 2. Scheme of the arrangement of the camera-laser.
The distance D is obtained from the projection of the laser dot on the surface of the object.
With this scheme we can establish the following relation:
D = (1)
The relationship can be raised to obtain the total of pixels per degree of incoming light, here called rpc
(radian per pixel pitch), thus converting each pixel in the image to its corresponding value in centimeters.
The following equation shows rpc in relation with Ɵ:
Ɵ= pfc * rpc (2)
Due to the existence of lens deformations on the webcam, in Equation 2 we consider a parameter which
allows us the correction of alignment, here called
ro (radian offset):
Ɵ = pfc * rpc + ro (3)
To obtain the values of rpc and ro we use a model of linear regression to find the relation between a
dependent variable Y with k explanatory variables xk, which generate a hyper plane of parameters β .Usually
this linear model is defined as:
Y + Ɛ (4)
where Ɛ is a random variable that deals with those uncontrollable factors. The model satisfies the following
general equation:
Y = m X + b (5)
With
Laser Distance Measurement by using web camera
International organization of Scientific Research 27 | P a g e
where n, is the total of samples used; m, is the slope of the regression line
(rpc); b, is the value originated by uncontrollable factors (ro) and Y represents Ɵ.
In order to obtain the distance to the object, we introduce the Equation 3 into the Equation 1, each of one in
centimeters:
D (6)
We called this formulation as the Equation of distance calculation.
We define laser beam as the light project by the laser pointer onto the object and captured by a webcam.
For our work, we use a laser pointer that emits green light and its localization onto the images is considered as a
form of introduce the depth information into the image. The process of localization is nuanced by (i) the
presence of illumination into the environment; (ii) the texture of the object pointed by the laser beam, and (iii)
the size of the laser beam projected onto the image (the size of the projected beam depends of the proximity
between the scanner and the object).
IV. RESULTS AND CALCULATIONS
To calibrate the system, we will collect a series of measurements where the range to the target, is known.
The number of pixels the dot is from the centre of the image each time is also needed. Example data is set out
below:
Table 1
Pixels from Centre Actual D (cm)
103 29
81 45
65 58
55 71
49 90
45 109
41 127
39 159
37 189
35 218
Calibration Data
Using the following equation, we can now calculate the actual angle based on the value of h, as well as actual
distance for each data point.
Ø actual = arc tan ( h / D actual)
where:
Ø actual =Actual Angle
D actual = Actual Distance to Measured Target
Once we have a Theta actual for each value, we can discover the relationship that lets us calculate theta
from the number of pixels from image centre. To use a linear relationship, the gain and offset are required.
Though this works well, it does not take into account that the focal plane is a 'plane', rather than curved at a
constant radius around the centre of the camera lens[4].
From the calibration data, we can calculate both Offset & Gain :
Offset (r o) = -0.056514344 radians
Gain (r pc) = 0.0024259348 radians/pixel
Using:
D= hover tan (pf ctimes r pc + r o )
Laser Distance Measurement by using web camera
International organization of Scientific Research 28 | P a g e
We can calculate distances, including a error value from the actual distance using the calibration data above:
Table 2( Actual and Calculated Range Data)
V. CONCLUSIONS
One possible improvement that can be made to this camera based Laser range finder is to use a laser
that projects a horizontal line rather than a dot onto a target. With this type of Laser, we could calculate the
range for each column of pixels in the image, rather than just one column as done here. This type of set-up
could be used to locate areas of maximum range as places that a vehicle could steer towards. Likewise, areas
with readings of minimum range would be identified as obstacles to be avoided.
From my initial experiments, the green laser's dot can be hard to distinguish at distances greater than 2 meters.
Also the dot tends to blend into light backgrounds in well lit situations. I feel that a Green Laser may go a long
way to solving these two issues.
REFERENCES
[1] Isaac Cohen and Gerard Medioni. Detecting and tracking objects in video surveillance.In Proceeding of
the IEEE Computer Vision and Pattern Recognition 99,Fort Collins, June 1999.
[2] Brian Gerkey, Kasper Stoy, and Richard T. Vaughan.Player robot server. Institutefor Robotics and
Intelligent Systems Technical Report IRIS-00-391, Universityof Southern California, 2000.
[3] Stephen S. Intille, James W. Davis, and Aaron F. Bobick. Real-time closed worldtracking. In
Proceeding of the IEEE Conference on Computer Vision and Pattern Recognition, pages 928{934, June
1997.
[4] Andrew Ng. Em algorithm. CS229 Class Notes, 2008.
[5] Hai Nguyen. Ros wiki: Camera calibration, 2008.
[6] Sebastian Thrun, Wolfram Burgard, and Dieter Fox. Probabilistic Robotics, chapter 6.3, pages
124-139. MIT Press, 2005.
Pixels from centre Calc D (cm) Actual D (cm) % Error
103 29.84 29 2.88
81 41.46 45 -7.87
65 57.55 58 -0.78
55 75.81 71 6.77
49 93.57 90 3.96
45 110.85 109 1.70
41 135.94 127 7.04
39 153.27 159 -3.60
37 175.66 189 -7.06
35 205.70 218 -5.64

Más contenido relacionado

La actualidad más candente

La actualidad más candente (20)

Improved single image dehazing by fusion
Improved single image dehazing by fusionImproved single image dehazing by fusion
Improved single image dehazing by fusion
 
Basics of dip
Basics of dipBasics of dip
Basics of dip
 
Review on Various Algorithm for Cloud Detection and Removal for Images
Review on Various Algorithm for Cloud Detection and Removal for ImagesReview on Various Algorithm for Cloud Detection and Removal for Images
Review on Various Algorithm for Cloud Detection and Removal for Images
 
Chapter 5: Remote sensing
Chapter 5: Remote sensingChapter 5: Remote sensing
Chapter 5: Remote sensing
 
Working of photogrammetry and remote sensing
Working of photogrammetry and remote sensingWorking of photogrammetry and remote sensing
Working of photogrammetry and remote sensing
 
Evaluate Combined Sobel-Canny Edge Detector for Image Procssing
Evaluate Combined Sobel-Canny Edge Detector for Image ProcssingEvaluate Combined Sobel-Canny Edge Detector for Image Procssing
Evaluate Combined Sobel-Canny Edge Detector for Image Procssing
 
The single image dehazing based on efficient transmission estimation
The single image dehazing based on efficient transmission estimationThe single image dehazing based on efficient transmission estimation
The single image dehazing based on efficient transmission estimation
 
A new approach of edge detection in sar images using
A new approach of edge detection in sar images usingA new approach of edge detection in sar images using
A new approach of edge detection in sar images using
 
A Review on Airlight Estimation Haze Removal Algorithms
A Review on Airlight Estimation Haze Removal AlgorithmsA Review on Airlight Estimation Haze Removal Algorithms
A Review on Airlight Estimation Haze Removal Algorithms
 
A new approach of edge detection in sar images using region based active cont...
A new approach of edge detection in sar images using region based active cont...A new approach of edge detection in sar images using region based active cont...
A new approach of edge detection in sar images using region based active cont...
 
Primal-Dual Coding to Probe Light Transport
Primal-Dual Coding to Probe Light TransportPrimal-Dual Coding to Probe Light Transport
Primal-Dual Coding to Probe Light Transport
 
Reduction of Azimuth Uncertainties in SAR Images Using Selective Restoration
Reduction of Azimuth Uncertainties in SAR Images Using Selective RestorationReduction of Azimuth Uncertainties in SAR Images Using Selective Restoration
Reduction of Azimuth Uncertainties in SAR Images Using Selective Restoration
 
Garbage Box: 3D Reconstruction and 4D Deformation Measurement
Garbage Box: 3D Reconstruction and 4D Deformation MeasurementGarbage Box: 3D Reconstruction and 4D Deformation Measurement
Garbage Box: 3D Reconstruction and 4D Deformation Measurement
 
Visibility Enhancement of Hazy Images using Depth Estimation Concept
Visibility Enhancement of Hazy Images using Depth Estimation ConceptVisibility Enhancement of Hazy Images using Depth Estimation Concept
Visibility Enhancement of Hazy Images using Depth Estimation Concept
 
Development and Hardware Implementation of an Efficient Algorithm for Cloud D...
Development and Hardware Implementation of an Efficient Algorithm for Cloud D...Development and Hardware Implementation of an Efficient Algorithm for Cloud D...
Development and Hardware Implementation of an Efficient Algorithm for Cloud D...
 
IRJET- Photo Restoration using Multiresolution Texture Synthesis and Convolut...
IRJET- Photo Restoration using Multiresolution Texture Synthesis and Convolut...IRJET- Photo Restoration using Multiresolution Texture Synthesis and Convolut...
IRJET- Photo Restoration using Multiresolution Texture Synthesis and Convolut...
 
A fast single image haze removal algorithm using color attenuation prior
A fast single image haze removal algorithm using color attenuation priorA fast single image haze removal algorithm using color attenuation prior
A fast single image haze removal algorithm using color attenuation prior
 
RADAR Image Fusion Using Wavelet Transform
RADAR Image Fusion Using Wavelet TransformRADAR Image Fusion Using Wavelet Transform
RADAR Image Fusion Using Wavelet Transform
 
Report
ReportReport
Report
 
3d scanning techniques
3d scanning techniques3d scanning techniques
3d scanning techniques
 

Similar a D04432528

Object Distance Detection using a Joint Transform Correlator
Object Distance Detection using a Joint Transform CorrelatorObject Distance Detection using a Joint Transform Correlator
Object Distance Detection using a Joint Transform Correlator
Alexander Layton
 
Stereo Correspondence Algorithms for Robotic Applications Under Ideal And Non...
Stereo Correspondence Algorithms for Robotic Applications Under Ideal And Non...Stereo Correspondence Algorithms for Robotic Applications Under Ideal And Non...
Stereo Correspondence Algorithms for Robotic Applications Under Ideal And Non...
CSCJournals
 
10.1109@ICCMC48092.2020.ICCMC-000167.pdf
10.1109@ICCMC48092.2020.ICCMC-000167.pdf10.1109@ICCMC48092.2020.ICCMC-000167.pdf
10.1109@ICCMC48092.2020.ICCMC-000167.pdf
mokamojah
 

Similar a D04432528 (20)

Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
 
Satellite Imaging System
Satellite Imaging SystemSatellite Imaging System
Satellite Imaging System
 
Object Distance Detection using a Joint Transform Correlator
Object Distance Detection using a Joint Transform CorrelatorObject Distance Detection using a Joint Transform Correlator
Object Distance Detection using a Joint Transform Correlator
 
An Accurate Scheme for Distance Measurement using an Ordinary Webcam
An Accurate Scheme for Distance Measurement using an Ordinary Webcam An Accurate Scheme for Distance Measurement using an Ordinary Webcam
An Accurate Scheme for Distance Measurement using an Ordinary Webcam
 
An Accurate Scheme for Distance Measurement using an Ordinary Webcam
An Accurate Scheme for Distance Measurement using an Ordinary Webcam An Accurate Scheme for Distance Measurement using an Ordinary Webcam
An Accurate Scheme for Distance Measurement using an Ordinary Webcam
 
F045033337
F045033337F045033337
F045033337
 
Goal location prediction based on deep learning using RGB-D camera
Goal location prediction based on deep learning using RGB-D cameraGoal location prediction based on deep learning using RGB-D camera
Goal location prediction based on deep learning using RGB-D camera
 
Orb feature by nitin
Orb feature by nitinOrb feature by nitin
Orb feature by nitin
 
A Review on Deformation Measurement from Speckle Patterns using Digital Image...
A Review on Deformation Measurement from Speckle Patterns using Digital Image...A Review on Deformation Measurement from Speckle Patterns using Digital Image...
A Review on Deformation Measurement from Speckle Patterns using Digital Image...
 
Stereo Correspondence Algorithms for Robotic Applications Under Ideal And Non...
Stereo Correspondence Algorithms for Robotic Applications Under Ideal And Non...Stereo Correspondence Algorithms for Robotic Applications Under Ideal And Non...
Stereo Correspondence Algorithms for Robotic Applications Under Ideal And Non...
 
Fuzzy-proportional-integral-derivative-based controller for object tracking i...
Fuzzy-proportional-integral-derivative-based controller for object tracking i...Fuzzy-proportional-integral-derivative-based controller for object tracking i...
Fuzzy-proportional-integral-derivative-based controller for object tracking i...
 
Multiresolution SVD based Image Fusion
Multiresolution SVD based Image FusionMultiresolution SVD based Image Fusion
Multiresolution SVD based Image Fusion
 
Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...
Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...
Enhanced Algorithm for Obstacle Detection and Avoidance Using a Hybrid of Pla...
 
Detection of Bridges using Different Types of High Resolution Satellite Images
Detection of Bridges using Different Types of High Resolution Satellite ImagesDetection of Bridges using Different Types of High Resolution Satellite Images
Detection of Bridges using Different Types of High Resolution Satellite Images
 
Sensors optimized for 3 d digitization
Sensors optimized for 3 d digitizationSensors optimized for 3 d digitization
Sensors optimized for 3 d digitization
 
E017443136
E017443136E017443136
E017443136
 
TUPPC055_Final
TUPPC055_FinalTUPPC055_Final
TUPPC055_Final
 
Application of lasers
Application of lasersApplication of lasers
Application of lasers
 
10.1109@ICCMC48092.2020.ICCMC-000167.pdf
10.1109@ICCMC48092.2020.ICCMC-000167.pdf10.1109@ICCMC48092.2020.ICCMC-000167.pdf
10.1109@ICCMC48092.2020.ICCMC-000167.pdf
 
Dense Visual Odometry Using Genetic Algorithm
Dense Visual Odometry Using Genetic AlgorithmDense Visual Odometry Using Genetic Algorithm
Dense Visual Odometry Using Genetic Algorithm
 

Más de IOSR-JEN

Más de IOSR-JEN (20)

C05921721
C05921721C05921721
C05921721
 
B05921016
B05921016B05921016
B05921016
 
A05920109
A05920109A05920109
A05920109
 
J05915457
J05915457J05915457
J05915457
 
I05914153
I05914153I05914153
I05914153
 
H05913540
H05913540H05913540
H05913540
 
G05913234
G05913234G05913234
G05913234
 
F05912731
F05912731F05912731
F05912731
 
E05912226
E05912226E05912226
E05912226
 
D05911621
D05911621D05911621
D05911621
 
C05911315
C05911315C05911315
C05911315
 
B05910712
B05910712B05910712
B05910712
 
A05910106
A05910106A05910106
A05910106
 
B05840510
B05840510B05840510
B05840510
 
I05844759
I05844759I05844759
I05844759
 
H05844346
H05844346H05844346
H05844346
 
G05843942
G05843942G05843942
G05843942
 
F05843238
F05843238F05843238
F05843238
 
E05842831
E05842831E05842831
E05842831
 
D05842227
D05842227D05842227
D05842227
 

Último

+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
?#DUbAI#??##{{(☎️+971_581248768%)**%*]'#abortion pills for sale in dubai@
 

Último (20)

Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CV
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
HTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesHTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation Strategies
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
Tech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdfTech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdf
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
Developing An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilDeveloping An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of Brazil
 
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 

D04432528

  • 1. IOSR Journal of Engineering (IOSRJEN) www.iosrjen.org ISSN (e): 2250-3021, ISSN (p): 2278-8719 Vol. 04, Issue 04 (April. 2014), ||V3|| PP 25-28 International organization of Scientific Research 25 | P a g e Laser Distance Measurement by using web camera Ahmed A.Hamad , Dhafir A. Dhahir, Bushra.R.Mhdi, Wadah H.Salim Ministry of Science and Technology ,Laser and Optoelectronic center ,Iraq-Bagdad Abstract: - In this work the near distance measurement system was presented .The system are consist from transmitter and receiver .The transmitter is used as semiconductor laser at wavelength 526 nm and the power 5 mw and the receiver are consist of optics part and web camera .The concept of operation is transmit a pulse of laser and then is received the reflected pulse from the object .The software program was written in C# language to calculate the object distance .The accuracy of the measurement distance is high. The error in the distance measurement reach to ±3cm. Keywords: - distance measurement, laser distance measurement I. INTRODUCTION Any self-respecting mobile robot requires some sort of distance measurement sensor or range finding mechanism. Among the most accessible are ultrasonic range finders, which time the propagation and reaction of a sonic pulse. ultrasonic rangers, especially when placed in ring configuration, are plagued with fundamental difficulties that make them unattractive for serious localization. At the other end of the spectrum are high-end laser rangefinders, which use light instead of sound to determine distance from time-of-light. These are often bundled with precision optics and mechanisms that allow one to infer 3D point clouds over large and small " human" distances. Their precision and reliability makes them ideal for industrial and research applications, but their cost is prohibitive to the hobbyist or researcher on a low budget. To solve the problem of optimizing user cost vs. sensor precision and reliability, we consider a different sensor design. Here, the only materials needed are a laser line generator and a camera (e.g, a cheap webcam). The principle of operation is simple: one points the line generator and the camera in the same direction, but of sets them some known distance d off (~ 10cm). The camera sees the laser line, filtered from the rest of the environment based on brightness and color, which is of set whenever it strikes an object. By calculating the of set distance from the expected position of the laser line, the distance to the object can be inferred. II. RANGE SENSOR DESIGN Figure( 1 )shows the arrangement of the webcam and laser pointer. This configuration allow us to maintain in parallel position both of components, there by facilitating a quick calculation of real distances from a simple model. Fig(1) The photograph image of laser distance measurement setup.
  • 2. Laser Distance Measurement by using web camera International organization of Scientific Research 26 | P a g e III. GEOMETRY OF THE MODEL In Figure( 2) the scheme used for the calculation of distance is showed. The distance between the webcam and laser pointer is defined by H (in centimeters). The distance between the scanner and the object is defined by D (in centimeters). Also note that the representation of the scene in the image formed by the webcam lens is the size A"' B " , where AB represents the same scene in real size. On the other hand, the distance between the center of the image to the location of the beam projected in the images can be defined as pfc (pixels from center), where Ɵ represents the angle of vision (in radians) formed by projection of the laser beam projected onto the image. Fig. 2. Scheme of the arrangement of the camera-laser. The distance D is obtained from the projection of the laser dot on the surface of the object. With this scheme we can establish the following relation: D = (1) The relationship can be raised to obtain the total of pixels per degree of incoming light, here called rpc (radian per pixel pitch), thus converting each pixel in the image to its corresponding value in centimeters. The following equation shows rpc in relation with Ɵ: Ɵ= pfc * rpc (2) Due to the existence of lens deformations on the webcam, in Equation 2 we consider a parameter which allows us the correction of alignment, here called ro (radian offset): Ɵ = pfc * rpc + ro (3) To obtain the values of rpc and ro we use a model of linear regression to find the relation between a dependent variable Y with k explanatory variables xk, which generate a hyper plane of parameters β .Usually this linear model is defined as: Y + Ɛ (4) where Ɛ is a random variable that deals with those uncontrollable factors. The model satisfies the following general equation: Y = m X + b (5) With
  • 3. Laser Distance Measurement by using web camera International organization of Scientific Research 27 | P a g e where n, is the total of samples used; m, is the slope of the regression line (rpc); b, is the value originated by uncontrollable factors (ro) and Y represents Ɵ. In order to obtain the distance to the object, we introduce the Equation 3 into the Equation 1, each of one in centimeters: D (6) We called this formulation as the Equation of distance calculation. We define laser beam as the light project by the laser pointer onto the object and captured by a webcam. For our work, we use a laser pointer that emits green light and its localization onto the images is considered as a form of introduce the depth information into the image. The process of localization is nuanced by (i) the presence of illumination into the environment; (ii) the texture of the object pointed by the laser beam, and (iii) the size of the laser beam projected onto the image (the size of the projected beam depends of the proximity between the scanner and the object). IV. RESULTS AND CALCULATIONS To calibrate the system, we will collect a series of measurements where the range to the target, is known. The number of pixels the dot is from the centre of the image each time is also needed. Example data is set out below: Table 1 Pixels from Centre Actual D (cm) 103 29 81 45 65 58 55 71 49 90 45 109 41 127 39 159 37 189 35 218 Calibration Data Using the following equation, we can now calculate the actual angle based on the value of h, as well as actual distance for each data point. Ø actual = arc tan ( h / D actual) where: Ø actual =Actual Angle D actual = Actual Distance to Measured Target Once we have a Theta actual for each value, we can discover the relationship that lets us calculate theta from the number of pixels from image centre. To use a linear relationship, the gain and offset are required. Though this works well, it does not take into account that the focal plane is a 'plane', rather than curved at a constant radius around the centre of the camera lens[4]. From the calibration data, we can calculate both Offset & Gain : Offset (r o) = -0.056514344 radians Gain (r pc) = 0.0024259348 radians/pixel Using: D= hover tan (pf ctimes r pc + r o )
  • 4. Laser Distance Measurement by using web camera International organization of Scientific Research 28 | P a g e We can calculate distances, including a error value from the actual distance using the calibration data above: Table 2( Actual and Calculated Range Data) V. CONCLUSIONS One possible improvement that can be made to this camera based Laser range finder is to use a laser that projects a horizontal line rather than a dot onto a target. With this type of Laser, we could calculate the range for each column of pixels in the image, rather than just one column as done here. This type of set-up could be used to locate areas of maximum range as places that a vehicle could steer towards. Likewise, areas with readings of minimum range would be identified as obstacles to be avoided. From my initial experiments, the green laser's dot can be hard to distinguish at distances greater than 2 meters. Also the dot tends to blend into light backgrounds in well lit situations. I feel that a Green Laser may go a long way to solving these two issues. REFERENCES [1] Isaac Cohen and Gerard Medioni. Detecting and tracking objects in video surveillance.In Proceeding of the IEEE Computer Vision and Pattern Recognition 99,Fort Collins, June 1999. [2] Brian Gerkey, Kasper Stoy, and Richard T. Vaughan.Player robot server. Institutefor Robotics and Intelligent Systems Technical Report IRIS-00-391, Universityof Southern California, 2000. [3] Stephen S. Intille, James W. Davis, and Aaron F. Bobick. Real-time closed worldtracking. In Proceeding of the IEEE Conference on Computer Vision and Pattern Recognition, pages 928{934, June 1997. [4] Andrew Ng. Em algorithm. CS229 Class Notes, 2008. [5] Hai Nguyen. Ros wiki: Camera calibration, 2008. [6] Sebastian Thrun, Wolfram Burgard, and Dieter Fox. Probabilistic Robotics, chapter 6.3, pages 124-139. MIT Press, 2005. Pixels from centre Calc D (cm) Actual D (cm) % Error 103 29.84 29 2.88 81 41.46 45 -7.87 65 57.55 58 -0.78 55 75.81 71 6.77 49 93.57 90 3.96 45 110.85 109 1.70 41 135.94 127 7.04 39 153.27 159 -3.60 37 175.66 189 -7.06 35 205.70 218 -5.64