SlideShare una empresa de Scribd logo
1 de 5
Descargar para leer sin conexión
ACEEE Int. J. on Information Technology, Vol. 01, No. 02, Sep 2011



   Image Fusion of Video Images and Geo-localization
                 for UAV Applications
                             K.Senthil Kumar1, Kavitha.G2, Subramanian.R3 and Marwan4
                                                   1, 3, 4 Division of Avionics

                                 Department of Electronics and Communication Engineering,
                                 2

                                             Madras Institute of Technology,
                                             Anna University Chennai, India
                                                   1 ksk_mit@annauniv.edu

                                               2 kavithag_mit@annauniv.edu

                                               3 subramanian.r.88@gmail.com

                                                    4 munna860@gmail.com



Abstract—We present in this paper a very fine method for              target using thermal imaging and to determine the targets by
determining the location of a ground based target when viewed         using an UAV fitted with a gimbal assembly of a RGB vision
from an Unmanned Aerial Vehicle (UAV). By determining the             camera, thermal camera and a range finder. In this paper we
pixel coordinates on the video frame and by using a range
                                                                      have assumed the target is identified by a human at the ground
finder the target’s geo-location is determined in the North-
East-Down (NED) frame. The contribution of this method is
                                                                      control station (GCS). The UAV requires two persons for
that the target can be localized to within 9m when view from          operation. One person operates the UAV (UAV pilot) while
an altitude of 2500m and down to 1m from an altitude of 100m.         the other person operates the gimbal assembly (UAV
This method offers a highly versatile tracking and geo-               operator). Once the UAV operator identifies the target the
localisation technique that has very good number of                   UAV system locks onto the target and the gimbal assembly
advantages over the previously suggested methods. Some of             automatically adjusts its azimuth and elevation to keep the
the key factors that differentiate our method from its                target in the Field of View (FOV). This method uses three
predecessors are:                                                     steps in the process:
     1) Day and night time operation
                                                                      1.1. Thermal and visual image fusion of airborne video
    2)   All weather operation
                                                                          The use of thermal imaging technology and the eventual
    3)   Highly accurate positioning of target in terms of            capabilities has increased dramatically in this technological
         latitude-longitude (GPS) and altitude.                       era. Thermal imaging used to be a very expensive technology
                                                                      for military users only. Today, more applications are involved
    4)   Automatic gimbaled operation of the camera once              with the help of thermal imaging. Sensor information fusion
         target is locked                                             which involves the process of combining various sensing
                                                                      modalities gives a realistic view. The subset of the sensor
    5)   Tracking is possible even when the target stops
         moving                                                       information does not collectively reveal the necessary data.
                                                                      With the development of new imaging sensors arises the
    6)   Independent of target (moving or stationary)                 need of a meaningful combination of all employed imaging
                                                                      sources. Image fusion of visual and thermal sensing outputs
    7)   No terrain database is required                              adds a new dimension in making the target tracking application
                                                                      of UAV more reliable. Target tracking at instances of smoke,
    8)   Instantaneous target geolocalisation is possible
                                                                      fog and cloudy conditions gets improved. Target
Index Terms—first term, second term, third term, fourth term,         identification, localisation, filtering and data association
fifth term, sixth term                                                forms an important application of the fusion process. Thus
                                                                      an effective surveillance and reconnaissance system can be
                        I. INTRODUCTION                               formed.
    Unmanned Aerial Vehicles (UAV) exists since they have             1.2. Tracking of relevant target
the ability and advantage to perform dangerous tasks. UAV’s               The fused thermal video gives a clear distinction of the
have an advantage of not putting human lives at risk. Tasks           target from its environment. The tracking module uses Horn-
such as tracking and reconnaissance require the UAV to                Schunck method of optical flow to determine motion.
determine the location of the target for military actions such        Previously used methods require the operator to control the
as dropping GPS guided bombs. This method presents a                  gimbal assembly to keep the target in the Field of View (FOV).
method for determining the GPS coordinates of the target              Using the current method the target is tracked and there is no
independent of whether the target is moving or stationary             need to define the target to the system once tracking
and independent of the type of target (vehicles, human or             commences.
animals). The method uses different approaches to track the
                                                                 44
© 2011 ACEEE
DOI: 01.IJIT.01.02.33
ACEEE Int. J. on Information Technology, Vol. 01, No. 02, Sep 2011


1.3.     Target geo-localization                                          moving platform. Once the system starts tracking the target
    Once the target is tracked on the pixel coordinate frame,             on the instructions given by the operator from the GCS, the
data from the range finder is combined and then converted to              centroid of the threshold area is computed for every frame.
the NED frame. Hence the target can be tracked automatically              This centroid value is always maintained at the center of the
using a feedback control mechanism connected with the                     image plane using servos controlling the gimbal assembly.
gimbal assembly and its instantaneous GPS coordinates can                 The difference between the center of the image plane and the
be determined in real time. Compared to previous methods                  centroid value obtained while tracking is determined for the
which were had an accuracy of up to 8m, this method is capable            azimuth and elevation axes to make corrections and maintain
of determining the position of the target with accuracy within            the error as minimum in the two axes i.e., to provide feedback
1m on the horizontal plane and around 2meters error in altitude.          to the gimbal assembly to adjust itself so that the target is
Hence the complete 3D coordinates of the target can be                    always near the center of the image plane. This ensures that
determined.                                                               there is no requirement for the operator at the GCS to control
                                                                          the gimbal assembly to focus on the target. The feedback
                  II. PROPOSED METHOD                                     mechanism takes care of this feature.

   The proposed method can be split into the following                    D. Geolocalization
subdivisions as:                                                              The gimbal assembly constitutes of a thermal camera, a
                                                                          video camera having the same intrinsic parameters as the
A. Thermal and visual image fusion process
                                                                          thermal camera and a range finder, all fitted onto the same
    Thermal images have a valuable advantage over the visual              gimbal so that they rotate altogether in the same direction in
images. Thermal images do not depend on the illumination,                 any axis. The gimbal assembly is positioned such that the
the output is the projection of thermal sensors of the emissions          thermal camera, video camera and the range finder are all
of heat of the objects. This unique merit gives rise for effective        pointing in the same direction or in other words that their line
segmentation of objects. Ultimately, surveillance measure                 of sights are always parallel. The gimbal assembly is also
using an UAV gets improved. Considering a visual and thermal              fitted with accelerometers to measure the angle of elevation
video as output from the UAV to be obtained in the ground                 and azimuth of the optical sensors. The elevation and azimuth
control station, the two videos are split into image. The visual          values are measured with respect to the body of the UAV.
and thermal image of the same frame is fused with by applying
Haar wavelet transform to the fourth level. An inverse wavelet
transform gets us the fused image. The considerations to be
met here are same resolution images with same field of view.
B. Gimbal assembly
    The gimbal assembly constitutes of a thermal camera, a
video camera having the same intrinsic parameters as the
thermal camera and a range finder, all fitted onto the same
gimbal so that they rotate altogether in the same direction in             Fig. 1. A two dimensional representation of localizing the target
any axis. The gimbal assembly is positioned such that the
                                                                              In Fig. 1 it can be illustrated that by knowing the value of
thermal camera, video camera and the range finder are all
                                                                          Height of the UAV ‘H’ and angle of elevation of the gimbal
pointing in the same direction or in other words that their line
                                                                          assembly ‘’, the range R of the target on the horizontal plane
of sights are always parallel. The gimbal assembly is also
                                                                          from the UAV’s can be determined with ease. It should be
fitted with accelerometers to measure the angle of elevation
                                                                          observed that in the above case that the target is considered
and azimuth of the optical sensors. The elevation and azimuth
                                                                          to be at zero altitude. However if the target is on a different
values are measured with respect to the body of the UAV.
                                                                          terrain such as a trench or a hill, then the above determination
The gimbal assembly has two functions:
                                                                          of the target’s coordinates will yield wrong. As shown in Fig.
1. Keeps the thermal and optic sensors aligned on the same
                                                                          2, the actual target is at position T(x2,y2). But by using the
platform.
                                                                          above method we will get another set of coordinates i.e.,
2. Measures the azimuth (az ) and elevation angle (el) and                T(x2’,y2’), which are wrong.
sends the values to the processor.
It should be noted that the gimbal attitude parameters are
independent of the attitude of the UAV.
C. Tracking Process
    The tracking process uses the fused video result. Here
the tracking algorithm uses Horn-Schunck method to track
optical flow in the video. The Horn-Schunck method is
superior to the Lucas-Kanade method because it is resistant
to the ‘aperture problem’ and it is very suitable for UAV and             Fig. 2. A two dimensional representation of localizing the target on
airborne applications where the camera is mounted on a                                             a hilly terrain
                                                                     45
© 2011 ACEEE
DOI: 01.IJIT.01.02.33
ACEEE Int. J. on Information Technology, Vol. 01, No. 02, Sep 2011


Similar work can be done in 3D, the only difference being that
the angular value  will be replaced by ‘el’ and ‘az’. The Range
Finder serves its purpose to overcome this problem. The
range finder which is with the gimbal assembly provides us
with the slant range ‘F’. Now this value ‘F’ can be used to
determine the ground distance ‘R’ between the UAV and the
target. The three dimensional representation of the
schematics are shown in Fig. 3 and Fig. 4.




                                                                                    Fig. 5. Visual image from the database collection




 Fig. 3. A three dimensional representation of localizing the target




                                                                                   Fig. 6. Thermal image from the database collection




 Fig. 4. A three dimensional representation of localizing the target
                         on a hilly terrain

                          II. RESULTS
   The fusion, tracking and localization results are given
below:
A. Thermal and Visual Image Fusion                                                       Fig. 7. Thermal and visual fused image
    Image fusion results based on wavelet transform are                     B. Geo-Localization
discussed. Two images (Visual and thermal) are taken as                         The videos used for simulating tracking are self animated
shown in Fig. 5 and Fig. 6 and the fused image is obtained as               videos which help to understand the problems involved by
an output as shown in Fig. 7. The visual image gives a realistic            introducing the unwanted components like gradient noise
human sensing view. The thermal image identifies the target                 into the animation and hence the best and optimized method
with the temperature difference coming into the picture with                can be adopted. For the airborne video the Horn Schunk’s
objects possessing different emissivity values. The fused                   method proves effective and the tracking results of the
image is obtained as a result of four level wavelet transform               algorithm are. The following figures show the results obtained
and the combined information is obtained. The fused image                   in the tracking section.
combines the complementary data. Image fusion results are
with respect to the images taken from imagefusion.org image
database collection.




                                                                       46
© 2011 ACEEE
DOI: 01.IJIT.01.02. 33
ACEEE Int. J. on Information Technology, Vol. 01, No. 02, Sep 2011




               Fig. 8. Sample target tracking video



                                                                        Fig. 12. Two Dimensional representation of the car’s movement on
                                                                                     a plane without the Position Hold Block
                                                                            Results from MATLAB showing accuracy ranges after
                                                                        introduction of errors: The results obtained using the
                                                                        proposed method was found satisfactory even after
                                                                        introducing the errors which were previously discussed. Table
                                                                        I shows the tested accuracies at different variations in GPS
                                                                        altitude, gimbal assembly and range finder. The results in
                                                                        Table I show that the maximum and minimum accuracy of the
               Fig. 9. Sample target tracking video
                                                                        method is 0.95 m and 21.8 m respectively.
The centroid of the bounding box gives the coordinates (x,y)
of the detected target on the image plane. This can be used                                    TABLE. I
                                                                                  GEOLOCALISATION ACCURACY RESULTS
only if the car is in motion. If the car stops then the centroid
value returns to (0,0). This is shown graphically in Fig. 10.




Fig. 10. Two Dimensional representation of the car’s movement on        Fig. 13 depicts the accuracy of the target in graphical form.
                            a plane                                     The green dot depicts the true position of the target and the
However by storing the previous value of the target in a                red and blue dots depict the accuracy obtained by introduc-
buffer it is possible to hold the position of the target when           ing positive and negative errors respectively. These values
the target is not moving in the image plane. This is possible           of accuracies were computed for different error values at vary-
using a user customized position hold block as shown below.             ing altitude ‘H’, LRF range ‘F’ and gimbal angle ‘’.




        Fig. 11. IF Loop for Position Hold Function Block                 Fig. 13. Accuracy Distribution with respect to the true position
                                                                   47
© 2011 ACEEE
DOI: 01.IJIT.01.02. 33
ACEEE Int. J. on Information Technology, Vol. 01, No. 02, Sep 2011


                          CONCLUSIONS                                   [3] Zhang Jin-Yu, Chen Yan and Huang Xian-Xiang, “IR Thermal
                                                                        Image Segmentation Based on Enhanced Genetic Algorithms and
    The airborne images obtained from an UAV are analyzed               Two-Dimensional Classes Square Error”, Second International
in ground control station. By using the thermal images, all             Conference on Information and Computing Science, 2009.
weather and night operation are possible. Visual and thermal            [4] Daniel Olmeda, Arturo de la Escalera and Jos M Armingol,
image fusion is done and the fused image is given for target            “Detection and Tracking of Pedestrians in Infrared Images”,
tracking. This system has the benefit of enhanced target                International Conference on Signals, Circuits and Systems, 2009
tracking application wherein only visual or thermal target              [5] Wai Kit Wong, Poi Ngee Tan, Chu Kiong Loo and Way Soong
                                                                        Lim, “An Effective Surveillance System Using Thermal Camera”,
tracking would not provide sufficient efficiency. Thus the
                                                                        International Conference on Signal Acquisition and Processing,
image fusion process augments information leading to an
                                                                        2009.
improved system as a whole. The overall system incorporates             [6] www.imagefusion.org
segmentation, fusion and target tracking principles. By using           [7] H.B.Mitchell, Image Fusion: Theories, Techniques and
the tracked coordinates on the image frame, the coordinates             Applications, Springer, 2010.
can be shifted to the NED frame which will give the GPS                 [8] K. Han and G. N. DeSouza, “Instantaneous Geo-Location of
coordinates of the target. The method proves it robust                  Multiple Targets from Monocular Airborne Video,” Proceedings of
application for the military and can prove efficient since it           2002 IEEE International Conference on Robotics and Automation,
has an accuracy of 0.9m from an altitude of 100m.                       May 2002, Washington DC, USA.
                                                                        [9] K. Han and G. N. DeSouza, “Multiple Targets Geolocation
                                                                        using SIFT and Stereo Vision on Airborne Video Sequences”, The
                        ACKNOWLEDGMENT
                                                                        2009 IEEE/RSJ International Conference on Intelligent Robots and
   The authors wish to thank A, B, C. This work was                     Systems October 11-15, 2009 St. Louis, USA.
supported in part by a grant from XYZ.                                  [10] D. B. Barber, J. Redding, T. W. McLain, R. W. Beard, and C.
                                                                        Taylor, “Vision-based target geo-location using a fixed-wing
                                                                        miniature air vehicle,” Journal of Intelligent Robotics Systems, vol.
                          REFERENCES                                    47, pp. 361–382.
[1] Y.Chena and C.Han, “Night-time Pedestrian Detection by              [11] Redding, J. D., “Vision based target localization from a small
Visual-Infrared Video Fusion,” Proceedings of 7th World congress        fixed-wing unmanned air vehicle”, Master’s thesis, Brigham, Young
on Intelligent Control and Automation, China, 2008.                     University, Provo, Utah 84602.
[2] Alex Leykin, Yang Ran and Riad Hammoud, “Thermal-Visible
Video Fusion for Moving Target Tracking and Pedestrian
Classification”, IEEE Conference on Computer Vision and Pattern
Recognition, Minneapolis, 2007.




                                                                   48
© 2011 ACEEE
DOI: 01.IJIT.01.02.33

Más contenido relacionado

La actualidad más candente

Multiresolution SVD based Image Fusion
Multiresolution SVD based Image FusionMultiresolution SVD based Image Fusion
Multiresolution SVD based Image FusionIOSRJVSP
 
The flow of baseline estimation using a single omnidirectional camera
The flow of baseline estimation using a single omnidirectional cameraThe flow of baseline estimation using a single omnidirectional camera
The flow of baseline estimation using a single omnidirectional cameraTELKOMNIKA JOURNAL
 
Detection of moving object using
Detection of moving object usingDetection of moving object using
Detection of moving object usingijistjournal
 
Development and Hardware Implementation of an Efficient Algorithm for Cloud D...
Development and Hardware Implementation of an Efficient Algorithm for Cloud D...Development and Hardware Implementation of an Efficient Algorithm for Cloud D...
Development and Hardware Implementation of an Efficient Algorithm for Cloud D...sipij
 
Human Motion Detection in Video Surveillance using Computer Vision Technique
Human Motion Detection in Video Surveillance using Computer Vision TechniqueHuman Motion Detection in Video Surveillance using Computer Vision Technique
Human Motion Detection in Video Surveillance using Computer Vision TechniqueIRJET Journal
 
Intelligent indoor mobile robot navigation using stereo vision
Intelligent indoor mobile robot navigation using stereo visionIntelligent indoor mobile robot navigation using stereo vision
Intelligent indoor mobile robot navigation using stereo visionsipij
 
Automatic identification of animal using visual and motion saliency
Automatic identification of animal using visual and motion saliencyAutomatic identification of animal using visual and motion saliency
Automatic identification of animal using visual and motion saliencyeSAT Publishing House
 
ShawnQuinnCSS565FinalResearchProject
ShawnQuinnCSS565FinalResearchProjectShawnQuinnCSS565FinalResearchProject
ShawnQuinnCSS565FinalResearchProjectShawn Quinn
 
Close range Photogrammeetry
Close range PhotogrammeetryClose range Photogrammeetry
Close range Photogrammeetrychinmay khadke
 
Satellite image processing
Satellite image processingSatellite image processing
Satellite image processingalok ray
 
Fast and High-Precision 3D Tracking and Position Measurement with MEMS Microm...
Fast and High-Precision 3D Tracking and Position Measurement with MEMS Microm...Fast and High-Precision 3D Tracking and Position Measurement with MEMS Microm...
Fast and High-Precision 3D Tracking and Position Measurement with MEMS Microm...Ping Hsu
 
In tech vision-based_obstacle_detection_module_for_a_wheeled_mobile_robot
In tech vision-based_obstacle_detection_module_for_a_wheeled_mobile_robotIn tech vision-based_obstacle_detection_module_for_a_wheeled_mobile_robot
In tech vision-based_obstacle_detection_module_for_a_wheeled_mobile_robotSudhakar Spartan
 
Fundamentals of Remote Sensing- A training module
Fundamentals of Remote Sensing- A training moduleFundamentals of Remote Sensing- A training module
Fundamentals of Remote Sensing- A training moduleNishant Sinha
 
Structural Health Monitoring Aluminum Honeycomb Sandwich Composite Panel
Structural Health Monitoring Aluminum Honeycomb Sandwich Composite PanelStructural Health Monitoring Aluminum Honeycomb Sandwich Composite Panel
Structural Health Monitoring Aluminum Honeycomb Sandwich Composite PanelDawid Yhisreal
 
An Automatic Detection of Landing Sites for Emergency Landing of Aircraft
An Automatic Detection of Landing Sites for Emergency Landing of AircraftAn Automatic Detection of Landing Sites for Emergency Landing of Aircraft
An Automatic Detection of Landing Sites for Emergency Landing of AircraftIRJET Journal
 
Applying edge density based region growing with frame difference for detectin...
Applying edge density based region growing with frame difference for detectin...Applying edge density based region growing with frame difference for detectin...
Applying edge density based region growing with frame difference for detectin...eSAT Publishing House
 
Disparity map generation based on trapezoidal camera architecture for multi v...
Disparity map generation based on trapezoidal camera architecture for multi v...Disparity map generation based on trapezoidal camera architecture for multi v...
Disparity map generation based on trapezoidal camera architecture for multi v...ijma
 

La actualidad más candente (19)

Multiresolution SVD based Image Fusion
Multiresolution SVD based Image FusionMultiresolution SVD based Image Fusion
Multiresolution SVD based Image Fusion
 
The flow of baseline estimation using a single omnidirectional camera
The flow of baseline estimation using a single omnidirectional cameraThe flow of baseline estimation using a single omnidirectional camera
The flow of baseline estimation using a single omnidirectional camera
 
Detection of moving object using
Detection of moving object usingDetection of moving object using
Detection of moving object using
 
K026063069
K026063069K026063069
K026063069
 
Nadia2013 research
Nadia2013 researchNadia2013 research
Nadia2013 research
 
Development and Hardware Implementation of an Efficient Algorithm for Cloud D...
Development and Hardware Implementation of an Efficient Algorithm for Cloud D...Development and Hardware Implementation of an Efficient Algorithm for Cloud D...
Development and Hardware Implementation of an Efficient Algorithm for Cloud D...
 
Human Motion Detection in Video Surveillance using Computer Vision Technique
Human Motion Detection in Video Surveillance using Computer Vision TechniqueHuman Motion Detection in Video Surveillance using Computer Vision Technique
Human Motion Detection in Video Surveillance using Computer Vision Technique
 
Intelligent indoor mobile robot navigation using stereo vision
Intelligent indoor mobile robot navigation using stereo visionIntelligent indoor mobile robot navigation using stereo vision
Intelligent indoor mobile robot navigation using stereo vision
 
Automatic identification of animal using visual and motion saliency
Automatic identification of animal using visual and motion saliencyAutomatic identification of animal using visual and motion saliency
Automatic identification of animal using visual and motion saliency
 
ShawnQuinnCSS565FinalResearchProject
ShawnQuinnCSS565FinalResearchProjectShawnQuinnCSS565FinalResearchProject
ShawnQuinnCSS565FinalResearchProject
 
Close range Photogrammeetry
Close range PhotogrammeetryClose range Photogrammeetry
Close range Photogrammeetry
 
Satellite image processing
Satellite image processingSatellite image processing
Satellite image processing
 
Fast and High-Precision 3D Tracking and Position Measurement with MEMS Microm...
Fast and High-Precision 3D Tracking and Position Measurement with MEMS Microm...Fast and High-Precision 3D Tracking and Position Measurement with MEMS Microm...
Fast and High-Precision 3D Tracking and Position Measurement with MEMS Microm...
 
In tech vision-based_obstacle_detection_module_for_a_wheeled_mobile_robot
In tech vision-based_obstacle_detection_module_for_a_wheeled_mobile_robotIn tech vision-based_obstacle_detection_module_for_a_wheeled_mobile_robot
In tech vision-based_obstacle_detection_module_for_a_wheeled_mobile_robot
 
Fundamentals of Remote Sensing- A training module
Fundamentals of Remote Sensing- A training moduleFundamentals of Remote Sensing- A training module
Fundamentals of Remote Sensing- A training module
 
Structural Health Monitoring Aluminum Honeycomb Sandwich Composite Panel
Structural Health Monitoring Aluminum Honeycomb Sandwich Composite PanelStructural Health Monitoring Aluminum Honeycomb Sandwich Composite Panel
Structural Health Monitoring Aluminum Honeycomb Sandwich Composite Panel
 
An Automatic Detection of Landing Sites for Emergency Landing of Aircraft
An Automatic Detection of Landing Sites for Emergency Landing of AircraftAn Automatic Detection of Landing Sites for Emergency Landing of Aircraft
An Automatic Detection of Landing Sites for Emergency Landing of Aircraft
 
Applying edge density based region growing with frame difference for detectin...
Applying edge density based region growing with frame difference for detectin...Applying edge density based region growing with frame difference for detectin...
Applying edge density based region growing with frame difference for detectin...
 
Disparity map generation based on trapezoidal camera architecture for multi v...
Disparity map generation based on trapezoidal camera architecture for multi v...Disparity map generation based on trapezoidal camera architecture for multi v...
Disparity map generation based on trapezoidal camera architecture for multi v...
 

Destacado

Data Management In Cellular Networks Using Activity Mining
Data Management In Cellular Networks Using Activity MiningData Management In Cellular Networks Using Activity Mining
Data Management In Cellular Networks Using Activity MiningIDES Editor
 
Development of Robust Adaptive Inverse models using Bacterial Foraging Optimi...
Development of Robust Adaptive Inverse models using Bacterial Foraging Optimi...Development of Robust Adaptive Inverse models using Bacterial Foraging Optimi...
Development of Robust Adaptive Inverse models using Bacterial Foraging Optimi...IDES Editor
 
Visual Programming and Program Visualization – Towards an Ideal Visual Softwa...
Visual Programming and Program Visualization – Towards an Ideal Visual Softwa...Visual Programming and Program Visualization – Towards an Ideal Visual Softwa...
Visual Programming and Program Visualization – Towards an Ideal Visual Softwa...IDES Editor
 
A Novel Optimum Technique for JPEG 2000 Post Compression Rate Distortion Algo...
A Novel Optimum Technique for JPEG 2000 Post Compression Rate Distortion Algo...A Novel Optimum Technique for JPEG 2000 Post Compression Rate Distortion Algo...
A Novel Optimum Technique for JPEG 2000 Post Compression Rate Distortion Algo...IDES Editor
 
Preventing Autonomous System against IP Source Address Spoofing: (PASIPS) A N...
Preventing Autonomous System against IP Source Address Spoofing: (PASIPS) A N...Preventing Autonomous System against IP Source Address Spoofing: (PASIPS) A N...
Preventing Autonomous System against IP Source Address Spoofing: (PASIPS) A N...IDES Editor
 
Gravitational Based Hierarchical Clustering Algorithm
Gravitational Based Hierarchical Clustering AlgorithmGravitational Based Hierarchical Clustering Algorithm
Gravitational Based Hierarchical Clustering AlgorithmIDES Editor
 
Physical Layer Technologies And Challenges In Mobile Satellite Communications
Physical Layer Technologies And Challenges In Mobile Satellite CommunicationsPhysical Layer Technologies And Challenges In Mobile Satellite Communications
Physical Layer Technologies And Challenges In Mobile Satellite CommunicationsIDES Editor
 
Performance Comparison of Rerouting Schemes of Multi Protocol Label Switching...
Performance Comparison of Rerouting Schemes of Multi Protocol Label Switching...Performance Comparison of Rerouting Schemes of Multi Protocol Label Switching...
Performance Comparison of Rerouting Schemes of Multi Protocol Label Switching...IDES Editor
 
IEEE-Communications-Mag-CSDCYorkU
IEEE-Communications-Mag-CSDCYorkUIEEE-Communications-Mag-CSDCYorkU
IEEE-Communications-Mag-CSDCYorkUKeith Menezes
 
An Improved Image Fusion Scheme Based on Markov Random Fields with Image Enha...
An Improved Image Fusion Scheme Based on Markov Random Fields with Image Enha...An Improved Image Fusion Scheme Based on Markov Random Fields with Image Enha...
An Improved Image Fusion Scheme Based on Markov Random Fields with Image Enha...Editor IJCATR
 
IMAGE FUSION IN IMAGE PROCESSING
IMAGE FUSION IN IMAGE PROCESSINGIMAGE FUSION IN IMAGE PROCESSING
IMAGE FUSION IN IMAGE PROCESSINGgarima0690
 

Destacado (11)

Data Management In Cellular Networks Using Activity Mining
Data Management In Cellular Networks Using Activity MiningData Management In Cellular Networks Using Activity Mining
Data Management In Cellular Networks Using Activity Mining
 
Development of Robust Adaptive Inverse models using Bacterial Foraging Optimi...
Development of Robust Adaptive Inverse models using Bacterial Foraging Optimi...Development of Robust Adaptive Inverse models using Bacterial Foraging Optimi...
Development of Robust Adaptive Inverse models using Bacterial Foraging Optimi...
 
Visual Programming and Program Visualization – Towards an Ideal Visual Softwa...
Visual Programming and Program Visualization – Towards an Ideal Visual Softwa...Visual Programming and Program Visualization – Towards an Ideal Visual Softwa...
Visual Programming and Program Visualization – Towards an Ideal Visual Softwa...
 
A Novel Optimum Technique for JPEG 2000 Post Compression Rate Distortion Algo...
A Novel Optimum Technique for JPEG 2000 Post Compression Rate Distortion Algo...A Novel Optimum Technique for JPEG 2000 Post Compression Rate Distortion Algo...
A Novel Optimum Technique for JPEG 2000 Post Compression Rate Distortion Algo...
 
Preventing Autonomous System against IP Source Address Spoofing: (PASIPS) A N...
Preventing Autonomous System against IP Source Address Spoofing: (PASIPS) A N...Preventing Autonomous System against IP Source Address Spoofing: (PASIPS) A N...
Preventing Autonomous System against IP Source Address Spoofing: (PASIPS) A N...
 
Gravitational Based Hierarchical Clustering Algorithm
Gravitational Based Hierarchical Clustering AlgorithmGravitational Based Hierarchical Clustering Algorithm
Gravitational Based Hierarchical Clustering Algorithm
 
Physical Layer Technologies And Challenges In Mobile Satellite Communications
Physical Layer Technologies And Challenges In Mobile Satellite CommunicationsPhysical Layer Technologies And Challenges In Mobile Satellite Communications
Physical Layer Technologies And Challenges In Mobile Satellite Communications
 
Performance Comparison of Rerouting Schemes of Multi Protocol Label Switching...
Performance Comparison of Rerouting Schemes of Multi Protocol Label Switching...Performance Comparison of Rerouting Schemes of Multi Protocol Label Switching...
Performance Comparison of Rerouting Schemes of Multi Protocol Label Switching...
 
IEEE-Communications-Mag-CSDCYorkU
IEEE-Communications-Mag-CSDCYorkUIEEE-Communications-Mag-CSDCYorkU
IEEE-Communications-Mag-CSDCYorkU
 
An Improved Image Fusion Scheme Based on Markov Random Fields with Image Enha...
An Improved Image Fusion Scheme Based on Markov Random Fields with Image Enha...An Improved Image Fusion Scheme Based on Markov Random Fields with Image Enha...
An Improved Image Fusion Scheme Based on Markov Random Fields with Image Enha...
 
IMAGE FUSION IN IMAGE PROCESSING
IMAGE FUSION IN IMAGE PROCESSINGIMAGE FUSION IN IMAGE PROCESSING
IMAGE FUSION IN IMAGE PROCESSING
 

Similar a Image Fusion of Video Images and Geo-localization for UAV Applications

Critical Infrastructure Monitoring Using UAV Imagery
Critical Infrastructure Monitoring Using UAV ImageryCritical Infrastructure Monitoring Using UAV Imagery
Critical Infrastructure Monitoring Using UAV Imageryaditess
 
Study of Geo-Spatial Data Quality
Study of Geo-Spatial Data QualityStudy of Geo-Spatial Data Quality
Study of Geo-Spatial Data QualityIRJET Journal
 
CVGIP 2010 Part 3
CVGIP 2010 Part 3CVGIP 2010 Part 3
CVGIP 2010 Part 3Cody Liu
 
DroneSurveyingPresentation2.pdf
DroneSurveyingPresentation2.pdfDroneSurveyingPresentation2.pdf
DroneSurveyingPresentation2.pdfFaizanSolanki1
 
Positionit android app
Positionit android appPositionit android app
Positionit android appSherin Green
 
A DEEP LEARNING APPROACH TO CLASSIFY DRONES AND BIRDS
A DEEP LEARNING APPROACH TO CLASSIFY DRONES AND BIRDSA DEEP LEARNING APPROACH TO CLASSIFY DRONES AND BIRDS
A DEEP LEARNING APPROACH TO CLASSIFY DRONES AND BIRDSIRJET Journal
 
PENANG ICSGRC2012 Pi_ALiRG_Abd Manan
PENANG ICSGRC2012 Pi_ALiRG_Abd MananPENANG ICSGRC2012 Pi_ALiRG_Abd Manan
PENANG ICSGRC2012 Pi_ALiRG_Abd MananDR_ABDMANANSAMAD
 
Goal location prediction based on deep learning using RGB-D camera
Goal location prediction based on deep learning using RGB-D cameraGoal location prediction based on deep learning using RGB-D camera
Goal location prediction based on deep learning using RGB-D camerajournalBEEI
 
SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT
SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENTSOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT
SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENTTomohiro Fukuda
 
GOAR: GIS Oriented Mobile Augmented Reality for Urban Landscape Assessment
GOAR: GIS Oriented Mobile Augmented Reality for Urban Landscape AssessmentGOAR: GIS Oriented Mobile Augmented Reality for Urban Landscape Assessment
GOAR: GIS Oriented Mobile Augmented Reality for Urban Landscape AssessmentTomohiro Fukuda
 
A METHOD OF TARGET TRACKING AND PREDICTION BASED ON GEOMAGNETIC SENSOR TECHNO...
A METHOD OF TARGET TRACKING AND PREDICTION BASED ON GEOMAGNETIC SENSOR TECHNO...A METHOD OF TARGET TRACKING AND PREDICTION BASED ON GEOMAGNETIC SENSOR TECHNO...
A METHOD OF TARGET TRACKING AND PREDICTION BASED ON GEOMAGNETIC SENSOR TECHNO...cscpconf
 
Person Detection in Maritime Search And Rescue Operations
Person Detection in Maritime Search And Rescue OperationsPerson Detection in Maritime Search And Rescue Operations
Person Detection in Maritime Search And Rescue OperationsIRJET Journal
 
Person Detection in Maritime Search And Rescue Operations
Person Detection in Maritime Search And Rescue OperationsPerson Detection in Maritime Search And Rescue Operations
Person Detection in Maritime Search And Rescue OperationsIRJET Journal
 
A Study of Motion Detection Method for Smart Home System
A Study of Motion Detection Method for Smart Home SystemA Study of Motion Detection Method for Smart Home System
A Study of Motion Detection Method for Smart Home SystemAM Publications
 
Fabrication of Customized Surveillance & Night Vision Patrolling Drone
Fabrication of Customized Surveillance & Night Vision Patrolling DroneFabrication of Customized Surveillance & Night Vision Patrolling Drone
Fabrication of Customized Surveillance & Night Vision Patrolling DroneIRJET Journal
 
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...csandit
 
CROP PROTECTION AGAINST BIRDS USING DEEP LEARNING AND IOT
CROP PROTECTION AGAINST BIRDS USING DEEP LEARNING AND IOTCROP PROTECTION AGAINST BIRDS USING DEEP LEARNING AND IOT
CROP PROTECTION AGAINST BIRDS USING DEEP LEARNING AND IOTIRJET Journal
 
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Kalle
 

Similar a Image Fusion of Video Images and Geo-localization for UAV Applications (20)

Critical Infrastructure Monitoring Using UAV Imagery
Critical Infrastructure Monitoring Using UAV ImageryCritical Infrastructure Monitoring Using UAV Imagery
Critical Infrastructure Monitoring Using UAV Imagery
 
Study of Geo-Spatial Data Quality
Study of Geo-Spatial Data QualityStudy of Geo-Spatial Data Quality
Study of Geo-Spatial Data Quality
 
CVGIP 2010 Part 3
CVGIP 2010 Part 3CVGIP 2010 Part 3
CVGIP 2010 Part 3
 
DroneSurveyingPresentation2.pdf
DroneSurveyingPresentation2.pdfDroneSurveyingPresentation2.pdf
DroneSurveyingPresentation2.pdf
 
Positionit android app
Positionit android appPositionit android app
Positionit android app
 
report
reportreport
report
 
A DEEP LEARNING APPROACH TO CLASSIFY DRONES AND BIRDS
A DEEP LEARNING APPROACH TO CLASSIFY DRONES AND BIRDSA DEEP LEARNING APPROACH TO CLASSIFY DRONES AND BIRDS
A DEEP LEARNING APPROACH TO CLASSIFY DRONES AND BIRDS
 
PENANG ICSGRC2012 Pi_ALiRG_Abd Manan
PENANG ICSGRC2012 Pi_ALiRG_Abd MananPENANG ICSGRC2012 Pi_ALiRG_Abd Manan
PENANG ICSGRC2012 Pi_ALiRG_Abd Manan
 
Goal location prediction based on deep learning using RGB-D camera
Goal location prediction based on deep learning using RGB-D cameraGoal location prediction based on deep learning using RGB-D camera
Goal location prediction based on deep learning using RGB-D camera
 
SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT
SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENTSOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT
SOAR: SENSOR ORIENTED MOBILE AUGMENTED REALITY FOR URBAN LANDSCAPE ASSESSMENT
 
GOAR: GIS Oriented Mobile Augmented Reality for Urban Landscape Assessment
GOAR: GIS Oriented Mobile Augmented Reality for Urban Landscape AssessmentGOAR: GIS Oriented Mobile Augmented Reality for Urban Landscape Assessment
GOAR: GIS Oriented Mobile Augmented Reality for Urban Landscape Assessment
 
A METHOD OF TARGET TRACKING AND PREDICTION BASED ON GEOMAGNETIC SENSOR TECHNO...
A METHOD OF TARGET TRACKING AND PREDICTION BASED ON GEOMAGNETIC SENSOR TECHNO...A METHOD OF TARGET TRACKING AND PREDICTION BASED ON GEOMAGNETIC SENSOR TECHNO...
A METHOD OF TARGET TRACKING AND PREDICTION BASED ON GEOMAGNETIC SENSOR TECHNO...
 
Person Detection in Maritime Search And Rescue Operations
Person Detection in Maritime Search And Rescue OperationsPerson Detection in Maritime Search And Rescue Operations
Person Detection in Maritime Search And Rescue Operations
 
Person Detection in Maritime Search And Rescue Operations
Person Detection in Maritime Search And Rescue OperationsPerson Detection in Maritime Search And Rescue Operations
Person Detection in Maritime Search And Rescue Operations
 
A Study of Motion Detection Method for Smart Home System
A Study of Motion Detection Method for Smart Home SystemA Study of Motion Detection Method for Smart Home System
A Study of Motion Detection Method for Smart Home System
 
Fabrication of Customized Surveillance & Night Vision Patrolling Drone
Fabrication of Customized Surveillance & Night Vision Patrolling DroneFabrication of Customized Surveillance & Night Vision Patrolling Drone
Fabrication of Customized Surveillance & Night Vision Patrolling Drone
 
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...
 
INCAST_2008-014__2_
INCAST_2008-014__2_INCAST_2008-014__2_
INCAST_2008-014__2_
 
CROP PROTECTION AGAINST BIRDS USING DEEP LEARNING AND IOT
CROP PROTECTION AGAINST BIRDS USING DEEP LEARNING AND IOTCROP PROTECTION AGAINST BIRDS USING DEEP LEARNING AND IOT
CROP PROTECTION AGAINST BIRDS USING DEEP LEARNING AND IOT
 
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
Pontillo Semanti Code Using Content Similarity And Database Driven Matching T...
 

Más de IDES Editor

Power System State Estimation - A Review
Power System State Estimation - A ReviewPower System State Estimation - A Review
Power System State Estimation - A ReviewIDES Editor
 
Artificial Intelligence Technique based Reactive Power Planning Incorporating...
Artificial Intelligence Technique based Reactive Power Planning Incorporating...Artificial Intelligence Technique based Reactive Power Planning Incorporating...
Artificial Intelligence Technique based Reactive Power Planning Incorporating...IDES Editor
 
Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...
Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...
Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...IDES Editor
 
Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...
Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...
Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...IDES Editor
 
Line Losses in the 14-Bus Power System Network using UPFC
Line Losses in the 14-Bus Power System Network using UPFCLine Losses in the 14-Bus Power System Network using UPFC
Line Losses in the 14-Bus Power System Network using UPFCIDES Editor
 
Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...
Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...
Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...IDES Editor
 
Assessing Uncertainty of Pushover Analysis to Geometric Modeling
Assessing Uncertainty of Pushover Analysis to Geometric ModelingAssessing Uncertainty of Pushover Analysis to Geometric Modeling
Assessing Uncertainty of Pushover Analysis to Geometric ModelingIDES Editor
 
Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...
Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...
Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...IDES Editor
 
Selfish Node Isolation & Incentivation using Progressive Thresholds
Selfish Node Isolation & Incentivation using Progressive ThresholdsSelfish Node Isolation & Incentivation using Progressive Thresholds
Selfish Node Isolation & Incentivation using Progressive ThresholdsIDES Editor
 
Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...
Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...
Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...IDES Editor
 
Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...
Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...
Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...IDES Editor
 
Cloud Security and Data Integrity with Client Accountability Framework
Cloud Security and Data Integrity with Client Accountability FrameworkCloud Security and Data Integrity with Client Accountability Framework
Cloud Security and Data Integrity with Client Accountability FrameworkIDES Editor
 
Genetic Algorithm based Layered Detection and Defense of HTTP Botnet
Genetic Algorithm based Layered Detection and Defense of HTTP BotnetGenetic Algorithm based Layered Detection and Defense of HTTP Botnet
Genetic Algorithm based Layered Detection and Defense of HTTP BotnetIDES Editor
 
Enhancing Data Storage Security in Cloud Computing Through Steganography
Enhancing Data Storage Security in Cloud Computing Through SteganographyEnhancing Data Storage Security in Cloud Computing Through Steganography
Enhancing Data Storage Security in Cloud Computing Through SteganographyIDES Editor
 
Low Energy Routing for WSN’s
Low Energy Routing for WSN’sLow Energy Routing for WSN’s
Low Energy Routing for WSN’sIDES Editor
 
Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...
Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...
Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...IDES Editor
 
Rotman Lens Performance Analysis
Rotman Lens Performance AnalysisRotman Lens Performance Analysis
Rotman Lens Performance AnalysisIDES Editor
 
Band Clustering for the Lossless Compression of AVIRIS Hyperspectral Images
Band Clustering for the Lossless Compression of AVIRIS Hyperspectral ImagesBand Clustering for the Lossless Compression of AVIRIS Hyperspectral Images
Band Clustering for the Lossless Compression of AVIRIS Hyperspectral ImagesIDES Editor
 
Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...
Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...
Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...IDES Editor
 
Texture Unit based Monocular Real-world Scene Classification using SOM and KN...
Texture Unit based Monocular Real-world Scene Classification using SOM and KN...Texture Unit based Monocular Real-world Scene Classification using SOM and KN...
Texture Unit based Monocular Real-world Scene Classification using SOM and KN...IDES Editor
 

Más de IDES Editor (20)

Power System State Estimation - A Review
Power System State Estimation - A ReviewPower System State Estimation - A Review
Power System State Estimation - A Review
 
Artificial Intelligence Technique based Reactive Power Planning Incorporating...
Artificial Intelligence Technique based Reactive Power Planning Incorporating...Artificial Intelligence Technique based Reactive Power Planning Incorporating...
Artificial Intelligence Technique based Reactive Power Planning Incorporating...
 
Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...
Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...
Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...
 
Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...
Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...
Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...
 
Line Losses in the 14-Bus Power System Network using UPFC
Line Losses in the 14-Bus Power System Network using UPFCLine Losses in the 14-Bus Power System Network using UPFC
Line Losses in the 14-Bus Power System Network using UPFC
 
Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...
Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...
Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...
 
Assessing Uncertainty of Pushover Analysis to Geometric Modeling
Assessing Uncertainty of Pushover Analysis to Geometric ModelingAssessing Uncertainty of Pushover Analysis to Geometric Modeling
Assessing Uncertainty of Pushover Analysis to Geometric Modeling
 
Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...
Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...
Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...
 
Selfish Node Isolation & Incentivation using Progressive Thresholds
Selfish Node Isolation & Incentivation using Progressive ThresholdsSelfish Node Isolation & Incentivation using Progressive Thresholds
Selfish Node Isolation & Incentivation using Progressive Thresholds
 
Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...
Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...
Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...
 
Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...
Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...
Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...
 
Cloud Security and Data Integrity with Client Accountability Framework
Cloud Security and Data Integrity with Client Accountability FrameworkCloud Security and Data Integrity with Client Accountability Framework
Cloud Security and Data Integrity with Client Accountability Framework
 
Genetic Algorithm based Layered Detection and Defense of HTTP Botnet
Genetic Algorithm based Layered Detection and Defense of HTTP BotnetGenetic Algorithm based Layered Detection and Defense of HTTP Botnet
Genetic Algorithm based Layered Detection and Defense of HTTP Botnet
 
Enhancing Data Storage Security in Cloud Computing Through Steganography
Enhancing Data Storage Security in Cloud Computing Through SteganographyEnhancing Data Storage Security in Cloud Computing Through Steganography
Enhancing Data Storage Security in Cloud Computing Through Steganography
 
Low Energy Routing for WSN’s
Low Energy Routing for WSN’sLow Energy Routing for WSN’s
Low Energy Routing for WSN’s
 
Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...
Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...
Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...
 
Rotman Lens Performance Analysis
Rotman Lens Performance AnalysisRotman Lens Performance Analysis
Rotman Lens Performance Analysis
 
Band Clustering for the Lossless Compression of AVIRIS Hyperspectral Images
Band Clustering for the Lossless Compression of AVIRIS Hyperspectral ImagesBand Clustering for the Lossless Compression of AVIRIS Hyperspectral Images
Band Clustering for the Lossless Compression of AVIRIS Hyperspectral Images
 
Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...
Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...
Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...
 
Texture Unit based Monocular Real-world Scene Classification using SOM and KN...
Texture Unit based Monocular Real-world Scene Classification using SOM and KN...Texture Unit based Monocular Real-world Scene Classification using SOM and KN...
Texture Unit based Monocular Real-world Scene Classification using SOM and KN...
 

Último

Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1DianaGray10
 
Nanopower In Semiconductor Industry.pdf
Nanopower  In Semiconductor Industry.pdfNanopower  In Semiconductor Industry.pdf
Nanopower In Semiconductor Industry.pdfPedro Manuel
 
OpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability AdventureOpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability AdventureEric D. Schabell
 
Cybersecurity Workshop #1.pptx
Cybersecurity Workshop #1.pptxCybersecurity Workshop #1.pptx
Cybersecurity Workshop #1.pptxGDSC PJATK
 
Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024D Cloud Solutions
 
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCost
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCostKubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCost
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCostMatt Ray
 
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...Will Schroeder
 
Comparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and IstioComparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and IstioChristian Posta
 
Bird eye's view on Camunda open source ecosystem
Bird eye's view on Camunda open source ecosystemBird eye's view on Camunda open source ecosystem
Bird eye's view on Camunda open source ecosystemAsko Soukka
 
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDEADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDELiveplex
 
How Accurate are Carbon Emissions Projections?
How Accurate are Carbon Emissions Projections?How Accurate are Carbon Emissions Projections?
How Accurate are Carbon Emissions Projections?IES VE
 
Basic Building Blocks of Internet of Things.
Basic Building Blocks of Internet of Things.Basic Building Blocks of Internet of Things.
Basic Building Blocks of Internet of Things.YounusS2
 
20230202 - Introduction to tis-py
20230202 - Introduction to tis-py20230202 - Introduction to tis-py
20230202 - Introduction to tis-pyJamie (Taka) Wang
 
AI Fame Rush Review – Virtual Influencer Creation In Just Minutes
AI Fame Rush Review – Virtual Influencer Creation In Just MinutesAI Fame Rush Review – Virtual Influencer Creation In Just Minutes
AI Fame Rush Review – Virtual Influencer Creation In Just MinutesMd Hossain Ali
 
Meet the new FSP 3000 M-Flex800™
Meet the new FSP 3000 M-Flex800™Meet the new FSP 3000 M-Flex800™
Meet the new FSP 3000 M-Flex800™Adtran
 
UiPath Community: AI for UiPath Automation Developers
UiPath Community: AI for UiPath Automation DevelopersUiPath Community: AI for UiPath Automation Developers
UiPath Community: AI for UiPath Automation DevelopersUiPathCommunity
 
Igniting Next Level Productivity with AI-Infused Data Integration Workflows
Igniting Next Level Productivity with AI-Infused Data Integration WorkflowsIgniting Next Level Productivity with AI-Infused Data Integration Workflows
Igniting Next Level Productivity with AI-Infused Data Integration WorkflowsSafe Software
 
UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6DianaGray10
 

Último (20)

201610817 - edge part1
201610817 - edge part1201610817 - edge part1
201610817 - edge part1
 
Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1
 
Nanopower In Semiconductor Industry.pdf
Nanopower  In Semiconductor Industry.pdfNanopower  In Semiconductor Industry.pdf
Nanopower In Semiconductor Industry.pdf
 
OpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability AdventureOpenShift Commons Paris - Choose Your Own Observability Adventure
OpenShift Commons Paris - Choose Your Own Observability Adventure
 
Cybersecurity Workshop #1.pptx
Cybersecurity Workshop #1.pptxCybersecurity Workshop #1.pptx
Cybersecurity Workshop #1.pptx
 
Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024
 
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCost
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCostKubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCost
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCost
 
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
 
Comparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and IstioComparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and Istio
 
Bird eye's view on Camunda open source ecosystem
Bird eye's view on Camunda open source ecosystemBird eye's view on Camunda open source ecosystem
Bird eye's view on Camunda open source ecosystem
 
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDEADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
 
How Accurate are Carbon Emissions Projections?
How Accurate are Carbon Emissions Projections?How Accurate are Carbon Emissions Projections?
How Accurate are Carbon Emissions Projections?
 
Basic Building Blocks of Internet of Things.
Basic Building Blocks of Internet of Things.Basic Building Blocks of Internet of Things.
Basic Building Blocks of Internet of Things.
 
20150722 - AGV
20150722 - AGV20150722 - AGV
20150722 - AGV
 
20230202 - Introduction to tis-py
20230202 - Introduction to tis-py20230202 - Introduction to tis-py
20230202 - Introduction to tis-py
 
AI Fame Rush Review – Virtual Influencer Creation In Just Minutes
AI Fame Rush Review – Virtual Influencer Creation In Just MinutesAI Fame Rush Review – Virtual Influencer Creation In Just Minutes
AI Fame Rush Review – Virtual Influencer Creation In Just Minutes
 
Meet the new FSP 3000 M-Flex800™
Meet the new FSP 3000 M-Flex800™Meet the new FSP 3000 M-Flex800™
Meet the new FSP 3000 M-Flex800™
 
UiPath Community: AI for UiPath Automation Developers
UiPath Community: AI for UiPath Automation DevelopersUiPath Community: AI for UiPath Automation Developers
UiPath Community: AI for UiPath Automation Developers
 
Igniting Next Level Productivity with AI-Infused Data Integration Workflows
Igniting Next Level Productivity with AI-Infused Data Integration WorkflowsIgniting Next Level Productivity with AI-Infused Data Integration Workflows
Igniting Next Level Productivity with AI-Infused Data Integration Workflows
 
UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6UiPath Studio Web workshop series - Day 6
UiPath Studio Web workshop series - Day 6
 

Image Fusion of Video Images and Geo-localization for UAV Applications

  • 1. ACEEE Int. J. on Information Technology, Vol. 01, No. 02, Sep 2011 Image Fusion of Video Images and Geo-localization for UAV Applications K.Senthil Kumar1, Kavitha.G2, Subramanian.R3 and Marwan4 1, 3, 4 Division of Avionics Department of Electronics and Communication Engineering, 2 Madras Institute of Technology, Anna University Chennai, India 1 ksk_mit@annauniv.edu 2 kavithag_mit@annauniv.edu 3 subramanian.r.88@gmail.com 4 munna860@gmail.com Abstract—We present in this paper a very fine method for target using thermal imaging and to determine the targets by determining the location of a ground based target when viewed using an UAV fitted with a gimbal assembly of a RGB vision from an Unmanned Aerial Vehicle (UAV). By determining the camera, thermal camera and a range finder. In this paper we pixel coordinates on the video frame and by using a range have assumed the target is identified by a human at the ground finder the target’s geo-location is determined in the North- East-Down (NED) frame. The contribution of this method is control station (GCS). The UAV requires two persons for that the target can be localized to within 9m when view from operation. One person operates the UAV (UAV pilot) while an altitude of 2500m and down to 1m from an altitude of 100m. the other person operates the gimbal assembly (UAV This method offers a highly versatile tracking and geo- operator). Once the UAV operator identifies the target the localisation technique that has very good number of UAV system locks onto the target and the gimbal assembly advantages over the previously suggested methods. Some of automatically adjusts its azimuth and elevation to keep the the key factors that differentiate our method from its target in the Field of View (FOV). This method uses three predecessors are: steps in the process: 1) Day and night time operation 1.1. Thermal and visual image fusion of airborne video 2) All weather operation The use of thermal imaging technology and the eventual 3) Highly accurate positioning of target in terms of capabilities has increased dramatically in this technological latitude-longitude (GPS) and altitude. era. Thermal imaging used to be a very expensive technology for military users only. Today, more applications are involved 4) Automatic gimbaled operation of the camera once with the help of thermal imaging. Sensor information fusion target is locked which involves the process of combining various sensing modalities gives a realistic view. The subset of the sensor 5) Tracking is possible even when the target stops moving information does not collectively reveal the necessary data. With the development of new imaging sensors arises the 6) Independent of target (moving or stationary) need of a meaningful combination of all employed imaging sources. Image fusion of visual and thermal sensing outputs 7) No terrain database is required adds a new dimension in making the target tracking application of UAV more reliable. Target tracking at instances of smoke, 8) Instantaneous target geolocalisation is possible fog and cloudy conditions gets improved. Target Index Terms—first term, second term, third term, fourth term, identification, localisation, filtering and data association fifth term, sixth term forms an important application of the fusion process. Thus an effective surveillance and reconnaissance system can be I. INTRODUCTION formed. Unmanned Aerial Vehicles (UAV) exists since they have 1.2. Tracking of relevant target the ability and advantage to perform dangerous tasks. UAV’s The fused thermal video gives a clear distinction of the have an advantage of not putting human lives at risk. Tasks target from its environment. The tracking module uses Horn- such as tracking and reconnaissance require the UAV to Schunck method of optical flow to determine motion. determine the location of the target for military actions such Previously used methods require the operator to control the as dropping GPS guided bombs. This method presents a gimbal assembly to keep the target in the Field of View (FOV). method for determining the GPS coordinates of the target Using the current method the target is tracked and there is no independent of whether the target is moving or stationary need to define the target to the system once tracking and independent of the type of target (vehicles, human or commences. animals). The method uses different approaches to track the 44 © 2011 ACEEE DOI: 01.IJIT.01.02.33
  • 2. ACEEE Int. J. on Information Technology, Vol. 01, No. 02, Sep 2011 1.3. Target geo-localization moving platform. Once the system starts tracking the target Once the target is tracked on the pixel coordinate frame, on the instructions given by the operator from the GCS, the data from the range finder is combined and then converted to centroid of the threshold area is computed for every frame. the NED frame. Hence the target can be tracked automatically This centroid value is always maintained at the center of the using a feedback control mechanism connected with the image plane using servos controlling the gimbal assembly. gimbal assembly and its instantaneous GPS coordinates can The difference between the center of the image plane and the be determined in real time. Compared to previous methods centroid value obtained while tracking is determined for the which were had an accuracy of up to 8m, this method is capable azimuth and elevation axes to make corrections and maintain of determining the position of the target with accuracy within the error as minimum in the two axes i.e., to provide feedback 1m on the horizontal plane and around 2meters error in altitude. to the gimbal assembly to adjust itself so that the target is Hence the complete 3D coordinates of the target can be always near the center of the image plane. This ensures that determined. there is no requirement for the operator at the GCS to control the gimbal assembly to focus on the target. The feedback II. PROPOSED METHOD mechanism takes care of this feature. The proposed method can be split into the following D. Geolocalization subdivisions as: The gimbal assembly constitutes of a thermal camera, a video camera having the same intrinsic parameters as the A. Thermal and visual image fusion process thermal camera and a range finder, all fitted onto the same Thermal images have a valuable advantage over the visual gimbal so that they rotate altogether in the same direction in images. Thermal images do not depend on the illumination, any axis. The gimbal assembly is positioned such that the the output is the projection of thermal sensors of the emissions thermal camera, video camera and the range finder are all of heat of the objects. This unique merit gives rise for effective pointing in the same direction or in other words that their line segmentation of objects. Ultimately, surveillance measure of sights are always parallel. The gimbal assembly is also using an UAV gets improved. Considering a visual and thermal fitted with accelerometers to measure the angle of elevation video as output from the UAV to be obtained in the ground and azimuth of the optical sensors. The elevation and azimuth control station, the two videos are split into image. The visual values are measured with respect to the body of the UAV. and thermal image of the same frame is fused with by applying Haar wavelet transform to the fourth level. An inverse wavelet transform gets us the fused image. The considerations to be met here are same resolution images with same field of view. B. Gimbal assembly The gimbal assembly constitutes of a thermal camera, a video camera having the same intrinsic parameters as the thermal camera and a range finder, all fitted onto the same gimbal so that they rotate altogether in the same direction in Fig. 1. A two dimensional representation of localizing the target any axis. The gimbal assembly is positioned such that the In Fig. 1 it can be illustrated that by knowing the value of thermal camera, video camera and the range finder are all Height of the UAV ‘H’ and angle of elevation of the gimbal pointing in the same direction or in other words that their line assembly ‘’, the range R of the target on the horizontal plane of sights are always parallel. The gimbal assembly is also from the UAV’s can be determined with ease. It should be fitted with accelerometers to measure the angle of elevation observed that in the above case that the target is considered and azimuth of the optical sensors. The elevation and azimuth to be at zero altitude. However if the target is on a different values are measured with respect to the body of the UAV. terrain such as a trench or a hill, then the above determination The gimbal assembly has two functions: of the target’s coordinates will yield wrong. As shown in Fig. 1. Keeps the thermal and optic sensors aligned on the same 2, the actual target is at position T(x2,y2). But by using the platform. above method we will get another set of coordinates i.e., 2. Measures the azimuth (az ) and elevation angle (el) and T(x2’,y2’), which are wrong. sends the values to the processor. It should be noted that the gimbal attitude parameters are independent of the attitude of the UAV. C. Tracking Process The tracking process uses the fused video result. Here the tracking algorithm uses Horn-Schunck method to track optical flow in the video. The Horn-Schunck method is superior to the Lucas-Kanade method because it is resistant to the ‘aperture problem’ and it is very suitable for UAV and Fig. 2. A two dimensional representation of localizing the target on airborne applications where the camera is mounted on a a hilly terrain 45 © 2011 ACEEE DOI: 01.IJIT.01.02.33
  • 3. ACEEE Int. J. on Information Technology, Vol. 01, No. 02, Sep 2011 Similar work can be done in 3D, the only difference being that the angular value  will be replaced by ‘el’ and ‘az’. The Range Finder serves its purpose to overcome this problem. The range finder which is with the gimbal assembly provides us with the slant range ‘F’. Now this value ‘F’ can be used to determine the ground distance ‘R’ between the UAV and the target. The three dimensional representation of the schematics are shown in Fig. 3 and Fig. 4. Fig. 5. Visual image from the database collection Fig. 3. A three dimensional representation of localizing the target Fig. 6. Thermal image from the database collection Fig. 4. A three dimensional representation of localizing the target on a hilly terrain II. RESULTS The fusion, tracking and localization results are given below: A. Thermal and Visual Image Fusion Fig. 7. Thermal and visual fused image Image fusion results based on wavelet transform are B. Geo-Localization discussed. Two images (Visual and thermal) are taken as The videos used for simulating tracking are self animated shown in Fig. 5 and Fig. 6 and the fused image is obtained as videos which help to understand the problems involved by an output as shown in Fig. 7. The visual image gives a realistic introducing the unwanted components like gradient noise human sensing view. The thermal image identifies the target into the animation and hence the best and optimized method with the temperature difference coming into the picture with can be adopted. For the airborne video the Horn Schunk’s objects possessing different emissivity values. The fused method proves effective and the tracking results of the image is obtained as a result of four level wavelet transform algorithm are. The following figures show the results obtained and the combined information is obtained. The fused image in the tracking section. combines the complementary data. Image fusion results are with respect to the images taken from imagefusion.org image database collection. 46 © 2011 ACEEE DOI: 01.IJIT.01.02. 33
  • 4. ACEEE Int. J. on Information Technology, Vol. 01, No. 02, Sep 2011 Fig. 8. Sample target tracking video Fig. 12. Two Dimensional representation of the car’s movement on a plane without the Position Hold Block Results from MATLAB showing accuracy ranges after introduction of errors: The results obtained using the proposed method was found satisfactory even after introducing the errors which were previously discussed. Table I shows the tested accuracies at different variations in GPS altitude, gimbal assembly and range finder. The results in Table I show that the maximum and minimum accuracy of the Fig. 9. Sample target tracking video method is 0.95 m and 21.8 m respectively. The centroid of the bounding box gives the coordinates (x,y) of the detected target on the image plane. This can be used TABLE. I GEOLOCALISATION ACCURACY RESULTS only if the car is in motion. If the car stops then the centroid value returns to (0,0). This is shown graphically in Fig. 10. Fig. 10. Two Dimensional representation of the car’s movement on Fig. 13 depicts the accuracy of the target in graphical form. a plane The green dot depicts the true position of the target and the However by storing the previous value of the target in a red and blue dots depict the accuracy obtained by introduc- buffer it is possible to hold the position of the target when ing positive and negative errors respectively. These values the target is not moving in the image plane. This is possible of accuracies were computed for different error values at vary- using a user customized position hold block as shown below. ing altitude ‘H’, LRF range ‘F’ and gimbal angle ‘’. Fig. 11. IF Loop for Position Hold Function Block Fig. 13. Accuracy Distribution with respect to the true position 47 © 2011 ACEEE DOI: 01.IJIT.01.02. 33
  • 5. ACEEE Int. J. on Information Technology, Vol. 01, No. 02, Sep 2011 CONCLUSIONS [3] Zhang Jin-Yu, Chen Yan and Huang Xian-Xiang, “IR Thermal Image Segmentation Based on Enhanced Genetic Algorithms and The airborne images obtained from an UAV are analyzed Two-Dimensional Classes Square Error”, Second International in ground control station. By using the thermal images, all Conference on Information and Computing Science, 2009. weather and night operation are possible. Visual and thermal [4] Daniel Olmeda, Arturo de la Escalera and Jos M Armingol, image fusion is done and the fused image is given for target “Detection and Tracking of Pedestrians in Infrared Images”, tracking. This system has the benefit of enhanced target International Conference on Signals, Circuits and Systems, 2009 tracking application wherein only visual or thermal target [5] Wai Kit Wong, Poi Ngee Tan, Chu Kiong Loo and Way Soong Lim, “An Effective Surveillance System Using Thermal Camera”, tracking would not provide sufficient efficiency. Thus the International Conference on Signal Acquisition and Processing, image fusion process augments information leading to an 2009. improved system as a whole. The overall system incorporates [6] www.imagefusion.org segmentation, fusion and target tracking principles. By using [7] H.B.Mitchell, Image Fusion: Theories, Techniques and the tracked coordinates on the image frame, the coordinates Applications, Springer, 2010. can be shifted to the NED frame which will give the GPS [8] K. Han and G. N. DeSouza, “Instantaneous Geo-Location of coordinates of the target. The method proves it robust Multiple Targets from Monocular Airborne Video,” Proceedings of application for the military and can prove efficient since it 2002 IEEE International Conference on Robotics and Automation, has an accuracy of 0.9m from an altitude of 100m. May 2002, Washington DC, USA. [9] K. Han and G. N. DeSouza, “Multiple Targets Geolocation using SIFT and Stereo Vision on Airborne Video Sequences”, The ACKNOWLEDGMENT 2009 IEEE/RSJ International Conference on Intelligent Robots and The authors wish to thank A, B, C. This work was Systems October 11-15, 2009 St. Louis, USA. supported in part by a grant from XYZ. [10] D. B. Barber, J. Redding, T. W. McLain, R. W. Beard, and C. Taylor, “Vision-based target geo-location using a fixed-wing miniature air vehicle,” Journal of Intelligent Robotics Systems, vol. REFERENCES 47, pp. 361–382. [1] Y.Chena and C.Han, “Night-time Pedestrian Detection by [11] Redding, J. D., “Vision based target localization from a small Visual-Infrared Video Fusion,” Proceedings of 7th World congress fixed-wing unmanned air vehicle”, Master’s thesis, Brigham, Young on Intelligent Control and Automation, China, 2008. University, Provo, Utah 84602. [2] Alex Leykin, Yang Ran and Riad Hammoud, “Thermal-Visible Video Fusion for Moving Target Tracking and Pedestrian Classification”, IEEE Conference on Computer Vision and Pattern Recognition, Minneapolis, 2007. 48 © 2011 ACEEE DOI: 01.IJIT.01.02.33