SlideShare una empresa de Scribd logo
1 de 30
Localization of MAV in GPS-
denied Environment Using
Embedded Stereo Camera
Syaril Azrad, Farid Kendoul,Fadhil
Mohamad,, Kenzo Nonami
Department of Mechanical
Engineering,Nonami Lab, Chiba
University
Research Background
• Vision based Autonomous Q-Rotor
Research in Chiba University
– Started in 2008 , Participate in
1st US-Asian Demonstration and Assessment of Micro Aerial and
, MAV’08 Agra, India,
– Single camera approach
• Color-based object tracking algoritm (visual-servoing)
• Feature-based/Optical Flow (OF)
– Stereo camera approach
• Ground stereo camera based hovering
• Fully-embedded based object tracking
Current Research Background
• Localization of MAV using embedded
camera
– Research has been conducted by Farid et. al
using single embedded camera using Optical
Flow to localize rotorcraft position
– Height estimation improvement
• Fusion with Pressure Sensor
• Using stereo camera for higher-precision
Altitude Estimation For MAVs ?
• GPS
• Pressure sensor
• Laser Range Finder
• Radar Sensors
• Vision System
– Single Camera
– Stereo Camera
For Small UAVs (MAV) Altitude Estimation
We propose…
• Embedded Lightweight Stereo Camera
• Fusion Optical Flow (SFM algorithm) with
Image Homography
• Fusion Optical Flow (SFM algorithm) with
Scale Invariant Feature Transform (SIFT)
–based feature matching
Our Platform Introduction
• Quadrotor Air-frame
• Micro-controller
• Sensors
• Vision System
18th
Aug. 2010 MOVIC 2010 Tokyo,JAPAN
7
•Size : 54cm x 54cm x 20cm
•Total weight : < 700 grams
•Flight time : 10 min
•Range : 1~2 km
•Total cost : < 4000 USD
Proposed Vision-based MAV
localization algorithm
• Horizontal Position using Optical Flow-
based algorithm
• Altitude Position using Optical Flow fused
with Homography/SIFT-based matching
approach
18th
Aug. 2010 MOVIC 2010 Tokyo,JAPAN
9
Computing Optical Flow
Implementation
We use Lucas-Kanade Method
18th
Aug. 2010 MOVIC 2010 Tokyo,JAPAN
10
the feature point (x, y) in the next image I2 will be at (x+dx, y+dy),
and the same process will be repeated in the next step (t +Δt), providing the
optical flow   and a simple feature tracking algorithm.
Horizontal Position Estimation (OF)
Imaging Model
Or can be expressed as below
Then we can express the OF equation as below
WHAT DOES THE EQUATION ABOVE MEANS?
The velocity of can be expressed as Pi(xi
,yi)
A perspective-central camera model
maps Pi(xi ,yi)
If we have data from IMU about the rotational velocity of the camera on the body
of our MAVs we can get purely translational part
Meaning = velocity and position
Because OF calculated from images contains rotational and translational part
Kendoul et. al (2007)
The strategy is an estimation problem with the state vector
So for KF dynamics model, we can write the measurement equation plus noise
With H is as below, from our optical flow expression in equation (5)
Now, after estimating Oftrans, we can use them for measurement vector for
(MAV)translational velocity and structure parameter
Both our cameras are mounted on the MAV, and assuming camera motion is smooth,
we can write the dynamic model as below with γ   is the camera acc.
We use the model proposed by Kendoul et. al for depth (altitude) as below
We can write the discrete dynamic system using the state vector X
Which is non-linear, we Implement Extended Kalman Filter as proposed by Kendoul et. A
and estimate translational velocity and depth (altitude)
While the observation of the discrete model is as below
OF raw data from camera,
Attitude data from IMU
Estimate the translational
part of OF, separate the
rotational part
Estimate the camera
velocity and depth(altitude)
Move along x-axis with various
attitude
X
Y
Verification Experiment of Image Algorithm Fused with IMU data
5 10 15 20 25 30
- 0.4
- 0.2
0
0.2
0.4
Time [s ]
X,Ydistance[m]
Localization of MAV using Optical
Flow and SIFT-matching technique
• SIFT?
– Detect & describe feature in local image
• In our computation we use SIFTGPU (Wu,2009)
to speed up the matching
• Matching result is filtered by RANSAC algorithm
to separate between outlier in inlier
• Applying threads in computations we can get the
Optical Flow based algorithm 50fps and SIFT
matching 7-8fps. This inculding triangulation
Manual flying test with embedded
stereo camera
Results(x,y) Ground Truth
Ground Truth
Results(z)
Implementation Strategy
1. Vicon Data
2. Control Program
3. Receive Image Data
4. Process Image Data
Image Processing
1.Optical Flow Base
2.Stereo Feature Matching
GPU Graphical Processing Unit
Socket
1 Computer with two separate Process, or
2 Computers
Implementation
• Implemented on 1 Computer due to high
capability, but have 2 share GPU
• Core-i7 (4 Cores 8 Threads)
– Separate Core for Image Processing
– Separate Thread for Control
– Separate Core for Receiving Image Data
– Separate Thread for Vicon Data
Results
• Process that has been implemented with
stable frequency
– PID Control (30Hz)
– Vicon Data Acquisition (30Hz)
– Image Processing SIFT (8Hz) and Optical
Flow (15Hz)
Improvement of result strategy
Before Improvement
Result
Future Work & Suggestions
• Apply SIFTGpu for horizontal odometry
estimation and fuse with OF base
horizontal odometry
– Successive frame feature matching
• On-board camera/ image processing
development, sharing technology.
Thank you…

Más contenido relacionado

La actualidad más candente

Image Processing Applied To Traffic Queue Detection Algorithm
Image Processing Applied To Traffic Queue Detection AlgorithmImage Processing Applied To Traffic Queue Detection Algorithm
Image Processing Applied To Traffic Queue Detection Algorithmguest673189
 
Pose estimation of a mobile robot
Pose estimation of a mobile robotPose estimation of a mobile robot
Pose estimation of a mobile robotAlwin Poulose
 
Inertial navigaton systems11
Inertial navigaton systems11Inertial navigaton systems11
Inertial navigaton systems11Vikas Kumar Sinha
 
Algorithmic Techniques for Parametric Model Recovery
Algorithmic Techniques for Parametric Model RecoveryAlgorithmic Techniques for Parametric Model Recovery
Algorithmic Techniques for Parametric Model RecoveryCurvSurf
 
Vehicle detection in Aerial Images
Vehicle detection in Aerial ImagesVehicle detection in Aerial Images
Vehicle detection in Aerial ImagesKoshy Geoji
 
Object tracking using motion flow projection for pan-tilt configuration
Object tracking using motion flow projection for pan-tilt configurationObject tracking using motion flow projection for pan-tilt configuration
Object tracking using motion flow projection for pan-tilt configurationIJECEIAES
 
Vehicle counting without background modeling
Vehicle counting without background modelingVehicle counting without background modeling
Vehicle counting without background modelingLihguong Jang
 
Optimal Sensor Management Technique For An Unmanned Aerial Vehicle Tracking M...
Optimal Sensor Management Technique For An Unmanned Aerial Vehicle Tracking M...Optimal Sensor Management Technique For An Unmanned Aerial Vehicle Tracking M...
Optimal Sensor Management Technique For An Unmanned Aerial Vehicle Tracking M...Negar Farmani
 
Improving the safety of ride hailing services using iot analytics
Improving the safety of ride hailing services using iot analyticsImproving the safety of ride hailing services using iot analytics
Improving the safety of ride hailing services using iot analyticsMohan Manivannan
 
Vehicle detection through image processing
Vehicle detection through image processingVehicle detection through image processing
Vehicle detection through image processingGhazalpreet Kaur
 
Generation of High Resolution DSM using UAV Images - Final Year Project
Generation of High Resolution DSM using UAV Images - Final Year ProjectGeneration of High Resolution DSM using UAV Images - Final Year Project
Generation of High Resolution DSM using UAV Images - Final Year ProjectBiplov Bhandari
 
Total station
Total stationTotal station
Total stationSumit Jha
 
Application of Vision based Techniques for Position Estimation
Application of Vision based Techniques for Position EstimationApplication of Vision based Techniques for Position Estimation
Application of Vision based Techniques for Position EstimationIRJET Journal
 
PR-278: RAFT: Recurrent All-Pairs Field Transforms for Optical Flow
PR-278: RAFT: Recurrent All-Pairs Field Transforms for Optical FlowPR-278: RAFT: Recurrent All-Pairs Field Transforms for Optical Flow
PR-278: RAFT: Recurrent All-Pairs Field Transforms for Optical FlowHyeongmin Lee
 
A ROS IMPLEMENTATION OF THE MONO-SLAM ALGORITHM
A ROS IMPLEMENTATION OF THE MONO-SLAM ALGORITHMA ROS IMPLEMENTATION OF THE MONO-SLAM ALGORITHM
A ROS IMPLEMENTATION OF THE MONO-SLAM ALGORITHMcsandit
 
Parameter-free Modelling of 2D Shapes with Ellipses
Parameter-free Modelling of 2D Shapes with EllipsesParameter-free Modelling of 2D Shapes with Ellipses
Parameter-free Modelling of 2D Shapes with EllipsesCostas Panagiotakis
 

La actualidad más candente (20)

Image Processing Applied To Traffic Queue Detection Algorithm
Image Processing Applied To Traffic Queue Detection AlgorithmImage Processing Applied To Traffic Queue Detection Algorithm
Image Processing Applied To Traffic Queue Detection Algorithm
 
Pose estimation of a mobile robot
Pose estimation of a mobile robotPose estimation of a mobile robot
Pose estimation of a mobile robot
 
Gabriele Candela_Image-based 3d reconstruction of a bamboo-steel spatial trus...
Gabriele Candela_Image-based 3d reconstruction of a bamboo-steel spatial trus...Gabriele Candela_Image-based 3d reconstruction of a bamboo-steel spatial trus...
Gabriele Candela_Image-based 3d reconstruction of a bamboo-steel spatial trus...
 
Inertial navigaton systems11
Inertial navigaton systems11Inertial navigaton systems11
Inertial navigaton systems11
 
Algorithmic Techniques for Parametric Model Recovery
Algorithmic Techniques for Parametric Model RecoveryAlgorithmic Techniques for Parametric Model Recovery
Algorithmic Techniques for Parametric Model Recovery
 
Final year project proposal
Final year project proposalFinal year project proposal
Final year project proposal
 
Vehicle detection in Aerial Images
Vehicle detection in Aerial ImagesVehicle detection in Aerial Images
Vehicle detection in Aerial Images
 
Object tracking using motion flow projection for pan-tilt configuration
Object tracking using motion flow projection for pan-tilt configurationObject tracking using motion flow projection for pan-tilt configuration
Object tracking using motion flow projection for pan-tilt configuration
 
Vehicle counting without background modeling
Vehicle counting without background modelingVehicle counting without background modeling
Vehicle counting without background modeling
 
Optimal Sensor Management Technique For An Unmanned Aerial Vehicle Tracking M...
Optimal Sensor Management Technique For An Unmanned Aerial Vehicle Tracking M...Optimal Sensor Management Technique For An Unmanned Aerial Vehicle Tracking M...
Optimal Sensor Management Technique For An Unmanned Aerial Vehicle Tracking M...
 
Improving the safety of ride hailing services using iot analytics
Improving the safety of ride hailing services using iot analyticsImproving the safety of ride hailing services using iot analytics
Improving the safety of ride hailing services using iot analytics
 
Vehicle detection through image processing
Vehicle detection through image processingVehicle detection through image processing
Vehicle detection through image processing
 
Generation of High Resolution DSM using UAV Images - Final Year Project
Generation of High Resolution DSM using UAV Images - Final Year ProjectGeneration of High Resolution DSM using UAV Images - Final Year Project
Generation of High Resolution DSM using UAV Images - Final Year Project
 
Pr266
Pr266Pr266
Pr266
 
Total station
Total stationTotal station
Total station
 
Application of Vision based Techniques for Position Estimation
Application of Vision based Techniques for Position EstimationApplication of Vision based Techniques for Position Estimation
Application of Vision based Techniques for Position Estimation
 
Modern Surveying Equipment Presentation
Modern Surveying Equipment PresentationModern Surveying Equipment Presentation
Modern Surveying Equipment Presentation
 
PR-278: RAFT: Recurrent All-Pairs Field Transforms for Optical Flow
PR-278: RAFT: Recurrent All-Pairs Field Transforms for Optical FlowPR-278: RAFT: Recurrent All-Pairs Field Transforms for Optical Flow
PR-278: RAFT: Recurrent All-Pairs Field Transforms for Optical Flow
 
A ROS IMPLEMENTATION OF THE MONO-SLAM ALGORITHM
A ROS IMPLEMENTATION OF THE MONO-SLAM ALGORITHMA ROS IMPLEMENTATION OF THE MONO-SLAM ALGORITHM
A ROS IMPLEMENTATION OF THE MONO-SLAM ALGORITHM
 
Parameter-free Modelling of 2D Shapes with Ellipses
Parameter-free Modelling of 2D Shapes with EllipsesParameter-free Modelling of 2D Shapes with Ellipses
Parameter-free Modelling of 2D Shapes with Ellipses
 

Destacado

Running YUI 3 on Node.js - JSConf 2010
Running YUI 3 on Node.js - JSConf 2010Running YUI 3 on Node.js - JSConf 2010
Running YUI 3 on Node.js - JSConf 2010Adam Moore
 
Template presentation
Template presentationTemplate presentation
Template presentationrich lauria
 
ASC: Integrating Technology into Construction and Engineering Courses
ASC: Integrating Technology into Construction and Engineering CoursesASC: Integrating Technology into Construction and Engineering Courses
ASC: Integrating Technology into Construction and Engineering Coursesguestb8f153b
 
Open source software
Open source softwareOpen source software
Open source softwareMartin Giger
 
1.3 skl ki kd rev
1.3 skl ki kd rev1.3 skl ki kd rev
1.3 skl ki kd revedisenoadji
 
Farah Prsentatation Gvip 14 Juin 2008
Farah Prsentatation Gvip 14 Juin 2008Farah Prsentatation Gvip 14 Juin 2008
Farah Prsentatation Gvip 14 Juin 2008Ashraf Aboshosha
 
Market based cash balance plans
Market based cash balance plansMarket based cash balance plans
Market based cash balance plansJim van Iwaarden
 
Protect Your Heart
Protect Your HeartProtect Your Heart
Protect Your HeartPk Doctors
 

Destacado (16)

P1150740001
P1150740001P1150740001
P1150740001
 
P1161211140
P1161211140P1161211140
P1161211140
 
P1151439345
P1151439345P1151439345
P1151439345
 
сAshmob at glance
сAshmob at glanceсAshmob at glance
сAshmob at glance
 
Running YUI 3 on Node.js - JSConf 2010
Running YUI 3 on Node.js - JSConf 2010Running YUI 3 on Node.js - JSConf 2010
Running YUI 3 on Node.js - JSConf 2010
 
The Power Of Points
The Power Of PointsThe Power Of Points
The Power Of Points
 
Template presentation
Template presentationTemplate presentation
Template presentation
 
ASC: Integrating Technology into Construction and Engineering Courses
ASC: Integrating Technology into Construction and Engineering CoursesASC: Integrating Technology into Construction and Engineering Courses
ASC: Integrating Technology into Construction and Engineering Courses
 
Ina
InaIna
Ina
 
Open source software
Open source softwareOpen source software
Open source software
 
1.3 skl ki kd rev
1.3 skl ki kd rev1.3 skl ki kd rev
1.3 skl ki kd rev
 
Farah Prsentatation Gvip 14 Juin 2008
Farah Prsentatation Gvip 14 Juin 2008Farah Prsentatation Gvip 14 Juin 2008
Farah Prsentatation Gvip 14 Juin 2008
 
Market based cash balance plans
Market based cash balance plansMarket based cash balance plans
Market based cash balance plans
 
P1151345302
P1151345302P1151345302
P1151345302
 
Protect Your Heart
Protect Your HeartProtect Your Heart
Protect Your Heart
 
P1111141868
P1111141868P1111141868
P1111141868
 

Similar a P1131210137

ShawnQuinnCSS565FinalResearchProject
ShawnQuinnCSS565FinalResearchProjectShawnQuinnCSS565FinalResearchProject
ShawnQuinnCSS565FinalResearchProjectShawn Quinn
 
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...ijma
 
Real-time Moving Object Detection using SURF
Real-time Moving Object Detection using SURFReal-time Moving Object Detection using SURF
Real-time Moving Object Detection using SURFiosrjce
 
Visual Mapping and Collision Avoidance Dynamic Environments in Dynamic Enviro...
Visual Mapping and Collision Avoidance Dynamic Environments in Dynamic Enviro...Visual Mapping and Collision Avoidance Dynamic Environments in Dynamic Enviro...
Visual Mapping and Collision Avoidance Dynamic Environments in Dynamic Enviro...Darius Burschka
 
SIMULTANEOUS MAPPING AND NAVIGATION FOR RENDEZVOUS IN SPACE APPLICATIONS
 SIMULTANEOUS MAPPING AND NAVIGATION FOR RENDEZVOUS IN SPACE APPLICATIONS  SIMULTANEOUS MAPPING AND NAVIGATION FOR RENDEZVOUS IN SPACE APPLICATIONS
SIMULTANEOUS MAPPING AND NAVIGATION FOR RENDEZVOUS IN SPACE APPLICATIONS Nandakishor Jahagirdar
 
A Detailed Analysis on Feature Extraction Techniques of Panoramic Image Stitc...
A Detailed Analysis on Feature Extraction Techniques of Panoramic Image Stitc...A Detailed Analysis on Feature Extraction Techniques of Panoramic Image Stitc...
A Detailed Analysis on Feature Extraction Techniques of Panoramic Image Stitc...IJEACS
 
Detection of moving object using
Detection of moving object usingDetection of moving object using
Detection of moving object usingijistjournal
 
DETECTION OF MOVING OBJECT USING FOREGROUND EXTRACTION ALGORITHM BY PTZ CAMERA
DETECTION OF MOVING OBJECT USING FOREGROUND EXTRACTION ALGORITHM BY PTZ CAMERADETECTION OF MOVING OBJECT USING FOREGROUND EXTRACTION ALGORITHM BY PTZ CAMERA
DETECTION OF MOVING OBJECT USING FOREGROUND EXTRACTION ALGORITHM BY PTZ CAMERAijistjournal
 
IRJET- Design the Surveillance Algorithm and Motion Detection of Objects for ...
IRJET- Design the Surveillance Algorithm and Motion Detection of Objects for ...IRJET- Design the Surveillance Algorithm and Motion Detection of Objects for ...
IRJET- Design the Surveillance Algorithm and Motion Detection of Objects for ...IRJET Journal
 
Scanline Homographies for Rolling-Shutter Plane Absolute Pose
Scanline Homographies for Rolling-Shutter Plane Absolute PoseScanline Homographies for Rolling-Shutter Plane Absolute Pose
Scanline Homographies for Rolling-Shutter Plane Absolute PoseAgnivaSen
 
Keynote at Tracking Workshop during ISMAR 2014
Keynote at Tracking Workshop during ISMAR 2014Keynote at Tracking Workshop during ISMAR 2014
Keynote at Tracking Workshop during ISMAR 2014Darius Burschka
 
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...csandit
 
Image Processing Algorithms For Deep-Space Autonomous Optical Navigation 2.pptx
Image Processing Algorithms For Deep-Space Autonomous Optical Navigation 2.pptxImage Processing Algorithms For Deep-Space Autonomous Optical Navigation 2.pptx
Image Processing Algorithms For Deep-Space Autonomous Optical Navigation 2.pptxahmadzahran219
 
Gait Based Person Recognition Using Partial Least Squares Selection Scheme
Gait Based Person Recognition Using Partial Least Squares Selection Scheme Gait Based Person Recognition Using Partial Least Squares Selection Scheme
Gait Based Person Recognition Using Partial Least Squares Selection Scheme ijcisjournal
 
IEEE/RSJ IROS 2008 Real-time Tracker
IEEE/RSJ IROS 2008 Real-time TrackerIEEE/RSJ IROS 2008 Real-time Tracker
IEEE/RSJ IROS 2008 Real-time Trackerc.choi
 
An automatic algorithm for object recognition and detection based on asift ke...
An automatic algorithm for object recognition and detection based on asift ke...An automatic algorithm for object recognition and detection based on asift ke...
An automatic algorithm for object recognition and detection based on asift ke...Kunal Kishor Nirala
 
Design and implementation of video tracking system based on camera field of view
Design and implementation of video tracking system based on camera field of viewDesign and implementation of video tracking system based on camera field of view
Design and implementation of video tracking system based on camera field of viewsipij
 

Similar a P1131210137 (20)

ShawnQuinnCSS565FinalResearchProject
ShawnQuinnCSS565FinalResearchProjectShawnQuinnCSS565FinalResearchProject
ShawnQuinnCSS565FinalResearchProject
 
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
Leader Follower Formation Control of Ground Vehicles Using Dynamic Pixel Coun...
 
J017377578
J017377578J017377578
J017377578
 
Real-time Moving Object Detection using SURF
Real-time Moving Object Detection using SURFReal-time Moving Object Detection using SURF
Real-time Moving Object Detection using SURF
 
Visual Mapping and Collision Avoidance Dynamic Environments in Dynamic Enviro...
Visual Mapping and Collision Avoidance Dynamic Environments in Dynamic Enviro...Visual Mapping and Collision Avoidance Dynamic Environments in Dynamic Enviro...
Visual Mapping and Collision Avoidance Dynamic Environments in Dynamic Enviro...
 
SIMULTANEOUS MAPPING AND NAVIGATION FOR RENDEZVOUS IN SPACE APPLICATIONS
 SIMULTANEOUS MAPPING AND NAVIGATION FOR RENDEZVOUS IN SPACE APPLICATIONS  SIMULTANEOUS MAPPING AND NAVIGATION FOR RENDEZVOUS IN SPACE APPLICATIONS
SIMULTANEOUS MAPPING AND NAVIGATION FOR RENDEZVOUS IN SPACE APPLICATIONS
 
A Detailed Analysis on Feature Extraction Techniques of Panoramic Image Stitc...
A Detailed Analysis on Feature Extraction Techniques of Panoramic Image Stitc...A Detailed Analysis on Feature Extraction Techniques of Panoramic Image Stitc...
A Detailed Analysis on Feature Extraction Techniques of Panoramic Image Stitc...
 
I0343065072
I0343065072I0343065072
I0343065072
 
Detection of moving object using
Detection of moving object usingDetection of moving object using
Detection of moving object using
 
DETECTION OF MOVING OBJECT USING FOREGROUND EXTRACTION ALGORITHM BY PTZ CAMERA
DETECTION OF MOVING OBJECT USING FOREGROUND EXTRACTION ALGORITHM BY PTZ CAMERADETECTION OF MOVING OBJECT USING FOREGROUND EXTRACTION ALGORITHM BY PTZ CAMERA
DETECTION OF MOVING OBJECT USING FOREGROUND EXTRACTION ALGORITHM BY PTZ CAMERA
 
IRJET- Design the Surveillance Algorithm and Motion Detection of Objects for ...
IRJET- Design the Surveillance Algorithm and Motion Detection of Objects for ...IRJET- Design the Surveillance Algorithm and Motion Detection of Objects for ...
IRJET- Design the Surveillance Algorithm and Motion Detection of Objects for ...
 
Scanline Homographies for Rolling-Shutter Plane Absolute Pose
Scanline Homographies for Rolling-Shutter Plane Absolute PoseScanline Homographies for Rolling-Shutter Plane Absolute Pose
Scanline Homographies for Rolling-Shutter Plane Absolute Pose
 
Keynote at Tracking Workshop during ISMAR 2014
Keynote at Tracking Workshop during ISMAR 2014Keynote at Tracking Workshop during ISMAR 2014
Keynote at Tracking Workshop during ISMAR 2014
 
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...
AUTO LANDING PROCESS FOR AUTONOMOUS FLYING ROBOT BY USING IMAGE PROCESSING BA...
 
Image Processing Algorithms For Deep-Space Autonomous Optical Navigation 2.pptx
Image Processing Algorithms For Deep-Space Autonomous Optical Navigation 2.pptxImage Processing Algorithms For Deep-Space Autonomous Optical Navigation 2.pptx
Image Processing Algorithms For Deep-Space Autonomous Optical Navigation 2.pptx
 
report
reportreport
report
 
Gait Based Person Recognition Using Partial Least Squares Selection Scheme
Gait Based Person Recognition Using Partial Least Squares Selection Scheme Gait Based Person Recognition Using Partial Least Squares Selection Scheme
Gait Based Person Recognition Using Partial Least Squares Selection Scheme
 
IEEE/RSJ IROS 2008 Real-time Tracker
IEEE/RSJ IROS 2008 Real-time TrackerIEEE/RSJ IROS 2008 Real-time Tracker
IEEE/RSJ IROS 2008 Real-time Tracker
 
An automatic algorithm for object recognition and detection based on asift ke...
An automatic algorithm for object recognition and detection based on asift ke...An automatic algorithm for object recognition and detection based on asift ke...
An automatic algorithm for object recognition and detection based on asift ke...
 
Design and implementation of video tracking system based on camera field of view
Design and implementation of video tracking system based on camera field of viewDesign and implementation of video tracking system based on camera field of view
Design and implementation of video tracking system based on camera field of view
 

Más de Ashraf Aboshosha (20)

P1151351311
P1151351311P1151351311
P1151351311
 
P1121352313
P1121352313P1121352313
P1121352313
 
P1121340296
P1121340296P1121340296
P1121340296
 
P1121340294
P1121340294P1121340294
P1121340294
 
P1121327289
P1121327289P1121327289
P1121327289
 
P1151442348
P1151442348P1151442348
P1151442348
 
P1151442347
P1151442347P1151442347
P1151442347
 
P1151424332
P1151424332P1151424332
P1151424332
 
P1151423331
P1151423331P1151423331
P1151423331
 
P1151420328
P1151420328P1151420328
P1151420328
 
P1151418327
P1151418327P1151418327
P1151418327
 
P1151404314
P1151404314P1151404314
P1151404314
 
P1111351312
P1111351312P1111351312
P1111351312
 
P1111444352
P1111444352P1111444352
P1111444352
 
P1111440346
P1111440346P1111440346
P1111440346
 
P1111431335
P1111431335P1111431335
P1111431335
 
P1111410320
P1111410320P1111410320
P1111410320
 
P1111410321
P1111410321P1111410321
P1111410321
 
P1111350310
P1111350310P1111350310
P1111350310
 
P1111348307
P1111348307P1111348307
P1111348307
 

P1131210137

  • 1. Localization of MAV in GPS- denied Environment Using Embedded Stereo Camera Syaril Azrad, Farid Kendoul,Fadhil Mohamad,, Kenzo Nonami Department of Mechanical Engineering,Nonami Lab, Chiba University
  • 2. Research Background • Vision based Autonomous Q-Rotor Research in Chiba University – Started in 2008 , Participate in 1st US-Asian Demonstration and Assessment of Micro Aerial and , MAV’08 Agra, India, – Single camera approach • Color-based object tracking algoritm (visual-servoing) • Feature-based/Optical Flow (OF) – Stereo camera approach • Ground stereo camera based hovering • Fully-embedded based object tracking
  • 3. Current Research Background • Localization of MAV using embedded camera – Research has been conducted by Farid et. al using single embedded camera using Optical Flow to localize rotorcraft position – Height estimation improvement • Fusion with Pressure Sensor • Using stereo camera for higher-precision
  • 4. Altitude Estimation For MAVs ? • GPS • Pressure sensor • Laser Range Finder • Radar Sensors • Vision System – Single Camera – Stereo Camera
  • 5. For Small UAVs (MAV) Altitude Estimation We propose… • Embedded Lightweight Stereo Camera • Fusion Optical Flow (SFM algorithm) with Image Homography • Fusion Optical Flow (SFM algorithm) with Scale Invariant Feature Transform (SIFT) –based feature matching
  • 6. Our Platform Introduction • Quadrotor Air-frame • Micro-controller • Sensors • Vision System
  • 7. 18th Aug. 2010 MOVIC 2010 Tokyo,JAPAN 7 •Size : 54cm x 54cm x 20cm •Total weight : < 700 grams •Flight time : 10 min •Range : 1~2 km •Total cost : < 4000 USD
  • 8. Proposed Vision-based MAV localization algorithm • Horizontal Position using Optical Flow- based algorithm • Altitude Position using Optical Flow fused with Homography/SIFT-based matching approach
  • 9. 18th Aug. 2010 MOVIC 2010 Tokyo,JAPAN 9 Computing Optical Flow Implementation We use Lucas-Kanade Method
  • 10. 18th Aug. 2010 MOVIC 2010 Tokyo,JAPAN 10 the feature point (x, y) in the next image I2 will be at (x+dx, y+dy), and the same process will be repeated in the next step (t +Δt), providing the optical flow   and a simple feature tracking algorithm.
  • 11. Horizontal Position Estimation (OF) Imaging Model
  • 12. Or can be expressed as below Then we can express the OF equation as below WHAT DOES THE EQUATION ABOVE MEANS? The velocity of can be expressed as Pi(xi ,yi) A perspective-central camera model maps Pi(xi ,yi)
  • 13. If we have data from IMU about the rotational velocity of the camera on the body of our MAVs we can get purely translational part Meaning = velocity and position Because OF calculated from images contains rotational and translational part Kendoul et. al (2007)
  • 14. The strategy is an estimation problem with the state vector So for KF dynamics model, we can write the measurement equation plus noise With H is as below, from our optical flow expression in equation (5)
  • 15. Now, after estimating Oftrans, we can use them for measurement vector for (MAV)translational velocity and structure parameter Both our cameras are mounted on the MAV, and assuming camera motion is smooth, we can write the dynamic model as below with γ   is the camera acc. We use the model proposed by Kendoul et. al for depth (altitude) as below We can write the discrete dynamic system using the state vector X
  • 16. Which is non-linear, we Implement Extended Kalman Filter as proposed by Kendoul et. A and estimate translational velocity and depth (altitude) While the observation of the discrete model is as below
  • 17. OF raw data from camera, Attitude data from IMU Estimate the translational part of OF, separate the rotational part Estimate the camera velocity and depth(altitude)
  • 18. Move along x-axis with various attitude X Y Verification Experiment of Image Algorithm Fused with IMU data 5 10 15 20 25 30 - 0.4 - 0.2 0 0.2 0.4 Time [s ] X,Ydistance[m]
  • 19. Localization of MAV using Optical Flow and SIFT-matching technique • SIFT? – Detect & describe feature in local image • In our computation we use SIFTGPU (Wu,2009) to speed up the matching • Matching result is filtered by RANSAC algorithm to separate between outlier in inlier • Applying threads in computations we can get the Optical Flow based algorithm 50fps and SIFT matching 7-8fps. This inculding triangulation
  • 20. Manual flying test with embedded stereo camera
  • 23. Implementation Strategy 1. Vicon Data 2. Control Program 3. Receive Image Data 4. Process Image Data Image Processing 1.Optical Flow Base 2.Stereo Feature Matching GPU Graphical Processing Unit Socket 1 Computer with two separate Process, or 2 Computers
  • 24. Implementation • Implemented on 1 Computer due to high capability, but have 2 share GPU • Core-i7 (4 Cores 8 Threads) – Separate Core for Image Processing – Separate Thread for Control – Separate Core for Receiving Image Data – Separate Thread for Vicon Data
  • 25. Results • Process that has been implemented with stable frequency – PID Control (30Hz) – Vicon Data Acquisition (30Hz) – Image Processing SIFT (8Hz) and Optical Flow (15Hz)
  • 29. Future Work & Suggestions • Apply SIFTGpu for horizontal odometry estimation and fuse with OF base horizontal odometry – Successive frame feature matching • On-board camera/ image processing development, sharing technology.

Notas del editor

  1. Standard GPS vertical precision between 25m to 50m Pressure sensor depends on environment will have an error between 6% and 7% moreover, pressure sensor only calculate relative height Laser altimeter is good but for our type of MAV it would be not suitable due to its energy consumption and surface requirement Radar sensor would not be suitable due to its high cost and energy consuming Thus we still think the vision system is one of the best alternative. However due to its high computing cost requirement, depending on algorithm there is an issue whether we can implement on-board or host-based.
  2. Our platform is off-the shelf platform from Ascending Technologies, German. The size of our platform is 54 cm in diameters, and 20cm of height, with total weight of less than 700 grams, and flight time up to 12 minutes. However we state here 10 minutes for safety purposes. The flight range would be 1 to 2 kilometers. And the total cost is less than 4000 USD, which is around 350 thousand yen. The details of the price can be seen on the right corner table. In this platform we incorporated the flight computer GUMSTIX, which has a computing speed of 400MHz, equipped with wi-fi. The GUMSTIX is linux-based. We also use the Crossbow MNAV IMU, pressure, gyro sensor for avionics sensor, and mount a small analog camera on the bottom of the platform, which operates using 1.3GHz band.
  3. The Optical Flow algorithm can be summarized by the picture here. When we select the object of interest as shown as the blue shape here. The Tomasi-Shi algorithm will automatically specify good features to track. We can limit the good features to track to a certain number of features. In this example five point is selected. Even though it is not exactly the center of the OOI, we output the center of the five features as the object center, and relay it to the controller as an input. Based on Optical Flow algorithm, the five same features will be tracked in the next frame. A process of reselection of feature is executed when the feature of interest is lost in detection. We can summarize the Optical Flow algorithm as in the next slide.
  4. As shown in the formula above, feature point x,y will be x+dx and y+dy in the next image frame I2. dx and dy is defined when the argument of the minimum of this equation is fulfilled. The same process then will be repeated in the next step, to provide an optical flow and simple feature tracking algorithm. However there is a problem when feature detection. First it only selected limited points or features, next it will lose the object profile when the object is lost from the screen due to movement or just simply image blur due to bad video streaming reception. The feature based algorithm does not “remember” the object when it first selected. Thus we introduce a hypothetical algorithm to solve the problem, combining the color based and feature based algorithm.
  5. The study has been done by Kendoul et al. using single camera, however we want to exploit the potential of stereo camera especially to predict correctly depth (Zi) used in the equation.