Se ha denunciado esta presentación.
Se está descargando tu SlideShare. ×

.remote sensing.Ece 402 unit-2

Anuncio
Anuncio
Anuncio
Anuncio
Anuncio
Anuncio
Anuncio
Anuncio
Anuncio
Anuncio
Anuncio
Anuncio
Próximo SlideShare
datum
datum
Cargando en…3
×

Eche un vistazo a continuación

1 de 84 Anuncio

.remote sensing.Ece 402 unit-2

Descargar para leer sin conexión

physics of remote sensing,ideal remote sensing,swath,platform,sensor,orbit and its characteristics,electromagnetic radiations,EMR solar radiations and its application,shortwave and long waves,spectrul reflectance curve, resolution AND multi concept,FCC,

physics of remote sensing,ideal remote sensing,swath,platform,sensor,orbit and its characteristics,electromagnetic radiations,EMR solar radiations and its application,shortwave and long waves,spectrul reflectance curve, resolution AND multi concept,FCC,

Anuncio
Anuncio

Más Contenido Relacionado

Presentaciones para usted (20)

A los espectadores también les gustó (20)

Anuncio

Similares a .remote sensing.Ece 402 unit-2 (20)

Más reciente (20)

Anuncio

.remote sensing.Ece 402 unit-2

  1. 1. Geoprocessing Sciences Geoinformatics • • • • Remote Sensing Geographic Information Systems (GIS) Global Positioning Systems (GPS) Geodesy (from Greek – geodaisia,"division of the Earth") • Surveying • Automated Cartography • and many others
  2. 2. Remote Sensing Process
  3. 3. Energy Source or Illumination (A) - the first requirement for remote sensing is to have an energy source which illuminates or provides electromagnetic energy to the target of interest. Radiation and the Atmosphere (B) - as the energy travels from its source to the target, it will come in contact with and interact with the atmosphere it passes through. This interaction may take place a second time as the energy travels from the target to the sensor. Interaction with the Target (C) - once the energy makes its way to the target through the atmosphere, it interacts with the target depending on the properties of both the target and the radiation. Recording of Energy by the Sensor (D) - after the energy has been scattered by, or emitted from the target, we require a sensor (remote - not in contact with the target) to collect and record the electromagnetic radiation. Transmission, Reception, and Processing (E) - the energy recorded by the sensor has to be transmitted, often in electronic form, to a receiving and processing station where the data are processed into an image (hardcopy and/or digital). Interpretation and Analysis (F) - the processed image is interpreted, visually and/or digitally or electronically, to extract information about the target which was illuminated. Application (G) - the final element of the remote sensing process is achieved when we apply the information we have been able to extract from the imagery about the target in order to better understand it, reveal some new information, or assist in solving a particular problem.
  4. 4. Ideal Remote Systems EM Energy Source Sensor Data Recorder Required Information USER Requirements Reflected Energy 1. Uniform Energy Source 2. Non-interfering Atmosphere Target Absorbed Transmitted Energy 3. A Series of Unique energy interaction at earth surface 4. A super sensor 5. A real time data handling system 6. Multiple data user
  5. 5. Remote Sensing System In respect to type of Energy Resources, it can be classified by two way; 1. Passive System It detect the reflected or emitted radiation from natural source (Sun’s energy) 2. Active System The sensor emits radiation which is directed towards the target to be investigated. The radiation reflected from that target is detected and measured by the sensor. Advantages include the ability to obtain measurements anytime.
  6. 6. Platforms and Type Sensor: An instrument that collect and record energy reflected or emitted from a target or surface. Sensor must reside on board stable platform. Platform types: Platforms for remote sensors may be situated on the ground, on an aircraft or balloon (or some other platform within the Earth's atmosphere), or on a spacecraft or satellite outside of the Earth's atmosphere
  7. 7. Satellite Characteristics Orbits selection can vary in terms of altitude (their height above the Earth's surface) and their orientation and rotation relative to the Earth. We consider two types of orbits: geostationary & near-polar. geostationary •high altitudes ( about 36,000 km), •revolve at speeds which match the rotation of the Earth, •West to East orientation •observe and collect information continuously over specific areas. •Weather and communications satellites commonly have these types of orbits.
  8. 8. Near-polar orbits: North-south orbit which, in conjunction with the Earth's rotation (west-east), allows them to cover most of the Earth's surface over a certain period of time. Inclination of the orbit relative to a line running between the North and South poles. These satellite orbits are sun-synchronous such that they cover each area of the world at a constant local time of day called local sun time. This ensures consistent illumination conditions when acquiring images in a specific season over successive years, or over a particular area over a series of days. This is an important factor for monitoring changes between images or for mosaicking adjacent images together
  9. 9. Swath The sensor "sees" a certain portion of the Earth's surface. The area imaged on the surface, is referred to as the swath. The satellite's orbit and the rotation of the Earth work together to allow complete coverage of the Earth's surface( cycle of orbits). In near-polar orbits, areas at high latitudes will be imaged more frequently than the equatorial zone due to the increasing overlap in adjacent swaths as the orbit paths come closer together near the poles.
  10. 10. Electromagnetic Radiation (EMR) According to Wien's Displacement Law shows how the spectrum of black body radiation at any temperature is related to the spectrum at any other temperature. If we know the shape of the spectrum at one temperature, we can calculate the shape at any other temperature. Spectral intensity can be expressed as a function of wavelength or of frequency. peak T 2.898 106 nm.K
  11. 11. According to James Clark Maxwell, EMR can be considered as Electromagnetic radiation (EM radiation or EMR) is a form of energy emitted and absorbed by charged particles, that travels through space at the speed of light. It consists of two fluctuating fields 1. Electric Field (E) 2. Magnetic Field (M) both fields are at right angles with each other and perpendicular to the direction of propagation
  12. 12. Electromagnetic Radiation of Black-body (Light Intensity) Black body radiation has a characteristic, continuous frequency spectrum that depends only on the body's temperature
  13. 13. Solar Radiation Gamma Ray: < 0.03nm X-Ray: 0.03 nm – 3 nm Ultraviolet: 3 nm -0.4 µm Visible: 0.4 µm – 0.7 µm Infrared: 0.7 µm – 0.1 cm Micro Wave: 0.1 cm – 30 cm Radio Wave: > 30 cm Short Waves Long Waves
  14. 14. Electromagnetic Spectrum It is the range of all possible frequencies/wavelength of electromagnetic radiation. Longest wavelength: size of universe Shortest Wavelength: vicinity of atomic distances Short wave Long wave Gamma Ray: < 0.03 nm X-Ray: 0.03 nm – 3 nm Ultraviolet: 3 nm -0.4 µm Visible: 0.4 µm – 0.7 µm Infrared: 0.7 µm -0.1 cm Micro Wave: 0.1 cm -30 cm Radio Wave: > 30 cm
  15. 15. Electromagnetic Spectral Regions REGION WAVELENGTH REMARKS Gamma ray < 0.03 nm Incoming radiation is completely absorbed by the upper atmosphere and is not available for remote sensing X ray 0.03 nm – 3 nm Completely absorbed by atmosphere. Not employed in remote sensing. Ultraviolet 3 nm – 0.4 µm Incoming wavelengths less than 0.3 µm are completely absorbed by ozone in the upper atmosphere Photographic UV band 0.3 µm – 0.4 µm Transmitted through atmosphere. Detectable with film and photo-detectors, but atmospheric scattering is severe Visible 0.4 µm – 0.7 µm Imaged with film and photo-detectors. Includes reflected energy peak of earth at 0.5 µm. Infrared 0.7 µm – 100 µm Interaction with matter varies with wavelength. Atmospheric transmission windows are separated by absorption band Reflected IR 0.7 µm – 3.0 µm Reflected solar radiation that contain no information about the thermal properties of materials. The band from 0.7 to 0.9 µm is detectable with film and is called the photographic IR band. Thermal IR 3 µm – 15 µm 8 µm – 14 µm Principal atmospheric windows in the thermal region. Images at these wavelengths are required by optical-mechanical scanners and special vidicon system but not by film. Microwave 0.1 cm – 30 cm Longer wavelengths can penetrate clouds, fog, and rain. Images may be acquired in the active or passive mode. Radar 0.1 cm – 30 cm Active form of microwave remote sensing. Radar images are acquired at various wavelength bands. Radio > 30 cm Longest wavelength portion of electromagnetic spectrum. Some classified radars with very long wavelength operate in this region.
  16. 16. Specific bands of wavelengths are only used for remote sensing purpose
  17. 17. Visible Spectrum VIOLET : BLUE: GREEN: YELLOW: ORANGE: RED: 0.400 - 0.446 µm 0.446 - 0.500 µm 0.500 - 0.578 µm 0.578 - 0.592 µm 0.592 - 0.620 µm 0.620 - 0.700 µm Spectral Band in remote sensing Band 1: Blue band (0.4 – 0.5 µm) Band 2 : Green band (0.5 – 0.6 µm) Band 3 : Red band (0.6 – 0.7 µm)
  18. 18. Infrared Spectrum Near Infrared (NIR) : (0.78 -3 µm) can be recorded on film. Mid Infrared (MIR) : (3 µm - 50 µm) can be detected using electro-optical sensors. Far Infrared (FIR) : (50 µm -1000 µm) or Thermal IR can only be detected using electro-optical sensors.
  19. 19. Remote Sensing Electromagnetic Spectrum Only some part of EM spectrum is useful for Remote Sensing Purpose. Wavelength of EM energy used by Remote sensor is called as Spectral Range or Spectral band. Wavelength Range: Visible Region: 0.4 µm – 0.7 µm Infrared Region: 0.7 µm – 1 cm Microwave Region: 1 cm – 30 cm Based on Wavelength Region Remote Sensing is three type: Visible & Reflective IR Remote Sensing Thermal IR Remote sensing Microwave Remote Sensing
  20. 20. Energy, Atmosphere, and Surface Interactions
  21. 21. Attenuation of Solar Radiation/ EM Energy Depends upon: Causes: 1. Energy Interaction with Atmosphere a). scattering b). refraction c). atmospheric absorption 2. Energy Interaction with Surface / target a). Reflection b). Absorption at surface c). Transmission a). Differences in path length b). Magnitude of the energy signal that is being sensed c). Atmospheric conditions present d). The wavelengths involved.
  22. 22. Energy Interaction with Atmosphere Scattering :   redirection of light by particles can be in any direction (unpredictable) Scattering of EM radiation by a cloud
  23. 23. General Effects of Scattering     causes skylight (allows us to see in shadow) forces image to record the brightness of the atmosphere in addition to the target. directs reflected light away from the sensor, decreasing spatial detail (fuzzy images) tends to make dark objects lighter and light objects darker (reduces contrast)
  24. 24. Types of scattering Depending on the relationship between the diameter of the scattering particle (a) and the wavelength of the radiation (λ), it can be three type; 1. Rayleigh Scattering 2. Mie Scattering 3. Non-selective scattering Scattering of EM radiation by a cloud
  25. 25. Rayleigh Scattering ( or molecular scattering) a << • Rayleigh scatter is common when radiation interacts with atmospheric molecules (gas molecules) and other tiny particles (aerosols) that are much smaller in diameter that the wavelength of the interacting radiation. (mainly due to Oxygen and Nitrogen molecule) • Scattering intensity is proportional to λ-4. As a result, short wavelengths are more likely to be scattered than long wavelengths. • It is common high in the atmosphere • Rayleigh scatter is one of the principal causes of haze in imagery. Visually haze diminishes the crispness or contrast of an image. • It is responsible for blue sky.
  26. 26. Mie Scatter (mean particle diameter 0.1 to 10 times of • Mie scatter exists when the atmospheric particle diameter is essentially equal to the energy wavelengths being sensed. • Examples: Water vapour, smoke particles, fine dust particles • Scattering intensity is proportion to diameter) -4 to 0 (depending upon particle • This type of scatter tends to influence longer wavelengths than Rayleigh scatter. • Clear atmosphere has both Rayleigh and Mie scattering. Their combined influence is between -0.7 to -2
  27. 27. Non-selective scatter a> • Non-selective scatter is more of a problem, and occurs when the diameter of the particles causing scatter are much larger than the wavelengths being sensed. • Water droplets, that commonly have diameters of between 5 and 100 m, can cause such scatter, and can affect all visible and near - to - mid-IR wavelengths equally. • Consequently, this scattering is “non-selective” with respect to wavelength. In the visible wavelengths, equal quantities of blue green and red light are scattered.
  28. 28. Refraction   bending of light when it passes through two media degrades spectral signatures on hot- humid days
  29. 29. Atmospheric Absorption Mostly caused by atmospheric contents:  ozone  carbon dioxide  water vapour  Oxygen  Dust Particle
  30. 30. Energy Interaction with Surface / target Reflection   the bouncing of electromagnetic energy from a surface type of reflection is dependent on the size of the surface irregularities relative to the incident wavelength. Types of Reflection 1. Specular Reflectance 2. Diffuse/Lambertian Reflectance
  31. 31. Specular Reflectance  light is reflected in a single direction - 'mirror' reflection specular reflectance helps and hinders remote sensing. Diffuse/Lambertian Reflectance   energy is reflected equally in all directions  many natural surfaces act as a diffuse reflector to some extent.
  32. 32. ENERGY INTERACTIONS WITH EARTH SURFACE MATERIALS EMR incident on any earth surface will interact in three fundanental ways – various fractions of the energy will be reflected, absorbed and/or transmitted. EI ( ) = ER ( ) + ET ( ) + EA ( ) Where; EI = incident energy ER = reflected energy EA = absorbed energy ET = transmi fted energy ( ) = denotes all components are a function of wavelength ER ( ) = E I( )- [EA ( ) + ET ( )] Reflection + Transmission + Absorption = 100%
  33. 33. Resolution means the ability and quality of a sensor to view/detect the objects. In satellite sensor it is four types; spatial, spectral, Temporal and Radiometric It is generally believed that improvements in resolution increases the probability that phenomena may be remotely sensed more accurately. The trade-off is that any improvement in resolution usually will require additional data - processing capability for either human or computer - assisted analysis.
  34. 34. Temporal Resolution Revisit of sensor to same area Temporal Resolution: how often do we observe? • Temporal - how often the sensor acquires data, e.g. every 30 days.
  35. 35. Represents the frequency with which a satellite can re-visit an area of interest and acquire a new image. Depends on the instrument's field of vision, and the satellite's orbit. Remote Sensor Data Acquisition June 1, 2006 June 17, 2006 16 days July 3, 2006
  36. 36. Radiometric Resolution D e g re e o f d e t a i l o b s e r v e d ? Radiometric - the sensitivity of detectors to scale small differences in electromagnetic energy.
  37. 37. 0 0 7-bit (0 - 127) 8-bit (0 - 255) 0 0 9-bit (0 - 511) 10-bit (0 - 1023) Radiometric resolution, or radiometric sensitivity refers to the number of digital levels used to express the data collected by the sensor. In general, the greater the number of levels, the greater the detail of information.
  38. 38. Spatial Resolution 1. Spatial Resolution: • Spatial m, etc. what size we can resolve? - the size of the field-of-view, e.g. 10 x 10 m or 5 x 5 SPATIAL : Smallest unit that can be resolved •This represents the ability of the sensor to detect and distinguish small objects and fine detail in larger objects. • Depends on the instrument's sensitivity and distance from the object, and defines the pixel size of a digital image. • The fineness of detail visible in an image. – (course) Low resolution – smallest features not discernable – (fine) High resolution – small objects are discernable • Factors affecting spatial resolution – Atmosphere, – haze, – smoke, – low light, – particles or blurred sensor systems
  39. 39. Better spatial resolution resolution = ~200 m resolution = ~10 m
  40. 40. Spatial Resolution Imagery of residential housing in Mechanicsville, New York, obtained on June 1, 1998, at a nominal spatial resolution of 0.3 x 0.3 m (approximately 1 x 1 ft.) using a digital camera.
  41. 41. 30 m Landsat TM ( 8 bit ) 10 m SPOT PAN ( 8 bit ) 5.6 m IRS 1C PAN ( 6 bit ) 1m IKONOS ( 11 bit )
  42. 42. Spectral Resolution 2. Spectral Resolution: what wavelengths do we use?  SPECTRAL : Sensitive to specific wavelength intervals • Spectral - the number and size of spectral regions ,the sensor records data in, e.g. blue, green, red, near-infrared, thermal infrared, microwave (radar).
  43. 43. The term spectral resolution refers to the width of spectral bands that a satellite imaging system can detect. Often satellite imaging systems are multi-spectral meaning that they can detect in several discrete bands, it is the width of these bands that spectral resolution refers too. The narrower the bands, the greater the spectral resolution.
  44. 44. SPECTRAL RESOLUTION Landsat TM
  45. 45. Spatial Resolution: the space between my Data Points (per ha.) 80 m = 1.5 30 m = 11 23.5 m = 18 10 m = 100 5 m = 400 1 m = 10000  Spectral Resolution: the “Various Colors” of my Data Points  Radiometric Resolution: How well I can distinguish my Data Points  Temporal Resolution: the number of times that I can see my Data Points T1 T2 T3 .. .. ..
  46. 46. Spectral Reflectance Curve The energy reaching surface is called irradiance and the energy reflected by the surface is called radiance. All matter is composed of atoms and molecules with particular compositions. Therefore matter will reflect wavelength with respect to the inner state. A graph of the spectral reflectance of an object as a function of wavelength is called a spectral curve. Also known as color readability.
  47. 47. Spectral Reflectance Curve for VEGETATION Generally, a leaf is built up of layers of structural fibrous organic matter, within which are pigmented, water filled cells and air spaces. Therefore, vegetation is considered under following features: 1. Pigmentation (i.e colour) 2. Physiological Structure (i.e leaf structure) 3. Water Content (i.e. moisture) Each features has an effect on the reflectance, absorbance and transmittance properties of a green leaf. Pigment Chlorophyll a Chlorophyll b b Carotene Xanthophyll Wavelengths Absorbed 0.43 to 0.66 microns 0.45 to 0.65 microns blue to green blue to green Physiological Structure •The discontinuities in the refractive indices within a leaf determine its near infrared reflectance. •These discontinuities occur between membranes and cytoplasm within the upper half of the leaf, and more importantly between individual cells and air spaces of the spongy mesophyll within the lower half of the leaf. Water content: detected in near and middle infrared
  48. 48. Spectral Reflectance Curve for SOIL The reflectance of most soil types are similar, with an increase in reflectance with wavelength. The dominant factors for reflectance are: 1. Moisture content 2. Organic content 3. Soil texture 4. Soil structure 5. Iron oxide content
  49. 49. Spectral Reflectance Curve for soils at different levels of moisture
  50. 50. Standard Product Received information from satellite + Correction Standard Product After correction, raw data is converted from image coordinates system to ground coordinate system These are generated products using the information received from the satellite after applying following two correction 1. Geometric Correction 2. Radiometric Correction Causes of Geometric Correction 1. Distortion due to relative motion of satellite w. r. t. earth 2. Distortion due to earth curvature 3. Panoramic distortion arising out of tilt angle Steps for Correction 1. by establishing a mapping b/w the output space as defined by user. 2. by transforming the input data into this defined output space
  51. 51. Causes of Radiometric Correction 1. Non-uniform response of the detector and detector elements 2. Detector element failure 3. data loss during communication 4. Narrow dynamic range 5. Image to image variation Steps for Correction 1. Detector normalization 2. Framing of received scene 3. Failed / degraded detector correction 4. Line loss correction
  52. 52. Data reception, transmission and processing
  53. 53. Remote Sensing and Image Interpretation Photographic / Image interpretation is defined as the act of examining photographic images for the purpose of identifying objects and judging their significances. Image interpretation process is a complex task and require several tasks to be conducted in a well defined routine consisting of the following process: 1. Classification : (determination of presence or absence of objects) 2. Enumeration : (listing and counting of objects which are visible in image) 3. Mensuration : (measurement of objects in terms of length, area, volume, height) 4. Delineation: (outlining the regions of homogenous objects or areas. The most important tasks for spatial interpretation is to establish interpretation keys, i.e. identifying the typical spatial and spectral patterns of known geographical features.
  54. 54. Elements of Image Interpretation           TONE / COLOUR PATTERN TEXTURE SIZE SHAPE SHADOW LOCATION ASSOCIATION SCALE RESOLUTION Interpretation Key  Selective Key  Elimination Key
  55. 55. Elements of Image Interpretation Shape - refers to the general outline of objects. Regular geometric shapes are usually indicators of human presence and use. Some objects can be identified almost solely on the basis of their shapes: for example - the Pentagon Building, (American) football fields, cloverleaf highway interchanges Size - The size of objects must be considered in the context of the scale of a photograph. The scale will help you determine if an object is a stock pond or Lake Minnetonka. Tone - Tone refers to the relative brightness or color of elements on a photograph. It is, perhaps, the most basic of the interpretive elements because without tonal differences none of the other elements could be discerned. Texture - The impression of "smoothness" or "roughness" of image features is caused by the frequency of change of tone in photographs. It is produced by a set of features too small to identify individually. Grass, cement, and water generally appear "smooth", while a forest canopy may appear "rough". Site - refers to topographic or geographic location. This characteristic of photographs is especially important in identifying vegetation types and landforms. For example, large circular depressions in the ground are readily identified as sinkholes in central Florida, where the bedrock consists of limestone. Association - Some objects are always found in association with other objects. The context of an object can provide insight into what it is. For instance, a nuclear power plant is not (generally) going to be found in the midst of single-family housing. Pattern - (spatial arrangement) -- The patterns formed by objects in a photo can be diagnostic. Consider the difference between (1) the random pattern formed by an unmanaged area of trees and (2) the evenly spaced rows formed by an orchard.
  56. 56. Elements of Image Interpretation Shape (depends on the object outline) Size (relative to one an other) Tone (brightness-hue, color) Site (location helps recognition) Pattern Association Shadow (features that are normally Texture found near object) (smooth or coarse) (helps to determine height)
  57. 57. Identify the following features in the image and explain how you were able to do so based on the elements of visual interpretation 6 4 5 3 2 1. 2. 3. 4. 5. 6. race track river roads bridges residential area dam 1 1. Race track : its characteristic shape 2. River : contrasting tone and shape 3. Roads : bright tone and linear feature 4. Bridges : association with river; they cross it ! 5. Residential area : the pattern that they make in conjunction with roads 6. Dam : bright tone with dark river, shape and association with river – where else would a dam be?!
  58. 58. Combined Plus of Band 4, 5,7 MSS Band 4: Blue (0.4 – 0.5 µm) standard false color composite (FCC): Final Image B 3 Image Red Filter B 2 Image Green Filter B 1 Image Blue Filter MSS Band 5: Green (0.5 – 0.6 µm) Light Source Bands plus (4, 5, 6) joined (mosaicked) to make a standard false color composite using the three Band combinations as shown in above image: MSS Band 6: Red (0.6 – 0.7 µm)
  59. 59. False color composite
  60. 60. True/Natural Color Image False Color IR Image
  61. 61. Multispectral concept of remote sensing Format of a Multispectral Image 1 Li nes or 1 rows (i ) 2 3 4 18 10 15 17 20 Col umns ( j) 2 3 4 5 15 16 18 17 18 21 22 20 22 20 24 21 23 Brigh tne ss valu e range (typical ly 8 bi t) 2 22 25 255 1 3 white 127 Associated gray-scal e gray Ban ds (k ) 4 0 X axis black Pict ure element (pixel) at location Line 4, Column 4, in Band 1 has a Bright ness Value of 24, i.e., BV 4,4,1 = 24 . 68
  62. 62. Remote Sensing Satellite Till date many numbers of satellites have been launched for the observation of Earth for different purposes such as weather, meteorological, monitoring satellite since from 1960s. Worldwide three major land satellite missions have been launched, which are 1. LANDSAT Series by USA 2. SPOT series by France 3. IRS series by India These satellites are low-altitude satellite having an altitude of less than 1000 km above the Earth’s surface and sensors having low spatial resolution and spectral resolution of less than 0.1 µm. High spatial resolution Satellites The stereo ability of SPOT and IRS –1C/ID PAN data provides a new dimension to cartographic mapping application up to 1 : 10,000 scale. The intense research has lead to significant improvement in spatial resolution close to 1 m. Some of these sensors capable of acquiring such high spatial resolution are: 1. IKONS 2. Quick Bird 3. OrbView Both types either Low or high spatial resolution satellites use different combination of spectral band in their sensor. Therefore, on the basis of spectral band (s) used, we have different types of remote sensors.
  63. 63. Remote sensing data product National Remote Sensing Centre (NRSC) is the focal point for distribution of remote sensing satellite data product in India and its neighboring countries. Station: at Shadnagar (Hyderabad) Receive the data from almost all contemporary remote sensing satellite such as: IRS P5 IRS P6 IRS P4 IRS 1D IRS 1C IRS P3 ERS 1/2 NOAA series AQUR , etc Classification of Product 1. Level of Processing 2. Output media / scale 3. Area of coverage LGSOWG (Landsat Ground Station operators Working Group) HDH (Hierarchical Data Format) 1. Standard Products a) b) c) d) e) f) 1. Photographic data Product a) B/W b) FCC 2. Digital Products a) b) c) d) e) Geo TIFF Fast format LGSOWG HDF CCT Path/row based Shift along track Quadrant product Geo-referenced Product Basic streo-pairs Area of interest based product 2. Value Added Products a) b) c) Geo-coded Product Merged product Ortho product 3. Derived Product
  64. 64. Types of Remote Sensor Most common type of sensors used in remote sensing satellites are: 1. Return Beam Vidicon (RBV) 2. Multi Spectral Scanner (MSS) 3. Thematic Mapper (TM) 4. Enhanced Thematic Mapper (ETM) 5. Enhanced Thematic Mapper Plus (ETM+) Each of these sensors collected data over a swath width of 185 km, with a full scene being defined as 185 km x 185 km. The most popular instrument in the early days of Landsat was the Multi Spectral Scanner (MSS) and later the Thematic Mapper (TM). Return Beam Vidicon (RBV) RBV sensor is essentially a camera – like instruments that took “snapshot’ image of the earth’s surface along the ground track of the satellite. Image frames of 185 km x 185 km were acquired with each shot, repeated at 25 second intervals to give contiguous frames in the along track direction at the equivalent ground speed of the satellite. Multi Spectral Scanner (MSS) The MSS senses the electromagnetic radiation from the Earth's surface in four spectral bands. Each band has a spatial resolution of approximately 60 x 80 meters and a radiometric resolution of 64 digital numbers (6 bits). Sensing is accomplished with a line scanning device using an oscillating mirror. Six scan lines are collected simultaneously with each west-to-east sweep of the scanning mirror. Thematic Mapper (TM) The TM sensor provides several improvements over the MSS sensor including: higher spatial and radiometric resolution; finer spectral bands; seven as opposed to four spectral bands; and an increase in the number of detectors per band (16 for the non-thermal channels versus six for MSS). Sixteen scan lines are captured simultaneously for each non-thermal spectral band (four for thermal band), using an oscillating mirror which scans during both the forward (west-to-east) and reverse (east-to-west) sweeps of the scanning mirror. All channels are recorded over a range of 256 digital numbers (8 bits).
  65. 65. Types of Sensors, Spectral Band Sensor Band Spectral Range (µm) 1 0.45 - 0.52 (blue) 2 0.52 - 0.60 (green) 3 0.63 - 0.69 (red) 4 0.76 - 0.90 (near IR) 5 1.55 - 1.75 (short wave IR) 6 10.4 - 12.5 (thermal IR) 7 2.08 - 2.35 (short wave IR) TM All TM Bands ETM ETM+ PAN 0.5 – 0.9 Same as ETM with Thermal Band resolution of 60 m
  66. 66. Explain why data from the Landsat TM sensor might be considered more useful than data from the original MSS sensor. Think about their spatial, spectral, and radiometric resolutions and try to explain. Explanation There are several reasons why TM data may be considered more useful than MSS data. 1. Although the area coverage of a TM scene is virtually the same as a MSS scene, TM offers higher spatial, spectral, and radiometric resolution. 2. The spatial resolution is 30 m compared to 80 m (except for the TM thermal channels, which are 120 m to 240 m). 3. Level of spatial detail detectable in TM data is better. TM has more spectral channels which are narrower and better placed in the spectrum for certain applications, particularly vegetation discrimination. 4. Increase from 6 bits to 8 bits for data recording represents a four-fold increase in the radiometric resolution of the data. (Remember, 6 bits = 26 = 64, and 8 bits = 28 = 256 - therefore, 256/64 = 4). However, this does not mean that TM data are "better" than MSS data. Indeed, MSS data are still used to this day and provide an excellent data source for many applications. If the desired information cannot be extracted from MSS data, then perhaps the higher spatial, spectral, and radiometric resolution of TM data may be more useful.
  67. 67. Other sensors Indian Remote sensing satellites, IRS-P6 consists of three solid state cameras : 1. A medium resolution multispectral sensor - LISS-III 2. A high resolution multispectral sensor - LISS-IV 3. An Advanced Wide Field Sensor – AWiFS LISS-III camera (Linear Image Self Scanning) The LISS-III is a multi-spectral camera operating in four spectral bands, three in the visible and near infra-red and one in SWIR region, as in the case of IRS-1C/1D. The new feature in LISS-III camera is the SWIR band (1.55 to 1.7 microns), which provides data with a spatial resolution of 23.5 m unlike IRS-1C/1D (the spatial resolution is 70m). LISS-IV Camera (Linear Image Self Scanning) The LISS-IV camera is a multispectral high resolution camera with a spatial resolution of 5.8m. The sensor consists of three linear odd-even pairs of CCD arrays, each with 12000 pixels. The odd and even pixel rows are separated by 35 microns, which correspond to five scan lines. Also the placement of the three CCDs in the focal plane is such that their imaging strips on the ground are separated by 14.25 km in the along-track direction. AWiFS Sensor The AWiFS camera provides enhanced capabilities compared to the WiFS camera on-board IRS1C/1D, in terms of spatial resolution (56 m Vs 188m), radiometric resolution (10 bits Vs 7 bits) and Spectral bands (4 Vs 2) with the additional feature of on-board detector calibration using LEDs. The spectral bands of AWiFS are same as LISS-III.
  68. 68. Multi-Spectral Scanner (MSS) MSS Band 4: 0.5- 0.6 µm (Green) MSS Band 5: 0.6- 0.7 µm (Red) MSS Band 6: 0.7- 0.8 µm (Photo IR) MSS Band 7: 0.8- 0.9 µm (Near IR)
  69. 69. Satellite Series
  70. 70. High-Resolution Satellite Systems IKONOS (1999) 680 km orbit Revisit is typically 1- 3 days 1 meter panchromatic 4 meter visible color /infrared Scan width of 13 km
  71. 71. Quickbird (2000) 61 cm (2 foot) panchrom atic 2.44 meter multi spectral Image side 16.5 km 1-3 day revisit
  72. 72. Geostationary satellites Geostationary satellites are able to view earth from above. As they move in synchronicity with earth's rotation, they can provide regular coverage for a region and help in forecasting.
  73. 73. Polar Orbiting Satellites Polar orbits allow for lower altitudes and more image resolution. These satellites take multiple passes of the earth before returning to the same location.
  74. 74. Satellite Orbits Characteristic • Geosynchronous Orbit (GEO): 36,000 km above Earth, includes commercial and military communications satellites, satellites providing early warning of ballistic missile launch. • Medium Earth Orbit (MEO): includes navigation satellites (GPS, Galileo, Glonass). • Low Earth Orbit (LEO): from 80 to 2000 km above Earth, includes military intelligence satellites, weather satellites.
  75. 75. Unit Completed …
  76. 76. Assignment !!!!
  77. 77. Submission date : 1. What are the characteristics of real remote sensing system? How do they differ from the ideal remote sensing? 2. Explain the general process involved in electromagnetic remote sensing . Differentiate between active and passive remote sensing system, under what condition which are preferable? 3. What are essential difference between a raw, standard and a geocoded imagery ? Which are most suitable in terms of geometric quality. 4. Draw a neat diagram of electromagnetic spectrum showing its various regions of particular interest in remote sensing? 5. Explain multiconcept in remote sensing? 6. Write characteristics of any one satellite with its sensor, bands , swath , resolution , altitude, and repeativity ? 7. Explain the type of resolution?

×