SlideShare una empresa de Scribd logo
1 de 444
Descargar para leer sin conexión
PDF generated using the open source mwlib toolkit. See http://code.pediapress.com/ for more information.
PDF generated at: Sun, 26 May 2013 00:10:46 UTC
What is Lidar?
A Resource Curated by AirTopo Group
Contents
Articles
Lidar 1
Remote sensing 9
Light 17
Radar 27
Geomatics 45
Archaeology 47
Geography 67
Geology 77
Geomorphology 98
Seismology 105
Forestry 111
Atmospheric physics 117
Contour line 121
Laser 129
National Center for Atmospheric Research 149
Ultraviolet 152
Interferometric visibility 173
Infrared 175
Aerosol 188
Cloud 197
Meteorology 225
Aircraft 239
Satellite 249
Surveying 265
Micrometre 276
Computer 277
Beam splitter 296
Laser scanning 297
Azimuth 300
Optics 304
Attenuation 329
Global Positioning System 334
Inertial measurement unit 358
National lidar dataset 361
Agricultural Research Service 362
Canopy (biology) 364
Orienteering 366
Soil survey 375
LIDAR speed gun 377
Speed limit enforcement 378
Wind farm 388
Structure from motion 401
CLidar 403
Lidar detector 405
Satellite laser ranging 405
Optical time-domain reflectometer 407
Optech 413
TopoFlight 414
Time-domain reflectometry 416
References
Article Sources and Contributors 418
Image Sources, Licenses and Contributors 430
Article Licenses
License 441
Lidar 1
Lidar
A FASOR used at the Starfire Optical Range for
lidar and laser guide star experiments is tuned to
the sodium D2a line and used to excite sodium
atoms in the upper atmosphere.
This lidar may be used to scan buildings, rock
formations, etc., to produce a 3D model. The lidar
can aim its laser beam in a wide range: its head
rotates horizontally; a mirror tilts vertically. The
laser beam is used to measure the distance to the
first object on its path.
Lidar (also written LIDAR or LiDAR) is a remote sensing technology
that measures distance by illuminating a target with a laser and
analyzing the reflected light. The term lidar comes from combining the
words light and radar.
[]
Lidar is popularly known as a technology used to make high resolution
maps, geomatics, archaeology, geography, geology, geomorphology,
seismology, forestry, remote sensing, atmospheric physics,
[1]
airborne
laser swath mapping (ALSM), laser altimetry, and contour mapping.
History and etymology of lidar/LIDAR
Lidar was developed in the early 1960s, shortly after the invention of
the laser, and combined laser’s focused imaging with radar’s ability to
calculate distances by measuring the time for the signal to return. Its
first applications were in meteorology, where it was used to measure
clouds by the National Center for Atmospheric Research.
[]
Although commonly considered to be an acronym, the term lidar is
actually a portmanteau of "light" and "radar", the first published
mention of lidar makes this clear: "Eventually the laser may provide an
extremely sensitive detector of particular wavelengths from distant
objects. Meanwhile, it is being used to study the moon by "lidar" (light
radar) and it promises a means of communications, not only all over
the solar system, but also with planets of nearby stars";
[2]
the Oxford
English Dictionary supports this etymology.
[]
The assumption that lidar was an acronym (LIDAR) came later,
beginning in 1970,
[3]
and was based on the assumption that since the
base term "radar" originally started as an acronym for "RAdio
Detection And Ranging", that LIDAR must stand for "LIght Detection
And Ranging",
[4]
or "Laser Imaging, Detection and Ranging".
[5]
Although "radar" is no longer treated as an acronym and is universally
uncapitalized, the word "lidar" became capitalized as LIDAR in some
publications beginning in the 1980s.
[6]
Today there is no consensus in
capitalization, reflecting uncertainty about whether or not it is an
acronym, and if it is an acronym, if it should be lowercase, like "radar".
Lidar is also sometimes spelled "LIDAR", "LiDAR", "LIDaR", or
"Lidar", depending on the publication, the USGS uses both LIDAR and
lidar, sometimes in the same document,
[7]
and the New York Times
uses both "lidar" and "Lidar".
[8]
Lidar 2
General description
Lidar uses ultraviolet, visible, or near infrared light to image objects and can be used with a wide range of targets,
including non-metallic objects, rocks, rain, chemical compounds, aerosols, clouds and even single molecules.
[1]
A
narrow laser beam can be used to map physical features with very high resolution.
Lidar has been used extensively for atmospheric research and meteorology. Downward-looking lidar instruments
fitted to aircraft and satellites are used for surveying and mapping – a recent example being the NASA Experimental
Advanced Research Lidar.
[9]
In addition lidar has been identified by NASA as a key technology for enabling
autonomous precision safe landing of future robotic and crewed lunar landing vehicles.
[10]
Wavelengths from about 10 micrometers to the UV (ca. 250 nm) are used to suit the target. Typically light is
reflected via backscattering. Different types of scattering are used for different lidar applications; most common are
Rayleigh scattering, Mie scattering, Raman scattering, and fluorescence. Based on different kinds of backscattering,
the lidar can be accordingly called Rayleigh Lidar, Mie Lidar, Raman Lidar, Na/Fe/K Fluorescence Lidar, and so
on.
[1]
Suitable combinations of wavelengths can allow for remote mapping of atmospheric contents by looking for
wavelength-dependent changes in the intensity of the returned signal.
Design
In general there are two kinds of lidar detection schemes: "incoherent" or direct energy detection (which is
principally an amplitude measurement) and Coherent detection (which is best for doppler, or phase sensitive
measurements). Coherent systems generally use Optical heterodyne detection, which, being more sensitive than
direct detection, allows them to operate a much lower power but at the expense of more complex transceiver
requirements.
In both coherent and incoherent lidar, there are two types of pulse models: micropulse lidar systems and high energy
systems. Micropulse systems have developed as a result of the ever increasing amount of computer power available
combined with advances in laser technology. They use considerably less energy in the laser, typically on the order of
one microjoule, and are often "eye-safe," meaning they can be used without safety precautions. High-power systems
are common in atmospheric research, where they are widely used for measuring many atmospheric parameters: the
height, layering and densities of clouds, cloud particle properties (extinction coefficient, backscatter coefficient,
depolarization), temperature, pressure, wind, humidity, trace gas concentration (ozone, methane, nitrous oxide,
etc.).
[1]
There are several major components to a lidar system:
1. Laser — 600–1000 nm lasers are most common for non-scientific applications. They are inexpensive, but since
they can be focused and easily absorbed by the eye, the maximum power is limited by the need to make them
eye-safe. Eye-safety is often a requirement for most applications. A common alternative, 1550 nm lasers, are
eye-safe at much higher power levels since this wavelength is not focused by the eye, but the detector technology
is less advanced and so these wavelengths are generally used at longer ranges and lower accuracies. They are also
used for military applications as 1550 nm is not visible in night vision goggles, unlike the shorter 1000 nm
infrared laser. Airborne topographic mapping lidars generally use 1064 nm diode pumped YAG lasers, while
bathymetric systems generally use 532 nm frequency doubled diode pumped YAG lasers because 532 nm
penetrates water with much less attenuation than does 1064 nm. Laser settings include the laser repetition rate
(which controls the data collection speed). Pulse length is generally an attribute of the laser cavity length, the
number of passes required through the gain material (YAG, YLF, etc.), and Q-switch speed. Better target
resolution is achieved with shorter pulses, provided the lidar receiver detectors and electronics have sufficient
bandwidth.
[1]
2. Scanner and optics — How fast images can be developed is also affected by the speed at which they are
scanned. There are several options to scan the azimuth and elevation, including dual oscillating plane mirrors, a
Lidar 3
combination with a polygon mirror, a dual axis scanner (see Laser scanning). Optic choices affect the angular
resolution and range that can be detected. A hole mirror or a beam splitter are options to collect a return signal.
3. Photodetector and receiver electronics — Two main photodetector technologies are used in lidars: solid state
photodetectors, such as silicon avalanche photodiodes, or photomultipliers. The sensitivity of the receiver is
another parameter that has to be balanced in a lidar design.
4. Position and navigation systems — Lidar sensors that are mounted on mobile platforms such as airplanes or
satellites require instrumentation to determine the absolute position and orientation of the sensor. Such devices
generally include a Global Positioning System receiver and an Inertial Measurement Unit (IMU).
3D imaging can be achieved using both scanning and non-scanning systems. "3D gated viewing laser radar" is a
non-scanning laser ranging system that applies a pulsed laser and a fast gated camera.
Imaging lidar can also be performed using arrays of high speed detectors and modulation sensitive detectors arrays
typically built on single chips using CMOS and hybrid CMOS/CCD fabrication techniques. In these devices each
pixel performs some local processing such as demodulation or gating at high speed down converting the signals to
video rate so that the array may be read like a camera. Using this technique many thousands of pixels / channels may
be acquired simultaneously.
[11]
High resolution 3D lidar cameras use homodyne detection with an electronic CCD or
CMOS shutter.
[]
A coherent Imaging lidar uses Synthetic array heterodyne detection to enables a staring single element receiver to act
as though it were an imaging array.
[12]
Applications
This lidar-equipped mobile robot uses its lidar to
construct a map and avoid obstacles.
Other than those applications listed above, there are a wide variety of
applications of lidar, as often mentioned in National LIDAR Dataset
programs.
Lidar 4
Agriculture
Agricultural Research Service scientists have
developed a way to incorporate lidar with yield
rates on agricultural fields. This technology will
help farmers improve their yields by directing
their resources toward the high-yield sections of
their land.
Lidar also can be used to help farmers determine which areas of their
fields to apply costly fertilizer. Lidar can create a topographical map of
the fields and reveals the slopes and sun exposure of the farm land.
Researchers at the Agricultural Research Service blended this
topographical information with the farm land’s yield results from
previous years. From this information, researchers categorized the farm
land into high-, medium-, or low-yield zones.
[13]
This technology is
valuable to farmers because it indicates which areas to apply the
expensive fertilizers to achieve the highest crop yield.
Archaeology
Lidar has many applications in the field of archaeology including
aiding in the planning of field campaigns, mapping features beneath
forest canopy, and providing an overview of broad, continuous features that may be indistinguishable on the
ground.
[14]
Lidar can also provide archaeologists with the ability to create high-resolution digital elevation models
(DEMs) of archaeological sites that can reveal micro-topography that are otherwise hidden by vegetation.
LiDAR-derived products can be easily integrated into a Geographic Information System (GIS) for analysis and
interpretation. For example at Fort Beausejour - Fort Cumberland National Historic Site, Canada, previously
undiscovered archaeological features below forest canopy have been mapped that are related to the siege of the Fort
in 1755. Features that could not be distinguished on the ground or through aerial photography were identified by
overlaying hillshades of the DEM created with artificial illumination from various angles. With lidar the ability to
produce high-resolution datasets quickly and relatively cheaply can be an advantage. Beyond efficiency, its ability to
penetrate forest canopy has led to the discovery of features that were not distinguishable through traditional
geo-spatial methods and are difficult to reach through field surveys, as in work at Caracol by Arlen Chase and his
wife Diane Zaino Chase.
[15]
The intensity of the returned signal can be used to detect features buried under flat
vegetated surfaces such as fields, especially when mapping using the infrared spectrum. The presence of these
features affects plant growth and thus the amount of infrared light reflected back.
[16]
In 2012, Lidar was used by a
team attempting to find the legendary city of La Ciudad Blanca in the Honduran jungle. During a seven day mapping
period, they found evidence of extensive man-made structures that had alluded ground searches for hundreds of
years.
[17]
Autonomous Vehicles
3D SICK Lidar
Autonomous vehicles use lidar for obstacle detection and avoidance to
navigate safely through environments.
[18]
Biology and conservation
Lidar has also found many applications in forestry. Canopy heights,
biomass measurements, and leaf area can all be studied using airborne lidar systems. Similarly, lidar is also used by
many industries, including Energy and Railroad, and the Department of Transportation as a faster way of surveying.
Topographic maps can also be generated readily from lidar, including for recreational use such as in the production
of orienteering maps.[19]
In addition, the Save-the-Redwoods League is undertaking a project to map the tall redwoods on California's
northern coast. Lidar allows research scientists to not only measure the height of previously unmapped trees but to
Lidar 5
determine the biodiversity of the redwood forest. Stephen Sillett who is working with the League on the North Coast
Lidar project claims this technology will be useful in directing future efforts to preserve and protect ancient redwood
trees.
[20]
Wikipedia:Citing sources#What information to include
Geology and soil science
High-resolution digital elevation maps generated by airborne and stationary lidar have led to significant advances in
geomorphology (the branch of geoscience concerned with the origin and evolution of Earth's surface topography).
Lidar’s abilities to detect subtle topographic features such as river terraces and river channel banks, to measure the
land-surface elevation beneath the vegetation canopy, to better resolve spatial derivatives of elevation, and to detect
elevation changes between repeat surveys have enabled many novel studies of the physical and chemical processes
that shape landscapes.
[citation needed]
In geophysics and tectonics, a combination of aircraft-based lidar and GPS has evolved into an important tool for
detecting faults and for measuring uplift. The output of the two technologies can produce extremely accurate
elevation models for terrain - models that can even measure ground elevation through trees. This combination was
used most famously to find the location of the Seattle Fault in Washington, USA.
[21]
This combination also measures
uplift at Mt. St. Helens by using data from before and after the 2004 uplift.
[22]
Airborne lidar systems monitor
glaciers and have the ability to detect subtle amounts of growth or decline. A satellite-based system, NASA's ICESat,
includes a lidar sub-system for this purpose. NASA's Airborne Topographic Mapper
[23]
is also used extensively to
monitor glaciers and perform coastal change analysis. The combination is also used by soil scientists while creating a
soil survey. The detailed terrain modeling allows soil scientists to see slope changes and landform breaks which
indicate patterns in soil spatial relationships.
Meteorology and atmospheric environment
The first lidar systems were used for studies of atmospheric composition, structure, clouds, and aerosols. Initially
based on ruby lasers, lidar for meteorological applications was constructed shortly after the invention of the laser and
represent one of the first applications of laser technology.
Differential Absorption lidar (DIAL) is used for range-resolved measurements of a particular gas in the atmosphere,
such as ozone, carbon dioxide, or water vapor. The lidar transmits two wavelengths: an "on-line" wavelength that is
absorbed by the gas of interest and an off-line wavelength that is not absorbed. The differential absorption between
the two wavelengths is a measure of the concentration of the gas as a function of range. DIAL lidars are essentially
dual-wavelength backscatter lidars.
[citation needed]
Doppler Lidar and Rayleigh Doppler Lidar are used to measure temperature and/or wind speed along the beam by
measuring the frequency of the backscattered light. The Doppler broadening of gases in motion allows the
determination of properties via the resulting frequency shift.
[24][25]
Scanning lidars, such as NASA's HARLIE
LIDAR, have been used to measure atmospheric wind velocity in a large three dimensional cone.
[26]
ESA's wind
mission ADM-Aeolus will be equipped with a Doppler lidar system in order to provide global measurements of
vertical wind profiles.
[27]
A doppler lidar system was used in the 2008 Summer Olympics to measure wind fields
during the yacht competition.
[28]
Doppler lidar systems are also now beginning to be successfully applied in the
renewable energy sector to acquire wind speed, turbulence, wind veer and wind shear data. Both pulsed
[29]
and
continuous wave systems
[30]
are being used. Pulsed systems using signal timing to obtain vertical distance
resolution, whereas continuous wave systems rely on detector focusing.
Synthetic Array Lidar allows imaging lidar without the need for an array detector. It can be used for imaging
Doppler velocimetry, ultra-fast frame rate (MHz) imaging, as well as for speckle reduction in coherent lidar.
[12]
An
extensive lidar bibliography for atmospheric and hydrospheric applications is given by Grant.
[31]
Lidar 6
Law enforcement
LIDAR speed guns are used by the police to measure the speed of vehicles for speed limit enforcement
purposes.
[citation needed]
Military
Few military applications are known to be in place and are classified, but a considerable amount of research is
underway in their use for imaging. Higher resolution systems collect enough detail to identify targets, such as tanks.
Examples of military applications of lidar include the Airborne Laser Mine Detection System (ALMDS) for
counter-mine warfare by Areté Associates.
[32]
A NATO report (RTO-TR-SET-098) evaluated the potential technologies to do stand-off detection for the
discrimination of biological warfare agents. The potential technologies evaluated were Long-Wave Infrared (LWIR),
Differential Scatterring (DISC), and Ultraviolet Laser Induced Fluorescence (UV-LIF). The report concluded that :
Based upon the results of the lidar systems tested and discussed above, the Task Group recommends that the best
option for the near-term (2008–2010) application of stand-off detection systems is UV LIF .
[33]
However, in the
long-term, other techniques such as stand-off Raman spectroscopy may prove to be useful for identification of
biological warfare agents.
Short-range compact spectrometric lidar based on Laser-Induced Fluorescence (LIF) would address the presence of
bio-threats in aerosol form over critical indoor, semi-enclosed and outdoor venues like stadiums, subways, and
airports. This near real-time capability would enable rapid detection of a bioaerosol release and allow for timely
implementation of measures to protect occupants and minimize the extent of contamination.
[34]
The Long-Range Biological Standoff Detection System (LR-BSDS) was developed for the US Army to provide the
earliest possible standoff warning of a biological attack. It is an airborne system carried by a helicopter to detect
man-made aerosol clouds containing biological and chemical agents at long range. The LR-BSDS, with a detection
range of 30 km or more, was fielded in June 1997.
[35]
Five lidar units produced by the German company Sick AG were used for short range detection on Stanley, the
autonomous car that won the 2005 DARPA Grand Challenge.
A robotic Boeing AH-6 performed a fully autonomous flight in June 2010, including avoiding obstacles using
lidar.
[36][37]
Mining
Lidar is used in the mining industry for various tasks. The calculation of ore volumes is accomplished by periodic
(monthly) scanning in areas of ore removal, then comparing surface data to the previous scan.
[citation needed]
Physics and astronomy
A worldwide network of observatories uses lidars to measure the distance to reflectors placed on the moon, allowing
the moon's position to be measured with mm precision and tests of general relativity to be done. MOLA, the Mars
Orbiting Laser Altimeter, used a lidar instrument in a Mars-orbiting satellite (the NASA Mars Global Surveyor) to
produce a spectacularly precise global topographic survey of the red planet.
In September, 2008, NASA's Phoenix Lander used lidar to detect snow in the atmosphere of Mars.
[38]
In atmospheric physics, lidar is used as a remote detection instrument to measure densities of certain constituents of
the middle and upper atmosphere, such as potassium, sodium, or molecular nitrogen and oxygen. These
measurements can be used to calculate temperatures. lidar can also be used to measure wind speed and to provide
information about vertical distribution of the aerosol particles.
[citation needed]
At the JET nuclear fusion research facility, in the UK near Abingdon, Oxfordshire, lidar Thomson Scattering is used
to determine Electron Density and Temperature profiles of the plasma.
[39]
Lidar 7
Robotics
Lidar technology is being used in Robotics for the perception of the environment as well as object classification.
[40]
The ability of lidar technology to provide three-dimensional elevation maps of the terrain, high precision distance to
the ground, and approach velocity can enable safe landing of robotic and manned vehicles with a high degree of
precision.
[41]
Refer to the Military section above for further examples.
Spaceflight
Lidar is increasingly being utilized for rangefinding and orbital element calculation of relative velocity in proximity
operations and stationkeeping of spacecraft. Lidar has also been used for atmospheric studies from space. Using
short pulses of laser light beamed from a spacecraft, some of that "light reflects off of tiny particles in the
atmosphere and back to a telescope aligned with the laser. By precisely timing the lidar 'echo,' and by measuring
how much laser light is received by the telescope, scientists can accurately determine the location, distribution and
nature of the particles. The result is a revolutionary new tool for studying constituents in the atmosphere, from cloud
droplets to industrial pollutants, that are difficult to detect by other means."
[42][43]
Surveying
This TomTom mapping van is fitted with five
lidars on its roof rack.
Airborne lidar sensors are used by companies in the remote sensing
field. It can be used to create DTM (Digital Terrain Models) and DEM
(Digital Elevation Models) this is quite a common practice for larger
areas as a plane can take in a 1 km wide swath in one flyover. Greater
vertical accuracy of below 50 mm can be achieved with a lower flyover
and a slimmer 200 m swath, even in forest, where it is able to give you
the height of the canopy as well as the ground elevation. A reference
point is needed to link the data in with the WGS (World Geodetic
System)
[citation needed]
In fact, it works a lot like ordinary radar, except
that these systems send out narrow pulses or beams of light rather than
broad radio waves.
Transportation
Lidar has been used in Adaptive Cruise Control (ACC) systems for automobiles. Systems such as those by Siemens
and Hella use a lidar device mounted on the front of the vehicle, such as the bumper, to monitor the distance between
the vehicle and any vehicle in front of it.
[44]
In the event the vehicle in front slows down or is too close, the ACC
applies the brakes to slow the vehicle. When the road ahead is clear, the ACC allows the vehicle to accelerate to a
speed preset by the driver. Refer to the Military section above for further examples.
Wind farm optimization
Lidar can be used to increase the energy output from wind farms by accurately measuring wind speeds and wind
turbulence.
[]
An experimental
[]
lidar is mounted on a wind turbine rotor to measure oncoming horizontal winds, and
proactively adjust blades to protect components and increase power.
[45]
Solar photovoltaic deployment optimization
Lidar can also be used to assist planners and developers optimize solar photovoltaic systems at the city level by
determining appropriate roof tops
[46]
and for determining shading losses.
[47]
Lidar 8
Other uses
The video for the song "House of Cards" by Radiohead was believed to be the first use of real-time 3D laser
scanning to record a music video. The range data in the video is not completely from a lidar, as structured light
scanning is also used.
[48][49]
Alternative technologies
Recent development of Structure From Motion (SFM) technologies allows delivering 3D images and maps based on
data extracted from visual and IR photography. The elevation or 3D data is extracted using multiple parallel passes
over mapped area, yielding both visual light image and 3D structure from the same sensor, which is often a specially
chosen and calibrated digital camera.
References
[2] James Ring, "The Laser in Astronomy." p. 672-3, New Scientist Jun 20, 1963
[3] "New Artillery Against Smog: TV and Lidar" Popular Mechanics, April 1970, p. 104.
[4] NOAA, http://www.ngs.noaa.gov/RESEARCH/RSD/main/lidar/lidar.shtml
[5] LIDAR patent on file, http://www.google.com/patents/US20090273770
[6] Google Books search for "lidar", sorted by date of publication, http://books.google.com/
[7] USGS Center for LIDAR Information Coordination and Knowledge, http://lidar.cr.usgs.gov/
[8] New York Times, search for "lidar", http://query.nytimes.com/search/sitesearch/#/lidar
[9] 'Experimental Advanced Research Lidar', NASA.org (http://inst.wff.nasa.gov/eaarl/). Retrieved 8 August 2007.
[12] Strauss C. E. M., " Synthetic-array heterodyne detection: a single-element detector acts as an array (http://www.opticsinfobase.org/ol/
abstract.cfm?id=12612)", Opt. Lett. 19, 1609-1611 (1994)
[19] http://www.lidarbasemaps.org
[20][20] Councillor Quarterly, Summer 2007 Volume 6 Issue 3
[21] Tom Paulson. 'LIDAR shows where earthquake risks are highest, Seattle Post (Wednesday, April 18, 2001) (http://www.seattlepi.com/
local/19144_quake18.shtml).
[22] 'Mount Saint Helens LIDAR Data', Washington State Geospatial Data Archive (September 13, 2006) (http://wagda.lib.washington.edu/
data/type/elevation/lidar/st_helens/). Retrieved 8 August 2007.
[23] 'Airborne Topographic Mapper', NASA.gov (http://atm.wff.nasa.gov/). Retrieved 8 August 2007.
[24] http://superlidar.colorado.edu/Classes/Lidar2011/LidarLecture14.pdf
[26] Thomas D. Wilkerson, Geary K. Schwemmer, and Bruce M. Gentry. LIDAR Profiling of Aerosols, Clouds, and Winds by Doppler and
Non-Doppler Methods, NASA International H2O Project (2002) (http://harlie.gsfc.nasa.gov/IHOP2002/Pub&Pats/AMOS 2002 final.
pdf).
[27] 'Earth Explorers: ADM-Aeolus', ESA.org (European Space Agency, 6 June 2007) (http://www.esa.int/esaLP/
ESAES62VMOC_LPadmaeolus_0.html). Retrieved 8 August 2007.
[28] 'Doppler lidar gives Olympic sailors the edge', Optics.org (3 July, 2008) (http://optics.org/cws/article/research/34878). Retrieved 8 July
2008.
[29] http://www.lidarwindtechnologies.com/
[30] http://www.naturalpower.com/zephir
[31] Grant, W. B., Lidar for atmospheric and hydrospheric studies, in Tunable Laser Applications, 1st Edition, Duarte, F. J. Ed. (Marcel Dekker,
New York, 1995) Chapter 7.
[32] (http://www.arete.com/index.php?view=stil_mcm)
[35] .http://articles.janes.com/articles/Janes-Nuclear,-Biological-and-Chemical-Defence/
LR-BSDS--Long-Range-Biological-Standoff-Detection-System-United-States.html
[36] Spice, Byron. Researchers Help Develop Full-Size Autonomous Helicopter (http://www.cmu.edu/news/blog/2010/Summer/
unprecedented-robochopper.shtml) Carnegie Mellon, 6 July 2010. Retrieved: 19 July 2010.
[37] Koski, Olivia. In a First, Full-Sized Robo-Copter Flies With No Human Help (http://www.wired.com/dangerroom/2010/07/
in-a-first-full-sized-robo-copter-flies-with-no-human-help/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed:+wired/
index+(Wired:+Index+3+(Top+Stories+2))#ixzz0tk2hAfAQ) Wired, 14 July 2010. Retrieved: 19 July 2010.
[38] NASA. 'NASA Mars Lander Sees Falling Snow, Soil Data Suggest Liquid Past' NASA.gov (29 September 2008) (http://www.nasa.gov/
mission_pages/phoenix/news/phoenix-20080929.html). Retrieved 9 November 2008.
[39] CW Gowers. ' Focus On : Lidar-Thomson Scattering Diagnostic on JET' JET.EFDA.org (undated) (http://www.jet.efda.org/pages/focus/
lidar/index.html). Retrieved 8 August 2007.
[45] Mikkelsen, Torben & Hansen, Kasper Hjorth et al. Lidar wind speed measurements from a rotating spinner (http://orbit.dtu.dk/
getResource?recordId=259451&objectId=1&versionId=1) Danish Research Database & Danish Technical University, 20 April 2010.
Lidar 9
Retrieved: 25 April 2010.
[46] Ha T. Nguyen, Joshua M. Pearce, Rob Harrap, and Gerald Barber, “ The Application of LiDAR to Assessment of Rooftop Solar
Photovoltaic Deployment Potential on a Municipal District Unit (http://www.mdpi.com/1424-8220/12/4/4534/pdf)”, Sensors, 12, pp.
4534-4558 (2012).
[49] Retrieved 2 May 2011 (http://www.velodyne.com/lidar/lidar.aspx)
WANG, J., ZHANG, J., RONCAT, A., KUENZER, C., WAGNER, W., 2009: Regularizing method for the
determination of the backscatter cross section in lidar data, In: J. Opt. Soc. Am. A Vol. 26, No. 5/May 2009/,
1084-7529/09/051071-9, pp. 1071–1079
External links
• The USGS Center for LIDAR Information Coordination and Knowledge (CLICK) (http://lidar.cr.usgs.gov/) -
A website intended to "facilitate data access, user coordination and education of lidar remote sensing for scientific
needs."
• How Lidar Works (http://airborne1.com/how_lidar.html)
• LiDAR Research Group (LRG), University of Heidelberg (http://lrg.uni-hd.de)
• Forecast 3D Lidar Scanner manufactured by Autonomous Solutions, Inc (http://autonomoussolutions.com/
forecast-3d-laser-system/): 3D Point Cloud and Obstacle Detection
• Free online lidar data viewer (http://www.lidarview.com/)
Remote sensing
Synthetic aperture radar image of
Death Valley colored using
polarimetry.
Remote sensing is the acquisition of information about an object or phenomenon
without making physical contact with the object. In modern usage, the term
generally refers to the use of aerial sensor technologies to detect and classify
objects on Earth (both on the surface, and in the atmosphere and oceans) by
means of propagated signals (e.g. electromagnetic radiation emitted from aircraft
or satellites).
[1][2]
Remote sensing 10
Overview
This video is about how Landsat was used to identify areas of conservation in the
Democratic Republic of the Congo, and how it was used to help map an area called
MLW in the north.
There are two main types of remote
sensings: passive remote sensing and active
remote sensing.
[3]
Passive sensors detect
natural radiation that is emitted or reflected
by the object or surrounding areas.
Reflected sunlight is the most common
source of radiation measured by passive
sensors. Examples of passive remote sensors
include film photography, infrared,
charge-coupled devices, and radiometers.
Active collection, on the other hand, emits
energy in order to scan objects and areas
whereupon a sensor then detects and
measures the radiation that is reflected or
backscattered from the target. RADAR and LiDAR are examples of active remote sensing where the time delay
between emission and return is measured, establishing the location, speed and direction of an object.
Remote sensing makes it possible to collect data on dangerous or inaccessible areas. Remote sensing applications
include monitoring deforestation in areas such as the Amazon Basin, glacial features in Arctic and Antarctic regions,
and depth sounding of coastal and ocean depths. Military collection during the Cold War made use of stand-off
collection of data about dangerous border areas. Remote sensing also replaces costly and slow data collection on the
ground, ensuring in the process that areas or objects are not disturbed.
Orbital platforms collect and transmit data from different parts of the electromagnetic spectrum, which in
conjunction with larger scale aerial or ground-based sensing and analysis, provides researchers with enough
information to monitor trends such as El Niño and other natural long and short term phenomena. Other uses include
different areas of the earth sciences such as natural resource management, agricultural fields such as land usage and
conservation, and national security and overhead, ground-based and stand-off collection on border areas.
[4]
Data acquisition techniques
The basis for multispectral collection and analysis is that of examined areas or objects that reflect or emit radiation
that stand out from surrounding areas.
Applications of remote sensing data
• Conventional radar is mostly associated with aerial traffic control, early warning, and certain large scale
meteorological data. Doppler radar is used by local law enforcements’ monitoring of speed limits and in enhanced
meteorological collection such as wind speed and direction within weather systems. Other types of active
collection includes plasmas in the ionosphere. Interferometric synthetic aperture radar is used to produce precise
digital elevation models of large scale terrain (See RADARSAT, TerraSAR-X, Magellan).
• Laser and radar altimeters on satellites have provided a wide range of data. By measuring the bulges of water
caused by gravity, they map features on the seafloor to a resolution of a mile or so. By measuring the height and
wavelength of ocean waves, the altimeters measure wind speeds and direction, and surface ocean currents and
directions.
• Light detection and ranging (LIDAR) is well known in examples of weapon ranging, laser illuminated homing of
projectiles. LIDAR is used to detect and measure the concentration of various chemicals in the atmosphere, while
airborne LIDAR can be used to measure heights of objects and features on the ground more accurately than with
Remote sensing 11
radar technology. Vegetation remote sensing is a principal application of LIDAR.
• Radiometers and photometers are the most common instrument in use, collecting reflected and emitted radiation
in a wide range of frequencies. The most common are visible and infrared sensors, followed by microwave,
gamma ray and rarely, ultraviolet. They may also be used to detect the emission spectra of various chemicals,
providing data on chemical concentrations in the atmosphere.
• Stereographic pairs of aerial photographs have often been used to make topographic maps by imagery and terrain
analysts in trafficability and highway departments for potential routes.
• Simultaneous multi-spectral platforms such as Landsat have been in use since the 70’s. These thematic mappers
take images in multiple wavelengths of electro-magnetic radiation (multi-spectral) and are usually found on Earth
observation satellites, including (for example) the Landsat program or the IKONOS satellite. Maps of land cover
and land use from thematic mapping can be used to prospect for minerals, detect or monitor land usage,
deforestation, and examine the health of indigenous plants and crops, including entire farming regions or forests.
• Hyperspectral imaging produces an image where each pixel has full spectral information with imaging narrow
spectral bands over a contiguous spectral range. Hyperspectral imagers are used in various applications including
mineralogy, biology, defence, and environmental measurements.
• Within the scope of the combat against desertification, remote sensing allows to follow-up and monitor risk areas
in the long term, to determine desertification factors, to support decision-makers in defining relevant measures of
environmental management, and to assess their impacts.
[5]
Geodetic
• Overhead geodetic collection was first used in aerial submarine detection and gravitational data used in military
maps. This data revealed minute perturbations in the Earth’s gravitational field (geodesy) that may be used to
determine changes in the mass distribution of the Earth, which in turn may be used for geological studies.
Acoustic and near-acoustic
• Sonar: passive sonar, listening for the sound made by another object (a vessel, a whale etc.); active sonar,
emitting pulses of sounds and listening for echoes, used for detecting, ranging and measurements of underwater
objects and terrain.
• Seismograms taken at different locations can locate and measure earthquakes (after they occur) by comparing the
relative intensity and precise timings.
To coordinate a series of large-scale observations, most sensing systems depend on the following: platform location,
what time it is, and the rotation and orientation of the sensor. High-end instruments now often use positional
information from satellite navigation systems. The rotation and orientation is often provided within a degree or two
with electronic compasses. Compasses can measure not just azimuth (i. e. degrees to magnetic north), but also
altitude (degrees above the horizon), since the magnetic field curves into the Earth at different angles at different
latitudes. More exact orientations require gyroscopic-aided orientation, periodically realigned by different methods
including navigation from stars or known benchmarks.
Data processing
Generally speaking, remote sensing works on the principle of the inverse problem. While the object or phenomenon
of interest (the state) may not be directly measured, there exists some other variable that can be detected and
measured (the observation), which may be related to the object of interest through the use of a data-derived
computer model. The common analogy given to describe this is trying to determine the type of animal from its
footprints. For example, while it is impossible to directly measure temperatures in the upper atmosphere, it is
possible to measure the spectral emissions from a known chemical species (such as carbon dioxide) in that region.
The frequency of the emission may then be related to the temperature in that region via various thermodynamic
Remote sensing 12
relations.
The quality of remote sensing data consists of its spatial, spectral, radiometric and temporal resolutions.
Spatial resolution
The size of a pixel that is recorded in a raster image – typically pixels may correspond to square areas ranging
in side length from 1 to 1,000 metres (3.3 to 3,300 ft).
Spectral resolution
The wavelength width of the different frequency bands recorded – usually, this is related to the number of
frequency bands recorded by the platform. Current Landsat collection is that of seven bands, including several
in the infra-red spectrum, ranging from a spectral resolution of 0.07 to 2.1 μm. The Hyperion sensor on Earth
Observing-1 resolves 220 bands from 0.4 to 2.5 μm, with a spectral resolution of 0.10 to 0.11 μm per band.
Radiometric resolution
The number of different intensities of radiation the sensor is able to distinguish. Typically, this ranges from 8
to 14 bits, corresponding to 256 levels of the gray scale and up to 16,384 intensities or "shades" of colour, in
each band. It also depends on the instrument noise.
Temporal resolution
The frequency of flyovers by the satellite or plane, and is only relevant in time-series studies or those requiring
an averaged or mosaic image as in deforesting monitoring. This was first used by the intelligence community
where repeated coverage revealed changes in infrastructure, the deployment of units or the
modification/introduction of equipment. Cloud cover over a given area or object makes it necessary to repeat
the collection of said location.
In order to create sensor-based maps, most remote sensing systems expect to extrapolate sensor data in relation to a
reference point including distances between known points on the ground. This depends on the type of sensor used.
For example, in conventional photographs, distances are accurate in the center of the image, with the distortion of
measurements increasing the farther you get from the center. Another factor is that of the platen against which the
film is pressed can cause severe errors when photographs are used to measure ground distances. The step in which
this problem is resolved is called georeferencing, and involves computer-aided matching up of points in the image
(typically 30 or more points per image) which is extrapolated with the use of an established benchmark, "warping"
the image to produce accurate spatial data. As of the early 1990s, most satellite images are sold fully georeferenced.
In addition, images may need to be radiometrically and atmospherically corrected.
Radiometric correction
Gives a scale to the pixel values, e. g. the monochromatic scale of 0 to 255 will be converted to actual radiance
values.
Topographic correction
In the rugged mountains, as a result of terrain, each pixel which receives the effective illumination varies
considerably different. In remote sensing image, the pixel on the shady slope receives weak illumination and
has a low radiance value, in contrast, the pixel on the sunny slope receives strong illumination and has a high
radiance value. For the same objects, the pixel radiance values on the shady slope must be very different from
that on the sunny slope. Different objects may have the similar radiance values. This spectral information
changes seriously affected remote sensing image information extraction accuracy in the mountainous area. It
became the main obstacle to further application on remote sensing images. The purpose of topographic
correction is to eliminate this effect, recovery true reflectivity or radiance of objects in horizontal conditions. It
is the premise of quantitative remote sensing application.
Atmospheric correction
Remote sensing 13
eliminates atmospheric haze by rescaling each frequency band so that its minimum value (usually realised in
water bodies) corresponds to a pixel value of 0. The digitizing of data also make possible to manipulate the
data by changing gray-scale values.
Interpretation is the critical process of making sense of the data. The first application was that of aerial photographic
collection which used the following process; spatial measurement through the use of a light table in both
conventional single or stereographic coverage, added skills such as the use of photogrammetry, the use of
photomosaics, repeat coverage, Making use of objects’ known dimensions in order to detect modifications. Image
Analysis is the recently developed automated computer-aided application which is in increasing use.
Object-Based Image Analysis (OBIA) is a sub-discipline of GIScience devoted to partitioning remote sensing (RS)
imagery into meaningful image-objects, and assessing their characteristics through spatial, spectral and temporal
scale.
Old data from remote sensing is often valuable because it may provide the only long-term data for a large extent of
geography. At the same time, the data is often complex to interpret, and bulky to store. Modern systems tend to store
the data digitally, often with lossless compression. The difficulty with this approach is that the data is fragile, the
format may be archaic, and the data may be easy to falsify. One of the best systems for archiving data series is as
computer-generated machine-readable ultrafiche, usually in typefonts such as OCR-B, or as digitized half-tone
images. Ultrafiches survive well in standard libraries, with lifetimes of several centuries. They can be created,
copied, filed and retrieved by automated systems. They are about as compact as archival magnetic media, and yet
can be read by human beings with minimal, standardized equipment.
Data processing levels
To facilitate the discussion of data processing in practice, several processing “levels” were first defined in 1986 by
NASA as part of its Earth Observing System
[6]
and steadily adopted since then, both internally at NASA (e. g.,
[7]
)
and elsewhere (e. g.,
[8]
); these definitions are:
Level Description
0 Reconstructed, unprocessed instrument and payload data at full resolution, with any and all communications artifacts (e. g., synchronization
frames, communications headers, duplicate data) removed.
1a Reconstructed, unprocessed instrument data at full resolution, time-referenced, and annotated with ancillary information, including
radiometric and geometric calibration coefficients and georeferencing parameters (e. g., platform ephemeris) computed and appended but
not applied to the Level 0 data (or if applied, in a manner that level 0 is fully recoverable from level 1a data).
1b Level 1a data that have been processed to sensor units (e. g., radar backscatter cross section, brightness temperature, etc.); not all
instruments have Level 1b data; level 0 data is not recoverable from level 1b data.
2 Derived geophysical variables (e. g., ocean wave height, soil moisture, ice concentration) at the same resolution and location as Level 1
source data.
3 Variables mapped on uniform spacetime grid scales, usually with some completeness and consistency (e. g., missing points interpolated,
complete regions mosaicked together from multiple orbits, etc.).
4 Model output or results from analyses of lower level data (i. e., variables that were not measured by the instruments but instead are derived
from these measurements).
A Level 1 data record is the most fundamental (i. e., highest reversible level) data record that has significant
scientific utility, and is the foundation upon which all subsequent data sets are produced. Level 2 is the first level that
is directly usable for most scientific applications; its value is much greater than the lower levels. Level 2 data sets
tend to be less voluminous than Level 1 data because they have been reduced temporally, spatially, or spectrally.
Level 3 data sets are generally smaller than lower level data sets and thus can be dealt with without incurring a great
deal of data handling overhead. These data tend to be generally more useful for many applications. The regular
spatial and temporal organization of Level 3 datasets makes it feasible to readily combine data from different
Remote sensing 14
sources.
History
The TR-1 reconnaissance/surveillance aircraft.
The 2001 Mars Odyssey used spectrometers and
imagers to hunt for evidence of past or present
water and volcanic activity on Mars.
The modern discipline of remote sensing arose with the development
of flight. The balloonist G. Tournachon (alias Nadar) made
photographs of Paris from his balloon in 1858. Messenger pigeons,
kites, rockets and unmanned balloons were also used for early images.
With the exception of balloons, these first, individual images were not
particularly useful for map making or for scientific purposes.
[citation
needed]
Systematic aerial photography was developed for military surveillance
and reconnaissance purposes beginning in World War I and reaching a
climax during the Cold War with the use of modified combat aircraft
such as the P-51, P-38, RB-66 and the F-4C, or specifically designed
collection platforms such as the U2/TR-1, SR-71, A-5[9] and the OV-1
series both in overhead and stand-off collection. A more recent
development is that of increasingly smaller sensor pods such as those
used by law enforcement and the military, in both manned and
unmanned platforms. The advantage of this approach is that this
requires minimal modification to a given airframe. Later imaging
technologies would include Infra-red, conventional, doppler and
synthetic aperture radar.
[citation needed]
The development of artificial satellites in the latter half of the 20th
century allowed remote sensing to progress to a global scale as of the
end of the Cold War. Instrumentation aboard various Earth observing
and weather satellites such as Landsat, the Nimbus and more recent missions such as RADARSAT and UARS
provided global measurements of various data for civil, research, and military purposes. Space probes to other
planets have also provided the opportunity to conduct remote sensing studies in extraterrestrial environments,
synthetic aperture radar aboard the Magellan spacecraft provided detailed topographic maps of Venus, while
instruments aboard SOHO allowed studies to be performed on the Sun and the solar wind, just to name a few
examples.
[citation needed]
Recent developments include, beginning in the 1960s and 1970s with the development of image processing of
satellite imagery. Several research groups in Silicon Valley including NASA Ames Research Center, GTE and ESL
Inc. developed Fourier transform techniques leading to the first notable enhancement of imagery data.
[citation needed]
Training and Education
Remote Sensing has a growing relevance in the modern information society. It represents a key technology as part of
the aerospace industry and bears increasing economic relevance – new sensors e.g. TerraSAR-X & RapidEye are
developed constantly and the demand for skilled labour is increasing steadily. Furthermore, remote sensing
exceedingly influences everyday life, ranging from weather forecasts to reports on climate change or natural
disasters. As an example, 80% of the German students use the services of Google Earth; in 2006 alone the software
was downloaded 100 million times. But studies has shown that only a fraction of them know more about the data
they are working with.
[10]
There exists a huge knowledge gap between the application and the understanding of
satellite images. Remote sensing only plays a tangential role in schools, regardless of the political claims to
strengthen the support for teaching on the subject.
[11]
A lot of the computer software explicitly developed for school
Remote sensing 15
lessons has not yet been implemented due to its complexity. Thereby, the subject is either not at all integrated into
the curriculum or does not pass the step of an interpretation of analogue images. In fact, the subject of remote
sensing requires a consolidation of physics and mathematics as well as competences in the fields of media and
methods apart from the mere visual interpretation of satellite images.
Many teachers have great interest in the subject “remote sensing”, being motivated to integrate this topic into
teaching, provided that the curriculum is considered. In many cases, this encouragement fails because of confusing
information.
[12]
In order to integrate remote sensing in a sustainable manner organizations like the EGU or digital
earth
[13]
encourages the development of learning modules and learning portals (e.g. FIS – Remote Sensing in
School Lessons
[14]
or Landmap – Spatial Discovery
[15]
) promoting media and method qualifications as well as
independent working.
Remote Sensing Internships
One effective way to teach students the many applications of remote sensing is through an internship opportunity.
NASA DEVELOP is one such opportunity, where students work in teams with science advisor(s) and/or partner(s) to
meet some practical need in the community. Working through NASA, this program give students experience in
real-world remote sensing applications, as well as providing valuable training. (More information can be found on
the NASA DEVELOP website
[16]
Another such program is SERVIR. Supporting by the US Agency of International Development (USAID) and
NASA, SERVIR provides students with valuable hands-on experience with remote sensing, while providing
end-users with the resources to better respond to a whole host of issues. More information can be found on the
SERVIR website
[17]
Remote Sensing software
Remote Sensing data is processed and analyzed with computer software, known as a remote sensing application. A
large number of proprietary and open source applications exist to process remote sensing data. Remote Sensing
Software packages include:
• TNTmips from MicroImages,
• PCI Geomatica made by PCI Geomatics, the leading remote sensing software package in Canada,
• IDRISI from Clark Labs,
• Image Analyst from Intergraph,
• and RemoteView made by Overwatch Textron Systems.
• Dragon/ips is one of the oldest remote sensing packages still available, and is in some cases free.
Open source remote sensing software includes:
• OSSIM,
• Opticks (software),
•• Orfeo toolbox
• Others mixing remote sensing and GIS capabilities are: GRASS GIS, ILWIS, QGIS, and TerraLook.
According to an NOAA Sponsored Research by Global Marketing Insights, Inc. the most used applications among
Asian academic groups involved in remote sensing are as follows: ERDAS 36% (ERDAS IMAGINE 25% &
ERMapper 11%); ESRI 30%; ITT Visual Information Solutions ENVI 17%; MapInfo 17%.
Among Western Academic respondents as follows: ESRI 39%, ERDAS IMAGINE 27%, MapInfo 9%, AutoDesk
7%, ITT Visual Information Solutions ENVI 17%.
Remote sensing 16
References
[4] http://hurricanes.nasa.gov/earth-sun/technology/remote_sensing.html
[5] Begni Gérard, Escadafal Richard, Fontannaz Delphine and Hong-Nga Nguyen Anne-Thérèse, 2005. Remote sensing: a tool to monitor and
assess desertification. Les dossiers thématiques du CSFD. Issue 2. 44 pp. (http://www.csf-desertification.org/index.php/bibliotheque/
publications-csfd/doc_details/28-begni-gerard-et-al-2005-remote-sensing)
[6] NASA (1986), Report of the EOS data panel, Earth Observing System, Data and Information System, Data Panel Report, Vol. IIa., NASA
Technical Memorandum 87777, June 1986, 62 pp. Available at (http://hdl.handle.net/2060/19860021622)
[7] C. L. Parkinson, A. Ward, M. D. King (Eds.) Earth Science Reference Handbook – A Guide to NASA’s Earth Science Program and Earth
Observing Satellite Missions, National Aeronautics and Space Administration Washington, D. C. Available at (http://eospso.gsfc.nasa.gov/
ftp_docs/2006ReferenceHandbook.pdf)
[8] GRAS-SAF (2009), Product User Manual, GRAS Satellite Application Facility, Version 1.2.1, 31 March 2009. Available at (http://www.
grassaf.org/general-documents/products/grassaf_pum_v121.pdf)
[9] http://toolserver.org/%7Edispenser/cgi-bin/dab_solver.py?page=Remote_sensing&editintro=Template:Disambiguation_needed/
editintro&client=Template:Dn
[10] Ditter, R., Haspel, M., Jahn, M., Kollar, I., Siegmund, A., Viehrig, K., Volz, D., Siegmund, A. (2012) Geospatial technologies in school –
theoretical concept and practical implementation in K-12 schools. In: International Journal of Data Mining, Modelling and Management
(IJDMMM): FutureGIS: Riding the Wave of a Growing Geospatial Technology Literate Society; Vol. X
[11] Stork, E.J., Sakamoto, S.O., and Cowan, R.M. (1999) "The integration of science explorations through the use of earth images in middle
school curriculum", Proc. IEEE Trans. Geosci. Remote Sensing 37, 1801–1817
[12] Bednarz, S.W. and Whisenant, S.E. (2000) “Mission geography: linking national geography standards, innovative technologies and NASA”,
Proc. IGARSS, Honolulu, USA, 2780–2782 8
[13] http://www.digital-earth.eu/
[14] http://www.fis.uni-bonn.de/node/92
[15] http://www.landmap.ac.uk
[16] http://develop.larc.nasa.gov/
[17] http://www.nasa.gov/mission_pages/servir/index.html
Further reading
• Campbell, J. B. (2002). Introduction to remote sensing (3rd ed.). The Guilford Press. ISBN 1-57230-640-8.
• Jensen, J. R. (2007). Remote sensing of the environment: an Earth resource perspective (2nd ed.). Prentice Hall.
ISBN 0-13-188950-8.
• Jensen, J. R. (2005). Digital Image Processing: a Remote Sensing Perspective (3rd ed.). Prentice Hall.
• Lentile, Leigh B.; Holden, Zachary A.; Smith, Alistair M. S.; Falkowski, Michael J.; Hudak, Andrew T.; Morgan,
Penelope; Lewis, Sarah A.; Gessler, Paul E.; Benson, Nate C.. Remote sensing techniques to assess active fire
characteristics and post-fire effects (http://www.treesearch.fs.fed.us/pubs/24613). International Journal of
Wildland Fire. 2006;3(15):319–345.
• Lillesand, T. M.; R. W. Kiefer, and J. W. Chipman (2003). Remote sensing and image interpretation (5th ed.).
Wiley. ISBN 0-471-15227-7.
• Richards, J. A.; and X. Jia (2006). Remote sensing digital image analysis: an introduction (4th ed.). Springer.
ISBN 3-540-25128-6.
•• US Army FM series.
•• US Army military intelligence museum, FT Huachuca, AZ
• Datla, R.U.; Rice, J.P.; Lykke, K.R.; Johnson, B.C.; Butler, J.J.; Xiong, X.. Best practice guidelines for pre-launch
characterization and calibration of instruments for passive optical remote sensing (http://nvlpubs.nist.gov/
nistpubs/jres/116/2/V116.N02.A05.pdf). Journal of Research of the National Institute of Standards and
Technology. 2011 (March–April);116(2):612–646.
Remote sensing 17
External links
• Remote Sensing (http://www.dmoz.org/Science/Earth_Sciences/Geomatics/Remote_Sensing/) at the Open
Directory Project
• Free space images (mosaics) (http://www.terraexploro.com/terralibrary/index.php/space-images)
• International Journal of Advanced Remote Sensing and GIS (http://www.cloud-journals.com/
journal-of-remote-sensing-n-gis-open-access.html)
Light
The Sun is Earth's primary source of light. About 44% of the sun's
electromagnetic radiation that reaches the ground is in the visible
light range.
Visible light (commonly referred to simply as light) is
electromagnetic radiation that is visible to the human
eye, and is responsible for the sense of sight.
[1]
Visible
light has a wavelength in the range of about 380
nanometres (nm), or 380×10
−9
 m, to about
740 nanometres – between the invisible infrared, with
longer wavelengths and the invisible ultraviolet, with
shorter wavelengths.
Primary properties of visible light are intensity,
propagation direction, frequency or wavelength
spectrum, and polarisation, while its speed in a
vacuum, 299,792,458 meters per second, is one of the
fundamental constants of nature. Visible light, as with
all types of electromagnetic radiation (EMR), is
experimentally found to always move at this speed in
vacuum.
In common with all types of EMR, visible light is emitted and absorbed in tiny "packets" called photons, and
exhibits properties of both waves and particles. This property is referred to as the wave–particle duality. The study of
light, known as optics, is an important research area in modern physics.
In physics, the term light sometimes refers to electromagnetic radiation of any wavelength, whether visible or
not.
[2][3]
This article focuses on visible light. See the electromagnetic radiation article for the general term.
Speed of visible light
The speed of light in a vacuum is defined to be exactly 299,792,458 m/s (approximately 186,282 miles per second).
The fixed value of the speed of light in SI units results from the fact that the metre is now defined in terms of the
speed of light. All forms of electromagnetic radiation are believed to move at exactly this same speed in vacuum.
Different physicists have attempted to measure the speed of light throughout history. Galileo attempted to measure
the speed of light in the seventeenth century. An early experiment to measure the speed of light was conducted by
Ole Rømer, a Danish physicist, in 1676. Using a telescope, Rømer observed the motions of Jupiter and one of its
moons, Io. Noting discrepancies in the apparent period of Io's orbit, he calculated that light takes about 22 minutes to
traverse the diameter of Earth's orbit.
[4]
However, its size was not known at that time. If Rømer had known the
diameter of the Earth's orbit, he would have calculated a speed of 227,000,000 m/s.
Another, more accurate, measurement of the speed of light was performed in Europe by Hippolyte Fizeau in 1849.
Fizeau directed a beam of light at a mirror several kilometers away. A rotating cog wheel was placed in the path of
the light beam as it traveled from the source, to the mirror and then returned to its origin. Fizeau found that at a
Light 18
certain rate of rotation, the beam would pass through one gap in the wheel on the way out and the next gap on the
way back. Knowing the distance to the mirror, the number of teeth on the wheel, and the rate of rotation, Fizeau was
able to calculate the speed of light as 313,000,000 m/s.
Léon Foucault used an experiment which used rotating mirrors to obtain a value of 298,000,000 m/s in 1862. Albert
A. Michelson conducted experiments on the speed of light from 1877 until his death in 1931. He refined Foucault's
methods in 1926 using improved rotating mirrors to measure the time it took light to make a round trip from Mt.
Wilson to Mt. San Antonio in California. The precise measurements yielded a speed of 299,796,000 m/s.
The effective velocity of light in various transparent substances containing ordinary matter, is less than in vacuum.
For example the speed of light in water is about 3/4 of that in vacuum. However, the slowing process in matter is
thought to result not from actual slowing of particles of light, but rather from their absorption and re-emission from
charged particles in matter.
As an extreme example of the nature of light-slowing in matter, two independent teams of physicists were able to
bring light to a "complete standstill" by passing it through a Bose-Einstein Condensate of the element rubidium, one
team at Harvard University and the Rowland Institute for Science in Cambridge, Mass., and the other at the
Harvard-Smithsonian Center for Astrophysics, also in Cambridge.
[5]
However, the popular description of light being
"stopped" in these experiments refers only to light being stored in the excited states of atoms, then re-emitted at an
arbitrary later time, as stimulated by a second laser pulse. During the time it had "stopped" it had ceased to be light.
Electromagnetic spectrum and visible light
Electromagnetic spectrum with light highlighted
Generally, EM radiation, or EMR (the
designation 'radiation' excludes static
electric and magnetic and near fields)
is classified by wavelength into radio,
microwave, infrared, the visible region
that we perceive as light, ultraviolet,
X-rays and gamma rays.
The behaviour of EMR depends on its
wavelength. Higher frequencies have
shorter wavelengths, and lower
frequencies have longer wavelengths.
When EMR interacts with single atoms
and molecules, its behaviour depends
on the amount of energy per quantum it carries.
EMR in the visible light region consists of quanta (called photons) that are at the lower end of the energies that are
capable of causing electronic excitation within molecules, which lead to changes in the bonding or chemistry of the
molecule. At the lower end of the visible light spectrum, EMR becomes invisible to humans (infrared) because its
photons no longer have enough individual energy to cause a lasting molecular change (a change in conformation) in
the visual molecule retinal in the human retina. This change triggers the sensation of vision.
There exist animals that are sensitive to various types of infrared, but not by means of quantum-absorption. Infrared
sensing in snakes depends on a kind of natural thermal imaging, in which tiny packets of cellular water are raised in
temperature by the infrared radiation. EMR in this range causes molecular vibration and heating effects, and this is
how living animals detect it.
Above the range of visible light, ultraviolet light becomes invisible to humans, mostly because it is absorbed by the
tissues of the eye and in particular the lens. Furthermore, the rods and cones located at the back of the human eye
cannot detect the short ultraviolet wavelengths, and are in fact damaged by ultraviolet rays, a condition known as
Light 19
snow eye.
[6]
Many animals with eyes that do not require lenses (such as insects and shrimp) are able to directly
detect ultraviolet visually, by quantum photon-absorption mechanisms, in much the same chemical way that normal
humans detect visible light.
Optics
The study of light and the interaction of light and matter is termed optics. The observation and study of optical
phenomena such as rainbows and the aurora borealis offer many clues as to the nature of light.
Refraction
An example of refraction of light. The straw appears bent, because of
refraction of light as it enters liquid from air.
A cloud illuminated by sunlight
Refraction is the bending of light rays when passing
through a surface between one transparent material and
another. It is described by Snell's Law:
where is the angle between the ray and the surface
normal in the first medium, is the angle between the
ray and the surface normal in the second medium, and
n
1
and n
2
are the indices of refraction, n = 1 in a
vacuum and n > 1 in a transparent substance.
When a beam of light crosses the boundary between a
vacuum and another medium, or between two different
media, the wavelength of the light changes, but the
frequency remains constant. If the beam of light is not
orthogonal (or rather normal) to the boundary, the
change in wavelength results in a change in the
direction of the beam. This change of direction is
known as refraction.
The refractive quality of lenses is frequently used to
manipulate light in order to change the apparent size of
images. Magnifying glasses, spectacles, contact lenses,
microscopes and refracting telescopes are all examples
of this manipulation.
Light sources
There are many sources of light. The most common
light sources are thermal: a body at a given temperature
emits a characteristic spectrum of black-body radiation.
A simple thermal source is sunlight, the radiation
emitted by the chromosphere of the Sun at around
6,000 Kelvin peaks in the visible region of the
electromagnetic spectrum when plotted in wavelength
units
[7]
and roughly 44% of sunlight energy that
reaches the ground is visible.
[8]
Another example is
incandescent light bulbs, which emit only around 10% of their energy as visible light and the remainder as infrared.
A common thermal light source in history is the glowing solid particles in flames, but these also emit most of their
Light 20
radiation in the infrared, and only a fraction in the visible spectrum. The peak of the blackbody spectrum is in the
deep infrared, at about 10 micrometer wavelength, for relatively cool objects like human beings. As the temperature
increases, the peak shifts to shorter wavelengths, producing first a red glow, then a white one, and finally a
blue-white colour as the peak moves out of the visible part of the spectrum and into the ultraviolet. These colours can
be seen when metal is heated to "red hot" or "white hot". Blue-white thermal emission is not often seen, except in
stars (the commonly seen pure-blue colour in a gas flame or a welder's torch is in fact due to molecular emission,
notably by CH radicals (emitting a wavelength band around 425 nm, and is not seen in stars or pure thermal
radiation).
Atoms emit and absorb light at characteristic energies. This produces "emission lines" in the spectrum of each atom.
Emission can be spontaneous, as in light-emitting diodes, gas discharge lamps (such as neon lamps and neon signs,
mercury-vapor lamps, etc.), and flames (light from the hot gas itself—so, for example, sodium in a gas flame emits
characteristic yellow light). Emission can also be stimulated, as in a laser or a microwave maser.
Deceleration of a free charged particle, such as an electron, can produce visible radiation: cyclotron radiation,
synchrotron radiation, and bremsstrahlung radiation are all examples of this. Particles moving through a medium
faster than the speed of light in that medium can produce visible Cherenkov radiation.
Certain chemicals produce visible radiation by chemoluminescence. In living things, this process is called
bioluminescence. For example, fireflies produce light by this means, and boats moving through water can disturb
plankton which produce a glowing wake.
Certain substances produce light when they are illuminated by more energetic radiation, a process known as
fluorescence. Some substances emit light slowly after excitation by more energetic radiation. This is known as
phosphorescence.
Phosphorescent materials can also be excited by bombarding them with subatomic particles. Cathodoluminescence is
one example. This mechanism is used in cathode ray tube television sets and computer monitors.
A city illuminated by artificial lighting
Certain other mechanisms can produce light:
•• Bioluminescence
•• Cherenkov radiation
•• Electroluminescence
•• Scintillation
•• Sonoluminescence
•• triboluminescence
When the concept of light is intended to include
very-high-energy photons (gamma rays), additional
generation mechanisms include:
• Particle–antiparticle annihilation
•• Radioactive decay
Units and measures
Light is measured with two main alternative sets of units: radiometry consists of measurements of light power at all
wavelengths, while photometry measures light with wavelength weighted with respect to a standardised model of
human brightness perception. Photometry is useful, for example, to quantify Illumination (lighting) intended for
human use. The SI units for both systems are summarised in the following tables.
Light 21
Quantity Unit Dimension Notes
Name
Symbol
[9] Name Symbol Symbol
Radiant energy
Q
e
[10] joule J
M⋅L
2
⋅T
−2 energy
Radiant flux
Φ
e
[10] watt W
M⋅L
2
⋅T
−3 radiant energy per unit time, also called radiant power.
Spectral power
Φ
eλ
[10][11] watt per metre
W⋅m
−1
M⋅L⋅T
−3 radiant power per wavelength.
Radiant intensity I
e
watt per steradian
W⋅sr
−1
M⋅L
2
⋅T
−3 power per unit solid angle.
Spectral intensity
I
eλ
[11] watt per steradian per
metre
W⋅sr
−1
⋅m
−1
M⋅L⋅T
−3 radiant intensity per wavelength.
Radiance L
e
watt per steradian per
square metre
W⋅sr
−1
⋅m
−2
M⋅T
−3 power per unit solid angle per unit projected source
area.
confusingly called "intensity" in some other fields of
study.
Spectral radiance
L
eλ
[11]
or
L
eν
[12]
watt per steradian per
metre
3
or
watt per steradian per
square
metre per hertz
W⋅sr
−1
⋅m
−3
or
W⋅sr
−1
⋅m
−2
⋅Hz
−1
M⋅L
−1
⋅T
−3
or
M⋅T
−2
commonly measured in W⋅sr
−1
⋅m
−2
⋅nm
−1
with surface
area and either wavelength or frequency.
Irradiance
E
e
[10] watt per square metre
W⋅m
−2
M⋅T
−3 power incident on a surface, also called radiant flux
density.
sometimes confusingly called "intensity" as well.
Spectral
irradiance
E
eλ
[11]
or
E
eν
[12]
watt per metre
3
or
watt per square metre
per hertz
W⋅m
−3
or
W⋅m
−2
⋅Hz
−1
M⋅L
−1
⋅T
−3
or
M⋅T
−2
commonly measured in W⋅m
−2
⋅nm
−1
or 10
−22
W⋅m
−2
⋅Hz
−1
, known as solar flux unit.
[13]
Radiant exitance
/
Radiant
emittance
M
e
[10] watt per square metre
W⋅m
−2
M⋅T
−3 power emitted from a surface.
Spectral radiant
exitance /
Spectral radiant
emittance
M
eλ
[11]
or
M
eν
[12]
watt per metre
3
or
watt per square
metre per hertz
W⋅m
−3
or
W⋅m
−2
⋅Hz
−1
M⋅L
−1
⋅T
−3
or
M⋅T
−2
power emitted from a surface per wavelength or
frequency.
Radiosity
J
e
or
J
eλ
[11]
watt per square metre
W⋅m
−2
M⋅T
−3 emitted plus reflected power leaving a surface.
Radiant exposure H
e
joule per square metre
J⋅m
−2
M⋅T
−2
Radiant energy
density
ω
e joule per metre
3
J⋅m
−3
M⋅L
−1
⋅T
−2
See also: SI · Radiometry · Photometry · (Compare)
Light 22
Quantity Unit Dimension Notes
Name
Symbol
[14] Name Symbol Symbol
Luminous energy
Q
v
 
[15] lumen second lm⋅s
T⋅J 
[16] units are sometimes called talbots
Luminous flux
Φ
v
 
[15] lumen (= cd⋅sr) lm
J 
[16] also called luminous power
Luminous intensity I
v
candela (= lm/sr) cd
J 
[16] an SI base unit, luminous flux per unit solid angle
Luminance L
v
candela per square metre
cd/m
2
L
−2
⋅J
units are sometimes called nits
Illuminance E
v lux (= lm/m
2
)
lx
L
−2
⋅J
used for light incident on a surface
Luminous emittance M
v lux (= lm/m
2
)
lx
L
−2
⋅J
used for light emitted from a surface
Luminous exposure H
v
lux second lx⋅s
L
−2
⋅T⋅J
Luminous energy density ω
v lumen second per metre
3
lm⋅s⋅m
−3
L
−3
⋅T⋅J
Luminous efficacy
η 
[15] lumen per watt lm/W
M
−1
⋅L
−2
⋅T
3
⋅J
ratio of luminous flux to radiant flux
Luminous efficiency V 1 also called luminous coefficient
See also: SI · Photometry · Radiometry · (Compare)
The photometry units are different from most systems of physical units in that they take into account how the human
eye responds to light. The cone cells in the human eye are of three types which respond differently across the visible
spectrum, and the cumulative response peaks at a wavelength of around 555 nm. Therefore, two sources of light
which produce the same intensity (W/m
2
) of visible light do not necessarily appear equally bright. The photometry
units are designed to take this into account, and therefore are a better representation of how "bright" a light appears
to be than raw intensity. They relate to raw power by a quantity called luminous efficacy, and are used for purposes
like determining how to best achieve sufficient illumination for various tasks in indoor and outdoor settings. The
illumination measured by a photocell sensor does not necessarily correspond to what is perceived by the human eye,
and without filters which may be costly, photocells and charge-coupled devices (CCD) tend to respond to some
infrared, ultraviolet or both.
Light pressure
Light exerts physical pressure on objects in its path, a phenomenon which can be deduced by Maxwell's equations,
but can be more easily explained by the particle nature of light: photons strike and transfer their momentum. Light
pressure is equal to the power of the light beam divided by c, the speed of light.  Due to the magnitude of c, the
effect of light pressure is negligible for everyday objects.  For example, a one-milliwatt laser pointer exerts a force of
about 3.3 piconewtons on the object being illuminated; thus, one could lift a U. S. penny with laser pointers, but
doing so would require about 30 billion 1-mW laser pointers.
[17]
  However, in nanometer-scale applications such as
NEMS, the effect of light pressure is more significant, and exploiting light pressure to drive NEMS mechanisms and
to flip nanometer-scale physical switches in integrated circuits is an active area of research.
[18]
At larger scales, light pressure can cause asteroids to spin faster,
[19]
acting on their irregular shapes as on the vanes
of a windmill.  The possibility of making solar sails that would accelerate spaceships in space is also under
investigation.
[20][21]
Although the motion of the Crookes radiometer was originally attributed to light pressure, this interpretation is
incorrect; the characteristic Crookes rotation is the result of a partial vacuum.
[22]
This should not be confused with
the Nichols radiometer, in which the (slight) motion caused by torque (though not enough for full rotation against
friction) is directly caused by light pressure.
[23]
Light 23
Historical theories about light, in chronological order
Classical Greece and Hellenism
In the fifth century BC, Empedocles postulated that everything was composed of four elements; fire, air, earth and
water. He believed that Aphrodite made the human eye out of the four elements and that she lit the fire in the eye
which shone out from the eye making sight possible. If this were true, then one could see during the night just as
well as during the day, so Empedocles postulated an interaction between rays from the eyes and rays from a source
such as the sun.
In about 300 BC, Euclid wrote Optica, in which he studied the properties of light. Euclid postulated that light
travelled in straight lines and he described the laws of reflection and studied them mathematically. He questioned
that sight is the result of a beam from the eye, for he asks how one sees the stars immediately, if one closes one's
eyes, then opens them at night. Of course if the beam from the eye travels infinitely fast this is not a problem.
In 55 BC, Lucretius, a Roman who carried on the ideas of earlier Greek atomists, wrote:
"The light & heat of the sun; these are composed of minute atoms which, when they are shoved off, lose no time in
shooting right across the interspace of air in the direction imparted by the shove." – On the nature of the Universe
Despite being similar to later particle theories, Lucretius's views were not generally accepted.
Ptolemy (c. 2nd century) wrote about the refraction of light in his book Optics.
[24]
Classical India
In ancient India, the Hindu schools of Samkhya and Vaisheshika, from around the early centuries CE developed
theories on light. According to the Samkhya school, light is one of the five fundamental "subtle" elements (tanmatra)
out of which emerge the gross elements. The atomicity of these elements is not specifically mentioned and it appears
that they were actually taken to be continuous.
On the other hand, the Vaisheshika school gives an atomic theory of the physical world on the non-atomic ground of
ether, space and time. (See Indian atomism.) The basic atoms are those of earth (prthivi), water (pani), fire (agni),
and air (vayu) Light rays are taken to be a stream of high velocity of tejas (fire) atoms. The particles of light can
exhibit different characteristics depending on the speed and the arrangements of the tejas atoms.
[citation needed]
The
Vishnu Purana refers to sunlight as "the seven rays of the sun".
[citation needed]
The Indian Buddhists, such as Dignāga in the 5th century and Dharmakirti in the 7th century, developed a type of
atomism that is a philosophy about reality being composed of atomic entities that are momentary flashes of light or
energy. They viewed light as being an atomic entity equivalent to energy.
[citation needed]
Descartes
René Descartes (1596–1650) held that light was a mechanical property of the luminous body, rejecting the "forms"
of Ibn al-Haytham and Witelo as well as the "species" of Bacon, Grosseteste, and Kepler.
[25]
In 1637 he published a
theory of the refraction of light that assumed, incorrectly, that light travelled faster in a denser medium than in a less
dense medium. Descartes arrived at this conclusion by analogy with the behaviour of sound waves.
[citation needed]
Although Descartes was incorrect about the relative speeds, he was correct in assuming that light behaved like a
wave and in concluding that refraction could be explained by the speed of light in different media.
Descartes is not the first to use the mechanical analogies but because he clearly asserts that light is only a mechanical
property of the luminous body and the transmitting medium, Descartes' theory of light is regarded as the start of
modern physical optics.
[25]
Light 24
Particle theory
Pierre Gassendi.
Pierre Gassendi (1592–1655), an atomist, proposed a particle
theory of light which was published posthumously in the 1660s.
Isaac Newton studied Gassendi's work at an early age, and
preferred his view to Descartes' theory of the plenum. He stated in
his Hypothesis of Light of 1675 that light was composed of
corpuscles (particles of matter) which were emitted in all
directions from a source. One of Newton's arguments against the
wave nature of light was that waves were known to bend around
obstacles, while light travelled only in straight lines. He did,
however, explain the phenomenon of the diffraction of light
(which had been observed by Francesco Grimaldi) by allowing
that a light particle could create a localised wave in the aether.
Newton's theory could be used to predict the reflection of light, but
could only explain refraction by incorrectly assuming that light
accelerated upon entering a denser medium because the
gravitational pull was greater. Newton published the final version
of his theory in his Opticks of 1704. His reputation helped the
particle theory of light to hold sway during the 18th century. The particle theory of light led Laplace to argue that a
body could be so massive that light could not escape from it. In other words it would become what is now called a
black hole. Laplace withdrew his suggestion later, after a wave theory of light became firmly established as the
model for light (as has been explained, neither a particle or wave theory is fully correct). A translation of Newton's
essay on light appears in The large scale structure of space-time, by Stephen Hawking and George F. R. Ellis.
Wave theory
To explain the origin of colors, Robert Hooke (1635-1703) developed a "pulse theory" and compared the spreading
of light to that of waves in water in his 1665 Micrographia ("Observation XI"). In 1672 Hooke suggested that light's
vibrations could be perpendicular to the direction of propagation. Christiaan Huygens (1629-1695) worked out a
mathematical wave theory of light in 1678, and published it in his Treatise on light in 1690. He proposed that light
was emitted in all directions as a series of waves in a medium called the Luminiferous ether. As waves are not
affected by gravity, it was assumed that they slowed down upon entering a denser medium.
[26]
Thomas Young's sketch of the two-slit experiment
showing the diffraction of light. Young's experiments
supported the theory that light consists of waves.
The wave theory predicted that light waves could interfere with
each other like sound waves (as noted around 1800 by Thomas
Young), and that light could be polarised, if it were a transverse
wave. Young showed by means of a diffraction experiment that
light behaved as waves. He also proposed that different colours
were caused by different wavelengths of light, and explained
colour vision in terms of three-coloured receptors in the eye.
Another supporter of the wave theory was Leonhard Euler. He
argued in Nova theoria lucis et colorum (1746) that diffraction
could more easily be explained by a wave theory.
Later, Augustin-Jean Fresnel independently worked out his own wave theory of light, and presented it to the
Académie des Sciences in 1817. Simeon Denis Poisson added to Fresnel's mathematical work to produce a
convincing argument in favour of the wave theory, helping to overturn Newton's corpuscular theory. By the year
1821, Fresnel was able to show via mathematical methods that polarisation could be explained only by the wave
Light 25
theory of light and only if light was entirely transverse, with no longitudinal vibration whatsoever.
The weakness of the wave theory was that light waves, like sound waves, would need a medium for transmission.
The existence of the hypothetical substance luminiferous aether proposed by Huygens in 1678 was cast into strong
doubt in the late nineteenth century by the Michelson-Morley experiment.
Newton's corpuscular theory implied that light would travel faster in a denser medium, while the wave theory of
Huygens and others implied the opposite. At that time, the speed of light could not be measured accurately enough to
decide which theory was correct. The first to make a sufficiently accurate measurement was Léon Foucault, in
1850.
[27]
His result supported the wave theory, and the classical particle theory was finally abandoned, only to partly
re-emerge in the 20th century.
Quantum theory
In 1900 Max Planck, attempting to explain black body radiation suggested that although light was a wave, these
waves could gain or lose energy only in finite amounts related to their frequency. Planck called these "lumps" of
light energy "quanta" (from a Latin word for "how much"). In 1905, Albert Einstein used the idea of light quanta to
explain the photoelectric effect, and suggested that these light quanta had a "real" existence. In 1923 Arthur Holly
Compton showed that the wavelength shift seen when low intensity X-rays scattered from electrons (so called
Compton scattering) could be explained by a particle-theory of X-rays, but not a wave theory. In 1926 Gilbert N.
Lewis named these liqht quanta particles photons.
Eventually the modern theory of quantum mechanics came to picture light as (in some sense) both a particle and a
wave, and (in another sense), as a phenomenon which is neither a particle nor a wave (which actually are
macroscopic phenomena, such as baseballs or ocean waves). Instead, modern physics sees light as something that
can be described sometimes with mathematics appropriate to one type of macroscopic metaphor (particles), and
sometimes another macroscopic metaphor (water waves), but is actually something that cannot be fully imagined. As
in the case for radio waves and the X-rays involved in Compton scattering, physicists have noted that
electromagnetic radiation tends to behave more like a classical wave at lower frequencies, but more like a classical
particle at higher frequencies, but never completely loses all qualities of one or the other. Visible light, which
occupies a middle ground in frequency, can easily be shown in experiments to be describable using either a wave or
particle model, or sometimes both.
Electromagnetic theory as explanation for all types of visible light and all EM radiation
A linearly polarised light wave frozen in time and showing the two oscillating
components of light; an electric field and a magnetic field perpendicular to each other and
to the direction of motion (a transverse wave).
In 1845, Michael Faraday discovered
that the plane of polarisation of
linearly polarised light is rotated when
the light rays travel along the magnetic
field direction in the presence of a
transparent dielectric, an effect now
known as Faraday rotation.
[]
This was
the first evidence that light was related
to electromagnetism. In 1846 he
speculated that light might be some
form of disturbance propagating along
magnetic field lines.
[]
Faraday
proposed in 1847 that light was a
high-frequency electromagnetic
vibration, which could propagate even in the absence of a medium such as the ether.
Light 26
Faraday's work inspired James Clerk Maxwell to study electromagnetic radiation and light. Maxwell discovered that
self-propagating electromagnetic waves would travel through space at a constant speed, which happened to be equal
to the previously measured speed of light. From this, Maxwell concluded that light was a form of electromagnetic
radiation: he first stated this result in 1862 in On Physical Lines of Force. In 1873, he published A Treatise on
Electricity and Magnetism, which contained a full mathematical description of the behaviour of electric and
magnetic fields, still known as Maxwell's equations. Soon after, Heinrich Hertz confirmed Maxwell's theory
experimentally by generating and detecting radio waves in the laboratory, and demonstrating that these waves
behaved exactly like visible light, exhibiting properties such as reflection, refraction, diffraction, and interference.
Maxwell's theory and Hertz's experiments led directly to the development of modern radio, radar, television,
electromagnetic imaging, and wireless communications.
In the quantum theory, photons are seen as wave packets of the waves described in the classical theory of Maxwell.
The quantum theory was needed to explain effects even with visual light that Maxwell's classical theory could not
(such as spectral lines).
Notes
[1] CIE (1987). International Lighting Vocabulary (http://www.cie.co.at/publ/abst/17-4-89.html). Number 17.4. CIE, 4th edition. ISBN
978-3-900734-07-7.
By the International Lighting Vocabulary, the definition of light is: “Any radiation capable of causing a visual sensation directly.”
[6] http://www.yorku.ca/eye/lambdas.htm
[7] http://thulescientific.com/LYNCH%20&%20Soffer%20OPN%201999.pdf
[9] Standards organizations recommend that radiometric quantities should be denoted with a suffix "e" (for "energetic") to avoid confusion with
photometric or photon quantities.
[10] Alternative symbols sometimes seen: W or E for radiant energy, P or F for radiant flux, I for irradiance, W for radiant emittance.
[11] Spectral quantities given per unit wavelength are denoted with suffix "λ" (Greek) to indicate a spectral concentration. Spectral functions of
wavelength are indicated by "(λ)" in parentheses instead, for example in spectral transmittance, reflectance and responsivity.
[12] Spectral quantities given per unit frequency are denoted with suffix "ν" (Greek)—not to be confused with the suffix "v" (for "visual")
indicating a photometric quantity.
[13] NOAA / Space Weather Prediction Center (http://www.swpc.noaa.gov/forecast_verification/F10.html) includes a definition of the
solar flux unit (SFU).
[14] Standards organizations recommend that photometric quantities be denoted with a suffix "v" (for "visual") to avoid confusion with
radiometric or photon quantities.
[15] Alternative symbols sometimes seen: W for luminous energy, P or F for luminous flux, and ρ or K for luminous efficacy.
[16] "J" here is the symbol for the dimension of luminous intensity, not the symbol for the unit joules.
[18] See, for example, nano-opto-mechanical systems research at Yale University (http://www.eng.yale.edu/tanglab/research.htm).
[22][22] P. Lebedev, Untersuchungen über die Druckkräfte des Lichtes, Ann. Phys. 6, 433 (1901).
[25] Theories of light, from Descartes to Newton A. I. Sabra CUP Archive,1981 pg 48 ISBN 0-521-28436-8, ISBN 978-0-521-28436-3
[26] Fokko Jan Dijksterhuis, Lenses and Waves: Christiaan Huygens and the Mathematical Science of Optics in the 17th Century (http://books.
google.com/books?id=cPFevyomPUIC), Kluwer Academic Publishers, 2004, ISBN 1-4020-2697-8
References
Radar 27
Radar
A long-range radar antenna, known as ALTAIR,
used to detect and track space objects in
conjunction with ABM testing at the Ronald
Reagan Test Site on Kwajalein Atoll.
Israeli military radar is typical of the type of radar
used for air traffic control. The antenna rotates at
a steady rate, sweeping the local airspace with a
narrow vertical fan-shaped beam, to detect
aircraft at all altitudes.
Radar is an object detection system which uses radio waves to
determine the range, altitude, direction, or speed of objects. It can be
used to detect aircraft, ships, spacecraft, guided missiles, motor
vehicles, weather formations, and terrain. The radar dish or antenna
transmits pulses of radio waves or microwaves which bounce off any
object in their path. The object returns a tiny part of the wave's energy
to a dish or antenna which is usually located at the same site as the
transmitter.
Radar was secretly developed by several nations before and during
World War II. The term RADAR was coined in 1940 by the United
States Navy as an acronym for RAdio Detection And Ranging.
[1]
The
term radar has since entered English and other languages as the
common noun radar, losing all capitalization.
The modern uses of radar are highly diverse, including air traffic
control, radar astronomy, air-defense systems, antimissile systems;
marine radars to locate landmarks and other ships; aircraft anticollision
systems; ocean surveillance systems, outer space surveillance and
rendezvous systems; meteorological precipitation monitoring; altimetry
and flight control systems; guided missile target locating systems; and
ground-penetrating radar for geological observations. High tech radar
systems are associated with digital signal processing and are capable of
extracting useful information from very high noise levels.
Other systems similar to radar make use of other parts of the
electromagnetic spectrum. One example is "lidar", which uses visible
light from lasers rather than radio waves.
History
As early as 1886, German physicist Heinrich Hertz showed that radio
waves could be reflected from solid objects. In 1895, Alexander
Popov, a physics instructor at the Imperial Russian Navy school in
Kronstadt, developed an apparatus using a coherer tube for detecting
distant lightning strikes. The next year, he added a spark-gap
transmitter. In 1897, while testing this equipment for communicating
between two ships in the Baltic Sea, he took note of an interference beat caused by the passage of a third vessel. In
his report, Popov wrote that this phenomenon might be used for detecting objects, but he did nothing more with this
observation.
[2]
The German inventor Christian Hülsmeyer was the first to use radio waves to detect "the presence of distant metallic
objects". In 1904 he demonstrated the feasibility of detecting a ship in dense fog, but not its distance from the
transmitter.
[3]
He obtained a patent
[4]
for his detection device in April 1904 and later a patent
[5]
for a related
amendment for estimating the distance to the ship. He also got a British patent on September 23, 1904
[6]
for a full
system, that he called a telemobiloscope.
Radar 28
A Chain Home tower in Great
Baddow, United Kingdom
In August 1917 Nikola Tesla outlined a concept for primitive radar-like units.
[7]
He stated,
...by their [standing electromagnetic waves] use we may produce at
will, from a sending station, an electrical effect in any particular
region of the globe; [with which] we may determine the relative
position or course of a moving object, such as a vessel at sea, the
distance traversed by the same, or its speed.
In 1922 A. Hoyt Taylor and Leo C. Young, researchers working with the U.S.
Navy, had a transmitter and a receiver on opposite sides of the Potomac River
and discovered that a ship passing through the beam path caused the received
signal to fade in and out. Taylor submitted a report, suggesting that this might be
used to detect the presence of ships in low visibility, but the Navy did not
immediately continue the work. Eight years later, Lawrence A. Hyland at the
Naval Research Laboratory observed similar fading effects from a passing
aircraft; this led to a patent application
[8]
as well as a proposal for serious work at
the NRL (Taylor and Young were then at this laboratory) on radio-echo signals
from moving targets.
[9]
Before the Second World War, researchers in France, Germany, Italy, Japan, the
Netherlands, the Soviet Union, the United Kingdom, and the United States, independently and in great secrecy,
developed technologies that led to the modern version of radar. Australia, Canada, New Zealand, and South Africa
followed prewar Great Britain, and Hungary had similar developments during the war.
[10]
In 1934 the Frenchman Émile Girardeau stated he was building an obstacle-locating radio apparatus "conceived
according to the principles stated by Tesla" and obtained a patent for a working system,
[11][12][13]
a part of which
was installed on the Normandie liner in 1935.
[14]
During the same time, the Soviet military engineer P.K.Oschepkov, in collaboration with Leningrad Electrophysical
Institute, produced an experimental apparatus, RAPID, capable of detecting an aircraft within 3 km of a receiver.
[15]
The French and Soviet systems, however, had continuous-wave operation and could not give the full performance
that was ultimately at the center of modern radar.
Full radar evolved as a pulsed system, and the first such elementary apparatus was demonstrated in December 1934
by the American Robert M. Page, working at the Naval Research Laboratory.
[16]
The following year, the United
States Army successfully tested a primitive surface-to-surface radar to aim coastal battery search lights at night.
[17]
This was followed by a pulsed system demonstrated in May 1935 by Rudolf Kühnhold and the firm GEMA in
Germany and then one in June 1935 by an Air Ministry team led by Robert A. Watson Watt in Great Britain.
Development of radar greatly expanded on 1 September 1936 when Watson-Watt became Superintendent of a new
establishment under the British Air Ministry, Bawdsey Research Station located in Bawdsey Manor, near
Felixstowe, Suffolk. Work there resulted in the design and installation of aircraft detection and tracking stations
called Chain Home along the East and South coasts of England in time for the outbreak of World War II in 1939.
This system provided the vital advance information that helped the Royal Air Force win the Battle of Britain.
The British were the first to fully exploit radar as a defence against aircraft attack. This was spurred on by fears that
the Germans were developing death rays.
[]
The Air Ministry asked British scientists in 1934 to investigate the
possibility of propagating electromagnetic energy and the likely effect. Following a study, they concluded that a
death ray was impractical but that detection of aircraft appeared feasible.
[]
Robert Watson Watt's team demonstrated
to his superiors the capabilities of a working prototype and then patented the device.
[13][18][19]
It served as the basis
for the Chain Home network of radars to defend Great Britain, which detected approaching German aircraft in the
Battle of Britain in 1940.
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?
What is LiDAR?

Más contenido relacionado

Destacado

Chapter 1Into the Internet
Chapter 1Into the InternetChapter 1Into the Internet
Chapter 1Into the Internet
Patty Ramsey
 
Ch4(saving state with cookies and query strings)
Ch4(saving state with cookies and query strings)Ch4(saving state with cookies and query strings)
Ch4(saving state with cookies and query strings)
Chhom Karath
 
5 Accessing Information Resources
5 Accessing Information Resources5 Accessing Information Resources
5 Accessing Information Resources
Patty Ramsey
 
Appendex f
Appendex fAppendex f
Appendex f
swavicky
 
Appendex e
Appendex eAppendex e
Appendex e
swavicky
 
Drought: Looking Back and Planning Ahead, Todd Votteler
Drought: Looking Back and Planning Ahead, Todd VottelerDrought: Looking Back and Planning Ahead, Todd Votteler
Drought: Looking Back and Planning Ahead, Todd Votteler
TXGroundwaterSummit
 
Appendex g
Appendex gAppendex g
Appendex g
swavicky
 
Chapter 4 Form Factors & Power Supplies
Chapter 4 Form Factors & Power SuppliesChapter 4 Form Factors & Power Supplies
Chapter 4 Form Factors & Power Supplies
Patty Ramsey
 

Destacado (19)

LiDAR Aided Decision Making
LiDAR Aided Decision MakingLiDAR Aided Decision Making
LiDAR Aided Decision Making
 
Offshore pipelines
Offshore pipelinesOffshore pipelines
Offshore pipelines
 
Preparing LiDAR for Use in ArcGIS 10.1 with the Data Interoperability Extension
Preparing LiDAR for Use in ArcGIS 10.1 with the Data Interoperability ExtensionPreparing LiDAR for Use in ArcGIS 10.1 with the Data Interoperability Extension
Preparing LiDAR for Use in ArcGIS 10.1 with the Data Interoperability Extension
 
Chapter 1Into the Internet
Chapter 1Into the InternetChapter 1Into the Internet
Chapter 1Into the Internet
 
Ch4(saving state with cookies and query strings)
Ch4(saving state with cookies and query strings)Ch4(saving state with cookies and query strings)
Ch4(saving state with cookies and query strings)
 
Guidelines for Modelling Groundwater Surface Water Interaction in eWater Source
Guidelines for Modelling Groundwater Surface Water Interaction in eWater SourceGuidelines for Modelling Groundwater Surface Water Interaction in eWater Source
Guidelines for Modelling Groundwater Surface Water Interaction in eWater Source
 
Ch3(working with file)
Ch3(working with file)Ch3(working with file)
Ch3(working with file)
 
Survey Grade LiDAR Technologies for Transportation Engineering
Survey Grade LiDAR Technologies for Transportation EngineeringSurvey Grade LiDAR Technologies for Transportation Engineering
Survey Grade LiDAR Technologies for Transportation Engineering
 
5 Accessing Information Resources
5 Accessing Information Resources5 Accessing Information Resources
5 Accessing Information Resources
 
Ch5(ms access with php)
Ch5(ms access with php)Ch5(ms access with php)
Ch5(ms access with php)
 
Appendex f
Appendex fAppendex f
Appendex f
 
Appendex e
Appendex eAppendex e
Appendex e
 
Ch07
Ch07Ch07
Ch07
 
Introduction to PHP
Introduction to PHPIntroduction to PHP
Introduction to PHP
 
Application of dual output LiDAR scanning system for power transmission line ...
Application of dual output LiDAR scanning system for power transmission line ...Application of dual output LiDAR scanning system for power transmission line ...
Application of dual output LiDAR scanning system for power transmission line ...
 
Drought: Looking Back and Planning Ahead, Todd Votteler
Drought: Looking Back and Planning Ahead, Todd VottelerDrought: Looking Back and Planning Ahead, Todd Votteler
Drought: Looking Back and Planning Ahead, Todd Votteler
 
Unix Master
Unix MasterUnix Master
Unix Master
 
Appendex g
Appendex gAppendex g
Appendex g
 
Chapter 4 Form Factors & Power Supplies
Chapter 4 Form Factors & Power SuppliesChapter 4 Form Factors & Power Supplies
Chapter 4 Form Factors & Power Supplies
 

Más de Justin Farrow

Rankin LiDAR presentation
Rankin LiDAR presentationRankin LiDAR presentation
Rankin LiDAR presentation
Justin Farrow
 

Más de Justin Farrow (7)

Nomadic Theatre and Going Paperless in the Age of Climate Chaos
Nomadic Theatre and Going Paperless in the Age of Climate ChaosNomadic Theatre and Going Paperless in the Age of Climate Chaos
Nomadic Theatre and Going Paperless in the Age of Climate Chaos
 
PCGS Students Party for the Planet
PCGS Students Party for the PlanetPCGS Students Party for the Planet
PCGS Students Party for the Planet
 
PCGS Student Competes in World’s Largest Crowd Funding Event
PCGS Student Competes in World’s Largest Crowd Funding EventPCGS Student Competes in World’s Largest Crowd Funding Event
PCGS Student Competes in World’s Largest Crowd Funding Event
 
Rankin LiDAR presentation
Rankin LiDAR presentationRankin LiDAR presentation
Rankin LiDAR presentation
 
Trimble Pro XT User Guide
Trimble Pro XT User GuideTrimble Pro XT User Guide
Trimble Pro XT User Guide
 
Survey earth in a day 2.0
Survey earth in a day 2.0Survey earth in a day 2.0
Survey earth in a day 2.0
 
How good was he?
How good was he?How good was he?
How good was he?
 

Último

The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
heathfieldcps1
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
ciinovamais
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
Chris Hunter
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
kauryashika82
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
PECB
 

Último (20)

Asian American Pacific Islander Month DDSD 2024.pptx
Asian American Pacific Islander Month DDSD 2024.pptxAsian American Pacific Islander Month DDSD 2024.pptx
Asian American Pacific Islander Month DDSD 2024.pptx
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
 
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural ResourcesEnergy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 

What is LiDAR?

  • 1. PDF generated using the open source mwlib toolkit. See http://code.pediapress.com/ for more information. PDF generated at: Sun, 26 May 2013 00:10:46 UTC What is Lidar? A Resource Curated by AirTopo Group
  • 2. Contents Articles Lidar 1 Remote sensing 9 Light 17 Radar 27 Geomatics 45 Archaeology 47 Geography 67 Geology 77 Geomorphology 98 Seismology 105 Forestry 111 Atmospheric physics 117 Contour line 121 Laser 129 National Center for Atmospheric Research 149 Ultraviolet 152 Interferometric visibility 173 Infrared 175 Aerosol 188 Cloud 197 Meteorology 225 Aircraft 239 Satellite 249 Surveying 265 Micrometre 276 Computer 277 Beam splitter 296 Laser scanning 297 Azimuth 300 Optics 304 Attenuation 329 Global Positioning System 334 Inertial measurement unit 358 National lidar dataset 361
  • 3. Agricultural Research Service 362 Canopy (biology) 364 Orienteering 366 Soil survey 375 LIDAR speed gun 377 Speed limit enforcement 378 Wind farm 388 Structure from motion 401 CLidar 403 Lidar detector 405 Satellite laser ranging 405 Optical time-domain reflectometer 407 Optech 413 TopoFlight 414 Time-domain reflectometry 416 References Article Sources and Contributors 418 Image Sources, Licenses and Contributors 430 Article Licenses License 441
  • 4. Lidar 1 Lidar A FASOR used at the Starfire Optical Range for lidar and laser guide star experiments is tuned to the sodium D2a line and used to excite sodium atoms in the upper atmosphere. This lidar may be used to scan buildings, rock formations, etc., to produce a 3D model. The lidar can aim its laser beam in a wide range: its head rotates horizontally; a mirror tilts vertically. The laser beam is used to measure the distance to the first object on its path. Lidar (also written LIDAR or LiDAR) is a remote sensing technology that measures distance by illuminating a target with a laser and analyzing the reflected light. The term lidar comes from combining the words light and radar. [] Lidar is popularly known as a technology used to make high resolution maps, geomatics, archaeology, geography, geology, geomorphology, seismology, forestry, remote sensing, atmospheric physics, [1] airborne laser swath mapping (ALSM), laser altimetry, and contour mapping. History and etymology of lidar/LIDAR Lidar was developed in the early 1960s, shortly after the invention of the laser, and combined laser’s focused imaging with radar’s ability to calculate distances by measuring the time for the signal to return. Its first applications were in meteorology, where it was used to measure clouds by the National Center for Atmospheric Research. [] Although commonly considered to be an acronym, the term lidar is actually a portmanteau of "light" and "radar", the first published mention of lidar makes this clear: "Eventually the laser may provide an extremely sensitive detector of particular wavelengths from distant objects. Meanwhile, it is being used to study the moon by "lidar" (light radar) and it promises a means of communications, not only all over the solar system, but also with planets of nearby stars"; [2] the Oxford English Dictionary supports this etymology. [] The assumption that lidar was an acronym (LIDAR) came later, beginning in 1970, [3] and was based on the assumption that since the base term "radar" originally started as an acronym for "RAdio Detection And Ranging", that LIDAR must stand for "LIght Detection And Ranging", [4] or "Laser Imaging, Detection and Ranging". [5] Although "radar" is no longer treated as an acronym and is universally uncapitalized, the word "lidar" became capitalized as LIDAR in some publications beginning in the 1980s. [6] Today there is no consensus in capitalization, reflecting uncertainty about whether or not it is an acronym, and if it is an acronym, if it should be lowercase, like "radar". Lidar is also sometimes spelled "LIDAR", "LiDAR", "LIDaR", or "Lidar", depending on the publication, the USGS uses both LIDAR and lidar, sometimes in the same document, [7] and the New York Times uses both "lidar" and "Lidar". [8]
  • 5. Lidar 2 General description Lidar uses ultraviolet, visible, or near infrared light to image objects and can be used with a wide range of targets, including non-metallic objects, rocks, rain, chemical compounds, aerosols, clouds and even single molecules. [1] A narrow laser beam can be used to map physical features with very high resolution. Lidar has been used extensively for atmospheric research and meteorology. Downward-looking lidar instruments fitted to aircraft and satellites are used for surveying and mapping – a recent example being the NASA Experimental Advanced Research Lidar. [9] In addition lidar has been identified by NASA as a key technology for enabling autonomous precision safe landing of future robotic and crewed lunar landing vehicles. [10] Wavelengths from about 10 micrometers to the UV (ca. 250 nm) are used to suit the target. Typically light is reflected via backscattering. Different types of scattering are used for different lidar applications; most common are Rayleigh scattering, Mie scattering, Raman scattering, and fluorescence. Based on different kinds of backscattering, the lidar can be accordingly called Rayleigh Lidar, Mie Lidar, Raman Lidar, Na/Fe/K Fluorescence Lidar, and so on. [1] Suitable combinations of wavelengths can allow for remote mapping of atmospheric contents by looking for wavelength-dependent changes in the intensity of the returned signal. Design In general there are two kinds of lidar detection schemes: "incoherent" or direct energy detection (which is principally an amplitude measurement) and Coherent detection (which is best for doppler, or phase sensitive measurements). Coherent systems generally use Optical heterodyne detection, which, being more sensitive than direct detection, allows them to operate a much lower power but at the expense of more complex transceiver requirements. In both coherent and incoherent lidar, there are two types of pulse models: micropulse lidar systems and high energy systems. Micropulse systems have developed as a result of the ever increasing amount of computer power available combined with advances in laser technology. They use considerably less energy in the laser, typically on the order of one microjoule, and are often "eye-safe," meaning they can be used without safety precautions. High-power systems are common in atmospheric research, where they are widely used for measuring many atmospheric parameters: the height, layering and densities of clouds, cloud particle properties (extinction coefficient, backscatter coefficient, depolarization), temperature, pressure, wind, humidity, trace gas concentration (ozone, methane, nitrous oxide, etc.). [1] There are several major components to a lidar system: 1. Laser — 600–1000 nm lasers are most common for non-scientific applications. They are inexpensive, but since they can be focused and easily absorbed by the eye, the maximum power is limited by the need to make them eye-safe. Eye-safety is often a requirement for most applications. A common alternative, 1550 nm lasers, are eye-safe at much higher power levels since this wavelength is not focused by the eye, but the detector technology is less advanced and so these wavelengths are generally used at longer ranges and lower accuracies. They are also used for military applications as 1550 nm is not visible in night vision goggles, unlike the shorter 1000 nm infrared laser. Airborne topographic mapping lidars generally use 1064 nm diode pumped YAG lasers, while bathymetric systems generally use 532 nm frequency doubled diode pumped YAG lasers because 532 nm penetrates water with much less attenuation than does 1064 nm. Laser settings include the laser repetition rate (which controls the data collection speed). Pulse length is generally an attribute of the laser cavity length, the number of passes required through the gain material (YAG, YLF, etc.), and Q-switch speed. Better target resolution is achieved with shorter pulses, provided the lidar receiver detectors and electronics have sufficient bandwidth. [1] 2. Scanner and optics — How fast images can be developed is also affected by the speed at which they are scanned. There are several options to scan the azimuth and elevation, including dual oscillating plane mirrors, a
  • 6. Lidar 3 combination with a polygon mirror, a dual axis scanner (see Laser scanning). Optic choices affect the angular resolution and range that can be detected. A hole mirror or a beam splitter are options to collect a return signal. 3. Photodetector and receiver electronics — Two main photodetector technologies are used in lidars: solid state photodetectors, such as silicon avalanche photodiodes, or photomultipliers. The sensitivity of the receiver is another parameter that has to be balanced in a lidar design. 4. Position and navigation systems — Lidar sensors that are mounted on mobile platforms such as airplanes or satellites require instrumentation to determine the absolute position and orientation of the sensor. Such devices generally include a Global Positioning System receiver and an Inertial Measurement Unit (IMU). 3D imaging can be achieved using both scanning and non-scanning systems. "3D gated viewing laser radar" is a non-scanning laser ranging system that applies a pulsed laser and a fast gated camera. Imaging lidar can also be performed using arrays of high speed detectors and modulation sensitive detectors arrays typically built on single chips using CMOS and hybrid CMOS/CCD fabrication techniques. In these devices each pixel performs some local processing such as demodulation or gating at high speed down converting the signals to video rate so that the array may be read like a camera. Using this technique many thousands of pixels / channels may be acquired simultaneously. [11] High resolution 3D lidar cameras use homodyne detection with an electronic CCD or CMOS shutter. [] A coherent Imaging lidar uses Synthetic array heterodyne detection to enables a staring single element receiver to act as though it were an imaging array. [12] Applications This lidar-equipped mobile robot uses its lidar to construct a map and avoid obstacles. Other than those applications listed above, there are a wide variety of applications of lidar, as often mentioned in National LIDAR Dataset programs.
  • 7. Lidar 4 Agriculture Agricultural Research Service scientists have developed a way to incorporate lidar with yield rates on agricultural fields. This technology will help farmers improve their yields by directing their resources toward the high-yield sections of their land. Lidar also can be used to help farmers determine which areas of their fields to apply costly fertilizer. Lidar can create a topographical map of the fields and reveals the slopes and sun exposure of the farm land. Researchers at the Agricultural Research Service blended this topographical information with the farm land’s yield results from previous years. From this information, researchers categorized the farm land into high-, medium-, or low-yield zones. [13] This technology is valuable to farmers because it indicates which areas to apply the expensive fertilizers to achieve the highest crop yield. Archaeology Lidar has many applications in the field of archaeology including aiding in the planning of field campaigns, mapping features beneath forest canopy, and providing an overview of broad, continuous features that may be indistinguishable on the ground. [14] Lidar can also provide archaeologists with the ability to create high-resolution digital elevation models (DEMs) of archaeological sites that can reveal micro-topography that are otherwise hidden by vegetation. LiDAR-derived products can be easily integrated into a Geographic Information System (GIS) for analysis and interpretation. For example at Fort Beausejour - Fort Cumberland National Historic Site, Canada, previously undiscovered archaeological features below forest canopy have been mapped that are related to the siege of the Fort in 1755. Features that could not be distinguished on the ground or through aerial photography were identified by overlaying hillshades of the DEM created with artificial illumination from various angles. With lidar the ability to produce high-resolution datasets quickly and relatively cheaply can be an advantage. Beyond efficiency, its ability to penetrate forest canopy has led to the discovery of features that were not distinguishable through traditional geo-spatial methods and are difficult to reach through field surveys, as in work at Caracol by Arlen Chase and his wife Diane Zaino Chase. [15] The intensity of the returned signal can be used to detect features buried under flat vegetated surfaces such as fields, especially when mapping using the infrared spectrum. The presence of these features affects plant growth and thus the amount of infrared light reflected back. [16] In 2012, Lidar was used by a team attempting to find the legendary city of La Ciudad Blanca in the Honduran jungle. During a seven day mapping period, they found evidence of extensive man-made structures that had alluded ground searches for hundreds of years. [17] Autonomous Vehicles 3D SICK Lidar Autonomous vehicles use lidar for obstacle detection and avoidance to navigate safely through environments. [18] Biology and conservation Lidar has also found many applications in forestry. Canopy heights, biomass measurements, and leaf area can all be studied using airborne lidar systems. Similarly, lidar is also used by many industries, including Energy and Railroad, and the Department of Transportation as a faster way of surveying. Topographic maps can also be generated readily from lidar, including for recreational use such as in the production of orienteering maps.[19] In addition, the Save-the-Redwoods League is undertaking a project to map the tall redwoods on California's northern coast. Lidar allows research scientists to not only measure the height of previously unmapped trees but to
  • 8. Lidar 5 determine the biodiversity of the redwood forest. Stephen Sillett who is working with the League on the North Coast Lidar project claims this technology will be useful in directing future efforts to preserve and protect ancient redwood trees. [20] Wikipedia:Citing sources#What information to include Geology and soil science High-resolution digital elevation maps generated by airborne and stationary lidar have led to significant advances in geomorphology (the branch of geoscience concerned with the origin and evolution of Earth's surface topography). Lidar’s abilities to detect subtle topographic features such as river terraces and river channel banks, to measure the land-surface elevation beneath the vegetation canopy, to better resolve spatial derivatives of elevation, and to detect elevation changes between repeat surveys have enabled many novel studies of the physical and chemical processes that shape landscapes. [citation needed] In geophysics and tectonics, a combination of aircraft-based lidar and GPS has evolved into an important tool for detecting faults and for measuring uplift. The output of the two technologies can produce extremely accurate elevation models for terrain - models that can even measure ground elevation through trees. This combination was used most famously to find the location of the Seattle Fault in Washington, USA. [21] This combination also measures uplift at Mt. St. Helens by using data from before and after the 2004 uplift. [22] Airborne lidar systems monitor glaciers and have the ability to detect subtle amounts of growth or decline. A satellite-based system, NASA's ICESat, includes a lidar sub-system for this purpose. NASA's Airborne Topographic Mapper [23] is also used extensively to monitor glaciers and perform coastal change analysis. The combination is also used by soil scientists while creating a soil survey. The detailed terrain modeling allows soil scientists to see slope changes and landform breaks which indicate patterns in soil spatial relationships. Meteorology and atmospheric environment The first lidar systems were used for studies of atmospheric composition, structure, clouds, and aerosols. Initially based on ruby lasers, lidar for meteorological applications was constructed shortly after the invention of the laser and represent one of the first applications of laser technology. Differential Absorption lidar (DIAL) is used for range-resolved measurements of a particular gas in the atmosphere, such as ozone, carbon dioxide, or water vapor. The lidar transmits two wavelengths: an "on-line" wavelength that is absorbed by the gas of interest and an off-line wavelength that is not absorbed. The differential absorption between the two wavelengths is a measure of the concentration of the gas as a function of range. DIAL lidars are essentially dual-wavelength backscatter lidars. [citation needed] Doppler Lidar and Rayleigh Doppler Lidar are used to measure temperature and/or wind speed along the beam by measuring the frequency of the backscattered light. The Doppler broadening of gases in motion allows the determination of properties via the resulting frequency shift. [24][25] Scanning lidars, such as NASA's HARLIE LIDAR, have been used to measure atmospheric wind velocity in a large three dimensional cone. [26] ESA's wind mission ADM-Aeolus will be equipped with a Doppler lidar system in order to provide global measurements of vertical wind profiles. [27] A doppler lidar system was used in the 2008 Summer Olympics to measure wind fields during the yacht competition. [28] Doppler lidar systems are also now beginning to be successfully applied in the renewable energy sector to acquire wind speed, turbulence, wind veer and wind shear data. Both pulsed [29] and continuous wave systems [30] are being used. Pulsed systems using signal timing to obtain vertical distance resolution, whereas continuous wave systems rely on detector focusing. Synthetic Array Lidar allows imaging lidar without the need for an array detector. It can be used for imaging Doppler velocimetry, ultra-fast frame rate (MHz) imaging, as well as for speckle reduction in coherent lidar. [12] An extensive lidar bibliography for atmospheric and hydrospheric applications is given by Grant. [31]
  • 9. Lidar 6 Law enforcement LIDAR speed guns are used by the police to measure the speed of vehicles for speed limit enforcement purposes. [citation needed] Military Few military applications are known to be in place and are classified, but a considerable amount of research is underway in their use for imaging. Higher resolution systems collect enough detail to identify targets, such as tanks. Examples of military applications of lidar include the Airborne Laser Mine Detection System (ALMDS) for counter-mine warfare by Areté Associates. [32] A NATO report (RTO-TR-SET-098) evaluated the potential technologies to do stand-off detection for the discrimination of biological warfare agents. The potential technologies evaluated were Long-Wave Infrared (LWIR), Differential Scatterring (DISC), and Ultraviolet Laser Induced Fluorescence (UV-LIF). The report concluded that : Based upon the results of the lidar systems tested and discussed above, the Task Group recommends that the best option for the near-term (2008–2010) application of stand-off detection systems is UV LIF . [33] However, in the long-term, other techniques such as stand-off Raman spectroscopy may prove to be useful for identification of biological warfare agents. Short-range compact spectrometric lidar based on Laser-Induced Fluorescence (LIF) would address the presence of bio-threats in aerosol form over critical indoor, semi-enclosed and outdoor venues like stadiums, subways, and airports. This near real-time capability would enable rapid detection of a bioaerosol release and allow for timely implementation of measures to protect occupants and minimize the extent of contamination. [34] The Long-Range Biological Standoff Detection System (LR-BSDS) was developed for the US Army to provide the earliest possible standoff warning of a biological attack. It is an airborne system carried by a helicopter to detect man-made aerosol clouds containing biological and chemical agents at long range. The LR-BSDS, with a detection range of 30 km or more, was fielded in June 1997. [35] Five lidar units produced by the German company Sick AG were used for short range detection on Stanley, the autonomous car that won the 2005 DARPA Grand Challenge. A robotic Boeing AH-6 performed a fully autonomous flight in June 2010, including avoiding obstacles using lidar. [36][37] Mining Lidar is used in the mining industry for various tasks. The calculation of ore volumes is accomplished by periodic (monthly) scanning in areas of ore removal, then comparing surface data to the previous scan. [citation needed] Physics and astronomy A worldwide network of observatories uses lidars to measure the distance to reflectors placed on the moon, allowing the moon's position to be measured with mm precision and tests of general relativity to be done. MOLA, the Mars Orbiting Laser Altimeter, used a lidar instrument in a Mars-orbiting satellite (the NASA Mars Global Surveyor) to produce a spectacularly precise global topographic survey of the red planet. In September, 2008, NASA's Phoenix Lander used lidar to detect snow in the atmosphere of Mars. [38] In atmospheric physics, lidar is used as a remote detection instrument to measure densities of certain constituents of the middle and upper atmosphere, such as potassium, sodium, or molecular nitrogen and oxygen. These measurements can be used to calculate temperatures. lidar can also be used to measure wind speed and to provide information about vertical distribution of the aerosol particles. [citation needed] At the JET nuclear fusion research facility, in the UK near Abingdon, Oxfordshire, lidar Thomson Scattering is used to determine Electron Density and Temperature profiles of the plasma. [39]
  • 10. Lidar 7 Robotics Lidar technology is being used in Robotics for the perception of the environment as well as object classification. [40] The ability of lidar technology to provide three-dimensional elevation maps of the terrain, high precision distance to the ground, and approach velocity can enable safe landing of robotic and manned vehicles with a high degree of precision. [41] Refer to the Military section above for further examples. Spaceflight Lidar is increasingly being utilized for rangefinding and orbital element calculation of relative velocity in proximity operations and stationkeeping of spacecraft. Lidar has also been used for atmospheric studies from space. Using short pulses of laser light beamed from a spacecraft, some of that "light reflects off of tiny particles in the atmosphere and back to a telescope aligned with the laser. By precisely timing the lidar 'echo,' and by measuring how much laser light is received by the telescope, scientists can accurately determine the location, distribution and nature of the particles. The result is a revolutionary new tool for studying constituents in the atmosphere, from cloud droplets to industrial pollutants, that are difficult to detect by other means." [42][43] Surveying This TomTom mapping van is fitted with five lidars on its roof rack. Airborne lidar sensors are used by companies in the remote sensing field. It can be used to create DTM (Digital Terrain Models) and DEM (Digital Elevation Models) this is quite a common practice for larger areas as a plane can take in a 1 km wide swath in one flyover. Greater vertical accuracy of below 50 mm can be achieved with a lower flyover and a slimmer 200 m swath, even in forest, where it is able to give you the height of the canopy as well as the ground elevation. A reference point is needed to link the data in with the WGS (World Geodetic System) [citation needed] In fact, it works a lot like ordinary radar, except that these systems send out narrow pulses or beams of light rather than broad radio waves. Transportation Lidar has been used in Adaptive Cruise Control (ACC) systems for automobiles. Systems such as those by Siemens and Hella use a lidar device mounted on the front of the vehicle, such as the bumper, to monitor the distance between the vehicle and any vehicle in front of it. [44] In the event the vehicle in front slows down or is too close, the ACC applies the brakes to slow the vehicle. When the road ahead is clear, the ACC allows the vehicle to accelerate to a speed preset by the driver. Refer to the Military section above for further examples. Wind farm optimization Lidar can be used to increase the energy output from wind farms by accurately measuring wind speeds and wind turbulence. [] An experimental [] lidar is mounted on a wind turbine rotor to measure oncoming horizontal winds, and proactively adjust blades to protect components and increase power. [45] Solar photovoltaic deployment optimization Lidar can also be used to assist planners and developers optimize solar photovoltaic systems at the city level by determining appropriate roof tops [46] and for determining shading losses. [47]
  • 11. Lidar 8 Other uses The video for the song "House of Cards" by Radiohead was believed to be the first use of real-time 3D laser scanning to record a music video. The range data in the video is not completely from a lidar, as structured light scanning is also used. [48][49] Alternative technologies Recent development of Structure From Motion (SFM) technologies allows delivering 3D images and maps based on data extracted from visual and IR photography. The elevation or 3D data is extracted using multiple parallel passes over mapped area, yielding both visual light image and 3D structure from the same sensor, which is often a specially chosen and calibrated digital camera. References [2] James Ring, "The Laser in Astronomy." p. 672-3, New Scientist Jun 20, 1963 [3] "New Artillery Against Smog: TV and Lidar" Popular Mechanics, April 1970, p. 104. [4] NOAA, http://www.ngs.noaa.gov/RESEARCH/RSD/main/lidar/lidar.shtml [5] LIDAR patent on file, http://www.google.com/patents/US20090273770 [6] Google Books search for "lidar", sorted by date of publication, http://books.google.com/ [7] USGS Center for LIDAR Information Coordination and Knowledge, http://lidar.cr.usgs.gov/ [8] New York Times, search for "lidar", http://query.nytimes.com/search/sitesearch/#/lidar [9] 'Experimental Advanced Research Lidar', NASA.org (http://inst.wff.nasa.gov/eaarl/). Retrieved 8 August 2007. [12] Strauss C. E. M., " Synthetic-array heterodyne detection: a single-element detector acts as an array (http://www.opticsinfobase.org/ol/ abstract.cfm?id=12612)", Opt. Lett. 19, 1609-1611 (1994) [19] http://www.lidarbasemaps.org [20][20] Councillor Quarterly, Summer 2007 Volume 6 Issue 3 [21] Tom Paulson. 'LIDAR shows where earthquake risks are highest, Seattle Post (Wednesday, April 18, 2001) (http://www.seattlepi.com/ local/19144_quake18.shtml). [22] 'Mount Saint Helens LIDAR Data', Washington State Geospatial Data Archive (September 13, 2006) (http://wagda.lib.washington.edu/ data/type/elevation/lidar/st_helens/). Retrieved 8 August 2007. [23] 'Airborne Topographic Mapper', NASA.gov (http://atm.wff.nasa.gov/). Retrieved 8 August 2007. [24] http://superlidar.colorado.edu/Classes/Lidar2011/LidarLecture14.pdf [26] Thomas D. Wilkerson, Geary K. Schwemmer, and Bruce M. Gentry. LIDAR Profiling of Aerosols, Clouds, and Winds by Doppler and Non-Doppler Methods, NASA International H2O Project (2002) (http://harlie.gsfc.nasa.gov/IHOP2002/Pub&Pats/AMOS 2002 final. pdf). [27] 'Earth Explorers: ADM-Aeolus', ESA.org (European Space Agency, 6 June 2007) (http://www.esa.int/esaLP/ ESAES62VMOC_LPadmaeolus_0.html). Retrieved 8 August 2007. [28] 'Doppler lidar gives Olympic sailors the edge', Optics.org (3 July, 2008) (http://optics.org/cws/article/research/34878). Retrieved 8 July 2008. [29] http://www.lidarwindtechnologies.com/ [30] http://www.naturalpower.com/zephir [31] Grant, W. B., Lidar for atmospheric and hydrospheric studies, in Tunable Laser Applications, 1st Edition, Duarte, F. J. Ed. (Marcel Dekker, New York, 1995) Chapter 7. [32] (http://www.arete.com/index.php?view=stil_mcm) [35] .http://articles.janes.com/articles/Janes-Nuclear,-Biological-and-Chemical-Defence/ LR-BSDS--Long-Range-Biological-Standoff-Detection-System-United-States.html [36] Spice, Byron. Researchers Help Develop Full-Size Autonomous Helicopter (http://www.cmu.edu/news/blog/2010/Summer/ unprecedented-robochopper.shtml) Carnegie Mellon, 6 July 2010. Retrieved: 19 July 2010. [37] Koski, Olivia. In a First, Full-Sized Robo-Copter Flies With No Human Help (http://www.wired.com/dangerroom/2010/07/ in-a-first-full-sized-robo-copter-flies-with-no-human-help/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed:+wired/ index+(Wired:+Index+3+(Top+Stories+2))#ixzz0tk2hAfAQ) Wired, 14 July 2010. Retrieved: 19 July 2010. [38] NASA. 'NASA Mars Lander Sees Falling Snow, Soil Data Suggest Liquid Past' NASA.gov (29 September 2008) (http://www.nasa.gov/ mission_pages/phoenix/news/phoenix-20080929.html). Retrieved 9 November 2008. [39] CW Gowers. ' Focus On : Lidar-Thomson Scattering Diagnostic on JET' JET.EFDA.org (undated) (http://www.jet.efda.org/pages/focus/ lidar/index.html). Retrieved 8 August 2007. [45] Mikkelsen, Torben & Hansen, Kasper Hjorth et al. Lidar wind speed measurements from a rotating spinner (http://orbit.dtu.dk/ getResource?recordId=259451&objectId=1&versionId=1) Danish Research Database & Danish Technical University, 20 April 2010.
  • 12. Lidar 9 Retrieved: 25 April 2010. [46] Ha T. Nguyen, Joshua M. Pearce, Rob Harrap, and Gerald Barber, “ The Application of LiDAR to Assessment of Rooftop Solar Photovoltaic Deployment Potential on a Municipal District Unit (http://www.mdpi.com/1424-8220/12/4/4534/pdf)”, Sensors, 12, pp. 4534-4558 (2012). [49] Retrieved 2 May 2011 (http://www.velodyne.com/lidar/lidar.aspx) WANG, J., ZHANG, J., RONCAT, A., KUENZER, C., WAGNER, W., 2009: Regularizing method for the determination of the backscatter cross section in lidar data, In: J. Opt. Soc. Am. A Vol. 26, No. 5/May 2009/, 1084-7529/09/051071-9, pp. 1071–1079 External links • The USGS Center for LIDAR Information Coordination and Knowledge (CLICK) (http://lidar.cr.usgs.gov/) - A website intended to "facilitate data access, user coordination and education of lidar remote sensing for scientific needs." • How Lidar Works (http://airborne1.com/how_lidar.html) • LiDAR Research Group (LRG), University of Heidelberg (http://lrg.uni-hd.de) • Forecast 3D Lidar Scanner manufactured by Autonomous Solutions, Inc (http://autonomoussolutions.com/ forecast-3d-laser-system/): 3D Point Cloud and Obstacle Detection • Free online lidar data viewer (http://www.lidarview.com/) Remote sensing Synthetic aperture radar image of Death Valley colored using polarimetry. Remote sensing is the acquisition of information about an object or phenomenon without making physical contact with the object. In modern usage, the term generally refers to the use of aerial sensor technologies to detect and classify objects on Earth (both on the surface, and in the atmosphere and oceans) by means of propagated signals (e.g. electromagnetic radiation emitted from aircraft or satellites). [1][2]
  • 13. Remote sensing 10 Overview This video is about how Landsat was used to identify areas of conservation in the Democratic Republic of the Congo, and how it was used to help map an area called MLW in the north. There are two main types of remote sensings: passive remote sensing and active remote sensing. [3] Passive sensors detect natural radiation that is emitted or reflected by the object or surrounding areas. Reflected sunlight is the most common source of radiation measured by passive sensors. Examples of passive remote sensors include film photography, infrared, charge-coupled devices, and radiometers. Active collection, on the other hand, emits energy in order to scan objects and areas whereupon a sensor then detects and measures the radiation that is reflected or backscattered from the target. RADAR and LiDAR are examples of active remote sensing where the time delay between emission and return is measured, establishing the location, speed and direction of an object. Remote sensing makes it possible to collect data on dangerous or inaccessible areas. Remote sensing applications include monitoring deforestation in areas such as the Amazon Basin, glacial features in Arctic and Antarctic regions, and depth sounding of coastal and ocean depths. Military collection during the Cold War made use of stand-off collection of data about dangerous border areas. Remote sensing also replaces costly and slow data collection on the ground, ensuring in the process that areas or objects are not disturbed. Orbital platforms collect and transmit data from different parts of the electromagnetic spectrum, which in conjunction with larger scale aerial or ground-based sensing and analysis, provides researchers with enough information to monitor trends such as El Niño and other natural long and short term phenomena. Other uses include different areas of the earth sciences such as natural resource management, agricultural fields such as land usage and conservation, and national security and overhead, ground-based and stand-off collection on border areas. [4] Data acquisition techniques The basis for multispectral collection and analysis is that of examined areas or objects that reflect or emit radiation that stand out from surrounding areas. Applications of remote sensing data • Conventional radar is mostly associated with aerial traffic control, early warning, and certain large scale meteorological data. Doppler radar is used by local law enforcements’ monitoring of speed limits and in enhanced meteorological collection such as wind speed and direction within weather systems. Other types of active collection includes plasmas in the ionosphere. Interferometric synthetic aperture radar is used to produce precise digital elevation models of large scale terrain (See RADARSAT, TerraSAR-X, Magellan). • Laser and radar altimeters on satellites have provided a wide range of data. By measuring the bulges of water caused by gravity, they map features on the seafloor to a resolution of a mile or so. By measuring the height and wavelength of ocean waves, the altimeters measure wind speeds and direction, and surface ocean currents and directions. • Light detection and ranging (LIDAR) is well known in examples of weapon ranging, laser illuminated homing of projectiles. LIDAR is used to detect and measure the concentration of various chemicals in the atmosphere, while airborne LIDAR can be used to measure heights of objects and features on the ground more accurately than with
  • 14. Remote sensing 11 radar technology. Vegetation remote sensing is a principal application of LIDAR. • Radiometers and photometers are the most common instrument in use, collecting reflected and emitted radiation in a wide range of frequencies. The most common are visible and infrared sensors, followed by microwave, gamma ray and rarely, ultraviolet. They may also be used to detect the emission spectra of various chemicals, providing data on chemical concentrations in the atmosphere. • Stereographic pairs of aerial photographs have often been used to make topographic maps by imagery and terrain analysts in trafficability and highway departments for potential routes. • Simultaneous multi-spectral platforms such as Landsat have been in use since the 70’s. These thematic mappers take images in multiple wavelengths of electro-magnetic radiation (multi-spectral) and are usually found on Earth observation satellites, including (for example) the Landsat program or the IKONOS satellite. Maps of land cover and land use from thematic mapping can be used to prospect for minerals, detect or monitor land usage, deforestation, and examine the health of indigenous plants and crops, including entire farming regions or forests. • Hyperspectral imaging produces an image where each pixel has full spectral information with imaging narrow spectral bands over a contiguous spectral range. Hyperspectral imagers are used in various applications including mineralogy, biology, defence, and environmental measurements. • Within the scope of the combat against desertification, remote sensing allows to follow-up and monitor risk areas in the long term, to determine desertification factors, to support decision-makers in defining relevant measures of environmental management, and to assess their impacts. [5] Geodetic • Overhead geodetic collection was first used in aerial submarine detection and gravitational data used in military maps. This data revealed minute perturbations in the Earth’s gravitational field (geodesy) that may be used to determine changes in the mass distribution of the Earth, which in turn may be used for geological studies. Acoustic and near-acoustic • Sonar: passive sonar, listening for the sound made by another object (a vessel, a whale etc.); active sonar, emitting pulses of sounds and listening for echoes, used for detecting, ranging and measurements of underwater objects and terrain. • Seismograms taken at different locations can locate and measure earthquakes (after they occur) by comparing the relative intensity and precise timings. To coordinate a series of large-scale observations, most sensing systems depend on the following: platform location, what time it is, and the rotation and orientation of the sensor. High-end instruments now often use positional information from satellite navigation systems. The rotation and orientation is often provided within a degree or two with electronic compasses. Compasses can measure not just azimuth (i. e. degrees to magnetic north), but also altitude (degrees above the horizon), since the magnetic field curves into the Earth at different angles at different latitudes. More exact orientations require gyroscopic-aided orientation, periodically realigned by different methods including navigation from stars or known benchmarks. Data processing Generally speaking, remote sensing works on the principle of the inverse problem. While the object or phenomenon of interest (the state) may not be directly measured, there exists some other variable that can be detected and measured (the observation), which may be related to the object of interest through the use of a data-derived computer model. The common analogy given to describe this is trying to determine the type of animal from its footprints. For example, while it is impossible to directly measure temperatures in the upper atmosphere, it is possible to measure the spectral emissions from a known chemical species (such as carbon dioxide) in that region. The frequency of the emission may then be related to the temperature in that region via various thermodynamic
  • 15. Remote sensing 12 relations. The quality of remote sensing data consists of its spatial, spectral, radiometric and temporal resolutions. Spatial resolution The size of a pixel that is recorded in a raster image – typically pixels may correspond to square areas ranging in side length from 1 to 1,000 metres (3.3 to 3,300 ft). Spectral resolution The wavelength width of the different frequency bands recorded – usually, this is related to the number of frequency bands recorded by the platform. Current Landsat collection is that of seven bands, including several in the infra-red spectrum, ranging from a spectral resolution of 0.07 to 2.1 μm. The Hyperion sensor on Earth Observing-1 resolves 220 bands from 0.4 to 2.5 μm, with a spectral resolution of 0.10 to 0.11 μm per band. Radiometric resolution The number of different intensities of radiation the sensor is able to distinguish. Typically, this ranges from 8 to 14 bits, corresponding to 256 levels of the gray scale and up to 16,384 intensities or "shades" of colour, in each band. It also depends on the instrument noise. Temporal resolution The frequency of flyovers by the satellite or plane, and is only relevant in time-series studies or those requiring an averaged or mosaic image as in deforesting monitoring. This was first used by the intelligence community where repeated coverage revealed changes in infrastructure, the deployment of units or the modification/introduction of equipment. Cloud cover over a given area or object makes it necessary to repeat the collection of said location. In order to create sensor-based maps, most remote sensing systems expect to extrapolate sensor data in relation to a reference point including distances between known points on the ground. This depends on the type of sensor used. For example, in conventional photographs, distances are accurate in the center of the image, with the distortion of measurements increasing the farther you get from the center. Another factor is that of the platen against which the film is pressed can cause severe errors when photographs are used to measure ground distances. The step in which this problem is resolved is called georeferencing, and involves computer-aided matching up of points in the image (typically 30 or more points per image) which is extrapolated with the use of an established benchmark, "warping" the image to produce accurate spatial data. As of the early 1990s, most satellite images are sold fully georeferenced. In addition, images may need to be radiometrically and atmospherically corrected. Radiometric correction Gives a scale to the pixel values, e. g. the monochromatic scale of 0 to 255 will be converted to actual radiance values. Topographic correction In the rugged mountains, as a result of terrain, each pixel which receives the effective illumination varies considerably different. In remote sensing image, the pixel on the shady slope receives weak illumination and has a low radiance value, in contrast, the pixel on the sunny slope receives strong illumination and has a high radiance value. For the same objects, the pixel radiance values on the shady slope must be very different from that on the sunny slope. Different objects may have the similar radiance values. This spectral information changes seriously affected remote sensing image information extraction accuracy in the mountainous area. It became the main obstacle to further application on remote sensing images. The purpose of topographic correction is to eliminate this effect, recovery true reflectivity or radiance of objects in horizontal conditions. It is the premise of quantitative remote sensing application. Atmospheric correction
  • 16. Remote sensing 13 eliminates atmospheric haze by rescaling each frequency band so that its minimum value (usually realised in water bodies) corresponds to a pixel value of 0. The digitizing of data also make possible to manipulate the data by changing gray-scale values. Interpretation is the critical process of making sense of the data. The first application was that of aerial photographic collection which used the following process; spatial measurement through the use of a light table in both conventional single or stereographic coverage, added skills such as the use of photogrammetry, the use of photomosaics, repeat coverage, Making use of objects’ known dimensions in order to detect modifications. Image Analysis is the recently developed automated computer-aided application which is in increasing use. Object-Based Image Analysis (OBIA) is a sub-discipline of GIScience devoted to partitioning remote sensing (RS) imagery into meaningful image-objects, and assessing their characteristics through spatial, spectral and temporal scale. Old data from remote sensing is often valuable because it may provide the only long-term data for a large extent of geography. At the same time, the data is often complex to interpret, and bulky to store. Modern systems tend to store the data digitally, often with lossless compression. The difficulty with this approach is that the data is fragile, the format may be archaic, and the data may be easy to falsify. One of the best systems for archiving data series is as computer-generated machine-readable ultrafiche, usually in typefonts such as OCR-B, or as digitized half-tone images. Ultrafiches survive well in standard libraries, with lifetimes of several centuries. They can be created, copied, filed and retrieved by automated systems. They are about as compact as archival magnetic media, and yet can be read by human beings with minimal, standardized equipment. Data processing levels To facilitate the discussion of data processing in practice, several processing “levels” were first defined in 1986 by NASA as part of its Earth Observing System [6] and steadily adopted since then, both internally at NASA (e. g., [7] ) and elsewhere (e. g., [8] ); these definitions are: Level Description 0 Reconstructed, unprocessed instrument and payload data at full resolution, with any and all communications artifacts (e. g., synchronization frames, communications headers, duplicate data) removed. 1a Reconstructed, unprocessed instrument data at full resolution, time-referenced, and annotated with ancillary information, including radiometric and geometric calibration coefficients and georeferencing parameters (e. g., platform ephemeris) computed and appended but not applied to the Level 0 data (or if applied, in a manner that level 0 is fully recoverable from level 1a data). 1b Level 1a data that have been processed to sensor units (e. g., radar backscatter cross section, brightness temperature, etc.); not all instruments have Level 1b data; level 0 data is not recoverable from level 1b data. 2 Derived geophysical variables (e. g., ocean wave height, soil moisture, ice concentration) at the same resolution and location as Level 1 source data. 3 Variables mapped on uniform spacetime grid scales, usually with some completeness and consistency (e. g., missing points interpolated, complete regions mosaicked together from multiple orbits, etc.). 4 Model output or results from analyses of lower level data (i. e., variables that were not measured by the instruments but instead are derived from these measurements). A Level 1 data record is the most fundamental (i. e., highest reversible level) data record that has significant scientific utility, and is the foundation upon which all subsequent data sets are produced. Level 2 is the first level that is directly usable for most scientific applications; its value is much greater than the lower levels. Level 2 data sets tend to be less voluminous than Level 1 data because they have been reduced temporally, spatially, or spectrally. Level 3 data sets are generally smaller than lower level data sets and thus can be dealt with without incurring a great deal of data handling overhead. These data tend to be generally more useful for many applications. The regular spatial and temporal organization of Level 3 datasets makes it feasible to readily combine data from different
  • 17. Remote sensing 14 sources. History The TR-1 reconnaissance/surveillance aircraft. The 2001 Mars Odyssey used spectrometers and imagers to hunt for evidence of past or present water and volcanic activity on Mars. The modern discipline of remote sensing arose with the development of flight. The balloonist G. Tournachon (alias Nadar) made photographs of Paris from his balloon in 1858. Messenger pigeons, kites, rockets and unmanned balloons were also used for early images. With the exception of balloons, these first, individual images were not particularly useful for map making or for scientific purposes. [citation needed] Systematic aerial photography was developed for military surveillance and reconnaissance purposes beginning in World War I and reaching a climax during the Cold War with the use of modified combat aircraft such as the P-51, P-38, RB-66 and the F-4C, or specifically designed collection platforms such as the U2/TR-1, SR-71, A-5[9] and the OV-1 series both in overhead and stand-off collection. A more recent development is that of increasingly smaller sensor pods such as those used by law enforcement and the military, in both manned and unmanned platforms. The advantage of this approach is that this requires minimal modification to a given airframe. Later imaging technologies would include Infra-red, conventional, doppler and synthetic aperture radar. [citation needed] The development of artificial satellites in the latter half of the 20th century allowed remote sensing to progress to a global scale as of the end of the Cold War. Instrumentation aboard various Earth observing and weather satellites such as Landsat, the Nimbus and more recent missions such as RADARSAT and UARS provided global measurements of various data for civil, research, and military purposes. Space probes to other planets have also provided the opportunity to conduct remote sensing studies in extraterrestrial environments, synthetic aperture radar aboard the Magellan spacecraft provided detailed topographic maps of Venus, while instruments aboard SOHO allowed studies to be performed on the Sun and the solar wind, just to name a few examples. [citation needed] Recent developments include, beginning in the 1960s and 1970s with the development of image processing of satellite imagery. Several research groups in Silicon Valley including NASA Ames Research Center, GTE and ESL Inc. developed Fourier transform techniques leading to the first notable enhancement of imagery data. [citation needed] Training and Education Remote Sensing has a growing relevance in the modern information society. It represents a key technology as part of the aerospace industry and bears increasing economic relevance – new sensors e.g. TerraSAR-X & RapidEye are developed constantly and the demand for skilled labour is increasing steadily. Furthermore, remote sensing exceedingly influences everyday life, ranging from weather forecasts to reports on climate change or natural disasters. As an example, 80% of the German students use the services of Google Earth; in 2006 alone the software was downloaded 100 million times. But studies has shown that only a fraction of them know more about the data they are working with. [10] There exists a huge knowledge gap between the application and the understanding of satellite images. Remote sensing only plays a tangential role in schools, regardless of the political claims to strengthen the support for teaching on the subject. [11] A lot of the computer software explicitly developed for school
  • 18. Remote sensing 15 lessons has not yet been implemented due to its complexity. Thereby, the subject is either not at all integrated into the curriculum or does not pass the step of an interpretation of analogue images. In fact, the subject of remote sensing requires a consolidation of physics and mathematics as well as competences in the fields of media and methods apart from the mere visual interpretation of satellite images. Many teachers have great interest in the subject “remote sensing”, being motivated to integrate this topic into teaching, provided that the curriculum is considered. In many cases, this encouragement fails because of confusing information. [12] In order to integrate remote sensing in a sustainable manner organizations like the EGU or digital earth [13] encourages the development of learning modules and learning portals (e.g. FIS – Remote Sensing in School Lessons [14] or Landmap – Spatial Discovery [15] ) promoting media and method qualifications as well as independent working. Remote Sensing Internships One effective way to teach students the many applications of remote sensing is through an internship opportunity. NASA DEVELOP is one such opportunity, where students work in teams with science advisor(s) and/or partner(s) to meet some practical need in the community. Working through NASA, this program give students experience in real-world remote sensing applications, as well as providing valuable training. (More information can be found on the NASA DEVELOP website [16] Another such program is SERVIR. Supporting by the US Agency of International Development (USAID) and NASA, SERVIR provides students with valuable hands-on experience with remote sensing, while providing end-users with the resources to better respond to a whole host of issues. More information can be found on the SERVIR website [17] Remote Sensing software Remote Sensing data is processed and analyzed with computer software, known as a remote sensing application. A large number of proprietary and open source applications exist to process remote sensing data. Remote Sensing Software packages include: • TNTmips from MicroImages, • PCI Geomatica made by PCI Geomatics, the leading remote sensing software package in Canada, • IDRISI from Clark Labs, • Image Analyst from Intergraph, • and RemoteView made by Overwatch Textron Systems. • Dragon/ips is one of the oldest remote sensing packages still available, and is in some cases free. Open source remote sensing software includes: • OSSIM, • Opticks (software), •• Orfeo toolbox • Others mixing remote sensing and GIS capabilities are: GRASS GIS, ILWIS, QGIS, and TerraLook. According to an NOAA Sponsored Research by Global Marketing Insights, Inc. the most used applications among Asian academic groups involved in remote sensing are as follows: ERDAS 36% (ERDAS IMAGINE 25% & ERMapper 11%); ESRI 30%; ITT Visual Information Solutions ENVI 17%; MapInfo 17%. Among Western Academic respondents as follows: ESRI 39%, ERDAS IMAGINE 27%, MapInfo 9%, AutoDesk 7%, ITT Visual Information Solutions ENVI 17%.
  • 19. Remote sensing 16 References [4] http://hurricanes.nasa.gov/earth-sun/technology/remote_sensing.html [5] Begni Gérard, Escadafal Richard, Fontannaz Delphine and Hong-Nga Nguyen Anne-Thérèse, 2005. Remote sensing: a tool to monitor and assess desertification. Les dossiers thématiques du CSFD. Issue 2. 44 pp. (http://www.csf-desertification.org/index.php/bibliotheque/ publications-csfd/doc_details/28-begni-gerard-et-al-2005-remote-sensing) [6] NASA (1986), Report of the EOS data panel, Earth Observing System, Data and Information System, Data Panel Report, Vol. IIa., NASA Technical Memorandum 87777, June 1986, 62 pp. Available at (http://hdl.handle.net/2060/19860021622) [7] C. L. Parkinson, A. Ward, M. D. King (Eds.) Earth Science Reference Handbook – A Guide to NASA’s Earth Science Program and Earth Observing Satellite Missions, National Aeronautics and Space Administration Washington, D. C. Available at (http://eospso.gsfc.nasa.gov/ ftp_docs/2006ReferenceHandbook.pdf) [8] GRAS-SAF (2009), Product User Manual, GRAS Satellite Application Facility, Version 1.2.1, 31 March 2009. Available at (http://www. grassaf.org/general-documents/products/grassaf_pum_v121.pdf) [9] http://toolserver.org/%7Edispenser/cgi-bin/dab_solver.py?page=Remote_sensing&editintro=Template:Disambiguation_needed/ editintro&client=Template:Dn [10] Ditter, R., Haspel, M., Jahn, M., Kollar, I., Siegmund, A., Viehrig, K., Volz, D., Siegmund, A. (2012) Geospatial technologies in school – theoretical concept and practical implementation in K-12 schools. In: International Journal of Data Mining, Modelling and Management (IJDMMM): FutureGIS: Riding the Wave of a Growing Geospatial Technology Literate Society; Vol. X [11] Stork, E.J., Sakamoto, S.O., and Cowan, R.M. (1999) "The integration of science explorations through the use of earth images in middle school curriculum", Proc. IEEE Trans. Geosci. Remote Sensing 37, 1801–1817 [12] Bednarz, S.W. and Whisenant, S.E. (2000) “Mission geography: linking national geography standards, innovative technologies and NASA”, Proc. IGARSS, Honolulu, USA, 2780–2782 8 [13] http://www.digital-earth.eu/ [14] http://www.fis.uni-bonn.de/node/92 [15] http://www.landmap.ac.uk [16] http://develop.larc.nasa.gov/ [17] http://www.nasa.gov/mission_pages/servir/index.html Further reading • Campbell, J. B. (2002). Introduction to remote sensing (3rd ed.). The Guilford Press. ISBN 1-57230-640-8. • Jensen, J. R. (2007). Remote sensing of the environment: an Earth resource perspective (2nd ed.). Prentice Hall. ISBN 0-13-188950-8. • Jensen, J. R. (2005). Digital Image Processing: a Remote Sensing Perspective (3rd ed.). Prentice Hall. • Lentile, Leigh B.; Holden, Zachary A.; Smith, Alistair M. S.; Falkowski, Michael J.; Hudak, Andrew T.; Morgan, Penelope; Lewis, Sarah A.; Gessler, Paul E.; Benson, Nate C.. Remote sensing techniques to assess active fire characteristics and post-fire effects (http://www.treesearch.fs.fed.us/pubs/24613). International Journal of Wildland Fire. 2006;3(15):319–345. • Lillesand, T. M.; R. W. Kiefer, and J. W. Chipman (2003). Remote sensing and image interpretation (5th ed.). Wiley. ISBN 0-471-15227-7. • Richards, J. A.; and X. Jia (2006). Remote sensing digital image analysis: an introduction (4th ed.). Springer. ISBN 3-540-25128-6. •• US Army FM series. •• US Army military intelligence museum, FT Huachuca, AZ • Datla, R.U.; Rice, J.P.; Lykke, K.R.; Johnson, B.C.; Butler, J.J.; Xiong, X.. Best practice guidelines for pre-launch characterization and calibration of instruments for passive optical remote sensing (http://nvlpubs.nist.gov/ nistpubs/jres/116/2/V116.N02.A05.pdf). Journal of Research of the National Institute of Standards and Technology. 2011 (March–April);116(2):612–646.
  • 20. Remote sensing 17 External links • Remote Sensing (http://www.dmoz.org/Science/Earth_Sciences/Geomatics/Remote_Sensing/) at the Open Directory Project • Free space images (mosaics) (http://www.terraexploro.com/terralibrary/index.php/space-images) • International Journal of Advanced Remote Sensing and GIS (http://www.cloud-journals.com/ journal-of-remote-sensing-n-gis-open-access.html) Light The Sun is Earth's primary source of light. About 44% of the sun's electromagnetic radiation that reaches the ground is in the visible light range. Visible light (commonly referred to simply as light) is electromagnetic radiation that is visible to the human eye, and is responsible for the sense of sight. [1] Visible light has a wavelength in the range of about 380 nanometres (nm), or 380×10 −9  m, to about 740 nanometres – between the invisible infrared, with longer wavelengths and the invisible ultraviolet, with shorter wavelengths. Primary properties of visible light are intensity, propagation direction, frequency or wavelength spectrum, and polarisation, while its speed in a vacuum, 299,792,458 meters per second, is one of the fundamental constants of nature. Visible light, as with all types of electromagnetic radiation (EMR), is experimentally found to always move at this speed in vacuum. In common with all types of EMR, visible light is emitted and absorbed in tiny "packets" called photons, and exhibits properties of both waves and particles. This property is referred to as the wave–particle duality. The study of light, known as optics, is an important research area in modern physics. In physics, the term light sometimes refers to electromagnetic radiation of any wavelength, whether visible or not. [2][3] This article focuses on visible light. See the electromagnetic radiation article for the general term. Speed of visible light The speed of light in a vacuum is defined to be exactly 299,792,458 m/s (approximately 186,282 miles per second). The fixed value of the speed of light in SI units results from the fact that the metre is now defined in terms of the speed of light. All forms of electromagnetic radiation are believed to move at exactly this same speed in vacuum. Different physicists have attempted to measure the speed of light throughout history. Galileo attempted to measure the speed of light in the seventeenth century. An early experiment to measure the speed of light was conducted by Ole Rømer, a Danish physicist, in 1676. Using a telescope, Rømer observed the motions of Jupiter and one of its moons, Io. Noting discrepancies in the apparent period of Io's orbit, he calculated that light takes about 22 minutes to traverse the diameter of Earth's orbit. [4] However, its size was not known at that time. If Rømer had known the diameter of the Earth's orbit, he would have calculated a speed of 227,000,000 m/s. Another, more accurate, measurement of the speed of light was performed in Europe by Hippolyte Fizeau in 1849. Fizeau directed a beam of light at a mirror several kilometers away. A rotating cog wheel was placed in the path of the light beam as it traveled from the source, to the mirror and then returned to its origin. Fizeau found that at a
  • 21. Light 18 certain rate of rotation, the beam would pass through one gap in the wheel on the way out and the next gap on the way back. Knowing the distance to the mirror, the number of teeth on the wheel, and the rate of rotation, Fizeau was able to calculate the speed of light as 313,000,000 m/s. Léon Foucault used an experiment which used rotating mirrors to obtain a value of 298,000,000 m/s in 1862. Albert A. Michelson conducted experiments on the speed of light from 1877 until his death in 1931. He refined Foucault's methods in 1926 using improved rotating mirrors to measure the time it took light to make a round trip from Mt. Wilson to Mt. San Antonio in California. The precise measurements yielded a speed of 299,796,000 m/s. The effective velocity of light in various transparent substances containing ordinary matter, is less than in vacuum. For example the speed of light in water is about 3/4 of that in vacuum. However, the slowing process in matter is thought to result not from actual slowing of particles of light, but rather from their absorption and re-emission from charged particles in matter. As an extreme example of the nature of light-slowing in matter, two independent teams of physicists were able to bring light to a "complete standstill" by passing it through a Bose-Einstein Condensate of the element rubidium, one team at Harvard University and the Rowland Institute for Science in Cambridge, Mass., and the other at the Harvard-Smithsonian Center for Astrophysics, also in Cambridge. [5] However, the popular description of light being "stopped" in these experiments refers only to light being stored in the excited states of atoms, then re-emitted at an arbitrary later time, as stimulated by a second laser pulse. During the time it had "stopped" it had ceased to be light. Electromagnetic spectrum and visible light Electromagnetic spectrum with light highlighted Generally, EM radiation, or EMR (the designation 'radiation' excludes static electric and magnetic and near fields) is classified by wavelength into radio, microwave, infrared, the visible region that we perceive as light, ultraviolet, X-rays and gamma rays. The behaviour of EMR depends on its wavelength. Higher frequencies have shorter wavelengths, and lower frequencies have longer wavelengths. When EMR interacts with single atoms and molecules, its behaviour depends on the amount of energy per quantum it carries. EMR in the visible light region consists of quanta (called photons) that are at the lower end of the energies that are capable of causing electronic excitation within molecules, which lead to changes in the bonding or chemistry of the molecule. At the lower end of the visible light spectrum, EMR becomes invisible to humans (infrared) because its photons no longer have enough individual energy to cause a lasting molecular change (a change in conformation) in the visual molecule retinal in the human retina. This change triggers the sensation of vision. There exist animals that are sensitive to various types of infrared, but not by means of quantum-absorption. Infrared sensing in snakes depends on a kind of natural thermal imaging, in which tiny packets of cellular water are raised in temperature by the infrared radiation. EMR in this range causes molecular vibration and heating effects, and this is how living animals detect it. Above the range of visible light, ultraviolet light becomes invisible to humans, mostly because it is absorbed by the tissues of the eye and in particular the lens. Furthermore, the rods and cones located at the back of the human eye cannot detect the short ultraviolet wavelengths, and are in fact damaged by ultraviolet rays, a condition known as
  • 22. Light 19 snow eye. [6] Many animals with eyes that do not require lenses (such as insects and shrimp) are able to directly detect ultraviolet visually, by quantum photon-absorption mechanisms, in much the same chemical way that normal humans detect visible light. Optics The study of light and the interaction of light and matter is termed optics. The observation and study of optical phenomena such as rainbows and the aurora borealis offer many clues as to the nature of light. Refraction An example of refraction of light. The straw appears bent, because of refraction of light as it enters liquid from air. A cloud illuminated by sunlight Refraction is the bending of light rays when passing through a surface between one transparent material and another. It is described by Snell's Law: where is the angle between the ray and the surface normal in the first medium, is the angle between the ray and the surface normal in the second medium, and n 1 and n 2 are the indices of refraction, n = 1 in a vacuum and n > 1 in a transparent substance. When a beam of light crosses the boundary between a vacuum and another medium, or between two different media, the wavelength of the light changes, but the frequency remains constant. If the beam of light is not orthogonal (or rather normal) to the boundary, the change in wavelength results in a change in the direction of the beam. This change of direction is known as refraction. The refractive quality of lenses is frequently used to manipulate light in order to change the apparent size of images. Magnifying glasses, spectacles, contact lenses, microscopes and refracting telescopes are all examples of this manipulation. Light sources There are many sources of light. The most common light sources are thermal: a body at a given temperature emits a characteristic spectrum of black-body radiation. A simple thermal source is sunlight, the radiation emitted by the chromosphere of the Sun at around 6,000 Kelvin peaks in the visible region of the electromagnetic spectrum when plotted in wavelength units [7] and roughly 44% of sunlight energy that reaches the ground is visible. [8] Another example is incandescent light bulbs, which emit only around 10% of their energy as visible light and the remainder as infrared. A common thermal light source in history is the glowing solid particles in flames, but these also emit most of their
  • 23. Light 20 radiation in the infrared, and only a fraction in the visible spectrum. The peak of the blackbody spectrum is in the deep infrared, at about 10 micrometer wavelength, for relatively cool objects like human beings. As the temperature increases, the peak shifts to shorter wavelengths, producing first a red glow, then a white one, and finally a blue-white colour as the peak moves out of the visible part of the spectrum and into the ultraviolet. These colours can be seen when metal is heated to "red hot" or "white hot". Blue-white thermal emission is not often seen, except in stars (the commonly seen pure-blue colour in a gas flame or a welder's torch is in fact due to molecular emission, notably by CH radicals (emitting a wavelength band around 425 nm, and is not seen in stars or pure thermal radiation). Atoms emit and absorb light at characteristic energies. This produces "emission lines" in the spectrum of each atom. Emission can be spontaneous, as in light-emitting diodes, gas discharge lamps (such as neon lamps and neon signs, mercury-vapor lamps, etc.), and flames (light from the hot gas itself—so, for example, sodium in a gas flame emits characteristic yellow light). Emission can also be stimulated, as in a laser or a microwave maser. Deceleration of a free charged particle, such as an electron, can produce visible radiation: cyclotron radiation, synchrotron radiation, and bremsstrahlung radiation are all examples of this. Particles moving through a medium faster than the speed of light in that medium can produce visible Cherenkov radiation. Certain chemicals produce visible radiation by chemoluminescence. In living things, this process is called bioluminescence. For example, fireflies produce light by this means, and boats moving through water can disturb plankton which produce a glowing wake. Certain substances produce light when they are illuminated by more energetic radiation, a process known as fluorescence. Some substances emit light slowly after excitation by more energetic radiation. This is known as phosphorescence. Phosphorescent materials can also be excited by bombarding them with subatomic particles. Cathodoluminescence is one example. This mechanism is used in cathode ray tube television sets and computer monitors. A city illuminated by artificial lighting Certain other mechanisms can produce light: •• Bioluminescence •• Cherenkov radiation •• Electroluminescence •• Scintillation •• Sonoluminescence •• triboluminescence When the concept of light is intended to include very-high-energy photons (gamma rays), additional generation mechanisms include: • Particle–antiparticle annihilation •• Radioactive decay Units and measures Light is measured with two main alternative sets of units: radiometry consists of measurements of light power at all wavelengths, while photometry measures light with wavelength weighted with respect to a standardised model of human brightness perception. Photometry is useful, for example, to quantify Illumination (lighting) intended for human use. The SI units for both systems are summarised in the following tables.
  • 24. Light 21 Quantity Unit Dimension Notes Name Symbol [9] Name Symbol Symbol Radiant energy Q e [10] joule J M⋅L 2 ⋅T −2 energy Radiant flux Φ e [10] watt W M⋅L 2 ⋅T −3 radiant energy per unit time, also called radiant power. Spectral power Φ eλ [10][11] watt per metre W⋅m −1 M⋅L⋅T −3 radiant power per wavelength. Radiant intensity I e watt per steradian W⋅sr −1 M⋅L 2 ⋅T −3 power per unit solid angle. Spectral intensity I eλ [11] watt per steradian per metre W⋅sr −1 ⋅m −1 M⋅L⋅T −3 radiant intensity per wavelength. Radiance L e watt per steradian per square metre W⋅sr −1 ⋅m −2 M⋅T −3 power per unit solid angle per unit projected source area. confusingly called "intensity" in some other fields of study. Spectral radiance L eλ [11] or L eν [12] watt per steradian per metre 3 or watt per steradian per square metre per hertz W⋅sr −1 ⋅m −3 or W⋅sr −1 ⋅m −2 ⋅Hz −1 M⋅L −1 ⋅T −3 or M⋅T −2 commonly measured in W⋅sr −1 ⋅m −2 ⋅nm −1 with surface area and either wavelength or frequency. Irradiance E e [10] watt per square metre W⋅m −2 M⋅T −3 power incident on a surface, also called radiant flux density. sometimes confusingly called "intensity" as well. Spectral irradiance E eλ [11] or E eν [12] watt per metre 3 or watt per square metre per hertz W⋅m −3 or W⋅m −2 ⋅Hz −1 M⋅L −1 ⋅T −3 or M⋅T −2 commonly measured in W⋅m −2 ⋅nm −1 or 10 −22 W⋅m −2 ⋅Hz −1 , known as solar flux unit. [13] Radiant exitance / Radiant emittance M e [10] watt per square metre W⋅m −2 M⋅T −3 power emitted from a surface. Spectral radiant exitance / Spectral radiant emittance M eλ [11] or M eν [12] watt per metre 3 or watt per square metre per hertz W⋅m −3 or W⋅m −2 ⋅Hz −1 M⋅L −1 ⋅T −3 or M⋅T −2 power emitted from a surface per wavelength or frequency. Radiosity J e or J eλ [11] watt per square metre W⋅m −2 M⋅T −3 emitted plus reflected power leaving a surface. Radiant exposure H e joule per square metre J⋅m −2 M⋅T −2 Radiant energy density ω e joule per metre 3 J⋅m −3 M⋅L −1 ⋅T −2 See also: SI · Radiometry · Photometry · (Compare)
  • 25. Light 22 Quantity Unit Dimension Notes Name Symbol [14] Name Symbol Symbol Luminous energy Q v   [15] lumen second lm⋅s T⋅J  [16] units are sometimes called talbots Luminous flux Φ v   [15] lumen (= cd⋅sr) lm J  [16] also called luminous power Luminous intensity I v candela (= lm/sr) cd J  [16] an SI base unit, luminous flux per unit solid angle Luminance L v candela per square metre cd/m 2 L −2 ⋅J units are sometimes called nits Illuminance E v lux (= lm/m 2 ) lx L −2 ⋅J used for light incident on a surface Luminous emittance M v lux (= lm/m 2 ) lx L −2 ⋅J used for light emitted from a surface Luminous exposure H v lux second lx⋅s L −2 ⋅T⋅J Luminous energy density ω v lumen second per metre 3 lm⋅s⋅m −3 L −3 ⋅T⋅J Luminous efficacy η  [15] lumen per watt lm/W M −1 ⋅L −2 ⋅T 3 ⋅J ratio of luminous flux to radiant flux Luminous efficiency V 1 also called luminous coefficient See also: SI · Photometry · Radiometry · (Compare) The photometry units are different from most systems of physical units in that they take into account how the human eye responds to light. The cone cells in the human eye are of three types which respond differently across the visible spectrum, and the cumulative response peaks at a wavelength of around 555 nm. Therefore, two sources of light which produce the same intensity (W/m 2 ) of visible light do not necessarily appear equally bright. The photometry units are designed to take this into account, and therefore are a better representation of how "bright" a light appears to be than raw intensity. They relate to raw power by a quantity called luminous efficacy, and are used for purposes like determining how to best achieve sufficient illumination for various tasks in indoor and outdoor settings. The illumination measured by a photocell sensor does not necessarily correspond to what is perceived by the human eye, and without filters which may be costly, photocells and charge-coupled devices (CCD) tend to respond to some infrared, ultraviolet or both. Light pressure Light exerts physical pressure on objects in its path, a phenomenon which can be deduced by Maxwell's equations, but can be more easily explained by the particle nature of light: photons strike and transfer their momentum. Light pressure is equal to the power of the light beam divided by c, the speed of light.  Due to the magnitude of c, the effect of light pressure is negligible for everyday objects.  For example, a one-milliwatt laser pointer exerts a force of about 3.3 piconewtons on the object being illuminated; thus, one could lift a U. S. penny with laser pointers, but doing so would require about 30 billion 1-mW laser pointers. [17]   However, in nanometer-scale applications such as NEMS, the effect of light pressure is more significant, and exploiting light pressure to drive NEMS mechanisms and to flip nanometer-scale physical switches in integrated circuits is an active area of research. [18] At larger scales, light pressure can cause asteroids to spin faster, [19] acting on their irregular shapes as on the vanes of a windmill.  The possibility of making solar sails that would accelerate spaceships in space is also under investigation. [20][21] Although the motion of the Crookes radiometer was originally attributed to light pressure, this interpretation is incorrect; the characteristic Crookes rotation is the result of a partial vacuum. [22] This should not be confused with the Nichols radiometer, in which the (slight) motion caused by torque (though not enough for full rotation against friction) is directly caused by light pressure. [23]
  • 26. Light 23 Historical theories about light, in chronological order Classical Greece and Hellenism In the fifth century BC, Empedocles postulated that everything was composed of four elements; fire, air, earth and water. He believed that Aphrodite made the human eye out of the four elements and that she lit the fire in the eye which shone out from the eye making sight possible. If this were true, then one could see during the night just as well as during the day, so Empedocles postulated an interaction between rays from the eyes and rays from a source such as the sun. In about 300 BC, Euclid wrote Optica, in which he studied the properties of light. Euclid postulated that light travelled in straight lines and he described the laws of reflection and studied them mathematically. He questioned that sight is the result of a beam from the eye, for he asks how one sees the stars immediately, if one closes one's eyes, then opens them at night. Of course if the beam from the eye travels infinitely fast this is not a problem. In 55 BC, Lucretius, a Roman who carried on the ideas of earlier Greek atomists, wrote: "The light & heat of the sun; these are composed of minute atoms which, when they are shoved off, lose no time in shooting right across the interspace of air in the direction imparted by the shove." – On the nature of the Universe Despite being similar to later particle theories, Lucretius's views were not generally accepted. Ptolemy (c. 2nd century) wrote about the refraction of light in his book Optics. [24] Classical India In ancient India, the Hindu schools of Samkhya and Vaisheshika, from around the early centuries CE developed theories on light. According to the Samkhya school, light is one of the five fundamental "subtle" elements (tanmatra) out of which emerge the gross elements. The atomicity of these elements is not specifically mentioned and it appears that they were actually taken to be continuous. On the other hand, the Vaisheshika school gives an atomic theory of the physical world on the non-atomic ground of ether, space and time. (See Indian atomism.) The basic atoms are those of earth (prthivi), water (pani), fire (agni), and air (vayu) Light rays are taken to be a stream of high velocity of tejas (fire) atoms. The particles of light can exhibit different characteristics depending on the speed and the arrangements of the tejas atoms. [citation needed] The Vishnu Purana refers to sunlight as "the seven rays of the sun". [citation needed] The Indian Buddhists, such as Dignāga in the 5th century and Dharmakirti in the 7th century, developed a type of atomism that is a philosophy about reality being composed of atomic entities that are momentary flashes of light or energy. They viewed light as being an atomic entity equivalent to energy. [citation needed] Descartes René Descartes (1596–1650) held that light was a mechanical property of the luminous body, rejecting the "forms" of Ibn al-Haytham and Witelo as well as the "species" of Bacon, Grosseteste, and Kepler. [25] In 1637 he published a theory of the refraction of light that assumed, incorrectly, that light travelled faster in a denser medium than in a less dense medium. Descartes arrived at this conclusion by analogy with the behaviour of sound waves. [citation needed] Although Descartes was incorrect about the relative speeds, he was correct in assuming that light behaved like a wave and in concluding that refraction could be explained by the speed of light in different media. Descartes is not the first to use the mechanical analogies but because he clearly asserts that light is only a mechanical property of the luminous body and the transmitting medium, Descartes' theory of light is regarded as the start of modern physical optics. [25]
  • 27. Light 24 Particle theory Pierre Gassendi. Pierre Gassendi (1592–1655), an atomist, proposed a particle theory of light which was published posthumously in the 1660s. Isaac Newton studied Gassendi's work at an early age, and preferred his view to Descartes' theory of the plenum. He stated in his Hypothesis of Light of 1675 that light was composed of corpuscles (particles of matter) which were emitted in all directions from a source. One of Newton's arguments against the wave nature of light was that waves were known to bend around obstacles, while light travelled only in straight lines. He did, however, explain the phenomenon of the diffraction of light (which had been observed by Francesco Grimaldi) by allowing that a light particle could create a localised wave in the aether. Newton's theory could be used to predict the reflection of light, but could only explain refraction by incorrectly assuming that light accelerated upon entering a denser medium because the gravitational pull was greater. Newton published the final version of his theory in his Opticks of 1704. His reputation helped the particle theory of light to hold sway during the 18th century. The particle theory of light led Laplace to argue that a body could be so massive that light could not escape from it. In other words it would become what is now called a black hole. Laplace withdrew his suggestion later, after a wave theory of light became firmly established as the model for light (as has been explained, neither a particle or wave theory is fully correct). A translation of Newton's essay on light appears in The large scale structure of space-time, by Stephen Hawking and George F. R. Ellis. Wave theory To explain the origin of colors, Robert Hooke (1635-1703) developed a "pulse theory" and compared the spreading of light to that of waves in water in his 1665 Micrographia ("Observation XI"). In 1672 Hooke suggested that light's vibrations could be perpendicular to the direction of propagation. Christiaan Huygens (1629-1695) worked out a mathematical wave theory of light in 1678, and published it in his Treatise on light in 1690. He proposed that light was emitted in all directions as a series of waves in a medium called the Luminiferous ether. As waves are not affected by gravity, it was assumed that they slowed down upon entering a denser medium. [26] Thomas Young's sketch of the two-slit experiment showing the diffraction of light. Young's experiments supported the theory that light consists of waves. The wave theory predicted that light waves could interfere with each other like sound waves (as noted around 1800 by Thomas Young), and that light could be polarised, if it were a transverse wave. Young showed by means of a diffraction experiment that light behaved as waves. He also proposed that different colours were caused by different wavelengths of light, and explained colour vision in terms of three-coloured receptors in the eye. Another supporter of the wave theory was Leonhard Euler. He argued in Nova theoria lucis et colorum (1746) that diffraction could more easily be explained by a wave theory. Later, Augustin-Jean Fresnel independently worked out his own wave theory of light, and presented it to the Académie des Sciences in 1817. Simeon Denis Poisson added to Fresnel's mathematical work to produce a convincing argument in favour of the wave theory, helping to overturn Newton's corpuscular theory. By the year 1821, Fresnel was able to show via mathematical methods that polarisation could be explained only by the wave
  • 28. Light 25 theory of light and only if light was entirely transverse, with no longitudinal vibration whatsoever. The weakness of the wave theory was that light waves, like sound waves, would need a medium for transmission. The existence of the hypothetical substance luminiferous aether proposed by Huygens in 1678 was cast into strong doubt in the late nineteenth century by the Michelson-Morley experiment. Newton's corpuscular theory implied that light would travel faster in a denser medium, while the wave theory of Huygens and others implied the opposite. At that time, the speed of light could not be measured accurately enough to decide which theory was correct. The first to make a sufficiently accurate measurement was Léon Foucault, in 1850. [27] His result supported the wave theory, and the classical particle theory was finally abandoned, only to partly re-emerge in the 20th century. Quantum theory In 1900 Max Planck, attempting to explain black body radiation suggested that although light was a wave, these waves could gain or lose energy only in finite amounts related to their frequency. Planck called these "lumps" of light energy "quanta" (from a Latin word for "how much"). In 1905, Albert Einstein used the idea of light quanta to explain the photoelectric effect, and suggested that these light quanta had a "real" existence. In 1923 Arthur Holly Compton showed that the wavelength shift seen when low intensity X-rays scattered from electrons (so called Compton scattering) could be explained by a particle-theory of X-rays, but not a wave theory. In 1926 Gilbert N. Lewis named these liqht quanta particles photons. Eventually the modern theory of quantum mechanics came to picture light as (in some sense) both a particle and a wave, and (in another sense), as a phenomenon which is neither a particle nor a wave (which actually are macroscopic phenomena, such as baseballs or ocean waves). Instead, modern physics sees light as something that can be described sometimes with mathematics appropriate to one type of macroscopic metaphor (particles), and sometimes another macroscopic metaphor (water waves), but is actually something that cannot be fully imagined. As in the case for radio waves and the X-rays involved in Compton scattering, physicists have noted that electromagnetic radiation tends to behave more like a classical wave at lower frequencies, but more like a classical particle at higher frequencies, but never completely loses all qualities of one or the other. Visible light, which occupies a middle ground in frequency, can easily be shown in experiments to be describable using either a wave or particle model, or sometimes both. Electromagnetic theory as explanation for all types of visible light and all EM radiation A linearly polarised light wave frozen in time and showing the two oscillating components of light; an electric field and a magnetic field perpendicular to each other and to the direction of motion (a transverse wave). In 1845, Michael Faraday discovered that the plane of polarisation of linearly polarised light is rotated when the light rays travel along the magnetic field direction in the presence of a transparent dielectric, an effect now known as Faraday rotation. [] This was the first evidence that light was related to electromagnetism. In 1846 he speculated that light might be some form of disturbance propagating along magnetic field lines. [] Faraday proposed in 1847 that light was a high-frequency electromagnetic vibration, which could propagate even in the absence of a medium such as the ether.
  • 29. Light 26 Faraday's work inspired James Clerk Maxwell to study electromagnetic radiation and light. Maxwell discovered that self-propagating electromagnetic waves would travel through space at a constant speed, which happened to be equal to the previously measured speed of light. From this, Maxwell concluded that light was a form of electromagnetic radiation: he first stated this result in 1862 in On Physical Lines of Force. In 1873, he published A Treatise on Electricity and Magnetism, which contained a full mathematical description of the behaviour of electric and magnetic fields, still known as Maxwell's equations. Soon after, Heinrich Hertz confirmed Maxwell's theory experimentally by generating and detecting radio waves in the laboratory, and demonstrating that these waves behaved exactly like visible light, exhibiting properties such as reflection, refraction, diffraction, and interference. Maxwell's theory and Hertz's experiments led directly to the development of modern radio, radar, television, electromagnetic imaging, and wireless communications. In the quantum theory, photons are seen as wave packets of the waves described in the classical theory of Maxwell. The quantum theory was needed to explain effects even with visual light that Maxwell's classical theory could not (such as spectral lines). Notes [1] CIE (1987). International Lighting Vocabulary (http://www.cie.co.at/publ/abst/17-4-89.html). Number 17.4. CIE, 4th edition. ISBN 978-3-900734-07-7. By the International Lighting Vocabulary, the definition of light is: “Any radiation capable of causing a visual sensation directly.” [6] http://www.yorku.ca/eye/lambdas.htm [7] http://thulescientific.com/LYNCH%20&%20Soffer%20OPN%201999.pdf [9] Standards organizations recommend that radiometric quantities should be denoted with a suffix "e" (for "energetic") to avoid confusion with photometric or photon quantities. [10] Alternative symbols sometimes seen: W or E for radiant energy, P or F for radiant flux, I for irradiance, W for radiant emittance. [11] Spectral quantities given per unit wavelength are denoted with suffix "λ" (Greek) to indicate a spectral concentration. Spectral functions of wavelength are indicated by "(λ)" in parentheses instead, for example in spectral transmittance, reflectance and responsivity. [12] Spectral quantities given per unit frequency are denoted with suffix "ν" (Greek)—not to be confused with the suffix "v" (for "visual") indicating a photometric quantity. [13] NOAA / Space Weather Prediction Center (http://www.swpc.noaa.gov/forecast_verification/F10.html) includes a definition of the solar flux unit (SFU). [14] Standards organizations recommend that photometric quantities be denoted with a suffix "v" (for "visual") to avoid confusion with radiometric or photon quantities. [15] Alternative symbols sometimes seen: W for luminous energy, P or F for luminous flux, and ρ or K for luminous efficacy. [16] "J" here is the symbol for the dimension of luminous intensity, not the symbol for the unit joules. [18] See, for example, nano-opto-mechanical systems research at Yale University (http://www.eng.yale.edu/tanglab/research.htm). [22][22] P. Lebedev, Untersuchungen über die Druckkräfte des Lichtes, Ann. Phys. 6, 433 (1901). [25] Theories of light, from Descartes to Newton A. I. Sabra CUP Archive,1981 pg 48 ISBN 0-521-28436-8, ISBN 978-0-521-28436-3 [26] Fokko Jan Dijksterhuis, Lenses and Waves: Christiaan Huygens and the Mathematical Science of Optics in the 17th Century (http://books. google.com/books?id=cPFevyomPUIC), Kluwer Academic Publishers, 2004, ISBN 1-4020-2697-8 References
  • 30. Radar 27 Radar A long-range radar antenna, known as ALTAIR, used to detect and track space objects in conjunction with ABM testing at the Ronald Reagan Test Site on Kwajalein Atoll. Israeli military radar is typical of the type of radar used for air traffic control. The antenna rotates at a steady rate, sweeping the local airspace with a narrow vertical fan-shaped beam, to detect aircraft at all altitudes. Radar is an object detection system which uses radio waves to determine the range, altitude, direction, or speed of objects. It can be used to detect aircraft, ships, spacecraft, guided missiles, motor vehicles, weather formations, and terrain. The radar dish or antenna transmits pulses of radio waves or microwaves which bounce off any object in their path. The object returns a tiny part of the wave's energy to a dish or antenna which is usually located at the same site as the transmitter. Radar was secretly developed by several nations before and during World War II. The term RADAR was coined in 1940 by the United States Navy as an acronym for RAdio Detection And Ranging. [1] The term radar has since entered English and other languages as the common noun radar, losing all capitalization. The modern uses of radar are highly diverse, including air traffic control, radar astronomy, air-defense systems, antimissile systems; marine radars to locate landmarks and other ships; aircraft anticollision systems; ocean surveillance systems, outer space surveillance and rendezvous systems; meteorological precipitation monitoring; altimetry and flight control systems; guided missile target locating systems; and ground-penetrating radar for geological observations. High tech radar systems are associated with digital signal processing and are capable of extracting useful information from very high noise levels. Other systems similar to radar make use of other parts of the electromagnetic spectrum. One example is "lidar", which uses visible light from lasers rather than radio waves. History As early as 1886, German physicist Heinrich Hertz showed that radio waves could be reflected from solid objects. In 1895, Alexander Popov, a physics instructor at the Imperial Russian Navy school in Kronstadt, developed an apparatus using a coherer tube for detecting distant lightning strikes. The next year, he added a spark-gap transmitter. In 1897, while testing this equipment for communicating between two ships in the Baltic Sea, he took note of an interference beat caused by the passage of a third vessel. In his report, Popov wrote that this phenomenon might be used for detecting objects, but he did nothing more with this observation. [2] The German inventor Christian Hülsmeyer was the first to use radio waves to detect "the presence of distant metallic objects". In 1904 he demonstrated the feasibility of detecting a ship in dense fog, but not its distance from the transmitter. [3] He obtained a patent [4] for his detection device in April 1904 and later a patent [5] for a related amendment for estimating the distance to the ship. He also got a British patent on September 23, 1904 [6] for a full system, that he called a telemobiloscope.
  • 31. Radar 28 A Chain Home tower in Great Baddow, United Kingdom In August 1917 Nikola Tesla outlined a concept for primitive radar-like units. [7] He stated, ...by their [standing electromagnetic waves] use we may produce at will, from a sending station, an electrical effect in any particular region of the globe; [with which] we may determine the relative position or course of a moving object, such as a vessel at sea, the distance traversed by the same, or its speed. In 1922 A. Hoyt Taylor and Leo C. Young, researchers working with the U.S. Navy, had a transmitter and a receiver on opposite sides of the Potomac River and discovered that a ship passing through the beam path caused the received signal to fade in and out. Taylor submitted a report, suggesting that this might be used to detect the presence of ships in low visibility, but the Navy did not immediately continue the work. Eight years later, Lawrence A. Hyland at the Naval Research Laboratory observed similar fading effects from a passing aircraft; this led to a patent application [8] as well as a proposal for serious work at the NRL (Taylor and Young were then at this laboratory) on radio-echo signals from moving targets. [9] Before the Second World War, researchers in France, Germany, Italy, Japan, the Netherlands, the Soviet Union, the United Kingdom, and the United States, independently and in great secrecy, developed technologies that led to the modern version of radar. Australia, Canada, New Zealand, and South Africa followed prewar Great Britain, and Hungary had similar developments during the war. [10] In 1934 the Frenchman Émile Girardeau stated he was building an obstacle-locating radio apparatus "conceived according to the principles stated by Tesla" and obtained a patent for a working system, [11][12][13] a part of which was installed on the Normandie liner in 1935. [14] During the same time, the Soviet military engineer P.K.Oschepkov, in collaboration with Leningrad Electrophysical Institute, produced an experimental apparatus, RAPID, capable of detecting an aircraft within 3 km of a receiver. [15] The French and Soviet systems, however, had continuous-wave operation and could not give the full performance that was ultimately at the center of modern radar. Full radar evolved as a pulsed system, and the first such elementary apparatus was demonstrated in December 1934 by the American Robert M. Page, working at the Naval Research Laboratory. [16] The following year, the United States Army successfully tested a primitive surface-to-surface radar to aim coastal battery search lights at night. [17] This was followed by a pulsed system demonstrated in May 1935 by Rudolf Kühnhold and the firm GEMA in Germany and then one in June 1935 by an Air Ministry team led by Robert A. Watson Watt in Great Britain. Development of radar greatly expanded on 1 September 1936 when Watson-Watt became Superintendent of a new establishment under the British Air Ministry, Bawdsey Research Station located in Bawdsey Manor, near Felixstowe, Suffolk. Work there resulted in the design and installation of aircraft detection and tracking stations called Chain Home along the East and South coasts of England in time for the outbreak of World War II in 1939. This system provided the vital advance information that helped the Royal Air Force win the Battle of Britain. The British were the first to fully exploit radar as a defence against aircraft attack. This was spurred on by fears that the Germans were developing death rays. [] The Air Ministry asked British scientists in 1934 to investigate the possibility of propagating electromagnetic energy and the likely effect. Following a study, they concluded that a death ray was impractical but that detection of aircraft appeared feasible. [] Robert Watson Watt's team demonstrated to his superiors the capabilities of a working prototype and then patented the device. [13][18][19] It served as the basis for the Chain Home network of radars to defend Great Britain, which detected approaching German aircraft in the Battle of Britain in 1940.