Top Banner
Introduction to Remote Sensing page 1 Remote Sensing of Environment (RSE) with TNTmips ® TNTview ® Introduction to I N T R O T O R S E
32
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Intro Rse

Introduction to Remote Sensing

page 1

Remote Sensing ofEnvironment (RSE)

with

TNTmips®

TNTview®

Introduction toINTRO

TO

RSE

Page 2: Intro Rse

Introduction to Remote Sensing

page 2

Before Getting Started

You can print or read this booklet in color from MicroImages’ Web site. TheWeb site is also your source for the newest tutorial booklets on other topics.You can download an installation guide, sample data, and the latest versionof TNTmips.

http://www.microimages.com

Imagery acquired by airborne or satellite sensors provides an important source ofinformation for mapping and monitoring the natural and manmade features on theland surface. Interpretation and analysis of remotely sensed imagery requires anunderstanding of the processes that determine the relationships between the prop-erty the sensor actually measures and the surface properties we are interested inidentifying and studying. Knowledge of these relationships is a prerequisite forappropriate processing and interpretation. This booklet presents a brief overviewof the major fundamental concepts related to remote sensing of environmentalfeatures on the land surface.

Sample Data The illustrations in this booklet show many examples of remotesensing imagery. You can find many additional examples of imagery in the sampledata that is distributed with the TNT products. If you do not have access to a TNTproducts CD, you can download the data from MicroImages’ Web site. In particu-lar, the CB_DATA, SF_DATA, BEREA, and COMBRAST data collections include samplefiles with remote sensing imagery that you can view and study.

More Documentation This booklet is intended only as an introduction to basicconcepts governing the acquisition, processing, and interpretation of remote sensingimagery. You can view all types of imagery in TNTmips using the standard Dis-play process, which is introduced in the tutorial booklet entitled DisplayingGeospatial Data. Many other processes in TNTmips can be used to process,enhance, or analyze imagery. Some of the most important ones are mentioned onthe appropriate pages in this booklet, along with a reference to an accompanyingtutorial booklet.

TNTmips® Pro and TNTmips Free TNTmips (the Map and Image ProcessingSystem) comes in three versions: the professional version of TNTmips (TNTmipsPro), the low-cost TNTmips Basic version, and the TNTmips Free version. Allversions run exactly the same code from the TNT products DVD and have nearlythe same features. If you did not purchase the professional version (which re-quires a software license key) or TNTmips Basic, then TNTmips operates inTNTmips Free mode.

Randall B. Smith, Ph.D., 4 January 2012©MicroImages, Inc., 2001–2012

Page 3: Intro Rse

Introduction to Remote Sensing

page 3

Introduction to Remote SensingRemote sensing is the sci-ence of obtaining andinterpreting informationfrom a distance, using sen-sors that are not in physicalcontact with the object be-ing observed. Though youmay not realize it, you arefamiliar with many examples. Biological evolutionhas exploited many natural phenomena and formsof energy to enable animals (including people) tosense their environment. Your eyes detect electro-magnetic energy in the form of visible light. Yourears detect acoustic (sound) energy, while your nosecontains sensitive chemical receptors that respondto minute amounts of airborne chemicals given offby the materials in our surroundings. Some researchsuggests that migrating birds can sense variations inEarth’s magnetic field, which helps explain their re-markable navigational ability.

The science of remote sensing in its broadest senseincludes aerial, satellite, and spacecraft observationsof the surfaces and atmospheres of the planets inour solar system, though the Earth is obviously themost frequent target of study. The term is customar-ily restricted to methods that detect and measureelectromagnetic energy, including visible light, thathas interacted with surface materials and the atmo-sphere. Remote sensing of the Earth has manypurposes, including making and updating planimet-ric maps, weather forecasting, and gathering militaryintelligence. Our focus in this booklet will be onremote sensing of the environment and resources ofEarth’s surface. We will explore the physical con-cepts that underlie the acquisition and interpretationof remotely sensed images, the important character-istics of images from different types of sensors, andsome common methods of processing images to en-hance their information content.

Fundamental concepts ofelectromagnetic radiationand its interactions withsurface materials and theatmosphere are introducedon pages 4-9. Imageacquisition and variousconcepts of imageresolution are discussed onpages 10-16. Pages 17-23focus on images acquired inthe spectral range fromvisible to middle infraredradiation, including visualimage interpretation andcommon processes used tocorrect or enhance theinformation content ofmultispectral images.Pages 23-24 discussimages acquired on multipledates and their spatialregistration andnormalization. You canlearn some basic conceptsof thermal infrared imageryon pages 26-27, and radarimagery on pages 28-29.Page 30 presents anexample of combineimages from differentsensors. Sources ofadditional information onremote sensing are listedon page 31.

Artist’s depiction of theLandsat 7 satellite inorbit, courtesy ofNASA. Launched inlate 1999, this satelliteacquires multispectralimages using reflectedvisible and infrared ra-diation.

Page 4: Intro Rse

Introduction to Remote Sensing

page 4

The Electromagnetic SpectrumThe field of remote sensing began with aerial pho-tography, using visible light from the sun as theenergy source. But visible light makes up only asmall part of the electromagnetic spectrum, a con-tinuum that ranges from high energy, shortwavelength gamma rays, to lower energy, long wave-length radio waves. Illustrated below is the portionof the electromagnetic spectrum that is useful in re-mote sensing of the Earth’s surface.

The Earth is naturally illuminated by electromagneticradiation from the Sun. The peak solar energy is inthe wavelength range of visible light (between 0.4and 0.7 µm). It’s no wonder that the visual systemsof most animals are sensitive to these wavelengths!Although visible light includes the entire range ofcolors seen in a rainbow, a cruder subdivision intoblue, green, and red wavelength regions is sufficientin many remote sensing studies. Other substantialfractions of incoming solar energy are in the form ofinvisible ultraviolet and infrared radiation. Only tinyamounts of solar radiation extend into the microwaveregion of the spectrum. Imaging radar systems usedin remote sensing generate and broadcast micro-waves, then measure the portion of the signal thathas returned to the sensor from the Earth’s surface.

Electromagnetic radiationbehaves in part as wavelikeenergy fluctuations travelingat the speed of light. Thewave is actually composite,involving electric and mag-netic fields fluctuating at rightangles to each other and tothe direction of travel.

A fundamental descriptivefeature of a waveform is itswavelength, or distance be-tween succeeding peaks ortroughs. In remote sensing,wavelength is most oftenmeasured in micrometers,each of which equals onemillionth of a meter. Thevariation in wavelength ofelectromagnetic radiation isso vast that it is usuallyshown on a logarithmic scale.

UNITS1 micrometer (µm) = 1 x 10-6 meters1 millimeter (mm) = 1 x 10-3 meters1 centimeter (cm) = 1 x 10-2 meters

Wavelength

Wavelength (logarithmic scale)

Incoming from Sun

Emitted by Earth

0.4 0.5 0.6 0.7

Blue Green Red

MICROWAVE

(RADAR)

INFRARED

1 m10 cm1 cm100 µm10 µm1 µm 1 mm0.1 µm

VIS

IBL

E

ULT

RA

VIO

LE

T

En

erg

y

Page 5: Intro Rse

Introduction to Remote Sensing

page 5

Interaction ProcessesRemote sensors measure electromagnetic (EM) ra-diation that has interacted with the Earth’s surface.Interactions with matter can change the direction,intensity, wavelength content, and polarization of EMradiation. The nature of these changes is dependenton the chemical make-up and physical structure ofthe material exposed to the EM radiation. Changesin EM radiation resulting from its interactions withthe Earth’s surface therefore provide major clues tothe characteristics of the surface materials.

The fundamental interactions between EM radiationand matter are diagrammed to the right. Electro-magnetic radiation that is transmitted passes througha material (or through the boundary between twomaterials) with little change in intensity. Materialscan also absorb EM radiation. Usually absorptionis wavelength-specific: that is, more energy is ab-sorbed at some wavelengths than at others. EMradiation that is absorbed is transformed into heatenergy, which raises the material’s temperature.Some of that heat energy may then be emitted asEM radiation at a wavelength dependent on thematerial’s temperature. The lower the temperature,the longer the wavelength of the emitted radiation.As a result of solar heating, the Earth’s surface emitsenergy in the form of longer-wavelength infraredradiation (see illustration on the preceding page). Forthis reason the portion of the infrared spectrum withwavelengths greater than 3 µm is commonly calledthe thermal infrared region.

Electromagnetic radiation encountering a boundarysuch as the Earth’s surface can also be reflected. Ifthe surface is smooth at a scale comparable to thewavelength of the incident energy, specular reflec-tion occurs: most of the energy is reflected in a singledirection, at an angle equal to the angle of incidence.Rougher surfaces cause scattering, or diffuse reflec-tion in all directions.

Matter - EM EnergyInteraction Processes

The horizontal linerepresents a boundarybetween two materials.

Specular Reflection

Scattering(Diffuse Reflection)

Absorption

Emission

Transmission

Page 6: Intro Rse

Introduction to Remote Sensing

page 6

Interaction Processes in Remote Sensing

Typical EMR interactions in the atmosphere and at the Earth’s surface.

To understand how different interaction processes impact the acquisition of aerialand satellite images, let’s analyze the reflected solar radiation that is measured ata satellite sensor. As sunlight initially enters the atmosphere, it encounters gasmolecules, suspended dust particles, and aerosols. These materials tend to scattera portion of the incoming radiation in all directions, with shorter wavelengthsexperiencing the strongest effect. (The preferential scattering of blue light incomparison to green and red light accounts for the blue color of the daytime sky.Clouds appear opaque because of intense scattering of visible light by tiny waterdroplets.) Although most of the remaining light is transmitted to the surface,some atmospheric gases are very effective at absorbing particular wavelengths.(The absorption of dangerous ultraviolet radiation by ozone is a well-known ex-ample). As a result of these effects, the illumination reaching the surface is acombination of highly filtered solar radiation transmitted directly to the groundand more diffuse light scattered from all parts of the sky, which helps illuminateshadowed areas.

As this modified solar radiation reaches the ground, it may encounter soil, rocksurfaces, vegetation, or other materials that absorb a portion of the radiation. Theamount of energy absorbed varies in wavelength for each material in a character-istic way, creating a sort of spectral signature. (The selective absorption of differentwavelengths of visible light determines what we perceive as a material’s color).Most of the radiation not absorbed is diffusely reflected (scattered) back up intothe atmosphere, some of it in the direction of the satellite. This upwelling radia-tion undergoes a further round of scattering and absorption as it passes throughthe atmosphere before finally being detected and measured by the sensor. If thesensor is capable of detecting thermal infrared radiation, it will also pick up radia-tion emitted by surface objects as a result of solar heating.

EMRSource Sensor

Absorption

Absorption

Absorption

ScatteringScattering

Scattering

Scattering

Emission

Transmission

Page 7: Intro Rse

Introduction to Remote Sensing

page 7

Atmospheric EffectsScattering and absorption of EM radiation by the at-mosphere have significant effects that impact sensordesign as well as the processing and interpretationof images. When the concentration of scatteringagents is high, scattering produces the visual effectwe call haze. Haze increases the overall brightnessof a scene and reduces the contrast between differentground materials. A hazy atmosphere scatters somelight upward, so a portion of the radiation recordedby a remote sensor, called path radiance, is the re-sult of this scattering process. Since the amount ofscattering varies with wavelength, so does the con-tribution of path radiance to remotely sensed images.As shown by the figure to the right, the path radi-ance effect is greatest for the shortest wavelengths,falling off rapidly with increasing wavelength. Whenimages are captured over several wavelength ranges,the differential path radiance effect complicates com-parison of brightness values at the differentwavelengths. Simple methods for correcting for pathradiance are discussed later in this booklet.

The atmospheric components that are effective ab-sorbers of solar radiation are water vapor, carbondioxide, and ozone. Each of these gases tends toabsorb energy in specific wavelength ranges. Somewavelengths are almost completely absorbed. Con-sequently, most broad-band remote sensors have beendesigned to detect radiation in the “atmospheric win-dows”, those wavelength ranges for which absorptionis minimal, and, conversely, transmission is high.

Rel

ativ

e S

catt

erin

g

0.4 0.6 0.8 1.0Wavelength, µµµµµm

Range of scattering fortypical atmosphericconditions (colored area)versus wavelength.Scattering increases withincreasing humidity andparticulate load butdecreases with increasingwavelength. In most casesthe path radiance producedby scattering is negligible atwavelengths longer thanthe near infrared.

100

1 m0.3 µm 1 µm 10 µm 100 µm 1 mm

Tra

nsm

issi

on

(%)

0

Vis

ible

Ult

ravi

ole

t ThermalInfrared

Nea

r IR

MiddleIR

Mic

row

ave

Variation in atmospherictransmission withwavelength of EM radiation,due to wavelength-selectiveabsorption by atmosphericgases. Only wavelengthranges with moderate tohigh transmission valuesare suitable for use inremote sensing.

Page 8: Intro Rse

Introduction to Remote Sensing

page 8

All remote sensing systems designed to monitor the Earth’s surface rely on energythat is either diffusely reflected by or emitted from surface features. Current re-mote sensing systems fall into three categories on the basis of the source of theelectromagnetic radiation and the relevant interactions of that energy with thesurface.

Reflected solar radiation sensors These sensor systems detect solar radiationthat has been diffusely reflected (scattered) upward from surface features. Thewavelength ranges that provide useful information include the ultraviolet, visible,near infrared and middle infrared ranges. Reflected solar sensing systems dis-criminate materials that have differing patterns of wavelength-specific absorption,which relate to the chemical make-up and physical struc-ture of the material. Because they depend on sunlight asa source, these systems can only provide useful imagesduring daylight hours, and changing atmospheric condi-tions and changes in illumination with time of day andseason can pose interpretive problems. Reflected solarremote sensing systems are the most common type usedto monitor Earth resources, and are the primary focus ofthis booklet.

Thermal infrared sensors Sensors that can detect thethermal infrared radiation emitted by surface featurescan reveal information about the thermal properties ofthese materials. Like reflected solar sensors, these arepassive systems that rely on solar radiation as the ulti-mate energy source. Because the temperature of surfacefeatures changes during the day, thermal infrared sens-ing systems are sensitive to time of day at which theimages are acquired.

Imaging radar sensors Rather than relying on a natural source, these “active”systems “illuminate” the surface with broadcast micro-wave radiation, then measure the energy that is diffuselyreflected back to the sensor. The returning energy pro-vides information about the surface roughness and watercontent of surface materials and the shape of the landsurface. Long-wavelength microwaves suffer little scat-tering in the atmosphere, even penetrating thick cloudcover. Imaging radar is therefore particularly useful incloud-prone tropical regions.

EMR Sources, Interactions, and Sensors

Reflected red image

Thermal Infrared image

Radar image

Page 9: Intro Rse

Introduction to Remote Sensing

page 9

Spectral SignaturesThe spectral signatures produced by wavelength-dependent absorption providethe key to discriminating different materials in images of reflected solar energy.The property used to quantify these spectral signatures is called spectral reflec-tance: the ratio of reflected energy to incident energy as a function of wavelength.The spectral reflectance of different materials can be measured in the laboratoryor in the field, providing reference data that can be used to interpret images. As anexample, the illustration below shows contrasting spectral reflectance curves forthree very common natural materials: dry soil, green vegetation, and water.

The reflectance of dry soil rises uniformly through the visible and near infraredwavelength ranges, peaking in the middle infrared range. It shows only minordips in the middle infrared range due to absorption by clay minerals. Green veg-etation has a very different spectrum. Reflectance is relatively low in the visiblerange, but is higher for green light than for red or blue, producing the green colorwe see. The reflectance pattern of green vegetation in the visible wavelengths isdue to selective absorption by chlorophyll, the primary photosynthetic pigment ingreen plants. The most noticeable feature of the vegetation spectrum is the dra-matic rise in reflectance across the visible-near infrared boundary, and the highnear infrared reflectance. Infrared radiation penetrates plant leaves, and is in-tensely scattered by the leaves’ complex internal structure, resulting in highreflectance. The dips in the middle infrared portion of the plant spectrum are dueto absorption by water. Deep clear water bodies effectively absorb all wavelengthslonger than the visible range, which results in very low reflectivity for infraredradiation.

Ref

lect

ance

0

0.2

0.4

0.6

0.4 0.6 0.8 1.0 1.2 1.4 1.6 1.8 2.0 2.2 2.4 2.6Wavelength (µµµµµm)

Clear Water BodyGreen VegetationDry Bare Soil

Near Infrared Middle Infrared

Red

Grn

Blu

e

Reflected Infrared

Page 10: Intro Rse

Introduction to Remote Sensing

page 10

Image Acquisition

52

71

74

102

113 144 1196570

6489 125 90

6687 87 80

89111 77 95

111115 67 74

We have seen that the radiant energy that is measured by an aerial or satellitesensor is influenced by the radiation source, interaction of the energy with surfacematerials, and the passage of the energy through the atmosphere. In addition, theillumination geometry (source position, surface slope, slope direction, and shad-owing) can also affect the brightness of the upwelling energy. Together theseeffects produce a composite “signal” that varies spatially and with the time of dayor season. In order to produce an image which we can interpret, the remote sens-ing system must first detect and measure this energy.

The electromagnetic energy returned from the Earth’s surface can be detected bya light-sensitive film, as in aerial photography, or by an array of electronic sen-

sors. Light striking photographic film causes a chemicalreaction, with the rate of the reaction varying with theamount of energy received by each point on the film.Developing the film converts the pattern of energy varia-tions into a pattern of lighter and darker areas that canbe interpreted visually.

Electronic sensors generate an electrical signal witha strength proportional to the amount of energy

received. The signal from each detector in anarray can be recorded and transmitted elec-tronically in digital form (as a series ofnumbers). Today’s digital still and video cam-eras are examples of imaging systems that useelectronic sensors. All modern satellite imag-ing systems also use some form of electronicdetectors.

An image from an electronic sensor array (ora digitally scanned photograph) consists of atwo-dimensional rectangular grid of numeri-cal values that represent differing brightnesslevels. Each value represents the average

brightness for a portion of the surface, represented bythe square unit areas in the image. In computer termsthe grid is commonly known as a raster, and the squareunits are cells or pixels. When displayed on your com-puter, the brightness values in the image raster aretranslated into display brightness on the screen.

Page 11: Intro Rse

Introduction to Remote Sensing

page 11

Spatial ResolutionThe spatial, spectral, and temporal components ofan image or set of images all provide informationthat we can use to form interpretations about sur-face materals and conditions. For each of theseproperties we can define the resolution of the im-ages produced by the sensor system. These imageresolution factors place limits on what informationwe can derive from remotely sensed images.

Spatial resolution is a measure of the spatial detailin an image, which is a function of the design of thesensor and its operating altitude above the surface.Each of the detectors in a remote sensor measuresenergy received from a finite patch of the groundsurface. The smaller these individual patches are,the more detailed will be the spatial information thatwe can interpret from the image. For digital images,spatial resolution is most commonly expressed as theground dimensions of an image cell.

Shape is one visual factor that we can use to recog-nize and identify objects in an image. Shape is usuallydiscernible only if the object dimensions are severaltimes larger than the cell dimensions.On the other hand, objects smallerthan the image cell size may be de-tectable in an image. If such anobject is sufficiently brighter ordarker than its surroundings, it willdominate the averaged brightness ofthe image cell it falls within, and thatcell will contrast in brightness withthe adjacent cells. We may not be able to identifywhat the object is, but we can see that something ispresent that is different from its surroundings, espe-cially if the “background” area is relatively uniform.Spatial context may also allow us to recognize linearfeatures that are narrower than the cell dimensions,such as roads or bridges over water. Evidently thereis no clear dimensional boundary between detectabil-ity and recognizability in digital images.

The image above is a portionof a Landsat Thematic Map-per scene showing part ofSan Francisco, California.The image has a cell size of28.5 meters. Only largerbuildings and roads areclearly recognizable. Theboxed area is shown belowleft in an IKONOS image witha cell size of 4 meters. Trees,smaller buildings, and nar-rower streets are recogniz-able in the Ikonos image.The bottom image shows the

boxed area of theThematic Mapperscene enlargedto the same scaleas the IKONOSimage, revealingthe larger cells inthe Landsat im-age.

Page 12: Intro Rse

Introduction to Remote Sensing

page 12

The spectral resolution of a remote sensing system can be described as its abilityto distinguish different parts of the range of measured wavelengths. In essence,this amounts to the number of wavelength intervals (“bands”) that are measured,and how narrow each interval is. An “image” produced by a sensor system canconsist of one very broad wavelength band, a few broad bands, or many narrowwavelength bands. The names usually used for these three image categories arepanchromatic, multispectral, and hyperspectral, respectively.

Aerial photographs taken using black and white film record an average responseover the entire visible wavelength range (blue, green, and red). Because this filmis sensitive to all visible colors, it is called panchromatic film. A panchromaticimage reveals spatial variations in the gross visual properties of surface materials,but does not allow spectral discrimination. Some satellite remote sensing sys-tems record a single very broad band to provide a synoptic overview of the scene,commonly at a higher spatial resolution than other sensors on board. Despitevarying wavelength ranges, such bands are also commonly referred to as panchro-matic bands. For example, the sensors on the first three SPOT satellites includeda panchromatic band with a spectral range of 0.51 to 0.73 micrometers (green andred wavelength ranges). This band has a spatial resolution of 10 meters, in con-trast to the 20-meter resolution of the multispectral sensor bands. The panchromaticband of the Enhanced The-matic Mapper Plus sensoraboard NASA’s Landsat 7 sat-ellite covers a wider spectralrange of 0.52 to 0.90 microme-ters (green, red, and nearinfrared), with a spatial reso-lution of 15 meters (versus30-meters for the sensor’smultispectral bands).

Spectral Resolution

SPOT panchromatic image ofpart of Seattle, Washington.This image band spans thegreen and red wavelengthranges. Water and vegetationappear dark, while the brightestobjects are building roofs and alarge circular tank.

Page 13: Intro Rse

Introduction to Remote Sensing

page 13

In order to provide increased spectral discrimination, remote sensing systems de-signed to monitor the surface environment employ a multispectral design: parallelsensor arrays detecting radiation in a small number of broad wavelength bands.Most satellite systems use from three to six spectral bands in the visible to middleinfrared wavelength region. Some systems also employ one or more thermal in-frared bands. Bands in the infrared range are limited in width to avoid atmosphericwater vapor absorption effects that significantly degrade the signal in certain wave-length intervals (see the previous page Atmospheric Effects). These broad-bandmultispectral systems allow discrimination of different types of vegetation, rocksand soils, clear and turbid water, and some man-made materials.

A three-band sensor with green, red, and near infrared bands is effective at dis-criminating vegetated and nonvegetated areas. The HRV sensor aboard the FrenchSPOT (Système Probatoire d’Observation de la Terre) 1, 2, and 3 satellites (20meter spatial resolution) has this design. Color-infrared film used in some aerialphotography provides similar spectral coverage, with the red emulsion recordingnear infrared, the green emulsion recording red light, and the blue emulsion re-cording green light. The IKONOS satellite from Space Imaging (4-meterresolution) and the LISS II sensor on the Indian Research Satellites IRS-1A and1B (36-meter resolution) add a blue band to provide complete coverage of thevisible light range, and allow natural-color bandcomposite images to be created. The LandsatThematic Mapper (Landsat 4 and 5) and En-hanced Thematic Mapper Plus (Landsat7) sensors add two bands in the middleinfrared (MIR). Landsat TM band 5(1.55 to 1.75 µm) and band 7 (2.08 to2.35 µm) are sensitive to variations inthe moisture content of vegetation andsoils. Band 7 also covers a range thatincludes spectral absorption featuresfound in several important types of minerals. An additional TM band (band 6)records part of the thermal infrared wavelength range (10.4 to 12.5 µm). (Bands6 and 7 are not in wavelength order because band 7 was added late in the sensordesign process.) Current multispectral satellite sensor systems with spatial reso-lution better than 200 meters are compared on the following pages.

To provide even greater spectral resolution, so-called hyperspectral sensors makemeasurements in dozens to hundreds of adjacent, narrow wavelength bands (aslittle as 0.1 µm in width). For more information on these systems, see the bookletIntroduction to Hyperspectral Imaging.

Multispectral Images

Page 14: Intro Rse

Introduction to Remote Sensing

page 14

Multispectral Satellite SensorsPlatform /Sensor /Launch Yr.

ImageCellSize

Image Size(Cross x

Along-Track)Spec.Bands

VisibleBands(µµµµµm)

Near IRBands(µµµµµm)

Ikonos-2VNIR1999Terra(EOS-AM-1)ASTER1999

SPOT 4HRVIR (XS)1999

Landsat 7ETM+1999

Landsat 4, 5T M1982

4 m 11 x 11 km 4 B 0.45-0.52G 0.52-0.60R 0.63-0.69

0.76-0.90

15 m (Vis, NIR)

30 m(MIR)90 m(TIR)

60 x 60 km 14 G 0.52-0.60R 0.63-0.69

0.76-0.86

20 m 60 x 60 km 4 G 0.50-0.59R 0.61-0.68

0.79-0.89

30 m 185 x 170 km 7 B 0.45-0.515G 0.525-0.605

R 0.63-0.69

0.75-0.90

30 m 185 x 170 km 7 B 0.45-0.52G 0.52-0.60R 0.63-0.69

0.76-0.90

Ikonos-2: Space Imaging, Inc., USA ResourceSAT-2: Indian Space Research Org.

Terra, Landsat: NASA, USA QuickBird, WorldView: DigitalGlobe, Inc., USA

SPOT: Centre National d’Etudes Spatiales (CNES), France

SPOT 5HRG2002

10 m(Vis, NIR)20 m (MIR)

60 x 60 km 4 G 0.50-0.59R 0.61-0.68

0.79-0.89

QuickBird2001

2.4 or 2.8m

16.5 x 16.5 km 4 B 0.45-0.52G 0.52-0.60R 0.63-0.69

0.76-0.90

RapidEye2008

6.5 m 77 km 5 B 0.44-0.51G 0.52-0.59R 0.63-0.685

0.69-0.730.76-0.85

GeoEye-12008

1.65 m 15 x 15 km 4 B 0.45-0.51G 0.51-0.58R 0.655-0.69

0.78-0.92

ResourceSAT-22011

5.8 m(LISS-4)

70 km 3 G 0.52-0.59R 0.62-0.68

0.77-0.86

23.5 m(LISS-3)

3 G 0.52-0.59R 0.62-0.68

0.77-0.86

WorldView-22009

1.8 m 16.4 km 8 0.40-0.45B 0.45-0.51G 0.51-0.58

Y 0.585-0.625R 0.655-0.69

0.705-0.7450.860-1.04

Page 15: Intro Rse

Introduction to Remote Sensing

page 15

Satellite Sensors Table (Continued)Mid. IRBands(µµµµµm)

ThermalIR Bands

(µµµµµm)

Panchrom.Band

Range (µµµµµm)

None None

PanCellSize

1.60-1.702.145-2.1852.185-2.2252.235-2.2852.295-2.3652.36-2.43

8.125-8.4758.475-8.8258.925-9.27510.25-10.9510.95-11.65

0.45-0.90B, G, R, NIR

NominalRevisit

Interval*

1 m 11 days(2.9 days†)

None X 16 days

1.58-1.75 None 0.61-0.68R

10 m 26 days(5 days†)

1.55-1.752.09-2.35

10.40-12.50 0.52-0.90G, R, NIR

15 m 16 days

1.55-1.752.08-2.35

10.40-12.50 None X 16 days

* Single satellite, nadir view at equator† With off-nadir pointing

You can import imagery from any of these sensors into theTNTmips Project File format using the Import / Export process.Each image band is stored as a raster object.

Platform /Sensor /Launch Yr.

Ikonos-2VNIR1999Terra(EOS-AM-1)ASTER1999

SPOT 4HRVIR (XS)1999

Landsat 7ETM+1999

Landsat 4, 5T M1982

SPOT 5HRG2002

1.58-1.75 None 0.51-0.73G, R

5 m 26 days(3 days†)

QuickBird2001

None None 0.45-0.90B, G, R, NIR

0.6 or0.7 m

(3.5 days†)

RapidEye2008

None None None X 5.5 days(1 day†)

GeoEye-12008

None None 0.45-0.80B, G, R, NIR

0.41 m 5.5 days(1 day†)

ResourceSAT-22011

None None None X 24 days(5 days†)

1.55-1.70 None None X

WorldView-22009

None None 0.45-0.80B, G, R, NIR

0.41 m 3.7 days(1.1 day†)

Page 16: Intro Rse

Introduction to Remote Sensing

page 16

In order to digitally record the energy received by an individual detector in asensor, the continuous range of incoming energy must be quantized, or subdi-vided into a number of discrete levels that are recorded as integer values. Manycurrent satellite systems quantize data into 256 levels (8 bits of data in a binaryencoding system). The thermal infrared bands of the ASTER sensor are quan-tized into 4096 levels (12 bits). The more levels that can be recorded, the greateris the radiometric resolution of the sensor system.

High radiometric resolution is advantageous when you use a computer to processand analyze the numerical values in the bands of a multispectral image. (Severalof the most common analysis procedures, band ratio analysis and spectral classi-fication, will be described subsequently.) Visual analysis of multispectral imagesalso benefits from high radiometric resolution becausea selection of wavelength bands can be combined toform a color display or print. One band is assigned toeach of the three color channels used by the computermonitor: red, green, and blue. Using the additive colormodel, differing levels of these three primary colorscombine to form millions of subtly different colors.For each cell in the multispectral image, the bright-

ness values in the selected bands determine the red,green, and blue values used to create the displayedcolor. Using 256 levels for each color channel, acomputer display can create over 16 million col-ors. Experiments indicate that the human visualsystem can distinguish close to seven million col-ors, and it is also highly attuned to spatialrelationships. So despite the power of computeranalysis, visual analysis of color displays of multi-spectral imagery can still be an effective tool intheir interpretation.

Individual band images in the visible to middle in-frared range from the Landsat Thematic Mapper are illustrated for two sampleareas on the next page. The left image is a mountainous terrane with forest (lowerleft), bare granitic rock, small clear lakes, and snow patches. The right image isan agricultural area with both bare and vegetated fields, with a town in the upperleft and yellowed grass in the upper right. The captions for each image pair dis-cuss some of the diagnostic uses of each band. Many color combinations are alsopossible with these six image bands. Three of the most widely-used color combi-nations are illustrated on a later page.

Radiometric Resolution

R G

B

Y

CM

Page 17: Intro Rse

Introduction to Remote Sensing

page 17

Visible to Middle Infrared Image BandsBlue (TM 1): Provides maximum penetrationof shallow water bodies, though the mountainlakes in the left image are deep and thus appeardark, as does the forested area. In the rightimage, the town and yellowed grassy areas arebrighter than the bare and cultivated agriculturalfields. The brightness of the bare fields varieswidely with moisture content.

Green (TM 2): Includes the peak visible lightreflectance of green vegetation, thus helpsassess plant vigor and differentiate green andyellowed vegetation. But note that forest is stilldarker than bare rocks and soil. Snow is verybright, as it is throughout the visible and near-infrared range.

Red (TM 3): Due to strong absorption bychlorophyll, green vegetation appears darkerthan in the other visible light bands. The strengthof this absorption can be used to differentiatedifferent plant types. The red band is alsoimportant in determining soil color, and foridentifying reddish, iron-stained rocks that areoften associated with ore deposits.

Near Infrared (TM 4): Green vegetation ismuch brighter than in any of the visible bands.In the agricultural image, the few very brightfields indicate the maximum crop canopy cover.An irrigation canal is also very evident due tostrong absorption by water and contrast with thebrighter vegetated fields.

Middle Infrared, 1.55 to 1.75 µµµµµm (TM 5):Strongly absorbed by water, ice, and snow, sothe lakes and snow patches in the mountainimage appear dark. Reflected by clouds, so isuseful for differentiating clouds and snow.Sensitive to the moisture content of soils:recently irrigated fields in the agricultural imageappear in darker tones.

Middle Infrared, 2.08 to 2.35 µµµµµm (TM 7):Similar to TM band 5, but includes an absorptionfeature found in clay minerals; materials withabundant clay appear darker than in TM band5. Useful for identifying clayey soils andalteration zones rich in clay that are commonlyassociated with economic mineral deposits.

Page 18: Intro Rse

Introduction to Remote Sensing

page 18

Much useful information can be obtained by visual examination of individualimage bands. Here our visual abilities to rapidly assess the shape and size ofground features and their spatial patterns (texture) play important roles in inter-pretation. We also have the ability to quickly assess patterns of topographic shadingand shadows and interpret from them the shape of the land surface and the direc-tion of illumination.

One of the most important characteristics of an image band is its distribution ofbrightness levels, which is most commonly represented as a histogram. (You canview an image histogram using the Histogram tool in the TNTmips Spatial DataDisplay process.) A sample image and its histogram are shown below. The hori-zontal axis of the histogram shows the range of possible brightness levels (usually0 to 255), and the vertical axis represents the number of image cells that have a

particular bright-ness. The sampleimage has somevery dark areas,and some verybright areas, butthe majority ofcells are only mod-erately bright. The

shape of the histogram reflects this, forming a broad peakthat is highest near the middle of the brightness range. The

breadth of this histogram peak indicates the significant brightness variability inthe scene. An image with more uniform surface cover, with less brightness varia-tion, would show a much narrower histogram peak. If the scene includes extensiveareas of different surface materials with distinctly different brightness, the histo-gram will show multiple peaks.

In contrast to our phenomenal color vision, we are only able to distinguish 20 to30 distinct brightness levels in a grayscale image, so contrast (the relative bright-ness difference between features) is an important image attribute. Because of itswide range in brightness, the sample image above has relatively good contrast.But it is common for the majority of cells in an image band to be clustered in arelatively narrow brightness range, producing poor contrast. You can increase theinterpretability of grayscale (and color) images by using the Contrast Enhance-ment procedure in the TNTmips Spatial Data Display process to spread thebrightness values over more of the display brightness range. (See the tutorialbooklet entitled Getting Good Color for more information.)

Interpreting Single Image Bands

Page 19: Intro Rse

Introduction to Remote Sensing

page 19

Color Combinations of Visible-MIR BandsFour image areas are shown below to illustrate useful color combinations of bandsin the visible to middle infrared range. The two left image sets are shown asseparate bands and described on a preceding page. The third image set shows adesert valley with a central riparian zone and a few irrigated fields, and a darkbasaltic cinder cone in the lower left. The fourth image set shows another desertarea with varied rock types and an area of irrigated fields in the upper right.

Middle infrared (TM 7) = R, Near infrared (TM 4) = G, Green (TM 2) = B: Healthy greenvegetation appears bright green. Yellowed grass and typical agricultural soils appear pinkto magenta. Snow is pale cyan, and deeper water is black. Rock materials typicallyappear in shades of brown, gray, pink, and red.

Red (TM 3) = R, Green (TM 2) = G, Blue (TM 1) = B: Simulates “natural” color. Note thesmall lake in the upper left corner of the third image, which appears blue-green due tosuspended sediment or algae.

Near infrared (TM 4) = R, Red (TM 3) = G, Green (TM 2) = B: Simulates the colors of acolor-infrared photo. Healthy green vegetation appears red, yellowed grass appears blue-green, and typical agricultural soils appear blue-green to brown. Snow is white, and deeperwater is black. Rock materials typically appear in shades of gray to brown.

Page 20: Intro Rse

Introduction to Remote Sensing

page 20

Band RatiosAerial images commonly exhibit illumination differences produced by shadowsand by differing surface slope angles and slope directions. Because of these ef-fects, the brightness of each surface material can vary from place to place in theimage. Although these variations help us to visualize the three-dimensional shapeof the landscape, they hamper our ability to recognize materials with similar spec-tral properties. We can remove these effects, and accentuate the spectral differencesbetween materials, by computing a ratio image using two spectral bands. Foreach cell in the scene, the ratio value is computed by dividing the brightness valuein one band by the value in the second band. Because the contribution of shadingand shadowing is approximately constant for all image bands, dividing the twoband values effectively cancels them out. Band ratios can be computed in TNTmipsusing the Predefined Raster Combination process, which is discussed in the tuto-rial booklet entitled Combining Rasters.

Band ratios have been used extensively in mineral exploration and to map vegeta-tion condition. Bands are chosen to accentuate the occurrence of a particularmaterial. The analyst chooses one wavelength band in which the material is highlyreflective (appears bright), and another in which the material is strongly absorb-ing (appears dark). Usually the more reflective band is used as the numerator ofthe ratio, so that occurrences of the target material yield higher ratio values (greaterthan 1.0) and appear bright in the ratio image.

A ratio of near infrared (NIR) and red bands (TM4 / TM3)is useful in mapping vegetation and vegetation condition.The ratio is high for healthy vegetation, but lower forstressed or yellowed vegetation (lower near infrared andhigher red values) and for nonvegetated areas. Explora-tion geologists use several ratios of Landsat ThematicMapper bands to help map alteration zones that commonlyhost ore deposits. A band ratio of red (TM3) to blue (TM1)highlights reddish-colored iron oxide minerals found inmany alteration zones. Nearly all minerals are highly re-flective in the shorter-wavelength middle infrared band(TM5), but the clay minerals such as kaolinite that areabundant in alteration zones have an absorption featurewithin the longer-wavelength middle infrared band (TM7).A ratio of TM5 to TM7 thus highlights these clay miner-als, along with the carbonate minerals that make uplimestone and dolomite. Compare the ratio images shownat left to the color composites of the third image set on thepreceding page.

Ratio NIR / RED

Ratio TM3 / TM1

Page 21: Intro Rse

Introduction to Remote Sensing

page 21

Simple band ratio images, while very useful, have some disadvantages. First, anysensor noise that is localized in a particular band is amplified by the ratio calcula-tion. (Ideally, the image bands you receive should have been processed to removesuch sensor artifacts.) Another difficulty lies in the range and distribution of thecalculated values, which we can illustrate using the NIR / RED ratio. Ratio val-ues can range from decimal values less than 1.0 (for NIR less than RED) to valuesmuch greater than 1.0 (for NIR greater than RED). This range of values posedsome difficulties in interpretation, scaling, and contrast enhancement for olderimage processing systems that operated primarily with 8-bit integer data values.(TNTmips allows you to work directly with the fractional ratio values in a float-ing-point raster format, with full access to different contrast enhancement methods).

A normalized difference index is a variant of the simple ratio calculation thatavoids these problems. Corresponding cell values in the two bands are first sub-tracted, and this difference is then “normalized” bydividing by the sum of two brightness values. (Youcan compute normalized difference indices automati-cally in TNTmips using the Predefined RasterCombination process). The normalization tends toreduce artifacts related to sensor noise, and most illu-mination effects still are removed. The most widelyused example is the Normalized Difference Vegeta-tion Index (NDVI), which is (NIR - RED) / (NIR +RED). Raw index values range from -1 to +1, andthe data range is symmetrical around 0 (NIR = RED),making interpretation and scaling easy. Compare theNDVI image of the mountain scene to the right with the color composite imagesshown on a previous page. The forested area in the lower left is very bright, and

clearly differentiated from the darker nonvegetatedareas.

Different ratio or normalized difference images canbe combined to form color composite images forvisual interpretation. The color image to the leftincorporates three ratio images with R = TM3 /TM1, G = TM4 / TM3, and B = TM7 / TM5. Veg-etated areas appear bright blue-green, iron-stainedareas appear in shades of pink to orange, and otherrock and soil materials are shown in a variety ofhues that portray subtle variations in their spectralcharacteristics.

Normalized Difference Vegetation Index

Page 22: Intro Rse

Introduction to Remote Sensing

page 22

Removing Haze (Path Radiance)Before you compute band ratios or normalized difference images, you shouldadjust the brightness values in the bands to remove the effects of atmosphericpath radiance. Recall that scattering by a hazy atmosphere adds a component ofbrightness to each cell in an image band. If atmospheric conditions were uniformacross the scene (not always a safe assumption!), then we can assume that thebrightness of each cell in a particular band has been increased by the same amount,shifting the entire band histogram uniformly toward higher values. This additiveeffect decreases with increasing wavelength, so calculating ratios with raw bright-ness values (especially ratios involving blue and green bands) can produce spuriousresults, including incomplete removal of topographic shading.

The adjustment of band values for path radiance effects is mathematically simple:subtract the appropriate value from each cell. (This operation can be performedin TNTmips in the Predefined Raster Combinations process, using the arithmeticoperation Scale/Offset; use a scale factor of 1.0 and set the path radiance value asa negative offset). But how do you know what value to subtract?

Fortunately there are several simple ways to estimate pathradiance values from the image itself. If the image includesareas that are completely shadowed, such as parts of thecanyon walls in the image to the right, the brightness of theshadowed cells should be entirely due to path radiance. Youcan use DataTips or the Examine Raster tool in the TNTmipsSpatial Data Display process to determine the value for theshadowed areas. In the absence of complete shadows, deepclear water bodies can provide suitably dark areas. Thedanger in this method is that the selected cell may actuallyhave a component of brightness from the surface (such as a partial shadow orturbid water), in which case the subtracted value is too high. A more reliableestimate can be found for Landsat TM bands by using the Raster Correlation tool

to display a scatterplot of brightness values forthe selected band and the longer-wavelengthmiddle infrared band (TM7) for which pathradiance should be essentially 0. Because ofpath radiance, the best-fit line through the pointdistribution (computed automatically using theRegression Line option) does not pass throughthe origin of the plot. Instead its intersectionwith the axis for the shorter-wavelength bandapproximates the band’s path radiance value(illustration at left).

0 25 51 76 102 1270

25

51

76

102

127

TM Band 7

TM

Ban

d 2

Path Radiancefor TM 2 = 21

Page 23: Intro Rse

Introduction to Remote Sensing

page 23

Spectral Classification

Color composite Landsat ThematicMapper image with Red = TM7, Green =TM4, and Blue = TM2. Scene showsfarmland flanked by an urban area (upperright) and grassy hills (lower left).

Result of unsupervised classification of sixnonthermal Landsat TM bands for theabove scene. Each arbitrary colorindicates a separate class.

Spectral classification is another popularmethod of computer image analysis. In amultispectral image the brightness valuesin the different wavelength bands encodethe spectral information for each imagecell, and can be regarded as a spectralpattern. Spectral classification methodsseek to categorize the image cells on thebasis of these spectral patterns, withoutregard to spatial relationships or associa-tions.

The spectral pattern of a cell in a multi-spectral image can be quantified byplotting the brightness value from eachwavelength band on a separate coordinateaxis to locate a point in a hypothetical“spectral space”. This spectral space hasone dimension for each image band that is used in the classification. Most classi-fication methods assess the similarity of spectral patterns by using some measureof the distance between points in this spectral space. Cells whose spectral pat-terns are close together in spectral space have similar spectral characteristics andhave a high likelihood of representing the same surface materials.

In supervised classification the analystdesignates a set of “training areas” in theimage, each of which is a known surfacematerial that represents a desired spectralclass. The classification algorithm com-putes the average spectral pattern for eachtraining class, then assigns the remainingimage cells to the most similar class. Inunsupervised classification the algorithmderives its own set of spectral classes froman arbitrary sample of the image cells be-fore making class assignments. You canperform both types of classification inTNTmips using the Automatic Classifi-cation process, which is described in thetutorial booklet entitled Image Classifi-cation.

Page 24: Intro Rse

Introduction to Remote Sensing

page 24

Temporal ResolutionThe surface environment of the Earth is dynamic, with change occurring on timescales ranging from seconds to decades or longer. The seasonal cycle of plantgrowth that affects both natural ecosystems and crops is an important example.Repeat imagery of the same area through the growing season adds to our abilityto recognize and distinguish plant or crop types. A time-series of images can alsobe used to monitor changes in surface features due to other natural processes orhuman activity. The time-interval separating successive images in such a seriescan be considered to define the temporal resolution of the image sequence.

This sequence of Landsat TM images of an agricultural area in central California wasacquired during a single growing season: 27 April (left), 30 June (center), and 20 October(right). In this 4-3-2 band combination vegetation appears red and bare soil in shades ofblue-green. Some fields show an increase in crop canopy cover from April to June, andsome were harvested prior to October.

Most surface-monitoring satellites are in low-Earth orbits (between 650 and 850kilometers above the surface) that pass close to the Earth’s poles. The satellitescomplete many orbits in a day as the Earth rotates beneath them, and the orbitalparameters and swath width determine the time interval between repeat passesover the same point on the surface. For example, the repeat interval of the indi-vidual Landsat satellites is 16 days. Placing duplicate satellites in offset orbits (asin the SPOT series) is one strategy for reducing the repeat interval. Satellitessuch as SPOT and IKONOS alsohave sensors that can be pointed offto the side of the orbital track, so theycan image the same areas within afew days, well below the orbital re-peat interval. Such frequent repeattimes may soon allow farmers to uti-lize weekly satellite imagery toprovide information on the conditionof their crops during the growing sea-son.

Growth in urban area of Tracy, Californiarecorded by Landsat TM images from 1985(left) and 1999 (right).

Page 25: Intro Rse

Introduction to Remote Sensing

page 25

Spatial Registration and NormalizationYou can make qualitative interpretations from an image time-sequence (or im-ages from different sensors) by simple visual comparison. If you wish to combineinformation from the different dates in a color composite display, or to perform aquantitative analysis such as spectral classification,first you need to ensure that the images are spa-tially registered and and spectrally normalized.

Spatial registration means that corresponding cellsin the different images are correctly identified,matched in size, and sample the same areas on theground. Registering a set of images requires sev-eral steps. The first step is usually georeferencingthe images: identifying in each image a set of con-trol points with known map coordinates. The controlpoint coordinates can come from anothergeoreferenced image or map, or from a set of posi-tions collected in the field using a Global PositioningSystem (GPS) receiver. Control points are assigned in TNTmips in the Georefer-ence process (Edit / Georeference). You can find step-by-step instructions onusing the Georeference process in the tutorial booklet entitled Georeferencing.After all of the images have been georeferenced, you can use the AutomaticResampling process (Process / Raster / Resample / Automatic) to reproject eachimage to a common map coordinate system and cell size. For more informationabout this process, consult the tutorial booklet entitled Rectifying Images.

Images of the same area acquired on different dates may have different brightnessvalues for the same ground location and surface material because of differences insensor calibration, atmospheric conditions, and illumination. The path radiancecorrection described previously removes most of the between-date variance dueto atmospheric conditions and sensor offset. To correct for remaining differencesin sensor gain and illumination, the values in the image bands must be rescaled bysome multiplicative factor. If spectral measurements have been made of groundmaterials in the scene, the images can be rescaled to represent actual reflectancevalues (spectral calibration). In the absence of field spectra, you can pick oneimage as the “standard”, and rescale the others to match its conditions (imagenormalization). One normalization procedure requires that the scene includesidentifiable features whose spectral properties have not varied through time (calledpseudoinvariant features). Good candidates include manmade materials such asasphalt and concrete, or natural materials such as deep water bodies or dry baresoil areas. Normalization procedures using this method are outlined in the Com-bining Rasters booklet .

Classification result for thearea shown in the imageson the preceding page,using six Landsat TMbands for each date.

Page 26: Intro Rse

Introduction to Remote Sensing

page 26

Thermal Infrared ImagesThermal infrared images add another dimension to passive remote sensing tech-niques. They provide information about surface temperatures and the thermalproperties of surface materials. Many applications of thermal infrared images arepossible, including mapping rock types, soils, and soil moisture variations, andmonitoring vegetation condition, sea ice, and ocean current patterns. Thermalimages also can be used in more dramatic circumstances to monitor unusual heatsources such as wildfires, volcanic activity, or hot water plumes released intorivers or lakes by power plants.

The Earth’s surface emits EM radiation in the thermal infrared wavelength rangeas determined by typical surface temperatures. Most thermal infrared images areacquired at wavelengths between 8 and 14 µm, a range that includes the peakemissions. Nearly all incoming solar radiation at these wavelengths is absorbedby the surface, so there is little interference from reflected radiation, and thisrange also is a good “atmospheric window” (see pages 6 and 7). The naturalsources that heat the Earth’s surface are solar energy and geothermal energy (heatproduced by decay of radioactive elements in rocks). Geothermal heating is much

smaller in magnitude and is nearly uniformover large areas, so solar heating is the domi-nant source of temperature variation for mostimages. The daily solar heating of the sur-face is influenced by the physical and thermalproperties of the surface materials, by topog-raphy (slopes facing the sun absorb moresolar energy), and by clouds and wind.

The brightness values in a thermal imagemeasure the amount of energy emitted by dif-ferent parts of the surface, which dependsnot only on the material’s temperature, butalso on a property called emissivity. Emis-sivity describes how efficiently a materialradiates energy compared to a hypotheticalideal emitter and absorber, called a black-body. Emissivity is defined as the ratio of

the amount of radiant energy emitted by a real material at a given temperature tothe amount emitted by a blackbody at the same temperature. Emissivity is wave-length dependent, so materials can be characterized by an emissivity spectrumjust as they are by a reflectance spectrum. Most natural materials are relativelystrong emitters. Average emissivity values for the wavelength range from 8 to 12µm vary from 0.815 for granite rock to 0.993 for pure water.

The cool river surface, its flankingwooded strips, and agriculturalfields with full crop cover appeardark in this summer mid-morningthermal image of an area in Kansas(USA). Brighter fields are bare soil.From Landsat 7 ETM+, band 6, with60-meter ground resolution.

Page 27: Intro Rse

Introduction to Remote Sensing

page 27

Thermal Processes and PropertiesSome analogies can be drawn between thermal infrared images and the morefamiliar images created with reflected solar radiation. Both types of images revealspatial variations in a material property that governs an instantaneous interactionprocess between radiation and matter: emissivity for thermal images and reflec-tance for reflected solar images. Topographic effects can be present in both typesof images as well, as temperature variations in thermal images and as illuminationdifferences in reflected images. But the interpretation of thermal images is morecomplex because surface temperature also varies spatially as a result of othermaterial properties. These temperature variations also involve processes that ex-tend below the visible surface and that are not instantaneous in nature.

The temperatures of all surface materials change in a daily cycle of solar heatingand subsequent nighttime cooling, but different materials respond to this dailycycle in different ways. Darker materials tend to absorb more incoming radiationand so are heated more than brighter materials, which reflect much of the solarradiation. Even if two materials do absorb the same amount of radiation, one mayachieve a higher maximum temperature than the other. In part this may be due tothe fact that the materials have different thermal capacities: different amounts ofheat are required to induce a given rise in their temperatures. But as the surfacewarms, heat is transferred by conduction to cooler levels below the surface, andthe reverse process occurs during nocturnal cooling. Temperature changes duringthe daily solar cycle may extend as deep as 30 centimeters below the surface.Because rates of heat transfer vary between materials due to differences in densityand thermal conductivity, this vertical heat exchange also gives rise to spatialvariations in temperature. These effects can be expressed as a property calledthermal inertia, the resistance to change in temperature, which is a function ofdensity, thermal capacity, and thermal conductivity.

Surface materials constantly emit infrared radiation, so thermal images can beacquired day or night. Materials that warm slowly during the day, and thus arecooler than their surroundings, also cool slowly at night, and so become warmer

than their surroundings in nighttime images. Suc-cessful interpretation of thermal images requires aknowledge of the time of acquisition of the image,the topography of the area, and the thermal propertiesof the materials likely to be present in the scene.

In this nighttime thermal infrared image of northern Eritreaand the Red Sea, the water is warmer than the land surface,and thus appears in brighter tones at the upper right. Thebrightness variations on land relate to variations in thermalproperties and, to a smaller degree, topography.

Page 28: Intro Rse

Introduction to Remote Sensing

page 28

Radar Images

Ra

ng

e

An imaging radar systemdirects a radar beam down andtoward the side, building upimage data line by line.

Hilly terrain dominates the right half andflatter surfaces the left half of this radarimage. A broad braided stream channelon the left edge is flanked by agriculturalfields. Residential areas with brightvegetation and structures and darkstreets are found in the lower left andupper right portions.

Imaging radar systems are versatile sourcesof remotely sensed images, providing day-night, all-weather imaging capability.Radar images are used to map landformsand geologic structure, soil types, vegeta-tion and crops, and ice and oil slicks onthe ocean surface. Aircraft-mounted com-mercial and research systems have been inuse for decades, and two satellite systemsare currently operational (the CanadianRadarsat and the European Space Agency’sERS-1).

Radar images have a unique, unfamiliarappearance compared to other forms ofimages. They appear grainy and can alsoinclude significant spatial distortions ofground features. As with any remote sens-ing system, an understanding of the natureof the relevant EMR interactions and theacquisition geometry is important for successful interpretation of radar images.

An imaging radar system uses an antenna to transmit microwave energy down-ward and toward the side of the flight path (see illustration below). As the radarbeam strikes ground features, energy is scattered in various directions, and theradar antenna receives and measures the strength of the energy that is scatteredback toward the sensor platform. A surface that is smooth and flat (such a lake orroad) will reflect nearly all of the incident energy away from the sensor. So flatsurfaces appear dark in a radar image. A surface that is rough, with “bumps”comparable in height to the wavelength of the microwaves, will scatter more en-

ergy back to the sensor, and so willappear bright. (The range of wave-lengths commonly used in imagingradar systems is between 0.8 cm and 1meter). Slopes that face the sensor willalso appear brighter than surfaces thatslope away from it, and steepbackslopes may be completely shad-owed. Terrane shape and surfaceroughness are thus the dominant con-trols on radar brightness variations.

Page 29: Intro Rse

Introduction to Remote Sensing

page 29

Imaging radar systems broadcast very short (10 to 50 microsecond) pulses ofmicrowave energy and, in the pauses between them, receive the fluctuating returnsignal of backscattered energy. Each broadcast pulse is directed across a narrowstrip perpendicular to the flight direction. This pulsing mode is necessary becausethe system measures not only the strength of the returning signal, but also itsround-trip travel time. Because the microwaves travel at the speed of light, thetravel time for each part of the return signal can be converted directly to straight-line distance to the reflecting object, known as the slant range (see illustrationbelow). In the initial image produced by most radar systems, the positions ofradar returns in the range (across-track) direction are based on their slant rangedistances. Because the angle of the radar reflections varies in the range direction,the horizontal scale of a slant range image is not constant. Features in the part ofthe image close to the flight line (the near range) appear compressed in the rangedirection compared to those in the far range. Using the sensor height and theassumption that the terrain is flat, a slant range image can be processed to producean approximation of the true horizontal positions of the radar returns. The resultis a ground range image. TNTmips offers the slant range to ground range trans-formation as one of its raster resampling options (Process / Raster / Resample /Radar Slant to Ground).

Radar Image Geometry

Slant Range Image Ground Range Image

Nea

r R

ange

Far

Ran

ge

The side-looking geometry of radar systems also cre-ates internal image distortions related to topography.Slopes that face the sensor are narrowed relative toslopes facing away from it. As a result hills and ridgesappear to lean toward the flight path. This foreshorten-ing is illustrated in the image to the right by the brighterslopes on the left side, which face toward the sensor(left). If the foreslope is too steep, returns from the topmay arrive before any others, and the front slope disap-pears completely (called layover). An accurate elevationmodel of the surface is required to remove these distor-tions.

Slant Range

Height

Page 30: Intro Rse

Introduction to Remote Sensing

page 30

Fusing Data from Different SensorsMaterials commonly found at the Earth’s surface, such as soil, rocks, water, veg-etation, and man-made features, possess many distinct physical properties thatcontrol their interactions with electromagnetic radiation. In the preceding pageswe have discussed remote sensing systems that use three separate parts of theradiation spectrum: reflected solar radiation (visible and infrared), emitted ther-mal infrared, and imaging radar. Because the interactions of EM radiation withsurface features in these spectral regions are different, each of the correspondingsensor systems measures a different set of physical properties. Although eachtype of system by itself can reveal a wealth of information about the identity andcondition of surface materials, we can learn even more by combining image datafrom different sensors. Interpretation of the merged data set can employ rigorousquantiative analysis, or more qualitative visual analysis. The illustrations belowshow an example of the latter approach.

These images show a small area (about 1.5 by 1.5 km)of cropland in the Salinas Valley, California. Data usedin the color image to the right was acquired 8 October1998 by NASA’s AVIRIS sensor. This hyperspectralsensor acquires images in numerous narrow spectralbands in the visible to middle infrared range. The bandcombination to the right uses bands from the nearinfrared, green, and blue wavelength regions to simulatea color infrared image; red indicates vegetated areas,in this case fields with full crop canopy.

Data for the center radar image was acquired by NASA’sAIRSAR imaging radar system on 24 October 1998,about two weeks after the AVIRIS image. Acquired usinga 24-cm radar wavelength, the image has beentransformed to ground range. The brightest radar returnscome from crops with a tall, bushy structure. Thebrightest field in the center is a broccoli field, and avineyard with vines trained to a vertical trellis is at bottomcenter. (Both the AIRSAR and AVIRIS data weregeoreferenced and resampled to the same cell size andgeographic extents.)

The image to the right combines the AVIRIS and AIRSARdata in a single color image using the RGBI raster displayprocedure. This process converts the AVIRIS color bandcombination to the Hue-Intensity-Saturation color space,substitutes the AIRSAR image for grayscale intensity,and converts back to the RGB color space to create thefinal image (see the Getting Good Color tutorial bookletfor more information). Colors in the combined imagedifferentiate fields by degree of plant cover (red hue)and plant structure (intensity).

Page 31: Intro Rse

Introduction to Remote Sensing

page 31

Other Sources of InformationThis booklet has provided a brief overview of the rich and complex field of re-mote sensing of environmental resources. If you are interested in exploring further,you may wish to begin with one of the traditional printed texts listed below, ortake advantage of a variety of on-line internet resources.

Introductory Textbooks

Campbell, James B. and Wynne, Randolph H. (2011). Introduction to RemoteSensing (5th ed.). The Guilford Presss. 667 p.

Drury, S. A. (1993). Image Interpretation in Geology (3rd ed.). London: Taylor &Francis. 290 p.

Lillesand, Thomas M., Kiefer, Ralph W., and Chipman, Jonathan (2007). RemoteSensing and Image Interpretation (6th ed.). New York: John Wiley andSons. 750 p.

Sabins, Floyd F. (1997). Remote Sensing: Principles and Interpretation (3rded.). Waveland Press Inc. 512 p.

More Advanced Texts

Jensen, John R. (1996). Introductory Digital Image Processing: a Remote SensingPerspective (2nd ed.). Upper Saddle River, NJ: Prentice-Hall. 316 p.

Rencz, Andrew N., ed. (1999). Remote Sensing for the Earth Sciences. Manualof Remote Sensing, (3rd ed.), Volume 3. New York: John Wiley and Sons.707 p.

Schowengerdt, Robert A. (1997). Remote Sensing: Models and Methods for ImageProcessing (2nd ed.). New York: Academic Press. 522 p.

Internet Resources

Remote Sensing Tutorial created by the Goddard Space Flight Centerhttp://rst.gsfc.nasa.gov or http://www.fas.org/irp/imint/docs/rst/

An application-oriented on-line tutorial covering all aspects of remote sensing,including thermal images and radar, with many sample images.

Remote Sensing Tutorials created by the Canada Centre for Remote Sensinghttp://www.nrcan.gc.ca/earth-sciences/geography-boundary/remote-sensing/1599#tutor

On-line tuturials in remote sensing fundamentals, radar and stereoscopy,and digital image analysis.

Page 32: Intro Rse

Introduction to Remote Sensing

page 32

MicroImages, Inc.11th Floor - Sharp Tower206 South 13th StreetLincoln, Nebraska 68508-2010 USA

Voice: (402) 477-9554 email: [email protected]: (402) 477-9559 internet: www.microimages.com

MicroImages, Inc. publishes a complete line of professional software for advanced geospatialdata visualization, analysis, and publishing. Contact us or visit our web site for detailed prod-uct information.

TNTmips Pro TNTmips Pro is a professional system for fully integrated GIS, imageanalysis, CAD, TIN, desktop cartography, and geospatial database management.

TNTmips Basic TNTmips Basic is a low-cost version of TNTmips for small projects.

TNTmips Free TNTmips Free is a free version of TNTmips for students and profession-als with small projects. You can download TNTmips Free from MicroImages’ web site.

TNTedit TNTedit provides interactive tools to create, georeference, and edit vector, image,CAD, TIN, and relational database project materials in a wide variety of formats.

TNTview TNTview has the same powerful display features as TNTmips and is perfect forthose who do not need the technical processing and preparation features of TNTmips.

TNTatlas TNTatlas lets you publish and distribute your spatial project materials on CD orDVD at low cost. TNTatlas CDs/DVDs can be used on any popular computing platform.

Indexabsorption................................5-9,13,26atmosphere

absorption by..........................6,7,13scattering by...........................6,7,22

aerial photography..................4,10,12,13atmospheric windows........................7,26color display................................16,19,21electromagnetic spectrum......................4emission....................................5,8,26,27haze .................................................7,22illumination effects.................8,10,20,21interaction processes...........................5-8multispectral images..............12,13,19,23multispectral sensor table................14-15normalization, spectral........................25normalized difference index...................21panchromatic band..............................12path radiance.....................................7,22quantization.........................................16radar, imaging...........................4,8,28,29

ratio images, band...........................20,21reflectance, spectral...............................9reflected solar radiation..................6,8,27Reflection

diffuse See scatteringspecular...........................................5

Resolutionradiometric..................................16spatial....................................11,12spectral..................................11,12temporal................................11,24

roughness (surface)...........................8,28,29scattering.......................................5-9,22shadows...................................6,20,22,28spatial registration...............................25spectral classification......................23,25spectral signatures..................................9thermal infrared..............5,8,13,16,26,27topographic shading............................20,22visual analysis & interpretation........16,17,18

Advanced Software for Geospatial Analysis INTRO

TO

RSE