Top Banner
6 CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 Definitions There are many possible definitions about what Remote Sensing actually is. These are some of its definition according to some scientist. F.F. Sabins (1978) in his book "Remote sensing: principles and interpretation" defines it as follows: "Remote Sensing is the science of acquiring, processing and interpreting images that record the interaction between electromagnetic energy and matter". Lillesand and Kiefer (2007) in their book "Remote Sensing and Image Interpretation" even define it as an art: "Remote Sensing is the science and art of obtaining information about an object, area, or phenomenon through the analysis of data acquired by a device that is not in contact with the object, area, or phenomenon under investigation". Probably the broadest definition is given by Charles Elachi (2006) in "Introduction to the Physics and Techniques of Remote Sensing": "Remote Sensing is defined as the acquisition of information about an object without being in physical contact with it". And according to GIS Dictionary 2015, defines: “Remote sensing is collecting and interpreting information about the environment and the surface of the earth from a distance, primarily by sensing radiation that is naturally emitted or reflected by the earth's surface or from the atmosphere, or by sensing signals transmitted from a device and reflected back to it”. Examples of remote-sensing methods include aerial photography, radar, and satellite imaging.
36

CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

May 31, 2020

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

6

CHAPTER II

BASIC THEORY

2.1 Remote Sensing

2.1.1 Definitions

There are many possible definitions about what Remote Sensing actually

is. These are some of its definition according to some scientist. F.F. Sabins (1978)

in his book "Remote sensing: principles and interpretation" defines it as follows:

"Remote Sensing is the science of acquiring, processing and interpreting images

that record the interaction between electromagnetic energy and matter". Lillesand

and Kiefer (2007) in their book "Remote Sensing and Image Interpretation" even

define it as an art: "Remote Sensing is the science and art of obtaining information

about an object, area, or phenomenon through the analysis of data acquired by a

device that is not in contact with the object, area, or phenomenon under

investigation". Probably the broadest definition is given by Charles Elachi (2006)

in "Introduction to the Physics and Techniques of Remote Sensing": "Remote

Sensing is defined as the acquisition of information about an object without being

in physical contact with it". And according to GIS Dictionary 2015, defines:

“Remote sensing is collecting and interpreting information about the environment

and the surface of the earth from a distance, primarily by sensing radiation that is

naturally emitted or reflected by the earth's surface or from the atmosphere, or by

sensing signals transmitted from a device and reflected back to it”. Examples of

remote-sensing methods include aerial photography, radar, and satellite imaging.

Page 2: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

7

2.1.2 Types of Remote Sensing

Based on its platform, remote sensing divided into three types namely

airborne remote sensing, shuttle-borne remote sensing and space-borne remote

sensing. Whenever the remote sensing sensor is carried by airplane, drone or UAV,

that is categorized as airborne remote sensing. For example this system are

AIRSAR by NASA/JPL and Pi-SAR by NICT/JAXA. Shuttle-borne remote sensing

platform is shuttle-craft for example SRTM in 2000 by NASA. The last one is

space-borne remote sensing, in this system the sensors carried by satellite. The first

space-borne remote sensing initiated by USA military through Corona programs

begun in 1959 (Baumann, 2009)

Based on its sensors there are two types of remote sensing namely passive

remote sensing and active remote sensing. Passive remote sensing use passive

sensors which is only received and measure energy that naturally available. The sun

provides a very convenient source of energy for remote sensing. The sun's energy

is either reflected, as it is for visible wavelengths, or absorbed and then reemitted,

as it is for thermal infrared wavelengths. There are many kind of these remote

sensing type mainly space-borne remote sensing, for example ALOS-AVNIR2,

ALOS-PRISM, SPOT, LANDSAT family, ASTER, etc.

On the other hand, active remote sensing use active sensor which is provide

their own energy source for illumination. The sensor emits radiation which is

directed toward the target to be investigated. The radiation reflected from that target

is detected and measured by the sensor. Advantages for active sensors include the

ability to obtain measurements anytime, regardless of the time of day or season.

Active sensors can be used for examining wavelengths that are not sufficiently

Page 3: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

8

provided by the sun, such as microwaves, or to better control the way a target is

illuminated. However, active systems require the generation of a fairly large amount

of energy to adequately illuminate targets. Some examples of active sensors are a

laser fluoro-sensor and a synthetic aperture radar (SAR) (CCRS, 2014).

2.2 Radar Remote Sensing

Radar remote sensing is one of the active remote sensing that using

microwave radiation with wavelength from about one centimeter to a few tens of

centimeters enables observation in all weather conditions along day and night. This

is an advantage that is not possible with the visible and/or infrared remote sensing.

However, the need for sophisticated data analysis is the disadvantage in using

microwave remote sensing. Radar bands and designations presented in table 2.1.

The most commonly used bands on radar remote sensing marked by (*) in table 2.1

(Lusch, 1999).

Table 2.1

Frequency and wavelength for microwave bands (Lusch, 1999)

Band Designation Wavelength range

(cm)

Frequency range

(GHz)

Ka

K

Ku

X*

C*

S

L*

P

0.75 – 1.10

1.10 – 1.67

1.67 – 2.40

2.40 – 3.75

3.75 – 7.50

7.50 – 15.0

15.0 – 30.0

30.0 – 130.0

40.0 – 26.5

26.5 – 18.0

18.0 – 12.5

12.5 – 8.0

8.0 – 4.0

4.0 – 2.0

2.0 – 1.0

1.0 – 0.23

Page 4: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

9

2.2.1 Radar Remote Sensing Satellites

The first civilian space-borne SAR was SEASAT (USA) in 1978, followed

by Almaz (USSR/Russia), ERS-1 (Europe), J-ERS-1 (Japan), ERS-2 (Europe) and

RADARSAT-1 (Canada). Nowadays there are many satellites orbiting the Earth by

carrying radar sensors on board. These satellites providing incredible amounts of

data to study about earth. In Figure 2.1 illustrating the family of satellites that

carrying SAR sensors for commercial applications from 1992.

Figure 2.1.

Satellite radar system available now and into the future

SAR sensors carried by satellites in polar orbits generally look to the right

of the satellite except for ALOS2, which is it can look to the right and left by

carrying two antennas. Given the orbital inclination, these side-looking sensors can

image the North Pole (actually an area within a few square kilometers of it), but not

the South Pole (unless the satellite is overturned) (Henri, 2008).

Page 5: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

10

All satellites equipped with SAR sensors orbit the earth on a near-polar

orbit at an altitude ranging from 500 to 800 km above the earth’s surface, depending

on the satellite platform hosting the SAR sensor. The angle between true north-

south and the satellite orbit varies slightly, depending on the satellite but, in general

lies in the range of 10 degrees.

2.2.2 ALOS-PALSAR System Overview

In this section will describe briefly about The Advanced Land Observing

Satellite (ALOS) satellite and focus on PALSAR sensor, which summarized from

ALOS User handbook by JAXA. ALOS or nicknamed "Daichi" is Japanese satellite

was launched in Jan. 24, 2006. The observation sensors consist of a high-resolution

stereo mapping sensor (PRISM), a visible and near infrared radiometer (AVNIR-2),

and an L-band synthetic aperture radar (PALSAR), all of which are high

performance systems.

The Phased Array type L-band Synthetic Aperture Radar (PALSAR) is an

active microwave sensor using L-band frequency to achieve cloud-free and day-

and-night land observation. The definitions of PALSAR data products for

processing levels are shown in Table 2.2. The processing levels of observational

modes are given in Table 2.3

Page 6: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

11

Figure 2.2.

Picture of Alos satellite with the its parts (ALOS User’s hand book)

Table 2.2

Processing Levels and Their Definitions (ALOS User’s hand book)

Processing

Level

Definition

1.0 The data of 1 scene area is extracted from received data. Data

type is 8 bit. The number of SAR data files is the same as the

number of polarizations in the case of dual polarization and

polarimetry modes. The data in SCAN SAR mode is not divided

into individual scans.

1.1 Range compression and 1 look azimuth compression are

performed. Data is complex data on the slant range coordinate.

The phase history is included

1.5 After range and multi-look azimuth compression are performed,

radiometric and geometric corrections are performed according

to the map projection. Pixel spacing can be selected for the Fine

mode

PALSAR product formats are based on the CEOS (Committee on Earth

Observation Satellites) revised standardized formats. One (1) file composition an

image volume consists of 4 kinds of files. The file names and their contents are

shown in table 2.4.

Page 7: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

12

Table 2.3

Processing Levels of Observational Modes (ALOS User’s hand book)

Observation Mode Processing Level Remarks

1.0 1.1 1,5

Fine mode Single polarization O O O 18 beams

Dual polarization O O O 18 beams

Scan SAR

mode

Burst mode 1 O - O 3 scans, 4 scans, 5 scans

Burst mode 2 O - O 3 scans, 4 scans, 5 scans

Direct Donwlink mode O O O 18 beams

Polarimetry mode O O O 12 beams

Table 2.4.

File names and its content on ALOS-PALSAR product format (ALOS User’s hand

book)

File

Name

Definition of File

Name

Contents

Volume

Directory

File

VOL-Scene ID-

Product ID

This file is located at the beginning of the

image volume and stores the volume and

file management information.

Leader

File

LED-Scene ID-

Product ID

This file is located before image file and

stores annotation data, ancillary data and

other types of data related to the image data

in the succeeding image file.

Image File IMG-XX-Scene ID-

Product ID

This file is located after the leader file and

stores the image data.

Trailer

File

TRL-Scene ID-

Product ID

This file is located after the image file and

stores the final information related to the

image data.

Page 8: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

13

2.3 Basic concept of Synthetic Aperture Radar (SAR)

“SAR” is the acronym for Synthetic Aperture Radar. Each word represents

“Synthesis,” “Aperture (Opening),” and “Radar.” “Radar” is the acronym for Radio

Detection And Ranging. The Radar technique was developed in the 20th century

for its ability to determine physical parameters (size, roughness or the

displacement) of an illuminated object using the range and backscatter intensity

from two-way travel time of the electromagnetic pulse. Radar imagery shown in

grayscale image.

Synthetic Aperture Radar (SAR) is a coherent radar system that generates

high resolution remote sensing imagery that can work along day and night since it

is an active system (Agustan, 2010). Furthermore, most of the remote sensing SAR

systems operate in upper L band, in C band or in X band (i.e. within well-defined

frequency bands comprised roughly between 1.2 and 10.9 GHz). At such

frequencies, the electromagnetic radiation penetrates the cloud cover SAR sensors

can therefore acquire data in the all weather conditions (Raucoules, 2007). Because

of these characteristic, SAR has been used in various research field as listed in table

2.5.

Page 9: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

14

Table 2.5

Selected fields of SAR application examples. Note that not all applications are in

practical use; many applications are still at developing stages (Ouchi, 2013)

Fields Objects

Geology

Agriculture

Forestry

Hydrology

Urban

Disaster

Oceanography

Cryosphere

Archeology

topography, DEM & DSM production, crust movement, faults,

GIS, soil structure, lithology, underground resources

crop classification, plantation acreage, growth, harvest &

disaster, soil moisture

tree biomass, height, species, plantation & deforestation, forest

fire monitoring

soil moisture, wetland, drainage pattern, river flow, water

equivalent snow & ice water cycle, water resources in desert

urban structure & density, change detection, subsidence,

urbanization, skyscraper height estimation, traffic monitoring

prediction, lifeline search, monitoring of damage & recovery,

tsunami & high tide landslide & subsidence by earthquake,

volcano & groundwater extraction

ocean waves, internal waves, wind, ship detection, identification

& navigation, currents, front, circulation, oil slick, offshore oil

field, bottom topography

classification, distribution & changes of ice & snow on land, sea

& lake, ice age, equivalent water, glacier flow, iceberg tracking,

ship navigation in sea ice

exploration of aboveground and underground remains, survey,

management

SAR system was invented in 1953 by Carl Wiley and then was developed

for fine resolution mapping and other remote sensing applications. Table 2.6

showing the Highlights of SAR history with space Emphasis. In this section will

describe the basic concept of SAR.

Page 10: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

15

Table 2.6

Highlights of SAR History with space emphasis (modified from USA SAR

Marine User’s Manual)

YEAR DEVELOPMENT

1951

1952

1957

1964

1969

1972

1978

1981

1984

1986

1987

1990

1990-

present

Carl Wiley of Goodyear postulates the Doppler beam-sharpening

concept.

University of Illinois demonstrates the beam-sharpening concept.

University of Michigan produces the first SAR imagery using an

optical correlator.

Analog electronic SAR correlation demonstrated in non-real time

(University of Michigan).

Digital electronic SAR demonstrated in non-real time (Hughes,

Goodyear, Westinghouse).

Real-time digital SAR demonstrated with motion compensation (for

aircraft systems).

First space-borne SAR NASA/JPL SEASAT satellite. Analog

downlink; optical and non-real-time digital processing.

Shuttle Imaging Radar series starts-SIR-A. Non-real-time optical

processing on ground.

SIR-B digital downlink; non-real-time digital processing on ground.

Space-borne SAR Real-time processing demonstration using JPL

Advanced Digital SAR processor (ADSP)

Soviet 1870 SAR is place in earth orbit.

Magellan SAR image Venus.

Evolution of SAR begins in space (excluding the military

reconnaissance satellites); Soviet ALMAZ (1991), European ERS-1

(1991), Japanese JERS-1 (1992), SIR-C (1994), ERS-2 (1995),

Canadian RADARSAT-1 (1995), SRTM (2000), ENVISAT (2002),

Japanese ALOS (2006), Chinese Yaogan-1 (2006), Italia COSMO-

Skymed 2007, Germany TerraSAR-X 2007, India RISAT-1 (2009),

South Korea KOMSAT-5 (2013), Japanese ALOS-2 (2014)

Page 11: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

16

2.3.1 Observation Geometry and Principles of SAR Imaging

Figure 2.3 depicts the configuration of a side-looking radar. Antenna is

mounted on a platform (usually an aircraft or satellite) moving with a velocity (V)

with respect to the Earth at a constant altitude; flight direction is generally called

azimuth. The radar illuminates along the direction perpendicular to the flight path,

slant range, with an inclination (look angle) with respect to the vertical.

Figure 2.3.

SAR geometry (1): Off-nadir angle (2): Depression angel (3): Range beam width

(4): Incidence angle (5): Azimuth beam width (source: Restec/Jaxa)

For a radar to make an image based on the echoes it receives, it need to

two things namely, where each echo came from on the ground and how bright each

echo should be in the image. Figure 2.4 shown how the SAR principle on radar

imaging. An antenna emit the microwave energy to the target, the backscatter (echo)

from the target will be received. Since the antenna is side looking, there is difference

Page 12: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

17

time acquisition between the far range and near range. The energy backscatter

received by antenna will varies depend on properties of the target. The high

backscattered energy yield the bright pixel in the image, and vice versa.

Figure 2.4.

The illustration how SAR work to make an radar imagery (adopted and modified

from Restec/Jaxa)

2.3.2 SAR Geometric Resolution

Simply speaking, geometric resolution is the ability of the system to

localize nearby objects. More precisely, the resolution length is the minimum

spacing between two objects that are detected as separate entities, and are therefore

resolved (Franceschetti, 1999). In SAR system the term “range resolution” and

Page 13: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

18

“azimuth resolution” are introduce.

Range resolution of a SAR is determined by built-in radar and processor

constraints which act in the slant range domain. The radar emits a short pulse that

reflects off the surface of the earth and returns to the antenna. The amplitude versus

time of the return pulse is a recording of the reflectivity of the surface. If adjacent

reflectors appear as two distinct peaks in the return waveform then they are resolved

in range (see Figure 2.5). When the distance between two objects is less than Cτ /

2, these objects can’t be distinguished in the image.

The relationship between the ground range Δx and the slant range ΔR is

expressed as ΔR = Cτ / 2 sinθ where the incident angle θ, τ is the pulse length, and

C is the speed of light. The factor of two accounts for the 2-way travel time of the

pulse. Figure 2.5 shows how the ground range resolution is geometrically related to

the slant range resolution

Figure 2.5

The relationship between the ground range (Δx) and slant range (ΔR). The

distance (height) of spacecraft from the ground surface represent as H (adopted

and modified from Restec/Jaxa)

Page 14: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

19

From Figure 2.5 we know the range resolution is independent of the height

of the spacecraft H. Note the ground range resolution is infinite for vertical look

angle and improves as look angle is increased. The range resolution can be

improved by increasing the bandwidth of the radar. Usually the radar bandwidth is

a small fraction of the carrier frequency so shorter wavelength radar does not

necessarily enable higher range resolution. In many cases the bandwidth of the radar

is limited by the speed at which the data can be transmitted from the satellite to a

ground station (Sandwell, et al 2011).

Figure 2.6.

Top view of SAR antenna imaging a point reflector (P). The reflector remains

within the illumination pattern over the real aperture length of W (Sandwell, et al

2011)

For a real aperture radar, azimuth resolution is determined by the angular

beam width of the terrain strip illuminated by the radar beam. For two objects to be

resolved, they must be separated in the azimuth direction by a distance greater than

Page 15: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

20

the beam width on the ground. SAR gets its name from the azimuth processing and

can achieve an azimuth resolution which may be hundreds of times smaller than the

transmitted antenna beam width. To understand the azimuth resolution, consider a

single point reflector (P) on the ground that is illuminated as the radar passes

overhead (Figure 2.6).

Consider the antenna with length (L), so the beam width of real aperture given by;

β =𝜆

𝐿 (1)

where βis beam width; λ is wavelength, so the illumination of real aperture (W)

is given by;

W = β ∙ R =𝜆∙𝑅

𝐿 (2)

Since the Length of synthetic aperture 𝐿𝑠 = W and beam width of synthetic

aperture (Ls) expressed by;

𝛽𝑠 =𝜆

2𝐿𝑠 (3)

Count phase difference two times to and from satellites, so spatial resolution in

azimuth direction (Ra) can be driven

𝑅𝑎 = 𝛽𝑠 ∙ 𝑅 (4)

And substitutes the Eq. (3) to Eq. (4), therefore;

𝑅𝑎 =𝜆

2𝐿𝑠∙ 𝑅 =

𝐿

2 (5)

Maximum spatial resolution in azimuth direction is L/2 independent of distance and

wavelength.

The final output of SAR data processing is SAR image that can be seen as

a mosaic of small picture elements (pixels). Each pixel corresponds to a small area

of the Earth’s surface that can be defined as a resolution cell. Each pixel contains a

Page 16: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

21

complex number that carries amplitude and phase information about the microwave

field backscattered by all objects in corresponding resolution cell projected on the

ground. These kinds of information are stored in complex format by adapting IQ

(In-phase and Quadrature) data format. Therefore, SAR image also known as single

look complex (SLC) that is composed of a regular grid with complex values or

phasors (Hanssen, 2001) and can be decomposed into amplitude (A) or real (R) and

phase (φ) or imaginer (I) components as expressed in following equation:

y = A ∙ 𝑒𝑗𝜙 (6)

where, y is the SLC data that represents the electric field of a plane electromagnetic

wave , A is amplitude of the electromagnetic pulse, and φ is phase angle. The

amplitude represents the quantity of electromagnetic field scattered back grouped

in each SAR image-sampling cell or pixel, whereas the phase represents an

ambiguous measure of distance between sensor and each area on the ground

corresponding to an image pixel (Raucoules, et al 2007).

2.3.3 Geometrical Effects Introduced by SAR

Since it is side looking, SAR is limited by the presence of geometric

distortion to the range imaging mode (Franceschetti, 1999) These effects are

demonstrated in Figure 2.7 where the SAR imaging along the range direction is

shows.

Page 17: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

22

Figure 2.7

Schematic illustration showing how mountainous terrain can create noise through

layover and shadow effects (Farretti et al, 2007)

When the terrain slope exceeds the radar local incidence angle, the scatters

are imaged in reverse order and superimposed on the contribution coming from

other areas. In this case, the top of the feature will be displaced, or “laid over”

relative to its base when it is processed into an image. In Figure 2.6 cell number 2,

3 and 4 showing the example of layover. The reflection coming from a part of the

ground B, F and G are all superimposed on top of each other in cell number 2. Also

for cell number 3 and 4 showing the same pattern. In general, layover is more

prevalent for viewing geometries with small incident angles, such as from satellites.

When an object in the scene blocks the radar wave from reaching other

portions of the scene, shadow occurs in the SAR imagery, as shown in cell number

5, 6, 7 and 8 in Figure 2.6. Radar shadows in imagery indicate those areas on the

ground surface not illuminated by the radar. Since no return signal is received, radar

Page 18: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

23

shadows appear very dark in tone on the imagery. Radar shadows occur in the

down-range direction behind tall objects. They are a good indicator of radar

illumination direction if annotation is missing or incomplete. Since incident angle

increases from near to far-range, terrain illumination becomes more oblique. As a

result, shadowing becomes more prominent toward far-range. Information about

the scene, such as an object’s height, can also be obtained from radar shadows.

Shadowing in radar imagery is an important key for terrain relief interpretation

(CCRS).

The last effect is foreshortening and it occurs as long as the slope of the

terrain is smaller than the local incidence angle. Foreshortening in a radar image is

the appearance of compression of those features in the scene which are tilted toward

the radar. This effect illustrated in Figure 2.8. Foreshortening leads to relatively

brighter appearance of these slopes, and must be accounted for by the interpreter.

Foreshortening is at a maximum when a steep slope is orthogonal to the radar beam.

In this case, the local incident angle is zero, and as a result, the base, slope and top

of a hill are imaged simultaneously and, therefore, occupy the same position in the

image.

For a given slope or hillside, foreshortening effects are reduced with

increasing incident angles. At the grazing angle, where incident angles approach

90°, foreshortening effects are eliminated, but severe shadowing may occur. In

selecting incident angle, there is always a trade-off between the occurrence of

foreshortening and the occurrence of shadowing in the image.

Page 19: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

24

Figure 2.8

Illustration of foreshortening effect on radar imaging system. Foreshortening

effect occurs as long as the slope of the terrain is smaller than the local incidence

angle (modified from Franceshetti & Lanari, 2000)

To better understand about geometrical effect on SAR, let’s see into Figure

2.9, in that Figure showing the example of the SAR image, acquired by ALOS-

PALSAR satellite.

Page 20: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

25

Figure 2.9

An example of SAR image of Mount Fuji Japan. Yellow circle area in the

image severely affected by shadow, dark color representing no energy

backscattered from those are. Contrary with the area in the yellow circle, the

red circle area looks very bright that indicating very strong backscattered

energy due to foreshortening or layover effect. Layover and foreshortening

effects looks very similar on a SAR images make that difficult to be

distinguished visually. (Image source:

http://gds.palsar.ersdac.jspacesystems.or.jp/e/collection/2009/fuji-

palsar_gc.png )

Page 21: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

26

2.4 Interferometry SAR (InSAR) Basic

Interferometric Synthetic Aperture Radar (InSAR), also referred to as SAR

Interferometry, is the measurement of signal phase change, or interference, over

time. A satellite SAR can observe the same area from slightly different look angles.

This can be done either simultaneously (with two radars mounted on the same

platform) or at different times by exploiting repeated orbits of the same satellite.

Since the InSAR involving 2 different acquisitions images, the term “baseline is

introduce. The baseline length is the distance between the SAR satellites orbits for

the first and second observation.

Based on the position of two antenna/sensors among other when taking

the data, two kind of InSAR are introduced namely Along Track (AT-InSAR) and

repeat pass Cross Track (CT-InSAR). AT-InSAR consists of two (or more) antennas

placed along the body of an aircraft platform and taking data in single pass. While

the repeat pass CT-InSAR taking the data of same area by using single antenna with

2 or more pass. The illustration about AT-InSAR and CT-InSAR shown in Figure

2.10. In this study will focus on application of CT-InSAR mainly on D-InSAR.

Figure 2.10

Illustration of Geometry of repeat pass CT-InSAR (left) and AT-InSAR (right)

(modified from Ouchi, 2013)

Page 22: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

27

Interferogram is this technique obtained by cross-multiplying, pixel by pixel of

two SAR (two SLCs) image. These two images must be coherent to able generate

an interferogram. The first image is called master and the second one called slave.

Because of different of time acquisition of two image, decorrelation will be occur.

Apart from decorrelation effects, to be discussed in the section 2.4.3. SAR

interferometry can only be applied in the following circumstances:

Images have to be acquired by the same satellite using the same acquisition

mode and properties (beam, polarization, off-nadir angle, etc)

Images have to be acquired with the satellite in the same nominal orbit;

The baseline separation between the master scene and any of the slave scene

must be no more than the “critical baseline”.

2.4.1 Geometrical Equations of CT-InSAR

In this section describing about geometry and basic geometric and

inteferometric phase of CT-InSAR.

Figure 2.11

InSAR geometry: B-the base line; Br-the radial base line; Bn-the normal

baseline (Lazarov, 2010)

Page 23: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

28

In Figure 2.11 consider two position S1 and S2 of SAR satellites which

observed two points scatterer P0 and P1. P0 as the reference point, the variation of

the travel path difference (ΔR) that results in passing from reference resolution cell

to another can be given by a simple expression that depends on a few geometric

parameters such as the perpendicular baseline (Bn), the radar target distance (R0)

and the displacement between the resolution cells along the perpendicular to the

slant range (Np) (Ferretti et al. 2007). Because of the distance Bn measured on the

normal to the reference line between the two SAR sensors is much smaller than

radar target distance (R0) the following approximated expression of ΔR holds:

∆R =𝐵𝑛𝑁𝑝

𝑅0 (7)

From triangles RpPN and NP1P2, and geometrical relation Np = PN +NP1 the

following equation can be written

𝑁𝑝 =𝑅𝑝

𝑡𝑎𝑛𝜃+

𝑞

𝑠𝑖𝑛𝜃 (8)

Substitute the Np in equation (7) by equation (8) so the ΔR can be expressed as

∆R =𝐵𝑛

𝑅0(

𝑅𝑝

𝑡𝑎𝑛𝜃+

𝑞

𝑠𝑖𝑛𝜃) (9)

The phase difference corresponding to the distance variation ΔR is proportional to

the travel path difference 2ΔR (the factor 2 accounts for the two ways travel path

from S1 and S2 to P1) times by wave number (k). Enter the expression of k =2𝜋

𝜆

into eq. (10), that the phase difference (Ф) can be expressed:

Φ =4𝜋

𝜆∙

𝐵𝑛

𝑅0(

𝑅𝑝

𝑡𝑎𝑛𝜃+

𝑞

𝑠𝑖𝑛𝜃) (10)

From equation (10) shown the interferometric phase variation proportional to two

components. The first one is proportional to the slant range displacement Rp of point

Page 24: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

29

targets P1 and P0 expressed by

Φ1 =4𝜋

𝜆∙

𝐵𝑛

𝑅0(

𝑅𝑝

𝑡𝑎𝑛𝜃) (11)

The second one is proportional to the altitude difference q between targets P1 and

P0, referred to a horizontal reference plane (see Figure1) expressed by

Φ2 =4𝜋

𝜆∙

𝐵𝑛

𝑅0(

𝑞

𝑠𝑖𝑛𝜃) (12)

Multiplication of the complex interferogram with complex conjugate phase term

e−jΦ1 is called interferogram flattening (Lazarov, 2010). It generates a phase map

proportional to the relative terrain altitude. The change of the phase with elevation

of the target point is given by the derivative

𝑑Φ

𝑑𝑞=

4𝜋∙𝐵𝑛

𝜆𝑅0𝑠𝑖𝑛𝜃 (13)

This relation describes the height sensitivity of interferometric measurements,

which may also be described by the height or altitude of ambiguity. The altitude of

ambiguity Ha is defined as the altitude difference that generates an interferometric

phase change of 2π after interferogram flattening. Altitude of ambiguity expressed

by following equation

𝐻𝑎 =𝜆𝑅0𝑠𝑖𝑛𝜃

2𝐵𝑛 (14)

2.4.2 Contributors to Signal Phase

The Interferogram phase contain many components as an impact of four

contributions such as topographic distortion due to slightly different look angles of

the two satellite passes, atmospheric effect, range displacement of the radar target

and noise. This components describing by the following equation (Sandwell et al,

2011).

Page 25: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

30

𝒑𝒉𝒂𝒔𝒆 = 𝒆𝒂𝒓𝒕𝒉 𝒄𝒖𝒓𝒗𝒂𝒕𝒖𝒓𝒆 (𝑎𝑙𝑚𝑜𝑠𝑡 𝑎 𝑝𝑙𝑎𝑛𝑒, 𝑘𝑛𝑜𝑤𝑛) +

𝑡𝒐𝒑𝒐𝒈𝒓𝒂𝒑𝒉𝒊𝒄 𝒑𝒉𝒂𝒔𝒆 (𝑏𝑟𝑜𝑎 𝑠𝑝𝑒𝑐𝑡𝑟𝑢𝑚) +

𝒔𝒖𝒓𝒇𝒂𝒄𝒆 𝒅𝒆𝒇𝒐𝒓𝒎𝒂𝒕𝒊𝒐𝒏 (𝑏𝑟𝑜𝑎𝑑 𝑠𝑝𝑒𝑐𝑡𝑟𝑢𝑚, 𝑢𝑛𝑘𝑛𝑜𝑤𝑛) +

𝒐𝒓𝒃𝒊𝒕 𝒆𝒓𝒓𝒐𝒓 (𝑎𝑙𝑚𝑜𝑠𝑡 𝑎 𝑝𝑙𝑎𝑛𝑒, 𝑙𝑎𝑟𝑔𝑒𝑙𝑦 𝑘𝑛𝑜𝑤𝑛) +

𝒊𝒏𝒐𝒔𝒑𝒉𝒆𝒓𝒆 𝒅𝒆𝒍𝒂𝒚 (𝑎 𝑝𝑙𝑎𝑛𝑒 𝑜𝑟 40 − 𝑘𝑚 𝑤𝑎𝑣𝑒𝑙𝑒𝑛𝑡ℎ 𝑤𝑎𝑣𝑒𝑠) +

𝒕𝒓𝒐𝒑𝒐𝒔𝒑𝒉𝒆𝒓𝒆 𝒅𝒆𝒍𝒂𝒚 (𝑝𝑜𝑤𝑒𝑟 𝑙𝑎𝑤, 𝑢𝑛𝑘𝑛𝑜𝑤𝑛) +

𝒑𝒉𝒂𝒔𝒆 𝒏𝒐𝒊𝒔𝒆 (𝑤ℎ𝑖𝑡𝑒 𝑠𝑝𝑒𝑐𝑡𝑟𝑢𝑚, 𝑢𝑛𝑘𝑛𝑜𝑤𝑛) (15)

2.4.3 Coherence

Interferometric fringes can only be observed when between the master and

slave images has a good coherence. When an area on the ground appears to have

the same surface characterization in all images under analysis, then the images are

said to be coherent but if the opposite happen it is called decorelated or loss of

coherence. Loss of coherence resulting in noise and no information being

obtainable.

Coherence estimation of 2 SAR images can be calculated by following

equation (Ferreti, et al 2007):

γ =𝐸[𝑢1 𝑢2

∗ ]

√𝐸[|𝑢1|2]√𝐸[|𝑢2|2]

(16)

Where (u1 and u2) is the pixel value of master and slave respectively and E is the

intensity of each images. For detail explanation can referencing to Ferretti et al.

2007.

The coherence of an interferogram is affected by several factors, including:

Topographic slope angle and orientation (steep slopes lead to low coherence),

terrain properties, the time between image acquisitions (longer time intervals lead

to lower coherence) and the distance between the satellite tracks during the first and

second acquisitions, also referred to as the baseline (larger baselines lead to lower

Page 26: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

31

coherence)

There are some typical sources of decorrelation such as vegetation,

construction activity, erosion and rapid movement. Leaves grow and die and they

also move. From one scene to the next, these changes are sufficient to change the

appearance of the surface characterization. This is a particular problem for X-band

and C-band sensors. L-band sensors can overcome this limitation in many situations,

because their significantly longer wavelength is able to ‘see’ through foliage and

reflect off objects beneath the vegetation and back through the foliage. At a

construction site, the appearance of the land surface is changing constantly. This is

a problem that is common to X-band, C-band, and L-band sensors. Rapid movement

followed by destruction causing the surface characteristic changing. These also

source of decorelation, if the total movement occurring between successive image

acquisitions exceeds one-half of the signal’s wavelength, decorrelation is likely to

occur.

2.4.4 Applications

The two main fields of application of InSAR data are reconstruction of

digital elevation models (DEM generations) of large areas and detection and

monitoring of surface deformation phenomena. In general measurement of

displacement rates of objects on the ground or known as D-InSAR. Especially for

displacement monitoring, operational principles of Satellite SAR also adopted to

ground based SAR all known as Terrestrial SAR Interferometry (TInSAR)

(Mazzanti, 2011).

Page 27: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

32

2.5 Differential InSAR (D-InSAR)

Suppose that some of the point scatterers on the ground slightly change

their relative position in the time interval between two SAR observations (as, for

example, in the event of subsidence, landslide, earthquake, etc.). In such cases the

following additive phase term, independent of the baseline, appears in the

interferometric phase

Φ3 = −4𝜋

𝜆𝑑 (17)

This means that after interferogram flattening, the interferometric phase contains

both altitude and motion contributions:

Φ̂ = Φ2 + Φ3 = 4𝜋

𝜆∙

𝐵𝑛

𝑅0(

𝑞

𝑠𝑖𝑛𝜃) −

4𝜋

𝜆𝑑 (18)

Moreover, if a digital elevation model (DEM) is available, the altitude contribution

can be subtracted from the interferometric phase (generating so-called differential

interferogram) and the terrain motion component can be measured (Farreti, et al

2007). D-InSAR is powerful technique to monitoring ground

deformation/displacement.

2.5.1 D-InSAR Processing to Land Deformation Monitoring.

As mentioned before, the main ability of D-InSAR is for land deformation

monitoring, there are some strategies has been developed such as Single

interferometric pair and near-zero baseline, Single interferometric pair and non-zero

baseline, Three interferometric images and no motion and Two image pairs and no

motion in one of them (Agustan, 2010). But in general the steps to produce an

interferogram is same. When the two raw SAR data are available, some step to

Page 28: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

33

create an interferogram and isolate the phase due to surface deformation should be

done. Every step will be explain as following;

1. The first step is Focusing, in this step the raw data of master and slave image

focused trough azimuth and range compression to make a Single Look

Complex (SLC) image.

Figure 2.12

Flow chart of focusing process to form Single Look Complex (SLC) image

from raw SAR image (Franceschetti & Lanari, 2000)

2. The second step is co-registration, this is a fundamental step in interferogram

generation. Purpose of this step is to ensures that each (range, azimuth) pixel

in both master image and slave image contributes to the same target on the

ground. In this step also adjusting the Doppler centers of the SLC image in

the azimuth direction. The Doppler centers of master and slave image should

be same to produce an interferogram with high coherence.

Page 29: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

34

3. Generation of Synthetic fringes, from two co-registered images then a

complex interferogram generated by multiplying pixel to pixel of these

image There are two types of information in the interferogram namely

phase or InSAR phase and coherence or InSAR coherence (Ouchi, 2013).

4. Generation of synthetic fringes of topographic phase, in this step requiring

external DEM data. DEM data than converted to topographic phase.

5. Next step is subtracting topographic phase from complex interferogram. By

removing the topographic phase component from complex interferogram the

differential phase then obtained.

6. Differential phase filtering, this step required to reduce phase noise and make

phase unwraping (next step) more efficient and simpler. There are three

option of filter the interferogram, namely band pass filter, based on local

phase gradient filter, and adaptive filter based on local fringe spectrum

(Agustan, 2010)

7. Unwraping, the differential phase is still in 2π modulo, therefore it is

necessary to determine the multiple of 2π to add to the measured phase on

each item to obtain an estimate of the actual phase.

8. Deformation generation, unwrapped phase should be changed to the length

unit to make it easier in displacement analysis.

9. Geocoding, all of images in previous stage is done in the radar coordinates

and converting process to geographical coordinates will be done in this stage.

All of those process illustrated in Figure 2.13

Page 30: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

35

Figure 2.13

Illustration of processing stage to obtain Line of Sight (LOS) displacement by D-

InSAR technique, (source: Wang, et al 2013).

2.5.2 Derivation of Land Subsidence from LOS displacement.

Satellites observed the land displacement on range directions, therefore we

need the information about subsidence or vertical displacement (Z). The

information about land subsidence can be derived from LOS displacement

informations as shown in Figure 2.14. Figure 2.14 illustrate the satellites with

altitude (H) observing one point on the earth at ascending and descending directions

with the off nadir angle (θ) and slant range (R). Earth curvature effect in this

assumptions is neglected since the width of observation area is too small comparing

to the radius of Earth.

Page 31: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

36

Figure 2.14

Illustrations of the relation between LOS displacement and subsidence or vertical

displacement (Z) observed in ascending and descending direction (A). Ascending

and descending direction means satellites move from south to north and vice versa

respectively, as illustrated in Figure (B).

Before this calculation taken out, the image at first normalized by

subtracting all pixel value by pixel value of reference point. After that supposed the

displacement at horizontal direction is zero or no displacement, vertical

displacement then obtained by divided LOS displacement by cos θ, whereas θ is

incident angle. This calculation expressed as following equation:

Z = 𝐿𝑂𝑆

cos 𝜃 (19)

2.6. D-InSAR processor, GMTSAR

Generic Mapping Tools Synthetic Aperture Radar (GMTSAR) is an open

source (GNU general Public License) InSAR processing was developed by David

Sandwell et al. at Scripps Institution of Oceanography, University of California

(Sandwel, et al 2011). The SAR processor code was originally derived from

Page 32: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

37

Stanford/JPL FORTRAN and rewritten in the C programming language ensuring

compilation on many platforms using gcc compiler. GMTSAR is command line

interface (CLI) software and have ability to process 3 kind of InSAR processing

technique they are 2-pass processing, stacking for time series analysis, and

ScanSAR to strip-mode.

In this study use 2-pass processing which using two SAR images to form

an interferogram. A flow diagram of a script called p2p_SAT.csh shown in Figure

2.15 (Sandwell, et al 2011). A brief procedure on D-InSAR generation by on

GMTSAR by using p2p_SAT.csh as follows. The deeply procedure on GMTSAR

can be found in GMTSAR website http://topex.ucsd.edu/gmtsar/ .

Figure 2.15

Flow diagram of two-pass processing in GMTSAR (Sandwell, et.al 2011)

Page 33: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

38

According to Sandwell et al. (2011), seven steps must be done in 2-pass

processing those are:

1. Preproces. Raw SAR data and orbital information usually in L1.0 CEOS format,

preprocess ran to create an ascii parameter file and a raw data file. This

preprocessing involves specialized code to extract orbital position and velocity

information from the leader files, align the raw radar echoes on a common near

range, and estimate the Doppler centroid of the raw data.

2. Focus is the second step to focusing each image to create two single look

complex (SLC) images.

3. Align the repeat image to the reference image. This is accomplished by first

using the orbital information to estimate the shift in range and azimuth needed

to align the upper left corner of the images. The repeat image is refocussed using

the parameters resulting in sub-pixel alignment between the reference and

repeat images. Smaller scale pixel shifts due to large amplitude surface

topography are corrected at the interfere step.

4. dem2topo_ra is the fourth step to transform the digital elevation model from

longitude, latitude, and topography into range, azimuth, and topography. This

is done using the precise orbital information of the reference image.

5. The fifth step is to interfere the reference and repeat SLC's using the

topo_shift.grd to both refine the image alignment of the repeat image due to

topography parallax, and to remove the baseline dependent topographic phase

from the repeat image prior to cross multiplication (phasediff). Thus all the

position and phase corrections are applied at the full resolution of the SLC's.

6. The sixth step filter/snaphu is to low-pass filter (conv) and decimate the real

Page 34: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

39

and imaginary components of the interferogram and compute standard products

of amplitude, phase, and coherence.

7. The final step is to geocode all the products by transforming from the

range/azimuth coordinate system of the master image to longitude and latitude.

2.7 Reviews of Application of D-InSAR to Land Displacement Monitoring

According to Hanssen (2003), satellite repeat-pass radar interferometric

measurement can be used for monitoring subsidence phenomena with high

accuracies. Deformation monitoring by remote sensing techniques and, in particular

InSAR could complement or, in certain can replace the ground-based techniques.

One of these measurement technique known as D-InSAR, which enables the

analysis of very small ground movement in continues, large areas and has

advantages of high resolution, all weather adaptability, low cost and inaccessible

area coverage (Wang et al. 2013). Crosetto et al. (2003) classify the D-InSAR

techniques as follows:

1. Coherence based D-InSAR with a single image pair.

2. Coherence based D-InSAR with multiple images.

3. D-InSAR based on Interest Point (IP) selected on multiple images.

These third type of D-InSAR also known as permanent scatterers (PS)

InSAR, which developed by Farreti et al. in 2001.

Many research on land deformation has been done by many scientist by

employing D-InSAR techniques along these two decades. D-InSAR results

compared to another method results like GPS, and shows the good agreement

between them. While improvement on it still needed to better accuracy.

Page 35: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated

40

For a feasibility analysis, Hanssen (2003) decided two main groups of

interferometric parameters namely, the design parameters and Environment

parameters. Design parameters including, wavelength, baseline, temporal baseline,

number of images and incidence angle or inclination. While the environment

parameters include, atmosphere, surfaces and deformation. All of those parameters

influencing the feasibility of deformation monitoring using satellite InSAR.

Like as other methods, D-InSAR also has limitations on land displacement

monitoring. As mentioned by Raucoules et al. (2007), the main limitations in the

detection of motion by means of the radar interferometry technique are linked to

the loss of coherence with time, to the influence of atmospheric artifacts, the

presence of uncompensated topography, and to instrumental limitations, such as the

orbital cycle or the pixel size. Among that the main limitations lead by the so-called

temporal and geometrical de-correlations as well as atmosphere artifacts. However

the high precision of D-InSAR results depend not only on the quality of SAR

images but also data processing methods used (Chen, et al 2013).

Development on interferometric technique on land displacement

monitoring not yet in complete stages. Improvement to enhancing the accuracies of

D-InSAR is continuing to develop. A new algorithm for surface deformation

monitoring based on small baseline D-InSAR interferogram introduced by

Berardirno et al. (2002). These technique based on an appropriate combination of

differential interferograms produced by data pairs characterized by a small orbital

sparation (baseline) in order to limit the spatial decorrelation phenomena. Small

baseline approach continued developed by Lanari et.al in 2004 on full-resolution

differential SAR interferograms.

Page 36: CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated