6 CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 Definitions There are many possible definitions about what Remote Sensing actually is. These are some of its definition according to some scientist. F.F. Sabins (1978) in his book "Remote sensing: principles and interpretation" defines it as follows: "Remote Sensing is the science of acquiring, processing and interpreting images that record the interaction between electromagnetic energy and matter". Lillesand and Kiefer (2007) in their book "Remote Sensing and Image Interpretation" even define it as an art: "Remote Sensing is the science and art of obtaining information about an object, area, or phenomenon through the analysis of data acquired by a device that is not in contact with the object, area, or phenomenon under investigation". Probably the broadest definition is given by Charles Elachi (2006) in "Introduction to the Physics and Techniques of Remote Sensing": "Remote Sensing is defined as the acquisition of information about an object without being in physical contact with it". And according to GIS Dictionary 2015, defines: “Remote sensing is collecting and interpreting information about the environment and the surface of the earth from a distance, primarily by sensing radiation that is naturally emitted or reflected by the earth's surface or from the atmosphere, or by sensing signals transmitted from a device and reflected back to it”. Examples of remote-sensing methods include aerial photography, radar, and satellite imaging.
36
Embed
CHAPTER II BASIC THEORY 2.1 Remote Sensing 2.1.1 … II.pdfis an advantage that is not possible with the visible and/or infrared remote sensing. However, the need for sophisticated
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
6
CHAPTER II
BASIC THEORY
2.1 Remote Sensing
2.1.1 Definitions
There are many possible definitions about what Remote Sensing actually
is. These are some of its definition according to some scientist. F.F. Sabins (1978)
in his book "Remote sensing: principles and interpretation" defines it as follows:
"Remote Sensing is the science of acquiring, processing and interpreting images
that record the interaction between electromagnetic energy and matter". Lillesand
and Kiefer (2007) in their book "Remote Sensing and Image Interpretation" even
define it as an art: "Remote Sensing is the science and art of obtaining information
about an object, area, or phenomenon through the analysis of data acquired by a
device that is not in contact with the object, area, or phenomenon under
investigation". Probably the broadest definition is given by Charles Elachi (2006)
in "Introduction to the Physics and Techniques of Remote Sensing": "Remote
Sensing is defined as the acquisition of information about an object without being
in physical contact with it". And according to GIS Dictionary 2015, defines:
“Remote sensing is collecting and interpreting information about the environment
and the surface of the earth from a distance, primarily by sensing radiation that is
naturally emitted or reflected by the earth's surface or from the atmosphere, or by
sensing signals transmitted from a device and reflected back to it”. Examples of
remote-sensing methods include aerial photography, radar, and satellite imaging.
7
2.1.2 Types of Remote Sensing
Based on its platform, remote sensing divided into three types namely
airborne remote sensing, shuttle-borne remote sensing and space-borne remote
sensing. Whenever the remote sensing sensor is carried by airplane, drone or UAV,
that is categorized as airborne remote sensing. For example this system are
AIRSAR by NASA/JPL and Pi-SAR by NICT/JAXA. Shuttle-borne remote sensing
platform is shuttle-craft for example SRTM in 2000 by NASA. The last one is
space-borne remote sensing, in this system the sensors carried by satellite. The first
space-borne remote sensing initiated by USA military through Corona programs
begun in 1959 (Baumann, 2009)
Based on its sensors there are two types of remote sensing namely passive
remote sensing and active remote sensing. Passive remote sensing use passive
sensors which is only received and measure energy that naturally available. The sun
provides a very convenient source of energy for remote sensing. The sun's energy
is either reflected, as it is for visible wavelengths, or absorbed and then reemitted,
as it is for thermal infrared wavelengths. There are many kind of these remote
sensing type mainly space-borne remote sensing, for example ALOS-AVNIR2,
ALOS-PRISM, SPOT, LANDSAT family, ASTER, etc.
On the other hand, active remote sensing use active sensor which is provide
their own energy source for illumination. The sensor emits radiation which is
directed toward the target to be investigated. The radiation reflected from that target
is detected and measured by the sensor. Advantages for active sensors include the
ability to obtain measurements anytime, regardless of the time of day or season.
Active sensors can be used for examining wavelengths that are not sufficiently
8
provided by the sun, such as microwaves, or to better control the way a target is
illuminated. However, active systems require the generation of a fairly large amount
of energy to adequately illuminate targets. Some examples of active sensors are a
laser fluoro-sensor and a synthetic aperture radar (SAR) (CCRS, 2014).
2.2 Radar Remote Sensing
Radar remote sensing is one of the active remote sensing that using
microwave radiation with wavelength from about one centimeter to a few tens of
centimeters enables observation in all weather conditions along day and night. This
is an advantage that is not possible with the visible and/or infrared remote sensing.
However, the need for sophisticated data analysis is the disadvantage in using
microwave remote sensing. Radar bands and designations presented in table 2.1.
The most commonly used bands on radar remote sensing marked by (*) in table 2.1
(Lusch, 1999).
Table 2.1
Frequency and wavelength for microwave bands (Lusch, 1999)
Band Designation Wavelength range
(cm)
Frequency range
(GHz)
Ka
K
Ku
X*
C*
S
L*
P
0.75 – 1.10
1.10 – 1.67
1.67 – 2.40
2.40 – 3.75
3.75 – 7.50
7.50 – 15.0
15.0 – 30.0
30.0 – 130.0
40.0 – 26.5
26.5 – 18.0
18.0 – 12.5
12.5 – 8.0
8.0 – 4.0
4.0 – 2.0
2.0 – 1.0
1.0 – 0.23
9
2.2.1 Radar Remote Sensing Satellites
The first civilian space-borne SAR was SEASAT (USA) in 1978, followed
by Almaz (USSR/Russia), ERS-1 (Europe), J-ERS-1 (Japan), ERS-2 (Europe) and
RADARSAT-1 (Canada). Nowadays there are many satellites orbiting the Earth by
carrying radar sensors on board. These satellites providing incredible amounts of
data to study about earth. In Figure 2.1 illustrating the family of satellites that
carrying SAR sensors for commercial applications from 1992.
Figure 2.1.
Satellite radar system available now and into the future
SAR sensors carried by satellites in polar orbits generally look to the right
of the satellite except for ALOS2, which is it can look to the right and left by
carrying two antennas. Given the orbital inclination, these side-looking sensors can
image the North Pole (actually an area within a few square kilometers of it), but not
the South Pole (unless the satellite is overturned) (Henri, 2008).
10
All satellites equipped with SAR sensors orbit the earth on a near-polar
orbit at an altitude ranging from 500 to 800 km above the earth’s surface, depending
on the satellite platform hosting the SAR sensor. The angle between true north-
south and the satellite orbit varies slightly, depending on the satellite but, in general
lies in the range of 10 degrees.
2.2.2 ALOS-PALSAR System Overview
In this section will describe briefly about The Advanced Land Observing
Satellite (ALOS) satellite and focus on PALSAR sensor, which summarized from
ALOS User handbook by JAXA. ALOS or nicknamed "Daichi" is Japanese satellite
was launched in Jan. 24, 2006. The observation sensors consist of a high-resolution
stereo mapping sensor (PRISM), a visible and near infrared radiometer (AVNIR-2),
and an L-band synthetic aperture radar (PALSAR), all of which are high
performance systems.
The Phased Array type L-band Synthetic Aperture Radar (PALSAR) is an
active microwave sensor using L-band frequency to achieve cloud-free and day-
and-night land observation. The definitions of PALSAR data products for
processing levels are shown in Table 2.2. The processing levels of observational
modes are given in Table 2.3
11
Figure 2.2.
Picture of Alos satellite with the its parts (ALOS User’s hand book)
Table 2.2
Processing Levels and Their Definitions (ALOS User’s hand book)
Processing
Level
Definition
1.0 The data of 1 scene area is extracted from received data. Data
type is 8 bit. The number of SAR data files is the same as the
number of polarizations in the case of dual polarization and
polarimetry modes. The data in SCAN SAR mode is not divided
into individual scans.
1.1 Range compression and 1 look azimuth compression are
performed. Data is complex data on the slant range coordinate.
The phase history is included
1.5 After range and multi-look azimuth compression are performed,
radiometric and geometric corrections are performed according
to the map projection. Pixel spacing can be selected for the Fine
mode
PALSAR product formats are based on the CEOS (Committee on Earth
Observation Satellites) revised standardized formats. One (1) file composition an
image volume consists of 4 kinds of files. The file names and their contents are
shown in table 2.4.
12
Table 2.3
Processing Levels of Observational Modes (ALOS User’s hand book)
Observation Mode Processing Level Remarks
1.0 1.1 1,5
Fine mode Single polarization O O O 18 beams
Dual polarization O O O 18 beams
Scan SAR
mode
Burst mode 1 O - O 3 scans, 4 scans, 5 scans
Burst mode 2 O - O 3 scans, 4 scans, 5 scans
Direct Donwlink mode O O O 18 beams
Polarimetry mode O O O 12 beams
Table 2.4.
File names and its content on ALOS-PALSAR product format (ALOS User’s hand
book)
File
Name
Definition of File
Name
Contents
Volume
Directory
File
VOL-Scene ID-
Product ID
This file is located at the beginning of the
image volume and stores the volume and
file management information.
Leader
File
LED-Scene ID-
Product ID
This file is located before image file and
stores annotation data, ancillary data and
other types of data related to the image data
in the succeeding image file.
Image File IMG-XX-Scene ID-
Product ID
This file is located after the leader file and
stores the image data.
Trailer
File
TRL-Scene ID-
Product ID
This file is located after the image file and
stores the final information related to the
image data.
13
2.3 Basic concept of Synthetic Aperture Radar (SAR)
“SAR” is the acronym for Synthetic Aperture Radar. Each word represents
“Synthesis,” “Aperture (Opening),” and “Radar.” “Radar” is the acronym for Radio
Detection And Ranging. The Radar technique was developed in the 20th century
for its ability to determine physical parameters (size, roughness or the
displacement) of an illuminated object using the range and backscatter intensity
from two-way travel time of the electromagnetic pulse. Radar imagery shown in
grayscale image.
Synthetic Aperture Radar (SAR) is a coherent radar system that generates
high resolution remote sensing imagery that can work along day and night since it
is an active system (Agustan, 2010). Furthermore, most of the remote sensing SAR
systems operate in upper L band, in C band or in X band (i.e. within well-defined
frequency bands comprised roughly between 1.2 and 10.9 GHz). At such
frequencies, the electromagnetic radiation penetrates the cloud cover SAR sensors
can therefore acquire data in the all weather conditions (Raucoules, 2007). Because
of these characteristic, SAR has been used in various research field as listed in table
2.5.
14
Table 2.5
Selected fields of SAR application examples. Note that not all applications are in
practical use; many applications are still at developing stages (Ouchi, 2013)
Fields Objects
Geology
Agriculture
Forestry
Hydrology
Urban
Disaster
Oceanography
Cryosphere
Archeology
topography, DEM & DSM production, crust movement, faults,