Language selection

Search


Educational Resources - Applications

Introduction

As we learned in the section on sensors, each one was designed with a specific purpose. With optical sensors, the design focuses on the spectral bands to be collected. With radar imaging, the incidence angle and microwave band used plays an important role in defining which applications the sensor is best suited for.

Each application itself has specific demands, for spectral resolution, spatial resolution, and temporal resolution.

To review, spectral resolution refers to the width or range of each spectral band being recorded. As an example, panchromatic imagery (sensing a broad range of all visible wavelengths) will not be as sensitive to vegetation stress as a narrow band in the red wavelengths, where chlorophyll strongly absorbs electromagnetic energy.

Spatial resolution refers to the discernible detail in the image. Detailed mapping of wetlands requires far finer spatial resolution than does the regional mapping of physiographic areas.

Temporal resolution refers to the time interval between images. There are applications requiring data repeatedly and often, such as oil spill, forest fire, and sea ice motion monitoring. Some applications only require seasonal imaging (crop identification, forest insect infestation, and wetland monitoring), and some need imaging only once (geology structural mapping). Obviously, the most time-critical applications also demand fast turnaround for image processing and delivery - getting useful imagery quickly into the user's hands.

In a case where repeated imaging is required, the revisit frequency of a sensor is important (how long before it can image the same spot on the Earth again) and the reliability of successful data acquisition. Optical sensors have limitations in cloudy environments, where the targets may be obscured from view. In some areas of the world, particularly the tropics, this is virtually a permanent condition. Polar areas also suffer from inadequate solar illumination, for months at a time. Radar provides reliable data, because the sensor provides its own illumination, and has long wavelengths to penetrate cloud, smoke, and fog, ensuring that the target won't be obscured by weather conditions, or poorly illuminated.

Often it takes more than a single sensor to adequately address all of the requirements for a given application. The combined use of multiple sources of information is called integration. Additional data that can aid in the analysis or interpretation of the data is termed "ancillary" data.

The applications of remote sensing described in this chapter are representative, but not exhaustive. We do not touch, for instance, on the wide area of research and practical application in weather and climate analysis, but focus on applications tied to the surface of the Earth. The reader should also note that there are a number of other applications that are practiced but are very specialized in nature, and not covered here (e.g. terrain trafficability analysis, archeological investigations, route and utility corridor planning, etc.).

Multiple sources of information

Each band of information collected from a sensor contains important and unique data. We know that different wavelengths of incident energy are affected differently by each target - they are absorbed, reflected or transmitted in different proportions. The appearance of targets can easily change over time, sometimes within seconds. In many applications, using information from several different sources ensures that target identification or information extraction is as accurate as possible. The following describe ways of obtaining far more information about a target or area, than with one band from a sensor.

Multispectral

The use of multiple bands of spectral information attempts to exploit different and independent "views" of the targets so as to make their identification as confident as possible. Studies have been conducted to determine the optimum spectral bands for analyzing specific targets, such as insect damaged trees.

Multisensor

Different sensors often provide complementary information, and when integrated together, can facilitate interpretation and classification of imagery. Examples include combining high resolution panchromatic imagery with coarse resolution multispectral imagery, or merging actively and passively sensed data. A specific example is the integration of SAR imagery with multispectral imagery. SAR data adds the expression of surficial topography and relief to an otherwise flat image. The multispectral image contributes meaningful colour information about the composition or cover of the land surface. This type of image is often used in geology, where lithology or mineral composition is represented by the spectral component, and the structure is represented by the radar component.

Multitemporal

Information from multiple images taken over a period of time is referred to as multitemporal information. Multitemporal may refer to images taken days, weeks, or even years apart. Monitoring land cover change or growth in urban areas requires images from different time periods. Calibrated data, with careful controls on the quantitative aspect of the spectral or backscatter response, is required for proper monitoring activities. With uncalibrated data, a classification of the older image is compared to a classification from the recent image, and changes in the class boundaries are delineated. Another valuable multitemporal tool is the observation of vegetation phenology (how the vegetation changes throughout the growing season), which requires data at frequent intervals throughout the growing season.

"Multitemporal information" is acquired from the interpretation of images taken over the same area, but at different times. The time difference between the images is chosen so as to be able to monitor some dynamic event. Some catastrophic events (landslides, floods, fires, etc.) would need a time difference counted in days, while much slower-paced events (glacier melt, forest regrowth, etc.) would require years. This type of application also requires consistency in illumination conditions (solar angle or radar imaging geometry) to provide consistent and comparable classification results.

The ultimate in critical (and quantitative) multitemporal analysis depends on calibrated data. Only by relating the brightnesses seen in the image to physical units, can the images be precisely compared, and thus the nature and magnitude of the observed changes be determined.

Page details

Report a problem on this page
Please select all that apply:

Thank you for your help!

You will not receive a reply. For enquiries, contact us.

Date modified: