Professional Documents
Culture Documents
2. Types of Remote sensing Satellite Remote sensing Optical & IR Remote sensing Microwave Remote sensing 3. Platforms for Remote Sensing 4. Sources & Sensors Energy sources Sensors 5. Process of Remote Sensing Interaction Process Interaction of EM radiation with earths surface Effect of atmosphere 6. Electromagnetic radiation Electromagnetic Waves 7. Electromagnetic spectrum Characteristics 8. Reflectance 9. Color composite images True color images False color images Natural color images 10. Synthetic aperture radar 11. Limitations of Remote Sensing 12. Applications Remote Sensing 13. Bibliography
INTRODUCTION
Remote sensors measure electromagnetic (EM) radiation that has interacted with the Earths surface. Interactions with matter can change the direction, intensity, wavelength content, and polarization of EM radiation. The nature of these changes is dependent on the chemical make-up and physical structure of the material exposed to the EM radiation. Changes in EM radiation resulting from its interactions with the Earths surface therefore provide major clues to the characteristics of the surface materials.
Basic Principal:Electromagnetic radiation that is transmitted passes through a material (or through the boundary between two materials) with little change in intensity. Materials can also absorb EM radiation. Usually absorption is wavelength-specific: that is, more energy is absorbed at some wavelengths than at others. EM radiation that is absorbed is transformed into heat energy, which raises the materials temperature. Some of that heat energy may then be emitted as EM radiation at a wavelength dependent on the materials temperature. The lower the temperature, the longer the wavelength of the emitted radiation. As a result of solar heating, the Earths surface emits energy in the form of longerwavelength infrared radiation (see illustration on the preceding page). For this reason the portion of the infrared spectrum with wavelengths greater than 3 micro meters is commonly called the thermal infrared region. Electromagnetic radiation encountering a boundary such as the Earths surface can also be reflected. If the surface is smooth at a scale comparable to the wavelength of the incident energy, seculars reflection occurs: most of the energy is reflected in a single direction, at an angle equal to the angle of incidence. Rougher surfaces cause scattering, or diffuse reflection in all directions.
We perceive the surrounding world through our five senses. Some senses (touch and taste) require contact of our sensing organs with the objects. However, we acquire much information about our surrounding through the senses of sight and hearing which do not require close contact between the sensing organs and the external objects. In another word, we are performing Remote Sensing all the time.
Generally, Remote sensing refers to the activities of recording/observing/perceiving (sensing) objects or events at far away (remote) places. In remote sensing, the sensors are not in direct contact with the objects or events being observed. The information needs a physical carrier to travel from the objects/events to the sensors through an intervening medium. The electromagnetic radiation is normally used as an information carrier in remote sensing. The output of a remote sensing system is usually an image representing the scene being observed. A further step of image analysis and interpretation is required in order to extract useful information from the image. The human visual system is an example of a remote sensing system in this general sense. In a more restricted sense, remote sensing usually refers to the technology of acquiring information about the earth's surface (land and ocean) and atmosphere using sensors onboard airborne (aircraft, balloons) or spaceborne (satellites, space shuttles) platforms.
is scattered even more in the atmosphere than blue light. Furthermore, ultraviolet light will expose photographic film. On a bright day this scattered ultraviolet light will fog a photograph of distant objects. In order to avoid this, we use a filter which passes visible light but not ultraviolet light (called a UV filter). The utilization of almost every remote sensing system used requires some consideration of the transmission and scattering properties of the atmosphere for a particular wavelength. These problems will be discussed where appropriate. Interaction with the Earth's Surface: A major aspect of interpretation of remotely sensed data is the nature of the interaction of radiation with the earth's surface. Each kind of surface material has its own signature. For instance, a water surface absorbs the near infrared and reflects a fair amount of green light. Snow reflects both. While it is possible for the observer to catalogue these signatures, occasionally he will encounter an object whose signature is puzzling. In those cases it may be necessary to play "detective" and consider the aspects of the surface which may be producing the signature observed. For
instance, the unusual occurrence of a rainstorm upon snow-covered sea ice may create an area with unusual absorption in the near infrared. It is not likely that this signature would be listed in any reference manual. The nature of the interaction of radiation with the earth's surface can be quite different for active and passive systems. Passive systems depend on illumination from a natural source, usually the sun or radiation emitted from the object. In this case, the angle of illumination is different from the "look" angle. However, usually there is sufficient illumination that there are few total shadows. Actually, we are quite used to this situation since we experience it daily. Most active systems depend on radiation emitted and reflected directly back to the source. This can create effects we do not experience on a daily basis. Consider how things look to you when using a flashlight on a dark night; shadows are troublesome. Yet, this is how the earth looks on airborne imaging radar.
Low resolution systems (approx. 1 km or more) Medium resolution systems (approx. 100 m to 1 km) High resolution systems (approx. 5 m to 100 m) Very high resolution systems (approx. 5 m or less)
In terms of the spectral regions used in data acquisition, the satellite imaging systems can be classified into:
Optical imaging systems (include visible, near infrared, and shortwave infrared systems)
Optical/thermal imaging systems can be classified according to the number of spectral bands used:
Monospectral or panchromatic (single wavelength band, "black-and-white", grey-scale image) systems Multispectral (several spectral bands) systems Superspectral (tens of spectral bands) systems Hyperspectral (hundreds of spectral bands) systems
Synthetic aperture radar imaging systems can be classified according to the combination of frequency bands and polarization modes used in data acquisition, e.g.:
Single frequency (L-band, or C-band, or X-band) Multiple frequency (Combination of two or more frequency bands) Single polarization (VV, or HH, or HV) Multiple polarization (Combination of two or more polarization modes)
In Optical Remote Sensing, optical sensors detect solar radiation reflected or scattered from the earth, forming images resembling photographs taken by a camera high up in space. The wavelength region usually extends from the visible and near infrared (commonly abbreviated as VNIR) to the short-wave infrared (SWIR).
Different materials such as water, soil, vegetation, buildings and roads reflect visible and infrared light in different ways. They have different colors and brightness when seen under the sun. The interpretation of optical images require the knowledge of the spectral reflectance signatures of the various materials (natural or man-made) covering the surface of the earth.
There are also infrared sensors measuring the thermal infrared radiation emitted from the earth, from which the land or sea surface temperature can be derived.
A microwave imaging system which can produce high resolution image of the Earth is the synthetic aperture radar (SAR). The intensity in a SAR image depends on the amount of microwave backscattered by the target and received by the SAR antenna. Since the physical mechanisms responsible for this backscatter is different for microwave, compared to visible/infrared radiation, the interpretation of SAR images requires the knowledge of how microwaves interact with the targets.
Electromagnetic radiation in the microwave wavelength region is used in remote sensing to provide useful information about the Earth's atmosphere, land and ocean. A microwave radiometer is a passive device which records the natural microwave emission from the earth. It can be used to measure the total water content of the atmosphere within its field of view. A radar altimeter sends out pulses of microwave signals and record the signal scattered back from the earth surface. The height of the surface can be measured from the time delay of the return signals. A wind Scatterometer can be used to measure wind speed and direction over the ocean surface. it sends out pulses of microwaves along several directions and records the magnitude of the signals backscattered from the ocean surface. The magnitude of the backscattered signal is related to the ocean surface roughness, which in turns is dependent on the sea surface wind condition, and hence the wind speed and direction can be derived. Platforms to generate high resolution images of the earth surface using microwave energy.
Black Body Radiation. All objects with a temperature above absolute zero emit electromagnetic radiation. The amount of radiation in each wavelength region depends on the temperature of the object in a complicated way but the total radiation is proportional to the object's Kelvin temperature taken to the 4th power (T 4). Hence an object at 373K (boiling water) emits four times as much radiant energy as water at 273K (OC) although its absolute temperature is only 36% greater. The exact relationship between temperature and radiated energy per wavelength for a perfect radiator is called the black body curve. Figure shows this relationship for objects at 2500, 2750 and 3000 K. The Sun and Earth as Black Body Radiators. The sun's black body curve peaks at a wavelength of 0.5 pm or 0.0005 millimeters, the wavelength of blue-green light. Therefore, the highest radiation level available for remote sensing detectors is at this wavelength. This is also close to the center of the wavelength range of human eyesight. Hence, human eyes are adapted to making the most of the available radiant energy from the sun. However, the sun's black body curve extends from the visible wavelengths to the infrared and even to the microwave region and beyond.
The earth's absolute temperature is around 290 0K (17%). The black body curve for this temperature peaks around 9.7pm. This wavelength is well within the thermal infrared region of the spectrum. However, the earth radiates less energy at all wavelengths than the sun even at this peak for the earth's black body curve. For this reason, daytime thermal infrared measurements can be highly distorted by reflected or backscattered solar energy.
SENSORS:
Reflected solar radiation sensors: These sensor systems detect solar radiation that has been diffusely reflected (scattered) upward from surface features. The wavelength ranges that provide useful information include the ultraviolet, visible, near infrared and middle infrared ranges. Reflected solar sensing systems discriminate materials that have differing patterns of wavelength specific absorption, which relate to the chemical make-up and physical structure of the material. Because they depend on sunlight as a source, these systems can only provide useful images during daylight hours, and changing atmospheric conditions and changes in illumination with time of day and season can pose interpretive problems. Reflected solar remote sensing systems are the most common type used to monitor Earth resources. Thermal infrared sensors: that can detect the thermal infrared radiation emitted by surface features can reveal information about the thermal properties of these materials. Like reflected solar sensors, these are passive systems that rely on solar radiation as the ultimate energy source. Because the temperature of surface features changes during the day, thermal infrared sensing systems are sensitive to time of day at which the images are acquired. Imaging radar sensors Rather than relying on a natural source, these active systems illuminate the surface with broadcast microwave radiation, then measure the energy that is diffusely reflected back to the sensor. The returning energy provides information about the surface roughness and water content of surface materials and the shape of the land surface. Long wavelength microwaves suffer little scattering in the atmosphere, even penetrating thick cloud cover. Imaging radar is therefore particularly useful in cloud-prone tropical regions. Sensors type: Optical: Visible Near Infrared Thermal Infrared Microwave: Passive (Scatterometer) Active (SAR, Altimeter) Laser: Active
Intensity, Time
FUSING DATA FROM DIFFERENT SENSORS:Materials commonly found at the Earths surface, such as soil, rocks, water, vegetation, and manmade features, possess many distinct physical properties that control their interactions with electromagnetic radiation. In the preceding pages we have discussed remote sensing systems that use three separate parts of the radiation spectrum: reflected solar radiation (visible and infrared), emitted thermal infrared, and imaging radar. Because the interactions of EM radiation with surface features in these spectral regions are different, each of the corresponding sensor systems measures a different set of physical properties. Although each type of system by itself can reveal a wealth of information about the identity and condition of surface materials, we can learn even more by combining image data from different sensors. Interpretation of the merged data set can employ rigorous quantitative analysis, or more qualitative visual analysis.
Energy Source or Illumination (A) - the first requirement for remote sensing is to have an energy source which illuminates or provides electromagnetic energy to the target of interest. Radiation and the Atmosphere (B) - as the energy travels from its source to the target, it will come in contact with and interact with the atmosphere it passes through. This interaction may take place a second time as the energy travels from the target to the sensor. Interaction with the Target (C) - once the energy makes its way to the target through the atmosphere, it interacts with the target depending on the properties of both the target and the radiation. Recording of Energy by the Sensor (D) - after the energy has been scattered by, or emitted from the target, we require a sensor (remote - not in contact with the target) to collect and record the electromagnetic radiation. Transmission, Reception, and Processing (E) - the energy recorded by the sensor has to be transmitted, often in electronic form, to a receiving and processing station where the data are processed into an image (hardcopy and/or digital). Interpretation and Analysis (F) - the processed image is interpreted, visually and/or digitally or electronically, to extract information about the target which was illuminated. Application (G) - the final element of the remote sensing process is achieved when we apply the information that we have been able to extract from the imagery about the target, in order to better understand it, reveal some new information, or assist in solving a particular problem.
Interaction Process In Remote Sensing:As sunlight initially enters the atmosphere, it encounters gas molecules, suspended dust particles, and aerosols. These materials tend to scatter a portion of the incoming radiation in all directions, with shorter wavelengths experiencing the strongest effect. (The preferential scattering of blue light in comparison to green and red light accounts for the blue color of the daytime sky. Clouds appear opaque because of intense scattering of visible light by tiny water droplets.) Although most of the remaining light is transmitted to the surface, some atmospheric gases are very effective at absorbing particular wavelengths. (The absorption of dangerous ultraviolet radiation by ozone is a well-known example).
surfaces of the ice composing the ridge. Absorption: Except for unusual cases, some of an incident electromagnetic signal is absorbed by the material of the surface it strikes or the medium it passes through. In the case of both active and passive systems, absorption of the electromagnetic signal by the atmosphere plays a major role in determining which wavelengths are used. Water vapor in the atmosphere absorbs many of the microwave wavelengths leaving a few "windows" through which we can transmit and receive this information. Water strongly absorbs radiation in the near infrared portion of the spectrum used by Landsat band 7 images and NOAA series spacecraft (near IR band images). This absorption is so strong that wet snow and ice can be interpreted as water unless data from other wavelengths are available. Microwave radiation is also strongly absorbed by a thin film of water.
Effects of Atmosphere
In satellite remote sensing of the earth, the sensors are looking through a layer of atmosphere separating the sensors from the Earth's surface being observed. Hence, it is essential to understand the effects of atmosphere on the electromagnetic radiation travelling from the Earth to the sensor through the atmosphere. The atmospheric constituents cause wavelength dependent absorption and scattering of radiation. These effects degrade the quality of images. Some of the atmospheric effects can be corrected before the images are subjected to further analysis and interpretation. A consequence of atmospheric absorption is that certain wavelength bands in the electromagnetic spectrum are strongly absorbed and effectively blocked by the atmosphere. The wavelength regions in the electromagnetic spectrum usable for remote sensing are determined by their ability to penetrate atmosphere. These regions are known as the atmospheric transmission windows. Remote sensing systems are often designed to operate within one or more of the atmospheric windows. These windows exist in the microwave region, some wavelength bands in the infrared, the entire visible region and part of the near ultraviolet regions. Although the atmosphere is practically transparent to x-rays and gamma rays, these radiations are not normally used in remote sensing of the earth.
Electromagnetic Radiation
Experiments with electricity and magnetism in the 1800's developed a body of knowledge which led James Clerk Maxwell to predict in 1886 on purely theoretical grounds that it might be possible for electric and magnetic fields to combine, forming self-sustaining waves which could travel great distances. These waves would have many of the behavior characteristics of waves on water (reflection, refraction, defraction, etc.) and would travel at the speed of light. These properties gave rise to the possibility that light was an electromagnetic wave, but at that time, there was no proof that electromagnetic waves really existed. In 1888, Heinrich Hertz built an apparatus to send and receive Maxwell's waves. In this case the waves were around 5 meters long. The apparatus worked and, in addition, proved that the waves could be polarized which turns out to be an important property from a remote sensing point of view. After this, it was learned that light, x-rays, infrared, ultraviolet, radio, microwaves, and gamma rays were all electromagnetic waves. The only property dividing them was their wavelength ranges. The names for these divisions arise from the interaction properties each wavelength range exhibits. (For instance, we see light, radio waves are useful for communication, x-rays pass through objects, etc.). Long before the wave description of light was developed by Maxwell, Sir Isaac Newton had also considered it and discarded the idea. The waves Newton considered were not electromagnetic, but compressional waves in the space-filling ether. Newton's reasoning was sound and based on the fact that the ether waves could not have some of the properties of light which had been observed. (Electromagnetic waves can have these properties.) Instead, he reasoned that light was composed of a vast flow of very tiny particles or corpuscles called photons. He was able to show that the flood of photons could have the known properties of light. The discovery of electromagnetic waves created the suspicion that the photon concept was incorrect. However, in 1905, Albert Einstein was able to show that no matter how light travels from place to place, it is emitted and absorbed in small packets of energy (photons again). As a result, electromagnetic waves are dichotomous. They are emitted and absorbed as particles, but travel as waves. Scientific thought concerning this paradox continues to the present. Each representation has been found to have its particular utility. Use of Electromagnetic Radiation For Remote Sensing Aerial photography was the earliest form of remote sensing other than the telescope. For a long time, this technique relied on the portion of electromagnetic radiation used by our eyes (the visible spectrum). Early aerial photography was usually obtained on black and white film which responded to light over a broad range of visible light. Later, it was learned that by placing a filter in front of the lens which would pass only a particular color of light, a black and white record could be made of the objects reflecting light in that range. For instance, an aerial photograph of a developed area with a red-passing filter would show bare ground and many man-made surfaces which reflect a significant amount of red light. Hence, this photograph would be useful for identifying man-made features. This technique is used by the Landsat series of satellites today. Later, as color photography became available, color film was used in aerial photography. Again, filters could be used to enhance particular features. Near Infrared Aerial Photography: During the second World War there was a need to detect camouflaged objects. Although a great deal of aerial photography was obtained, it was often difficult to detect objects which had been painted green or had been covered with cut tree
branches. Some experimental film was developed which responded to light in the near infrared portion of the spectrum, light just a little more red than the red light detected by the human eye. One of the anticipated uses for this film involved the monitoring of healthy vegetation whose chlorophyll reflects the near infrared extremely well. This film was simply a black and white film with extended sensitivity which would record the near infrared if the visible light was filtered out. Later, a color film was developed which responded to the near infrared as well as visible colors (except blue). This was called color infrared film. Growth of Remote Sensing: Encouraged by these results, efforts were made to utilize other electromagnetic wavelengths such as heat infrared, microwave, and radar for remote sensing purposes. Here the topic becomes complex because the radiation does not behave exactly as light does and it is not quite as simple to understand as the near infrared.
Imaging Satellite Systems: Another important factor in the development of remote sensing, particularly for ice surveillance, was the development of satellite systems which routinely return images to earth. The first of these systems operated in the visible portion of the spectrum because existing television technology was most easily applied there. Quickly, however, systems were developed to make use of other portions of the electromagnetic spectrum. At this time, satellite remote sensing systems based on radar were being developed.
Electromagnetic Waves
Electromagnetic waves are energy transported through space in the form of periodic disturbances of electric and magnetic fields. All electromagnetic waves travel through space at the same speed, c = 2.99792458 x 108 m/s, commonly known as the speed of light. An electromagnetic wave is characterized by a frequency and a wavelength. These two quantities are related to the speed of light by the equation, Speed of light = frequency x wavelength The frequency (and hence, the wavelength) of an electromagnetic wave depends on its source. There is a wide range of frequency encountered in our physical world, ranging from the low frequency of the electric waves generated by the power transmission lines to the very high frequency of the gamma rays originating from the atomic nuclei. This wide frequency range of electromagnetic waves constitutes the Electromagnetic Spectrum.
The electromagnetic spectrum can be divided into several wavelength (frequency) regions, among which only a narrow band from about 400 to 700 nm is visible to the human eyes. Note that there is no sharp boundary between these regions. The boundaries shown in the above figures are approximate and there are overlaps between two adjacent regions. Wavelength units: 1 mm = 1000 m; 1 m = 1000 nm.
Radio Waves: 10 cm to 10 km wavelength. Microwaves: 1 mm to 1 m wavelength. The microwaves are further divided into different frequency (wavelength) bands: (1 GHz = 109 Hz) o P band: 0.3 - 1 GHz (30 - 100 cm) o L band: 1 - 2 GHz (15 - 30 cm) o S band: 2 - 4 GHz (7.5 - 15 cm) o C band: 4 - 8 GHz (3.8 - 7.5 cm) o X band: 8 - 12.5 GHz (2.4 - 3.8 cm) o Ku band: 12.5 - 18 GHz (1.7 - 2.4 cm) o K band: 18 - 26.5 GHz (1.1 - 1.7 cm) o Ka band: 26.5 - 40 GHz (0.75 - 1.1 cm) Infrared: 0.7 to 300 m wavelength. This region is further divided into the following bands: o Near Infrared (NIR): 0.7 to 1.5 m. o Short Wavelength Infrared (SWIR): 1.5 to 3 m. o Mid Wavelength Infrared (MWIR): 3 to 8 m. o Long Wavelength Infrared (LWIR): 8 to 15 m. o Far Infrared (FIR): longer than 15 m. The NIR and SWIR are also known as the Reflected Infrared, referring to the main infrared component of the solar radiation reflected from the earth's surface. The MWIR and LWIR are the Thermal Infrared.
Visible Light: This narrow band of electromagnetic radiation extends from about 400 nm (violet) to about 700 nm (red). The various color components of the visible spectrum fall roughly within the following wavelength regions: o Red: 610 - 700 nm o Orange: 590 - 610 nm o Yellow: 570 - 590 nm o Green: 500 - 570 nm o Blue: 450 - 500 nm o Indigo: 430 - 450 nm o Violet: 400 - 430 nm Ultraviolet: 3 to 400 nm X-Rays and Gamma Rays
CHARACTERISTICS OF THE ELECTROMAGNETIC SPECTRUM Ultraviolet (UV): This wavelength region has not been used to monitor sea ice and it is not likely to be used in the future. Because of the high degree of atmospheric scattering in this wavelength region, there is a tendency for imagery to appear "fuzzy". The radiation source is the sun and the systems used are, therefore, passive. Visible Light: This wavelength region, principally the green and red portion, is used by Landsat and NOAA weather satellites to produce map-like imagery. The green portion is particularly sensitive to ice regardless of whether it is newly formed and thin or old and flooded. This portion of the spectrum is cloud-limited. As with the UV, the radiation source is the sun. Near Infrared: This wavelength region is often detected along with visible light. Landsat and NOAA weather satellites produce images in this wavelength region. The imagery is of great utility to remote sensing of sea ice because it is highly sensitive to water/ice boundaries and water upon ice. It generally presents greater contrast between ice types and ice and water than do the visible wavelengths. Thermal Infrared. This wavelength region is truly representative of heat. However, interpretation of thermal infrared imagery can be somewhat difficult. For many purposes, the best imagery is obtained just before dawn so that solar heating effects are at a minimum. Since the thermal infrared is absorbed by clouds and fog, it is useful to have a visual image as well as a thermal image to help identify them and the areas modified by their influences. Microwave: Data is obtained in this region by both active and passive methods. The earth's surface does emit microwave radiation in very small amounts as a manifestation of its temperature. It is, therefore, necessary to use very sensitive microwave receivers (radiometers) to measure this radiation. This wavelength region is not affected by ordinary cloudiness, but the shorter wavelengths can be absorbed by the raindrops in severe storms. The active systems using this wavelength region come under the heading of radar. Side-scanning radar systems are operated routinely aboard Canadian ice surveillance aircraft. Imaging radar has also been used experimentally aboard spacecraft and it is likely that data from an operational satellite imaging radar system will be available relatively soon. The active systems send out and receive back a much stronger signal than the passive microwave systems. Hence, the "background" radiation of the earth does not confuse the signal received.
Reflectance
When solar radiation hits a target surface, it may be transmitted, absorbed or reflected. Different materials reflect and absorb differently at different wavelengths. The reflectance spectrum of a material is a plot of the fraction of radiation reflected as a function of the incident wavelength and serves as a unique signature for the material. In principle, a material can be identified from its spectral reflectance signature if the sensing system has sufficient spectral resolution to distinguish its spectrum from those of other materials. This premise provides the basis for multispectral remote sensing. The following graph shows the typical reflectance spectra of five materials: clear water, turbid water, bare soil and two types of vegetation.
The reflectance of clear water is generally low. However, the reflectance is maximum at the blue end of the spectrum and decreases as wavelength increases. Hence, clear water appears darkbluish. Turbid water has some sediment suspension which increases the reflectance in the red end of the spectrum, accounting for its brownish appearance. The reflectance of bare soil generally depends on its composition. In the example shown, the reflectance increases monotonically with increasing wavelength. Hence, it should appear yellowish-red to the eye. Vegetation has a unique spectral signature which enables it to be distinguished readily from other types of land cover in an optical/near-infrared image. The reflectance is low in both the blue and red regions of the spectrum, due to absorption by chlorophyll for photosynthesis. It has a peak at the green region which gives rise to the green color of vegetation. In the near infrared (NIR) region, the reflectance is much higher than that in the visible band due to the cellular structure in the leaves. Hence, vegetation can be identified by the high NIR but generally low visible reflectance. This property has been used in early reconnaisance missions during war times for "camouflage detection".
Typical Reflectance Spectrum of Vegetation. The labeled arrows indicate the common wavelength bands used in optical remote sensing of vegetation: A: blue band, B: green band; C: red band; D: near IR band; E: short-wave IR band
The shape of the reflectance spectrum can be used for identification of vegetation type. For example, the reflectance spectra of vegetation 1 and 2 in the above figures can be distinguished although they exhibit the generally characteristics of high NIR but low visible reflectance. Vegetation 1 has higher reflectance in the visible region but lower reflectance in the NIR region. For the same vegetation type, the reflectance spectrum also depends on other factors such as the leaf moisture content and health of the plants. The reflectance of vegetation in the SWIR region (e.g. band 5 of Landsat TM and band 4 of SPOT 4 sensors) is more varied, depending on the types of plants and the plant's water content. Water has strong absorption bands around 1.45, 1.95 and 2.50 m. outside these absorption bands in the SWIR region, reflectance of leaves generally increases when leaf liquid water content decreases. This property can be used for identifying tree types and plant conditions from remote sensing images. The SWIR band can be used in detecting plant drought stress and delineating burnt areas and fire-affected vegetation. The SWIR band is also sensitive to the thermal radiation emitted by intense fires, and hence can be used to detect active fires, especially during night-time when the background interference from SWIR in reflected sunlight is absent.
Many colors can be formed by combining the three primary colors (Red, Green, Blue) in various proportions.
False color composite multispectral SPOT image: Red: XS3; Green: XS2; Blue: XS1
Another common false color composite scheme for displaying an optical image with a shortwave infrared (SWIR) band is shown below: R = SWIR band (SPOT4 band 4, Landsat TM band 5) G = NIR band (SPOT4 band 3, Landsat TM band 4) B = Red band (SPOT4 band 2, Landsat TM band 3)
An example of this false color composite display is shown below for a SPOT 4 image.
False color composite of a SPOT 4 multispectral image including the SWIR band: Red: SWIR band; Green: NIR band; Blue: Red band. In this display scheme, vegetation appears in shades of green. Bare soils and clearcut areas appear purplish or magenta. The patch of bright red area on the left is the location of active fires. A smoke plume originating from the active fire site appears faint bluish in color.
False color composite of a SPOT 4 multispectral image without displaying the SWIR band: Red: NIR band; Green: Red band; Blue: Green band. Vegetation appears in shades of red. The smoke plume appears bright bluish white.
Natural color composite multispectral SPOT image: Red: XS2; Green: 0.75 XS2 + 0.25 XS3; Blue: 0.75 XS2 - 0.25 XS3
The radar pulse is scattered by the ground targets back to the antenna.
In real aperture radar imaging, the ground resolution is limited by the size of the microwave beam sent out from the antenna. Finer details on the ground can be resolved by using a narrower beam. The beam width is inversely proportional to the size of the antenna, i.e. the longer the antenna, the narrower the beam. It is not feasible for a spacecraft to carry a very long antenna which is required for high resolution imaging of the earth surface. To overcome this limitation, SAR capitalizes on the motion of the space craft to emulate a large antenna (about 4 km for the ERS SAR) from the small antenna (10 m on the ERS satellite) it actually carries on board.
Applications
Natural resource management is a broad field covering many different application areas as diverse as monitoring fish stocks to effects of natural disasters (hazard assessment).Remote sensing can be used for applications in several different areas, including:
Geology and Mineral exploration Hazard assessment Oceanography Agriculture and forestry Land degradation Environmental monitoring Each sensor was designed with a specific purpose. With optical sensors, the design focuses on the spectral bands to be collected. With radar imaging, the incidence angle and microwave band used plays an important role in defining which applications the sensor is best suited for. Each application itself has specific demands, for spectral resolution, spatial resolution, and temporal resolution. For a brief, spectral resolution refers to the width or range of each spectral band being recorded. As an example, panchromatic imagery (sensing a broad range of all visible wavelengths) will not be as sensitive to vegetation stress as a narrow band in the red wavelengths, where chlorophyll strongly absorbs electromagnetic energy. Spatial resolution refers to the discernible detail in the image. Detailed mapping of wetlands requires far finer spatial resolution than does the regional mapping of physiographic areas. Temporal resolution refers to the time interval between images. There are applications requiring data repeatedly and often, such as oil spill, forest fire, and sea ice motion monitoring. Some applications only require seasonal imaging (crop identification, forest insect infestation, and wetland monitoring), and some need imaging only once (geology structural mapping). Obviously, the most time-critical applications also demand fast turnaround for image processing and delivery - getting useful imagery quickly into the user's hands. Let as consider an application, in concrete the use of remote sensing in the forest inventory. Forest inventory is a broad application area covering the gathering of information on the species distribution, age, height, density and site quality. For species identification, we could use imaging systems or aerial photos. For the age and height of the trees, radar could be used in combination with the species information assessed at a first
stage. Density is achieved mainly by an optical interpretation of aerial photos and/or highresolution panchromatic images. As for site quality, is one of the more difficult things to assess. It is based on topological position, soil type and drainage and moisture regime. The topological position can be estimated using laser or radar. However, the soil type and drainage and moisture regime could be more profitably collected using ground data. The use of Remote Sensing in Crop monitoring (real case) The countries involved in the European Communities (EC) are using remote sensing to help fulfill the requirements and mandate of the EC Agricultural Policy, which is common to all members. The requirements are to delineate, identify, and measure the extent of important crops throughout Europe, and to provide an early forecast of production early in the season. Standardized procedures for collecting this data are based on remote sensing technology, developed and defined through the MARS project (Monitoring Agriculture by Remote Sensing). The project uses many types of remotely sensed data, from low resolution NOAA-AVHRR, to high-resolution radar, and numerous sources of ancillary data. These data are used to classify crop type over a regional scale to conduct regional inventories, assess vegetation condition, estimate potential yield, and finally to predict similar statistics for other areas and compare results. Multisource data such as VIR and radar were introduced into the project for increasing classification accuracies. Radar provides very different information than the VIR sensors, particularly vegetation structure, which proves valuable when attempting to differentiate between crop types. One the key application within this project is the operational use of high resolution optical and radar data to confirm conditions claimed by a farmer when he requests aid or compensation. The use of remote sensing identifies potential areas of non-compliance or suspicious circumstances, which can then be investigated by other, more direct methods. As part of the Integrated Administration and Control System (IACS), remote sensing data supports the development and management of databases, which include cadastral information, declared land use, and parcel measurement. This information is considered when applications are received for area subsidies. This is an example of a truly successfully operational crop identification and monitoring application of remote sensing.
Bibliography