Professional Documents
Culture Documents
Scattered
Refracted wave
wave
where
a(r) = amplitude
u(t)
T = 1/ν ϕ(r) = phase
φ/ω
ν = frequency (cycles/s or Hz)
a
ω = 2πν = angular frequency (radians/s)
0 t
T = 1/ν = 2π/ω = period (s).
The wavefunction is a harmonic function of time with frequency ν at all positions, and
the amplitude and the phase are generally position dependent.
It is convenient to represent the real wavefunction u(r,t) in (2.1-2) in terms of a
complex wavefunction
U (r,t) = U (r) exp(jωt), (2.1-3)
U (r, t) = U (r) exp(jωt),
where the symbol ∗ signifies complex conjugation. At a given position r, the complex
amplitude U (r) is a complex variable [depicted in Fig. 2.1-2(a)] whose magnitude
|U (r)| = a(r) is the amplitude of the wave and whose argument arg{U (r)} = ϕ(r)
is the phase. The complex wavefunction U (r, t) is represented graphically in Fig. 2.1-
2(b) by a phasor that rotates with angular velocity ω (radians/s). Its initial value at t = 0
is the complex amplitude U (r).
Im{U} Im{U}
ω
a Figure 2.1-2 Representations of a har-
φ φ monic wave at a fixed position r: (a) the
Re{U} Re{U}
complex amplitude U = a exp(jϕ) is a
fixed phasor; (b) the complex wave-
function U (t) = U exp(jωt) is a phasor
(a) (b) rotating with angular velocity ω radians/s.
a)
Space domain b) Magnitude
(at reference point)
+ Direction of
Magnitude
propagation – +
A
Time = t0
Location B Time
Time = t0+ Dt domain
A
– Time = t0 C
Time = t0+2D t Period (P )
Time
Location
– Time = t0+ Dt
+ Direction of
propagation
Magnitude
Location
(a) Waveforms showing the magnitude of a
wave’s disturbance in the region of a
C
– Time = t0+2 Dt
1 MHz
1 GHz
1 kHz
1 Hz
Frequency
(Hz)
Speech
103
106
109
1
Infra- Audio
Sound Ultrasound Hypersound
10–3
10–6
1
(m)
Wavelength
1 mm
1 µm
1m
(in air)
Wavelength
(in vacuum)
103 1 kHz
VLF
LF
1 km 10 3
MF 106 1 MHz
Radiowave
HF
VHF
1m 1
UHF 10 9 1 GHz
Microwave
Wavelength Wavelength
(μm) SHF (nm)
Figure 2.1-8
300 760
FIR
1 mm 10–3 MMW
20 Red
THz 1012 1 THz
MIR
2
NIR IR
0.76 622
Orange
1 µm 10–6 597 Yellow
577
Optical
Visible
1015 1 PHz
390 UV Green
300
NUV
MUV
200 492
FUV Blue
FREQUENCIES OF ELECTROMAGNETIC WAVES
1 nm 10–9
Soft
100 455
EUV 1018 1 EHz
X-rays
Violet
Hard
390
10
Wavelength
(nm) 1 pm 10–12
γ -rays
1021 1 ZHz
H
PENETRATION DEPTH IN SSI
Resolution Resolution
Depth
(a) (b)
Other factors affecting results of SSI
Source Sensor
Rough
surface
Clutter
Target
Localized and Tomographic Imaging
in 3D imaging the sensed property is represented by a positon-
dependent function α = α(r), where r = (x, y, z).
There are two distinct approaches for addressing the localizaton
challenge of 3D imaging:
1) Localized Imaging. In this approach, the probe and sensor spots
are configured such that their intersection, which we call the
scanned spot, is a single point (or tiny volume). This ensures that,
for each measurement, the sensor is responsive to approximately a
single point in the object. The full 3D distributon is, of course,
constructed by scanning over all points of the object.
Probe Sensor Probe
Scanned Sensor
spot
Scanned
spot
Scanning Scanning
(a) (b)
Figure 1.3-1 (a) In confocal imaging, the probe and sensor spots are co-focused onto the same
point. (b) The probe beam illuminates a 2D slice of the 3D object and each slice is viewed by the
sensor.
2) Tomographic Imaging.
Scanning
Axial
Sensor
Source
Sensor
Slice
Source
Sources
Sensors
Electrical Impedance Tomography
The conductvity and permitivity of the object are inferred from electrical
measurements on the outer surface. A small alternating current is injected
through each conductng electrode and the resultng electrical potental is
measured at all other electrodes
Seismic tomography with active sources
Sources
Dynamic, Multispectral, Multisensor, and Multiwave
Imaging
• Dynamic imaging systems deal with objects that are time varying so that α =
α(r,t) is a function of space and time, and the imaging system is four-
dimensional (4D). The time dependence may be i) independent of the probe,
e.g., when imaging moving, pulsating, or fluctuating targets, or ii) initated by
the probe, e.g., molecular change or chemical reaction lasting for some time
after the applicaton of an optical or electromagnetic pulse.
• Spectral imaging systems observe a wavelength-dependent property
represented by the 4D functon α = α(r, λ), which is a function of the
wavelength λ. The human visual system observes color objects that are
position, time, and wavelength dependent; this is a five-dimensional system.
• Multisensor imaging instruments combine the images acquired
independently by different imaging modalites, such as X-rays, radiowaves,
and ultrasonic waves, to perform tasks such as the mapping of an underlying
physical or biological property to which all sensors are sensitve, or the
detection/classiftcation of a target. Human and animal sensory systems
combine visual (optical) and auditory (acoustic) data to achieve tasks such as
target location and identification.
Multiwave imaging systems use multiple waves, e.g.,
electromagnetic and acoustic, that interact inside the
medium, e.g., one wave generating or modulating
another in such a way that the informaton is acquired
better, or new informaton is delivered
Dynamic Imaging. If the object varies slowly relative to
the response time of the image acquisiton system,
which is determined by factors such as the width of the
probe pulse and the response time of the sensor, then
capturing a 4D dynamic image α(r, t) may be
accomplished by simply measuring a sequence of
independent static 3D images α(r,t1),α(r,t2),·∙·∙·∙ and
forming a video image.
Multispectral Imaging: Since the sensed property α(r)
represents the response of the medium to the incoming probe
wave, it is generally frequency dependent:
α = α(r, ω), where ω is the angular frequency. In certain
applications it is useful to use, as probes, a set of waves/fields
of different frequencies/wavelengths that measure alpha
parameters α1 , α2 , . . . , αN of the object.
The combined measurements may be used to determine an
underlying beta property β, or several such properties β1 ,
β2 , . . . , βM .
For example, a measurement of the distributon of optical
absorpton or fluorescence α(r) at N wavelengths may be used
to determine the concentrations β1 (r), β2 (r), . . . , βM (r) of M
different materials, molecules, or species of known spectral
absorption or fluorescence profiles.
Multispectral imaging and thematic mapping allows researchers to collect reflection
data and absorpton properties of soils, rock, and vegetation. This data could be
utlized by trained photogeologists to interpret surface lithologies, identify clays,
oxides, and soil types from satellite imagery ASTER SATELLITE
http://www.satmagingcorp.com Launch Date: 18 December 1999 at
Vandenberg Air Force Base, California,
USA
The ASTER instrument consists of
three separate instrument
subsystems:
• The left image displays visible and near infrared bands 3, 2, and 1 in red, green, and blue (RGB).
Vegetation appears red, snow and dry salt lakes are white, and exposed rocks are brown, gray, yellow
and blue. Rock colors may reflect the presence of iron minerals, and variatons in albedo.
• The middle image displays short wavelength infrared bands 4, 6, and 8 as RGB. In this wavelength
region, clay, carbonate, and sulfate minerals have diagnostic absorption features, resulting in distinct
colors on the image. For example, limestones are yellow-green, and purple areas are kaolinite-rich.
• The right image displays thermal infrared bands 13, 12 and 10 as RGB. In this wavelength region,
variatons in quartz content appear as more or less red; carbonate rocks are green, and mafic volcanic
rocks are purple.
Multisensor Imaging: In multisensor (or multimodality)
imaging, probes of different physical nature are used to
measure different properties α1,α2,·∙·∙·∙.
These independent measurements may be used together to
better reveal a single underlying property β, or to make a
better decision on the existence of some anomaly.
Multisensor imagers may also be used to extract different
information about the object.
EXAMPLE: combining of optcal and sonic probes.
In geophysics it is one of the development directons.
(d)
(a)
(b) (c)