You are on page 1of 45

KULIAH II SUBSURFACE IMAGING

TGS571 GEOFISIKA PERTAMBANGAN

Program Studi Teknik Geofisika


Jurusan Teknik Kebumian Dosen Pengasuh
Fakultas Teknik Dian Darisma M.T
Universitas Syiah Kuala
2018
Lecture 2. Physical background

1)  Basics of subsurface imaging


2)  Waves used in subsurface imaging
3)  Different types of subsurface imaging
Subsurface imaging
Subsurface imaging (SSI) to infer internal structure from complex
and distorted signals received outside the obscuring volume. They
often employ similar physical and mathematical models and use
similar computational and algorithmic methods.

Subsurface imaging (SSI) problems arise in a wide range of areas on


different scales:
•  geophysical exploration
•  environmental remediaton under the earth or the ocean,
•  medical examinaton and diagnosis inside the body,
•  basic studies of the biological processes inside the cell,
•  security inspections for explosives or contraband concealed in
luggage.
Imaging is the measurement of the spatial distributon of
some physical property of an object by use of an instrument
such as
•  a camera,
•  an optical or ultrasonic scanner,
•  a microscope,
•  a telescope,
•  a radar system,
•  an X-ray machine.
The spatial scale of the object may range from
subnanometer to light years
Imaging applicatons at various spatial scales
Subsurface imaging ( SSI ) is
the imaging of an object
buried below the surface of a
medium, such as
soil, rock, water, atmosphere,
or tssue.

The imaging process is mediated by


•  some field,
•  wave,
•  or stream of particles
that probe the medium, and generally is called probe or
the probing wave. The probe interacts with the object
(“reads” the object) and communicates with sensor.
Examples of probes:
•  An electrostatic field can be used to monitor the spatial
distributon of the electric conductvity of an object.
•  A magnetic field can be used to probe the presence of metal.
•  Electromagnetic waves at various bands of the spectrum
(low-frequency, radiowave, microwave, millimeter waves,
terahertz, optcal, X-rays, γ-rays) travel in the form of waves
(which may be approximated by rays when the wavelength is
short).
•  Mechanical and acoustic waves (e.g., seismic and ultrasonic)
have widespread imaging applicatons.
•  Beams of accelerated particles may also be used for imaging.
For example, an electron beam is used in the scanning
electron microscope (SEM), and nuclear particles are used in
medical imaging.
Imaging Configurations
Imaging systems can take several configurations.
A self- luminous object generates its own signal (field, wave, or
particles), which may be observed and used to construct the
image without the need for an external probe
(Passive imaging)
in actve imaging the image is formed by use of an external
probe (field, wave, or particles) that interacts with the object.
Such a probe may be transmited through or reflected from the
object
Imaged Physical Property

The physical property that is measured by the imaging system


is called the alpha property, and is denoted by the symbol α.

In most cases, α is a scalar functon of positon r = (x,y,z), i.e.,


is a three- dimensional (3D) distributon (a map). The actual
physical nature of α depends on the probe used.
Mapping Other Underlying Object Properties

The purpose of subsurface imaging may be the mapping of structural,


mechanical, chemical, environmental, biological, physiological, or other
functional propertes that are not directly sensed by the probe.
Such parameters, which are of interest to the user, may be denoted as
the beta parameters β, and may be scalar or vector functons of positon r
and time t.
Examples of user property β:
Density, pressure, temperature
Young’s modulus of elasticity, bulk modulus and fluid elasticity, viscosity
Humidity, moisture content, porosity, pH number, thermal resistivity
Molecular or ion concentration, chemical composition
Crystallographic atomic structure
Biological and physiological properties such as blood flow, tissue oxygenation, hemoglobin
concentration, metabolic rates, and membrane integrity; in medical imaging, the term
functional imaging, as opposed to structural imaging, is used for such measurements
Concentration of extrinsic markers such as dyes, chemical tags, chromophores and fluor-
phores, and fluorescence protein markers
Gene expression, cellular differentiation, morphogenesis.
STUDY OF SUBSURFACE BY GEOPHYSICAL METHODS IS ALSO A SSI
PROBLEM USING DIFFERENT TYPES OF PROBES
VIBROSEIS SEISMIC SURVEY: SEISMIC WAVES ARE USED AS PROBE
MICROSEISMIC MONITORING INFORMATION CAN BE USED FOR PASSIVE IMAGING
Waves
A wave is a physical property that exhibits oscillatory variaton in both
time and space corresponding to propagation with some velocity.
Mechanical waves, such as sound,or seismic waves, propagate in a
medium, while electromagnetic waves, such as radiowaves and light,
may also propagate in vacuum.
•  A wave transfers energy as it travels from one location to another.
It changes directon at planar boundaries between different media,
leading to reflection and refraction, and it acquires curvature at
curved boundaries so that it may be focused or defocused.
•  A wave is scatered into many directons when it falls onto a small
object or inhomogeneity, and is diffracted as it travels through
narrow apertures.
•  Since waves are sensitve to the spatial distribution of the
medium through which they propagate (and to objects located
within), they are suitable for use as probes in subsurface imaging
systems.
Incident Reflected
wave Diffracted
wave wave

Scattered
Refracted wave
wave

A probe wave is reflected and refracted at a planar surface.


The reflected wave is diffracted from an aperture.
The refracted wave is scatered from a target
PROPAGATION OF SEISMIC WAVES

(Dentth and Mulge, 2014)


Wave Equation
A wave is described mathematcally by a functon of
1)  positon r = (x, y, z) and
2)  time t.
The functon is denoted u(r, t) and is known as the wavefunction.
The physical meaning of this function depends on the nature of
the wave (mechanical, electromagnetic, etc.).
n a linear homogeneous medium, the wavefunction u(r, t)
satisfies a partal differental equation called the wave equaton:
where v is the wave velocity, which is characteristc of the
medium,
∇2 is the Laplacian operator, which is
∇2 = ∂2/∂x2 + ∂2/∂y2 + ∂2/∂z2 in Cartesian coordinates.
Because the wave equaton is linear, the principle of superpositon
applies, i.e., if u1 (r, t) and u2 (r, t) represent possible waves, then
u(r, t) = u1 (r, t) + u2 (r, t) also represents a possible wave.
Harmonic Waves
A harmonic wave has a wavefunction with harmonic time dependence,

u(r, t) = a(r) cos[ωt + ϕ(r)], (2.1-2)

where
a(r) = amplitude
u(t)
T = 1/ν ϕ(r) = phase
φ/ω
ν = frequency (cycles/s or Hz)
a
ω = 2πν = angular frequency (radians/s)
0 t
T = 1/ν = 2π/ω = period (s).

The wavefunction is a harmonic function of time with frequency ν at all positions, and
the amplitude and the phase are generally position dependent.
It is convenient to represent the real wavefunction u(r,t) in (2.1-2) in terms of a
complex wavefunction
U (r,t) = U (r) exp(jωt), (2.1-3)
U (r, t) = U (r) exp(jωt),

where the time-independent factor U (r) = a(r)exp[jϕ(r)] is referred to as the com-


plex amplitude. The wavefunction u(r, t) is therefore related to the complex ampli-
tude by

u(r, t) = Re{U (r) exp(jωt)} = 1 [U (r) exp(jωt)


2 + U ∗ (r) exp(−jωt)], (2.1-4)

where the symbol ∗ signifies complex conjugation. At a given position r, the complex
amplitude U (r) is a complex variable [depicted in Fig. 2.1-2(a)] whose magnitude
|U (r)| = a(r) is the amplitude of the wave and whose argument arg{U (r)} = ϕ(r)
is the phase. The complex wavefunction U (r, t) is represented graphically in Fig. 2.1-
2(b) by a phasor that rotates with angular velocity ω (radians/s). Its initial value at t = 0
is the complex amplitude U (r).
Im{U} Im{U}
ω
a Figure 2.1-2 Representations of a har-
φ φ monic wave at a fixed position r: (a) the
Re{U} Re{U}
complex amplitude U = a exp(jϕ) is a
fixed phasor; (b) the complex wave-
function U (t) = U exp(jωt) is a phasor
(a) (b) rotating with angular velocity ω radians/s.
a)
Space domain b) Magnitude
(at reference point)
+ Direction of
Magnitude
propagation – +

A
Time = t0
Location B Time
Time = t0+ Dt domain
A
– Time = t0 C
Time = t0+2D t Period (P )

+ Direction of Time = t0+3D t


B propagation
D
Magnitude

Time
Location

– Time = t0+ Dt

+ Direction of
propagation
Magnitude

Location
(a)  Waveforms showing the magnitude of a
wave’s disturbance in the region of a
C
– Time = t0+2 Dt

reference point in a medium at different


Direction of
+ propagation

tmes (separated by tme interval Δt) as a


D
Magnitude

Location wavelet passes through the point,


(b)  the disturbance at the reference point
– Time = t0+3 Dt
Wavelength (l) shown as a waveform varying in tme.
Reference
point
SUMMARY:

•  A harmonic wave repeats itself. The repeated section of the wave


is known as a cycle. This is the secton between two consecutive
equivalent points on the waveform experiencing identical
disturbance. For example, the secton between two adjacent peaks
or troughs, or two zero cross-overs of the same slope etc.,
represents one cycle of the wave.
•  In the space domain, the distance that one cycle of the sine wave
occupies is known as the wavelength (λ) of the wave. The time
domain equivalent of wavelength is the period (P) of the wave,
which is the time taken for one cycle of the wave to occur.
•  In the time domain, the number of repetitions or cycles per unit of
time, i.e. cycles per second or hertz (Hz), is the frequency (f) of the
wave.
•  In the space domain, the number of cycles per unit of distance, e.g.
cycles per metre, is the spatial frequency or wave number (σ) of the
wave. It is common for spatial frequency to be (incorrectly) referred
to as frequency.
(Stein& Wyesson, 2003, An introducton to seismology, earthquakes, and Earth structure.
Blackwell Publishing.
In microseismic monitoring seismic waves with frequencies of several thousands Hz
are detected and used
The acoustic spectrum

The frequency of the acoustic wave extends over the


infrasonic, audio, ultrasonic, and hypersonic bands, as
defined in Fig. 2.1-10

1 MHz

1 GHz
1 kHz
1 Hz

Frequency
(Hz)
Speech
103

106

109
1

Infra- Audio
Sound Ultrasound Hypersound

10–3

10–6
1

(m)
Wavelength
1 mm

1 µm
1m

(in air)

Figure 2.1-10 The acoustc spectrum


m
Hz
Frequency

Wavelength
(in vacuum)
103 1 kHz
VLF

LF
1 km 10 3
MF 106 1 MHz
Radiowave

HF

VHF
1m 1
UHF 10 9 1 GHz
Microwave

Wavelength Wavelength
(μm) SHF (nm)

Figure 2.1-8
300 760
FIR
1 mm 10–3 MMW
20 Red
THz 1012 1 THz
MIR
2
NIR IR
0.76 622
Orange
1 µm 10–6 597 Yellow
577
Optical
Visible

1015 1 PHz
390 UV Green
300
NUV
MUV
200 492
FUV Blue
FREQUENCIES OF ELECTROMAGNETIC WAVES

1 nm 10–9
Soft

100 455
EUV 1018 1 EHz
X-rays

The electromagnetic spectrum.


X-rays

Violet
Hard

390
10
Wavelength
(nm) 1 pm 10–12
γ -rays

1021 1 ZHz
H
PENETRATION DEPTH IN SSI

•  A medium under investigation absorbs or scaters the incoming


probe wave, preventing it from reaching the target.
•  Likewise, emission from a self-luminous object may be absorbed or
scatered by the medium and never reaches the sensor.
•  A necessary conditon for successful subsurface imaging is that the
wave reaching the sensor must retain sufficient power so that it is
detectable with sufficient accuracy.
•  The penetraton depth (the distance at which the power of the
probe is reduced by a certain factor) depends on the properties of
the medium and the nature of the wave, including its wavelength.
•  For example, a medium may be totally opaque to light, but
penetrable by sound or X-rays.
Resolution

The resoluton of an imaging system is the dimension of the fitnest


(smallest) spatial detail in the object that can be discerned in the
acquired image.
Tradeoff between Penetration and Resolution: Waves of higher
frequencies (shorter wavelengths) can resolve finer (smaller)
details in the image.
EXAMPLE: Ground Penetrating Radar imaging at high frequency
(a) and low frequency (b)
Depth

Resolution Resolution
Depth

(a) (b)
Other factors affecting results of SSI

1)  Contrast of the physical property


2)  Noise arising from random fluctuation of the property. In the
presence of high noise, meaningful variations of the sensed
property of the object must have greater contrast in order to
be distnguished from the background random fluctuations

Source Sensor

Rough
surface
Clutter

Target
Localized and Tomographic Imaging
in 3D imaging the sensed property is represented by a positon-
dependent function α = α(r), where r = (x, y, z).
There are two distinct approaches for addressing the localizaton
challenge of 3D imaging:
1) Localized Imaging. In this approach, the probe and sensor spots
are configured such that their intersection, which we call the
scanned spot, is a single point (or tiny volume). This ensures that,
for each measurement, the sensor is responsive to approximately a
single point in the object. The full 3D distributon is, of course,
constructed by scanning over all points of the object.
Probe Sensor Probe
Scanned Sensor
spot

Scanned
spot

Scanning Scanning
(a) (b)

Figure 1.3-1 (a) In confocal imaging, the probe and sensor spots are co-focused onto the same
point. (b) The probe beam illuminates a 2D slice of the 3D object and each slice is viewed by the
sensor.
2) Tomographic Imaging.

•  In this approach, physical localizaton is replaced with


computational localizaton.
•  The scanned spot is an extended region, so that a single
measurement by the instrument is responsive to the sum of
contributions from many points of the object within that
spot.
•  The measurement is repeated from multiple views such that
these spots intersect, enabling each point of the object to
contribute to multiple measurements.
•  Taken together, these measurements are used to compute
the individual contributions of all points.
Ray Tomography
The simplest example of tomography is ray tomography, which
is the basis of X-ray computed tomography (CT).

Scanning
Axial
Sensor

Source

Sensor
Slice
Source

Figure 1.3-3 Imaging by axial slic-


ing and ray tomography within each slice.
Wave Tomography

At wavelengths not much


shorter than the spatial scale of
r

View #1 the details of the object, the ray


model is inadequate and the
probe must be treated as a
wave that spreads as it travels
r

and is modifted by the spatally


View #2 varying properties of the object
Figure 1.3-5 Imaging of three scat-
terers by use of probe waves from mul-
tiple directions. Two views are shown.
EXAMPLE: CROSS-‐WELL TOMOGRAPHY
Different types of waves can be used (electromagnetc,
seismic)

Sources

Sensors
Electrical Impedance Tomography

The conductvity and permitivity of the object are inferred from electrical
measurements on the outer surface. A small alternating current is injected
through each conductng electrode and the resultng electrical potental is
measured at all other electrodes
Seismic tomography with active sources

Sources
Dynamic, Multispectral, Multisensor, and Multiwave
Imaging
•  Dynamic imaging systems deal with objects that are time varying so that α =
α(r,t) is a function of space and time, and the imaging system is four-
dimensional (4D). The time dependence may be i) independent of the probe,
e.g., when imaging moving, pulsating, or fluctuating targets, or ii) initated by
the probe, e.g., molecular change or chemical reaction lasting for some time
after the applicaton of an optical or electromagnetic pulse.
•  Spectral imaging systems observe a wavelength-dependent property
represented by the 4D functon α = α(r, λ), which is a function of the
wavelength λ. The human visual system observes color objects that are
position, time, and wavelength dependent; this is a five-dimensional system.
•  Multisensor imaging instruments combine the images acquired
independently by different imaging modalites, such as X-rays, radiowaves,
and ultrasonic waves, to perform tasks such as the mapping of an underlying
physical or biological property to which all sensors are sensitve, or the
detection/classiftcation of a target. Human and animal sensory systems
combine visual (optical) and auditory (acoustic) data to achieve tasks such as
target location and identification.
Multiwave imaging systems use multiple waves, e.g.,
electromagnetic and acoustic, that interact inside the
medium, e.g., one wave generating or modulating
another in such a way that the informaton is acquired
better, or new informaton is delivered
Dynamic Imaging. If the object varies slowly relative to
the response time of the image acquisiton system,
which is determined by factors such as the width of the
probe pulse and the response time of the sensor, then
capturing a 4D dynamic image α(r, t) may be
accomplished by simply measuring a sequence of
independent static 3D images α(r,t1),α(r,t2),·∙·∙·∙ and
forming a video image.
Multispectral Imaging: Since the sensed property α(r)
represents the response of the medium to the incoming probe
wave, it is generally frequency dependent:
α = α(r, ω), where ω is the angular frequency. In certain
applications it is useful to use, as probes, a set of waves/fields
of different frequencies/wavelengths that measure alpha
parameters α1 , α2 , . . . , αN of the object.
The combined measurements may be used to determine an
underlying beta property β, or several such properties β1 ,
β2 , . . . , βM .
For example, a measurement of the distributon of optical
absorpton or fluorescence α(r) at N wavelengths may be used
to determine the concentrations β1 (r), β2 (r), . . . , βM (r) of M
different materials, molecules, or species of known spectral
absorption or fluorescence profiles.
Multispectral imaging and thematic mapping allows researchers to collect reflection
data and absorpton properties of soils, rock, and vegetation. This data could be
utlized by trained photogeologists to interpret surface lithologies, identify clays,
oxides, and soil types from satellite imagery ASTER SATELLITE
http://www.satmagingcorp.com Launch Date: 18 December 1999 at
Vandenberg Air Force Base, California,
USA
The ASTER instrument consists of
three separate instrument
subsystems:

•  VNIR (Visible Near Infrared), a


backward looking telescope which
is only used to acquire a stereo pair
image
•  SWIR (Short Wave Infrared), a single
ftxed aspheric refractng telescope
•  TIR(Thermal Infrared)
•  ASTER has 14 bands of informaton
(Image Copyright © NASA/Japanese Space Team)
Mine in Saline, California, ASTER Satellite Images:
http://www.satmagingcorp.com/applicatons/energy/mining/

•  The left image displays visible and near infrared bands 3, 2, and 1 in red, green, and blue (RGB).
Vegetation appears red, snow and dry salt lakes are white, and exposed rocks are brown, gray, yellow
and blue. Rock colors may reflect the presence of iron minerals, and variatons in albedo.
•  The middle image displays short wavelength infrared bands 4, 6, and 8 as RGB. In this wavelength
region, clay, carbonate, and sulfate minerals have diagnostic absorption features, resulting in distinct
colors on the image. For example, limestones are yellow-green, and purple areas are kaolinite-rich.
•  The right image displays thermal infrared bands 13, 12 and 10 as RGB. In this wavelength region,
variatons in quartz content appear as more or less red; carbonate rocks are green, and mafic volcanic
rocks are purple.
Multisensor Imaging: In multisensor (or multimodality)
imaging, probes of different physical nature are used to
measure different properties α1,α2,·∙·∙·∙.
These independent measurements may be used together to
better reveal a single underlying property β, or to make a
better decision on the existence of some anomaly.
Multisensor imagers may also be used to extract different
information about the object.
EXAMPLE: combining of optcal and sonic probes.
In geophysics it is one of the development directons.
(d)
(a)
(b) (c)

Figure 7.1-10 Multisensor undersea imaging. (a) An autonomous remote-controlled underwater


vehicle equipped with a pair of stereoscopic cameras and an acoustic (sonar) depth sensor surveys
an underwater archaeological site of an ancient Roman shipwreck using two types of imagery of the
seafloor — acoustic and visual. The bottle-shaped objects lying on the sea floor are amphorae used
by ancient Romans to store household materials. (b) Data from the acoustic (sonar) depth sensor.
(c) A second depth image estimated from the stereoscopic cameras. (d) Results of sensor integration
combining the brightness information from the cameras and depth information from the sonar device,
aided by depth cues from the stereoscopic depth estimation algorithm. Sensor integration has served
to provide a realistic 3D virtual view of the shipwreck.
Multiwave Imaging. In multiwave imaging, waves of
different physical nature are designed to interact within
the medium and generate a single image with superior
characteristcs.
Such interaction may also enable the acquisiton of
new physical information about the object that is not
obtainable by any of the waves individually.
While multisensor imaging benefits from the post-
processing of images obtained independently by two
modalites, each possibly lacking in some aspect (e.g.,
resoluton or contrast), a multiwave imaging system
generates a single image designed to combine the best
of each (e.g., both contrast and resoluton).
Applications: laser altmetry, GPS technologies.

You might also like