This part of my web pages lists important issues
that are specific to interpreting gravity and magnetic data for petroleum exploration,
their definition, and my philosophy or approach for dealing with them.
General Properties of Gravity and
As with most geophysical data, anomaly wavelength
is directly proportional to the distance (or depth) from the source that produces it. Long
wavelengths are produced by deep sources and short wavelengths are produced by shallow
sources. Anomaly amplitudes are generally proportional to contrasts in density,
magnetization, acoustic impedance, electrical resistivity, etc. High amplitudes are
produced by large contrasts and low amplitudes are produced by small contrasts. Although a
source's size and geometry also affect the shape of the anomaly it produces, simple
relationships for amplitude and wavelength described above are good rules-of-thumb to keep
Another aspect of gravity and magnetic data to
keep in mind is that anomalies can only be produced by horizontal changes in rock
properties, i.e., density and/or magnetization. Multiple horizontal layers, with only
vertical changes in density and magnetization, will not produce anomalies regardless of
the magnitudes of these contrasts. Only at the edges of such features, where horizontal
density and magnetization contrasts exist, can potential field anomalies be produced.
This property is important, especially when
compared to exploration with reflection seismic data. Reflection seismic data shows, for
the most part, vertical changes in rock properties, i.e., acoustic impedance. Acoustic
impedance is the product of a rock layers density with it's velocity. Vertical
changes in acoustic impedance cause seismic waveforms to reflect all or part of their
energy as the waveforms travel down into the Earth. Hence, the use of potential fields
data with seismic data is naturally complimentary.
For an explanation of my background image, which
shows elements of both gravity and magnetic data, go to Background
With the Global Positioning System (GPS), all
sorts of geographic information and data are improved including gravity and magnetic data.
GPS-located data make use of a network of satellites that allow extreme positional
accuracy. Smaller and more powerful computers, and improved data acquisition and
processing algorithms, also contribute to our ability to collect better data than ever.
Terms such as data (or instrument) resolution,
accuracy, and sensitivity are frequently used interchangeably, but we typically think of
the smallest amplitudes, and shortest wavelengths larger than "background
noise", as the limits which define data quality. For me, I think in terms of the
smallest amplitudes and wavelengths that I can interpret confidently.
Data Filtering and Anomaly Enhancement
It is often useful to filter anomalies in order
to enhance some wavelengths at the expense of others. Many techniques are available, but I
feel that pitfalls should be kept in mind. These pitfalls are usually related to artifacts
produced by the enhancement method used. Therefore it is best to continually reference
filtered maps to original data maps.
Artifacts related to Fourier-type filtering such
as "ringing" along the edges of maps are undesireable and sometimes unavoidable.
For this reason, analyses that use wavelet transforms may be useful. Check out the
following books if you're interested:
Strang, G., Nguyen, T., 1996, Wavelets and Filter Banks: Wellesley Cambridge Pr.
Axler, S., Bourdon, P., Ramey, 1992, Harmonic Function Theory: Springer-Verlag.
Depth-to-Magnetic Source Analyses
Estimating depths to potential fields anomaly
sources is typically restricted to magnetic data. The reason is due to the distribution of
density and magnetization beneath the Earths surface. That is, magnetization
contrasts between crystalline basement rocks and overlying sedimentary rocks are large.
Volcanic rocks within the sedimentary section also have large magnetization contrasts.
Density, on the other hand, does not vary so dramatically. So magnetic anomalies are
generally thought to be produced by distinct source bodies, while gravity anomalies may be
produced by the cumulative effect of several density variations.
There are several techniques for estimating the
depths to magnetic sources, both manual (or graphical) and automated. All methods estimate
the depth to the source based on the horizontal distance spanned by the slope of an
anomaly over a source body. Computer based automated techniques are simply
"brute-force" algorithms that replicate manual techniques. Hence computer
generated depth estimates calculate many more solutions than necessary, and the
interpreters job is to select the correct depth estimates.
For more, see depth-to-magnetic
Modeling and Inversion
Models are useful for testing ideas and
quantitatively providing support for an interpretation. So interpretation confidence can
be enhanced with supporting models, of which there are two basic types: forward and
A forward model is constructed by assigning
appropriate density and magnetization properties to defined polygonal Earth layers.
Gravity and magnetic anomalies can then be calculated for the hypothetical model and
compared with observed data. Rock properties, and Earth layer shapes, can then be modified
and the anomalies they produce can be re-calculated for comparison with original data. The
process continues until a reasonable "fit" between calculated and observed data
Inverse modeling assumes an initial Earth model,
then uses original data to modify either the model's geometries or rock properties.
Inversion to modify model geometries is a non-linear operation while inversion to modify
model rock properties is a linear operation. Ironically, the quality or success of inverse
modeling is largely dependant upon the accuracy of the starting model. If the input model
is well constrained, then inverse modeling results may be very good.
Both modeling exercises can be useful while
interpreting, however one must always understand that models are general and can never
approach the true complexity of the Earth. Still they can be enlightening if data quality
is good and caveats associated with modeling are kept in mind.
Depth-to-source analyses and modeling are
quantitative methods, however qualitative techniques are equally important for
interpreting potential fields data. Unfortunately these methods can appear to be nebulus.
They involve identifying linear features formed by anomaly shapes and gradients, inferring
structures analogous to important structures mapped from other data, suggesting regions
that may be depocenters or carbonate buildups, or identifying geologic boundaries that may
play important roles for exploration.
Quantitative analyses are somewhat mechanical,
but qualitative analyses can be the glue that holds an interpretation together and give it
strength. It requires geologic knowledge, creativity and vision. Note that all four
figures on this page are parts of the same geologic model and the lower part, directly
above left, is an example of filtered data (derived from the upper part of the figure).
The term "non-uniqueness" refers to the
fact that any gravity or magnetic anomaly can be produced by an infinite number of
different sources from an infinite number of depths. Although this is mathematically true,
it is not geologically true.
For example: the Richter scale, used to measure
earthquake magnitudes, is open-ended. This means that mathematically an earthquake that
measures 20, 30, 40 or even 500 is possible. Of course this is ridiculus because the
limiting factors are the Earths properties, that is, there are physical limits of
how much deformation or strain Earth materials can support before they rupture and cause
The same is true for gravity and magnetic data
interpretation. Even with cursory geologic knowledge of a region, the "infinite"
number of possible anomaly sources can be limited. A study area may be a rift basin, or
along a passive margin, or over a foreland basin, etc. So we know the regional geologic
setting, principle directions of compression and extension, and probable geologic
structures that might be inferred with just this first-pass information. Therefore,
by simply considering an area's geology, the data are interpretable and are no longer
Interpretation of Magnetic Anomalies
Produced within the Sedimentary Section, sometimes called "Micro-Magnetics"
Micro-Magnetics has been a controversial topic
for over 20 years but has yet to be "ground-truthed". That is, the idea that
hydrocarbon micro-seepage creates a reducing environment over a reservoir which causes
minerals like hematite to turn into magnetite. Hence, magnetic anomalies produced by this
authigenic (or diagenetic) magnetite should be direct indicators of oil and gas
It is well documented that non-magnetic minerals
can be reduced by hydrocarbon micro-seepage into magnetic minerals, but to my knowledge no
one has ever shown that they occur in quantities that produce measureable magnetic
Since data quality today is so much better than
that of the past, we are seeing coherent magnetic anomalies that can only be produced
within the sedimentary section. Unfortunately the database needed for identifying the
rocks and/or minerals that produce these anomalies does not exist.
My approach is that the new generation of subtle
anomalies are produced by vertical offsets of magnetized horizons, or by precipitation of
magnetic (or non-magnetic) minerals along a fault plane; and that they are mappable,
geologic, and interpretable.
For more see micro-magnetic
The importance of integrating geological and
geophysical information cannot be overly stressed. If our information were infinite, then
all exploration wells would be successful. Since this is not the case, it is crucial to
access all information for a study area. Each bit of data helps to constrain possible
Confidence in gravity and magnetics
interpretations depends on data quality, the goals of the study, and amount and quality of