Geostatistical tools combined with non-conventional seismic (inversion cubes, AVO cubes, 4D seismic etc.) are increasingly being used for more comprehensive reservoir characterization and uncertainty measurement. The construction of the underlying "framework model" essential for this RC work can often be a daunting task, especially when sculpted from densely sampled 3D data, multiple reservoir levels and numerous faults.
To the working engineer, timeliness in the construction of this gross geological framework can be as important as the estimation of system uncertainty at reservoir scale. To this end, recent independent developments in both software and hardware technology are coalescing to empower new workflow approaches that greatly speed up the model building process. These innovations include automated model building, visualization, speedy log/seismic correlation, speedy depth conversion and integration breakthroughs. These new developments demand equally new workflows, which often lag in development. Like fluid flow, workflow is a dynamic entity that must be understood and controlled with technology.
This article introduces the "Decouple-Recouple"
workflow, and shows its role in both reservoir characterization and organizational
structure. It is an approach that improves integration and description
at both the macro and reservoir level by "decoupling" workflows according
to discipline. There are distinct advantages to this approach: it reduces
bottlenecks in the RC process, encourages creativity, leverages new "recoupling"
software, improves asset team coordination through a model/visualization
paradigm, encourages a more iterative approach to reservoir simulation
and helps establish a robust continuum from exploration through production.
Challenges facing practitioners of reservoir characterization are multifold. One challenge is to populate a rock volume with reservoir properties derived from sparsely sampled log and core data, and to estimate the uncertainty of the process. At the other end of the RC spectrum is the challenge of assimilating very densely sampled 3D seismic data into the reservoir model, a daunting task for different reasons and one requiring a very different set of tools. The first step in integrating the "macro" and the "micro" begins with the construction of a geological earth or framework model that can be used to guide in-depth studies at the reservoir level, such as geostatistical inversion.
Reflect for a moment on the world of "old-timer" paper-oriented geophysicists. Decades of experience have forged them into masters of the contour map, able to intuitively create realistic three-dimensional visualizations from sparse two-dimensional G&G data. Adept at applying the "contour option" (a term used to describe uncertainty in the placement of contour lines between widely spaced data point), these experts developed exceptional mapping skills that are still in demand in frontier, "2D" areas. Seven such interpreters could yield seven realizations of the same geology, all similar yet different. Each one would honor the sparsely sampled seismic and log data: final geological models would be shaped by quasi-random variations in contour options, a very human version of stochastic simulation. Comparison of results (i.e. at partner meetings) has historically been a good qualitative measure of uncertainty in the model building process.
Yet with the advent of 3D data, contour option has shrunk from hundreds or thousands of meters to the size of a 3D bin, usually 12.5 or 25m. In the process, many of the "2D" skills acquired over decades have been trivialized by 3D data sets, as the realm of the contour option and its inherent uncertainty has shrunk to oblivion. In its place is an overwhelming density of data, which demands an entirely different skill profile and workflow to conquer. This has all occurred in one decade: it should be no surprise that workflows are stale, and generally not up to the task at hand.
The above historical anecdote is presented to emphasize the point that for workflow design, the uncertainty associated with the construction of the "macro" geological framework is often a secondary issue. A primary concern to operational staff is the timely construction of the gross geologic framework from densely sampled 3D data and well data, and the smooth integration of that model into stochastic simulations.
The time-sensitivity of model construction is often of special concern to working engineers: delay in full reservoir characterization can and does result in poor business decisions. Examples of losses due to incomplete or delayed framework models are rushed well and sidetrack locations, premature gas contract negotiations, incomplete or inaccurate reserves certifications, poorly negotiated long term rig contracts due to thin prospect portfolios, lack of iteration on reservoir simulations, etc. Reservoir characterization adds value to corporations only when it is performed in a timely manner. Like fluid flow, workflow is a dynamic entity that is constrained by barriers to flow, and stimulated through the application of proven or new technology.
This paper addresses that part of the reservoir characterization workflow
that deals with the timely construction of the gross structural and stratigraphic
framework model, and the integration of that model. Current practices are
examined in historical context, and shown to be a special case of a more
general "decouple-recouple" approach. This new family of workflows is shown
to have several important advantages. First, it reduces bottlenecks often
found in "coupled" work environments. Second, it arguably frees individuals
to work more creatively. Third, it leverages new "recoupling" software,
in the process automating tedious tasks. Fourth, it enhances asset team
management through its dependence on a highly integrated model-centric
workflow that is ideally suited for new "visualization
chambers". Finally, it helps establish a continuum from exploration through
production by the replacement of awkward GG&E coordination with more
integrated SS&E teamwork.
The Decouple-Recouple Workflow
Most classical G&G workflows involve the simultaneous mapping of faults and stratigraphy. This practice derives from the classical earth model whereby faults are defined by the abrupt termination of otherwise continuous stratigraphy. Typical 3D seismic mapping procedures involve autopicking along continuous reflections and placing fault contacts for each mapped horizon on seismic lines. These contacts are then projected to map view, and serve as the basis for fault polygons or boundaries. Repeating this process at different levels results in the delineation of both faults and structural surfaces in three dimensions. This workflow couples together stratigraphy and structure: the geometry of the faults is described by the extent to which stratigraphic levels are displaced.
This approach demands that stratigraphic correlations are error free. When they are not, the fault framework will be in error. Similarly, bottlenecks in stratigraphic work retard the development of the fault framework.
Because of this inherent coupling, workflow problems increase along with general system uncertainty, such as in bad data areas or for extremely large projects. This is especially true when the classical layered earth model does not apply, such as in faulted, non-marine settings where (aside from sparse flooding surfaces), the only thing continuous are the discontinuities. Stratigraphic correlations in this type of geology are notoriously error-prone, as both log and seismic correlations can be difficult to ascertain. The flow of work often slows to a trickle in this environment, as significant amounts of time can be spent on edits and dead ends. A different workflow is called for, one that also has advantages in more "friendly" basins.
An alternative to conventional "coupled" interpretation is to decouple
structure from stratigraphy during the interpretation/ correlation phase.
One way to accomplish this is to create artificial
horizons, such as time slices, onto which fault plane intersections
can be recorded. Because these "horizons" are independent of stratigraphy,
they are not susceptible to change as correlations evolve. This effectively
decouples structural model building from stratigraphic, and is an important
part of the decouple-recouple workflow. Importantly, it lays the groundwork
for "structuralists" to work independently from "stratigraphers", which
in many ways is a multi-disciplinary improvement over "geologists" and
"geophysicists". This redefinement of roles is only now possible with new
highly integrated software modules, which swiftly bring seismic time data
onto geological and engineering depth data and vice versa.
Timely Developments - Coherency and Correlation Tools
Mapping faults from 3D data through the use of time slices as proxy horizons is made possible through the calculation of coherency or correlation along that plane. Coherency and correlation software such as CTC’s Coherency Cubeâ or GeoQuest’s Correlation Map transform flat time slices into canvases exhibiting structural information. Fault information can be digitized onto these arbitrary surfaces, which can be quickly synthesized into fault models, entirely in a "decoupled" sense. In areas of considerable dip, time slice horizons can be tilted and warped to roughly mimic dip. Known as "form slices", these smooth unfaulted surfaces offer a powerful visualized alternative to old-fashioned fault "picking" on two-dimensional seismic cross sections.
Though correlation and coherency tools have been primarily perceived
as stratigraphic analysis tools by industry, they perhaps reach their greatest
value as structural analysis "facilitators" within a decoupled workflow.
Animation of 3D Volumes
To many, visualization tools have been largely viewed as presentation
tools. In fact, it is now possible to greatly accelerate all stratigraphic
and structural analysis on a shared 3D canvas, viewing and manipulating
the geological framework as it develops. After constructing, viewing in
3D and interpreting the form slices described above, seismic volumes can
in real time through the developing
framework. This permits the eye to pick up subtle structural and
stratigraphic breaks and features that would be lost on a static cross-sectional
display. Animations are "frozen"
to permit the hand-drawing of faults and facies information,
and then re-animated in an iterative manner. Substituting or combining
different types of seismic volumes into the animation software is an especially
powerful tool for the integration of non-conventional seismic data into
The Stratigrapher’s New Environment
Decoupling structure from stratigraphy is a powerful workflow stimulant. For example, the structuralist can laboriously complete three structural maps from seismic and well data, and the stratigrapher can then instruct framework modeling software to interpolate several dozen stratigraphic surfaces in between. Not only will the software correctly determine the fault/marker intersections and fault displacement for each case, but it will also shape each map to closely resemble the hand-edited maps above and below.
Highly integrated software now allows the stratigrapher to easily import
3D seismic data "on the fly" to fill
the white space between composite logs. This enables the stratigrapher
to simultaneously correlate seismic and log data, greatly accelerating
the workflow. Similarly, well log cross sections can be brought in the
shared 3D canvas, allowing the stratigrapher to view
composite logs and animated 3D seismic data simultaneously.
Because structural modeling can often be performed in parallel with stratigraphic,
it is possible to bring seismic fault cuts onto "clean" logs, a workflow
catalyst in areas where missing section is difficult to recognize.
Chronostratigraphy vs. Lithostratigraphy
Production geologists and engineers often live in a world of lithostratigaphy, whereby reservoirs are defined by lithology (i.e. the "L17a" sand) rather than age. Contrast this to chronostratigraphy, where "flooding surfaces" represent snapshots in geological time that form the basis for an age-based "sequence stratigraphic" classification of the earth. A model-centric, visualization based workflow helps bring these two doctrines together. When the stratigrapher vertically shifts the structuralist’s form slices to coincide with chronostratigraphic "snapshots" tied to well control and 3D seismic, a chronostratigraphic model can be created. Many sub-regional chronostratigraphic or flooding surfaces over an area can often be tracked in this way, effectively permitting the stratigrapher to "hang his hat" on age-dated reference surfaces.
When the computer is then instructed to excavate 10, 20, 30… meters below and/or above these "geological snapshots", lithofacies can be visualized as they should be, presented in an intuitively satisfying geological context. For each mapped chronostratigraphic surface, a stack of parallel "snapshots" can be excavated in this way. This helps form a solid framework with which to characterize facies directly from the seismic, especially when calibrated with core, dip, FMS and other non-seismic data. A systematically expanding collection of layered mini-volumes (really three-dimensional books onto whose pages lithofacies are described) slowly grows to replace the earth volume, aided where necessary by object-oriented geostatistical software. Framework construction in this manner helps unravel difficult well-to-well-correlations as well as provide guidance during parameterization for later geostatistical simulations. Most importantly, it speeds up the RC process.
With new visualization and geophysical tools available, the stratigrapher
can no longer afford to be a pure geologist. The need to integrate advanced
data such as AVO and inversion cubes is simply too important. For example,
log-derived cross plots of acoustic impedance versus reservoir properties
can sometimes predict where particular lithologies or gas saturation values
lie on the seismic AI spectrum. When this is the case, the stratigrapher
can then "make
invisible" lithologies or
gas saturations that are undesirable. Furthermore, rendering "snapshots"
slightly transparent permits "optical stacking" of several snapshots, revealing
thick facies that can be otherwise
difficult to see. By using a full repertoire of G&G tools, the
stratigrapher can visualize areal patterns embedded in the 3D earth volume
on the basis of thickness, fluid content and/or lithology. This approach
can significantly speed up the reservoir characterization by empowering
the stratigrapher to quickly integrate well and seismic control, geological
intuition, and reservoir properties.
The above scenario sounds straightforward, but until recently has been difficult to apply in practice. The reason is that chronostratigraphic flooding surfaces are often faulted, and properly "excavating" parallel beds above and below a complexly faulted surface has been a tedious and time consuming manual task. This has changed with the release of automated framework modeling software and "conformal" gridding techniques, which combine to "excavate" through reservoirs by propagating stratigraphic correlations through the fault framework in an intelligent manner. The software is quite sophisticated, and properly determines structural/stratigraphic intersections. This also relegates as anachronistic such non-physical artifacts as fault "polygons", "contacts" and other concessions that 2D geoscientists have always made to the 3D world.
This is recoupling at it’s finest: computer automation using hardwired
rules to recombine work done by focused and creative individuals, in this
case structuralists and stratigraphers. Complementary new technologies
recouple the work of velocity experts, petrophysicists and engineers.
Workflow Stimulant - Automated Depth Conversion
When 3D seismic data is available to aid in reservoir characterization, arguably the most important and difficult step is to carefully "tie" the seismic data (in time) to the well data (in depth). This can be a tedious process that involves expert log editing and conditioning, "wavelet extraction" and the integration of various types of geological and geophysical data pairs. The final product is only as good as the registration of "two way seismic time" data to "depth" data.
Many non-geophysicists are surprised to know just how many ways there are to convert reservoir models constructed in seismic time to depth. Methods range from a simple "layercake" approach to rigorous and expensive pre-stack depth imaging. All have their place, especially when the needs for accuracy are balanced by the demands for timeliness. For areas not plagued by severe imaging problems such as found in sub-salt areas, it is often preferable to trade off uncertainty for a significant reduction in cycle time.
One new approach involves the en-masse, one step conversion of all reservoir data from time-to-depth and back. This is accomplished by evolving a "domain conversion" velocity cube over the life of the project, constructed in a decoupled manner by a velocity specialist. Incorporating sonic logs, marker/seismic pairs, check shot surveys and seismic velocities, the cube greatly reduces a historical bottleneck by replacing tedious layer-by-layer depth conversion with hardwired, automated rules. Where this approach is applicable, workflow stimulation results from tight integration, not new technology.
The uncertainty associated with depth conversion can be investigated
by supplementing the velocity cube with a stochastic velocity model that
incorporates the uncertainty in the velocity measurements. This approach
can yield multiple depth realizations, in so doing increase an understanding
of total system uncertainty.
3D Log/Seismic Property Modeling
The correlation of reservoir properties and seismic attributes on a reservoir by reservoir basis is an essential and primary step in the formulation of a 3D seismic-based stochastic model (Schultz et al, 1994; Trappe and Hellmich, 1998). By channeling all SS&E work in a model-centric direction, the "macro" geological model can now be superimposed directly onto the well, core and seismic data bases. This eases data analysis between the macro framework level and the "micro" reservoir level, possibly the most difficult and least robust step in the reservoir characterization chain.
The comparison of log/core versus seismic response at well locations
results in a series of rules by which "hard" or "conditional" well data
is interpolated and extrapolated into undrilled rock, using one of a number
of approaches. These rules (determined by multivariate data analyses) need
to be re-run whenever one of the decoupled workflows evolves. Thus, changes
in stratigraphic correlation, fault framework, Rw calculation, velocity
control, etc. need to be incorporated in updated log/core/seismic rules
so that reservoir properties can be re-mapped in an iterative manner. Similarly,
variograms should be periodically recalculated so that the measurement
of uncertainty improves as a project develops, in a prompt and "evergreen"
manner. Reducing the cycle time between the decoupled
and recoupled portions of workflow in this manner greatly speeds up
the entire reservoir characterization process.
Decouple-Recouple, Creativity and Asset Teams
Until this point the article has discussed the role of framework characterization in the RC workflow, and how new, highly integrated software allows a decouple-recouple approach to accelerate this process. It was also argued that decoupling renders anachronistic the historical roles of the geologist and geophysicist, delivering instead the true stratigrapher and structuralist into the asset team. What is perhaps not immediately obvious is the role that decoupling has on creativity and innovation in the RC process.
Asset teams are the culmination of a logical strategy that evolved to tackle the reservoir characterization challenge. They are a way by which to mobilize and coordinate several disciplines for a particular project. What may not be fully appreciated is that this is one of several alternative, contrasting strategies.
One alternative approach arose in the exploration community in the 1980’s, when the "explorationist" emerged from the ranks of geologists and geophysicists. Here the goal of integration was to enhance the individual, not the team. Though this approach became quite popular, it was and is limited by the individual: not many can master geology, geophysics AND engineering. Hence there appeared no "developmentist" or "productionist": the asset team performed that role.
But asset teams have drawbacks. Group dynamics can often stifle creativity, inhibiting innovation. Business units that do not innovate do not grow, and unfortunately many of the industry’s best oil and gas "finders" can be a touch eccentric and difficult to work with in a team setting. To tightly link team coordination (which asset teams do well) with consistent productivity (which they often do not) is a combination that has been unavoidable until recently.
Much of creativity lies with individual effort. An apt metaphor is the stroll through the park: while the construction and upkeep of the park is certainly a "team" effort, the statues found there commemorate exceptional contributions from individuals: soldiers, statesmen, scientists, etc. It is rare to find statues of committees or asset teams in parks, as essential as they are for progress. Societies and production departments are not all that different: the innovations that help companies grow often originate as individual effort.
A workflow that allows asset teams and individuals to coexist adds value.
The challenge then is to allow individual stratigraphers, structuralists,
engineers, petrophysicists and velocity experts to work towards proper
reservoir characterization in isolation as need be, and yet to smoothly
recouple their efforts for synergy and timely decision making. A decouple-recouple
workflow approach, only now possible with emerging technologies, would
seem to be one solution.
Building the Continuum: RC from Exploration through Production
It is difficult enough for individuals to consistently cooperate with individuals in a corporate setting. But when individuals such as explorationists need to interface with teams, the problem becomes even more complex. Perhaps it is no wonder why exploration and production departments often fail to create a continuum of workflow from initial discovery through field development. This continuum however can be more naturally created with a decoupled approach to workflow, whereby stratigraphers, structuralists, petrophysicists, engineers and velocity experts work the reservoir characterization process from exploration through to production, individually and as part of different asset teams during field development.
There needs to be a continuum all through the life of the field: the decoupled workloads do not fundamentally change as the reservoir becomes better known through time. What does change is the focus, timeliness and final deliverables required of individuals during each phase.
During the exploration phase, the focus is regional, extending from
the earth’s surface down to the deep crust: final deliverables reflect
this breadth. During development and production, the focus converges to
bracket the shallowest and deepest reservoir of interest, and most or all
final deliverables need to be of immediate use to engineers.
The role of "Visualization Chambers"
In the decouple-recouple workflow, the asset team as such is ephemeral: it assembles itself to recouple and coordinate the individual workflows that comprise the reservoir characterization process, and then disassembles to continue in a decoupled mode. While assembled, individual work is coordinated and timely business decisions are made. This is possible because all work is oriented towards the population of a reservoir model, and that model can and should be visualized in a team setting.
Very recent commercial releases of visualization technologies allow
asset teams to view the reservoir model in a "visualization chamber". This
provides a good venue for asset team coordination and communication. Combined
with recoupling software, these new visualization technologies permit asset
teams to work together for a highly integrated approach to reservoir characterization.
Importantly, the "chamber" should be the venue for making timely business
decisions and coordinating team efforts: the lion’s share of the work usually
needs to be done in a decoupled mode, away from the assembled team. Usually,
but not always: one attraction of a decouple-recouple approach is that
a balanced workflow can be tailored according to team chemistry, dynamics
Implementation: As important as the Design
For too many years, oil companies have been savaged by fluctuations in economic downturns, only to be stressed in the upturns by understaffed and overworked asset teams. Well-intended TQM and optimization exercises often derail these overworked groups through distracting re-tooling experiments. The workflow cures are often worse than the sickness.
The design and implementation of the decouple-recouple workflow is best
performed in a "non-invasive" mode. Special workflow teams (in-house or
from outside) can quickly evaluate the workflow needs of a "target" asset
team, and then recreate the asset team’s work off-site while the asset
team gets on with their business. This offsite work reassembles the project
within a decouple-recouple process, which is followed by on-site implementation
of the updated project data. This method of implementation imbues the asset
team with both new technologies AND momentum. Continual improvement ensures
success. This four-fold process of evaluation, process, implementation
and continuity is known by the acronym EPICã
, and is a "decoupled" implementation strategy well suited for this
family of workflows.
The purpose of this paper was to discuss the role of reservoir framework characterization in the overall reservoir characterization workflow. It was shown that this workflow deals with both sparsely and very densely sampled data, and that the timeliness of the work can be as important as the need for accuracy, precision and the measurement of uncertainty.
As occasionally happens in industry, several independent developments have coalesced to make the "decouple-recouple" workflow a reality. The use of "artificial" seismic horizons is only possible because of developments in correlation and coherency processing, which convert these surfaces to interpretable canvasses. Tight integration only now allows geologists to easily bring seismic into their daily workflow, enabling them to become stratigraphers in counterpoint to the new "structuralists". New recoupling software allows asset teams to harvest individual work from the various disciplines. Visualization software now permits the active manipulation and interpretation of data to quickly construct models. New visualization hardware permits asset teams to view these models in a timely, efficient manner.
The decouple-recouple approach divides the workflow into two phases. The first is an individual-oriented decoupled phase, whereby the important "G&G" component is replaced by "S&S", structure and stratigraphy. The SS&E team members work alongside velocity specialists and petrophysicists in a manner designed to maximize both creativity and productivity. The second "recoupling" phase uses powerful new software to automate usually tedious processes such as framework building, depth conversion and log/seismic property modeling. This important phase permits asset teams to be quickly assembled in order to coordinate their individual work onto one model, and to apply that model to business decisions. Visualization software and even "viewing chambers" play an important role in this coordination through flexible viewing of the asset team’s model. For similar reasons, the dynamic coexistence of the "macro" geological model and the "micro" well, core and log data base greatly speeds up the iteration of simulations, both stochastic and fluid flow.
Lastly, the parallel development last decade of the "explorationist"
and asset teams have been shown to be a contributory factor to the historic
gulf that exists between exploration, development and production. It is
shown that the decouple-recouple workflow helps to create a continuum between
these end points, by creating a generic individual/team structure that
should be universally applicable. In this context, the differences between
the needs of exploration and production become ones of focus and deliverables,
not of organizational structure. As fields develop from first discovery
through to production, individual and team focus narrows to bracket the
reservoir, and deliverables take on a decidedly engineering-support flavor.
The author would like to thank Schlumberger GeoQuest for access to their
advanced SS&E software, and for continued and enlightened cooperation
between GeoQuest and The Energy Outpost in advancing Development and Production
workflows to the oil and gas industry.
Dubrule, O., Thibaut, M., Lamy, P., and Haas, A 1998 Geostatistical Reservoir Characterization constrained by 3D Seismic Data; Petroleum Geoscience, v.4, p121-128.
Trappe, H. and Hellmich, C. Seismic Characterization of Rotliegend Reservoirs: from Bright Spots to Stochastic Simulation; First Break 16, 79-87.
Schultz, P.S., Ronen, S., Hattori, M., and Corbett, C. 1994 Seismic Guided Estimation of Log Properties; Leading Edge 13, 5-7.
Tobias, S., Ahmed, U, and Brown, D., 1998, Integrated Workflow Methodologies for Asset Teams; Presented at the Asia Pacific Conference of Integrated Modelling for Asset Management, SPE, 23-24 March 1998, Kuala Lumpur, Malaysia
"Coherency Cube" is the trademark of Coherency Technologies Corporation. All rights reserved
"EPIC" workflow design and "Decouple-Recouple" workflows are 1 998 The Energy Outpost Company. All rights reserved.