Sorry, you need to enable JavaScript to visit this website.
Menu
Login

Search

Resource

Framing risk for environmental science and environmental scientists

‘Risk: The potential for adverse consequences for human or ecological systems, recognising the diversity of values and objectives associated with such systems.’ This widely used definition both clarifies and clouds how environmental scientists can discern and address risk. While it provides a broad basis of understanding, it also raises numerous questions: what is risk composed of; where can it be found; is it stand-alone, systemic, or can it be both; how can we deal with it when we encounter it?

Download Resource
Resource

Mitigating the gradient artefacts of Migration Velocity Analysis by Gauss-Newton update

Full waveform inversion (FWI) is a method of velocity estimation which minimizes the misfit between recorded and modeled data, the parameters of the minimization being the velocity model. If the velocity model is smooth, then only the refracted waves are modeled and used, if the velocity model has discon- tinuities, the reflected waves created by the two-way modeling can be used to model the reflected waves. This highly non-linear algorithm can provide impressive results, in particular high resolution velocity from low-frequency data, but is very sensitive to its starting point. Migration velocity analysis (MVA) uses a migration to estimate a reflectivity from the data, then uses a criterion on the reflectivity to find the best velocity model. This method is not as non linear as FWI: it is less sensitive to the starting model, but has less potential to find a very detailed velocity from low- frequency data. In recent years, FWI has made a lot of progress to solve its inherent problems, and MVA techniques have been lagging behind. The main reason is that the gradient of the MVA cost functions exhibits artefacts that perturb the convergence.

Download Resource
Resource

A 4D Seismic Processing Case Study in a Difficult Shallow Offshore Complex Carbonate Field.

Time-lapse seismic processing in carbonate fields having complex geology and in difficult seismic contexts requires highly specialized teams for success. Our field case has a flat structure, a poorly-imaged but highly reflective sea bottom and is covered by towed-streamer data in about 60m of water depth. Multiples contamination is severe and these are coherent with primaries. We tested several 4D seismic processing routes using a base and two monitor surveys and have summed up our experiences in four learning points: A simplified targeted demultiple flow avoiding adaptive methods improves 4D metrics better than a complicated one. A guided co-denoise technique using base and monitor vintages attenuate non-repeated noise from data while preserving 4D timeshifts and 4D amplitude changes. A mute design optimization prior to stack attenuates residual multiples that degrade 4D signal. Finally, seismic acquisition parameters have a strong impact on computed 4D seismic attributes even if this may not be the case in 3D. These learning points coupled with multidisciplinary interactions and an iterative processing QC strategy assure the delivery of data with a more interpretable 4D signal that permit the delineation of depleted zones, flushed zones and by-passed oil for future infill-well drilling and optimal reservoir management.

Download Resource
Resource

Integration of Broadband Seismic Data Into Reservoir Characterization Workflows: a Case Study From the Campos Basin, Brazil

In this work we propose to revisit some of the main steps of a seismic reservoir characterization workflow, using a MCNV Campos Basin broadband seismic dataset. The objective is to illustrate the differences with conventional seismic data, identify potential pitfalls and suggest best practices in the use of broadband data for reservoir characterization. The resulting study showcases the benefits that broadband data can bring to reservoir uncertainty management – in this case at the exploration stage. (excerpt from the introduction)

Download Resource
Resource

Least square Q-Kirchhoff migration: implementation and application

The absorption effect caused by the anelastic nature of earth leads to attenuation of amplitudes and distortion of phases for seismic wave. The so-called Q factor has to be compensated for correct imaging. We propose least square Q-Kirchhoff migration (LSQPSDM) in which absorption is incorporated into Kirchhoff modeling operator and Q compensation is achieved naturally via inversion with proper sparse constraints. With better illumination and Q compensation, fault imaging is naturally enhanced through the proposed least square Q-Kirchhoff migration. The proposed LSQPSDM approach has been applied to a synthetic data and a field dataset from NWS Australia. Better fault imaging and SNR are obtained comparing to conventional Q migration.

Download Resource
Resource

Gippsland Basin, Australia: New data provides compelling insights in unexplored areas

CGG has undertaken a multi-phase, multi-year data enhancement and acquisition project, commencing with a major basin-scale reprocessing initiative (ReGeneration) and culminating in the completion of a new 3D acquisition and imaging project, completed in 2021. The new survey has provided expanded data coverage from the inboard shallow water areas, throughout the Central Deep, and into the previously unexplored deepwater areas. Preliminary interpretation of the final data processed with CGG’s latest proprietary imaging technology has yielded a number of key insights which further enhance understanding of paleo-depositional environments and prospectivity of the deepwater areas. The new data is already unlocking previously unseen depositional elements, with strong implications for petroleum system understanding.

Download Resource
Resource

Angolan Kwanza Basin - Expanding Proven Opportunities

Seismic imaging in the Kwanza Basin has historically proven to be challenging owing to its complex geology and the presence of deep pre-salt targets. CGG has recently re-imaged its Kwanza Basin multi-client data portfolio to benefit from new insights made possible by advanced proprietary imaging techniques that have already been proven in other pre-salt basins. This newly re-processed data will enable interpreters to produce meaningful interpretations in a largely under-explored basin.

Download Resource
Resource

New Insights into Wellbore Stability Analysis with Integration of Petrophysics, Rock Physics, Geology, Geomechanics And Drilling

CCED is the operator of onshore blocks in the Sultanate of Oman. The blocks are located on the eastern flank of the Oman Salt Basin. Within the area of interest, the Barik, Al Bashair, Buah and Khufai Formations form the main oil and gas reservoirs. While trying to reach these targets, CCED faced significant issues while drilling wells through the sandstone-rich claystones of the Barakat and Mabrouk Formations. 1D pore pressure and geomechanical models were built for ten wells chosen based on their representability of the field, data availability, spatial coverage and issues faced during drilling. The main goal of the study was to understand the geomechanical behavior of the different formations and identify measures to optimize future drilling decisions especially when drilling deviated wells through the Mabrouk Formation to land horizontal laterals in the Barik Formation. The study concluded that most of the held up and stuck pipe issues were due to WBOs caused by stress and strength anisotropy (caused by weak bedding planes). Heavier muds were recommended for subsequent drilling operations during which no major borehole issues were detected. Strength anisotropy (caused by WBP) was detrimental only when well deviation exceeded 30 deg.

Download Resource
Resource

Machine Learning for Table Cell Classification

Tables are ubiquitous in the geoscience industry, appearing in numerous documents and spreadsheets. They contain a wealth of data in a structured format which can help us understand the subsurface. However, the number of tables created over the years is huge and it requires an enormous manual effort for domain experts to read each table to understand what kind of data is in it. Therefore, it would be more efficient to develop an automated way to do this, but different tables can vary greatly in style and layouts which makes it difficult for a machine to understand tables. For this reason, a first step towards automatic extraction of data from tables and spreadsheets is the identification of the role each cell plays, a task called table cell classification. In this work, we explore machine learning techniques for performing this task.

Download Resource