Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
9
result(s) for
"Azzolina, Nicholas A"
Sort by:
Progress of Gas Injection EOR Surveillance in the Bakken Unconventional Play—Technical Review and Machine Learning Study
2024
Although considerable laboratory and modeling activities were performed to investigate the enhanced oil recovery (EOR) mechanisms and potential in unconventional reservoirs, only limited research has been reported to investigate actual EOR implementations and their surveillance in fields. Eleven EOR pilot tests that used CO2, rich gas, surfactant, water, etc., have been conducted in the Bakken unconventional play since 2008. Gas injection was involved in eight of these pilots with huff ‘n’ puff, flooding, and injectivity operations. Surveillance data, including daily production/injection rates, bottomhole injection pressure, gas composition, well logs, and tracer testing, were collected from these tests to generate time-series plots or analytics that can inform operators of downhole conditions. A technical review showed that pressure buildup, conformance issues, and timely gas breakthrough detection were some of the main challenges because of the interconnected fractures between injection and offset wells. The latest operation of co-injecting gas, water, and surfactant through the same injection well showed that these challenges could be mitigated by careful EOR design and continuous reservoir monitoring. Reservoir simulation and machine learning were then conducted for operators to rapidly predict EOR performance and take control actions to improve EOR outcomes in unconventional reservoirs.
Journal Article
Applying Reservoir Simulation and Artificial Intelligence Algorithms to Optimize Fracture Characterization and CO2 Enhanced Oil Recovery in Unconventional Reservoirs: A Case Study in the Wolfcamp Formation
by
Jin, Lu
,
Azzolina, Nicholas A.
,
Wan, Xincheng
in
Accuracy
,
Algorithms
,
Artificial intelligence
2022
Reservoir simulation for unconventional reservoirs requires proper history matching (HM) to quantify the uncertainties of fracture properties and proper modeling methods to address complex fracture geometry. An integrated method, namely embedded discrete fracture model–artificial intelligence–automatic HM (EDFM–AI–AHM), was used to automatically generate HM solutions for a multistage hydraulic fracturing well in the Wolfcamp Formation. Thirteen scenarios with different combinations of matrix and fracture parameters as variables or fixed inputs were designed to generate 1300 reservoir simulations via EDFM–AI–AHM, from which 358 HM solutions were retained to reproduce production history and quantify the uncertainties of matrix and hydraulic fracture properties. The best HM solution was used for production forecasting and carbon dioxide (CO2)-enhanced oil recovery (EOR) strategy optimization. The results of the production forecast for primary recovery indicated that the drainage area for oil production was difficult to extend further into the low-permeability reservoir matrix. However, CO2 EOR simulations showed that increasing the gas injection rate during the injection cycle promoted incremental oil production from the reservoir matrix, regardless of minimum miscibility pressure. A gas injection rate of 25 million standard cubic feet per day (MMscfd) resulted in a 14% incremental oil production improvement compared to the baseline scenario with no EOR. This paper demonstrates the utility of coupling reservoir simulation with artificial intelligence algorithms to generate ensembles of simulation cases that provide insights into the relationships between fracture network properties and production.
Journal Article
Trends in Burdens of Disease by Transmission Source (USA, 2005–2020) and Hazard Identification for Foods: Focus on Milkborne Disease
by
Stephenson, Michele M
,
Azzolina, Nicholas A
,
Coleman, Margaret E
in
Bacterial diseases
,
Collaboration
,
Datasets
2024
BackgroundRobust solutions to global, national, and regional burdens of communicable and non-communicable diseases, particularly related to diet, demand interdisciplinary or transdisciplinary collaborations to effectively inform risk analysis and policy decisions.ObjectiveU.S. outbreak data for 2005–2020 from all transmission sources were analyzed for trends in the burden of infectious disease and foodborne outbreaks.MethodsOutbreak data from 58 Microsoft Access® data tables were structured using systematic queries and pivot tables for analysis by transmission source, pathogen, and date. Trends were examined using graphical representations, smoothing splines, Spearman’s rho rank correlations, and non-parametric testing for trend. Hazard Identification was conducted based on the number and severity of illnesses.ResultsThe evidence does not support increasing trends in the burden of infectious foodborne disease, though strongly increasing trends were observed for other transmission sources. Morbidity and mortality were dominated by person-to-person transmission; foodborne and other transmission sources accounted for small portions of the disease burden. Foods representing the greatest hazards associated with the four major foodborne bacterial diseases were identified. Fatal foodborne disease was dominated by fruits, vegetables, peanut butter, and pasteurized dairy.ConclusionThe available evidence conflicts with assumptions of zero risk for pasteurized milk and increasing trends in the burden of illness for raw milk. For future evidence-based risk management, transdisciplinary risk analysis methodologies are essential to balance both communicable and non-communicable diseases and both food safety and food security, considering scientific, sustainable, economic, cultural, social, and political factors to support health and wellness for humans and ecosystems.
Journal Article
Effectiveness of subsurface pressure monitoring for brine leakage detection in an uncertain CO2 sequestration system
by
Small, Mitchell J
,
Azzolina, Nicholas A
,
Nakles, David V
in
Aquatic Pollution
,
Aquifers
,
carbon dioxide
2014
This work evaluates the detection sensitivity of deep subsurface pressure monitoring within an uncertain carbon dioxide sequestration system by linking the output of an analytical reduced-order model and first-order uncertainty analysis. A baseline (non-leaky) modeling run was compared against 10 different leakage scenarios, where the cap rock permeability was increased by factors of 2–100 (cap rock permeability from 10⁻³to 10⁻¹millidarcy). The uncertainty variance outputs were used to develop percentile estimates and detection sensitivity for pressure throughout the deep subsurface as a function of space (lateral distance from the injection wells and vertical orientation within the reservoir) and time (years since injection), or P(x, z, t). Conditional probabilities were computed for combinations of x, z, and t, which were then used to generate power curves for detecting leakage scenarios. The results suggest that measurements of the absolute change in pressure within the target injection aquifer would not be able to distinguish small leakage rates (i.e., less than 50 × baseline) from baseline conditions, and that only large leakage rates (i.e., >100 × baseline) would be discriminated with sufficient statistical power (>99 %). Combining measurements, for example by taking the ratio of formation pressure in Aquifer 2/Aquifer 1, provides better statistical power for distinguishing smaller leakage rates at earlier times in the injection program. Detection sensitivity for pressure is a function of space and time. Therefore, design of an adequate monitoring network for subsurface pressure should account for this space–time variability to ensure that the monitoring system performs to the necessary design criteria, e.g., specific false-negative and false-positive rates.
Journal Article
Applying Reservoir Simulation and Artificial Intelligence Algorithms to Optimize Fracture Characterization and COsub.2 Enhanced Oil Recovery in Unconventional Reservoirs: A Case Study in the Wolfcamp Formation
2022
Reservoir simulation for unconventional reservoirs requires proper history matching (HM) to quantify the uncertainties of fracture properties and proper modeling methods to address complex fracture geometry. An integrated method, namely embedded discrete fracture model–artificial intelligence–automatic HM (EDFM–AI–AHM), was used to automatically generate HM solutions for a multistage hydraulic fracturing well in the Wolfcamp Formation. Thirteen scenarios with different combinations of matrix and fracture parameters as variables or fixed inputs were designed to generate 1300 reservoir simulations via EDFM–AI–AHM, from which 358 HM solutions were retained to reproduce production history and quantify the uncertainties of matrix and hydraulic fracture properties. The best HM solution was used for production forecasting and carbon dioxide (CO[sub.2])-enhanced oil recovery (EOR) strategy optimization. The results of the production forecast for primary recovery indicated that the drainage area for oil production was difficult to extend further into the low-permeability reservoir matrix. However, CO[sub.2] EOR simulations showed that increasing the gas injection rate during the injection cycle promoted incremental oil production from the reservoir matrix, regardless of minimum miscibility pressure. A gas injection rate of 25 million standard cubic feet per day (MMscfd) resulted in a 14% incremental oil production improvement compared to the baseline scenario with no EOR. This paper demonstrates the utility of coupling reservoir simulation with artificial intelligence algorithms to generate ensembles of simulation cases that provide insights into the relationships between fracture network properties and production.
Journal Article
CAN THE HGM CLASSIFICATION OF SMALL, NON-PEAT FORMING WETLANDS DISTINGUISH WETLANDS FROM SURFACE WATER GEOCHEMISTRY
by
Azzolina, Nicholas A.
,
Siegel, Donald I.
,
Samson, Scott D.
in
Calcium ions
,
Catskill/Delaware watershed
,
Chemical composition
2007
We report the results of a detailed 12-month study of 23 freshwater wetlands and one larger synoptic characterization of 55 freshwater wetlands to test whether a hydrogeomorphic (HGM) classification of the wetlands into lotic (attached to streams) and terrene (groundwater fed) classes meaningfully discriminated wetland surface water chemical composition in the mountainous Catskill-Delaware watersheds of southeastern New York State. Most of these hillslope wetlands are underlain by thin, largely siliceous mineral soils and have minimal peat cover. Nonparametric one-way ANOVA (Kruskal-Wallis) tests based on measurements of SC, Ca2+, Mg2+, Na+, DOC, TDN, TDS, Si, SO42−, pH, DO, K+, Cl−, NH4+, NO3−, TDP, and HCO3− failed to reject the null hypothesis that the surface water chemistry of lotic and terrene wetlands was identical. Results of the statistical tests showed that the only significantly different chemical species in surface waters from the two HGM landscape classifications were SC, Na+, and Cl−, which was clearly related to individual wetland proximity to road salt additions. Isotopic analyses of 2H and 18O for 30 synoptic wetland surface waters also failed to demonstrate significant differences for any of the HGM wetland classes. Based on the results of these data, we caution that landscape position, landform, water flow path, and water body type may not be accurate in making wetland classifications for HGM assessment in all locations. Underlying geology should be considered before making assumptions that water chemistry will differ by landscape position, and wetland functions dependent on water chemistry should be evaluated accordingly.
Journal Article
Effectiveness of subsurface pressure monitoring for brine leakage detection in an uncertain CO sub(2) sequestration system
2014
This work evaluates the detection sensitivity of deep subsurface pressure monitoring within an uncertain carbon dioxide sequestration system by linking the output of an analytical reduced-order model and first-order uncertainty analysis. A baseline (non-leaky) modeling run was compared against 10 different leakage scenarios, where the cap rock permeability was increased by factors of 2-100 (cap rock permeability from 10 super(-3) to 10 super(-1) millidarcy). The uncertainty variance outputs were used to develop percentile estimates and detection sensitivity for pressure throughout the deep subsurface as a function of space (lateral distance from the injection wells and vertical orientation within the reservoir) and time (years since injection), or P(x, z, t). Conditional probabilities were computed for combinations of x, z, and t, which were then used to generate power curves for detecting leakage scenarios. The results suggest that measurements of the absolute change in pressure within the target injection aquifer would not be able to distinguish small leakage rates (i.e., less than 50 baseline) from baseline conditions, and that only large leakage rates (i.e., >100 baseline) would be discriminated with sufficient statistical power (>99 %). Combining measurements, for example by taking the ratio of formation pressure in Aquifer 2/Aquifer 1, provides better statistical power for distinguishing smaller leakage rates at earlier times in the injection program. Detection sensitivity for pressure is a function of space and time. Therefore, design of an adequate monitoring network for subsurface pressure should account for this space-time variability to ensure that the monitoring system performs to the necessary design criteria, e.g., specific false-negative and false-positive rates.
Journal Article
Deep Learning-Accelerated 3D Carbon Storage Reservoir Pressure Forecasting Based on Data Assimilation Using Surface Displacement from InSAR
by
Sherman, Christopher S
,
Jiang, Su
,
Tang, Hewei
in
Carbon sequestration
,
Data assimilation
,
Deep learning
2022
Fast forecasting of reservoir pressure distribution in geologic carbon storage (GCS) by assimilating monitoring data is a challenging problem. Due to high drilling cost, GCS projects usually have spatially sparse measurements from wells, leading to high uncertainties in reservoir pressure prediction. To address this challenge, we propose to use low-cost Interferometric Synthetic-Aperture Radar (InSAR) data as monitoring data to infer reservoir pressure build up. We develop a deep learning-accelerated workflow to assimilate surface displacement maps interpreted from InSAR and to forecast dynamic reservoir pressure. Employing an Ensemble Smoother Multiple Data Assimilation (ES-MDA) framework, the workflow updates three-dimensional (3D) geologic properties and predicts reservoir pressure with quantified uncertainties. We use a synthetic commercial-scale GCS model with bimodally distributed permeability and porosity to demonstrate the efficacy of the workflow. A two-step CNN-PCA approach is employed to parameterize the bimodal fields. The computational efficiency of the workflow is boosted by two residual U-Net based surrogate models for surface displacement and reservoir pressure predictions, respectively. The workflow can complete data assimilation and reservoir pressure forecasting in half an hour on a personal computer.
A Deep Learning-Accelerated Data Assimilation and Forecasting Workflow for Commercial-Scale Geologic Carbon Storage
by
Sherman, Christopher S
,
Burton-Kelly, Matthew
,
Zhang, Jize
in
Artificial neural networks
,
Carbon dioxide
,
Carbon sequestration
2022
Fast assimilation of monitoring data to update forecasts of pressure buildup and carbon dioxide (CO2) plume migration under geologic uncertainties is a challenging problem in geologic carbon storage. The high computational cost of data assimilation with a high-dimensional parameter space impedes fast decision-making for commercial-scale reservoir management. We propose to leverage physical understandings of porous medium flow behavior with deep learning techniques to develop a fast history matching-reservoir response forecasting workflow. Applying an Ensemble Smoother Multiple Data Assimilation framework, the workflow updates geologic properties and predicts reservoir performance with quantified uncertainty from pressure history and CO2 plumes interpreted through seismic inversion. As the most computationally expensive component in such a workflow is reservoir simulation, we developed surrogate models to predict dynamic pressure and CO2 plume extents under multi-well injection. The surrogate models employ deep convolutional neural networks, specifically, a wide residual network and a residual U-Net. The workflow is validated against a flat three-dimensional reservoir model representative of a clastic shelf depositional environment. Intelligent treatments are applied to bridge between quantities in a true-3D reservoir model and those in a single-layer reservoir model. The workflow can complete history matching and reservoir forecasting with uncertainty quantification in less than one hour on a mainstream personal workstation.