首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper reports the results of an investigation on the use of a deterministic analysis scheme combined with the method ensemble smoother with multiple data assimilation (ES-MDA) for the problem of assimilating a large number of correlated data points. This is the typical case when history-matching time-lapse seismic data in petroleum reservoir models. The motivation for the use of the deterministic analysis is twofold. First, it tends to result in a smaller underestimation of the ensemble variance after data assimilation. This is particularly important for problems with a large number of measurements. Second, the deterministic analysis avoids the factorization of a large covariance matrix required in the standard implementation of ES-MDA with the perturbed observations scheme. The deterministic analysis is tested in a synthetic history-matching problem to assimilate production and seismic data.  相似文献   

2.
The ensemble Kalman filter has been successfully applied for data assimilation in very large models, including those in reservoir simulation and weather. Two problems become critical in a standard implementation of the ensemble Kalman filter, however, when the ensemble size is small. The first is that the ensemble approximation to cross-covariances of model and state variables to data can indicate the presence of correlations that are not real. These spurious correlations give rise to model or state variable updates in regions that should not be updated. The second problem is that the number of degrees of freedom in the ensemble is only as large as the size of the ensemble, so the assimilation of large amounts of precise, independent data is impossible. Localization of the Kalman gain is almost universal in the weather community, but applications of localization for the ensemble Kalman filter in porous media flow have been somewhat rare. It has been shown, however, that localization of updates to regions of non-zero sensitivity or regions of non-zero cross-covariance improves the performance of the EnKF when the ensemble size is small. Localization is necessary for assimilation of large amounts of independent data. The problem is to define appropriate localization functions for different types of data and different types of variables. We show that the knowledge of sensitivity alone is not sufficient for determination of the region of localization. The region depends also on the prior covariance for model variables and on the past history of data assimilation. Although the goal is to choose localization functions that are large enough to include the true region of non-zero cross-covariance, for EnKF applications, the choice of localization function needs to balance the harm done by spurious covariance resulting from small ensembles and the harm done by excluding real correlations. In this paper, we focus on the distance-based localization and provide insights for choosing suitable localization functions for data assimilation in multiphase flow problems. In practice, we conclude that it is reasonable to choose localization functions based on well patterns, that localization function should be larger than regions of non-zero sensitivity and should extend beyond a single well pattern.  相似文献   

3.
地下水位在非饱和水流数据同化中的应用   总被引:1,自引:0,他引:1       下载免费PDF全文
为理解地下水位观测信息在非饱和水流数据同化中的数据价值,建立了基于地下水位动态观测信息的一维饱和-非饱和水流集合卡尔曼滤波,通过虚拟数值实验检验了地下水位观测信息在非饱和水力参数估计和水分校正中的潜在价值。研究结果表明:在以地下水位为唯一观测数据时,同时更新参数和水头比仅更新水头能更好地校正土壤剖面的水头分布;当多层单个水力参数未知时,地下水位观测可以为参数估计提供有效信息;当多层多个参数未知时,地下水位与多层多个参数之间的复杂关系导致观测信息难以估计出最优的(唯一的)参数值;地下水位可作为辅助信息,与含水量观测等信息联合运用改善参数估计和含水量预测精度。  相似文献   

4.
Parameter calibration is one of the most problematic phases of numerical modeling since the choice of parameters affects the model’s reliability as far as the physical problems being studied are concerned. In some cases, laboratory tests or physical models evaluating model parameters cannot be completed and other strategies must be adopted; numerical models reproducing debris flow propagation are one of these. Since scale problems affect the reproduction of real debris flows in the laboratory or specific tests used to determine rheological parameters, calibration is usually carried out by comparing in a subjective way only a few parameters, such as the heights of soil deposits calculated for some sections of the debris flows or the distance traveled by the debris flows using the values detected in situ after an event has occurred. Since no automatic or objective procedure has as yet been produced, this paper presents a numerical procedure based on the application of a statistical algorithm, which makes it possible to define, without ambiguities, the best parameter set. The procedure has been applied to a study case for which digital elevation models of both before and after an important event exist, implicating that a good database for applying the method was available. Its application has uncovered insights to better understand debris flows and related phenomena.  相似文献   

5.
Ensemble-based methods are becoming popular assisted history matching techniques with a growing number of field applications. These methods use an ensemble of model realizations, typically constructed by means of geostatistics, to represent the prior uncertainty. The performance of the history matching is very dependent on the quality of the initial ensemble. However, there is a significant level of uncertainty in the parameters used to define the geostatistical model. From a Bayesian viewpoint, the uncertainty in the geostatistical modeling can be represented by a hyper-prior in a hierarchical formulation. This paper presents the first steps towards a general parametrization to address the problem of uncertainty in the prior modeling. The proposed parametrization is inspired in Gaussian mixtures, where the uncertainty in the prior mean and prior covariance is accounted by defining weights for combining multiple Gaussian ensembles, which are estimated during the data assimilation. The parametrization was successfully tested in a simple reservoir problem where the orientation of the major anisotropic direction of the permeability field was unknown.  相似文献   

6.
Published strength profiles predict strength discontinuities within and/or at the base of continental crust during compression. We use finite element models to investigate the effect of strength discontinuities on continental collision dynamics. The style of deformation in model crust during continued subduction of underlying mantle lithosphere is controlled by: (1) experimental flow-law data; (2) the crustal geotherm; (3) strain localization by erosion; (4) strain-softening and other localization effects. In the absence of erosion and other factors causing strain localization, numerical models with typical geothermal gradients and frictional/ductile rheologies predict diffuse crustal deformation with whole-scale detachment of crust from mantle lithosphere. This prediction is at odds with earlier model studies that only considered frictional crustal rheologies and showed asymmetric, focused crustal deformation. Without localization, model deformation is not consistent with that observed in small collisional orogens such as the Swiss Alps. This suggests that strain localization by a combination of erosion and rheological effects such as strain softening must play a major role in focusing deformation, and that strength profiles derived under constant strain rates and uniform material properties cannot be used to infer crustal strength during collision dynamics.  相似文献   

7.
8.
A 3-compartment model of phytoplankton growth dynamics has been coupled with a primitive-equation circulation model to better understand and quantify physical and biological processes in the Adriatic Sea. This paper presents the development and application of a data assimilation procedure based on optimal control theory. The aim of the procedure is to identify a set of model coefficient values that ensures the best fit between data and model results by minimizing a function that measures model and data discrepancies. In this sense, twin experiments have been successfully implemented in order to have a better estimation of biological model parameters and biological initial conditions.  相似文献   

9.
The objective of this study is to investigate the impact of a surface data assimilation (SDA) technique, together with the traditional four-dimensional data assimilation (FDDA), on the simulation of a monsoon depression that formed over India during the field phase of the 1999 Bay of Bengal Monsoon Experiment (BOBMEX). The SDA uses the analyzed surface data to continuously assimilate the surface layer temperature as well as the water vapor mixing ratio in the mesoscale model. The depression for the greater part of this study was offshore and since successful application of the SDA would require surface information, a method of estimating surface temperature and surface humidity using NOAA-TOVS satellites was used. Three sets of numerical experiments were performed using a coupled mesoscale model. The first set, called CONTROL, uses the NCEP (National Center for Environmental Prediction) reanalysis for the initial and lateral boundary conditions in the MM5 simulation. The second and the third sets implemented the SDA of temperature and moisture together with the traditional FDDA scheme available in the MM5 model. The second set of MM5 simulation implemented the SDA scheme only over the land areas, and the third set extended the SDA technique over land as well as sea. Both the second and third sets of the MM5 simulation used the NOAA-TOVS and QuikSCAT satellite and conventional upper air and surface meteorological data to provide an improved analysis. The results of the three sets of MM5 simulations are compared with one another and with the analysis and the BOBMEX 1999 buoy, ship, and radiosonde observations. The predicted sea level pressure of both the model runs with assimilation resembles the analysis closely and also captures the large-scale structure of the monsoon depression well. The central sea level pressures of the depression for both the model runs with assimilation were 2–4 hPa lower than the CONTROL. The results of both the model runs with assimilation indicate a larger spatial area as well as increased rainfall amounts over the coastal regions after landfall compared with the CONTROL. The impact of FDDA and SDA, the latter over land, resulted in reduced errors of the following: 1.45 K in temperature, 0.39 m s−1 in wind speed, and 14° in wind direction compared with the BOBMEX buoy observation, and 1.43 m s−1 in wind speed, 43° in wind direction, and 0.75% in relative humidity compared with the CONTROL. The impact of SDA over land and sea compared with SDA over land only showed a further marginal reduction of errors: 0.23 K in air temperature (BOBMEX buoy) and 1.33 m s−1 in wind speed simulations.  相似文献   

10.
11.
12.
为研究观测资料稀少情况下土壤质地及有机质对土壤水分同化的影响,发展了集合卡尔曼平滑(Ensemble Kalman Smooth, EnKS)的土壤水分同化方案。利用黑河上游阿柔冻融观测站2008年6月1日至10月29日的观测数据,使用EnKS算法将表层土壤水分观测数据同化到简单生物圈模型(Simple Biosphere Model 2, SiB2)中,分析不同方案对土壤水分估计的影响,并与集合卡尔曼滤波算法(EnKF)的结果进行比较。研究结果表明,土壤质地和有机质对表层土壤水分模拟结果影响最大而对深层的影响相对较小;利用EnKF和EnKS算法同化表层土壤水分观测数据,均能够显著提高表层和根区土壤水分估计的精度,EnKS算法的精度略高于EnKF且所受土壤质地和有机质的影响小于EnKF;当观测数据稀少时,EnKS算法仍然可以得到较高精度的土壤水分估计。  相似文献   

13.
This paper proposes a new ensemble-based algorithm that assimilates the vertical rain structure retrieved from microwave radiometer and radar measurements in a regional weather forecast model, by employing a Bayesian framework. The goal of the study is to evaluate the capability of the proposed technique to improve track prediction of tropical cyclones that originate in the North Indian Ocean. For this purpose, the tropical cyclone Jal has been analyzed by the community mesoscale weather model, weather research and forecasting (WRF). The ensembles of prognostic variables such as perturbation potential temperature (θk), perturbation geopotential (?, m2/s2), meridional (U) and zonal velocities (V) and water vapor mixing ratio (q v , kg/kg) are generated by the empirical orthogonal function technique. An over pass of the tropical rainfall-measuring mission (TRMM) satellite occurred on 06th NOV 0730 UTC over the system, and the observations from the radiometer and radar on board the satellite(1B11 data products) are inverted using a combined in-home radiometer-radar retrieval technique to estimate the vertical rain structure, namely the cloud liquid water, cloud ice, precipitation water and precipitation ice. Each ensemble is input as a possible set of initial conditions to the WRF model from 00 UTC which was marched in time till 06th NOV 0730 UTC. The above-mentioned hydrometeors from the cloud water and rain water mixing ratios are then estimated for all the ensembles. The Bayesian filter framework technique is then used to determine the conditional probabilities of all the candidates in the ensemble by comparing the retrieved hydrometeors through measured TRMM radiances with the model simulated hydrometeors. Based on the posterior probability density function, the initial conditions at 06 00 UTC are then corrected using a linear weighted average of initial ensembles for the all prognostic variables. With these weighted average initial conditions, the WRF model has been run up to 08th Nov 06 UTC and the predictions are then compared with observations and the control run. An ensemble independence study was conducted on the basis of which, an optimum of 25 ensembles is arrived at. With the optimum ensemble size, the sensitivity of prognostic variables was also analyzed. The model simulated track when compared with that obtained with the corrected set of initial conditions gives better results than the control run. The algorithm can improve track prediction up to 35 % for a 24 h forecast and up to 12 % for a 54 h forecast.  相似文献   

14.
15.
In this paper, a stochastic collocation-based Kalman filter (SCKF) is developed to estimate the hydraulic conductivity from direct and indirect measurements. It combines the advantages of the ensemble Kalman filter (EnKF) for dynamic data assimilation and the polynomial chaos expansion (PCE) for efficient uncertainty quantification. In this approach, the random log hydraulic conductivity field is first parameterized by the Karhunen–Loeve (KL) expansion and the hydraulic pressure is expressed by the PCE. The coefficients of PCE are solved with a collocation technique. Realizations are constructed by choosing collocation point sets in the random space. The stochastic collocation method is non-intrusive in that such realizations are solved forward in time via an existing deterministic solver independently as in the Monte Carlo method. The needed entries of the state covariance matrix are approximated with the coefficients of PCE, which can be recovered from the collocation results. The system states are updated by updating the PCE coefficients. A 2D heterogeneous flow example is used to demonstrate the applicability of the SCKF with respect to different factors, such as initial guess, variance, correlation length, and the number of observations. The results are compared with those from the EnKF method. It is shown that the SCKF is computationally more efficient than the EnKF under certain conditions. Each approach has its own advantages and limitations. The performance of the SCKF decreases with larger variance, smaller correlation ratio, and fewer observations. Hence, the choice between the two methods is problem dependent. As a non-intrusive method, the SCKF can be easily extended to multiphase flow problems.  相似文献   

16.
陈冲  张伟  邢庆辉  豆沂宣 《冰川冻土》2022,44(6):1912-1924
黑河流域中下游地下水系统受上游冰冻圈融水和降雨的补给,由气候变暖导致的冰冻圈萎缩致使中下游地下水系统的稳定性面临更多的风险。地下水模型是地下水系统稳定性评估的有效手段,但是地下水模型参数往往存在较大的不确定性。为此,本文提出了基于数据同化算法的不确定性分析方法,通过包含观测资料信息减小模型不确定性。采用所提方法分析了(基于MODFLOW构建)黑河流域中游地下水模型中13个参数的不确定性,讨论了算法超参数的影响及其最优取值,分析了地下水模型参数的不确定性。实验结果证明数据同化算法可有效减小地下水模型参数的不确定性,观测资料的种类与数量对参数不确定性的减小起到重要作用;不同地下水模型参数的不确定性不同,地表水与地下水相互作用频繁的区域参数不确定性较大;含水层渗透系数、含水层给水度以及灌溉回流系数对模型输出的地下水位输出影响显著,河床水力传导系数对模型输出的河流流量影响较大。本研究将为地下水研究提供更加可靠的模型方法,为西北内流区地下水哺育的绿洲生态系统稳定可持续研究提供重要支撑。  相似文献   

17.
Extreme weather events such as cloudburst and thunderstorms are great threat to life and property. It is a great challenge for the forecasters to nowcast such hazardous extreme weather events. Mesoscale model (ARPS) with real-time assimilation of DWR data has been operationally implemented in India Meteorological Department (IMD) for real-time nowcast of weather over Indian region. Three-dimensional variational (ARPS3DVAR) technique and cloud analysis procedure are utilized for real-time data assimilation in the model. The assimilation is performed as a sequence of intermittent cycles and complete process (starting from reception, processing and assimilation of DWR data, running of ARPS model and Web site updation) takes less than 20 minutes. Thus, real-time nowcast for next 3 h from ARPS model is available within 20 minutes of corresponding hour. Cloudburst event of September 15, 2011, and thunderstorm event of October 22, 2010, are considered to demonstrate the capability of ARPS model to nowcast the extreme weather events in real time over Indian region. Results show that in both the cases, ARPS3DVAR and cloud analysis technique are able to extract hydrometeors from radar data which are transported to upper levels by the strong upward motion resulting in the distribution of hydrometeors at various isobaric levels. Dynamic and thermodynamic structures of cloudburst and thunderstorm are also well simulated. Thus, significant improvement in the initial condition is noticed. In the case of cloudburst event, the model is able to capture the sudden collisions of two or more clouds during 09–10 UTC. Rainfall predicted by the model during cloudburst event is over 100 mm which is very close to the observed rainfall (117 mm). The model is able to predict the cloudburst with slight errors in time and space. Real-time nowcast of thunderstorm shows that movement, horizontal extension, and north–south orientation of thunderstorm are well captured during first hour and deteriorate thereafter. The amount of rainfall predicted by the model during thunderstorm closely matches with observation with slight errors in the location of rainfall area. The temporal and spatial information predicted by ARPS model about the sudden collision/merger and broken up of convective cells, intensification, weakening, and maintaining intensity of convective cells has added value to a human forecast.  相似文献   

18.
A dynamical downscaling approach based on scale-selective data assimilation (SSDA) is applied to tropical cyclone (TC) track forecasts. The results from a case study of super Typhoon Megi (2010) show that the SSDA approach is very effective in improving the TC track forecasts by fitting the large-scale wind field from the regional model to that from the global forecast system (GFS) forecasts while allowing the small-scale circulation to develop freely in the regional model. A comparison to the conventional spectral-nudging four-dimensional data assimilation (FDDA) indicates that the SSDA approach outperforms the FDDA in TC track forecasts because the former allows the small-scale features in a regional model to develop more freely than the latter due to different techniques used. In addition, a number of numerical experiments are performed to investigate the sensitivity of SSDA’s effect in TC track forecasts to some parameters in SSDA, including the cutoff wave number, the vertical layers of the atmosphere being adjusted, and the interval of SSDA implementation. The results show that the improvements are sensitive in different extent to the above three parameters.  相似文献   

19.
The variational technique of data assimilation using adjoint equations has been illustrated using a nonlinear oceanographic shallow water model. The technique consists of minimizing a cost function representing the misfit between the model and the data subject to the model equations acting as constraints. The problem has been transformed into an unconstrained one by the use of Lagrange multipliers. Particular emphasis has been laid on finite difference formulation of the algorithm. Several numerical experiments have been conducted using simulated data obtained from a control run of the model. Implications of this technique for assimilating asynoptic satellite altimeter data into ocean models have been discussed.  相似文献   

20.
The Bayesian framework is the standard approach for data assimilation in reservoir modeling. This framework involves characterizing the posterior distribution of geological parameters in terms of a given prior distribution and data from the reservoir dynamics, together with a forward model connecting the space of geological parameters to the data space. Since the posterior distribution quantifies the uncertainty in the geologic parameters of the reservoir, the characterization of the posterior is fundamental for the optimal management of reservoirs. Unfortunately, due to the large-scale highly nonlinear properties of standard reservoir models, characterizing the posterior is computationally prohibitive. Instead, more affordable ad hoc techniques, based on Gaussian approximations, are often used for characterizing the posterior distribution. Evaluating the performance of those Gaussian approximations is typically conducted by assessing their ability at reproducing the truth within the confidence interval provided by the ad hoc technique under consideration. This has the disadvantage of mixing up the approximation properties of the history matching algorithm employed with the information content of the particular observations used, making it hard to evaluate the effect of the ad hoc approximations alone. In this paper, we avoid this disadvantage by comparing the ad hoc techniques with a fully resolved state-of-the-art probing of the Bayesian posterior distribution. The ad hoc techniques whose performance we assess are based on (1) linearization around the maximum a posteriori estimate, (2) randomized maximum likelihood, and (3) ensemble Kalman filter-type methods. In order to fully resolve the posterior distribution, we implement a state-of-the art Markov chain Monte Carlo (MCMC) method that scales well with respect to the dimension of the parameter space, enabling us to study realistic forward models, in two space dimensions, at a high level of grid refinement. Our implementation of the MCMC method provides the gold standard against which the aforementioned Gaussian approximations are assessed. We present numerical synthetic experiments where we quantify the capability of each of the ad hoc Gaussian approximation in reproducing the mean and the variance of the posterior distribution (characterized via MCMC) associated to a data assimilation problem. Both single-phase and two-phase (oil–water) reservoir models are considered so that fundamental differences in the resulting forward operators are highlighted. The main objective of our controlled experiments was to exhibit the substantial discrepancies of the approximation properties of standard ad hoc Gaussian approximations. Numerical investigations of the type we present here will lead to the greater understanding of the cost-efficient, but ad hoc, Bayesian techniques used for data assimilation in petroleum reservoirs and hence ultimately to improved techniques with more accurate uncertainty quantification.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号