首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In geophysical inverse problems, the posterior model can be analytically assessed only in case of linear forward operators, Gaussian, Gaussian mixture, or generalized Gaussian prior models, continuous model properties, and Gaussian-distributed noise contaminating the observed data. For this reason, one of the major challenges of seismic inversion is to derive reliable uncertainty appraisals in cases of complex prior models, non-linear forward operators and mixed discrete-continuous model parameters. We present two amplitude versus angle inversion strategies for the joint estimation of elastic properties and litho-fluid facies from pre-stack seismic data in case of non-parametric mixture prior distributions and non-linear forward modellings. The first strategy is a two-dimensional target-oriented inversion that inverts the amplitude versus angle responses of the target reflections by adopting the single-interface full Zoeppritz equations. The second is an interval-oriented approach that inverts the pre-stack seismic responses along a given time interval using a one-dimensional convolutional forward modelling still based on the Zoeppritz equations. In both approaches, the model vector includes the facies sequence and the elastic properties of P-wave velocity, S-wave velocity and density. The distribution of the elastic properties at each common-mid-point location (for the target-oriented approach) or at each time-sample position (for the time-interval approach) is assumed to be multimodal with as many modes as the number of litho-fluid facies considered. In this context, an analytical expression of the posterior model is no more available. For this reason, we adopt a Markov chain Monte Carlo algorithm to numerically evaluate the posterior uncertainties. With the aim of speeding up the convergence of the probabilistic sampling, we adopt a specific recipe that includes multiple chains, a parallel tempering strategy, a delayed rejection updating scheme and hybridizes the standard Metropolis–Hasting algorithm with the more advanced differential evolution Markov chain method. For the lack of available field seismic data, we validate the two implemented algorithms by inverting synthetic seismic data derived on the basis of realistic subsurface models and actual well log data. The two approaches are also benchmarked against two analytical inversion approaches that assume Gaussian-mixture-distributed elastic parameters. The final predictions and the convergence analysis of the two implemented methods proved that our approaches retrieve reliable estimations and accurate uncertainties quantifications with a reasonable computational effort.  相似文献   

2.
Summary The paper outlines some new approaches to discovering refraction anomalies using available meteorological data which characterize the vertical condition of the atmosphere in the neighbourhood of an observatory. Due to the limited number of data and the methods of processing, the resultant refraction anomalies refer to only a part of the atmosphere, approximately 10 km in height.  相似文献   

3.
ABSTRACT

A new physics-based rainfall–runoff method of the Soil and Water Assessment Tool (SWAT) was developed, which integrates a water balance (WB) approach with the variable source area (WB-VSA). This approach was further compared with four methods—soil-water-dependent curve number (CN-Soil), evaporation-dependent curve number (CN-ET), Green and Ampt equation (G&A) and WB—in a monsoonal watershed, Eastern China. The regional sensitivity analysis shows that volumetric efficiency coefficient (VE) with river discharges is sensitive to the most parameters of all approaches. The results of model calibration against VE demonstrate that WB-VSA is the most accurate owing to its reflection of the spatial variation of runoff generation as affected by topography and soil properties. Other methods can also mimic baseflow well, but the G&A and CN-ET simulate floods much worse than the saturation excess runoff approaches (WB-VSA, WB and CN-Soil). Meanwhile, CN-Soil as an empirical method fails to simulate groundwater levels. By contrast, WB-VSA captures them best.
Editor M.C. Acreman; Associate editor S. Kanae  相似文献   

4.
Following a general trend in paleo-environmental research, a considerable and still growing number of luminescence dating studies have been focusing on sediment archives providing information on environmental conditions beyond the last glacial-interglacial cycle. This trend caused a revival of IRSL-based dating approaches using feldspar minerals. As a consequence, the long-known but still poorly understood problem of anomalous fading and various correction methods are gaining increasing importance. In order to cope with the challenge of fading, several new measurement protocols aiming at reducing or completely avoiding fading were proposed. However, these approaches are either still experimental (e.g., infrared radiofluorescence), have only been applied to a limited number of natural samples (e.g., infrared photoluminescence) or are the subject of ongoing scientific discussions (e.g., postIRIR-protocols).Anomalous fading therefore remains a severe problem for feldspar-based luminescence measurements, and fading correction will thus be of crucial importance for reliable age calculations. Some of the proposed correction methods require fully-constructed dose response curves (DRCs) to accurately constrain D0-and saturation values, which is indispensable for mathematically accurate corrections. Recording such DRCs requires the consideration of high-dose points for a large number of aliquots, corresponding to long-lasting measurement times that pose challenges for resources in routine dating applications.The concept of standardized growth curves (SGC) might provide a promising solution for this problem. Here, we present results obtained from a comprehensive study assessing the potential of SGC-based approaches for improving the applicability and performance of fading correction procedures. In particular, our study is focusing on the fading correction model proposed by Kars et al. (2008), which is fundamentally based on the findings published by Huntley (2006). The applied performance test comprises various natural samples representing a variety of fading rates and covering different locations as well as sedimentary environments.  相似文献   

5.
This paper presents a review of methods for stochastic representation of non-Gaussian random fields. One category of such methods is through transformation from Gaussian random fields, and the other category is through direct simulation. This paper also gives a reflection on the simulation of non-Gaussian random fields, with the focus on its primary application for uncertainty quantification, which is usually associated with a large number of simulations. Dimension reduction is critical in the representation of non-Gaussian random fields with the aim of efficient uncertainty quantification. Aside from introducing the methods for simulating non-Gaussian random fields, critical components related to suitable stochastic approaches for efficient uncertainty quantification are stressed in this paper. Numerical examples of stochastic groundwater flow are also presented to investigate the applicability and efficiency of the methods for simulating non-Gaussian random fields for the purpose of uncertainty quantification.  相似文献   

6.
Dynamics of microbial community and biodegradation of polycyclic aromatic hydrocarbons (PAHs) in polluted marine sediments, artificially spiked with a mixture of PAHs (fluorene, phenanthrene, fluoranthene and pyrene), were examined for a period of 60 days. Microbial communities were characterised by bacterial counts, ester-linked fatty acid methyl ester (EL-FAME) analysis and denaturing gradient gel electrophoresis (DGGE). A noted reduction in species diversity occurred only in the high PAH level treatment at onset. Both EL-FAME and DGGE demonstrated a marked shift in microbial community, in all the PAH level treatments, afterwards, with increases in the number of fatty acid degraders, the relative abundance of fatty acid biomarkers for gram-negative bacteria and a decrease in species diversity. The shift was also accompanied by the significant decrease in PAH concentrations. By the end of the experiment, diversity indices, based on both approaches, recovered when PAH concentrations declined to their background levels, except in the high PAH level treatment.  相似文献   

7.
Realistic environmental models used for decision making typically require a highly parameterized approach. Calibration of such models is computationally intensive because widely used parameter estimation approaches require individual forward runs for each parameter adjusted. These runs construct a parameter-to-observation sensitivity, or Jacobian, matrix used to develop candidate parameter upgrades. Parameter estimation algorithms are also commonly adversely affected by numerical noise in the calculated sensitivities within the Jacobian matrix, which can result in unnecessary parameter estimation iterations and less model-to-measurement fit. Ideally, approaches to reduce the computational burden of parameter estimation will also increase the signal-to-noise ratio related to observations influential to the parameter estimation even as the number of forward runs decrease. In this work a simultaneous increments, an iterative ensemble smoother (IES), and a randomized Jacobian approach were compared to a traditional approach that uses a full Jacobian matrix. All approaches were applied to the same model developed for decision making in the Mississippi Alluvial Plain, USA. Both the IES and randomized Jacobian approach achieved a desirable fit and similar parameter fields in many fewer forward runs than the traditional approach; in both cases the fit was obtained in fewer runs than the number of adjustable parameters. The simultaneous increments approach did not perform as well as the other methods due to inability to overcome suboptimal dropping of parameter sensitivities. This work indicates that use of highly efficient algorithms can greatly speed parameter estimation, which in turn increases calibration vetting and utility of realistic models used for decision making.  相似文献   

8.
《Marine pollution bulletin》2012,64(5-12):424-430
Dynamics of microbial community and biodegradation of polycyclic aromatic hydrocarbons (PAHs) in polluted marine sediments, artificially spiked with a mixture of PAHs (fluorene, phenanthrene, fluoranthene and pyrene), were examined for a period of 60 days. Microbial communities were characterised by bacterial counts, ester-linked fatty acid methyl ester (EL-FAME) analysis and denaturing gradient gel electrophoresis (DGGE). A noted reduction in species diversity occurred only in the high PAH level treatment at onset. Both EL-FAME and DGGE demonstrated a marked shift in microbial community, in all the PAH level treatments, afterwards, with increases in the number of fatty acid degraders, the relative abundance of fatty acid biomarkers for gram-negative bacteria and a decrease in species diversity. The shift was also accompanied by the significant decrease in PAH concentrations. By the end of the experiment, diversity indices, based on both approaches, recovered when PAH concentrations declined to their background levels, except in the high PAH level treatment.  相似文献   

9.
The Special Issue of the Bulletin of Earthquake Engineering devoted to the new Italian strong motion database ITACA (ITalian ACelerometric Archive) is introduced in this foreword. An overview of the papers published in this issue is presented, providing an idea of the number of problems encountered in the compilation of a database as rich of information as ITACA, of the solutions adopted and of the possible research and practical applications. Most of the contents, though specifically addressed to ITACA and to its accelerograms, can be usefully thought of as an exemplification of approaches and methods that can be used for, and extended to, similar databases in other countries.  相似文献   

10.
The forecasting of large aftershocks is a preliminary and critical step in seismic hazard analysis and seismic risk management. From a statistical point of view, it relies entirely on the estimation of the properties of aftershock sequences using a set of laws with well-defined parameters. Since the frequentist and Bayesian approaches are common tools to assess these parameter values, we compare the two approaches for the Modified Omori Law and a selection of mainshock–aftershock sequences in the Iranian Plateau. There is a general agreement between the two methods, but the Bayesian appears to be more efficient as the number of recorded aftershocks decreases. Taking into account temporal variations of the b-value, the slope of the frequency-size distribution, the probability for the occurrence of strong aftershock, or larger main shock has been calculated in a finite time window using the parameters of the Modified Omori Law observed in the Iranian Plateau.  相似文献   

11.
Glacier retreat and thinning are occurring in many regions of the world, leading to significant changes in river discharge, water quality and aquatic ecosystems. Therefore, it is increasingly necessary for environmental scientists and managers to develop and adopt methods that link changes in meltwater dynamics and biotic response to better understand and predict future ecosystem change in glacially influenced rivers. To this end, two approaches have been developed recently: (1) the ARISE quantitative meltwater contribution approach of Brown et al. (Freshw Biol 54:1357–1369, 2009b) and (2) the glaciality index (GI) of Ilg and Castella (Freshw Biol 51:840–853, 2006), which provides a measure of glacial runoff influence on stream ecosystems based on four environmental variables (water temperature, channel stability, electrical conductivity and suspended sediment concentration). However, the relative performance and potential complementarities of these two approaches have yet to be evaluated. We conducted a methodological comparison using detailed hydrological, water quality and biological datasets collected over two summers in the French Pyrénées. Analyses revealed strong and significant correlation between ARISE meltwater contributions and the GI. However, the ARISE approach performed better when differentiating glacial influence between streams and over time. Both approaches were significant predictors of macroinvertebrate taxonomic richness, beta diversity, the number of EPT genera and total abundance, although regression models were typically stronger for the ARISE meltwater contribution approach. At the species level, ARISE performed better for predicting the abundance of 13 of the 20 most common taxa. We propose both approaches are valuable for assessing the effects of decreasing meltwater contributions on river ecosystems at the community level but this case study suggests ARISE was better able to identify subtle differences in hydrological change, community response and the abundance of individual taxa. Comparative studies from other catchments are required to further evaluate the two methods.  相似文献   

12.
13.
Freshwater and marine ecosystems are exposed to various multi-component mixtures of pollutants. Nevertheless, most ecotoxicological research and chemicals regulation focus on hazard and exposure assessment of individual substances only, the problem of chemical mixtures in the environment is ignored to a large extent. In contrast, the assessment of combination effects has a long tradition in pharmacology, where mixtures of chemicals are specifically designed to develop new products, e.g. human and veterinary drugs or agricultural and non-agricultural pesticides. In this area, two concepts are frequently used and are thought to describe fundamental relationships between single substance and mixture effects: Independent Action (Response Addition) and Concentration Addition. The question, to what extent these concepts may also be applied in an ecotoxicological and regulatory context may be considered a research topic of major importance, as the concepts would allow to make use of already existing single substance toxicity data for the predictive assessment of mixture toxicities. Two critical knowledge gaps are identified: (a) There is a lack of environmental realism, as a huge part of our current knowledge about the applicability of the concepts is restricted to artificial situations with respect to mixture composition or biological effect assessment. (b) The knowledge on what exactly is needed for using the concepts as tools for the predictive mixture toxicity assessment is insufficient. Both gaps seriously hamper the necessary, scientifically sound consideration of mixture toxicities in a regulatory context.In this paper, the two concepts will be briefly introduced, the necessity of considering the toxicities of chemical mixtures in the environment will be demonstrated and the applicability of Independent Action and Concentration Addition as tools for the prediction and assessment of mixture toxicities will be discussed. An overview of the specific aims and approaches of the BEAM project to fill in the identified knowledge gaps is given and first results are outlined.  相似文献   

14.
With the recent development of distributed hydrological models, the use of multi‐site observed data to evaluate model performance is becoming more common. Distributed hydrological model have many advantages, and at the same time, it also faces the challenge to calibrate over‐do parameters. As a typical distributed hydrological model, problems also exist in Soil and Water Assessment Tool (SWAT) parameter calibration. In the paper, four different uncertainty approaches – Particle Swarm Optimization (PSO) techniques, Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting algorithm (SUFI‐2) and Parameter Solution (PARASOL) – are taken to a comparative study with the SWAT model applied in Peace River Basin, central Florida. In our study, the observed river discharge data used in SWAT model calibration were collected from the three gauging stations at the main tributary of the Peace River. Behind these approaches, there is a shared philosophy; all methods seek out many parameter set to fit the uncertainties due to the non‐uniqueness in model parameter evaluation. On the basis of the statistical results of four uncertainty methods, difficulty level of each method, the number of runs and theoretical basis, the reasons that affected the accuracy of simulation were analysed and compared. Furthermore, for the four uncertainty method with SWAT model in the study area, the pairwise correlation between parameters and the distributions of model fit summary statistics computed from the sampling over the behavioural parameter and the entire model calibration parameter feasible spaces were identified and examined. It provided additional insight into the relative identifiability of the four uncertainty methods Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

15.
The use of stochastic models in subsurface hydrology is growing at a logistic pace. To tie together a number of different stochastic methodologies for deriving subsurface transport equations, we have put together a brief review of some of the more common techniques. Our attention is confined to a few select methodologies so that we might delve in detail into assumptions required by the various approaches and their strengths and weaknesses. The methods reviewed include: Martingale, stochastic-convective, stochastic-relativist, spectral-integral, perturbative, statistical-mechanical, and generalized hydrodynamics. Within this list, we also have included a few stochastic methodologies which have been used solely to develop expressions for the dispersion tensor.  相似文献   

16.
In this paper, we discuss the local discontinuous Galerkin (LDG) method applied to elliptic flow problems and give details on its implementation, focusing specifically on the case of piecewise linear approximating functions. The LDG method is one a family of discontinuous Galerkin (DG) methods proposed for diffusion models. These DG methods allow for very general hp finite element meshes, and produce locally conservative fluxes which can be used in coupling flow with transport. The drawback to DG methods, when compared to their continuous counterparts, is the number of degrees of freedom required to compute the solution. This motivates a coupled approach, discussed herein, where the solution is allowed to be continuous or discontinuous on a node-by-node basis. This coupled approximation is locally conservative in regions where the numerical solution is discontinuous. Numerical results for fully discontinuous, continuous and coupled discontinuous/continuous solutions are given, where we compare solution accuracy, matrix condition numbers and mass balance errors for the various approaches.  相似文献   

17.
One of the important factors influencing the accuracy of the numerical solution of 1D unsaturated flow equation (Richards’ equation) is the averaging method applied to compute hydraulic conductivity between two adjacent nodes of the computational grid. A number of averaging schemes have been proposed in the literature for homogeneous soil, including arithmetic, geometric, upstream and integrated means, as well as more sophisticated approaches, based on the local solution of steady state flow between the neighboring nodes (Darcian means). Another group of methods have been developed for the case when a material interface is present between the nodes. They range from simple arithmetic averaging to more complex schemes using the pressure- and flux-continuity conditions at the interface. In this paper we compare several averaging schemes for a number of steady and unsteady flow problems in layered soils. The first group of methods is applied in the framework of the vertex-centered approach to spatial discretization, where the nodes are placed at the material interfaces, while the second group is used with the cell-centered approach, where the material interfaces are located between computational nodes. The resulting numerical schemes are evaluated in terms of accuracy and computational time. It is shown that the averaging schemes based on Darcian mean principle [19] used in the framework of either vertex-centered or cell-centered approach compare favorably to other methods for a range of test cases.  相似文献   

18.
Aroclor 1254, a technical PCB mixture (polychlorinated biphenyls) and TBT (tributyltinchloride) are environmental pollutants that cause a broad spectrum of acute toxic and chronic effects in aquatic animals. In this paper, the sensitivity of Daphnia magna to chronic exposure to mixed xenobiotics was evaluated under laboratory conditions. The results show that xenobiotic mixtures (50 % each of the single compounds) were more toxic than individual xenobiotics alone. By measuring behavioral parameters of animals, it becomes evident that exposure to single xenobiotics significantly affects daphnids: exposure led finally to a rapid decrease in mean swimming activity and also caused changes in preferred swimming depth, with daphnids preferring the upper layers of aquaria. The mixture altered the swimming behavior even more strongly compared to the group stressed by single chemicals. Finally, all daphnids sank to the bottom of the aquaria, still alive, but inactive at the end of the exposure period. In addition, we investigated the reproductive capacity (number of newborn per female and day). PCB did not affect the number of newborn significantly, TBT‐stress led to an evidently decreased number of young daphnids and the xenobiotic mixture decreased reproduction even more. In conclusion, we found significant effects of the single compounds as well as approximately additive (swimming behavior) and synergistic (reproduction) effects of the chemical mixture on daphnids indicating the possibility of dramatic ecological consequences of the occurrence of mixed xenobiotic substances in the aquatic environment.  相似文献   

19.
Three methods, Shuffled Complex Evolution (SCE), Simple Genetic Algorithm (SGA) and Micro‐Genetic Algorithm (µGA), are applied in parameter calibration of a grid‐based distributed rainfall–runoff model (GBDM) and compared by their performances. Ten and four historical storm events in the Yan‐Shui Creek catchment, Taiwan, provide the database for model calibration and verification, respectively. The study reveals that the SCE, SGA and µGA have close calibration results, and none of them are superior with respect to all the performance measures, i.e. the errors of time to peak, peak discharge and the total runoff volume, etc. The performances of the GBDM for the verification events are slightly worse than those in the calibration events, but still quite satisfactory. Among the three methods, the SCE seems to be more robust than the other two approaches because of the smallest influence of different initial random number seeds on calibrated model parameters, and has the best performance of verification with a relatively small number of calibration events. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

20.
In order to evaluate natural attenuation in contaminated aquifers, there has been a recent recognition that a multidisciplinary approach, incorporating microbial and molecular methods, is required. Observed decreases in contaminant mass and identified footprints of biogeochemical reactions are often used as evidence of intrinsic bioremediation, but characterizing the structure and function of the microbial populations at contaminated sites is needed. In this paper, we review the experimental approaches and microbial methods that are available as tools to evaluate the controls on microbially mediated degradation processes in contaminated aquifers. We discuss the emerging technologies used in biogeochemical studies and present a synthesis of recent studies that serve as models of integrating microbiological approaches with more traditional geochemical and hydrogeologic approaches in order to address important biogeochemical questions about contaminant fate.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号