首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 546 毫秒
1.
The use of a physiologically based toxicokinetic (PBTK) model to reconstruct chemical exposure using human biomonitoring data, urinary metabolites in particular, has not been fully explored. In this paper, the trichloroethylene (TCE) exposure dataset by Fisher et al. (Toxicol Appl Pharm 152:339–359, 1998) was reanalyzed to investigate this new approach. By treating exterior chemical exposure as an unknown model parameter, a PBTK model was used to estimate exposure and model parameters by measuring the cumulative amount of trichloroethanol glucuronide (TCOG), a metabolite of TCE, in voided urine and a single blood sample of the study subjects by Markov chain Monte Carlo (MCMC) simulations. An estimated exterior exposure of 0.532 mg/l successfully reconstructed the true inhalation concentration of 0.538 mg/l with a 95% CI (0.441–0.645) mg/l. Based on the simulation results, a feasible urine sample collection period would be 12–16 h after TCE exposure, with blood sampling at the end of the exposure period. Given the known metabolic pathway and exposure duration, the proposed computational procedure provides a simple and reliable method for environmental (occupational) exposure and PBTK model parameter estimation, which is more feasible than repeated blood sampling.  相似文献   

2.
We present a Bayesian isochron approach to interpret measurements of multiple cosmogenic nuclides from glacially modified bedrock surfaces with complex exposure histories. An isochron approach explicitly incorporating glacial erosion is ideally suited for this problem; such erosion must be accounted for but has traditionally been ignored. Previous methods required treating each sample individually (to account for glacial erosion) and subsequently averaging results for the entire dataset. Geological considerations, however, suggest a more robust approach is to treat samples in the dataset here (and samples from other conceivable datasets) simultaneously. The Bayesian isochron method is applied to a previously published set of in situ 14C and 10Be measurements from a set of samples spanning the forefield of the Rhone Glacier, Switzerland. Results indicate 6.4 ± 0.5 kyr of integrated exposure and 4.7 ± 0.5 kyr of cumulative burial, similar to previous estimates, but with much smaller uncertainties. The reduced uncertainties result from fitting the exposure and burial duration to the entire dataset, while explicitly accounting for glacial erosion. The method presented here should be applicable with minor modifications in a number of geologic settings, and further demonstrates the utility of paired in situ 10Be and 14C measurements for unraveling complex exposure histories over during the Holocene and late Pleistocene.  相似文献   

3.
In this paper we combine a multiscale data integration technique introduced in [Lee SH, Malallah A, Datta-Gupta A, Hidgon D. Multiscale data integration using Markov Random Fields. SPE Reservoir Evaluat Eng 2002;5(1):68–78] with upscaling techniques for spatial modeling of permeability. The main goal of this paper is to find fine-scale permeability fields based on coarse-scale permeability measurements. The approach introduced in the paper is hierarchical and the conditional information from different length scales is incorporated into the posterior distribution using a Bayesian framework. Because of a complicated structure of the posterior distribution Markov chain Monte Carlo (MCMC) based approaches are used to draw samples of the fine-scale permeability field.  相似文献   

4.
Radon and its short-lived progeny are exposed to most human exposures as a natural source of radiation. Many studies have presumed that one of the most common incidences of lung cancer, colon cancer, and stomach cancer is caused by radon-contaminated water. In this study, water was collected from different groundwater sources in the Hafr Al Batin area, Saudi Arabia, and the radon concentration was investigated using an electronic portable radon detector. The annual effective dose of radon exposure by ingestion and inhalation of water is calculated from that radon concentration for the different age groups to assess the risk of radon exposure. The calculated annual effective doses are then compared with the international standard of risk limit as directed by the international organizations. The estimated radon concentration for groundwater samples is found to be between 0.03 and 3.20 Bq/L with an average value of 1.16 Bq/L. These estimated values are below the safety limits set by the USEPA and EAEC and far below those recommended by the UNSCEAR, EC, and WHO standards. The calculated annual effective dose of radon exposure for the different age groups ranging from infant to adult is found to be in the range of 0.05 to 16.24 μSv/year, with a mean value of 5.89 μSv/year, which is in the safe limit recommended by the EC and WHO. The obtained results of this present study will support the authority and regulators who are responsible for controlling and strategizing to ensure public safety against radon exposure.  相似文献   

5.
Currently, an operational strategy for the maintenance of reservoirs is an important issue because of the reduction of reservoir storage from sedimentation. However, relatively few studies have addressed the reliability analysis including uncertainty on the decrease of the reservoir storage by the sedimentation. Therefore, it is necessary that the reduction of the reservoir storage by the sedimentation should be assessed by a probabilistic viewpoint because the natural uncertainty is embedded in the process of the sedimentation. The objective of this study is to advance the maintenance procedures, especially the assessment of future reservoir storage, using the time-dependent reliability analysis with the Bayesian approach. The stochastic gamma process is applied to estimate the reduction of the Soyang dam reservoir storage in South Korea. In estimating the parameters of the stochastic gamma process, the Bayesian Markov chain Monte Carlo (MCMC) scheme using the informative prior distribution through the empirical Bayes method is applied. The Metropolis–Hastings algorithm is constructed and its convergence is checked by the various diagnostics. The range of the expected life time of the Soyang dam reservoir by the Bayesian MCMC is estimated from 111 to 172 years at a 5 % significance level. Finally, it is suggested that improving the assessment strategy in this study can provide valuable information to the decision makers who are in charge of the maintenance of a reservoir or a dam.  相似文献   

6.
Kil Seong Lee  Sang Ug Kim 《水文研究》2008,22(12):1949-1964
This study employs the Bayesian Markov Chain Monte Carlo (MCMC) method with the Metropolis–Hastings algorithm and maximum likelihood estimation (MLE) using a quadratic approximation of the likelihood function for the evaluation of uncertainties in low flow frequency analysis using a two‐parameter Weibull distribution. The two types of prior distributions, a non‐data‐based distribution and a data‐based distribution using regional information collected from neighbouring stations, are used to establish a posterior distribution. Eight case studies using the synthetic data with a sample size of 100, generated from two‐parameter Weibull distribution, are performed to compare with results of analysis using MLE and Bayesian MCMC. Also, Bayesian MCMC and MLE are applied to 36 years of gauged data to validate the efficiency of the developed scheme. These examples illustrate the advantages of Bayesian MCMC and the limitations of MLE based on a quadratic approximation. From the point of view of uncertainty analysis, Bayesian MCMC is more effective than MLE using a quadratic approximation when the sample size is small. In particular, Bayesian MCMC method is more attractive than MLE based on a quadratic approximation because the sample size of low flow at the site of interest is mostly not enough to perform the low flow frequency analysis. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

7.
This paper proposes a non-parametric method of classification of maps (i.e., variable fields such as wave energy maps for the Western Mediterranean Sea) into a set of D typical regimes (calm, E-, SW- or N/NW-wind dominated storms, the 4 synoptic situations more often occurring in this region). Each map in the training set is described by its values at P measurement points and one of these regime classes. A map is thus identified as a labelled point in a P-dimensional feature space, and the problem is to find a discrimination rule that may be used for attaching a classification probability to future unlabelled maps. The discriminant model proposed assumes that some log-contrasts of these classification probabilities form a Gaussian random field on the feature space. Then, available data (labelled maps of the training set) are linked to these latent probabilities through a multinomial model. This model is quite common in model-based Geostatistics and the Gaussian process classification literature. Inference is here approximated numerically using likelihood based techniques. The multinomial likelihood of labelled features is combined in a Bayesian updating with the Gaussian random field, playing the role of prior distribution. The posterior corresponds to an Aitchison distribution. Its maximum posterior estimates are obtained in two steps, exploiting several properties of this family. The first step is to obtain the mode of this distribution for labelled features, by solving a mildly non-linear system of equations. The second step is to propagate these estimates to unlabelled features, with simple kriging of log-contrasts. These inference steps can be extended via Markov-chain Monte Carlo (MCMC) sampling to a hierarchical Bayesian problem. This MCMC sampling can be improved by further exploiting the Aitchison distribution properties, though this is only outlined here. Results for the application case study suggest that E- and N/NW-dominated storms can be successfully discriminated from calm situations, but not so easily distinguished from each other.  相似文献   

8.
Global climate change models have predicted the intensification of extreme events, and these predictions are already occurring. For disaster management and adaptation of extreme events, it is essential to improve the accuracy of extreme value statistical models. In this study, Bayes' Theorem is introduced to estimate parameters in Generalized Pareto Distribution (GPD), and then the GPD is applied to simulate the distribution of minimum monthly runoff during dry periods in mountain areas of the Ürümqi River, Northwest China. Bayes' Theorem treats parameters as random variables and provides a robust way to convert the prior distribution of parameters into a posterior distribution. Statistical inferences based on posterior distribution can provide a more comprehensive representation of the parameters. An improved Markov Chain Monte Carlo (MCMC) method, which can solve high‐dimensional integral computation in the Bayes equation, is used to generate parameter simulations from the posterior distribution. Model diagnosis plots are made to guarantee the fitted GPD is appropriate. Then based on the GPD with Bayesian parameter estimates, monthly runoff minima corresponding to different return periods can be calculated. The results show that the improved MCMC method is able to make Markov chains converge faster. The monthly runoff minima corresponding to 10a, 25a, 50a and 100a return periods are 0.60 m3/s, 0.44 m3/s, 0.32 m3/s and 0.20 m3/s respectively. The lower boundary of 95% confidence interval of 100a return level is below zero, which implies that the Ürümqi River is likely to cease to flow when 100a return level appears in dry periods. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

9.
Age-depth modeling using Bayesian statistics requires well-informed prior information about the behavior of sediment accumulation. Here we present average sediment accumulation rates (represented as deposition times, DT, in yr/cm) for lakes in an Arctic setting, and we examine the variability across space (intra- and inter-lake) and time (late Holocene). The dataset includes over 100 radiocarbon dates, primarily on bulk sediment, from 22 sediment cores obtained from 18 lakes spanning the boreal to tundra ecotone gradients in subarctic Canada. There are four to twenty-five radiocarbon dates per core, depending on the length and character of the sediment records. Deposition times were calculated at 100-year intervals from age-depth models constructed using the ‘classical’ age-depth modeling software Clam. Lakes in boreal settings have the most rapid accumulation (mean DT 20 ± 10 yr/cm), whereas lakes in tundra settings accumulate at moderate (mean DT 70 ± 10 yr/cm) to very slow rates, (>100 yr/cm). Many of the age-depth models demonstrate fluctuations in accumulation that coincide with lake evolution and post-glacial climate change. Ten of our sediment cores yielded sediments as old as c. 9000 cal BP (BP = years before AD 1950). From between c. 9000 cal BP and c. 6000 cal BP, sediment accumulation was relatively rapid (DT of 20–60 yr/cm). Accumulation slowed between c. 5500 and c. 4000 cal BP as vegetation expanded northward in response to warming. A short period of rapid accumulation occurred near 1200 cal BP at three lakes. Our research will help inform priors in Bayesian age modeling.  相似文献   

10.
A statistical study was made of the temporal trend in extreme rainfall in the region of Extremadura (Spain) during the period 1961–2009. A hierarchical spatio-temporal Bayesian model with a GEV parameterization of the extreme data was employed. The Bayesian model was implemented in a Markov chain Monte Carlo framework that allows the posterior distribution of the parameters that intervene in the model to be estimated. The results show a decrease of extreme rainfall in winter and spring and a slight increase in autumn. The uncertainty in the trend parameters obtained with the hierarchical approach is much smaller than the uncertainties obtained from the GEV model applied locally. Also found was a negative relationship between the NAO index and the extreme rainfall in Extremadura during winter. An increase was observed in the intensity of the NAO index in winter and spring, and a slight decrease in autumn.  相似文献   

11.
Using data from two very large watersheds and five smaller, this paper explores the use of Bayesian methods for fitting rating curves. Posterior distribution of rating-curve parameters were calculated using Markov Chain Monte Carlo (MCMC) methods, and 95% credible intervals were calculated for predicted discharges, given stage. Expected discharge was related to stage using a link function. For the five smaller watersheds, the assumptions were (a) that the distribution of discharge Q, given stage h, is Normal, with variance proportional to h; (b) that a log link function relates μQ, the mean of Qh, to a function of stage, of the form μQ = β(h + α)λ. For the two large watersheds, however, a better fit was obtained by taking the distribution of Q to be log-Normal, and the link function as ln μQ = β0 + β1h. For the two large watersheds, priors for all three parameters were taken as uninformative; for the five smaller, the prior for parameter λ was taken as Normally distributed, N(2, 0.5). Acceptable ratings were obtained for all seven sites. It is argued that distributions of derived variables (such as annual maximum discharge) can be derived directly from (a) the posterior distribution of rating-curve parameters, and (b) the stage record, without recourse to additional assumptions. Estimates thus obtained for the T-year event will incorporate rating-curve uncertainty. It is argued that Bayesian methods are appropriate for rating-curve calculation because their inherent flexibility (a) allows the incorporation of prior information about the nature of a rating curve; (b) yields credible intervals for predicted discharges and quantities derived from them; (c) can be extended to allow for uncertainty in stage measurements.  相似文献   

12.
Accurate and precise estimation of return levels is often a key goal of any extreme value analysis. For example, in the UK the British Standards Institution (BSI) incorporate estimates of ‘once-in-50-year wind gust speeds’—or 50-year return levels—into their design codes for new structures; similarly, the Dutch Delta Commission use estimates of the 10,000-year return level for sea-surge to aid the construction of flood defence systems. In this paper, we briefly highlight the shortcomings of standard methods for estimating return levels, including the commonly-adopted block maxima and peaks over thresholds approach, before presenting an estimation framework which we show can substantially increase the precision of return level estimates. Our work allows explicit quantification of seasonal effects, as well as exploiting recent developments in the estimation of the extremal index for handling extremal clustering. From frequentist ideas, we turn to the Bayesian paradigm as a natural approach for building complex hierarchical or spatial models for extremes. Through simulations we show that the return level posterior mean does not have an exceedance probability in line with the intended encounter risk; we also argue that the Bayesian posterior predictive value gives the most satisfactory representation of a return level for use in practice, accounting for uncertainty in parameter estimation and future observations. Thus, where feasible, we propose a Bayesian estimation strategy for optimal return level inference.  相似文献   

13.
Large observed datasets are not stationary and/or depend on covariates, especially, in the case of extreme hydrometeorological variables. This causes the difficulty in estimation, using classical hydrological frequency analysis. A number of non-stationary models have been developed using linear or quadratic polynomial functions or B-splines functions to estimate the relationship between parameters and covariates. In this article, we propose regularised generalized extreme value model with B-splines (GEV-B-splines models) in a Bayesian framework to estimate quantiles. Regularisation is based on penalty and aims to favour parsimonious model especially in the case of large dimension space. Penalties are introduced in a Bayesian framework and the corresponding priors are detailed. Five penalties are considered and the corresponding priors are developed for comparison purpose as: Least absolute shrinkage and selection (Lasso and Ridge) and smoothing clipped absolute deviations (SCAD) methods (SCAD1, SCAD2 and SCAD3). Markov chain Monte Carlo (MCMC) algorithms have been developed for each model to estimate quantiles and their posterior distributions. Those approaches are tested and illustrated using simulated data with different sample sizes. A first simulation was made on polynomial B-splines functions in order to choose the most efficient model in terms of relative mean biais (RMB) and the relative mean-error (RME) criteria. A second simulation was performed with the SCAD1 penalty for sinusoidal dependence to illustrate the flexibility of the proposed approach. Results show clearly that the regularized approaches leads to a significant reduction of the bias and the mean square error, especially for small sample sizes (n < 100). A case study has been considered to model annual peak flows at Fort-Kent catchment with the total annual precipitations as covariates. The conditional quantile curves were given for the regularized and the maximum likelihood methods.  相似文献   

14.
Frequent ash fallout from long-lived eruptions (with active phases greater than 5 years) may lead to local populations experiencing unacceptably high cumulative exposures to respirable particulate matter. Ash from Montserrat has been shown to contain significant levels of cristobalite and other reactive agents that are associated with an increased risk of developing pneumoconiosis (including silicosis) and other long-term health problems. There are a number of difficulties associated with estimating risks in populations due to uncertain and wide ranging individual exposures, change in behaviour with time and the natural variation in individual response. Present estimates of risk in workers and other population groups are simplifications based on a limited number of exposure measurements taken on Montserrat (1996–1999), and exposure−response curves from epidemiological studies of coal workers exposed to siliceous dust. In this paper we present a method for calculating the long-term cumulative exposure to cristobalite from volcanic ash by Monte Carlo simulation. Code has been written to generate synthetic time series for volcanic activity, rainfall, ash deposition and erosion to give daily ash deposit values and cristobalite fraction at a range of locations. The daily mean personal exposure for PM10 and cristobalite is obtained by sampling from a probability distribution, with distribution parameters dependent on occupation, ground deposit depth and daily weather conditions. Output from multiple runs is processed to calculate the exceedance probability for cumulative exposure over a range of occupation types, locations and exposure periods. Results are interpreted in terms of current occupational standards, and epidemiological exposure−response functions for silicosis are applied to quantify the long-term health risk. Assuming continuing volcanic activity, median risk of silicosis (profusion 1/0 or higher) for an average adult after 20 years continuous exposure is estimated to be approximately 0.5% in northern Montserrat to 1.6% in Cork Hill. The occupational group with the highest exposure to ash are gardeners, with a corresponding 2% to 4% risk of silicosis. In situations where opportunities for in-depth exposure studies are limited, computer simulations provide a good indication of risk based on current expert knowledge. By running the code for a range of input scenarios, the cost-benefit of mitigation measures (such as a programme of active ash clearance) can be estimated. Results also may be used to identify situations where full exposure studies or fieldwork would be beneficial. Editorial responsibility: J Stix  相似文献   

15.
The Cumulative and Aggregate Simulation of Exposure (CASE) framework is an innovative simulation tool for exploring non-dietary exposures to environmental contaminants. Built upon the Dermal Exposure Reduction Model (DERM) and established methods for collecting detailed human activity patterns, the CASE framework improves upon its predecessor. Although prompted in part by the Food Quality Protection Act of 1996 and the need to assess aggregate exposure to pesticides, the framework was designed to be flexible enough to assess exposure to other contaminants. This paper examines features of the CASE framework in an illustrative application estimating children’s dermal and non-dietary ingestion exposure to lead in the residential environment. Concentration values in various media are taken from a nationwide study and exposure factors are obtained from the literature. Activity pattern input includes sequential micro-level activities collected for 20 children (ages 1 through 6). Modeled results are explored via classification trees and sensitivity analysis. Results of each exposure route are also compared to independent data. Median dermal exposure estimates were 589 and 558 μg/m3 for the right and left hands, respectively, with the resulting output most sensitive to exposure factor terms. The simulation estimated a median non-dietary ingestion rate of 9.5 μg of lead per day with estimates most sensitive to the surface area of mouthed objects.  相似文献   

16.
The Usutu virus is an arbovirus transmitted by mosquitoes and causing disease in birds. The virus was detected in Austria for the first time in 2001, while a major outbreak occurred in 2003. Rubel et al. (2008) developed a nine-compartment deterministic SEIR model to explain the spread of the disease. We extended this to a hierarchical Bayes model assuming random variation in temperature data, in reproduction data of birds, and in the number of birds found to be infected. The model was implemented in R, combined with the FORTRAN subroutine for the original deterministic model. Analysis was made by MCMC using a random walk Metropolis scheme. Posterior means, medians, and credible intervals were calculated for the parameters. The hierarchical Bayes approach proved to be fruitful in extending the deterministic model into a stochastic one. It allowed for Bayesian point and interval estimation and quantification of uncertainty of predictions. The analysis revealed that some model parameters were not identifiable; therefore we kept constant some of them and analyzed others conditional on them. Identifiability problems are common in models aiming to mirror the mechanism of the process, since parameters with natural interpretation are likely to exhibit interrelationships. This study illustrated that Bayesian modeling combined with conditional analysis may help in those cases. Its application to the Usutu model improved model fit and revealed the structure of interdependencies between model parameters: it demonstrated that determining some of them experimentally would enable estimation of the others, except one of them, from available data.  相似文献   

17.
We present a sensitivity analysis of the isochron approach of Goehring et al. (2013) for paired measurements of in situ 14C/10Be from glacially sculpted bedrock surfaces. This analysis tests how sensitive the resulting exposure durations from this technique are to both the number of samples analyzed and their locations along a glacial trough transect, using a dataset from Goehring et al. (2011) as a test case. A simple equally weighted combinatorial approach was employed to (1) generate non-repetitive combinations of n subsets of samples arranged from the ten possible samples (where n < 10), and (2) estimate the exposure duration and uncertainty for each set of simulations. Results from the Goehring et al. (2011) data indicate that five samples evenly distributed along a transect parallel to the ice margin are the minimum number of samples required for this method, while eight or more samples provide an optimal combination of accuracy and precision at the 1σ level. These findings should be applicable to paired in situ 14C/10Be measurements from other polished bedrock troughs at glacial margins, but need further experimental confirmation.  相似文献   

18.
基于MCMC的叠前地震反演方法研究   总被引:6,自引:5,他引:1       下载免费PDF全文
马尔科夫链蒙特卡洛方法(MCMC)是一种启发式的全局寻优算法[1].它在贝叶斯框架下,利用已有资料进行约束,既可使最优解满足参数的统计特性,又通过融入的先验信息,提高解的精度;寻优过程可跳出局部最优,得到全局最优解.利用MCMC方法,可以得到大量来自于后验概率分布的样本,不仅可以得到每个未知参数的估计值,而且可以得到与...  相似文献   

19.
《国际泥沙研究》2019,34(6):577-590
Bayesian and discriminant function analysis (DFA) models have recently been used as tools to estimate sediment source contributions. Unlike existing multivariate mixing models, the accuracy of these two models remains unclear. In the current study, four well-distinguished source samples were used to create artificial mixtures to test the performance of Bayesian and DFA models. These models were tested against the Walling-Collins model, a credible model used in estimation of sediment source contributions estimation, as a reference. The artificial mixtures were divided into five groups, with each group consisting of five samples with known source percentages. The relative contributions of the sediment sources to the individual and grouped samples were calculated using each of the models. The mean absolute error (MAE) and standard error of (SE) MAE were used to test the accuracy of each model and the robustness of the optimized solutions. For the individual sediment samples, the calculated source contributions obtained with the Bayesian (MAE = 7.4%, SE = 0.6%) and Walling-Collins (MAE = 7.5%, SE = 0.7%) models produced results which were closest to the actual percentages of the source contributions to the sediment mixtures. The DFA model produced the worst estimates (MAE = 18.4%, SE = 1.4%). For the grouped sediment samples, the Walling-Collins model (MAE = 5.4%) was the best predictor, closely followed by the Bayesian model (MAE = 5.9%). The results obtained with the DFA model were similar to the values for the individual sediment samples, with the accuracy of the source contribution value being the poorest obtained with any of the models (MAE = 18.5%). An increase in sample size improved the accuracies of the Walling-Collins and Bayesian models, but the DFA model produced similarly inaccurate results for both the individual and grouped sediment samples. Generally, the accuracy of the Walling-Collins and Bayesian models was similar (p > 0.01), while there were significant differences (p < 0.01) between the DFA model and the other models. This study demonstrated that the Bayesian model could provide a credible estimation of sediment source contributions and has great practical potential, while the accuracy of the DFA model still requires considerable improvement.  相似文献   

20.
A methodological approach for modelling the occurrence patterns of species for the purpose of fisheries management is proposed here. The presence/absence of the species is modelled with a hierarchical Bayesian spatial model using the geographical and environmental characteristics of each fishing location. Maps of predicted probabilities of presence are generated using Bayesian kriging. Bayesian inference on the parameters and prediction of presence/absence in new locations (Bayesian kriging) are made by considering the model as a latent Gaussian model, which allows the use of the integrated nested Laplace approximation ( INLA ) software (which has been seen to be quite a bit faster than the well-known MCMC methods). In particular, the spatial effect has been implemented with the stochastic partial differential equation (SPDE) approach. The methodology is evaluated on Mediterranean horse mackerel (Trachurus mediterraneus) in the Western Mediterranean. The analysis shows that environmental and geographical factors can play an important role in directing local distribution and variability in the occurrence of species. Although this approach is used to recognize the habitat of mackerel, it could also be for other different species and life stages in order to improve knowledge of fish populations and communities.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号