共查询到20条相似文献,搜索用时 31 毫秒
1.
Investigation of topographic reductions and aliasing effects on gravity and the geoid over Greece based on various digital terrain models 总被引:2,自引:0,他引:2
The reduction of gravity-field related quantities (e.g., gravity anomalies, geoid heights) due to the topography plays a crucial
role in both geodetic and geophysical applications, since in the former it is an intermediate step towards geoid prediction
and in the latter it reveals lateral as well as radial density contrasts and infers the geology of the area under study. The
computations are usually carried out by employing a DTM and/or a DBM, which describe the topography and bathymetry, respectively.
Errors in these DTMs/DBMs will introduce errors in the computed topographic effects, while poor spatial resolution of the
topography and bathymetry models will result in aliasing effects to both gravity anomalies and geoid heights, both influencing
the accuracy of the estimated solutions. The scope of this work is twofold. First, a validation and accuracy assessment of
the SRTM 3″ (90 m) DTM over Greece is performed through comparisons with existing global models as well as with the Greek
450 m national DTMs. Whenever a misrepresentation of the topography is identified in the SRTM data, it is “corrected” using
the local 450 m DTM. This process resulted in an improved SRTM DTM called SRTMGr, which was then used to determine terrain
effects to gravity field quantities. From the fine-resolution SRTMGr DTMs, coarser models of 15″, 30″, 1′, 2′ and 5′ have
been determined in order to investigate aliasing effects on both gravity anomalies and geoid heights by computing terrain
effects at variable spatial resolutions. From the results acquired in two test areas, it was concluded that SRTMGr provides
similar results to the local DTM making the use of other older global DTMs obsolete. The study for terrain aliasing effects
proved that when high-resolution and accuracy gravity and geoid models are needed, then the highest possible resolution DTM
should be employed to compute the respective terrain effects. Based on the results acquired from two the test areas a corrected
SRTMGr DTM has been compiled for the entire Greek territory towards the development of a new gravimetric geoid model. Results
from that analysis are presented based on the well-known remove-compute-restore method, employing land and marine gravity
data, EGM08 as a reference geopotential model and the SRTMGr DTM for the computation of the RTM effects. 相似文献
2.
A mixed model is proposed to fit earthquake interevent time distribution. In this model, the whole distribution is constructed
by mixing the distribution of clustered seismicity, with a suitable distribution of background seismicity. Namely, the fit
is tested assuming a clustered seismicity component modeled by a non-homogeneous Poisson process and a background component
modeled using different hypothetical models (exponential, gamma and Weibull). For southern California, Japan, and Turkey,
the best fit is found when a Weibull distribution is implemented as a model for background seismicity. Our study uses earthquake
random sampling method we introduced recently. It is performed here to account for space–time clustering of earthquakes at
different distances from a given source and to increase the number of samples used to estimate earthquake interevent time
distribution and its power law scaling. For Japan, the contribution of clustered pairs of events to the whole distribution
is analyzed for different magnitude cutoffs, m
c, and different time periods. The results show that power laws are mainly produced by the dominance of correlated pairs at
small and long time ranges. In particular, both power laws, observed at short and long time ranges, can be attributed to time–space
clustering revealed by the standard Gardner and Knopoff’s declustering windows. 相似文献
3.
Karl Thomas Hjelmervik Jan Kristian Jensen Petter ?stenstad Atle Ommundsen 《Ocean Dynamics》2012,62(2):253-264
Sonar performance modeling is crucial for submarine and anti–submarine operations. The validity of sonar performance models
is generally limited by environmental uncertainty, and particularly uncertainty in the vertical sound speed profile (SSP).
Rapid environmental assessment (REA) products, such as oceanographic surveys and ocean models may be used to reduce this uncertainty
prior to sonar operations. Empirical orthogonal functions (EOF) applied on the SSPs inherently take into account the vertical
gradients and therefore the acoustic properties. We present a method that employs EOFs and a grouping algorithm to divide
a large group of SSPs from an ocean model simulation into smaller groups with similar SSP characteristics. Such groups are
henceforth called acoustically stable groups. Each group represents a subset in space and time within the ocean model domain.
Regions with low acoustic variability contain large and geographically contiguous acoustically stable groups. In contrast,
small or fragmented acoustically stable groups are found in regions with high acoustic variability. The main output is a map
of the group distribution. This is a REA product in itself, but the map may also be used as a planning aid for REA survey
missions. 相似文献
4.
A novel hybrid approach to earthquake location is proposed which uses a combined coarse global search and fine local inversion
with a minimum search routine. The method exploits the advantages of network ray tracing and robust formulation of the Fréchet
derivatives to simultaneously update all sampled initial source parameters in the solution space to determine the best solution.
Synthetic examples, involving a three-dimensional (3-D) complex velocity model and a challenging source–receiver layout, are
used to demonstrate the advantages over direct grid search algorithms in terms of solution accuracy, computational efficiency,
and sensitivity to noise. Therefore, this is a promising scheme for earthquake early warning, tsunami early warning, rapid
hazard assessment, and emergency response after strong earthquake occurrence. 相似文献
5.
Lawrence D. Lemke Andrew S. Bahrou 《Stochastic Environmental Research and Risk Assessment (SERRA)》2009,23(1):27-39
Quantifying human cancer risk arising from exposure to contaminated groundwater is complicated by the many hydrogeological,
environmental, and toxicological uncertainties involved. In this study, we used Monte Carlo simulation to estimate cancer
risk associated with tetrachloroethene (PCE) dissolved in groundwater by linking three separate models for: (1) reactive contaminant
transport; (2) human exposure pathways; and (3) the PCE cancer potency factor. The hydrogeologic model incorporates an analytical
solution for a one-dimensional advective–dispersive–reactive transport equation to determine the PCE concentration in a water
supply well located at a fixed distance from a continuous source. The pathway model incorporates PCE exposure through ingestion,
inhalation, and dermal contact. The toxicological model combines epidemiological data from eight rodent bioassays of PCE exposure
in the form of a composite cumulative distribution frequency curve for the human PCE cancer potency factor. We assessed the
relative importance of individual model variables through their correlation with expected cancer risk calculated in an ensemble
of Monte Carlo simulations with 20,000 trials. For the scenarios evaluated, three factors were most highly correlated with
cancer risk: (1) the microbiological decay constant for PCE in groundwater, (2) the linear groundwater pore velocity, and
(3) the cancer potency factor. We then extended our analysis beyond conventional expected value risk assessment using the
partitioned multiobjective risk method (PMRM) to generate expected-value functions conditional to a 1 in 100,000 increased
cancer risk threshold. This approach accounts for low probability/high impact outcomes separately from the conventional unconditional
expected values. Thus, information on potential worst-case outcomes can be quantified for decision makers. Using PMRM, we
evaluated the cost-benefit relationship of implementing several postulated risk management alternatives intended to mitigate
the expected and conditional cancer risk. Our results emphasize the importance of hydrogeologic models in risk assessment,
but also illustrate the importance of integrating environmental and toxicological uncertainty. When coupled with the PMRM,
models integrating uncertainty in transport, exposure, and potency constitute an effective risk assessment tool for use within
a risk-based corrective action (RBCA) framework. 相似文献
6.
7.
Anna Zacharioudaki Shunqi Pan Dave Simmonds Vanesa Magar Dominic E. Reeve 《Ocean Dynamics》2011,61(6):807-827
In this paper, we investigate changes in the wave climate of the west-European shelf seas under global warming scenarios.
In particular, climate change wind fields corresponding to the present (control) time-slice 1961–2000 and the future (scenario)
time-slice 2061–2100 are used to drive a wave generation model to produce equivalent control and scenario wave climate. Yearly
and seasonal statistics of the scenario wave climates are compared individually to the corresponding control wave climate
to identify relative changes of statistical significance between present and future extreme and prevailing wave heights. Using
global, regional and linked global–regional wind forcing over a set of nested computational domains, this paper further demonstrates
the sensitivity of the results to the resolution and coverage of the forcing. It suggests that the use of combined forcing
from linked global and regional climate models of typical resolution and coverage is a good option for the investigation of
relative wave changes in the region of interest of this study. Coarse resolution global forcing alone leads to very similar
results over regions that are highly exposed to the Atlantic Ocean. In contrast, fine resolution regional forcing alone is
shown to be insufficient for exploring wave climate changes over the western European waters because of its limited coverage.
Results obtained with the combined global–regional wind forcing showed some consistency between scenarios. In general, it
was shown that mean and extreme wave heights will increase in the future only in winter and only in the southwest of UK and
west of France, north of about 44–45° N. Otherwise, wave heights are projected to decrease, especially in summer. Nevertheless,
this decrease is dominated by local wind waves whilst swell is found to increase. Only in spring do both swell and local wind
waves decrease in average height. 相似文献
8.
Dynamic characteristics of monthly rainfall in the Korean Peninsula under climate change 总被引:3,自引:1,他引:2
Min Soo Kyoung Hung Soo Kim Bellie Sivakumar Vijay P. Singh Kyung Soo Ahn 《Stochastic Environmental Research and Risk Assessment (SERRA)》2011,25(4):613-625
Global climate change is one of the most serious issues we are facing today. While its exact impacts on our water resources
are hard to predict, there is a general consensus among scientists that it will result in more frequent and more severe hydrologic
extremes (e.g. floods, droughts). Since rainfall is the primary input for hydrologic and water resource studies, assessment
of the effects of climate change on rainfall is essential for devising proper short-term emergency measures as well as long-term
management strategies. This is particularly the case for a region like the Korean Peninsula, which is susceptible to both
floods (because of its mountainous terrain and frequent intense rainfalls during the short rainy season) and droughts (because
of its smaller area, long non-rainy season, and lack of storage facilities). In view of this, an attempt is made in the present
study to investigate the potential impacts of climate change on rainfall in the Korean Peninsula. More specifically, the dynamics
of ‘present rainfall’ and ‘future rainfall’ at the Seoul meteorological station in the Han River basin are examined and compared;
monthly scale is considered in both cases. As for ‘present rainfall,’ two different data sets are used: (1) observed rainfall
for the period 1971–1999; and (2) rainfall for the period 1951–1999 obtained through downscaling of coarse-scale climate outputs
produced by the Bjerknes Center for Climate Research-Bergen Climate Model Version 2 (BCCR-BCM2.0) climate model with the Intergovernmental
Panel on Climate Change Special Report on Emission Scenarios (IPCC SRES) 20th Century Climate in Coupled Models (20C3M) scenario.
The ‘future rainfall’ (2000–2099) is obtained through downscaling of climate outputs projected by the BCCR-BCM2.0 with the
A2 emission scenario. For downscaling of coarse-scale climate outputs to basin-scale rainfall, a K-nearest neighbor (K-NN) technique is used. Examination of the nature of rainfall dynamics is made through application of four methods: autocorrelation
function, phase space reconstruction, correlation dimension, and close returns plot. The results are somewhat mixed, depending
upon the method, as to whether the rainfall dynamics are chaotic or stochastic; however, the dynamics of the future rainfall
seem more on the chaotic side than on the stochastic side, and more so when compared to that of the present rainfall. 相似文献
9.
C. Onof R. E. Chandler A. Kakou P. Northrop H. S. Wheater V. Isham 《Stochastic Environmental Research and Risk Assessment (SERRA)》2000,14(6):384-411
Over a decade ago, point rainfall models based upon Poisson cluster processes were developed by Rodriguez-Iturbe, Cox and
Isham. Two types of point process models were envisaged: the Bartlett–Lewis and the Neyman–Scott rectangular pulse models.
Recent developments are reviewed here, including a number of empirical studies. The parameter estimation problem is addressed
for both types of Poisson-cluster based models. The multiplicity of parameters which can be obtained for a given data set
using the method of moments is illustrated and two approaches to finding a best set of parameters are presented. The use of
a proper fitting method will allow for the problems encountered in regionalisation to be adequately dealt with. Applications
of the point process model to flood design are discussed and finally, results for a model with dependent cell depth and duration
are given. Taking into account the spatial features of rainfall, three multi-site models are presented and compared. They
are all governed by a master Poisson process of storm origins and have a number of cell origins associated with each storm
origin. The three models differ as to the type of dependence structure between the cell characteristics at different sites.
Analytical properties are presented for these models and their ability to represent the spatial structure of a set of raingauge
data in the South-West of England is examined. Continuous spatial-temporal models are currently being developed and results
are presented for a model in which storm centres arrive in a homogeneous Poisson process in space-time, and cells follow them
in time according to a Bartlett–Lewis type cluster. Examples of simulations using this model are shown and compared with radar
data from the South-West of England. The paper concludes with a summary of the main areas in which further research is required. 相似文献
10.
Quantitative microbial risk assessment: uncertainty and measures of central tendency for skewed distributions 总被引:1,自引:1,他引:0
K. K. Benke A. J. Hamilton 《Stochastic Environmental Research and Risk Assessment (SERRA)》2008,22(4):533-539
In the past, arithmetic and geometric means have both been used to characterise pathogen densities in samples used for microbial
risk assessment models. The calculation of total (annual) risk is based on cumulative independent (daily) exposures and the
use of an exponential dose–response model, such as that used for exposure to Giardia or Cryptosporidium. Mathematical analysis suggests that the arithmetic mean is the appropriate measure of central tendency for microbial concentration
with respect to repeated samples of daily exposure in risk assessment. This is despite frequent characterisation of microbial
density by the geometric mean, since the microbial distributions may be Log normal or skewed in nature. Mathematical derivation
supporting the use of the arithmetic mean has been based on deterministic analysis, prior assumptions and definitions, the
use of point-estimates of probability, and has not included from the outset the influence of an actual distribution for microbial
densities. We address these issues by experiments using two real-world pathogen datasets, together with Monte Carlo simulation,
and it is revealed that the arithmetic mean also holds in the case of a daily dose with a finite distribution in microbial
density, even when the distribution is very highly-skewed, as often occurs in environmental samples. Further, for simplicity,
in many risk assessment models, the daily infection risk is assumed to be the same for each day of the year and is represented
by a single value,
which is then used in the calculation of p
Σ, which is a numerical estimate of annual risk, P
Σ, and we highlight the fact that is simply a function of the geometric mean of the daily complementary risk probabilities (although it is sometimes approximated
by the arithmetic mean of daily risk in the low dose case). Finally, the risk estimate is an imprecise probability with no
indication of error and we investigate and clarify the distinction between risk and uncertainty assessment with respect to
the predictive model used for total risk assessment. 相似文献
11.
A significant number of volcano-tectonic (VT) earthquake swarms, some of which are accompanied by ground deformation and/or
volcanic gas emissions, do not culminate in an eruption. These swarms are often thought to represent stalled intrusions of
magma into the mid- or shallow-level crust. Real-time assessment of the likelihood that a VT swarm will culminate in an eruption
is one of the key challenges of volcano monitoring, and retrospective analysis of non-eruptive swarms provides an important
framework for future assessments. Here we explore models for a non-eruptive VT earthquake swarm located beneath Iliamna Volcano,
Alaska, in May 1996–June 1997 through calculation and inversion of fault-plane solutions for swarm and background periods,
and through Coulomb stress modeling of faulting types and hypocenter locations observed during the swarm. Through a comparison
of models of deep and shallow intrusions to swarm observations, we aim to test the hypothesis that the 1996–97 swarm represented
a shallow intrusion, or “failed” eruption. Observations of the 1996–97 swarm are found to be consistent with several scenarios
including both shallow and deep intrusion, most likely involving a relatively small volume of intruded magma and/or a low
degree of magma pressurization corresponding to a relatively low likelihood of eruption. 相似文献
12.
Shengpan Lin Changwei Jing Neil A. Coles Vincent Chaplot Nathan J. Moore Jiaping Wu 《Stochastic Environmental Research and Risk Assessment (SERRA)》2013,27(1):209-221
DEMs as important input parameters of environmental risk assessment models are notable sources of uncertainties. To illustrate the effect of DEM grid size and source on model outputs, a widely used watershed management model, the Soil and Water Assessment Tool (SWAT), was applied with two newly available DEMs as inputs (i.e. ASTER GDEM Version 1, and SRTM Version 4.1). A DEM derived from 1:10,000 high resolution digital line graph (DLG) was used as a baseline for comparisons. Eleven resample resolutions, from 5 to 140?m, were considered to evaluate the impact of DEM resolution on SWAT outputs. Results from a case study in South-eastern China indicate that the SWAT predictions of total phosphorus and total nitrogen decreased substantially with coarser resample resolution. A slightly decreasing trend was found in the SWAT predicted sediment when DEMs were resampled to coarser resolutions. The SWAT predicted runoff was not sensitive to resample resolution. For different data sources, ASTER GDEM did not perform better than SRTM in SWAT simulations even it was provided with a smaller grid size and higher vertical accuracy. The predicted outputs based on ASTER GDEM and SRTM were similar, and much lower than the ones based on DLG. This study presents potential uncertainties introduced by DEM resolutions and data sources, and recommends strategies choosing DEMs based on research objects and maximum acceptable errors. 相似文献
13.
Selection and ranking of ground motion models for seismic hazard analysis in the Pyrenees 总被引:1,自引:0,他引:1
Stéphane Drouet Frank Scherbaum Fabrice Cotton Annie Souriau 《Journal of Seismology》2007,11(1):87-100
The issue addressed in this paper is the objective selection of appropriate ground motion models for seismic hazard assessment
in the Pyrenees. The method of Scherbaum et al. (2004a) is applied in order to rank eight published ground motion models relevant to intraplate or to low deformation rate contexts.
This method is based on a transparent and data-driven process which quantifies the model fit and also measures how well the
underlying model assumptions are met. The method is applied to 15 accelerometric records obtained in the Pyrenees for events
of local magnitude between 4.8 and 5.1, corresponding to moment magnitudes ranging from 3.7 to 3.9. Only stations at rock
sites are considered. A total of 720 spectral amplitudes are used to rank the selected ground motion models. Some control
parameters of these models, such as magnitude and distance definitions, may vary from one model to the other. It is thus important
to correct the selected models for their difference with respect to the magnitude and distance definitions used for the Pyrenean
data. Our analysis shows that, with these corrections, some of the ground motion models successfully fit the data. These are
the Lussou et al. (2001) and the Berge-Thierry et al. (2003) models. According to the selected ground motion models, a possible scenario of a magnitude 6 event is proposed; it predicts
response spectra accelerations of 0.08–0.1 g at 1 Hz at a hypocentral distance of 10 km. 相似文献
14.
ABSTRACTThis study assessed the utility of EUDEM, a recently released digital elevation model, to support flood inundation modelling. To this end, a comparison with other topographic data sources was performed (i.e. LIDAR, light detection and ranging; SRTM, Shuttle Radar Topographic Mission) on a 98-km reach of the River Po, between Cremona and Borgoforte (Italy). This comparison was implemented using different model structures while explicitly accounting for uncertainty in model parameters and upstream boundary conditions. This approach facilitated a comprehensive assessment of the uncertainty associated with hydraulic modelling of floods. For this test site, our results showed that the flood inundation models built on coarse resolutions data (EUDEM and SRTM) and simple one-dimensional model structure performed well during model evaluation.
Editor Z.W. Kundzewicz; Associate editor S. Weijs 相似文献
15.
H. S. Kim D. S. Kang J. H. Kim 《Stochastic Environmental Research and Risk Assessment (SERRA)》2003,17(1-2):104-115
The conventional nonparametric tests have been widely used in many fields for the residual analysis of a fitted model on
observations. Also, in recent, a new technique called the BDS (Brock–Dechert–Scheinkman) statistic has been shown that it
can be used as a powerful tool for the residual analysis, especially, of a nonlinear system. The purpose of this study is
to compare the powers of the nonparametric tests and BDS statistic by residual analysis of the fitted models. This study evaluates
stochastic models for four monthly rainfalls in Korea through the residual analysis by using the conventional nonparametric
and BDS statistics. We use SARIMA and AR Error models for fitting each rainfall and perform the residual analysis by using
the test techniques. As a result, we find that the BDS statistic is more reasonable than the conventional nonparametric tests
for the residual analysis and AR Error model may be more appropriate than SARIMA model for modeling of monthly rainfalls.
This work was supported by grant No. R01-2001-000-00474-0 from the Basic Research Program of the Korea Science & Engineering
Foundation. 相似文献
16.
D. Oettl R. A. Almbauer P. J. Sturm G. Pretterhofer 《Stochastic Environmental Research and Risk Assessment (SERRA)》2003,17(1-2):58-75
Although the strict legislation regarding vehicle emissions in Europe (EURO 4, EURO 5) will lead to a remarkable reduction
of emissions in the near future, traffic related air pollution still can be problematic due to a large increase of traffic
in certain areas. Many dispersion models for line-sources have been developed to assess the impact of traffic on the air pollution
levels near roads, which are in most cases based on the Gaussian equation. Previous studies gave evidence, that such kind
of models tend to overestimate concentrations in low wind speed conditions or when the wind direction is almost parallel to
the street orientation. This is of particular interest, since such conditions lead generally to the highest observed concentrations
in the vicinity of streets. As many air quality directives impose limits on high percentiles of concentrations, it is important
to have good estimates of these quantities in environmental assessment studies. The objective of this study is to evaluate
a methodology for the computation of especially those high percentiles required by e.g. the EU daughter directive 99/30/EC
(for instance the 99.8 percentile for NO2). The model used in this investigation is a Markov Chain – Monte Carlo model to predict pollutant concentrations, which performs
well in low wind conditions as is shown here. While usual Lagrangian models use deterministic time steps for the calculation
of the turbulent velocities, the model presented here, uses random time steps from a Monte Carlo simulation and a Markov Chain
simulation for the sequence of the turbulent velocities. This results in a physically better approach when modelling the dispersion
in low wind speed conditions. When Lagrangian dispersion models are used for regulatory purposes, a meteorological pre-processor
is necessary to obtain required input quantities like Monin-Obukhov length and friction velocity from routinely observed data.
The model and the meteorological pre-processor applied here, were tested against field data taken near a major motorway south
of Vienna. The methodology used is based on input parameters, which are also available in usual environmental assessment studies.
Results reveal that the approach examined is useful and leads to reasonable concentration levels near motorways compared to
observations.
We wish to thank Andreas Schopper (Styrian Government) for providing air quality values, M. Kalina for providing the raw
data of the air quality stations near the motorway and J. Kukkonen for providing the road site data set from the Finish Meteorological
Institute (FMI). The study was partly funded by the Austrian science fund under the project P14075-TEC. 相似文献
17.
R. L. Maddalena T. E. McKone D. P. H. Hsieh S. Geng 《Stochastic Environmental Research and Risk Assessment (SERRA)》2001,15(1):1-17
Monte Carlo analysis is a statistical simulation method that is often used to assess and quantify the outcome variance in
complex environmental fate and effects models. Total outcome variance of these models is a function of (1) the variance (uncertainty
and/or variability) associated with each model input and (2) the sensitivity of the model outcome to changes in the inputs.
To propagate variance through a model using Monte Carlo techniques, each variable must be assigned a probability distribution.
The validity of these distributions directly influences the accuracy and reliability of the model outcome. To efficiently
allocate resources for constructing distributions one should first identify the most influential set of variables in the model.
Although existing sensitivity and uncertainty analysis methods can provide a relative ranking of the importance of model inputs,
they fail to identify the minimum set of stochastic inputs necessary to sufficiently characterize the outcome variance. In
this paper, we describe and demonstrate a novel sensitivity/uncertainty analysis method for assessing the importance of each
variable in a multimedia environmental fate model. Our analyses show that for a given scenario, a relatively small number
of input variables influence the central tendency of the model and an even smaller set determines the spread of the outcome
distribution. For each input, the level of influence depends on the scenario under consideration. This information is useful
for developing site specific models and improving our understanding of the processes that have the greatest influence on the
variance in outcomes from multimedia models. 相似文献
18.
Global elevation vibration and seasonal changes derived by the analysis of GPS height 总被引:3,自引:0,他引:3
Nearly10yearshaveelapsedsincetheGPSobservationtechniquewasappliedingeophysicalstudies.TheInternationalGPSServiceforGeodynamics(IGS),whichcameintooperationin1994,distributespreciseGPSsatelliteephemerides,Earthrotationparameters,internationalterrestrialrefe… 相似文献
19.
Global sensitivity analysis for a numerical model of radionuclide migration from the RRC “Kurchatov Institute” radwaste disposal site 总被引:1,自引:1,他引:0
E. Volkova B. Iooss F. Van Dorpe 《Stochastic Environmental Research and Risk Assessment (SERRA)》2008,22(1):17-31
Today, in different countries, there exist sites with contaminated groundwater formed as a result of inappropriate handling
or disposal of hazardous materials or wastes. Numerical modeling of such sites is an important tool for a correct prediction
of contamination plume spreading and an assessment of environmental risks associated with the site. Many uncertainties are
associated with a part of the parameters and the initial conditions of such environmental numerical models. Statistical techniques
are useful to deal with these uncertainties. This paper describes the methods of uncertainty propagation and global sensitivity
analysis that are applied to a numerical model of radionuclide migration in a sandy aquifer in the area of the RRC “Kurchatov
Institute” radwaste disposal site in Moscow, Russia. We consider 20 uncertain input parameters of the model and 20 output
variables (contaminant concentration in the observation wells predicted by the model for the end of 2010). Monte Carlo simulations
allow calculating uncertainty in the output values and analyzing the linearity and the monotony of the relations between input
and output variables. For the non monotonic relations, sensitivity analyses are classically done with the Sobol sensitivity
indices. The originality of this study is the use of modern surrogate models (called response surfaces), the boosting regression
trees, constructed for each output variable, to calculate the Sobol indices by the Monte Carlo method. It is thus shown that
the most influential parameters of the model are distribution coefficients and infiltration rate in the zone of strong pipe
leaks on the site. Improvement of these parameters would considerably reduce the model prediction uncertainty. 相似文献
20.
L. M. Balakina 《Izvestiya Physics of the Solid Earth》2011,47(9):835-846
The paper addresses the interpretation of the location, type, and size of the source for the earth-quake of March 11, 2011.
The source—a subvertical reverse fault trending in the azimuth of ∼25° along the island arc—is located in the middle part
of the Pacific slope of Honshu Island, between 38°–38.5°N and 35.5°N. The length of the source, about 350 km, approximately
corresponds to a magnitude ∼8.7 earthquake. In the north, the source is bounded by a sublatitudinal reverse fault, which generated
an earthquake with magnitude 7.2–7.5 in 1978. On this segment of the Pacific slope of Honshu Island, there are probably another
one or a few other large seismic sources, which are still latent. They are longitudinal reverse faults, which are comparable
in scale with the source of the March, 2011 earthquake. The recurrence period of the maximal earthquakes in such sources is
more than 1000 years. 相似文献