共查询到20条相似文献,搜索用时 15 毫秒
1.
Hwong-wen Ma 《Stochastic Environmental Research and Risk Assessment (SERRA)》2000,14(3):195-206
The selection of optimal management strategies for environmental contaminants requires detailed information on the risks
imposed on populations. These risks are characterized by both inter-subject variability (different individuals having different
levels of risk) and by uncertainty (there is uncertainty about the risk associated with the Yth percentile of the variability distribution). In addition, there is uncertainty introduced by the inability to agree fully
on the appropriate decision criteria. This paper presents a methodology for incorporating uncertainty and variability into
a multi-medium, multi-pathway, multi-contaminant risk assessment, and for placing this assessment into an optimization framework
to identify optimal management strategies. The framework is applied to a case study of a sludge management system proposed
for North Carolina and the impact of stochasticity on selection of an optimal strategy considered. Different sets of decision
criteria reflecting different ways of treating stochasticity are shown to lead to different selections of optimal management
strategies. 相似文献
2.
I. D. Benekos C. A. Shoemaker J. R. Stedinger 《Stochastic Environmental Research and Risk Assessment (SERRA)》2007,21(4):375-390
Groundwater contamination risk assessment for health-threatening compounds should benefit from a stochastic environmental
risk assessment which considers the effects of biological, chemical, human behavioral, and physiological processes that involve
elements of biotic and abiotic aquifer uncertainty, and human population variability. This paper couples a complex model of
chemical degradation and transformation with movement in an aquifer undergoing bioremediation to generate a health risk analysis
for different population cohorts in the community. A two-stage Monte Carlo simulation has separate stages for population variability
and aquifer uncertainty yielding a computationally efficient and conceptually attractive algorithm. A hypothetical example
illustrates how risk variance analysis can be conducted to determine the distribution of risk, and the relative impact of
uncertainty and variability in different sets of parameters upon the variation of risk values for adults, adolescents, and
children. The groundwater example considers a community water supply contaminated with chlorinated ethenes. Biodegradation
pathways are enhanced by addition of butyrate. The results showed that the contribution of uncertainty to the risk variance
is comparable to that of variability. Among the uncertain parameters considered, transmissivity accounted for the major part
of the output variance. Children were the most susceptible and vulnerable population cohort. 相似文献
3.
2D Monte Carlo versus 2D Fuzzy Monte Carlo health risk assessment 总被引:11,自引:4,他引:11
E. Kentel M. M. Aral 《Stochastic Environmental Research and Risk Assessment (SERRA)》2005,19(1):86-96
Risk estimates can be calculated using crisp estimates of the exposure variables (i.e., contaminant concentration, contact rate, exposure frequency and duration, body weight, and averaging time). However, aggregate and cumulative exposure studies require a better understanding of exposure variables and uncertainty and variability associated with them. Probabilistic risk assessment (PRA) studies use probability distributions for one or more variables of the risk equation in order to quantitatively characterize variability and uncertainty. Two-dimensional Monte Carlo Analysis (2D MCA) is one of the advanced modeling approaches that may be used to conduct PRA studies. In this analysis the variables of the risk equation along with the parameters of these variables (for example mean and standard deviation for a normal distribution) are described in terms of probability density functions (PDFs). A variable described in this way is called a second order random variable. Significant data or considerable insight to uncertainty associated with these variables is necessary to develop the appropriate PDFs for these random parameters. Typically, available data and accuracy and reliability of such data are not sufficient for conducting a reliable 2D MCA. Thus, other theories and computational methods that propagate uncertainty and variability in exposure and health risk assessment are needed. One such theory is possibility analysis based on fuzzy set theory, which allows the utilization of incomplete information (incomplete information includes vague and imprecise information that is not sufficient to generate probability distributions for the parameters of the random variables of the risk equation) together with expert judgment. In this paper, as an alternative to 2D MCA, we are proposing a 2D Fuzzy Monte Carlo Analysis (2D FMCA) to overcome this difficulty. In this approach, instead of describing the parameters of PDFs used in defining the variables of the risk equation as random variables, we describe them as fuzzy numbers. This approach introduces new concepts and risk characterization methods. In this paper we provide a comparison of these two approaches relative to their computational requirements, data requirements and availability. For a hypothetical case, we also provide a comperative interpretation of the results generated. 相似文献
4.
R. L. Maddalena T. E. McKone D. P. H. Hsieh S. Geng 《Stochastic Environmental Research and Risk Assessment (SERRA)》2001,15(1):1-17
Monte Carlo analysis is a statistical simulation method that is often used to assess and quantify the outcome variance in
complex environmental fate and effects models. Total outcome variance of these models is a function of (1) the variance (uncertainty
and/or variability) associated with each model input and (2) the sensitivity of the model outcome to changes in the inputs.
To propagate variance through a model using Monte Carlo techniques, each variable must be assigned a probability distribution.
The validity of these distributions directly influences the accuracy and reliability of the model outcome. To efficiently
allocate resources for constructing distributions one should first identify the most influential set of variables in the model.
Although existing sensitivity and uncertainty analysis methods can provide a relative ranking of the importance of model inputs,
they fail to identify the minimum set of stochastic inputs necessary to sufficiently characterize the outcome variance. In
this paper, we describe and demonstrate a novel sensitivity/uncertainty analysis method for assessing the importance of each
variable in a multimedia environmental fate model. Our analyses show that for a given scenario, a relatively small number
of input variables influence the central tendency of the model and an even smaller set determines the spread of the outcome
distribution. For each input, the level of influence depends on the scenario under consideration. This information is useful
for developing site specific models and improving our understanding of the processes that have the greatest influence on the
variance in outcomes from multimedia models. 相似文献
5.
Z. J. Kabala H. K. El-Sayegh H. P. Gavin 《Stochastic Environmental Research and Risk Assessment (SERRA)》2002,16(6):399-424
Logarithmic sensitivities and plausible relative errors are studied in a simple no-crossflow model of a transient flowmeter
test (TFMT). This model is identical to the model of a constant-rate pumping test conducted on a fully penetrating well with
wellbore storage, surrounded by a thick skin zone, and situated in a homogeneous confined aquifer. The sensitivities of wellbore
drawdown and wellface flowrate to aquifer and skin parameters are independent of the pumping rate. However, the plausible
relative errors in the aquifer and skin parameters estimated from drawdown and wellface flowrate data can be proportionally
decreased by increasing the pumping rate. The plausible relative errors vary by many orders of magnitude from the beginning
of the TFMT. The practically important flowrate and drawdown measurements in this test, for which the plausible relative errors
vary by less than one order of magnitude from the minimum plausible relative errors, can begin approximately when the dimensionless
wellface flowrate exceeds q
D
=q/Q≈0.4. During most of this stage of the test, the plausible relative errors in aquifer hydraulic conductivity (K
a
) are generally an order of magnitude smaller than those in aquifer specific storativity. The plausible relative errors in
the skin hydraulic conductivity (K
s
) are generally larger than the plausible relative errors in the aquifer specific storativity when the thick skin is normal
(K
s
>K
a
) and smaller when the thick skin is damaged (K
s
<K
a
). The specific storativity of the skin zone would be so biased that one should not even attempt to estimate it from the TFMT.
We acknowledge Wiebe H. van der Molen for recommending the De Hoog algorithm and sharing his code. This research was partially
supported by the US Geological Survey, USGS Agreement #1434-HQ-96-GR-02689 and North Carolina Water Resources Research Institute,
WRRI Project #70165. 相似文献
6.
Richard Dawson Jim Hall Paul Sayers Paul Bates Corina Rosu 《Stochastic Environmental Research and Risk Assessment (SERRA)》2005,19(6):388-402
A dike system of moderate size has a large number of potential system states, and the loading imposed on the system is inherently
random. If the system should fail, in one of its many potential failure modes, the topography of UK floodplains is usually
such that hydrodynamic modelling of flood inundation is required to generate realistic estimates of flood depth and hence
damage. To do so for all possible failure states may require 1,000s of computationally expensive inundation simulations. A
risk-based sampling technique is proposed in order to reduce the computational resources required to estimate flood risk.
The approach is novel in that the loading and dike system states (obtained using a simplified reliability analysis) are sampled
according to the contribution that a given region of the space of basic variables makes to risk. The methodology is demonstrated
in a strategic flood risk assessment for the city of Burton-upon-Trent in the UK. 5,000 inundation model simulations were
run although it was shown that the flood risk estimate converged adequately after approximately half this number. The case
study demonstrates that, amongst other factors, risk is a complex function of loadings, dike resistance, floodplain topography
and the spatial distribution of floodplain assets. The application of this approach allows flood risk managers to obtain an
improved understanding of the flooding system, its vulnerabilities and the most efficient means of allocating resource to
improve performance. It may also be used to test how the system may respond to future external perturbations. 相似文献
7.
Can global sensitivity analysis steer the implementation of models for environmental assessments and decision-making? 总被引:3,自引:1,他引:3
S. Tarantola N. Giglioli J. Jesinghaus A. Saltelli 《Stochastic Environmental Research and Risk Assessment (SERRA)》2002,16(1):63-76
We illustrate a method of global sensitivity analysis and we test it on a preliminary case study in the field of environmental
assessment to quantify uncertainty importance in poorly-known model parameters and spatially referenced input data. The focus
of the paper is to show how the methodology provides guidance to improve the quality of environmental assessment practices
and decision support systems employed in environmental policy. Global sensitivity analysis, coupled with uncertainty analysis,
is a tool to assess the robustness of decisions, to understand whether the current state of knowledge on input data and parametric
uncertainties is sufficient to enable a decision to be taken. The methodology is applied to a preliminary case study, which
is based on a numerical model that employs GIS-based soil data and expert consultation to evaluate an index that joins environmental
and economic aspects of land depletion. The index is used as a yardstick by decision-makers involved in the planning of highways
to identify the route that minimises the overall impact. 相似文献
8.
Rubin Y. Cushey M. A. Bellin A. 《Stochastic Environmental Research and Risk Assessment (SERRA)》1994,8(1):57-77
This paper presents the principles underlying a recently developed numerical technique for modeling transport in heterogeneous porous media. The method is then applied to derive the concentration mean and variance, the concentration CDF, exceedance probabilities and exposure time CDF, which are required by various regulatory agencies for risk and performance assessment calculations. The dependence of the various statistics on elapsed travel time, location in space, the dimension of the detection volume, natural variability and pore-scale dispersion is investigated and discussed. 相似文献
9.
H. Held 《Stochastic Environmental Research and Risk Assessment (SERRA)》2003,17(1-2):20-41
A comparison is made between a circular and a more adequate spherical reaction-diffusion multi-media mass balance model.
This comparison adds new aspects to ongoing debates about more effective assessments of potentially harmful substances including
persistent organic pollutants (POPs). The circular model serves as a paradigm in investigations of persistence and spatial
range of non-polar chemicals. An analytic solution of the spherical model is presented. It is utilized in order to establish
circular spatial ranges as versatile approximations of their spherical counterparts for most cases. Deviations in the few
exceptions are demonstrated as playing a minor role compared to sensitivities against parameter uncertainties which characterize
these exceptional cases as well. The sensitivities are fundamentally linked to the multi-scale nature of the underlying system.
Finally, the insensitivity of spatial ranges with respect to dimension – circle versus sphere – is further secured by extensive
studies of the role of the mode of entry for which a set of rules is established.
Present address: Potsdam Institute for Climate Impact Research P.O. Box 60 12 03, D-14412 Potsdam, Germany e-mail: held@pik-potsdam.de
The author would like to thank B. H. Hawkins, H. A. Schweers, and M. Str?be for helpful comments. 相似文献
10.
Programs for evaluating proposed discharges of dredged material into waters of the United States specify a tiered testing and evaluation protocol that includes performance of acute and chronic bioassays to assess toxicity of the dredged sediments. Although these evaluations reflect the toxicological risks associated with disposal activities to some degree, analysis activities are limited to the sediments of each dredging project separately. Cumulative risks to water column and benthic organisms at and near the designated disposal site are therefore difficult to assess. An alternate approach is to focus attention on the disposal site, with the goal of understanding more directly the risks of multiple disposal events to receiving ecosystems. Here we review current US toxicity testing and evaluation protocols, and describe an application of ecological risk assessment that allows consideration of the temporal and spatial components of risk to receiving aquatic ecosystems. When expanded to include other disposal options, this approach can provide the basis for holistic management of dredged material disposal. 相似文献
11.
Produced water discharge accounts for the greater portion of wastes arising from offshore oil and gas production operations. Development and expansion of Canada’s offshore oil and gas reserves has led to concerns over the potential long-term impacts of produced water discharges to the ocean. To examine this emerging environmental issue at a regional scale, an integrated risk assessment approach was developed in this study based on the princeton ocean model (POM), a random walk (RW) and Monte Carlo simulation. The use of water quality standards arrayed in a Monte Carlo design in the developed approach has served to reflect uncertainties and quantify environmental risks associated with produced water discharge. The model was validated against field data from a platform operating off Canada’s east coast, demonstrating its usefulness in supporting effective management of future produced water discharge. 相似文献
12.
Lucio Lirer Rosalba Munno Immacolata Postiglione Anna Vinci Livia Vitelli 《Bulletin of Volcanology》1997,59(2):112-124
Due to the lack of an effective policy of planning and prevention, over the past decades the area around Mt. Vesuvio has
undergone a steady increase in population and uncontrolled housing development. Consequently, it has become one of the most
hazardous volcanic areas in the world. In order to mitigate the damage that the impact of an explosive event would cause in
the area, the Department of Civil Defense has worked out an Emergency Management Plan using the A.D. 1631 subplinian eruption
as the most probable short-term event. However, from 25 000 years B.P. to present, the activity of the Somma-Vesuvio volcano
has shown a sequence of eight eruptive cycles, which always began with a strong plinian eruption. In this paper we utilize
the A.D. 79 eruption as an example of a potential large explosive eruption that might occur again at Vesuvio. A detailed tephrostratigraphic
analysis of the eruption products was processed by a multivariate statistical analysis. This analysis proved useful for identifying
marker layers in the sequences, thus allowing the recognition of some major phases of synchronous deposition and hence the
definition of the chronological and spatial evolution of the eruption. By combining this reconstruction with land-use maps,
a scenario is proposed with time intervals in the eruptive sequence similar to those reported in Pliny's letter. Thus, it
was calculated that, after 7 h from the start of the eruption, a total area of approximately 300 km2 would be covered with the eruption products. In the following 11 h, a total area of approximately 500 km2 would be involved. The third and last phase of deposition would not cause significant variation in the total area involved,
but it would bring about an increase in the thickness of the pyroclastic deposits in the perivolcanic area.
Received: 30 November 1996 / Accepted: 29 May 1997 相似文献
13.
D. Oettl R. A. Almbauer P. J. Sturm G. Pretterhofer 《Stochastic Environmental Research and Risk Assessment (SERRA)》2003,17(1-2):58-75
Although the strict legislation regarding vehicle emissions in Europe (EURO 4, EURO 5) will lead to a remarkable reduction
of emissions in the near future, traffic related air pollution still can be problematic due to a large increase of traffic
in certain areas. Many dispersion models for line-sources have been developed to assess the impact of traffic on the air pollution
levels near roads, which are in most cases based on the Gaussian equation. Previous studies gave evidence, that such kind
of models tend to overestimate concentrations in low wind speed conditions or when the wind direction is almost parallel to
the street orientation. This is of particular interest, since such conditions lead generally to the highest observed concentrations
in the vicinity of streets. As many air quality directives impose limits on high percentiles of concentrations, it is important
to have good estimates of these quantities in environmental assessment studies. The objective of this study is to evaluate
a methodology for the computation of especially those high percentiles required by e.g. the EU daughter directive 99/30/EC
(for instance the 99.8 percentile for NO2). The model used in this investigation is a Markov Chain – Monte Carlo model to predict pollutant concentrations, which performs
well in low wind conditions as is shown here. While usual Lagrangian models use deterministic time steps for the calculation
of the turbulent velocities, the model presented here, uses random time steps from a Monte Carlo simulation and a Markov Chain
simulation for the sequence of the turbulent velocities. This results in a physically better approach when modelling the dispersion
in low wind speed conditions. When Lagrangian dispersion models are used for regulatory purposes, a meteorological pre-processor
is necessary to obtain required input quantities like Monin-Obukhov length and friction velocity from routinely observed data.
The model and the meteorological pre-processor applied here, were tested against field data taken near a major motorway south
of Vienna. The methodology used is based on input parameters, which are also available in usual environmental assessment studies.
Results reveal that the approach examined is useful and leads to reasonable concentration levels near motorways compared to
observations.
We wish to thank Andreas Schopper (Styrian Government) for providing air quality values, M. Kalina for providing the raw
data of the air quality stations near the motorway and J. Kukkonen for providing the road site data set from the Finish Meteorological
Institute (FMI). The study was partly funded by the Austrian science fund under the project P14075-TEC. 相似文献
14.
Elcin Kentel Mustafa M. Aral 《Stochastic Environmental Research and Risk Assessment (SERRA)》2007,21(4):405-417
In risk assessment studies it is important to determine how uncertain and imprecise knowledge should be included into the
simulation and assessment models. Thus, proper evaluation of uncertainties has become a major concern in environmental and
health risk assessment studies. Previously, researchers have used probability theory, more commonly Monte Carlo analysis,
to incorporate uncertainty analysis in health risk assessment studies. However, in conducting probabilistic health risk assessment,
risk analyst often suffers from lack of data or the presence of imperfect or incomplete knowledge about the process modeled
and also the process parameters. Fuzzy set theory is a tool that has been used in propagating imperfect and incomplete information
in health risk assessment studies. Such analysis result in fuzzy risks which are associated with membership functions. Since
possibilistic health risk assessment studies are relatively new, standard procedures for decision-making about the acceptability
of the resulting fuzzy risk with respect to a crisp standard set by the regulatory agency are not fully established. In this
paper, we are providing a review of several available approaches which may be used in decision-making. These approaches involve
defuzzification techniques, the possibility and the necessity measures. In this study, we also propose a new measure, the
risk tolerance measure, which can be used in decision making. The risk tolerance measure provides an effective metric for evaluating the acceptability
of a fuzzy risk with respect to a crisp compliance criterion. Fuzzy risks with different membership functions are evaluated
with respect to a crisp compliance criterion by using the possibility, the necessity, and the risk tolerance measures and
the results are discussed comparatively. 相似文献
15.
S. De Iaco M. Palma 《Stochastic Environmental Research and Risk Assessment (SERRA)》2002,16(5):333-341
In geostatistics, stochastic simulation is often used either as an improved interpolation algorithm or as a measure of the
spatial uncertainty. Hence, it is crucial to assess how fast realization-based statistics converge towards model-based statistics
(i.e. histogram, variogram) since in theory such a match is guaranteed only on average over a number of realizations. This
can be strongly affected by the random number generator being used. Moreover, the usual assumption of independence among simulated
realizations of a random process may be affected by the random number generator used. Simulation results, obtained by using
three different random number generators implemented in Geostatistical Software Library (GSLib), are compared. Some practical
aspects are pointed out and some suggestions are given to users of the unconditional LU simulation method. 相似文献
16.
A. A. Batabyal H. Beladi 《Stochastic Environmental Research and Risk Assessment (SERRA)》2002,16(5):325-332
This paper addresses the following hitherto unstudied question in renewable resource management: How should a resource manager
set the temporal control optimally for renewable resources such as rangelands and fisheries that are managed with spatial
and temporal controls? We use a dynamic and stochastic framework to first derive the resource manager's long run average net
cost function. We then demonstrate how the temporal control can be chosen to minimize this objective function.
We thank George Christakos and two anonymous referees for their comments on a previous version of this paper. Batabyal acknowledges
financial support from the Gosnell endowment at RIT. The usual disclaimer applies. 相似文献
17.
J. Kros J. P. Mol-Dijkstra E. J. Pebesma 《Stochastic Environmental Research and Risk Assessment (SERRA)》2002,16(4):279-306
The prediction error of a relatively simple soil acidification model (SMART2) was assessed before and after calibration,
focussing on the Al and NO3 concentrations on a block scale. Although SMART2 is especially developed for application on a national to European scale,
it still runs at a point support. A 5×5 km2 grid was used for application on the European scale. Block characteristic values were obtained simply by taking the median
value of the point support values within the corresponding grid cell. In order to increase confidence in model predictions
on large spatial scales, the model was calibrated and validated for the Netherlands, using a resolution that is feasible for
Europe as a whole. Because observations are available only at the point support, it was necessary to transfer them to the
block support of the model results. For this purpose, about 250 point observations of soil solution concentrations in forest
soils were upscaled to a 5×5 km2 grid map, using multiple linear regression analysis combined with block kriging. The resulting map with upscaled observations
was used for both validation and calibration. A comparison of the map with model predictions using nominal parameter values
and the map with the upscaled observations showed that the model overestimated the predicted Al and NO3 concentrations. The nominal model results were still in the 95% confidence interval of the upscaled observations, but calibration
improved the model predictions and strongly reduced the model error. However, the model error after calibration remains rather
large. 相似文献
18.
J. Mohapl 《Stochastic Environmental Research and Risk Assessment (SERRA)》2003,17(1-2):76-103
Averages of annual wet deposition data are often used as an indicator of acid amounts in the atmosphere. From the view point
of statistics, the average is a meaningful estimator only for identically distributed data with specific types of probability
distribution. Wet deposition data usually carry seasonal trends. To learn about year-to-year concentration changes, a more
accurate summary information accommodating the trend is thus necessary. This paper suggests a quantity (index) describing
the annual wet deposition of SO4 and formulas for its calculation. The formulas are in some sense optimal in case the data can be considered independent,
lognormally distributed and have a time dependent trend. Generalization of these assumptions is also discussed. The index
is derived from specific features of the average applied to data with lognormal distribution. The theory is utilized to analyze
a set of SO4 precipitation concentrations observed in the Ontario region.
A work sponsored by the Atmospheric Environment Service in Toronto and the Canadian NSERC.
I want to thank the Ontario Ministry of the Environment for provision of the data used in this study and the anonymous referee
for a number of comments that resulted in improvement of the final presentation. 相似文献
19.
K. K. Benke A. J. Hamilton 《Stochastic Environmental Research and Risk Assessment (SERRA)》2008,22(4):533-539
In the past, arithmetic and geometric means have both been used to characterise pathogen densities in samples used for microbial
risk assessment models. The calculation of total (annual) risk is based on cumulative independent (daily) exposures and the
use of an exponential dose–response model, such as that used for exposure to Giardia or Cryptosporidium. Mathematical analysis suggests that the arithmetic mean is the appropriate measure of central tendency for microbial concentration
with respect to repeated samples of daily exposure in risk assessment. This is despite frequent characterisation of microbial
density by the geometric mean, since the microbial distributions may be Log normal or skewed in nature. Mathematical derivation
supporting the use of the arithmetic mean has been based on deterministic analysis, prior assumptions and definitions, the
use of point-estimates of probability, and has not included from the outset the influence of an actual distribution for microbial
densities. We address these issues by experiments using two real-world pathogen datasets, together with Monte Carlo simulation,
and it is revealed that the arithmetic mean also holds in the case of a daily dose with a finite distribution in microbial
density, even when the distribution is very highly-skewed, as often occurs in environmental samples. Further, for simplicity,
in many risk assessment models, the daily infection risk is assumed to be the same for each day of the year and is represented
by a single value,
which is then used in the calculation of p
Σ, which is a numerical estimate of annual risk, P
Σ, and we highlight the fact that is simply a function of the geometric mean of the daily complementary risk probabilities (although it is sometimes approximated
by the arithmetic mean of daily risk in the low dose case). Finally, the risk estimate is an imprecise probability with no
indication of error and we investigate and clarify the distinction between risk and uncertainty assessment with respect to
the predictive model used for total risk assessment. 相似文献
20.
José Fernández José M. Carrasco John B. Rundle Vicente Araña 《Bulletin of Volcanology》1999,60(7):534-544
In this paper we study the application of different geodetic techniques to volcanic activity monitoring, using theoretical analysis. This methodology is very useful for obtaining an idea of the most appropriate (and efficient) monitoring method, mainly when there are no records of geodetic changes previous to volcanic activity. The analysis takes into account the crustal structure of the area, its geology, and its known volcanic activity to estimate the deformation and gravity changes that might precede eruptions. The deformation model used includes the existing gravity field and vertical changes in the crustal properties. Both factors can have a considerable effect on computed deformation and gravity changes. Topography should be considered when there is a steep slope (greater than 10°). The case study of Teide stratovolcano (Tenerife, Canary Islands), for which no deformation or gravity changes are available, is used as a test. To avoid considering topography, we worked at the lowest level of Las Cañadas and examined a smaller area than the whole caldera or island. Therefore, the results are only a first approach to the most adequate geodetic monitoring system. The methodology can also be applied to active areas where volcanic risk is not associated with a stratovolcano but instead with monogenetic scattered centers, especially when sites must be chosen in terms of detection efficiency or existing facilities. The Canary Islands provide a good example of this type of active volcanic areas, and we apply our model to the island of Lanzarote to evaluate the efficiency of the monitoring system installed at the existing geodynamic station. On this island topography is not important. The results of our study show clearly that the most appropriate geodetic volcano monitoring system is not the same for all different volcanic zones and types, and the particular properties of each volcano/zone must be taken into account when designing each system. 相似文献