共查询到20条相似文献,搜索用时 31 毫秒
1.
A. Oya J. Navarro-Moreno J. C. Ruiz-Molina 《Stochastic Environmental Research and Risk Assessment (SERRA)》2007,21(4):317-326
An approach to the simulation of spatial random fields is proposed. The target random field is specified by its covariance
function which need not be homogeneous or Gaussian. The technique provided is based on an approximate Karhunen–Loève expansion
of spatial random fields which can be readily realized. Such an approximate representation is obtained from a correction to
the Rayleigh–Ritz method based on the dual Riesz basis theory. The resulting numerical projection procedure improves Rayleigh–Ritz
algorithm in the approximation of second-order random fields. Simulations are developed to illustrate the convergence and
accuracy of the method presented.
相似文献
J. C. Ruiz-MolinaEmail: |
2.
Derivation of Photosynthetically Available Radiation from METEOSAT data in the German Bight with Neural Nets 总被引:1,自引:0,他引:1
Kathrin Schiller 《Ocean Dynamics》2006,56(2):79-85
Two different models, a Physical Model and a Neural Net (NN), are used for the derivation of the Photosynthetically Available
Radiation (PAR) from METEOSAT data in the German Bight; advantages and disadvantages of both models are discussed. The use
of a NN for derivation of PAR should be preferred to the Physical Model because by construction, a NN can take the various
processes determining PAR on a surface much better into account than a non-statistical model relying on averaged relations.
相似文献
Kathrin SchillerEmail: |
3.
Stuart Coles Jonathan Tawn 《Stochastic Environmental Research and Risk Assessment (SERRA)》2005,19(6):417-427
Extreme value analysis of sea levels is an essential component of risk analysis and protection strategy for many coastal regions.
Since the tidal component of the sea level is deterministic, it is the stochastic variation in extreme surges that is the
most important to model. Historically, this modelling has been accomplished by fitting classical extreme value models to series
of annual maxima data. Recent developments in extreme value modelling have led to alternative procedures that make better
use of available data, and this has led to much refined estimates of extreme surge levels. However, one aspect that has been
routinely ignored is seasonality. In an earlier study we identified strong seasonal effects at one of the number of locations
along the eastern coastline of the United Kingdom. In this article, we discuss the construction and inference of extreme value
models for processes that include components of seasonality in greater detail. We use a point process representation of extreme
value behaviour, and set our inference in a Bayesian framework, using simulation-based techniques to resolve the computational
issues. Though contemporary, these techniques are now widely used for extreme value modelling. However, the issue of seasonality
requires delicate consideration of model specification and parameterization, especially for efficient implementation via Markov
chain Monte Carlo algorithms, and this issue seems not to have been much discussed in the literature. In the present paper
we make some suggestions for model construction and apply the resultant model to study the characteristics of the surge process,
especially in terms of its seasonal variation, on the eastern UK coastline. Furthermore, we illustrate how an estimated model
for seasonal surge can be combined with tide records to produce return level estimates for extreme sea levels that accounts
for seasonal variation in both the surge and tidal processes.
相似文献
Jonathan Tawn (Corresponding author)Email: |
4.
Ram Ranjan Ruben N. Lubowski 《Stochastic Environmental Research and Risk Assessment (SERRA)》2005,19(5):315-325
We examine the management of livestock diseases from the producers‘ perspective, incorporating information and incentive asymmetries
between producers and regulators. Using a stochastic dynamic model, we examine responses to different policy options including
indemnity payments, subsidies to report at-risk animals, monitoring, and regulatory approaches to decreasing infection risks
when perverse incentives and multiple policies interact. This conceptual analysis illustrates the importance of designing
efficient combinations of regulatory and incentive-based policies.
相似文献
Ram RanjanEmail: |
5.
Jacqueline A. MacDonald Mitchell J. Small 《Stochastic Environmental Research and Risk Assessment (SERRA)》2009,23(2):203-214
This paper summarizes the findings of a statistical analysis of the locations of metallic anomalies detected at the Pueblo
Precision Bombing Range Number 2 in Otero County, Colorado, and at the Victorville Precision Bombing Range in San Bernardino
County, California. The purpose of the study is to explore whether statistical properties of the pattern of anomaly locations
can be used to discriminate areas likely to contain unexploded ordnance (UXO) left over from previous bombing practice from
those unlikely to contain UXO. Techniques for discriminating areas with and without UXO are needed because historic records
have left an incomplete account of previous military training activities, so that locations historically used for target practice
are often unknown. This study differs from previous research on metallic anomaly data at former military training ranges in
that it analyzes the spatial pattern of the discrete locations of the anomalies, rather than the average number of anomalies
per unit area. The results indicate that differences in spatial pattern may be a distinguishing feature between areas that
were used for target practice and those that are unlikely to contain UXO, even when a large number of ferrous rocks and other
inert metallic anomalies are present. We found that at both of the former bombing ranges, the anomaly patterns in sample areas
that are distant from all known bombing targets are consistent with a complete spatial randomness pattern, while those near
the target areas fit a radially symmetric, bivariate Gaussian pattern. Furthermore, anomaly location patterns generated by
surveys with airborne metal detectors have the same statistical properties as the patterns generated by surveys with on-ground
detectors, even though the airborne systems detect only a subset of the anomalies found by the ground-based detectors. Thus,
pattern information revealed by airborne surveys with metal detectors may be useful in identifying areas where careful searches
for UXO are needed.
相似文献
Jacqueline A. MacDonaldEmail: |
6.
Wind in Ireland: long memory or seasonal effect? 总被引:1,自引:1,他引:0
Jean-Christophe Bouette Jean-François Chassagneux David Sibai Rémi Terron Arthur Charpentier 《Stochastic Environmental Research and Risk Assessment (SERRA)》2006,20(3):141-151
Since Haslett and Raftery’s paper Space-Time Modelling with Long-Memory Dependence: Assessing Ireland’s Wind Power Resource (1989), modelling meteorological time series with long memory processes, in particular the ARFIMA model has become very common. Haslett and Raftery fitted an ARFIMA model on Irish daily wind speeds. In this paper, we try to reproduce Haslett and Raftery’s results (focusing on the dynamic
of the wind process, and not on cross-correlation and space dependencies), and show that an ARFIMA model does not properly capture the behaviour of the series (in Modelling daily windspeed in Ireland section). Indeed, the
series show a periodic behaviour, that is not taken into account by the ARFIMA model. Removing this periodic behaviour yields no results either, we therefore try to fit a GARMA model that takes into account both seasonality and long memory (in Seasonality and long memory using GARMA models section). If a GARMA process can be fitted to the data to model Irish daily data, we will show that these models could also be used to model Dutch hourly data.
相似文献
Arthur CharpentierEmail: |
7.
Davide D’Alimonte Dan Cornford 《Stochastic Environmental Research and Risk Assessment (SERRA)》2008,22(5):613-620
This paper, addresses the problem of novelty detection in the case that the observed data is a mixture of a known ‘background’ process contaminated with an unknown other process, which generates the outliers, or novel observations. The framework we
describe here is quite general, employing univariate classification with incomplete information, based on knowledge of the
distribution (the probability density function, pdf) of the data generated by the ‘background’ process. The relative proportion of this ‘background’ component (the prior ‘background’ probability), the pdf and the prior probabilities of all other components are all assumed unknown. The main contribution is a new classification scheme that
identifies the maximum proportion of observed data following the known ‘background’ distribution. The method exploits the Kolmogorov–Smirnov test to estimate the proportions, and afterwards data are Bayes
optimally separated. Results, demonstrated with synthetic data, show that this approach can produce more reliable results
than a standard novelty detection scheme. The classification algorithm is then applied to the problem of identifying outliers in the SIC2004 data set, in order
to detect the radioactive release simulated in the ‘joker’ data set. We propose this method as a reliable means of novelty
detection in the emergency situation which can also be used to identify outliers prior to the application of a more general
automatic mapping algorithm.
相似文献
Davide D’AlimonteEmail: |
Dan Cornford (Corresponding author)Email: |
8.
Storm-related sea level variations 1958–2002 along the North Sea coast from a high-resolution numerical hindcast are investigated and compared to the results of earlier studies. Considerable variations were found from year to year and over the entire period. The large-scale pattern of these variations is consistent with that derived from previous studies, while the magnitudes of the long-term trends differ. The latter is attributed to different analysis periods, improvements in the atmospheric forcing, and the enhanced spatial resolution of the numerical simulation. It is shown that the different analysis periods, in particular, represent an issue as the increase in storm-related sea levels was found to be weaker over the last few years that have not been included in earlier studies. These changes are consistent with observed changes of the storm climate over the North Sea. It is also shown that observed and hindcast trends may differ significantly. While the latter are in agreement with observed changes in the storm climate, it may be concluded that observed sea level changes along the North Sea coast comprise a considerable fraction that cannot be attributed to changes in the large-scale atmospheric circulation.
相似文献
Ralf WeisseEmail: |
9.
Pierre Kestener Alain Arneodo 《Stochastic Environmental Research and Risk Assessment (SERRA)》2008,22(3):421-435
Extreme atmospheric events are intimately related to the statistics of atmospheric turbulent velocities. These, in turn, exhibit
multifractal scaling, which is determining the nature of the asymptotic behavior of velocities, and whose parameter evaluation
is therefore of great interest currently. We combine singular value decomposition techniques and wavelet transform analysis
to generalize the multifractal formalism to vector-valued random fields. The so-called Tensorial Wavelet Transform Modulus
Maxima (TWTMM) method is calibrated on synthetic self-similar 2D vector-valued multifractal measures and monofractal 3D vector-valued
fractional Brownian fields. We report the results of some application of the TWTMM method to turbulent velocity and vorticity
fields generated by direct numerical simulations of the incompressible Navier–Stokes equations. This study reveals the existence
of an intimate relationship between the singularity spectra of these two vector fields which are found significantly more intermittent than previously
estimated from longitudinal and transverse velocity increment statistics.
相似文献
Alain ArneodoEmail: |
10.
Amitrajeet A. Batabyal 《Stochastic Environmental Research and Risk Assessment (SERRA)》2007,21(3):253-257
Recently, Batabyal and Nijkamp (Environ Econ Policy Stud 7:39–51, 2005) have used a theoretical model of antibiotic use to study the relative merits of interventionist (antibiotics) and non-interventionist
(no antibiotics) treatment options. A key assumption in their paper is that the default treatment option is the interventionist
option. Because there are several instances in which this assumption is invalid, in this paper, we suppose that the default
treatment option is the non-interventionist option. Specifically, we first derive the long run average cost of treating a
common infection such as acute otitis media (AOM). Next, we show that there is a particular tolerance level and that when
a physician uses this tolerance level to determine when to administer the non-antibiotic medicine, the long run average cost
of treating the common infection under study is minimized.
相似文献
Amitrajeet A. BatabyalEmail: |
11.
Christoph Clauser 《Surveys in Geophysics》2009,30(3):163-191
The heat of the Earth derives from internal and external sources. A heat balance shows that most of the heat provided by external
sources is re-emitted by long-wavelength heat radiation and that the dominant internal sources are original heat and heat
generated by decay of unstable radioactive isotopes. Understanding of the thermal regime of the Earth requires appreciation
of properties and mechanisms for heat generation, storage, and transport. Both experimental and indirect methods are available
for inferring the corresponding rock properties. Heat conduction is the dominant transport process in the Earth’s crust, except
for settings where appreciable fluid flow provides a mechanism for heat advection. For most crustal and mantle rocks, heat
radiation becomes significant only at temperatures above 1200°C.
相似文献
Christoph ClauserEmail: |
12.
Topography and morphodynamics in the German Bight using SAR and optical remote sensing data 总被引:3,自引:0,他引:3
Morphological changes in coastal areas, especially in river estuaries, are of high interest in many parts of the world. Satellite
data from both optical and radar sensors can help to monitor and investigate these changes. Data from both kinds of sensors
being available for up to 30 years now, allow examinations over large timescales, while high resolution sensors developed
within the last decade allow increased accuracy. So the creation of digital elevation models (DEMs) of, for example, the wadden
sea from a series of satellite images is already possible. ENVISAT, successfully launched on March 1, 2002, continues the
line of higher resolution synthetic aperture radar (SAR) imaging sensors with its ASAR instrument and now also allows several
polarization modes for better separation of land and water areas. This article gives an overview of sensors and algorithms
for waterline determination as well as several applications. Both optical and SAR images are considered. Applications include
morphodynamic monitoring studies and DEM generation.
相似文献
Andreas NiedermeierEmail: |
13.
Ram Ranjan 《Stochastic Environmental Research and Risk Assessment (SERRA)》2006,20(5):353-362
This paper estimates the expected annual impacts of the Pink Hibiscus Mealybug infestation on the economies of Florida and the rest of the United States. The approach involves a Markov chain analysis wherein both short run and long run expected damages from infestation are calculated. Use is made of the CLIMEX model that predicts the potential pest-establishment regions in the US. While predictions based upon the CLIMEX model extend the scope of damages beyond Florida, the damages are significantly dependent upon the rate of arrival and detection of species in those regions. Damages are significantly higher when a longer time horizon is considered. When nursery owners bear the full cost of quarantines in the form of loss of sales and treatment costs of infected plants, the cost-effectiveness of quarantines as a regulatory tool is diminished. The long run propensity of the system, in terms of the fraction of time spent in the possible ‘states’ of infestation and control, determines the extent of damages, and not the annual value of crops that could be potential hosts to the pest.
相似文献
Ram RanjanEmail: Phone: +1-352-3921881Fax: +1-352-3929898 |
14.
A Survey of Techniques for Predicting Earthquake Ground Motions for Engineering Purposes 总被引:2,自引:0,他引:2
Over the past four or five decades many advances have been made in earthquake ground-motion prediction and a variety of procedures
have been proposed. Some of these procedures are based on explicit physical models of the earthquake source, travel-path and
recording site while others lack a strong physical basis and seek only to replicate observations. In addition, there are a
number of hybrid methods that seek to combine benefits of different approaches. The various techniques proposed have their
adherents and some of them are extensively used to estimate ground motions for engineering design purposes and in seismic
hazard research. These methods all have their own advantages and limitations that are not often discussed by their proponents.
The purposes of this article are to: summarise existing methods and the most important references, provide a family tree showing
the connections between different methods and, most importantly, to discuss the advantages and disadvantages of each method.
相似文献
John DouglasEmail: |
15.
Kevin J. Johnson Christian P. Minor Verner N. Guthrie Susan L. Rose-Pehrsson 《Stochastic Environmental Research and Risk Assessment (SERRA)》2009,23(2):237-252
The remediation of sites contaminated with unexploded ordnance (UXO) remains an area of intense focus for the Department of
Defense. Under the sponsorship of SERDP, data fusion techniques are being developed for use in enhancing wide-area assessment
UXO remediation efforts and a data fusion framework is being created to provide a cohesive data management and decision-making
utility to allow for more efficient expenditure of time, labor and resources. An important first step in this work is the
development of feature extraction utilities and feature probability density maps for eventual input to data fusion algorithms,
making data fusion of estimates of data quality, UXO-related features, non-UXO backgrounds, and correlations among independent
data streams possible. Utilizing data acquired during ESTCP’s Wide-Area Assessment Pilot Program, the results presented here
successfully demonstrate the feasibility of automated feature extraction from light detection and ranging, orthophotography,
and helicopter magnetometry wide-area assessment survey data acquired at the Pueblo Precision Bombing Range #2. These data
were imported and registered to a common survey map grid and UXO-related features were extracted and utilized to construct
survey site-wide probability density maps that are well-suited for input to higher level data fusion algorithms. Preliminary
combination of feature maps from the various data sources yielded maps for the Pueblo site that offered a more accurate UXO
assessment than any one data source alone.
相似文献
Susan L. Rose-PehrssonEmail: |
16.
Viroj Wiwanitkit Amornpun Sereemaspun Rojrit Rojanathanes 《Stochastic Environmental Research and Risk Assessment (SERRA)》2008,22(4):583-585
In medicine, there is limited knowledge on the toxicity of nanoparticles. In reproductive medicine, there has been limited
knowledge on the effect of gold nanoparticle on the human red blood cell. In this work, the author performed a study to demonstrate
if gold nanoparticle can be detected inside red blood cell on microscopic test. Chulalongkorn Univesity, Bangkok Thailand.
This study was performed as an experimental study. Mixture of gold nanoparticle solution and blood sample was prepared and
further analyzed. According to this work, accumulation of gold nanoparticle in the red blood cell can be observed after mixing
the blood sample with gold nanoparticle solution. However, no significant destruction of the red cell can be seen. The effect
of gold nanoparticle on red blood cell can be detected and the implication for the possible chronic toxicity of the accumulated
gold nanoparticle in red cell is raised.
相似文献
Viroj WiwanitkitEmail: |
17.
Solomon Tesfamariam Rehan Sadiq 《Stochastic Environmental Research and Risk Assessment (SERRA)》2008,22(1):1-15
The concepts of system load and capacity are pivotal in risk analysis. The complexity in risk analysis increases when the
input parameters are either stochastic (aleatory uncertainty) and/or missing (epistemic uncertainty). The aleatory and epistemic
uncertainties related to input parameters are handled through simulation-based parametric and non-parametric probabilistic
techniques. The complexities increase further when the empirical relationships are not strong enough to derive physical-based
models. In this paper, ordered weighted averaging (OWA) operators are proposed to estimate the system load. The risk of failure
is estimated by assuming normally distributed reliability index. The proposed methodology for risk analysis is illustrated
using an example of nine-input parameters. Sensitivity analyses identified that the risk of failure is dominated by the attitude
of a decision-maker to generate OWA weights, missing input parameters and system capacity.
相似文献
Rehan Sadiq (Corresponding author)Email: |
18.
Beatriz Vaz de Melo Mendes Luis Raúl Pericchi 《Stochastic Environmental Research and Risk Assessment (SERRA)》2009,23(3):399-410
We model multivariate hydrological risks in the case that at least one of the variables is extreme. Recently, Heffernan JE,
Tawn JA (2004) A conditional approach for multivariate extremes. J R Stat Soc B 66(3):497–546 (thereafter called HT04) proposed a conditional multivariate extreme value model which applies to regions where
not all variables are extreme and simultaneously identifies the type of extremal dependence, including negative dependence.
In this paper we apply this modeling strategy and provide an application to multivariate observations of five rivers in two
clearly distinct regions of Puerto Rico Island and for two different seasons each. This effective dimensionality of ten-dimensions
cannot be handled by the traditional models of multivariate extremes. The resulting fitted model, following HT04 model and
strategies of estimation, is able to make long term estimation of extremes, conditional than other rivers are extreme or not.
The model shows considerable flexibility to address the natural questions that arise in multivariate extreme value assessments.
In the Puerto Rico 5 rivers application, the model clearly puts together two regions one of two rivers and another of three
rivers, which show strong relationships in the rainy season. This corresponds with the geographical distribution of the rivers.
相似文献
Beatriz Vaz de Melo MendesEmail: |
19.
Ocean/ice interaction at the base of deep-drafted Antarctic ice shelves modifies the physical properties of inflowing shelf
waters to become Ice Shelf Water (ISW). In contrast to the conditions at the atmosphere/ocean interface, the increased hydrostatic
pressure at the glacial base causes gases embedded in the ice to dissolve completely after being released by melting. Helium
and neon, with an extremely low solubility, are saturated in glacial meltwater by more than 1000%. At the continental slope
in front of the large Antarctic caverns, ISW mixes with ambient waters to form different precursors of Antarctic Bottom Water.
A regional ocean circulation model, which uses an explicit formulation of the ocean/ice shelf interaction to describe for
the first time the input of noble gases to the Southern Ocean, is presented. The results reveal a long-term variability of
the basal mass loss solely controlled by the interaction between waters of the continental shelf and the ice shelf cavern.
Modeled helium and neon supersaturations from the Filchner–Ronne Ice Shelf front show a “low-pass” filtering of the inflowing
signal due to cavern processes. On circumpolar scales, the simulated helium and neon distributions allow us to quantify the
ISW contribution to bottom water, which spreads with the coastal current connecting the major formation sites in Ross and
Weddell Seas.
相似文献
Christian B. RodehackeEmail: |
20.
Jamesina J. Simpson 《Surveys in Geophysics》2009,30(2):105-130
Advances in computing technologies in recent decades have provided a means of generating and performing highly sophisticated
computational simulations of electromagnetic phenomena. In particular, just after the turn of the twenty-first century, improvements
to computing infrastructures provided for the first time the opportunity to conduct advanced, high-resolution three-dimensional
full-vector Maxwell’s equations investigations of electromagnetic propagation throughout the global Earth-ionosphere spherical
volume. These models, based on the finite-difference time-domain (FDTD) method, are capable of including such details as
the Earth’s topography and bathymetry, as well as arbitrary horizontal/vertical geometrical and electrical inhomogeneities
and anisotropies of the ionosphere, lithosphere, and oceans. Studies at this level of detail simply are not achievable using
analytical methods. The goal of this paper is to provide an historical overview and future prospectus of global FDTD computational
research for both natural and man-made electromagnetic phenomena around the world. Current and future applications of global
FDTD models relating to lightning sources and radiation, Schumann resonances, hypothesized earthquake precursors, remote sensing,
and space weather are discussed.
相似文献
Jamesina J. SimpsonEmail: |