首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper investigates trust in the scientists, government authorities and wider risk management team during the ongoing volcanic crisis in Montserrat, WI. Identifying the most trusted communicator and how trust in information can be enhanced are considered important for improving the efficacy of volcanic risk communication. Qualitative interviews, participant observations and a quantitative survey were utilised to investigate the views and attitudes of the public, authorities and scientists. Trust was found to be dynamic, influenced by political factors made more complex by the colonial nature of Montserrat’s governance and the changing level of volcanic activity. The scientists were viewed by the authorities as a highly trusted expert source of volcanic information. Mistrust among some of the local authorities towards the scientists and British Governor was founded in the uncertainty of the volcanic situation and influenced by differences in levels of acceptable risk and suspicions about integrity (e.g. as a consequence of employment by the British Government). The public viewed friends and relatives as the most trusted source for volcanic information. High trust in this source allowed competing messages to reinforce beliefs of lower risk than were officially being described. The scientists were the second most trusted group by the public and considered significantly more competent, reliable, caring, fair and open than the authorities. The world press was the least trusted, preceded closely by the British Governor’s Office and Montserratian Government officials. These results tally well with other empirical findings suggesting that government ministers and departments are typically distrusted as sources of risk-related information. These findings have implications for risk communication on Montserrat and other volcanic crises. The importance and potential effectiveness of scientists as communicators, because of, and despite, the existence of political, cultural and institutional barriers, is exemplified by this study.
Katharine HaynesEmail: Email:
  相似文献   

2.
We examine the management of livestock diseases from the producers‘ perspective, incorporating information and incentive asymmetries between producers and regulators. Using a stochastic dynamic model, we examine responses to different policy options including indemnity payments, subsidies to report at-risk animals, monitoring, and regulatory approaches to decreasing infection risks when perverse incentives and multiple policies interact. This conceptual analysis illustrates the importance of designing efficient combinations of regulatory and incentive-based policies.
Ram RanjanEmail:
  相似文献   

3.
Two different models, a Physical Model and a Neural Net (NN), are used for the derivation of the Photosynthetically Available Radiation (PAR) from METEOSAT data in the German Bight; advantages and disadvantages of both models are discussed. The use of a NN for derivation of PAR should be preferred to the Physical Model because by construction, a NN can take the various processes determining PAR on a surface much better into account than a non-statistical model relying on averaged relations.
Kathrin SchillerEmail:
  相似文献   

4.
The remediation of sites contaminated with unexploded ordnance (UXO) remains an area of intense focus for the Department of Defense. Under the sponsorship of SERDP, data fusion techniques are being developed for use in enhancing wide-area assessment UXO remediation efforts and a data fusion framework is being created to provide a cohesive data management and decision-making utility to allow for more efficient expenditure of time, labor and resources. An important first step in this work is the development of feature extraction utilities and feature probability density maps for eventual input to data fusion algorithms, making data fusion of estimates of data quality, UXO-related features, non-UXO backgrounds, and correlations among independent data streams possible. Utilizing data acquired during ESTCP’s Wide-Area Assessment Pilot Program, the results presented here successfully demonstrate the feasibility of automated feature extraction from light detection and ranging, orthophotography, and helicopter magnetometry wide-area assessment survey data acquired at the Pueblo Precision Bombing Range #2. These data were imported and registered to a common survey map grid and UXO-related features were extracted and utilized to construct survey site-wide probability density maps that are well-suited for input to higher level data fusion algorithms. Preliminary combination of feature maps from the various data sources yielded maps for the Pueblo site that offered a more accurate UXO assessment than any one data source alone.
Susan L. Rose-PehrssonEmail:
  相似文献   

5.
The concepts of system load and capacity are pivotal in risk analysis. The complexity in risk analysis increases when the input parameters are either stochastic (aleatory uncertainty) and/or missing (epistemic uncertainty). The aleatory and epistemic uncertainties related to input parameters are handled through simulation-based parametric and non-parametric probabilistic techniques. The complexities increase further when the empirical relationships are not strong enough to derive physical-based models. In this paper, ordered weighted averaging (OWA) operators are proposed to estimate the system load. The risk of failure is estimated by assuming normally distributed reliability index. The proposed methodology for risk analysis is illustrated using an example of nine-input parameters. Sensitivity analyses identified that the risk of failure is dominated by the attitude of a decision-maker to generate OWA weights, missing input parameters and system capacity.
Rehan Sadiq (Corresponding author)Email:
  相似文献   

6.
An approach to the simulation of spatial random fields is proposed. The target random field is specified by its covariance function which need not be homogeneous or Gaussian. The technique provided is based on an approximate Karhunen–Loève expansion of spatial random fields which can be readily realized. Such an approximate representation is obtained from a correction to the Rayleigh–Ritz method based on the dual Riesz basis theory. The resulting numerical projection procedure improves Rayleigh–Ritz algorithm in the approximation of second-order random fields. Simulations are developed to illustrate the convergence and accuracy of the method presented.
J. C. Ruiz-MolinaEmail:
  相似文献   

7.
The heat of the Earth derives from internal and external sources. A heat balance shows that most of the heat provided by external sources is re-emitted by long-wavelength heat radiation and that the dominant internal sources are original heat and heat generated by decay of unstable radioactive isotopes. Understanding of the thermal regime of the Earth requires appreciation of properties and mechanisms for heat generation, storage, and transport. Both experimental and indirect methods are available for inferring the corresponding rock properties. Heat conduction is the dominant transport process in the Earth’s crust, except for settings where appreciable fluid flow provides a mechanism for heat advection. For most crustal and mantle rocks, heat radiation becomes significant only at temperatures above 1200°C.
Christoph ClauserEmail:
  相似文献   

8.
This paper, addresses the problem of novelty detection in the case that the observed data is a mixture of a known ‘background’ process contaminated with an unknown other process, which generates the outliers, or novel observations. The framework we describe here is quite general, employing univariate classification with incomplete information, based on knowledge of the distribution (the probability density function, pdf) of the data generated by the ‘background’ process. The relative proportion of this ‘background’ component (the priorbackground’ probability), the pdf and the prior probabilities of all other components are all assumed unknown. The main contribution is a new classification scheme that identifies the maximum proportion of observed data following the known ‘background’ distribution. The method exploits the Kolmogorov–Smirnov test to estimate the proportions, and afterwards data are Bayes optimally separated. Results, demonstrated with synthetic data, show that this approach can produce more reliable results than a standard novelty detection scheme. The classification algorithm is then applied to the problem of identifying outliers in the SIC2004 data set, in order to detect the radioactive release simulated in the ‘joker’ data set. We propose this method as a reliable means of novelty detection in the emergency situation which can also be used to identify outliers prior to the application of a more general automatic mapping algorithm.
Davide D’AlimonteEmail:
Dan Cornford (Corresponding author)Email:
  相似文献   

9.
Over the past four or five decades many advances have been made in earthquake ground-motion prediction and a variety of procedures have been proposed. Some of these procedures are based on explicit physical models of the earthquake source, travel-path and recording site while others lack a strong physical basis and seek only to replicate observations. In addition, there are a number of hybrid methods that seek to combine benefits of different approaches. The various techniques proposed have their adherents and some of them are extensively used to estimate ground motions for engineering design purposes and in seismic hazard research. These methods all have their own advantages and limitations that are not often discussed by their proponents. The purposes of this article are to: summarise existing methods and the most important references, provide a family tree showing the connections between different methods and, most importantly, to discuss the advantages and disadvantages of each method.
John DouglasEmail:
  相似文献   

10.
Recently, Batabyal and Nijkamp (Environ Econ Policy Stud 7:39–51, 2005) have used a theoretical model of antibiotic use to study the relative merits of interventionist (antibiotics) and non-interventionist (no antibiotics) treatment options. A key assumption in their paper is that the default treatment option is the interventionist option. Because there are several instances in which this assumption is invalid, in this paper, we suppose that the default treatment option is the non-interventionist option. Specifically, we first derive the long run average cost of treating a common infection such as acute otitis media (AOM). Next, we show that there is a particular tolerance level and that when a physician uses this tolerance level to determine when to administer the non-antibiotic medicine, the long run average cost of treating the common infection under study is minimized.
Amitrajeet A. BatabyalEmail:
  相似文献   

11.
In medicine, there is limited knowledge on the toxicity of nanoparticles. In reproductive medicine, there has been limited knowledge on the effect of gold nanoparticle on the human red blood cell. In this work, the author performed a study to demonstrate if gold nanoparticle can be detected inside red blood cell on microscopic test. Chulalongkorn Univesity, Bangkok Thailand. This study was performed as an experimental study. Mixture of gold nanoparticle solution and blood sample was prepared and further analyzed. According to this work, accumulation of gold nanoparticle in the red blood cell can be observed after mixing the blood sample with gold nanoparticle solution. However, no significant destruction of the red cell can be seen. The effect of gold nanoparticle on red blood cell can be detected and the implication for the possible chronic toxicity of the accumulated gold nanoparticle in red cell is raised.
Viroj WiwanitkitEmail:
  相似文献   

12.
Morphological changes in coastal areas, especially in river estuaries, are of high interest in many parts of the world. Satellite data from both optical and radar sensors can help to monitor and investigate these changes. Data from both kinds of sensors being available for up to 30 years now, allow examinations over large timescales, while high resolution sensors developed within the last decade allow increased accuracy. So the creation of digital elevation models (DEMs) of, for example, the wadden sea from a series of satellite images is already possible. ENVISAT, successfully launched on March 1, 2002, continues the line of higher resolution synthetic aperture radar (SAR) imaging sensors with its ASAR instrument and now also allows several polarization modes for better separation of land and water areas. This article gives an overview of sensors and algorithms for waterline determination as well as several applications. Both optical and SAR images are considered. Applications include morphodynamic monitoring studies and DEM generation.
Andreas NiedermeierEmail:
  相似文献   

13.
Extreme atmospheric events are intimately related to the statistics of atmospheric turbulent velocities. These, in turn, exhibit multifractal scaling, which is determining the nature of the asymptotic behavior of velocities, and whose parameter evaluation is therefore of great interest currently. We combine singular value decomposition techniques and wavelet transform analysis to generalize the multifractal formalism to vector-valued random fields. The so-called Tensorial Wavelet Transform Modulus Maxima (TWTMM) method is calibrated on synthetic self-similar 2D vector-valued multifractal measures and monofractal 3D vector-valued fractional Brownian fields. We report the results of some application of the TWTMM method to turbulent velocity and vorticity fields generated by direct numerical simulations of the incompressible Navier–Stokes equations. This study reveals the existence of an intimate relationship between the singularity spectra of these two vector fields which are found significantly more intermittent than previously estimated from longitudinal and transverse velocity increment statistics.
Alain ArneodoEmail:
  相似文献   

14.
Advances in computing technologies in recent decades have provided a means of generating and performing highly sophisticated computational simulations of electromagnetic phenomena. In particular, just after the turn of the twenty-first century, improvements to computing infrastructures provided for the first time the opportunity to conduct advanced, high-resolution three-dimensional full-vector Maxwell’s equations investigations of electromagnetic propagation throughout the global Earth-ionosphere spherical volume. These models, based on the finite-difference time-domain (FDTD) method, are capable of including such details as the Earth’s topography and bathymetry, as well as arbitrary horizontal/vertical geometrical and electrical inhomogeneities and anisotropies of the ionosphere, lithosphere, and oceans. Studies at this level of detail simply are not achievable using analytical methods. The goal of this paper is to provide an historical overview and future prospectus of global FDTD computational research for both natural and man-made electromagnetic phenomena around the world. Current and future applications of global FDTD models relating to lightning sources and radiation, Schumann resonances, hypothesized earthquake precursors, remote sensing, and space weather are discussed.
Jamesina J. SimpsonEmail:
  相似文献   

15.
This paper estimates the expected annual impacts of the Pink Hibiscus Mealybug infestation on the economies of Florida and the rest of the United States. The approach involves a Markov chain analysis wherein both short run and long run expected damages from infestation are calculated. Use is made of the CLIMEX model that predicts the potential pest-establishment regions in the US. While predictions based upon the CLIMEX model extend the scope of damages beyond Florida, the damages are significantly dependent upon the rate of arrival and detection of species in those regions. Damages are significantly higher when a longer time horizon is considered. When nursery owners bear the full cost of quarantines in the form of loss of sales and treatment costs of infected plants, the cost-effectiveness of quarantines as a regulatory tool is diminished. The long run propensity of the system, in terms of the fraction of time spent in the possible ‘states’ of infestation and control, determines the extent of damages, and not the annual value of crops that could be potential hosts to the pest.
Ram RanjanEmail: Phone: +1-352-3921881Fax: +1-352-3929898
  相似文献   

16.
Storm-related sea level variations 1958–2002 along the North Sea coast from a high-resolution numerical hindcast are investigated and compared to the results of earlier studies. Considerable variations were found from year to year and over the entire period. The large-scale pattern of these variations is consistent with that derived from previous studies, while the magnitudes of the long-term trends differ. The latter is attributed to different analysis periods, improvements in the atmospheric forcing, and the enhanced spatial resolution of the numerical simulation. It is shown that the different analysis periods, in particular, represent an issue as the increase in storm-related sea levels was found to be weaker over the last few years that have not been included in earlier studies. These changes are consistent with observed changes of the storm climate over the North Sea. It is also shown that observed and hindcast trends may differ significantly. While the latter are in agreement with observed changes in the storm climate, it may be concluded that observed sea level changes along the North Sea coast comprise a considerable fraction that cannot be attributed to changes in the large-scale atmospheric circulation.
Ralf WeisseEmail:
  相似文献   

17.
Ocean/ice interaction at the base of deep-drafted Antarctic ice shelves modifies the physical properties of inflowing shelf waters to become Ice Shelf Water (ISW). In contrast to the conditions at the atmosphere/ocean interface, the increased hydrostatic pressure at the glacial base causes gases embedded in the ice to dissolve completely after being released by melting. Helium and neon, with an extremely low solubility, are saturated in glacial meltwater by more than 1000%. At the continental slope in front of the large Antarctic caverns, ISW mixes with ambient waters to form different precursors of Antarctic Bottom Water. A regional ocean circulation model, which uses an explicit formulation of the ocean/ice shelf interaction to describe for the first time the input of noble gases to the Southern Ocean, is presented. The results reveal a long-term variability of the basal mass loss solely controlled by the interaction between waters of the continental shelf and the ice shelf cavern. Modeled helium and neon supersaturations from the Filchner–Ronne Ice Shelf front show a “low-pass” filtering of the inflowing signal due to cavern processes. On circumpolar scales, the simulated helium and neon distributions allow us to quantify the ISW contribution to bottom water, which spreads with the coastal current connecting the major formation sites in Ross and Weddell Seas.
Christian B. RodehackeEmail:
  相似文献   

18.
Some of the major advances in the field of mining in the last three decades have referred to the development of new design and planning techniques for optimizing open-pit mining and the inclusion of a stochastic perspective in economic models that is more revealing than a purely deterministic perspective. These advances include the use of parametric techniques in the design and planning process, the formulation of criteria for establishing an optimum cut-off grade policy when the economic goal is to optimize net present value (NPV), and the introduction of economic risk analysis. This paper examines some of the difficulties involved in applying these techniques—arising largely as a result of a lack of knowledge of the spatial location and distribution of the deposit grades—and analyses how these difficulties can be tackled with the help of geostatistical simulation techniques that take probabilistic criteria into consideration during the optimization process. These techniques enable equally likely representations of the deposit to be obtained that reproduce the main dispersion features for the starting experimental data (covariance or variogram, as well as the histogram). Consequently, the uncertainty in regard to the deposit as well as its influence on the economic assessment of the deposit in risk terms can be evaluated. This paper also describes a simple method for introducing price and cost increases into the risk analysis via the Monte Carlo method and shows how geological, technical and economic uncertainty can be integrated in risk analyses. Although it is true that the relationship between prices and costs is maintained constant in mining planning based on using parametric techniques, it is no less true that the risk analysis requires the use of models in which the main parameters with a bearing on deposit economics are considered as stochastic variables. The proposed methodology simplifies the calculations and easily integrates the different sources of uncertainty.
F. G. BastanteEmail:
  相似文献   

19.
We model multivariate hydrological risks in the case that at least one of the variables is extreme. Recently, Heffernan JE, Tawn JA (2004) A conditional approach for multivariate extremes. J R Stat Soc B 66(3):497–546 (thereafter called HT04) proposed a conditional multivariate extreme value model which applies to regions where not all variables are extreme and simultaneously identifies the type of extremal dependence, including negative dependence. In this paper we apply this modeling strategy and provide an application to multivariate observations of five rivers in two clearly distinct regions of Puerto Rico Island and for two different seasons each. This effective dimensionality of ten-dimensions cannot be handled by the traditional models of multivariate extremes. The resulting fitted model, following HT04 model and strategies of estimation, is able to make long term estimation of extremes, conditional than other rivers are extreme or not. The model shows considerable flexibility to address the natural questions that arise in multivariate extreme value assessments. In the Puerto Rico 5 rivers application, the model clearly puts together two regions one of two rivers and another of three rivers, which show strong relationships in the rainy season. This corresponds with the geographical distribution of the rivers.
Beatriz Vaz de Melo MendesEmail:
  相似文献   

20.
Bayesian modelling of health risks in relation to environmental exposures offers advantages over conventional (non-Bayesian) modelling approaches. We report an example using research into whether, after controlling for different confounders, air pollution (NOx) has a significant effect on coronary heart disease mortality, estimating the relative risk associated with different levels of exposure. We use small area data from Sheffield, England and describe how the data were assembled. We compare the results obtained using a generalized (Poisson) log-linear model with adjustment for overdispersion, with the results obtained using a hierarchical (Poisson) log-linear model with spatial random effects. Both classes of models were fitted using a Bayesian approach. Including spatial random effects models both overdispersion and spatial autocorrelation effects arising as a result of analysing data from small contiguous areas. The first modelling framework has been widely used, while the second provides a more rigorous model for hypothesis testing and risk estimation when data refer to small areas. When the models are fitted controlling only for the age and sex of the populations, the generalized log-linear model shows NOx effects are significant at all levels, whereas the hierarchical log-linear model with spatial random effects shows significant effects only at higher levels. We then adjust for deprivation and smoking prevalence. Uncertainty in the estimates of smoking prevalence, arising because the data are based on samples, was accounted for through errors-in-variables modelling. NOx effects apparently are significant at the two highest levels according to both modelling frameworks.
Paul BrindleyEmail:
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号