首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We examine the management of livestock diseases from the producers‘ perspective, incorporating information and incentive asymmetries between producers and regulators. Using a stochastic dynamic model, we examine responses to different policy options including indemnity payments, subsidies to report at-risk animals, monitoring, and regulatory approaches to decreasing infection risks when perverse incentives and multiple policies interact. This conceptual analysis illustrates the importance of designing efficient combinations of regulatory and incentive-based policies.
Ram RanjanEmail:
  相似文献   

2.
Wind in Ireland: long memory or seasonal effect?   总被引:1,自引:1,他引:0  
Since Haslett and Raftery’s paper Space-Time Modelling with Long-Memory Dependence: Assessing Ireland’s Wind Power Resource (1989), modelling meteorological time series with long memory processes, in particular the ARFIMA model has become very common. Haslett and Raftery fitted an ARFIMA model on Irish daily wind speeds. In this paper, we try to reproduce Haslett and Raftery’s results (focusing on the dynamic of the wind process, and not on cross-correlation and space dependencies), and show that an ARFIMA model does not properly capture the behaviour of the series (in Modelling daily windspeed in Ireland section). Indeed, the series show a periodic behaviour, that is not taken into account by the ARFIMA model. Removing this periodic behaviour yields no results either, we therefore try to fit a GARMA model that takes into account both seasonality and long memory (in Seasonality and long memory using GARMA models section). If a GARMA process can be fitted to the data to model Irish daily data, we will show that these models could also be used to model Dutch hourly data.
Arthur CharpentierEmail:
  相似文献   

3.
This paper estimates the expected annual impacts of the Pink Hibiscus Mealybug infestation on the economies of Florida and the rest of the United States. The approach involves a Markov chain analysis wherein both short run and long run expected damages from infestation are calculated. Use is made of the CLIMEX model that predicts the potential pest-establishment regions in the US. While predictions based upon the CLIMEX model extend the scope of damages beyond Florida, the damages are significantly dependent upon the rate of arrival and detection of species in those regions. Damages are significantly higher when a longer time horizon is considered. When nursery owners bear the full cost of quarantines in the form of loss of sales and treatment costs of infected plants, the cost-effectiveness of quarantines as a regulatory tool is diminished. The long run propensity of the system, in terms of the fraction of time spent in the possible ‘states’ of infestation and control, determines the extent of damages, and not the annual value of crops that could be potential hosts to the pest.
Ram RanjanEmail: Phone: +1-352-3921881Fax: +1-352-3929898
  相似文献   

4.
The vesiculation of magma during the 1983 eruption of Miyakejima Volcano, Japan, is discussed based on systematic investigations of water content, vesicularity, and bubble size distribution for the products. The eruption is characterized by simultaneous lava effusion and explosive sub-plinian (‘dry’) eruptions with phreatomagmatic (‘wet’) explosions. The magmas are homogeneous in composition (basaltic andesite) and in initial water content (H2O = 3.9±0.9 wt%), and residual groundmass water contents for all eruption styles are low (H2O <0.4 wt%) suggestive of extensive dehydration of magma. For the scoria erupted during simultaneous ‘dry’ and ‘wet’ explosive eruptions, inverse correlation was observed between vesicularity and residual water content. This relation can be explained by equilibrium exsolution and expansion of ca. 0.3 wt% H2O at shallow level with different times of quenching, and suggests that each scoria with different vesicularity, which was quenched at a different time, provides a snapshot of the vesiculation process near the point of fragmentation. The bubble size distribution (BSD) varies systematically with vesicularity, and total bubble number density reaches a maximum value at vesicularity Φ ∼ 0.5. At Φ  ∼ 0.5, a large number of bubbles are connected with each other, and the average thickness of bubble walls reaches the minimum value below which they would rupture. These facts suggest that vesiculation advanced by nucleation and growth of bubbles when Φ < 0.5, and then by expansion of large bubbles with coalescence of small ones for Φ > 0.5, when bubble connection becomes effective. Low vesicularity and low residual water content of lava and spatter (Φ  < 0.1, H2O  < 0.1 wt%), and systematic decrease in bubble number density from scoria through spatter to lava with decrease in vesicularity suggest that effusive eruption is a consequence of complete degassing by bubble coalescence and separation from magma at shallow levels when magma ascent rate is slow.
T. ShimanoEmail:
  相似文献   

5.
An approach to the simulation of spatial random fields is proposed. The target random field is specified by its covariance function which need not be homogeneous or Gaussian. The technique provided is based on an approximate Karhunen–Loève expansion of spatial random fields which can be readily realized. Such an approximate representation is obtained from a correction to the Rayleigh–Ritz method based on the dual Riesz basis theory. The resulting numerical projection procedure improves Rayleigh–Ritz algorithm in the approximation of second-order random fields. Simulations are developed to illustrate the convergence and accuracy of the method presented.
J. C. Ruiz-MolinaEmail:
  相似文献   

6.
Advances in computing technologies in recent decades have provided a means of generating and performing highly sophisticated computational simulations of electromagnetic phenomena. In particular, just after the turn of the twenty-first century, improvements to computing infrastructures provided for the first time the opportunity to conduct advanced, high-resolution three-dimensional full-vector Maxwell’s equations investigations of electromagnetic propagation throughout the global Earth-ionosphere spherical volume. These models, based on the finite-difference time-domain (FDTD) method, are capable of including such details as the Earth’s topography and bathymetry, as well as arbitrary horizontal/vertical geometrical and electrical inhomogeneities and anisotropies of the ionosphere, lithosphere, and oceans. Studies at this level of detail simply are not achievable using analytical methods. The goal of this paper is to provide an historical overview and future prospectus of global FDTD computational research for both natural and man-made electromagnetic phenomena around the world. Current and future applications of global FDTD models relating to lightning sources and radiation, Schumann resonances, hypothesized earthquake precursors, remote sensing, and space weather are discussed.
Jamesina J. SimpsonEmail:
  相似文献   

7.
The remediation of sites contaminated with unexploded ordnance (UXO) remains an area of intense focus for the Department of Defense. Under the sponsorship of SERDP, data fusion techniques are being developed for use in enhancing wide-area assessment UXO remediation efforts and a data fusion framework is being created to provide a cohesive data management and decision-making utility to allow for more efficient expenditure of time, labor and resources. An important first step in this work is the development of feature extraction utilities and feature probability density maps for eventual input to data fusion algorithms, making data fusion of estimates of data quality, UXO-related features, non-UXO backgrounds, and correlations among independent data streams possible. Utilizing data acquired during ESTCP’s Wide-Area Assessment Pilot Program, the results presented here successfully demonstrate the feasibility of automated feature extraction from light detection and ranging, orthophotography, and helicopter magnetometry wide-area assessment survey data acquired at the Pueblo Precision Bombing Range #2. These data were imported and registered to a common survey map grid and UXO-related features were extracted and utilized to construct survey site-wide probability density maps that are well-suited for input to higher level data fusion algorithms. Preliminary combination of feature maps from the various data sources yielded maps for the Pueblo site that offered a more accurate UXO assessment than any one data source alone.
Susan L. Rose-PehrssonEmail:
  相似文献   

8.
The heat of the Earth derives from internal and external sources. A heat balance shows that most of the heat provided by external sources is re-emitted by long-wavelength heat radiation and that the dominant internal sources are original heat and heat generated by decay of unstable radioactive isotopes. Understanding of the thermal regime of the Earth requires appreciation of properties and mechanisms for heat generation, storage, and transport. Both experimental and indirect methods are available for inferring the corresponding rock properties. Heat conduction is the dominant transport process in the Earth’s crust, except for settings where appreciable fluid flow provides a mechanism for heat advection. For most crustal and mantle rocks, heat radiation becomes significant only at temperatures above 1200°C.
Christoph ClauserEmail:
  相似文献   

9.
Two different models, a Physical Model and a Neural Net (NN), are used for the derivation of the Photosynthetically Available Radiation (PAR) from METEOSAT data in the German Bight; advantages and disadvantages of both models are discussed. The use of a NN for derivation of PAR should be preferred to the Physical Model because by construction, a NN can take the various processes determining PAR on a surface much better into account than a non-statistical model relying on averaged relations.
Kathrin SchillerEmail:
  相似文献   

10.
Recently, Batabyal and Nijkamp (Environ Econ Policy Stud 7:39–51, 2005) have used a theoretical model of antibiotic use to study the relative merits of interventionist (antibiotics) and non-interventionist (no antibiotics) treatment options. A key assumption in their paper is that the default treatment option is the interventionist option. Because there are several instances in which this assumption is invalid, in this paper, we suppose that the default treatment option is the non-interventionist option. Specifically, we first derive the long run average cost of treating a common infection such as acute otitis media (AOM). Next, we show that there is a particular tolerance level and that when a physician uses this tolerance level to determine when to administer the non-antibiotic medicine, the long run average cost of treating the common infection under study is minimized.
Amitrajeet A. BatabyalEmail:
  相似文献   

11.
Hazard maps are considered essential tools in the communication of volcanic risk between scientists, the local authorities and the public. This study investigates the efficacy of such maps for the volcanic island of Montserrat in the West Indies using both quantitative and qualitative research techniques. Normal plan view maps, which have been used on the island over the last 10 years of the crisis, are evaluated against specially produced three-dimensional (3D) maps and perspective photographs. Thirty-two demographically representative respondents of mixed backgrounds, sex, education and location were interviewed and asked to complete a range of tasks and identification on the maps and photographs. The overall results show that ordinary people have problems interpreting their environment as a mapped representation. We found respondents’ ability to locate and orientate themselves as well as convey information relating to volcanic hazards was improved when using aerial photographs rather than traditional plan view contour maps. There was a slight improvement in the use of the 3D maps, especially in terms of topographic recognition. However, the most striking increase in effectiveness was found with the perspective photographs, which enabled people to identify features and their orientation much more readily. For Montserrat it appears that well labelled aerial and perspective photographs are the most effective geo-spatial method of communicating volcanic risks.
Katharine HaynesEmail:
  相似文献   

12.
Ocean/ice interaction at the base of deep-drafted Antarctic ice shelves modifies the physical properties of inflowing shelf waters to become Ice Shelf Water (ISW). In contrast to the conditions at the atmosphere/ocean interface, the increased hydrostatic pressure at the glacial base causes gases embedded in the ice to dissolve completely after being released by melting. Helium and neon, with an extremely low solubility, are saturated in glacial meltwater by more than 1000%. At the continental slope in front of the large Antarctic caverns, ISW mixes with ambient waters to form different precursors of Antarctic Bottom Water. A regional ocean circulation model, which uses an explicit formulation of the ocean/ice shelf interaction to describe for the first time the input of noble gases to the Southern Ocean, is presented. The results reveal a long-term variability of the basal mass loss solely controlled by the interaction between waters of the continental shelf and the ice shelf cavern. Modeled helium and neon supersaturations from the Filchner–Ronne Ice Shelf front show a “low-pass” filtering of the inflowing signal due to cavern processes. On circumpolar scales, the simulated helium and neon distributions allow us to quantify the ISW contribution to bottom water, which spreads with the coastal current connecting the major formation sites in Ross and Weddell Seas.
Christian B. RodehackeEmail:
  相似文献   

13.
Morphological changes in coastal areas, especially in river estuaries, are of high interest in many parts of the world. Satellite data from both optical and radar sensors can help to monitor and investigate these changes. Data from both kinds of sensors being available for up to 30 years now, allow examinations over large timescales, while high resolution sensors developed within the last decade allow increased accuracy. So the creation of digital elevation models (DEMs) of, for example, the wadden sea from a series of satellite images is already possible. ENVISAT, successfully launched on March 1, 2002, continues the line of higher resolution synthetic aperture radar (SAR) imaging sensors with its ASAR instrument and now also allows several polarization modes for better separation of land and water areas. This article gives an overview of sensors and algorithms for waterline determination as well as several applications. Both optical and SAR images are considered. Applications include morphodynamic monitoring studies and DEM generation.
Andreas NiedermeierEmail:
  相似文献   

14.
The paper focuses on the development of reservoir operating rules for dry and rainfall events, and their implementation in the case of the Ghézala dam located in northern Tunisia (characterized by Mediterranean climate). Rainfall events are defined in terms of depth and duration that are correlated to each other. A depth analysis per event is performed, conditioned on the event duration. The gamma distribution provides a good fit to depth per event, especially for events lasting at least 6 days. The event duration fits a geometric distribution, whereas the dry events during the rainy season fit a negative binomial distribution. The climatic cycle length is fitted to a gamma distribution. On this basis, many 50-year synthetic event series were generated. Every synthetic streamflow sequence obtained from synthetic rainfall sequences as well as the one derived from the historic rainfall events time series were optimized and optimal decisions were formulated. These decisions were assessed by means of multiple regression analysis to estimate the relation between the optimal decision to every stage (dry or rainfall event) and other system variables. Optimal rules, which have a linear form, were derived by predetermined useful storage interval and depend on storage, inflows and downstream demand at dry or rainfall event t. The range of t is 1–13 days (rainfall event) and 1–57 days (dry event). The rules were satisfactory for every predetermined useful storage interval. The simulated dam performance generated by the operation rules was compared with the deterministic optimum operation and the historical operation. Also included is the comparison of the implicit stochastic optimization-based operation policy per event during the water years 1985–2002.
Fethi LebdiEmail:
  相似文献   

15.
Extreme atmospheric events are intimately related to the statistics of atmospheric turbulent velocities. These, in turn, exhibit multifractal scaling, which is determining the nature of the asymptotic behavior of velocities, and whose parameter evaluation is therefore of great interest currently. We combine singular value decomposition techniques and wavelet transform analysis to generalize the multifractal formalism to vector-valued random fields. The so-called Tensorial Wavelet Transform Modulus Maxima (TWTMM) method is calibrated on synthetic self-similar 2D vector-valued multifractal measures and monofractal 3D vector-valued fractional Brownian fields. We report the results of some application of the TWTMM method to turbulent velocity and vorticity fields generated by direct numerical simulations of the incompressible Navier–Stokes equations. This study reveals the existence of an intimate relationship between the singularity spectra of these two vector fields which are found significantly more intermittent than previously estimated from longitudinal and transverse velocity increment statistics.
Alain ArneodoEmail:
  相似文献   

16.
We model multivariate hydrological risks in the case that at least one of the variables is extreme. Recently, Heffernan JE, Tawn JA (2004) A conditional approach for multivariate extremes. J R Stat Soc B 66(3):497–546 (thereafter called HT04) proposed a conditional multivariate extreme value model which applies to regions where not all variables are extreme and simultaneously identifies the type of extremal dependence, including negative dependence. In this paper we apply this modeling strategy and provide an application to multivariate observations of five rivers in two clearly distinct regions of Puerto Rico Island and for two different seasons each. This effective dimensionality of ten-dimensions cannot be handled by the traditional models of multivariate extremes. The resulting fitted model, following HT04 model and strategies of estimation, is able to make long term estimation of extremes, conditional than other rivers are extreme or not. The model shows considerable flexibility to address the natural questions that arise in multivariate extreme value assessments. In the Puerto Rico 5 rivers application, the model clearly puts together two regions one of two rivers and another of three rivers, which show strong relationships in the rainy season. This corresponds with the geographical distribution of the rivers.
Beatriz Vaz de Melo MendesEmail:
  相似文献   

17.
The concepts of system load and capacity are pivotal in risk analysis. The complexity in risk analysis increases when the input parameters are either stochastic (aleatory uncertainty) and/or missing (epistemic uncertainty). The aleatory and epistemic uncertainties related to input parameters are handled through simulation-based parametric and non-parametric probabilistic techniques. The complexities increase further when the empirical relationships are not strong enough to derive physical-based models. In this paper, ordered weighted averaging (OWA) operators are proposed to estimate the system load. The risk of failure is estimated by assuming normally distributed reliability index. The proposed methodology for risk analysis is illustrated using an example of nine-input parameters. Sensitivity analyses identified that the risk of failure is dominated by the attitude of a decision-maker to generate OWA weights, missing input parameters and system capacity.
Rehan Sadiq (Corresponding author)Email:
  相似文献   

18.
Storm-related sea level variations 1958–2002 along the North Sea coast from a high-resolution numerical hindcast are investigated and compared to the results of earlier studies. Considerable variations were found from year to year and over the entire period. The large-scale pattern of these variations is consistent with that derived from previous studies, while the magnitudes of the long-term trends differ. The latter is attributed to different analysis periods, improvements in the atmospheric forcing, and the enhanced spatial resolution of the numerical simulation. It is shown that the different analysis periods, in particular, represent an issue as the increase in storm-related sea levels was found to be weaker over the last few years that have not been included in earlier studies. These changes are consistent with observed changes of the storm climate over the North Sea. It is also shown that observed and hindcast trends may differ significantly. While the latter are in agreement with observed changes in the storm climate, it may be concluded that observed sea level changes along the North Sea coast comprise a considerable fraction that cannot be attributed to changes in the large-scale atmospheric circulation.
Ralf WeisseEmail:
  相似文献   

19.
We report an analysis of the mechanisms responsible for interannual variability in the Greenland–Iceland–Norwegian (GIN) Seas in a control integration of the HadCM3 coupled climate model. Interannual variability in sea surface temperature (SST) and sea surface salinity (SSS) is dominated by a quasi-periodic ∼7-year signal. Analyses show that the mechanism involves a competition between convection and advection. Advection carries cold, fresh, Arctic water over warm, salty, Atlantic water, while convection periodically mixes these two water masses vertically, raising SST. Convection is able to raise SST because of the presence of a subsurface temperature maximum. The GIN Seas convection in HadCM3 is forced by wind stress anomalies related to the North Atlantic Oscillation (NAO). The consequent SST anomalies feedback positively to force the atmosphere, resulting in a weak spectral peak (at ∼7 years) in GIN Seas sea level pressure. Although there is no evidence of a similar oscillation in reality, key aspects of the simulated mechanism may be relevant to understanding variability in the real GIN Seas. In particular, the potential for increases in convection to raise SST offers a possible new explanation for increases in SST that occurred between the 1960s and the late 1980s/early 1990s. These SST increases may have contributed to the observed sea-ice retreat. In addition, a positive feedback between GIN Seas SST and the atmosphere could contribute to the persistence of the NAO, potentially helping to explain its red spectrum or recent northeastward shift.
Sonia R. Gamiz-FortisEmail:
  相似文献   

20.
Over the past four or five decades many advances have been made in earthquake ground-motion prediction and a variety of procedures have been proposed. Some of these procedures are based on explicit physical models of the earthquake source, travel-path and recording site while others lack a strong physical basis and seek only to replicate observations. In addition, there are a number of hybrid methods that seek to combine benefits of different approaches. The various techniques proposed have their adherents and some of them are extensively used to estimate ground motions for engineering design purposes and in seismic hazard research. These methods all have their own advantages and limitations that are not often discussed by their proponents. The purposes of this article are to: summarise existing methods and the most important references, provide a family tree showing the connections between different methods and, most importantly, to discuss the advantages and disadvantages of each method.
John DouglasEmail:
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号