首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 93 毫秒
1.
The remediation of sites contaminated with unexploded ordnance (UXO) remains an area of intense focus for the Department of Defense. Under the sponsorship of SERDP, data fusion techniques are being developed for use in enhancing wide-area assessment UXO remediation efforts and a data fusion framework is being created to provide a cohesive data management and decision-making utility to allow for more efficient expenditure of time, labor and resources. An important first step in this work is the development of feature extraction utilities and feature probability density maps for eventual input to data fusion algorithms, making data fusion of estimates of data quality, UXO-related features, non-UXO backgrounds, and correlations among independent data streams possible. Utilizing data acquired during ESTCP’s Wide-Area Assessment Pilot Program, the results presented here successfully demonstrate the feasibility of automated feature extraction from light detection and ranging, orthophotography, and helicopter magnetometry wide-area assessment survey data acquired at the Pueblo Precision Bombing Range #2. These data were imported and registered to a common survey map grid and UXO-related features were extracted and utilized to construct survey site-wide probability density maps that are well-suited for input to higher level data fusion algorithms. Preliminary combination of feature maps from the various data sources yielded maps for the Pueblo site that offered a more accurate UXO assessment than any one data source alone.
Susan L. Rose-PehrssonEmail:
  相似文献   

2.
Advanced discrimination methods and careful optimization of operational procedures are critical for efficient remediation of unexploded ordnance (UXO) contaminated sites. In this paper, we report on our experiences with a 200 acre magnetic survey that was collected and processed under production survey conditions at Chevallier Ranch, Montana. All anomalies with fitted moments above 0.05 Am2 were excavated. During the survey the magnetic remanence metric was predicted but not used to guide the discrimination. The retrospective analysis presented here reveals that discrimination using remanence would have significantly reduced the total number of anomalies (with good dipolar fits) that needed to be excavated, from 524 to 290 while still recovering all 69 UXO. The false alarm rate (FAR = number of non-UXOs excavated divided / number of UXO found) was reduced from 6.3 to 2.9. At a cut-off of 75% remanence, 77% of anomalies due to shrapnel and metallic debris and 64% of geological anomalies were rejected.Geological anomalies due to variations in magnetite concentration introduced a significant human-element into the interpretation process. Three different interpreters added a total of 305 additional anomalies that were not fit with a dipole model and which were later found to be non-UXO. Between 40 and 50% of anomalies picked by the two relatively inexperienced interpreters who analyzed the data turned out to be geology, as compared to 14% for an experienced interpreter. Critical analysis of results, operator training and feedback from the UXO technicians validating the anomaly are essential components towards improving the quality and consistency of the anomaly interpretations. This is consistent with the tenants of Total Quality Management (TQM). We compare the actual FAR that resulted during the survey when there was little feedback between UXO technician validation results, to a hypothetical result that could have been achieved had there been a constant feedback system in place at the onset of operations. Feedback would have significantly reduced the number of geological anomalies and decreased the FAR from 10.7 to 4.0.The hypothetical results presented here demonstrate the value of using TQM principles to guide the UXO remediation process. They further show that improvements in the efficiency and costs of UXO remediation require both technological advances and operational optimization of the technology when implemented in a production setting. Furthermore, by treating geophysical modeling and UXO validation as separate entities, both with respect to contracting and operational reporting, there is little incentive for the geophysicist to leave an anomaly off the dig-sheet. Only potential negative consequences will result if that anomaly is later found to be a UXO. An incentive based mechanism that rewards the geophysicist for reductions in follow-on costs would have a strong potential to reduce the number of unnecessary excavations, and hence reduce the total cost of the UXO remediation effort.  相似文献   

3.
Site characterization activities at potential unexploded ordnance (UXO) sites rely on sparse sampling collected as geophysical surveys along strip transects. From these samples, the locations of target areas, those regions on the site where the geophysical anomaly density is significantly above the background density, must be identified. A target area detection approach using a hidden Markov model (HMM) is developed here. HMM’s use stationary transition probabilities from one state to another for steps between adjacent locations as well as the probability of any particular observation occurring given each possible underlying state. The approach developed here identifies the transition probabilities directly from the conceptual site model (CSM) created as part of the UXO site characterization process. A series of simulations examine the ability of the HMM approach to simultaneously determine the target area locations within each transect and to estimate the unknown anomaly intensity within the identified target area. The HMM results are compared to those obtained using a simpler target detection approach that considers the background anomaly density to be defined by a Poisson distribution and each location to be independent of any adjacent location. Results show that the HMM approach is capable of accurately identifying the target locations with limited false positive identifications when both the background and target are intensities are known. The HMM approach is relatively robust to changes in the initial estimate of the target anomaly intensity and is capable of identifying target locations and the corresponding target anomaly intensity when this intensity is approximately 60% higher than the background intensity at intensities that are representative of actual field sites. Application to data collected from a wide area assessment field site show that the HMM approach identifies the area of the site with elevated anomaly intensity with few false positives. This field site application also shows that the HMM results are relatively robust to changes in the transect width.  相似文献   

4.
Morphological changes in coastal areas, especially in river estuaries, are of high interest in many parts of the world. Satellite data from both optical and radar sensors can help to monitor and investigate these changes. Data from both kinds of sensors being available for up to 30 years now, allow examinations over large timescales, while high resolution sensors developed within the last decade allow increased accuracy. So the creation of digital elevation models (DEMs) of, for example, the wadden sea from a series of satellite images is already possible. ENVISAT, successfully launched on March 1, 2002, continues the line of higher resolution synthetic aperture radar (SAR) imaging sensors with its ASAR instrument and now also allows several polarization modes for better separation of land and water areas. This article gives an overview of sensors and algorithms for waterline determination as well as several applications. Both optical and SAR images are considered. Applications include morphodynamic monitoring studies and DEM generation.
Andreas NiedermeierEmail:
  相似文献   

5.
We report an analysis of the mechanisms responsible for interannual variability in the Greenland–Iceland–Norwegian (GIN) Seas in a control integration of the HadCM3 coupled climate model. Interannual variability in sea surface temperature (SST) and sea surface salinity (SSS) is dominated by a quasi-periodic ∼7-year signal. Analyses show that the mechanism involves a competition between convection and advection. Advection carries cold, fresh, Arctic water over warm, salty, Atlantic water, while convection periodically mixes these two water masses vertically, raising SST. Convection is able to raise SST because of the presence of a subsurface temperature maximum. The GIN Seas convection in HadCM3 is forced by wind stress anomalies related to the North Atlantic Oscillation (NAO). The consequent SST anomalies feedback positively to force the atmosphere, resulting in a weak spectral peak (at ∼7 years) in GIN Seas sea level pressure. Although there is no evidence of a similar oscillation in reality, key aspects of the simulated mechanism may be relevant to understanding variability in the real GIN Seas. In particular, the potential for increases in convection to raise SST offers a possible new explanation for increases in SST that occurred between the 1960s and the late 1980s/early 1990s. These SST increases may have contributed to the observed sea-ice retreat. In addition, a positive feedback between GIN Seas SST and the atmosphere could contribute to the persistence of the NAO, potentially helping to explain its red spectrum or recent northeastward shift.
Sonia R. Gamiz-FortisEmail:
  相似文献   

6.
Two different models, a Physical Model and a Neural Net (NN), are used for the derivation of the Photosynthetically Available Radiation (PAR) from METEOSAT data in the German Bight; advantages and disadvantages of both models are discussed. The use of a NN for derivation of PAR should be preferred to the Physical Model because by construction, a NN can take the various processes determining PAR on a surface much better into account than a non-statistical model relying on averaged relations.
Kathrin SchillerEmail:
  相似文献   

7.
Storm-related sea level variations 1958–2002 along the North Sea coast from a high-resolution numerical hindcast are investigated and compared to the results of earlier studies. Considerable variations were found from year to year and over the entire period. The large-scale pattern of these variations is consistent with that derived from previous studies, while the magnitudes of the long-term trends differ. The latter is attributed to different analysis periods, improvements in the atmospheric forcing, and the enhanced spatial resolution of the numerical simulation. It is shown that the different analysis periods, in particular, represent an issue as the increase in storm-related sea levels was found to be weaker over the last few years that have not been included in earlier studies. These changes are consistent with observed changes of the storm climate over the North Sea. It is also shown that observed and hindcast trends may differ significantly. While the latter are in agreement with observed changes in the storm climate, it may be concluded that observed sea level changes along the North Sea coast comprise a considerable fraction that cannot be attributed to changes in the large-scale atmospheric circulation.
Ralf WeisseEmail:
  相似文献   

8.
Bayesian modelling of health risks in relation to environmental exposures offers advantages over conventional (non-Bayesian) modelling approaches. We report an example using research into whether, after controlling for different confounders, air pollution (NOx) has a significant effect on coronary heart disease mortality, estimating the relative risk associated with different levels of exposure. We use small area data from Sheffield, England and describe how the data were assembled. We compare the results obtained using a generalized (Poisson) log-linear model with adjustment for overdispersion, with the results obtained using a hierarchical (Poisson) log-linear model with spatial random effects. Both classes of models were fitted using a Bayesian approach. Including spatial random effects models both overdispersion and spatial autocorrelation effects arising as a result of analysing data from small contiguous areas. The first modelling framework has been widely used, while the second provides a more rigorous model for hypothesis testing and risk estimation when data refer to small areas. When the models are fitted controlling only for the age and sex of the populations, the generalized log-linear model shows NOx effects are significant at all levels, whereas the hierarchical log-linear model with spatial random effects shows significant effects only at higher levels. We then adjust for deprivation and smoking prevalence. Uncertainty in the estimates of smoking prevalence, arising because the data are based on samples, was accounted for through errors-in-variables modelling. NOx effects apparently are significant at the two highest levels according to both modelling frameworks.
Paul BrindleyEmail:
  相似文献   

9.
The identification and characterization of target areas at former bombing ranges is the first step in investigating these sites for residual unexploded ordnance. Traditionally, magnetometer surveys along transects are used in identifying areas with high densities of magnetic anomalies, which are likely former target areas. Combining magnetometer survey data with other data sources may reduce the level of survey data required for site characterization, increasing characterization efficiency. Here, several techniques for incorporating secondary information into kriging estimates of magnetic anomaly density are investigated for a former bombing range located near Pueblo, Colorado. In particular, kriging with external drift, collocated ordinary cokriging, and simple kriging with local means (SKLM) are used to incorporate information from a secondary variable. The secondary variable consists of a grid of crater density values derived from a topographic light detection and ranging (LIDAR) analysis. The craters, which are clearly identifiable in the LIDAR data, were generated through munitions use at the site and are therefore related to the target locations. The results from this study indicate that the inclusion of the secondary information in the kriging estimates does benefit target area characterization and provides a means of elucidating target area details from only limited magnetometer transect data. For the Pueblo site, the use of SKLM with the crater density as a secondary variable and only limited magnetometer transect data, provided results comparable to those obtained from using much larger magnetometer transect data sets.  相似文献   

10.
We consider financial markets with agents exposed to external sources of risk caused, for example, by short-term climate events such as the South Pacific sea surface temperature anomalies widely known by the name El Nino. Since such risks cannot be hedged through investments on the capital market alone, we face a typical example of an incomplete financial market. In order to make this risk tradable, we use a financial market model in which an additional insurance asset provides another possibility of investment besides the usual capital market. Given one of the many possible market prices of risk, each agent can maximize his individual exponential utility from his income obtained from trading in the capital market, the additional security, and his risk-exposure function. Under the equilibrium market-clearing condition for the insurance security the market price of risk is uniquely determined by a backward stochastic differential equation. We translate these stochastic equations via the Feynman–Kac formalism into semi-linear parabolic partial differential equations. Numerical schemes are available by which these semilinear pde can be simulated. We choose two simple qualitatively interesting models to describe sea surface temperature, and with an ENSO risk exposed fisher and farmer and a climate risk neutral bank three model agents with simple risk exposure functions. By simulating the expected appreciation price of risk trading, the optimal utility of the agents as a function of temperature, and their optimal investment into the risk trading security we obtain first insight into the dynamics of such a market in simple situations.
Peter ImkellerEmail:
  相似文献   

11.
We examine the management of livestock diseases from the producers‘ perspective, incorporating information and incentive asymmetries between producers and regulators. Using a stochastic dynamic model, we examine responses to different policy options including indemnity payments, subsidies to report at-risk animals, monitoring, and regulatory approaches to decreasing infection risks when perverse incentives and multiple policies interact. This conceptual analysis illustrates the importance of designing efficient combinations of regulatory and incentive-based policies.
Ram RanjanEmail:
  相似文献   

12.
An approach to the simulation of spatial random fields is proposed. The target random field is specified by its covariance function which need not be homogeneous or Gaussian. The technique provided is based on an approximate Karhunen–Loève expansion of spatial random fields which can be readily realized. Such an approximate representation is obtained from a correction to the Rayleigh–Ritz method based on the dual Riesz basis theory. The resulting numerical projection procedure improves Rayleigh–Ritz algorithm in the approximation of second-order random fields. Simulations are developed to illustrate the convergence and accuracy of the method presented.
J. C. Ruiz-MolinaEmail:
  相似文献   

13.
The heat of the Earth derives from internal and external sources. A heat balance shows that most of the heat provided by external sources is re-emitted by long-wavelength heat radiation and that the dominant internal sources are original heat and heat generated by decay of unstable radioactive isotopes. Understanding of the thermal regime of the Earth requires appreciation of properties and mechanisms for heat generation, storage, and transport. Both experimental and indirect methods are available for inferring the corresponding rock properties. Heat conduction is the dominant transport process in the Earth’s crust, except for settings where appreciable fluid flow provides a mechanism for heat advection. For most crustal and mantle rocks, heat radiation becomes significant only at temperatures above 1200°C.
Christoph ClauserEmail:
  相似文献   

14.
Over the past four or five decades many advances have been made in earthquake ground-motion prediction and a variety of procedures have been proposed. Some of these procedures are based on explicit physical models of the earthquake source, travel-path and recording site while others lack a strong physical basis and seek only to replicate observations. In addition, there are a number of hybrid methods that seek to combine benefits of different approaches. The various techniques proposed have their adherents and some of them are extensively used to estimate ground motions for engineering design purposes and in seismic hazard research. These methods all have their own advantages and limitations that are not often discussed by their proponents. The purposes of this article are to: summarise existing methods and the most important references, provide a family tree showing the connections between different methods and, most importantly, to discuss the advantages and disadvantages of each method.
John DouglasEmail:
  相似文献   

15.
The concepts of system load and capacity are pivotal in risk analysis. The complexity in risk analysis increases when the input parameters are either stochastic (aleatory uncertainty) and/or missing (epistemic uncertainty). The aleatory and epistemic uncertainties related to input parameters are handled through simulation-based parametric and non-parametric probabilistic techniques. The complexities increase further when the empirical relationships are not strong enough to derive physical-based models. In this paper, ordered weighted averaging (OWA) operators are proposed to estimate the system load. The risk of failure is estimated by assuming normally distributed reliability index. The proposed methodology for risk analysis is illustrated using an example of nine-input parameters. Sensitivity analyses identified that the risk of failure is dominated by the attitude of a decision-maker to generate OWA weights, missing input parameters and system capacity.
Rehan Sadiq (Corresponding author)Email:
  相似文献   

16.
Extreme atmospheric events are intimately related to the statistics of atmospheric turbulent velocities. These, in turn, exhibit multifractal scaling, which is determining the nature of the asymptotic behavior of velocities, and whose parameter evaluation is therefore of great interest currently. We combine singular value decomposition techniques and wavelet transform analysis to generalize the multifractal formalism to vector-valued random fields. The so-called Tensorial Wavelet Transform Modulus Maxima (TWTMM) method is calibrated on synthetic self-similar 2D vector-valued multifractal measures and monofractal 3D vector-valued fractional Brownian fields. We report the results of some application of the TWTMM method to turbulent velocity and vorticity fields generated by direct numerical simulations of the incompressible Navier–Stokes equations. This study reveals the existence of an intimate relationship between the singularity spectra of these two vector fields which are found significantly more intermittent than previously estimated from longitudinal and transverse velocity increment statistics.
Alain ArneodoEmail:
  相似文献   

17.
This paper, addresses the problem of novelty detection in the case that the observed data is a mixture of a known ‘background’ process contaminated with an unknown other process, which generates the outliers, or novel observations. The framework we describe here is quite general, employing univariate classification with incomplete information, based on knowledge of the distribution (the probability density function, pdf) of the data generated by the ‘background’ process. The relative proportion of this ‘background’ component (the priorbackground’ probability), the pdf and the prior probabilities of all other components are all assumed unknown. The main contribution is a new classification scheme that identifies the maximum proportion of observed data following the known ‘background’ distribution. The method exploits the Kolmogorov–Smirnov test to estimate the proportions, and afterwards data are Bayes optimally separated. Results, demonstrated with synthetic data, show that this approach can produce more reliable results than a standard novelty detection scheme. The classification algorithm is then applied to the problem of identifying outliers in the SIC2004 data set, in order to detect the radioactive release simulated in the ‘joker’ data set. We propose this method as a reliable means of novelty detection in the emergency situation which can also be used to identify outliers prior to the application of a more general automatic mapping algorithm.
Davide D’AlimonteEmail:
Dan Cornford (Corresponding author)Email:
  相似文献   

18.
Statistically defensible methods are presented for developing geophysical detector sampling plans and analyzing data for munitions response sites where unexploded ordnance (UXO) may exist. Detection methods for identifying areas of elevated anomaly density from background density are shown. Additionally, methods are described which aid in the choice of transect pattern and spacing to assure with degree of confidence that a target area (TA) of specific size, shape, and anomaly density will be identified using the detection methods. Methods for evaluating the sensitivity of designs to variation in certain parameters are also discussed. Methods presented have been incorporated into the Visual Sample Plan (VSP) software (free at ) and demonstrated at multiple sites in the United States. Application examples from actual transect designs and surveys from the previous two years are demonstrated.  相似文献   

19.
This paper estimates the expected annual impacts of the Pink Hibiscus Mealybug infestation on the economies of Florida and the rest of the United States. The approach involves a Markov chain analysis wherein both short run and long run expected damages from infestation are calculated. Use is made of the CLIMEX model that predicts the potential pest-establishment regions in the US. While predictions based upon the CLIMEX model extend the scope of damages beyond Florida, the damages are significantly dependent upon the rate of arrival and detection of species in those regions. Damages are significantly higher when a longer time horizon is considered. When nursery owners bear the full cost of quarantines in the form of loss of sales and treatment costs of infected plants, the cost-effectiveness of quarantines as a regulatory tool is diminished. The long run propensity of the system, in terms of the fraction of time spent in the possible ‘states’ of infestation and control, determines the extent of damages, and not the annual value of crops that could be potential hosts to the pest.
Ram RanjanEmail: Phone: +1-352-3921881Fax: +1-352-3929898
  相似文献   

20.
Recently, Batabyal and Nijkamp (Environ Econ Policy Stud 7:39–51, 2005) have used a theoretical model of antibiotic use to study the relative merits of interventionist (antibiotics) and non-interventionist (no antibiotics) treatment options. A key assumption in their paper is that the default treatment option is the interventionist option. Because there are several instances in which this assumption is invalid, in this paper, we suppose that the default treatment option is the non-interventionist option. Specifically, we first derive the long run average cost of treating a common infection such as acute otitis media (AOM). Next, we show that there is a particular tolerance level and that when a physician uses this tolerance level to determine when to administer the non-antibiotic medicine, the long run average cost of treating the common infection under study is minimized.
Amitrajeet A. BatabyalEmail:
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号