首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 257 毫秒
1.
Tilespatialdistribuhonsandthespatialprocessesofcrimeachvihesinurbatlareasareincreasingl}'brininggeographers'andcrhaologistS'attenhon.mespahalstUdiesofulbancrimeinWesterncountriescanbedividedintotWoschools,oneofwhichistorevealthegeneralspatialchrenhahonofurbancrimefromthemacro-angle,andaleotheristoanalysetheSPahalprocessesofanindividualcriminalfronlalemicro-angle.AsforthemacrostUdiesofulbancrime,therehavebeenmuchcasestUdyandsometheories.Forexample,thelawofdistancedeca}'showsthatthereisanegah…  相似文献   

2.
Assuming a study region in which each cell has associated with it an N-dimensional vector of values corresponding to N predictor variables, one means of predicting the potential of some cell to host mineralization is to estimate, on the basis of historical data, a probability density function that describes the distribution of vectors for cells known to contain deposits. This density estimate can then be employed to predict the mineralization likelihood of other cells in the study region. However, owing to the curse of dimensionality, estimating densities in high-dimensional input spaces is exceedingly difficult, and conventional statistical approaches often break down. This article describes an alternative approach to estimating densities. Inspired by recent work in the area of similarity-based learning, in which input takes the form of a matrix of pairwise similarities between training points, we show how the density of a set of mineralized training examples can be estimated from a graphical representation of those examples using the notion of eigenvector graph centrality. We also show how the likelihood for a test example can be estimated from these data without having to construct a new graph. Application of the technique to the prediction of gold deposits based on 16 predictor variables shows that its predictive performance far exceeds that of conventional density estimation methods, and is slightly better than the performance of a discriminative approach based on multilayer perceptron neural networks.  相似文献   

3.
城市犯罪区位选择的数学模拟*   总被引:5,自引:0,他引:5  
本文以犯罪期望效用和成功概率为空间变量,用数学动态规划方法建立模型模拟罪犯在城市内选择犯罪区位的一般规律。模拟结果显示:罪犯在其犯罪规划期间内依据每次犯罪的期望收益和成功的概率来优化选择犯罪区位。罪犯总是在其感觉成功概率最高但期望效用较低的地区实施第一次犯罪,而其规划期间内的最后一次犯罪则发生在期望效用最高但成功概率较低的地区。如果一个地区的犯罪期望效用和成功的概率均较高,罪犯将集中在这一地区作案。模型所揭示的犯罪行为规律为城市犯罪的空间防范提供了有益的启示,即制定犯罪防治措施应因地制宜,在较贫穷的居住区或少年犯罪区应采取区域巡逻一类的“覆盖式”的防范措施;在较富裕或职业罪犯出没的地区应采取较复杂、严密和先进的技术防范措施。  相似文献   

4.
Examining Risk in Mineral Exploration   总被引:4,自引:0,他引:4  
Successful mineral exploration strategy requires identification of some of the risk sources and considering them in the decision-making process so that controllable risk can be reduced. Risk is defined as chance of failure or loss. Exploration is an economic activity involving risk and uncertainty, so risk also must be defined in an economic context. Risk reduction can be addressed in three fundamental ways: (1) increasing the number of examinations; (2) increasing success probabilities; and (3) changing success probabilities per test by learning. These provide the framework for examining exploration risk. First, the number of prospects examined is increased, such as by joint venturing, thereby reducing chance of gambler's ruin. Second, success probability is increased by exploring for deposit types more likely to be economic, such as those with a high proportion of world-class deposits. For example, in looking for 100+ ton (>3 million oz) Au deposits, porphyry Cu-Au, or epithermal quartz alunite Au types require examining fewer deposits than Comstock epithermal vein and most other deposit types. For porphyry copper exploration, a strong positive relationship between area of sulfide minerals and deposits' contained Cu can be used to reduce exploration risk by only examining large sulfide systems. In some situations, success probabilities can be increased by examining certain geologic environments. Only 8% of kuroko massive sulfide deposits are world class, but success chances can be increased to about 15% by looking in settings containing sediments and rhyolitic rocks. It is possible to reduce risk of loss during mining by sequentially developing and expanding a mine—thus reducing capital exposed at early stages and reducing present value of risked capital. Because this strategy is easier to apply in some deposit types than in others, the strategy can affect deposit types sought. Third, risk is reduced by using prior information and by changing the independence of trials assumption, that is, by learning. Bayes' formula is used to change the probability of existence of the deposit sought on the basis of successive exploration stages. Perhaps the most important way to reduce exploration risk is to employ personnel with the appropriate experience and yet who are learning.  相似文献   

5.
A Conditional Dependence Adjusted Weights of Evidence Model   总被引:3,自引:0,他引:3  
One of the key assumptions in weights of evidence (WE) modelling is that the predictor patterns have to be conditionally independent. When this assumption is violated, WE posterior probability estimates are likely to be biased upwards. In this paper, a formal expression for the bias of the contrasts will be derived. It will be shown that this bias has an intuitive and convenient interpretation. A modified WE model will then be developed, where the bias is corrected using the correlation structure of the predictor patterns. The new model is termed the conditional dependence adjusted weights of evidence (CDAWE) model. It will be demonstrated via a simulation study that the CDAWE model significantly outperforms the existing WE model when conditional independence is violated, and it is on par with logistic regression, which does not assume conditional independence. Furthermore, it will be argued that, in the presence of conditional dependence between predictor patterns, weights variance estimates from WE are likely to understate the true level of uncertainty. It will be argued that weights variance estimates from CDAWE, which are also bias-corrected, can properly address this issue.  相似文献   

6.
Earthquake prediction: the null hypothesis   总被引:5,自引:0,他引:5  
The null hypothesis in assessing earthquake predictions is often, loosely speaking, that the successful predictions are chance coincidences. To make this more precise requires specifying a chance model for the predictions and/or the seismicity. The null hypothesis tends to be rejected not only when the predictions have merit, but also when the chance model is inappropriate. In one standard approach, the seismicity is taken to be random and the predictions are held fixed. 'Conditioning' on the predictions this way tends to reject the null hypothesis even when it is true, if the predictions depend on the seismicity history. An approach that seems less likely to yield erroneous conclusions is to compare the predictions with the predictions of a 'sensible' random prediction algorithm that uses seismicity up to time t to predict what will happen after time t. The null hypothesis is then that the predictions are no better than those of the random algorithm. Significance levels can be assigned to this test in a more satisfactory way, because the distribution of the success rate of the random predictions is under our control. Failure to reject the null hypothesis indicates that there is no evidence that any extra-seismic information the predictor uses (electrical signals for example) helps to predict earthquakes.  相似文献   

7.
A Hybrid Fuzzy Weights-of-Evidence Model for Mineral Potential Mapping   总被引:1,自引:0,他引:1  
This paper describes a hybrid fuzzy weights-of-evidence (WofE) model for mineral potential mapping that generates fuzzy predictor patterns based on (a) knowledge-based fuzzy membership values and (b) data-based conditional probabilities. The fuzzy membership values are calculated using a knowledge-driven logistic membership function, which provides a framework for treating systemic uncertainty and also facilitates the use of multiclass predictor maps in the modeling procedure. The fuzzy predictor patterns are combined using Bayes’ rule in a log-linear form (under an assumption of conditional independence) to update the prior probability of target deposit-type occurrence in every unique combination of predictor patterns. The hybrid fuzzy WofE model is applied to a regional-scale mapping of base-metal deposit potential in the south-central part of the Aravalli metallogenic province (western India). The output map of fuzzy posterior probabilities of base-metal deposit occurrence is classified subsequently to delineate zones with high-favorability, moderate favorability, and low-favorability for occurrence of base-metal deposits. An analysis of the favorability map indicates (a) significant improvement of probability of base-metal deposit occurrence in the high-favorability and moderate-favorability zones and (b) significant deterioration of probability of base-metal deposit occurrence in the low-favorability zones. The results demonstrate usefulness of the hybrid fuzzy WofE model in representation and in integration of evidential features to map relative potential for mineral deposit occurrence.  相似文献   

8.
The need to integrate large quantities of digital geoscience information to classify locations as mineral deposits or nondeposits has been met by the weights-of-evidence method in many situations. Widespread selection of this method may be more the result of its ease of use and interpretation rather than comparisons with alternative methods. A comparison of the weights-of-evidence method to probabilistic neural networks is performed here with data from Chisel Lake-Andeson Lake, Manitoba, Canada. Each method is designed to estimate the probability of belonging to learned classes where the estimated probabilities are used to classify the unknowns. Using these data, significantly lower classification error rates were observed for the neural network, not only when test and training data were the same (0.02 versus 23%), but also when validation data, not used in any training, were used to test the efficiency of classification (0.7 versus 17%). Despite these data containing too few deposits, these tests of this set of data demonstrate the neural network's ability at making unbiased probability estimates and lower error rates when measured by number of polygons or by the area of land misclassified. For both methods, independent validation tests are required to ensure that estimates are representative of real-world results. Results from the weights-of-evidence method demonstrate a strong bias where most errors are barren areas misclassified as deposits. The weights-of-evidence method is based on Bayes rule, which requires independent variables in order to make unbiased estimates. The chi-square test for independence indicates no significant correlations among the variables in the Chisel Lake–Andeson Lake data. However, the expected number of deposits test clearly demonstrates that these data violate the independence assumption. Other, independent simulations with three variables show that using variables with correlations of 1.0 can double the expected number of deposits as can correlations of –1.0. Studies done in the 1970s on methods that use Bayes rule show that moderate correlations among attributes seriously affect estimates and even small correlations lead to increases in misclassifications. Adverse effects have been observed with small to moderate correlations when only six to eight variables were used. Consistent evidence of upward biased probability estimates from multivariate methods founded on Bayes rule must be of considerable concern to institutions and governmental agencies where unbiased estimates are required. In addition to increasing the misclassification rate, biased probability estimates make classification into deposit and nondeposit classes an arbitrary subjective decision. The probabilistic neural network has no problem dealing with correlated variables—its performance depends strongly on having a thoroughly representative training set. Probabilistic neural networks or logistic regression should receive serious consideration where unbiased estimates are required. The weights-of-evidence method would serve to estimate thresholds between anomalies and background and for exploratory data analysis.  相似文献   

9.
ABSTRACT

Geospatial data conflation is aimed at matching counterpart features from two or more data sources in order to combine and better utilize information in the data. Due to the importance of conflation in spatial analysis, different approaches to the conflation problem have been proposed ranging from simple buffer-based methods to probability and optimization based models. In this paper, I propose a formal framework for conflation that integrates two powerful tools of geospatial computation: optimization and relational databases. I discuss the connection between the relational database theory and conflation, and demonstrate how the conflation process can be formulated and carried out in standard relational databases. I also propose a set of new optimization models that can be used inside relational databases to solve the conflation problem. The optimization models are based on the minimum cost circulation problem in operations research (also known as the network flow problem), which generalizes existing optimal conflation models that are primarily based on the assignment problem. Using comparable datasets, computational experiments show that the proposed conflation method is effective and outperforms existing optimal conflation models by a large margin. Given its generality, the new method may be applicable to other data types and conflation problems.  相似文献   

10.
Abstract

It is widely recognized that stakeholder engagement processes produce advantages, but few studies acknowledge that they also can produce disadvantages. There is a global need to better assess stakeholder engagement processes by defining success and developing new methods to analyze stakeholder participation data. Our method of digitizing and coding stakeholder communications (1) produces a wide range of analyses, (2) tells the story of governance over time, (3) is comparable with other datasets, and (4) can be used wherever public documents exist. We demonstrate the utility of these integrated methods by examining statewide differences in public participation and success rates in Alaska’s Board of Fisheries’ (Board) proposal process. We determine that significantly different participation and success rates across the state indicate the existence of disadvantages and the need for further investigation into the equity, efficiency, and effectiveness of the Board process.  相似文献   

11.
暴雨山洪灾害预警是中小流域山洪灾害防控体系的薄弱环节,也是决定山洪灾害防控成败的关键。论文围绕山洪灾害预警的核心问题,从中国山洪灾害区域差异特征、山洪灾害预警技术方法、山洪灾害概率预警现状3个方面进行了综述。中国山洪灾害分布存在明显的时空差异,因此有必要根据山洪灾害的区域差异发展有针对性的预警方法。以临界雨量为指标的雨量预警是目前中国中小流域暴雨山洪灾害预警的主要技术手段,但常规方法仅给出一个(组)确定的临界雨量阈值,导致预警结果存在突出的不确定性问题。概率预警可以定量评估诸多不确定性,给出山洪灾害概率预警结果,因此具备很好的理论优势与潜在应用价值。论文展望了山洪灾害概率预警未来的研究重点与方向:(1)充分挖掘暴雨洪水样本信息,开展山洪灾害概率预警基础方法与技术集成研究;(2)加强非平稳性条件下的临界雨量阈值估算与山洪灾害概率预警研究;(3)综合考虑预警阈值发生概率及其致灾概率,优化“多级预警、多级响应”技术方法,推进山洪灾害综合预警业务系统建设与应用。  相似文献   

12.
Information theory makes it possible to judge and evaluate methods and results in chemical analysis. Theobtained information can be expressed in different ways. One way is to define information as the decreaseof uncertainty after analysis. Conditional probabilities are therefore considered when evaluating theinformation provided by qualitative analyses. However, the use of other information measures, such asthe information gain, is often preferable. In multicomponent analysis the translation of information fromsignals to the amounts of the analytes has been investigated along with the relevance of individualcomponents. Information theory can also be applied to find the optimum experimental conditions. Theevaluation of the properties of analytical methods by information theory has been proposed.  相似文献   

13.
In the oil industry, uncertainty about hydrocarbon volumes in undrilled prospects generally is expressed as an expectation curve. The curve indicates the probability of exceeding a given amount.After drilling a number of prospects conclusively, that is, we know the amount of reserves in the targets, if any, the question arises about the validity of the prediction. Since the prediction was in the form of a probability distribution, the comparison with a single actual outcome of the process is not straightforward.I propose a specific combination of mainly well-known tests that can be applied in this hindsight analysis to address the following: (1) the measure of location or expectation, (2) the probability of success (3) the shape of the distribution of the nonzero outcomes or success cases, and (4) a measure of rank correlation between predictions and outcomes. Even small numbers of drilled structures may suffice for obtaining conclusive results. Such statistical analysis provides useful feedback for those concerned with the maintenance and control of the prediction system.  相似文献   

14.
Abstract

The opportunities available at a demand location are usually measured as the costs of reaching a specified critical number of facilities from that location. This method does not however, account for multistop trips nor for differences in the diversity of supply at the level of individual facilities. In this paper we introduce an alternative measurement method that overcomes these shortcomings. In this method the probability of successfully visiting a specific facility is assumed to be a function of the diversity of supply provided. Trip routes are constructed that have an acceptable probability of success. Then, the expected costs of travelling the optimum route are determined as an indicator of spatial opportunities. The proposed method has been implemented in a GIS environment, using typical GIS data and GIS tools for spatial analysis and display. The results of a case study indicate that the new method, compared to current methods, may lead to different evaluations of the level of opportunities at demand locations.  相似文献   

15.
基于空间概率面的山区居民地遥感信息提取   总被引:3,自引:0,他引:3  
根据地物之间光谱特征建立的基于知识的遥感居民地信息提取模型是目前居民地信息提取中最普遍的方法,但由于高程差异的影响,其在山区居民地信息提取中效果不理想。以云南省丽江市部分地区为例,在GIS支持下,通过构建多因素空间概率面的方式,综合运用地形和光谱特征信息实现山区居民地遥感信息提取。结果表明,地形差异是影响山区居民地信息提取精度的最主要因素,其影响程度占所有影响因素的50%强;在光谱信息识别的基础上,引入地形这一辅助信息,运用空间概率面能够有效地改善山区居民地信息的提取效果,识别精度从57.5%提高到82.5%。  相似文献   

16.
High accuracy surface modeling(HASM) is a method which can be applied to soil property interpolation.In this paper,we present a method of HASM combined geographic information for soil property interpolation(HASM-SP) to improve the accuracy.Based on soil types,land use types and parent rocks,HASM-SP was applied to interpolate soil available P,Li,pH,alkali-hydrolyzable N,total K and Cr in a typical red soil hilly region.To evaluate the performance of HASM-SP,we compared its performance with that of ordinary kriging(OK),ordinary kriging combined geographic information(OK-Geo) and stratified kriging(SK).The results showed that the methods combined with geographic information including HASM-SP and OK-Geo obtained a lower estimation bias.HASM-SP also showed less MAEs and RMSEs when it was compared with the other three methods(OK-Geo,OK and SK).Much more details were presented in the HASM-SP maps for soil properties due to the combination of different types of geographic information which gave abrupt boundary for the spatial varia-tion of soil properties.Therefore,HASM-SP can not only reduce prediction errors but also can be accordant with the distribution of geographic information,which make the spatial simula-tion of soil property more reasonable.HASM-SP has not only enriched the theory of high accuracy surface modeling of soil property,but also provided a scientific method for the ap-plication in resource management and environment planning.  相似文献   

17.
Measuring the Performance of Mineral-Potential Maps   总被引:2,自引:0,他引:2  
D. P. Harris and others have proposed a new method for comparative analysis of favorability mappings. In their approach, Weights-of-Evidence (WofE) consistently shows poorer results than other more flexible methods. Information loss because of discretization would be a second drawback of WofE. In this paper, we discuss that the random cell selection method proposed by Harris and others necessarily results in higher success ratios for more flexible methods but this does not necessarily indicate that these methods provide better mineral-potential maps. For example, a good point density contouring method that does not use any geoscience background information also would score high in the random cell selection approach. Additionally, we show that discretization usually is advantageous because it prevents occurrences of overly high posterior probabilities. For more detailed comparison, we have conducted a number of experiments on 90 gold deposits in the Gowganda Area of the Canadian Shield comparing WofE with the more flexible weighted logistic regression method. Mineral occurrences should be modeled as discoveries at points instead of randomly sampling them together with their surrounding environments in small cells.  相似文献   

18.
Abstract

Triangulated irregular networks (TINs) are increasingly popular for their efficiency in data storage and their ability to accommodate irregularly spaced elevation points for many applications of geographical information systems. This paper reviews and evaluates various methods for extracting TINs from dense digital elevation models (DEMs) on a sample DEM. Both structural and statistical comparisons show that the methods perform with different rates of success in different settings. Users of DEM to TIN conversion methods should be aware of the strengths and weaknesses of the methods in addition to their own purposes before conducting the conversion.  相似文献   

19.
Mineral exploration activities require robust predictive models that result in accurate mapping of the probability that mineral deposits can be found at a certain location. Random forest (RF) is a powerful machine data-driven predictive method that is unknown in mineral potential mapping. In this paper, performance of RF regression for the likelihood of gold deposits in the Rodalquilar mining district is explored. The RF model was developed using a comprehensive exploration GIS database composed of: gravimetric and magnetic survey, a lithogeochemical survey of 59 elements, lithology and fracture maps, a Landsat 5 Thematic Mapper image and gold occurrence locations. The results of this study indicate that the use of RF for the integration of large multisource data sets used in mineral exploration and for prediction of mineral deposit occurrences offers several advantages over existing methods. Key advantages of RF include: (1) the simplicity of parameter setting; (2) an internal unbiased estimate of the prediction error; (3) the ability to handle complex data of different statistical distributions, responding to nonlinear relationships between variables; (4) the capability to use categorical predictors; and (5) the capability to determine variable importance. Additionally, variables that RF identified as most important coincide with well-known geologic expectations. To validate and assess the effectiveness of the RF method, gold prospectivity maps are also prepared using the logistic regression (LR) method. Statistical measures of map quality indicate that the RF method performs better than LR, with mean square errors equal to 0.12 and 0.19, respectively. The efficiency of RF is also better, achieving an optimum success rate when half of the area predicted by LR is considered.  相似文献   

20.
A quantitative valuation study has been made of Australian state surveys with the specific goals of (1) establishing the 'worth' of current programs upgrading state government geoscientific information infrastructure, and (2) considering the results of the valuation in terms of strategic planning. The study has been done from the perspective of the community as a whole and has been undertaken in two phases reflecting the different objectives of Australian state surveys in terms of the exploration industry and government policy-making. This paper reports on the second part of this valuation process, measuring the impact of upgraded survey data on government mineral policy decision processes. The valuation methodology developed is a comparative approach used to determine net benefit foregone by not upgrading information infrastructure. The underlying premise for the geological survey study is that existing and upgraded data sets will have a different probability that a deposit will be detected. The approach used in the valuation of geoscientific data introduces a significant technical component with the requirement to model both favorability of mineral occurrence and probability of deposit occurrence for two different generations of government data. The estimation of mineral potential uses modern quantitative methods, including the U.S. Geological Survey three-part resource-assessment process and computer-based prospectivity modeling. To test the methodology mineral potential was assessed for porphyry copper type deposits in part of the Yarrol Province, central Queensland. Results of the Yarrol case study supports the strategy of the state surveys to facilitate effective exploration by improving accuracy and acquiring new data, as part of resource management. It was determined in the Yarrol Province case study that in going from existing to upgraded data sets the area that would be considered permissible for the occurrence of porphyry type deposits almost doubled. The implication of this result is that large tracts of potentially mineralized land would not be identified using existing data. Results of the prospectivity modeling showed a marked increase in the number of exploration targets and in target rankings using the upgraded data set. A significant reduction in discovery risk also is associated with the upgraded data set, a conclusion supported by the fact that known mines with surface exposure are not identified in prospectivity modeling using the existing data sets. These results highlight the absence in the existing data sets of information critical for the identification of prospective ground.Quantitative resource assessment and computer-based prospectivity modeling are seen as complementary processes that provide the support for the increasingly sophisticated needs of Australian survey clients. Significant additional gains to the current value of geoscientific data can be achieved through the in-house analysis and characterization of individual data sets, the integration and interpretation of data sets, and the incorporation of information on geological uncertainty.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号