首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 390 毫秒
1.
The reliability of raster cellular automaton (CA) models for fine-scale land change simulations has been increasingly questioned, because regular pixels/grids cannot precisely represent irregular geographical entities and their interactions. Vector CA models can address these deficiencies due to the ability of the vector data structure to represent realistic urban entities. This study presents a new land parcel cellular automaton (LP-CA) model for simulating urban land changes. The innovation of this model is the use of ensemble learning method for automatic calibration. The proposed model is applied in Shenzhen, China. The experimental results indicate that bagging-Naïve Bayes yields the highest calibration accuracy among a set of selected classifiers. The assessment of neighborhood sensitivity suggests that the LP-CA model achieves the highest simulation accuracy with neighbor radius r = 2. The calibrated LP-CA is used to project future urban land use changes in Shenzhen, and the results are found to be consistent with those specified in the official city plan.  相似文献   

2.
Geo-tagged travel photos on social networks often contain location data such as points of interest (POIs), and also users’ travel preferences. In this paper, we propose a hybrid ensemble learning method, BAyes-Knn, that predicts personalized tourist routes for travelers by mining their geographical preferences from these location-tagged data. Our method trains two types of base classifiers to jointly predict the next travel destination: (1) The K-nearest neighbor (KNN) classifier quantifies users’ location history, weather condition, temperature and seasonality and uses a feature-weighted distance model to predict a user’s personalized interests in an unvisited location. (2) A Bayes classifier introduces a smooth kernel function to estimate a-priori probabilities of features and then combines these probabilities to predict a user’s latent interests in a location. All the outcomes from these subclassifiers are merged into one final prediction result by using the Borda count voting method. We evaluated our method on geo-tagged Flickr photos and Beijing weather data collected from 1 January 2005 to 1 July 2016. The results demonstrated that our ensemble approach outperformed 12 other baseline models. In addition, the results showed that our framework has better prediction accuracy than do context-aware significant travel-sequence-patterns recommendations and frequent travel-sequence patterns.  相似文献   

3.
魏娜  贺晨昕  刘佩佩 《干旱区地理》2018,41(6):1178-1183
从短期气候预测的实际出发,针对月尺度的气温分县预测,使用逐步回归和主成分分析(即经验正交函数)的统计降尺度方法,利用地面观测站的气温资料、美国国家环境预报中心和美国大气科学研究中心的大尺度气候变量(NCEP/NCAR)和国家气候中心月动力延伸预报模式资料(DERF),对1982-2015年陕西省96个县区的1月和7月气温进行预测,建立统计降尺度模型,并采用交叉检验方法检验模型的预测效果,表明基于经验正交函数和逐步回归的统计降尺度方法在陕西省1月和7月气温的预测中是合理可用的。全省96个县区1月份预测值与观测值距平符号一致率大于60%达到了50个县区,7月份大于60%达到了60个县区。预测值可以较好的预测出气温变化趋势,但预测值变化幅度明显小于观测值。  相似文献   

4.
Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.  相似文献   

5.
Prefetching is a process in which the necessary portion of data is predicted and loaded into memory beforehand. The increasing usage of geographic data in different types of applications has motivated the development of different prefetching techniques. Each prefetching technique serves a specific type of application, such as two-dimensional geographic information systems or three-dimensional visualization, and each one is crafted for the corresponding navigation patterns. However, as the boundary between these application types blurs, these techniques become insufficient for hybrid applications (such as digital moving maps), which embody various capabilities and navigation patterns. Therefore, a set of techniques should be used in combination to handle different prefetching requirements. In this study, a priority-based tile prefetching approach is proposed, which enables the ensemble usage of various techniques at the same time. The proposed approach manages these techniques dynamically through a fuzzy-logic-based inference engine to increase prefetching performance and to adapt to various exhibited behaviours. This engine performs adaptive decisions about the advantages of each technique according to their individual accuracy and activity level using fuzzy logic to determine how each prefetching technique performs. The results obtained from the experiments showed that up to a 25% increase in prefetching performance is achieved with the proposed ensemble usage over individual usage. A generic model for prefetching techniques was also developed and used to describe the given approach. Finally, a cross-platform software framework with four different prefetching techniques was developed to let other users utilize the proposed approach.  相似文献   

6.
Police databases hold a large amount of crime data that could be used to inform us about current and future crime trends and patterns. Predictive analysis aims to optimize the use of these data to anticipate criminal events. It utilizes specific statistical methods to predict the likelihood of new crime events at small spatiotemporal units of analysis. The aim of this study is to investigate the potential of applying predictive analysis in an urban context. To this end, the available crime data for three types of crime (home burglary, street robbery, and battery) are spatially aggregated to grids of 200 by 200 m and retrospectively analyzed. An ensemble model is applied, synthesizing the results of a logistic regression and neural network model, resulting in bi-weekly predictions for 2014, based on crime data from the previous three years. Temporally disaggregated (day versus night predictions) monthly predictions are also made. The quality of the predictions is evaluated based on the following criteria: direct hit rate (proportion of incidents correctly predicted), precision (proportion of correct predictions versus the total number of predictions), and prediction index (ratio of direct hit rate versus proportion of total area predicted as high risk). Results indicate that it is possible to attain functional predictions by applying predictive analysis to grid-level crime data. The monthly predictions with a distinction between day and night produce better results overall than the bi-weekly predictions, indicating that the temporal resolution can have an important impact on the prediction performance.  相似文献   

7.
Spatial optimization is complex because it usually involves numerous spatial factors and constraints. The optimization becomes more challenging if a large set of spatial data with fine resolutions are used. This article presents an agent-based model for optimal land allocation (AgentLA) by maximizing the total amount of land-use suitability and the compactness of patterns. The essence of the optimization is based on the collective efforts of agents for formulating the optimal patterns. A local and global search strategy is proposed to inform the agents to select the sites properly. Three sets of hypothetical data were first used to verify the optimization effects. AgentLA was then applied to the solution of the actual land allocation optimization problems in Panyu city in the Pearl River Delta. The study has demonstrated that the proposed method has better performance than the simulated annealing method for solving complex spatial optimization problems. Experiments also indicate that the proposed model can produce patterns that are very close to the global optimums.  相似文献   

8.
ABSTRACT

Vector-based cellular automata (VCA) models have been applied in land use change simulations at fine scales. However, the neighborhood effects of the driving factors are rarely considered in the exploration of the transition suitability of cells, leading to lower simulation accuracy. This study proposes a convolutional neural network (CNN)-VCA model that adopts the CNN to extract the high-level features of the driving factors within a neighborhood of an irregularly shaped cell and discover the relationships between multiple land use changes and driving factors at the neighborhood level. The proposed model was applied to simulate urban land use changes in Shenzhen, China. Compared with several VCA models using other machine learning methods, the proposed CNN-VCA model obtained the highest simulation accuracy (figure-of-merit = 0.361). The results indicated that the CNN-VCA model can effectively uncover the neighborhood effects of multiple driving factors on the developmental potential of land parcels and obtain more details on the morphological characteristics of land parcels. Moreover, the land use patterns of 2020 and 2025 under an ecological control strategy were simulated to provide decision support for urban planning.  相似文献   

9.
中国高分辨率温度和降水模拟数据的验证   总被引:17,自引:3,他引:17  
朱华忠  罗天祥 《地理研究》2003,22(3):349-359
PRISM模型是一种基于地理特征和回归统计方法生成气候图的模型。基于中国及其周边国家地区2450多个气象台站观测数据,以PRISM模型模拟生成了中国2.5′×2.5′(≈4~5km)逐月温度和降水数据。利用独立于模拟数据的中国生态系统研究网络18个野外观测站的长年气候观测数据,检验了PRISM模型的模拟结果,表明PRISM模型较好地模拟了我国温度和降水的空间分布及季节变化,除了在高山和亚热带地区由于地表覆盖和局部地形的差异影响模拟结果,其模拟值与实测值之间的趋势线同1∶1线基本一致,具有显著相关关系,其中降水效果略差  相似文献   

10.
In this paper we present a method which allows delineation of geologic structures in a bi-modal lithotype setting. We propose to use gravity data in combination with a priori information about the density contrast between the two lithotypes. The iterative method uses an objective function with five tunable parameters which need to be set. Using an efficient parameter search, suitable ranges of these are investigated to determine their optimal values, respectively, which in turn, ensures good inversion results.
The approach produces structural images of the subsurface, without the need of an a priori density model; the depth to the top of the inhomogeneity is also retrieved.
Besides synthetic simulations, the methodology has also been applied to a small gravity data set, acquired by the industry over a basinal structure. A consistent, bi-modal image of the bedrock depression is obtained from the data, which, in this case, was the goal. Other potential areas of application include delineation of salt structures and ore deposits.  相似文献   

11.
Evaluation of recharge and groundwater dynamics of an aquifer is an important step for finding a proper groundwater management scenario. This has been performed on the basis of statistical Kendall Tau test to find a relationship between groundwater levels and hydro-meteorological parameters (e.g., precipitation, temperature, evaporation). Recharge to the aquifer was estimated for identification of critical areas/locations based on the analytical Soil and Water Assessment Tool. Moreover, spatiotemporal variability of groundwater levels has been quantified using space–time variogram. The overall characterization method has been applied to the shallow alluvial aquifer of Kanpur city in India. The analysis was performed using groundwater level data from 56 monitoring piezometer locations in Kanpur from March 2006 to June 2011. Groundwater level shows relatively higher correlation with temperature. Performance of the geostatistical model was evaluated by comparing with the observed values of groundwater level from January 2011 to June 2011 for two scenarios: “with limited spatiotemporal data” and “without spatiotemporal data.” It is evident that spatiotemporal prediction of groundwater level can be performed even for the unmonitored/missing data. This analysis demonstrates the potential applicability of the method for a general aquifer system.  相似文献   

12.
对基于案例推理的元胞自动机模型(CBR-CA)进行改进,将各类别的宏观转移概率添加到目标函数中,体现各类别的转变特征,并增加时间权重来确定转移概率,实现时间尺度上的模拟;由于土地覆盖变化的多样性和空间结构的复杂性,利用Monte Carlo(M-C)法确定土地覆盖的最终转换类别。选择黄河源区为试验区,利用1977年、1985年土地覆盖数据建立原始案例库,模拟了该区域1995年、2000年和2006年的土地覆盖变化,模拟的各类别转换的数量精度与实际相吻合,各年份的总体误差分别为0.002%、0.012%和0.005%,空间位置精度总体在70%以上,并进行未来土地覆盖情景预测。该模型可用于多类别、长时间序列区域土地覆盖变化的模拟与预测。  相似文献   

13.
Cellular automata (CA) models have been widely employed to simulate urban growth and land use change. In order to represent urban space more realistically, new approaches to CA models have explored the use of vector data instead of traditional regular grids. However, the use of irregular CA-based models brings new challenges as well as opportunities. The most strongly affected factor when using an irregular space is neighbourhood. Although neighbourhood definition in an irregular environment has been reported in the literature, the question of how to model the neighbourhood effect remains largely unexplored. In order to shed light on this question, this paper proposed the use of spatial metrics to characterise and measure the neighbourhood effect in irregular CA-based models. These metrics, originally developed for raster environments, namely the enrichment factor and the neighbourhood index, were adapted and applied in the irregular space employed by the model. Using the results of these metrics, distance-decay functions were calculated to reproduce the push-and-pull effect between the simulated land uses. The outcomes of a total of 55 simulations (5 sets of different distance functions and 11 different neighbourhood definition distances) were compared with observed changes in the study area during the calibration period. Our results demonstrate that the proposed methodology improves the outcomes of the urban growth simulation model tested and could be applied to other irregular CA-based models.  相似文献   

14.
This paper presents a new derivative-free search method for finding models of acceptable data fit in a multidimensional parameter space. It falls into the same class of method as simulated annealing and genetic algorithms, which are commonly used for global optimization problems. The objective here is to find an ensemble of models that preferentially sample the good data-fitting regions of parameter space, rather than seeking a single optimal model. (A related paper deals with the quantitative appraisal of the ensemble.)
  The new search algorithm makes use of the geometrical constructs known as Voronoi cells to derive the search in parameter space. These are nearest neighbour regions defined under a suitable distance norm. The algorithm is conceptually simple, requires just two 'tuning parameters', and makes use of only the rank of a data fit criterion rather than the numerical value. In this way all difficulties associated with the scaling of a data misfit function are avoided, and any combination of data fit criteria can be used. It is also shown how Voronoi cells can be used to enhance any existing direct search algorithm, by intermittently replacing the forward modelling calculations with nearest neighbour calculations.
  The new direct search algorithm is illustrated with an application to a synthetic problem involving the inversion of receiver functions for crustal seismic structure. This is known to be a non-linear problem, where linearized inversion techniques suffer from a strong dependence on the starting solution. It is shown that the new algorithm produces a sophisticated type of 'self-adaptive' search behaviour, which to our knowledge has not been demonstrated in any previous technique of this kind.  相似文献   

15.
There are many different methods to calibrate cellular automata (CA) models for better simulation results of urban land-use changes. However, few studies have been reported on combination of parameter update and error control using local data in CA calibration procedures. This paper presents a self-modifying CA model (SM-CA) that uses the dual ensemble Kalman filter (dual EnKF), which enables the CA model to simultaneously update model parameters and simulation results by merging observation data (local data). We applied the proposed model to simulate urban land-use changes in a 13-year period (1993–2005) in Dongguan City, a rapidly urbanizing region in south China. Simulation results indicate that this model yields better simulation results than the conventional logistic-regression CA and decision-tree CA models. For example, the validation is carried out using cross-tabulation matrix. The simulation results of SM-CA have allocation disagreement of 10.18%, 19.64%, and 30.03% in 1997, 2001, and 2005, respectively, which are 2.12%, 2.47%, and 6% lower than conventional logistic-regression CA models.  相似文献   

16.
Research into aeolian dune form and dynamics has benefited from simple and abstract cellular automata computer models. Many of these models are based upon a seminal framework proposed by Werner (1995). Unfortunately, most versions of this model are not publicly available or are not provided in a format that promotes widespread use. In our view, this hinders progress in linking model simulations to empirical data (and vice versa). To this end, we introduce an accessible, graphical user interface (GUI) version of the Werner model. The novelty of this contribution is that it provides a simple interface and detailed instructions that encourage widespread use and extension of the Werner dune model for research and training purposes. By lowering barriers for researchers to develop and test hypotheses about aeolian dune and dune field patterns, this release addresses recent calls to improve access to earth surface models.  相似文献   

17.
杨青生  黎夏 《地理学报》2006,61(8):882-894
为了更有效地模拟地理现象的复杂演变过程,提出了用粗集理论来确定元胞自动机 (CA)不确定性转换规则的新方法。CA可以通过局部规则来有效地模拟许多地理现象的演变过程。但目前缺乏很好定义CA转换规则的方法。往往采用启发式的方法来定义CA的转换规则,这些转换规则是静态的,而且其参数值多是确定的。在反映诸如城市扩张、疾病扩散等不确定性复杂现象时,具有一定的局限性。利用粗集从GIS和遥感数据中发现知识,自动寻找CA的不确定性转换规则,基于粗集的CA在缩短建模时间的同时,能提取非确定性的转换规则,更好地反映复杂系统的特点。采用所提出的方法模拟了深圳市的城市发展过程,取得了比传统MCE方法更好的模拟效果。  相似文献   

18.
This paper presents a GIS-based mathematical model for the simulation of floodplain sedimentation. The model comprises two components: (1) the existing hydrodynamic WAQUA model that calculates two-dimensional water flow patterns; and (2) the SEDIFLUX model that calculates deposition of sediment based on a simple mass balance concept with a limited number of model parameters. The models were applied to simulate floodplain sediment deposition over river reaches of several kilometres in length. The SEDIFLUX model has been calibrated and validated using interpolated raster maps of sediment deposition observed after the large magnitude December 1993 flood on the embanked floodplain of the lower river Rhine in the Netherlands. The model appeared to be an adequate tool to predict patterns of sediment deposition as the product of the complex interaction among river discharge and sediment concentration, floodplain topography, and the resulting water flow patterns during various discharge levels. In the investigated areas, the resulting annual average sedimentation rates varied between 0.5 mm/year and 4.0 mm/year. The role of the most important mechanisms governing the spatial patterns of overbank deposition, i.e. inundation frequency, sediment load, floodplain topography and its influence on the flow patterns over the floodplain, are discussed.  相似文献   

19.
Faults strongly impact groundwater flow in the unconsolidated sediments of the Lower Rhine Embayment. Hydraulic head maps show that many individual faults form a barrier to fluid flow whereas relay structures in these faults are sites of hydraulic contact between otherwise separated aquifers. The fluid flow patterns around the Rurrand Fault close to the largest open‐pit mine in the Lower Rhine Embayment is one of the first well‐documented examples of fluid flow around a fault relay zone. The effect of clay smearing could be quantified using the Shale Gouge Ratio (SGR) method that is common in hydrocarbon‐related studies but has not been applied to groundwater flow data so far. The effect of fault relay zones on groundwater flow is analysed using numerical simulations. It is concluded that fault relay needs special consideration in the evaluation of the sealing capacities of faults in sedimentary basins. Moreover, it is demonstrated that the SGR methodology is a promising tool for the estimation of fault zone hydraulic properties in hydrogeological modelling.  相似文献   

20.
The Delphi expert-consensus forecasting model has been widely applied since 1948 with mixed success. This paper presents a version of the Delphi, modified to organize and generate classroom discussion, and it describes an experiment in which this technique was used to teach a class in future urban environments. An analysis of the response patterns of the students suggests that student consensus does develop and that the technique is a successful nonlecture format.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号