首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Variability in per cell classification accuracy is predominantly modelled with land-cover class as the explanatory variable, i.e. with users' accuracies from the error matrix. Logistic regression models were developed to include other explanatory variables: heterogeneity in the 3×3 window around a cell, the size of the patch and the complexity of the landscape in which a cell is located. It was found that per cell, the probability of correct classification was significantly (α=0.05) higher for cells with a less heterogeneous neighbourhood, for cells part of larger patches and for cells in regions with a less heterogeneous landscape. To validate the models, a leave-one-out procedure was applied in which the absolute difference between the actual and the model-estimated number of cells correctly classified was summarized over 55 regions in the Netherlands. The sum of differences reduced from 60.9 to 48.1 after adding the variables ‘patch size’ and ‘landscape dominance’ to the land-cover class model. Spatial variability thus modelled therefore led to a substantial improvement in the estimation of the per cell classification accuracy.  相似文献   

2.
3.
4.
提出了一种基于GIS的地理元胞自动机模型框架:SimUrban,用于城市发展和演化的模拟与预测。该框架基于面向对象技术,在GIS环境下利用VS.NET开发而成,可以集成遥感和GIS数据以及新的转换规则和地理CA模型,从而模拟城市演化并进行精度评定。以上海市嘉定区为例,在SimUrban环境下利用基于主成分分析(PCA)的地理CA模型模拟了该区域1989-2006年城市发展和演化过程。  相似文献   

5.
6.
7.
Sensitivity analysis of electromagnetic (EM) measurements is important to quantify the effect of the subsurface conductivity on the measured response. Knowledge of the sensitivity functions helps in solving inverse problems related to field data. In the present paper, we have derived the sensitivity functions for exponentially varying conductivity earth models. The effect of the exponential variation of conductivity has been illustrated graphically on the sensitivity functions. The effect of varying the periods of the electromagnetic waves on the sensitivity functions has also been studied, which gives the characteristic behaviour of the sensitivity functions. This characteristic behaviour provides information about the exponentially decreasing or increasing conductivity earth models.  相似文献   

8.
A data set on soil losses and controlling factors for 58 ephemeral gullies has been collected in the Belgian loess belt from March 1997 to March 1999. Of the observed ephemeral gullies, 32 developed at the end of winter or in early spring (winter gullies) and 26 ephemeral gullies developed during summer (summer gullies). The assessed data have been used to test the physically based Ephemeral Gully Erosion Model (EGEM) and to compare its performance with the value of simple topographical and morphological indices in the prediction of ephemeral gully erosion.Analysis shows that EGEM is not capable of predicting ephemeral gully cross-sections well. Although conditions for input parameter assessment were ideal, some parameters such as channel erodibility, critical flow shear stress and local rainfall depth showed great uncertainty. Rather than revealing EGEM's inability of predicting ephemeral gully erosion, this analysis stresses the problematic nature of physically based models, since they often require input parameters that are not available or can hardly be obtained.With respect to the value of simple topographical and morphological indices in predicting ephemeral gully erosion, this study shows that for winter gullies and summer gullies, respectively, over 80% and about 75% of the variation in ephemeral gully volume can be explained when ephemeral gully length is known. Moreover, when previously collected data for ephemeral gullies in two Mediterranean study areas and the data for summer gullies formed in the Belgian loess belt are pooled, it appears that one single length (L)–volume (V) relation exists (V=0.048 L1.29; R2=0.91). These findings imply that predicting ephemeral gully length is a valuable alternative for the prediction of ephemeral gully volume. A simple procedure to predict ephemeral gully length based on topographical thresholds is presented here. Secondly, the empirical length–volume relation can also be used to convert ephemeral gully length data extracted from aerial photos into ephemeral gully volumes.  相似文献   

9.
10.
An iterative solution to the non-linear 3-D electromagnetic inverse problem is obtained by successive linearized model updates using the method of conjugate gradients. Full wave equation modelling for controlled sources is employed to compute model sensitivities and predicted data in the frequency domain with an efficient 3-D finite-difference algorithm. Necessity dictates that the inverse be underdetermined, since realistic reconstructions require the solution for tens of thousands of parameters. In addition, large-scale 3-D forward modelling is required and this can easily involve the solution of over several million electric field unknowns per solve. A massively parallel computing platform has therefore been utilized to obtain reasonable execution times, and results are given for the 1840-node Intel Paragon. The solution is demonstrated with a synthetic example with added Gaussian noise, where the data were produced from an integral equation forward-modelling code, and is different from the finite difference code embedded in the inversion algorithm  相似文献   

11.
We describe the development of the algorithms that comprise the Spatial Decision Support System (SDSS) CaNaSTA (Crop Niche Selection in Tropical Agriculture). The system was designed to assist farmers and agricultural advisors in the tropics to make crop suitability decisions. These decisions are frequently made in highly diverse biophysical and socioeconomic environments and must often rely on sparse datasets.The field trial datasets that provide a knowledge base for SDSS such as this are characterised by ordinal response variables. Our approach has been to apply Bayes’ formula as a prediction model.This paper does not describe the entire CaNaSTA system, but rather concentrates on the algorithm of the central prediction model. The algorithm is tested using a simulated dataset to compare results with ordinal regression, and to test the stability of the model with increasingly sparse calibration data. For all but the richest input datasets it outperforms ordinal regression, as determined using Cohen’s weighted kappa. The model also performs well with sparse datasets. Whilst this is not as conclusive as testing with real world data, the results are encouraging.  相似文献   

12.
A modified Lax-Wendroff correction for wave propagation in attenuating and dispersive media described by Zener elements is presented. As opposed to the full correction, this new technique is explicit and offers large computational savings. The technique may be applied to a wide variety of hyperbolic problems. Here, the concept is illustrated for wave propagation in visco-acoustic media.  相似文献   

13.
The effects of stress on the 2-D permeability tensor of natural fracture networks were studied using a numerical method (Universal Distinct Element Code). On the basis of three natural fracture networks sampled around Dounreay, Scotland, numerical modelling was carried out to examine the fluid flow in relation to the variations in burial depth, differential stress and loading direction. It was demonstrated that the permeability of all the networks decreased with depth due to the closure of aperture. The permeability approached the minimum value at some depth below which little further variation occurred. Also, differential stress had a significant effect on both the magnitude and direction of permeability. The permeability generally decreased with increasing major horizontal stress for a fixed minor horizontal stress, but the various networks considered showed different behaviours. A factor, termed the average deviation angle of maximum permeability ( A m), was defined to describe quantitatively the deviation degree of the direction of the major permeability component from the applied major stress direction. For networks whose behaviour is controlled by set(s) of systematic fractures, A m is significantly greater than zero, whereas those comprised of non-systematic fractures have A m close to zero. In general, fractured rock masses, especially those with one or more sets of systematic fractures, cannot be treated as equivalent porous media. Specification of the geometry of the network is a necessary, but not sufficient, condition for models of fluid flow. Knowledge of the in situ stress, and the deformation it induces, is necessary to predict the behaviour of the rock mass.  相似文献   

14.
We analysed the sensitivity of a decision tree derived forest type mapping to simulated data errors in input digital elevation model (DEM), geology and remotely sensed (Landsat Thematic Mapper) variables. We used a stochastic Monte Carlo simulation model coupled with a one‐at‐a‐time approach. The DEM error was assumed to be spatially autocorrelated with its magnitude being a percentage of the elevation value. The error of categorical geology data was assumed to be positional and limited to boundary areas. The Landsat data error was assumed to be spatially random following a Gaussian distribution. Each layer was perturbed using its error model with increasing levels of error, and the effect on the forest type mapping was assessed. The results of the three sensitivity analyses were markedly different, with the classification being most sensitive to the DEM error, than to the Landsat data errors, but with only a limited sensitivity to the geology data error used. A linear increase in error resulted in non‐linear increases in effect for the DEM and Landsat errors, while it was linear for geology. As an example, a DEM error of as small as ±2% reduced the overall test accuracy by more than 2%. More importantly, the same uncertainty level has caused nearly 10% of the study area to change its initial class assignment at each perturbation, on average. A spatial assessment of the sensitivities indicates that most of the pixel changes occurred within those forest classes expected to be more sensitive to data error. In addition to characterising the effect of errors on forest type mapping using decision trees, this study has demonstrated the generality of employing Monte Carlo analysis for the sensitivity and uncertainty analysis of categorical outputs that have distinctive characteristics from that of numerical outputs.  相似文献   

15.
In this paper we review recent progress in the use of reduced complexity models for predicting floodplain inundation. We review the theoretical basis for modelling floodplain flow with simplified hydraulic treatments based on a dimensional analysis of the one-dimensional shallow water equations. We then review how such schemes can be applied in practice and consider issues of space discretization, time discretization and model parameterisation, before going on to consider model assessment procedures. We show that a key advantage of reduced complexity codes is that they force modellers to think about the minimum process representation necessary to predict particular quantities and act as a check on any tendency to reductionism. At the same time, however, the use (compared to standard hydraulic codes) of strong simplifying assumptions requires us to also address the question “how simple can a model be and still be physically realistic?” We show that by making explicit this debate about acceptable levels of abstraction, reduced complexity codes allow progress to be made in addressing a number of long-standing debates in hydraulics.  相似文献   

16.
Novel digital data sources allow us to attain enhanced knowledge about locations and mobilities of people in space and time. Already a fast-growing body of literature demonstrates the applicability and feasibility of mobile phone-based data in social sciences for considering mobile devices as proxies for people. However, the implementation of such data imposes many theoretical and methodological challenges. One major issue is the uneven spatial resolution of mobile phone data due to the spatial configuration of mobile network base stations and its spatial interpolation. To date, different interpolation techniques are applied to transform mobile phone data into other spatial divisions. However, these do not consider the temporality and societal context that shapes the human presence and mobility in space and time. The paper aims, first, to contribute to mobile phone-based research by addressing the need to give more attention to the spatial interpolation of given data, and further by proposing a dasymetric interpolation approach to enhance the spatial accuracy of mobile phone data. Second, it contributes to population modelling research by combining spatial, temporal and volumetric dasymetric mapping and integrating it with mobile phone data. In doing so, the paper presents a generic conceptual framework of a multi-temporal function-based dasymetric (MFD) interpolation method for mobile phone data. Empirical results demonstrate how the proposed interpolation method can improve the spatial accuracy of both night-time and daytime population distributions derived from different mobile phone data sets by taking advantage of ancillary data sources. The proposed interpolation method can be applied for both location- and person-based research, and is a fruitful starting point for improving the spatial interpolation methods for mobile phone data. We share the implementation of our method in GitHub as open access Python code.  相似文献   

17.
18.
Lacustrine basins and their deposits are good paleoclimate recorders and contain rich energy resources. Shelf-margin clinoforms do exist in deep lacustrine basins, but with striking differences from those in deep marine basins, caused by a correlation between the river-derived sediment supply and the lake level. This study uses empirical relationships to calculate the water and sediment discharge from rivers and coeval lake level during wet–dry cycles at 10 s of ky time scale. Sediment supply and lake-level changes are used for a stratigraphic forward model to understand how lacustrine clinoforms develop under different climate conditions. The results show that both wet and dry cycles can be associated with thick deep-water fan deposits, supporting the existing climate-driven lacustrine model proposed based on field data (e.g. Neogene Pannonian Basin and Eocene Uinta Basin). The wet period with high sediment supply and rising lake level creates the highly aggradational shelf, progradational slope and thick bottomset deposits. This is contrary from marine basin settings where the presence of rising shelf-margin trajectory commonly indicates limited deep-water fan deposits. This work suggests marine-based stratigraphic models cannot be directly applied to lacustrine basins.  相似文献   

19.
20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号