首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The introduction of automated generalisation procedures in map production systems requires that generalisation systems are capable of processing large amounts of map data in acceptable time and that cartographic quality is similar to traditional map products. With respect to these requirements, we examine two complementary approaches that should improve generalisation systems currently in use by national topographic mapping agencies. Our focus is particularly on self‐evaluating systems, taking as an example those systems that build on the multi‐agent paradigm. The first approach aims to improve the cartographic quality by utilising cartographic expert knowledge relating to spatial context. More specifically, we introduce expert rules for the selection of generalisation operations based on a classification of buildings into five urban structure types, including inner city, urban, suburban, rural, and industrial and commercial areas. The second approach aims to utilise machine learning techniques to extract heuristics that allow us to reduce the search space and hence the time in which a good cartographical solution is reached. Both approaches are tested individually and in combination for the generalisation of buildings from map scale 1:5000 to the target map scale of 1:25 000. Our experiments show improvements in terms of efficiency and effectiveness. We provide evidence that both approaches complement each other and that a combination of expert and machine learnt rules give better results than the individual approaches. Both approaches are sufficiently general to be applicable to other forms of self‐evaluating, constraint‐based systems than multi‐agent systems, and to other feature classes than buildings. Problems have been identified resulting from difficulties to formalise cartographic quality by means of constraints for the control of the generalisation process.  相似文献   

2.
3.
4.
5.
6.
As they increase in popularity, social media are regarded as important sources of information on geographical phenomena. Studies have also shown that people rely on social media to communicate during disasters and emergency situation, and that the exchanged messages can be used to get an insight into the situation. Spatial data mining techniques are one way to extract relevant information from social media. In this article, our aim is to contribute to this field by investigating how graph clustering can be applied to support the detection of geo-located communities in Twitter in disaster situations. For this purpose, we have enhanced the fast-greedy optimization of modularity (FGM) clustering algorithm with semantic similarity so that it can deal with the complex social graphs extracted from Twitter. Then, we have coupled the enhanced FGM with the varied density-based spatial clustering of applications with noise spatial clustering algorithm to obtain spatial clusters at different temporal snapshots. The method was experimented with a case study on typhoon Haiyan in the Philippines, and Twitter’s different interaction modes were compared to create the graph of users and to detect communities. The experiments show that communities that are relevant to identify areas where disaster-related incidents were reported can be extracted, and that the enhanced algorithm outperforms the generic one in this task.  相似文献   

7.
Rivers and streams originating in the surrounding mountainous area are the major sources of salt in the Salinas Grandes basin (Córdoba, Argentina). These rivers infiltrate when they reach the sandflat or in the fringes of the mudflat, feeding springs which often form shallow lakes. Presently, the lakes are distant from the playa edge, thus allowing inflow waters to dissolve ancient (Pleistocene?) evaporite beds. In the sandflat environment, two dominant types of water have been recognized (SO 4 2? -Cl?-HCO 3 ? -Na+, and Cl?-SO 4 2? -HCO 3 2? -Na+), both considered as original members of the brine in the saline complex. Two main sources of solutes were distinguished, one related to the waters supplied by the southern sector and another to waters of the eastern sector. As a result of the chemical evolution in the playa environment, all brines belong to the neutral type (Cl?-SO 4 2? -Na+). Following Hardie and Eugster's (1970) model, waters from the southern sector should evolve towards an alkaline brine (Cl?-SO 4 2? -HCO 3 ? -Na+), whereas those from to the eastern sector should evolve towards a neutral one (Cl?-SO 4 2? -Na+). A computer simulation was carried out to model the chemical evolution of source waters. The results obtained by this methodology showed the same dichotomy (alkaline vs. neutral) established by Hardie and Eugster's (1970) model. The deficit in alkalinity could not be explained by any of the mechanisms published until now. Gypsum dissolution is the most likely mechanism which accounts for the chemical evolution of the waters investigated. When such a process is included in the computations, the Ca2+ supplied by gypsum beds generates an increase in the ion activity product (aCa+2·aCO 3 2? ) and produces a significant change in the 2Ca+2/(2CO 3 2? +HCO 3 ? ) ratio, switching from values less than 1 to values greater than 1. This process determines the precipitation of calcite, and leads to a decrease in alkalinity, which in turn would explain the existence of a neutral brine in the saline complex. An intermediate salinity brine was detected in the mudflat, which, according to the model (Hardie and Eugster, 1970), should evolve towards a SO 4 2? -free neutral brine (Cl?-Na+-Ca2+). The absence of this type of brine may be explained through mixing processes.  相似文献   

8.
9.
10.
11.
12.
13.
14.
The integration of multisource heterogeneous spatial data is one of the major challenges for many spatial data users. To facilitate multisource spatial data integration, many initiatives including federated databases, feature manipulation engines (FMEs), ontology-driven data integration and spatial mediators have been proposed. The major aim of these initiatives is to harmonize data sets and establish interoperability between different data sources.

On the contrary, spatial data integration and interoperability is not a pure technical exercise, and there are other nontechnical issues including institutional, policy, legal and social issues involved. Spatial Data Infrastructure (SDI) framework aims to better address the technical and nontechnical issues and facilitate data integration. The SDIs aim to provide a holistic platform for users to interact with spatial data through technical and nontechnical tools.

This article aims to discuss the complexity of the challenges associated with data integration and propose a tool that facilitates data harmonization through the assessment of multisource spatial data sets against many measures. The measures represent harmonization criteria and are defined based on the requirement of the respective jurisdiction. Information on technical and nontechnical characteristics of spatial data sets is extracted to form metadata and actual data. Then the tool evaluates the characteristics against measures and identifies the items of inconsistency. The tool also proposes available manipulation tools or guidelines to overcome inconsistencies among data sets. The tool can assist practitioners and organizations to avoid the time-consuming and costly process of validating data sets for effective data integration.  相似文献   

15.
16.
17.
Geographic information systems (GIS) and spatial regression modeling techniques were used to evaluate the spatially prioritized relationships between grave density and various spatial parameters for a total of 5549 grave locations. Solar radiation was the most important predictor of grave density in the Feng‐Shui locations. Similarly, spatial clustering technology identified the fact that high concentrations of grave necessarily accompany the significantly increasing trends of solar radiation. The results of the regression analyses indicate that the grave density could be explained by the four landform parameters alone yielding R 2 values of 0.751. In contrast to the typical theory, slope and aspect were not a dominant determining factor upon the dependent variable of grave density. Also, the significantly increasing trends of grave density were not observed in line with a southern direction. A clear verification has been made for the hidden assumptions in Feng‐Shui's long history that its approach is found to be more appropriate in avoiding shadow conditions, rather than exploring the ideal landform location.  相似文献   

18.
19.
Annual freezing and thawing index of 7 meteorological stations along the Qing-hai-Xizang Railway were calculated based on daily maximum and minimum temperature records for 1966?2004. Trends of annual freezing and thawing index were analyzed using the Mann-Kendall test and a simple linear regression method. The results show that: 1) The mean annual freezing indices range from 95 to 2300℃·d and the mean annual thawing indi-ces range from 630 to 3250℃·d. The mean annual freezing index of the 7 stations exhibited decreasing trends with decreasing rate of ?16.6– ?59.1℃·d /10a. The mean annual thawing index of these 7 stations showed increasing trends with the related decreasing rate is 19.83–45.6℃·d /10a. 2) The MK trend test indicated the significant decreasing trends (sig-nificant at < 0.05 significant level) in the annual freezing index for most stations except for Golmud. The significant increasing trends can be observed in the annual thawing index for 4 stations except Golmud and Tuotuohe. Golmud was the only station with no trends in both annual freezing and annual thawing index.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号