首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Geocollaboration is a new field of research that investigates how technology can support human–human collaboration with geospatial information. This paper considers the design issues inherent in distributed geospatial software. It looks at providing a non-spatial communication channel, supporting real-time synchronous awareness, designing interaction techniques, establishing common ground, and using floor control and attention techniques. Using examples from existing geocollaboration tools and realistic geocollaboration scenarios, it demonstrates some of the design alternatives for geocollaboration. The paper concludes with a future research agenda describing the complexities in supporting longer-term geocollaboration activities.  相似文献   

2.
Open data has a profound effect on the working environment within which information is created and shared at all levels. At the local government level, open data initiatives have resulted in higher transparency in policy, a greater engagement between decision‐makers and citizens, and have changed the culture about how data analysis and evidence are used to support local governance. This article, based on data collected through an on‐line survey, participatory workshops with data user communities in four cities (in Colombia and Spain), and interviews with Valencia good‐government office, identifies four elements for a conceptual framework to improve the re‐usability of open geographic data in cities. The essential elements defined in this research are the definition of data user communities and their needs, the creation of the community of reuse, user‐focused metadata, and reuse‐focused legal terms. The definition of these indicators provides a framework for authorities to re‐shape their current open data strategy to include data user requirements. At the end of this article, a roadmap for future research and implementation is presented, considering some reflections on the conceptual framework.  相似文献   

3.
Geospatial processing tasks like solar potential analyses or floodplain investigations within flood scenarios are often complex and deal with large amounts of data. If such analysis operations are performed in distributed web‐based systems, technical capabilities are mostly not sufficient. Major shortcomings comprise the potentially long execution times and the vast amount of messaging overhead that arise from common poll‐based approaches. To overcome these issues, an approach for an event‐driven architecture for web‐based geospatial processing is proposed within this article. First, this article presents a thorough qualitative discussion of different available technologies for push‐based notifications. The aim of this discussion is to find the most suitable push‐based messaging technologies for application with OGC Web Processing Services (WPS). Based on this, an event‐driven architecture for asynchronous geospatial processing with the WPS is presented, building on the Web Socket Protocol as the transport protocol and the OGC Event Service as the message‐oriented middleware. The proposed architecture allows pushing notifications to clients once a task has completed. This paradigm enables the efficient execution of web‐based geospatial processing tasks as well as the integration of geographical analyses into event‐driven real‐time workflows.  相似文献   

4.
Weather radar data play an important role in meteorological analysis and forecasting. In particular, web‐based real‐time 3D visualization will enable and enhance various meteorological applications by avoiding the dissemination of a large amount of data over the internet. Despite that, most existing studies are either limited to 2D or small‐scale data analytics due to methodological limitations. This article proposes a new framework to enable web‐based real‐time 3D visualization of large‐scale weather radar data using 3D tiles and WebGIS technology. The 3D tiles technology is an open specification for online streaming massive heterogeneous 3D geospatial datasets, which is designed to improve rendering performance and reduce memory consumption. First, the weather radar data from multiple single‐radar sites across a large coverage area are organized into a spliced grid data (i.e., weather radar composing data, WRCD). Next, the WRCD is converted into a widely used 3D tile data structure in four steps: data preprocessing, data indexing, data transformation, and 3D tile generation. Last, to validate the feasibility of the proposed strategy, a prototype, namely Meteo3D at https://202.195.237.252:82 , is implemented to accommodate the WRCD collected from all the weather radar sites over the whole of China. The results show that near real‐time and accurate visualization for the monitoring and early warning of strong convective weather can be achieved.  相似文献   

5.
This article reports on the initial development of a generic framework for integrating Geographic Information Systems (GIS) with Massive Multi‐player Online Gaming (MMOG) technology to support the integrated modeling of human‐environment resource management and decision‐making. We review Web 2.0 concepts, online maps, and games as key technologies to realize a participatory construction of spatial simulation and decision making practices. Through a design‐based research approach we develop a prototype framework, “GeoGame”, that allows users to play board‐game‐style simulations on top of an online map. Through several iterations we demonstrate the implementation of a range of design artifacts including: real‐time, multi‐user editing of online maps, web services, game lobby, user‐modifiable rules and scenarios building, chat, discussion, and market transactions. Based on observational, analytical, experimental and functional evaluations of design artifacts as well as a literature review, we argue that a MMO GeoGame‐framework offers a viable approach to address the complex dynamics of human‐environmental systems that require a simultaneous reconciliation of both top‐down and bottom‐up decision making where stakeholders are an integral part of a modeling environment. Further research will offer additional insight into the development of social‐environmental models using stakeholder input and the use of such models to explore properties of complex dynamic systems.  相似文献   

6.
Dynamic geospatial complex systems are inherently four‐dimensional (4D) processes and there is a need for spatio‐temporal models that are capable of realistic representation for improved understanding and analysis. Such systems include changes of geological structures, dune formation, landslides, pollutant propagation, forest fires, and urban densification. However, these phenomena are frequently analyzed and represented with modeling approaches that consider only two spatial dimensions and time. Consequently, the main objectives of this study are to design and develop a modeling framework for 4D agent‐based modeling, and to implement the approach to the 4D case study for forest‐fire smoke propagation. The study area is central and southern British Columbia and the western parts of Alberta, Canada for forest fires that occurred in the summer season of 2017. The simulation results produced realistic spatial patterns of the smoke propagation dynamics.  相似文献   

7.
8.
Disaster response operations require fast and coordinated actions based on real‐time disaster situation information. Although crowdsourced geospatial data applications have been demonstrated to be valuable tools for gathering real‐time disaster situation information, they only provide limited utility for disaster response coordination because of the lack of semantic compatibility and interoperability. To help overcome the semantic incompatibility and heterogeneity problems, we use Geospatial Semantic Web (GSW) technologies. We then combine GSW technologies with Web Feature Service requests to access multiple servers. However, a GSW‐based geographic information system often has poor performance due to the complex geometric computations required. The objective of this research is to explore how to use optimization techniques to improve the performance of an interoperable geographic situation‐awareness system (IGSAS) based on GSW technologies for disaster response. We conducted experiments to evaluate various client‐side optimization techniques for improving the performance of an IGSAS prototype for flood disaster response in New Haven, Connecticut. Our experimental results show that the developed prototype can greatly reduce the runtime costs of geospatial semantic queries through on‐the‐fly spatial indexing, tile‐based rendering, efficient algorithms for spatial join, and caching, especially for those spatial‐join geospatial queries that involve a large number of spatial features and heavy geometric computation.  相似文献   

9.
A geospatial cyberinfrastructure is needed to support advanced GIScience research and education activities. However, the heterogeneous and distributed nature of geospatial resources creates enormous obstacles for building a unified and interoperable geospatial cyberinfrastructure. In this paper, we propose the Geospatial Service Web (GSW) to underpin the development of a future geospatial cyberinfrastructure. The GSW excels over the traditional spatial data infrastructure by providing a highly intelligent geospatial middleware to integrate various geospatial resources through the Internet based on interoperable Web service technologies. The development of the GSW focuses on the establishment of a platform where data, information, and knowledge can be shared and exchanged in an interoperable manner. Theoretically, we describe the conceptual framework and research challenges for GSW, and then introduce our recent research toward building a GSW. A research agenda for building a GSW is also presented in the paper.  相似文献   

10.
A data model for use in a rapid environmental assessment system is constructed. The data model is used in an information layer that supports acoustic assessments of the ocean environment. Such an assessment requires use of both historic and real‐time oceanographic data. The foundation of the data model is Arc Marine, a framework specification for geospatial oceanographic databases that provides structures for containing the basic data types used in oceanographic research. Arc Marine also allows design extensions to account for application specific data structures as demonstrated through incorporation of aspects of the International Organization for Standardization (ISO) 19115 Geographic Information–Metadata standard. The ISO 19115 standard provides structures for recording the historic processing of the data sets. The data model is used to construct a database in the open source database management system (DBMS) PostgreSQL. The resulting system also incorporates the concept of user exits, the seamless extension of the DBMS through inclusion of application‐specific software.  相似文献   

11.
Geographic features change over time, this change being the result of some kind of event. Most database systems used in GIS are relational in nature, capturing change by exhaustively storing all versions of data, or updates replace previous versions. This stems from the inherent difficulty of modelling geographic objects and associated data in relational tables, and this is compounded when the necessary time dimension is introduced to represent how these objects evolve. This article describes an object‐oriented (OO) spatio‐temporal conceptual data model called the Feature Evolution Model (FEM), which can be used for the development of a spatio‐temporal database management system (STDBMS). Object versioning techniques developed in the fields of Computer Aided Design (CAD) and engineering design are utilized in the design. The model is defined using the Unified Modelling Language (UML), and exploits the expressiveness of OO technology by representing both geographic entities and events as objects. Further, the model overcomes the limitations inherent in relational approaches in representing aggregation of objects to form more complex, compound objects. A management object called the evolved feature maintains a temporally ordered list of references to features thus representing their evolution. The model is demonstrated by its application to road network data.  相似文献   

12.
Big geospatial data is an emerging sub‐area of geographic information science, big data, and cyberinfrastructure. Big geospatial data poses two unique challenges. First, raster and vector data structures and analyses have developed on largely separate paths for the last 20 years. This is creating an impediment to geospatial researchers seeking to utilize big data platforms that do not promote heterogeneous data types. Second, big spatial data repositories have yet to be integrated with big data computation platforms in ways that allow researchers to spatio‐temporally analyze big geospatial datasets. IPUMS‐Terra, a National Science Foundation cyberInfrastructure project, addresses these challenges by providing a unified framework of integrated geospatial services which access, analyze, and transform big heterogeneous spatio‐temporal data. As IPUMS‐Terra's data volume grows, we seek to integrate geospatial platforms that will scale geospatial analyses and address current bottlenecks within our system. However, our work shows that there are still unresolved challenges for big geospatial analysis. The most pertinent is that there is a lack of a unified framework for conducting scalable integrated vector and raster data analysis. We conducted a comparative analysis between PostgreSQL with PostGIS and SciDB and concluded that SciDB is the superior platform for scalable raster zonal analyses.  相似文献   

13.
This article presents a methodology for designing a WebGIS framework intended for automatically analyzing spatial data and updating statistics of interest with new information inserted daily by multiple users via a Web portal. A practical example is used on vehicle accident data for assessing risk in specific road segments. Two main blocks integrated together will be described: the collaborative block and the data‐analysis block. The former gives end‐users computer‐aided tools to view, insert, modify and manage data related to accidents and traffic monitoring sensors, whereas the latter is developed to automatically analyze the accident data coming from user's collaboration. Because different agencies can survey accident sites, a collaborative environment is necessary – and a Web‐based solution is ideal – for permitting multi‐user access and data insertion. A centralized approach to process the data in real time is described in all its components. Server‐side Structured Query Language functions optimize performance by using dedicated libraries for spatial processing and re‐structuring the attributes associated with elements which are consequently re‐classified for correct color‐scaling. The end‐product is a system that provides a seamless integration of front‐end tools for user collaboration and back‐end tools to update accident risk statistics in real time and provide them to stakeholders.  相似文献   

14.
Today, many real‐time geospatial applications (e.g. navigation and location‐based services) involve data‐ and/or compute‐intensive geoprocessing tasks where performance is of great importance. Cloud computing, a promising platform with a large pool of storage and computing resources, could be a practical solution for hosting vast amounts of data and for real‐time processing. In this article, we explored the feasibility of using Google App Engine (GAE), the cloud computing technology by Google, for a module in navigation services, called Integrated GNSS (iGNSS) QoS prediction. The objective of this module is to predict quality of iGNSS positioning solutions for prospective routes in advance. iGNSS QoS prediction involves the real‐time computation of large Triangulated Irregular Networks (TINs) generated from LiDAR data. We experimented with the Google App Engine (GAE) and stored a large TIN for two geoprocessing operations (proximity and bounding box) required for iGNSS QoS prediction. The experimental results revealed that while cloud computing can potentially be used for development and deployment of data‐ and/or compute‐intensive geospatial applications, current cloud platforms require improvements and special tools for handling real‐time geoprocessing, such as iGNSS QoS prediction, efficiently. The article also provides a set of general guidelines for future development of real‐time geoprocessing in clouds.  相似文献   

15.
龚建华  李文航  张国永  申申  黄琳  孙麇 《测绘学报》2018,47(8):1089-1097
结合增强现实技术的发展,在虚拟地理环境发展的基础上,提出了“增强地理环境”的概念和虚实融合框架;主要研究了虚拟地理过程与三维打印模型沙盘融合的关键计算与可视化技术,具体包括虚拟地理空间与现实地理空间的坐标匹配、遮挡处理等算法,并以学校火灾人群疏散为案例,实现了人群疏散模拟虚实融合的增强地理环境可视化原型系统。通过系统测试和人员体验调查统计分析,证明了增强地理环境可视化技术的可行性和人群疏散模拟应用交互展示的创新性。  相似文献   

16.
The rapid development of urban retail companies brings new opportunities to the Chinese economy. Due to the spatiotemporal heterogeneity of different cities, selecting a business location in a new area has become a challenge. The application of multi‐source geospatial data makes it possible to describe human activities and urban functional zones at fine scale. We propose a knowledge transfer‐based model named KTSR to support citywide business location selections at the land‐parcel scale. This framework can optimize customer scores and study the pattern of business location selection for chain brands. First, we extract the features of each urban land parcel and study the similarities between them. Then, singular value decomposition was used to build a knowledge‐transfer model of similar urban land parcels between different cities. The results show that: (1) compared with the actual scores, the estimated deviation of the proposed model decreased by more than 50%, and the Pearson correlation coefficient reached 0.84 or higher; (2) the decomposed features were good at quantifying and describing high‐level commercial operation information, which has a strong relationship with urban functional structures. In general, our method can work for selecting business locations and estimating sale volumes and user evaluations.  相似文献   

17.
Agent‐based modeling provides a means for addressing the way human and natural systems interact to change landscapes over time. Until recently, evaluation of simulation models has focused on map comparison techniques that evaluate the degree to which predictions match real‐world observations. However, methods that change the focus of evaluation from patterns to processes have begun to surface; that is, rather than asking if a model simulates a correct pattern, models are evaluated on their ability to simulate a process of interest. We build on an existing agent‐based modeling validation method in order to present a temporal variant‐invariant analysis (TVIA). The enhanced method, which focuses on analyzing the uncertainty in simulation results, examines the degree to which outcomes from multiple model runs match some reference to how land use parcels make the transition from one land use class to another over time. We apply TVIA to results from an agent‐based model that simulates the relationships between landowner decisions and wildfire risk in the wildland‐urban interface of the southern Willamette Valley, Oregon, USA. The TVIA approach demonstrates a novel ability to examine uncertainty across time to provide an understanding of how the model emulates the system of interest.  相似文献   

18.
19.
Object‐oriented (OO) image analysis provides an efficient way to generate vector‐format land‐cover and land‐use maps from remotely sensed images. Such image‐derived vector maps, however, are generally presented with congested and twisted polygons with step‐like boundaries. They include unclassified polygons and polygons with geometric conflicts such as unreadable small areas and narrow corridors. The complex and poorly readable representations usually make such maps not comply well with the Gestalt principle of cartography. This article describes a framework designed to improve the representation by resolving these problematic polygons. It presents a polygon similarity model integrating semantic, geometric and spectral characteristics of the image‐derived polygons to eliminate small and unclassified polygons. In addition, an outward‐inward‐buffering approach is presented to resolve the narrow‐corridor conflicts of a polygon and improve its overall appearance. A case study demonstrates that the implementation of the framework reduces the number of the polygons by 32% and the length of the polygon boundaries by 20%. At the same time, it does not cause distinct changes the distribution of land‐use types (less than 0.05%) and the overall accuracy (decreased only 0.02%) as compared with the original image‐derived land‐use maps. We conclude that the presented framework and models effectively improve the overall representation of image‐derived maps without distinct changes in their semantic characteristics and accuracy.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号