首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
提出了基于Web服务技术、OGC规范和工作流技术,以实现与平台无关的、具备流程编排能力的地理空间处理服务链框架,用于支持复杂的在线空间处理任务.在该框架基础上实现了一个在线遥感影像融合处理示例.该示例展示了利用OGC WCS、WPS,WSDL,UDDI和BPEL4WS等成熟的标准规范来构建GIS服务链,使得客户应用程序...  相似文献   

2.
Geospatial processing tasks like solar potential analyses or floodplain investigations within flood scenarios are often complex and deal with large amounts of data. If such analysis operations are performed in distributed web‐based systems, technical capabilities are mostly not sufficient. Major shortcomings comprise the potentially long execution times and the vast amount of messaging overhead that arise from common poll‐based approaches. To overcome these issues, an approach for an event‐driven architecture for web‐based geospatial processing is proposed within this article. First, this article presents a thorough qualitative discussion of different available technologies for push‐based notifications. The aim of this discussion is to find the most suitable push‐based messaging technologies for application with OGC Web Processing Services (WPS). Based on this, an event‐driven architecture for asynchronous geospatial processing with the WPS is presented, building on the Web Socket Protocol as the transport protocol and the OGC Event Service as the message‐oriented middleware. The proposed architecture allows pushing notifications to clients once a task has completed. This paradigm enables the efficient execution of web‐based geospatial processing tasks as well as the integration of geographical analyses into event‐driven real‐time workflows.  相似文献   

3.
深入探讨了图像处理方法、分类原理等基本理论;依据研究区内植被的光谱特征,选择合适的遥感数据源并对数据进行预处理;通过对植被光谱特征、植被指数等关键因子的分析,选择特征提取的方法,进行了初步的计算机自动分类;根据分类结果,计算宣城市森林面积。不仅可以弥补人工实地调查中工作量大、调查周期长,资源数据速度慢、精度低等缺点,还可以发挥其信息量大、检测手段先进等优点,使得快速、准确地完成森林面积估算成为现实。  相似文献   

4.
The Web 2.0 technologies and standards enable web as a platform by allowing the user participation in web application. In the realization of Web 2.0, new knowledge and services are created by combining information and services from different sources which are known as ‘mashups'. The present study focused on spatial mashup solution for disaster management using open source GIS, mobile applications, web services in web 2.0, Geo-RDBMS and XML which are in the central of intelligent geo web services. The geo-web application is developed to generate the actionable GIS products at user end during disaster event by consuming various data and information services from web and central server system and also real time ground observation data collected through a mobile device. The technological solution developed in this study is successfully demonstrated for disaster management in the Assam State of India during the floods in 2010.  相似文献   

5.
6.
ABSTRACT

Big Earth Data has experienced a considerable increase in volume in recent years due to improved sensing technologies and improvement of numerical-weather prediction models. The traditional geospatial data analysis workflow hinders the use of large volumes of geospatial data due to limited disc space and computing capacity. Geospatial web service technologies bring new opportunities to access large volumes of Big Earth Data via the Internet and to process them at server-side. Four practical examples are presented from the marine, climate, planetary and earth observation science communities to show how the standard interface Web Coverage Service and its processing extension can be integrated into the traditional geospatial data workflow. Web service technologies offer a time- and cost-effective way to access multi-dimensional data in a user-tailored format and allow for rapid application development or time-series extraction. Data transport is minimised and enhanced processing capabilities are offered. More research is required to investigate web service implementations in an operational mode and large data centres have to become more progressive towards the adoption of geo-data standard interfaces. At the same time, data users have to become aware of the advantages of web services and be trained how to benefit from them most.  相似文献   

7.
China is one of the most disaster-prone countries in the world. Currently, the disaster prevention and relief mechanism in China is mainly based on single disaster types and is implemented by different ministries and divisions in single administrative regions. Subsequently, the available resources, including data, services, materials, and human resources, cannot be shared and used effectively. Based on the idea of an observation system of systems and a business system of systems, this paper presents an integrated framework for a Chinese National Disaster Reduction System of Systems (CNDRSS) to address this issue. The CNDRSS framework aims to achieve data sharing and collaboration among different disaster-related ministries/institutions by providing one-stop services for all phases of disaster management and linking together existing and planned disaster-related business systems and observation systems. The key technologies use federated databases and a web service to integrate multiple disaster management systems among different ministries/institutions and a sensor web to integrate airborne, space-borne, and in-situ observations through the web service. These event-driven focused-services connecting the various observations, processing, and mapping processes can meet the requirements for complex disaster-chain systems.  相似文献   

8.
A rich amount of geographic information exists in unstructured texts, such as web pages, social media posts, housing advertisements, and historical archives. Geoparsers are useful tools that extract structured geographic information from unstructured texts, thereby enabling spatial analysis on textual data. While a number of geoparsers have been developed, they have been tested on different data sets using different metrics. Consequently, it is difficult to compare existing geoparsers or to compare a new geoparser with existing ones. In recent years, researchers have created open and annotated corpora for testing geoparsers. While these corpora are extremely valuable, much effort is still needed for a researcher to prepare these data sets and deploy geoparsers for comparative experiments. This article presents EUPEG: an Extensible and Unified Platform for Evaluating Geoparsers. EUPEG is an open source and web‐based benchmarking platform which hosts the majority of open corpora, geoparsers, and performance metrics reported in the literature. It enables direct comparison of the geoparsers hosted, and a new geoparser can be connected to EUPEG and compared with other geoparsers. The main objective of EUPEG is to reduce the time and effort that researchers have to spend in preparing data sets and baselines, thereby increasing the efficiency and effectiveness of comparative experiments.  相似文献   

9.
陈军  张俊  张委伟  彭舒 《遥感学报》2016,20(5):991-1001
近年来,多尺度地表覆盖遥感产品的不断涌现,为环境变化研究、地球系统模拟、地理国(世)情监测和可持续发展规划等提供了重要科学数据。为更好地满足广大用户日益增长的应用需求,应对地表覆盖遥感产品进行持续更新完善,保持其时效性、增强时序性、丰富多样性。针对大面积地表覆盖遥感产品更新完善所面临的主要问题,介绍和评述了国内外有关研究动向,包括影像与众源信息相结合的更新、数据类型细化与完善、地表覆盖真实性验证,并作了简要展望。  相似文献   

10.
中国可持续发展信息共享是我国为探索国家层次上信息共享的示范,旨在向政府部门与社会公众提供可持续发展各专题的数据与信息服务,数据仓库的建设是实现信息共享与服务的支撑和基础。本文首先通过数据提取、数据转化、数据清洗以及创建元数据和数据字典等步骤,对多源异构的中国可持续发展数据进行了体系化扩展和标准化改造,为数据进入仓库奠定了基础。之后通过一套新颖的数据多维组织模式,即基于可持续发展信息分类与编码的数据组织、基于元数据和数据字典的数据组织、基于空间和时间的数据组织及面向数据集市的部门级数据组织这四种方式的综合应用,实现了海量可持续发展数据的组织和管理,建立了一个集中式的可持续发展数据仓库。最后介绍了数据仓库管理平台,并对基于该数据仓库的可持续发展数据共享和网络服务进行了展示。中国可持续发展数据仓库的建立为中国可持续发展信息共享和服务提供了良好数据基础,对国家级大型信息共享工程的建设具有一定的借鉴意义。  相似文献   

11.
12.
For geospatial cyberinfrastructure-enabled web services, the ability of rapidly transmitting and sharing spatial data over the Internet plays a critical role to meet the demands of real-time change detection, response and decision-making. Especially for vector datasets which serve as irreplaceable and concrete material in data-driven geospatial applications, their rich geometry and property information facilitates the development of interactive, efficient and intelligent data analysis and visualization applications. However, the big-data issues of vector datasets have hindered their wide adoption in web services. In this research, we propose a comprehensive optimization strategy to enhance the performance of vector data transmitting and processing. This strategy combines: (1) pre- and on-the-fly generalization, which automatically determines proper simplification level through the introduction of appropriate distance tolerance speed up simplification efficiency; (2) a progressive attribute transmission method to reduce data size and, therefore, the service response time; (3) compressed data transmission and dynamic adoption of a compression method to maximize the service efficiency under different computing and network environments. A cyberinfrastructure web portal was developed for implementing the proposed technologies. After applying our optimization strategies, substantial performance enhancement is achieved. We expect this work to facilitate real-time spatial feature sharing, visual analytics and decision-making.  相似文献   

13.
Land use classification requires a significant amount of labeled data, which may be difficult and time consuming to obtain. On the other hand, without a sufficient number of training samples, conventional classifiers are unable to produce satisfactory classification results. This paper aims to overcome this issue by proposing a new model, TrCbrBoost, which uses old domain data to successfully train a classifier for mapping the land use types of target domain when new labeled data are unavailable. TrCbrBoost adopts a fuzzy CBR (Case Based Reasoning) model to estimate the land use probabilities for the target (new) domain, which are subsequently used to estimate the classifier performance. Source (old) domain samples are used to train the classifiers of a revised TrAdaBoost algorithm in which the weight of each sample is adjusted according to the classifier’s performance. This method is tested using time-series SPOT images for land use classification. Our experimental results indicate that TrCbrBoost is more effective than traditional classification models, provided that sufficient amount of old domain data is available. Under these conditions, the proposed method is 9.19% more accurate.  相似文献   

14.
ABSTRACT

Earth observation (EO) data, such as high-resolution satellite imagery or LiDAR, has become one primary source for forests Aboveground Biomass (AGB) mapping and estimation. However, managing and analyzing the large amount of globally or locally available EO data remains a great challenge. The Google Earth Engine (GEE), which leverages cloud-computing services to provide powerful capabilities on the management and rapid analysis of various types of EO data, has appeared as an inestimable tool to address this challenge. In this paper, we present a scalable cyberinfrastructure for on-the-fly AGB estimation, statistics, and visualization over a large spatial extent. This cyberinfrastructure integrates state-of-the-art cloud computing applications, including GEE, Fusion Tables, and the Google Cloud Platform (GCP), to establish a scalable, highly extendable, and high-performance analysis environment. Two experiments were designed to demonstrate its superiority in performance over the traditional desktop environment and its scalability in processing complex workflows. In addition, a web portal was developed to integrate the cyberinfrastructure with some visualization tools (e.g. Google Maps, Highcharts) to provide a Graphical User Interfaces (GUI) and online visualization for both general public and geospatial researchers.  相似文献   

15.
China's social media platform, Sina Weibo, like Twitter, hosts a considerable amount of big data: messages, comments, pictures. Collecting and analyzing information from this treasury of human behavior data is a challenge, although the message exchange on the network is readable by everyone through the web or app interface. The official Application Programming Interface (API) is the gateway to access and download public content from Sina Weibo and is used to collect messages for all mainland China. The nearby_timeline() request is used to harvest only messages with associated location information. This technical note serves as a reference for researchers who do not speak Mandarin but want to collect data from this rich source of information. Ways of data visualization are presented as a point cloud, density per areal unit, or clustered using Density‐Based Spatial Clustering of Applications with Noise (DBSCAN). The relation of messages to census information is also given.  相似文献   

16.
基于多协议的地理信息服务集成   总被引:1,自引:0,他引:1  
设计了多协议地理信息服务集成框架,探讨解决了其中的关键问题,实现了从不同地理信息服务获取的影像数据、矢量数据和DEM数据的无缝集成。  相似文献   

17.
互联网与大数据技术的高速发展为地理信息服务的互联共享与广泛应用提供了技术基础。面对资源海量、多源异构的地理信息服务,如何实现服务的有效组织并提供合理高效的服务组合,拓展地理信息服务的应用范围,满足更高层次的应用需求受到研究者的广泛关注。构建地理信息服务网络并通过语义实现服务协同,是一种可能的解决方案。本文分析了当前地理信息服务和地理信息服务网络研究的现状,依据服务网络领域的研究成果,从服务网络的表达和优化、服务协同的构建和优化两个方面探讨了基于网络构建的地理信息服务协同方法的发展潜力,进而提出了地理信息服务网络及协同面临的挑战与研究方向。  相似文献   

18.
ABSTRACT

Recent research has shown an increase in the number of extreme tornado outbreaks per year. The characterization of the spatio-temporal pattern of tornado events is therefore a critical task in the analysis of meteorological data. Currently, there are a large number of available meteorological datasets that can be used for such analysis. However, much of these data are distributed across multiple websites and are not accessible in a central location. This poses a significant challenge for a scientist who is interested in exploring meteorological patterns associated with tornado events. This paper presents a novel system which uses cloud-based technology for integrating, storing, exploring, analyzing, and visualizing meteorological data associated with tornado outbreaks. The system employs a novel NoSQL database schema and web services architecture for data integration and provides a user friendly interface that allows scientists to explore the spatio-temporal pattern of tornado events. Furthermore, scientists can use this interface to analyze the relationship between different meteorological variables and properties of tornado outbreaks using a number of spatio-temporal statistical and data mining methods. The efficacy of the system is demonstrated on a use case centered on the analysis of climatic indicators of large spatio-temporally clustered tornado outbreaks.  相似文献   

19.
20.
Geographic Information Systems (GIS) are moving from isolated, standalone, monolithic, proprietary systems working in a client‐server architecture to smaller web‐based applications and components offering specific geo‐processing functionality and transparently exchanging data among them. Interoperability is at the core of this new web services model. Compliance with Open Specifications (OS) enables interoperability. Web‐GIS software's high costs, complexity and special requirements have prevented many organizations from deploying their data and geo‐processing capabilities over the World Wide Web. There are no‐cost Open Source Software (OSS) alternatives to proprietary software for operating systems, web servers, and Relational Database Management Systems. We tested the potential of the combined use of OS and OSS to create web‐based spatial information solutions. We present in detail the steps taken in creating a prototype system to support land use planning in Mexico with web‐based geo‐processing capabilities currently not present in commercial web‐GIS products. We show that the process is straightforward and accessible to a broad audience of geographic information scientists and developers. We conclude that OS and OSS allow the development of web‐based spatial information solutions that are low‐cost, simple to implement, compatible with existing information technology infrastructure, and have the potential of interoperating with other systems and applications in the future.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号