首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
云计算面临的最大挑战是安全问题。云计算应用的无边界性、流动性等特点,较之传统的IT模式有很大差异。在云计算环境下,服务方式发生变化,安全的责任主体也发生了根本改变。作为云计算服务提供商,需要建立安全的云计算平台,为云安全服务提供保障;同时,服务一定是开放的、安全的,要保护云用户敏感信息的安全。整合桌面安全管理技术是行之有效的。云安全应用研究主要是从云计算平台系统安全和网络安全设备、安全基础设施的“云化”突破几个方面展开。  相似文献   

2.
ABSTRACT

The availability and quantity of remotely sensed and terrestrial geospatial data sets are on the rise. Historically, these data sets have been analyzed and quarried on 2D desktop computers; however, immersive technologies and specifically immersive virtual reality (iVR) allow for the integration, visualization, analysis, and exploration of these 3D geospatial data sets. iVR can deliver remote and large-scale geospatial data sets to the laboratory, providing embodied experiences of field sites across the earth and beyond. We describe a workflow for the ingestion of geospatial data sets and the development of an iVR workbench, and present the application of these for an experience of Iceland’s Thrihnukar volcano where we: (1) combined satellite imagery with terrain elevation data to create a basic reconstruction of the physical site; (2) used terrestrial LiDAR data to provide a geo-referenced point cloud model of the magmatic-volcanic system, as well as the LiDAR intensity values for the identification of rock types; and (3) used Structure-from-Motion (SfM) to construct a photorealistic point cloud of the inside volcano. The workbench provides tools for the direct manipulation of the georeferenced data sets, including scaling, rotation, and translation, and a suite of geometric measurement tools, including length, area, and volume. Future developments will be inspired by an ongoing user study that formally evaluates the workbench’s mature components in the context of fieldwork and analyses activities.  相似文献   

3.
为了突破桌面软件对地理空间信息应用的限制,本文应用MapGISK9服务引擎,提出了一种新的服务引擎功能,它能够将MapGISK9REST风格的Web服务处理功能作为在线式和按需式影像科学程序提供给地理空间组织机构。针对目前地理空间用户在影像处理方面具有在线式和按需式访问需求的情况,本文借鉴ENVI并利用MapGISK9服务引擎的工程化应用实例,提出了一套客户端一服务器企业架构或云架构的解决方案,以增强态势感知能力。  相似文献   

4.
ABSTRACT

Sentinel-2 scenes are increasingly being used in operational Earth observation (EO) applications at regional, continental and global scales, in near-real time applications, and with multi-temporal approaches. On a broader scale, they are therefore one of the most important facilitators of the Digital Earth. However, the data quality and availability are not spatially and temporally homogeneous due to effects related to cloudiness, the position on the Earth or the acquisition plan. The spatio-temporal inhomogeneity of the underlying data may therefore affect any big remote sensing analysis and is important to consider. This study presents an assessment of the metadata for all accessible Sentinel-2 Level-1C scenes acquired in 2017, enabling the spatio-temporal coverage and availability to be quantified, including scene availability and cloudiness. Spatial exploratory analysis of the global, multi-temporal metadata also reveals that higher acquisition frequencies do not necessarily yield more cloud-free scenes and exposes metadata quality issues, e.g. systematically incorrect cloud cover estimation in high, non-vegetated altitudes. The continuously updated datasets and analysis results are accessible as a Web application called EO-Compass. It contributes to a better understanding and selection of Sentinel-2 scenes, and improves the planning and interpretation of remote sensing analyses.  相似文献   

5.
The development of groundwater favourability map is an effective tool for the sustainability management of groundwater resources in typical agricultural regions, such as southern Perak Province, Malaysia. Assessing the potentiality and pollution vulnerability of groundwater is a fundamental phase of favourability mapping. A geographic information system (GIS)-based Boolean operator of a spatial analyst module was applied to combine a groundwater potentiality map (GPM) model and a groundwater vulnerability to pollution index (GVPI) map, thereby establishing the favourable zones for drinking water exploration in the investigated area. The area GPM model was evaluated by applying a GIS-based Dempster–Shafer–evidential belief function model. In the evaluation, six geoelectrically determined groundwater potential conditioning factors (i.e. overburden resistivity, overburden thickness, aquifer resistivity, aquifer thickness, aquifer transmissivity and hydraulic conductivity) were synthesized by employing the probability-based algorithms of the model. The generated thematic maps of the seven hydrogeological parameters of the DRASTIC model were considered as pollution potential conditioning factors and were analysed with the developed ordered weighted average–DRASTIC index model algorithms to construct the GVPI map. Approximately 88.8 and 85.71% prediction accuracies for the Groundwater Potentiality and GVPI maps were established using the reacting operating characteristic curve method and water quality status–vulnerability zone relationship scheme, respectively. Finally, the area groundwater favourability map (GFM) model was produced by applying a GIS-based Boolean operator on the Groundwater Potentiality and GVPI maps. The GFM model reveals three distinct zones: ‘not suitable’, ‘less suitable’ and ‘very suitable’ zones. The area analysis of the GFM model indicates that more than 50% of the study area is covered by the ‘very suitable’ zones. Results produce a suitability map that can be used by local authorities for the exploitation and management of drinking water in the area. The study findings can also be applied as a tool to help increase public awareness of groundwater issues in developing countries.  相似文献   

6.
Abstract

This paper discusses the role of Geoinformatics as a new scientific discipline designed for handling of geospatial information. Depending on the scientific background of the people involved in shaping the emerging discipline, emphasis may be placed on different aspects of Geoinformatics. Applications and developments may address geoscientific, spatial planning, or computer science related matters. The scientific field of Geoinformatics encompasses the acquisition and storing of geospatial data, the modelling and presentation of spatial information, geoscientific analyses and spatial planning, and the development of algorithms and geospatial database systems. It is the position of the author that these tools from Geoinformatics are necessary to bridge the gap between Digital Earth models and the real world with its real-world problems (‘connecting through location’). It is, however, crucial that Geoinformatics represents a coherent integrated approach to the acquisition, storage, analysis, modeling, presentation, and dissemination of geo-processes and not a patchwork solution of unconnected fields of activity. Geoinformatics is as such not a part of Geography, Surveying, or Computer Science, but a new self-contained scientific discipline. The current paper highlights international and national trends of the discipline and presents a number of Geoinformatics initiatives. The research and teaching activities of the newly formed Institute for Geoinformatics and Remote Sensing (IGF) at the University of Osnabrueck serve as an example for these initiatives. All these developments have lead to the long overdue formation of a scientific ‘Society for Geoinformatics’ (German: Gesellschaft für Geoinformatik – GfGI) in Germany.  相似文献   

7.
This article considers two types of tools used for the analysis of spatial data: computational tools for grid data and visualization-based interactive exploratory techniques. The main purpose is to demonstrate that computational and visual techniques are complementary, and their combined use in data analysis may produce a synergistic effect. To this end, we describe two example scenarios of data analysis in the evaluation of potential damage from earthquakes and in analysis of forest resources. The analyses were performed with the use of CommonGIS, featuring the availability of powerful geocomputational tools and interactive visualization-related facilities that can be efficiently used in combination in a desktop environment or via the World Wide Web.  相似文献   

8.
Abstract

Geospatial simulation models can help us understand the dynamic aspects of Digital Earth. To implement high-performance simulation models for complex geospatial problems, grid computing and cloud computing are two promising computational frameworks. This research compares the benefits and drawbacks of both in Web-based frameworks by testing a parallel Geographic Information System (GIS) simulation model (Schelling's residential segregation model). The parallel GIS simulation model was tested on XSEDE (a representative grid computing platform) and Amazon EC2 (a representative cloud computing platform). The test results demonstrate that cloud computing platforms can provide almost the same parallel computing capability as high-end grid computing frameworks. However, cloud computing resources are more accessible to individual scientists, easier to request and set up, and have more scalable software architecture for on-demand and dedicated Web services. These advantages may attract more geospatial scientists to utilize cloud computing for the development of Digital Earth simulation models in the future.  相似文献   

9.
ABSTRACT

The impact of climate change on groundwater vulnerability has been assessed in the Pannonian basin over 1961–2070. High-resolution climate models, aquifers composition, land cover, and digital elevation model were the main factors which served to perform the spatial analysis using Geographical Information Systems. The analysis reported here is focused on the long-term period, including three temporal time sets: the past period of 1961–1990 (1990s), the present period of 2011–2040 (2020s), and the future period of 2041–2070 (2050s). During the 1990s, the high and very high areas of groundwater vulnerability were identified in all the central, western, eastern, southeastern, and northern sides of the Pannonian basin. In these areas, the water availability is lower and the pollution load index is high, due to the agricultural activities. The low and very low vulnerability class was depicted in the South-West part of the basin and in few locations from the peripheral areas, mainly in the North and West. The medium groundwater vulnerability spreads over the Pannonian basin, but it is more concentrated in the central, South, and South-West. The most affected territory is Hungary, while the territories of Slovenia, Croatia, and Bosnia and Herzegovina are less affected. In the present and future periods, the very high groundwater vulnerability increased in areas by 0.74% and 0.87%, respectively. The low class area decreased between the 1990s and the 2020s by 2.33% and it is expected to decrease up to 2.97% in the 2050s. Based on this analysis and the groundwater vulnerability maps, the Pannonian basin appears more vulnerable to climate change in the present and future. These findings demonstrate that the aquifers from Pannonian basin experience high negative effect under climate conditions. In addition, the land cover contributes to this negative status of groundwater resources. The original maps of groundwater vulnerability represent an instrument for water management planning and for research.  相似文献   

10.
ABSTRACT

Forecasting environmental parameters in the distant future requires complex modelling and large computational resources. Due to the sensitivity and complexity of forecast models, long-term parameter forecasts (e.g. up to 2100) are uncommon and only produced by a few organisations, in heterogeneous formats and based on different assumptions of greenhouse gases emissions. However, data mining techniques can be used to coerce the data to a uniform time and spatial representation, which facilitates their use in many applications. In this paper, streams of big data coming from AquaMaps and NASA collections of 126 long-term forecasts of nine types of environmental parameters are processed through a cloud computing platform in order to (i) standardise and harmonise the data representations, (ii) produce intermediate scenarios and new informative parameters, and (iii) align all sets on a common time and spatial resolution. Time series cross-correlation applied to these aligned datasets reveals patterns of climate change and similarities between parameter trends in 10 marine areas. Our results highlight that (i) the Mediterranean Sea may have a standalone ‘response’ to climate change with respect to other areas, (ii) the Poles are most representative of global forecasted change, and (iii) the trends are generally alarming for most oceans.  相似文献   

11.
Advances in the development of Earth observation data acquisition systems have led to the continuously growing production of remote sensing datasets, for which timely analysis has become a major challenge. In this context, distributed computing technology can provide support for efficiently handling large amounts of data. Moreover, the use of distributed computing techniques, once restricted by the availability of physical computer clusters, is currently widespread due to the increasing offer of cloud computing infrastructure services. In this work, we introduce a cloud computing approach for object-based image analysis and classification of arbitrarily large remote sensing datasets. The approach is an original combination of different distributed methods which enables exploiting machine learning methods in the creation of classification models, through the use of a web-based notebook system. A prototype of the proposed approach was implemented with the methods available in the InterCloud system integrated with the Apache Zeppelin notebook system, for collaborative data analysis and visualization. In this implementation, the Apache Zeppelin system provided the means for using the scikit-learn Python machine learning library in the design of a classification model. In this work we also evaluated the approach with an object-based image land-cover classification of a GeoEye-1 scene, using resources from a commercial cloud computing infrastructure service provided. The obtained results showed the effectiveness of the approach in efficiently handling a large data volume in a scalable way, in terms of the number of allocated computing resources.  相似文献   

12.
Geospatially Enabled Scientific Workflows offer a promising toolset to help researchers in the earth observation domain with many aspects of the scientific process. One such aspect is that of access to distributed earth observation data and computing resources. Earth observation research often utilizes large datasets requiring extensive CPU and memory resources in their processing. These resource intensive processes can be chained; the sequence of processes (and their provenance) makes up a scientific workflow. Despite the exponential growth in capacity of desktop computers, their resources are often insufficient for the scientific workflow processing tasks at hand. By integrating distributed computing capabilities into a geospatially enabled scientific workflow environment, it is possible to provide researchers with a mechanism to overcome the limitations of the desktop computer. Most of the effort on extending scientific workflows with distributed computing capabilities has focused on the web services approach, as exemplified by the OGC's Web Processing Service and by GRID computing. The approach to leveraging distributed computing resources described in this article uses instead remote objects via RPyC and the dynamic properties of the Python programming language. The Vistrails environment has been extended to allow for geospatial processing through the EO4Vistrails package ( http://code.google.com/p/eo4vistrails/ ). In order to allow these geospatial processes to be seamlessly executed on distributed resources such as cloud computing nodes, the Vistrails environment has been extended with both multi‐tasking capabilities and distributed processing capabilities. The multi‐tasking capabilities are required in order to allow Vistrails to run side‐by‐side processes, a capability it does not currently have. The distributed processing capabilities are achieved through the use of remote objects and mobile code through RPyC.  相似文献   

13.
ABSTRACT

Since Al Gore created the vision for Digital Earth in 1998, a wide range of research in this field has been published in journals. However, little attention has been paid to bibliometric analysis of the literature on Digital Earth. This study uses a bibliometric analysis methodology to study the publications related to Digital Earth in the Science Citation Index database and Social Science Citation Index database (via the Web of Science online services) during the period from 1998 to 2015. In this paper, we developed a novel keyword set for ‘Digital Earth’. Using this keyword set, 11,061 scientific articles from 23 subject categories were retrieved. Based on the searched articles, we analyzed the spatiotemporal characteristics of publication outputs, the subject categories and the major journals. Then, authors’ performance, affiliations, cooperation, and funding institutes were evaluated. Finally, keywords were examined. Through keyword clustering, research hotspots in the field of Digital Earth were detected. We assume that the results coincide well with the position of Digital Earth research in the context of big data.  相似文献   

14.
15.
Abstract

Grid computing is deemed as a good solution to the digital earth infrastructure. Various geographically dispersed geospatial resources can be connected and merged into a ‘supercomputer’ by using the grid-computing technology. On the other side, geosensor networks offer a new perspective for collecting physical data dynamically and modeling a real-time virtual world. Integrating geosensor networks and grid computing in geosensor grid can be compared to equipping the geospatial information grid with ‘eyes’ and ‘ears.’ Thus, real-time information in the physical world can be processed, correlated, and modeled to enable complex and advanced geospatial analyses on geosensor grid with capability of high-performance computation. There are several issues and challenges that need to be overcome before geosensor grid comes true. In this paper, we propose an integrated framework, comprising the geosensor network layer, the grid layer and the application layer, to address these design issues. Key technologies of the geosensor grid framework are discussed. And, a geosensor grid testbed is set up to illustrate the proposed framework and improve our geosensor grid design.  相似文献   

16.
Abstract

Cartography in general, and building solid landscape models in particular, requires an interdisciplinary set of skills in order to be done well. Traditional handcrafted construction methods provide quality results, but are extremely labour-intensive and therefore costly. Modern methods using digital terrain models (DTMs) and computer numerical control (CNC) milling are fast and accurate, but the finished models are visually less than optimal. Solutions are proposed using DTMs and CNC milling to create landscape models in which the initial shaping is done mechanically and the fine details are carved by hand. This ‘balanced approach’ to landscape modelling combines the time- and cost-advantages of modern digital technology with the quality of traditional handcrafted techniques resulting in highly accurate landscape models which still retain the artistic ‘feel’ of the human touch.  相似文献   

17.
ABSTRACT

Light detection and ranging (LiDAR) data are essential for scientific discoveries such as Earth and ecological sciences, environmental applications, and responding to natural disasters. While collecting LiDAR data over large areas is quite possible the subsequent processing steps typically involve large computational demands. Efficiently storing, managing, and processing LiDAR data are the prerequisite steps for enabling these LiDAR-based applications. However, handling LiDAR data poses grand geoprocessing challenges due to data and computational intensity. To tackle such challenges, we developed a general-purpose scalable framework coupled with a sophisticated data decomposition and parallelization strategy to efficiently handle ‘big’ LiDAR data collections. The contributions of this research were (1) a tile-based spatial index to manage big LiDAR data in the scalable and fault-tolerable Hadoop distributed file system, (2) two spatial decomposition techniques to enable efficient parallelization of different types of LiDAR processing tasks, and (3) by coupling existing LiDAR processing tools with Hadoop, a variety of LiDAR data processing tasks can be conducted in parallel in a highly scalable distributed computing environment using an online geoprocessing application. A proof-of-concept prototype is presented here to demonstrate the feasibility, performance, and scalability of the proposed framework.  相似文献   

18.
基于开放互操作标准的分布式地理空间模型共享研究   总被引:1,自引:0,他引:1  
传统的单机环境和封闭式网络环境由于有限的资源利用能力, 难以充分支持分散地学数据、模型等资源的共享与应用集成。基于网络环境的信息交换特点, 提出了分布式地理空间模型共享的服务体系。该体系以数据、模型、元数据等互操作要素为核心, 通过网络将数据、模型等网络节点进行开放式耦合。针对地理空间模型服务的互操作问题, 提出了分布式环境下的模型共享服务交互接口, 该接口定义了模型服务元数据、模型服务的交互操作、模型服务的通讯方式等交互规则, 尽可能地降低模型服务与模型终端之间在数据交换、功能调用等方面的互操作困难。为了降低将模型共享为模型服务的实现难度, 设计和开发了地理空间模型共享平台, 并介绍了在该平台上发布地理空间模型的2种方法。最后介绍了研究成果在Prairie生态模型共享方面的应用实践。  相似文献   

19.
Cloud computing has been considered as the next-generation computing platform with the potential to address the data and computing challenges in geosciences. However, only a limited number of geoscientists have been adapting this platform for their scientific research mainly due to two barriers: 1) selecting an appropriate cloud platform for a specific application could be challenging, as various cloud services are available and 2) existing general cloud platforms are not designed to support geoscience applications, algorithms and models. To tackle such barriers, this research aims to design a hybrid cloud computing (HCC) platform that can utilize and integrate the computing resources across different organizations to build a unified geospatial cloud computing platform. This platform can manage different types of underlying cloud infrastructure (e.g., private or public clouds), and enables geoscientists to test and leverage the cloud capabilities through a web interface. Additionally, the platform also provides different geospatial cloud services, such as workflow as a service, on the top of common cloud services (e.g., infrastructure as a service) provided by general cloud platforms. Therefore, geoscientists can easily create a model workflow by recruiting the needed models for a geospatial application or task on the fly. A HCC prototype is developed and dust storm simulation is used to demonstrate the capability and feasibility of such platform in facilitating geosciences by leveraging across-organization computing and model resources.  相似文献   

20.
地下水是水资源的重要组成部份,地下水污染危害人的健康,影响人们的生产和生活,查明某一地区地下水容易受污染的可能性即地下水脆弱性,能为管理决策部门提供合理开发地下水资源,防治地下水污染的科学规划和管理依据。在脆弱性评价工作中,应用当前国际上最先进的地理信息系统平台ArcGIS,并结合地统计分析原理,完成地下水系统脆弱性编...  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号