首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Mercury is a federated metadata harvesting, search and retrieval tool based on both open source packages and custom software developed at Oak Ridge National Laboratory (ORNL). It was originally developed for the National Aeronautics and Space Administration (NASA), and the consortium now includes funding from NASA, U.S. Geological Survey (USGS), and U.S. Department of Energy (DOE). Mercury is itself a reusable software application which uses a service-oriented architecture (SOA) approach to capturing and managing metadata in support of twelve Earth science projects. Mercury also supports the reuse of metadata by enabling searches across a range of metadata specification and standards including XML, Z39.50, FGDC, Dublin-Core, Darwin-Core, EML, and ISO-19115. It collects metadata and key data from contributing project servers distributed around the world and builds a centralized index. The Mercury search interfaces allows the users to perform simple, fielded, spatial, temporal and other hierarchical searches across these metadata sources. This centralized repository of metadata with distributed data sources provides extremely fast search results (Table 1) to the user, while allowing data providers to advertise the availability of their data and yet maintain complete control and ownership of that data.  相似文献   

2.
U-Pb Saturn is new freeware for U-Pb LA-ICP-MS data reduction. It has been developed to provide easy interaction with and visualisation of LA-ICP-MS U-Pb datasets, and allows fast and reliable data reduction of hundreds of data-points. Saturn offers dynamic graphic interfaces to quickly view, evaluate, and plot U-Pb and Pb-Pb isotope data. It operates online (or offline), giving the freedom to change parameters and reprocess data at any stage of data acquisition. The main interface allows the user to: (1) choose the best statistics for drift correction, (2) include/exclude offset factors and (3) apply (or not) Pbc corrections in different modes. Signal intensities are displayed in a separate graphic interface that allows users to interact with the time-resolved signal of individual spot analyses. All graphic windows are interactive; any modification to data treatment (e.g., inclusion or exclusion of analyses of reference material, or modification of the time-resolved signal windows) is instantaneously updated on the data tables. Saturn is particularly attractive for beginners in LA-ICP-MS U-Pb geochronology as it is non-commercial, easy to install, and very interactive. Coding information and a version of the software can be accessed at http://www.air.ufop.br .  相似文献   

3.
Geospatial data sciences have emerged as critical requirements for high-priority application solutions in diverse areas, including, but not limited to, the mitigation of natural and man-made disasters. Three sets of metrics, adopted or customized from geo-statistics, applied meteorology and signal processing, are tested in terms of their ability to evaluate geospatial datasets, specifically two population databases commonly used for disaster preparedness and consequence management. The two high-resolution, grid-based population datasets are the following: The LandScan dataset available from the Geographic Information Science and Technology (GIST) group at the Oak Ridge National Laboratory (ORNL), and the Gridded Population of the World (GPW) dataset available from the Center for International Earth Science Information Network (CIESIN) group at Columbia University. Case studies evaluate population data across the globe, specifically, the metropolitan areas of Washington DC, USA, Los-Angeles, USA, and Houston, USA, and London, UK, as well as the country of Iran. The geospatial metrics confirm that the two population datasets have significant differences, especially in the context of their utility for disaster readiness and mitigation. While this paper primarily focuses on grid based population datasets and disaster management applications, the sets of metrics developed here can be generalized to other geospatial datasets and applications. Future research needs to develop metrics for geospatial and temporal risks and associated uncertainties in the context of disaster management. The U. S. Government’s right to retain a non-exclusive, royalty-free license in and to any copyright is acknowledged.  相似文献   

4.
Karst database development in Minnesota: design and data assembly   总被引:1,自引:0,他引:1  
The Karst Feature Database (KFD) of Minnesota is a relational GIS-based Database Management System (DBMS). Previous karst feature datasets used inconsistent attributes to describe karst features in different areas of Minnesota. Existing metadata were modified and standardized to represent a comprehensive metadata for all the karst features in Minnesota. Microsoft Access 2000 and ArcView 3.2 were used to develop this working database. Existing county and sub-county karst feature datasets have been assembled into the KFD, which is capable of visualizing and analyzing the entire data set. By November 17 2002, 11,682 karst features were stored in the KFD of Minnesota. Data tables are stored in a Microsoft Access 2000 DBMS and linked to corresponding ArcView applications. The current KFD of Minnesota has been moved from a Windows NT server to a Windows 2000 Citrix server accessible to researchers and planners through networked interfaces.  相似文献   

5.
Sterling Quinn 《GeoJournal》2017,82(3):455-473
As businesses and governments integrate OpenStreetMap (OSM) into their services in ways that require comprehensive coverage, there is a need to expand research outside of major urban areas and consider the strength of the map in smaller cities. A place-specific inquiry into the OSM contributor sets in small cities allows an intimate look at user motives, locations, and editing habits that are readily described in the OSM metadata and user profile pages, but often missed by aggregate studies of OSM data. Using quantitative and qualitative evidence from the OSM history of five small cities across North and South America, I show that OSM is not accumulating large local corpuses of editors outside of major urban areas. In these more remote places OSM remains largely at the mercy of an unpredictable mix of casual contributions, business interests, feature-specific “hobbyists”, bots, and importers, all passing through the map at different scales for different reasons. I present a typology of roles played by contributors as they expand and fix OSM in casual, systematic, and automated fashion. I argue that these roles are too complex to be conceptualized with the traditional “citizen as sensor” model of understanding volunteered geographic information. While some contributors are driven by pride of place, others are more interested in improving map quality or ensuring certain feature types are represented. Institutions considering the use of OSM data in their projects should be aware of these varied influences and their potential effects on the data.  相似文献   

6.
地球空间元数据研究   总被引:25,自引:6,他引:19       下载免费PDF全文
周成虎  李军 《地球科学》2000,25(6):579-585
根据地学元数据描述对象的差异, 把地学元数据分为数据库元数据、数据集元数据和数据要素层元数据, 不同层次元数据在管理使用上有一定差异.地学元数据研究中最关键、最基础的内容是元数据标准的制定和使用, 从实用性角度出发, 结合已有的地学元数据标准, 提出了一种分为基本集、概要集和详细集三层结构的地学元数据体系, 每层次的元数据适用于不同的用户群.根据已有的元数据使用的成功经验, 总结了几种元数据使用和管理模式.   相似文献   

7.
Current advances in computer hardware, information technology and data collection techniques have produced very large data sets in a wide variety of scientific and engineering disciplines. We must harness this opportunity to visualize and extract useful information from geophysical and geological data. We have taken the task of data mining by implementing a map-like approach over a web server for interrogating the data, using a client-server paradigm. The spatial-data is mapped onto a two-dimensional grid from which the user (client) can quiz the data with the map-interface as a user extension. The data is stored on the server, while the computational gateway separating the client and the server can be the front-end of an electronic publication, electronic classroom, a survey, or an e-business. We have used a combination of Java, Java3D, and Perl for processing the data and communicating between the client and the server. The user can interrogate the geospatial data over any particular region with arbitrary dimensions and then receive back relevant statistical analysis, such as the histogram plots and local statistics. We have applied this method for the following data sets: (1.) distribution of prime numbers (2.) two-dimensional mantle convection (3.) three-dimensional mantle convection (4.) high-resolution satellite reflectance data over multiple wavelengths (5.) molecular dynamics describing the flow of blood in narrow vessels. Using the map-interface, one can actually interrogate this data over the Internet.  相似文献   

8.
Modern exploration is a multidisciplinary task requiring the simultaneous consideration of multiple disparate geological, geochemical and geophysical datasets. Over the past decade, several research groups have investigated the role of Geographic Information Systems as a tool to analyse these data. From this research, a number of techniques has been developed that allow the extraction of exploration‐relevant spatial factors from the datasets. The spatial factors are ultimately condensed into a single prospectivity map. Most techniques used to construct prospectivity maps tend to agree, in general, as to which areas have the lowest and highest prospectivities, but disagree for regions of intermediate prospectivity. In such areas, the prospectivity map requires detailed interpretation, and the end‐user must normally resort to analysis of the original datasets to determine which conjunction of factors results in each intermediate prospectivity value. To reduce this burden, a new technique, based on fuzzy logic principles, has been developed for the integration of spatial data. Called vectorial fuzzy logic, it differs from existing methods in that it displays prospectivity as a continuous surface and allows a measure of confidence to be incorporated. With this technique, two maps are produced: one displays the calculated prospectivity and the other shows the similarity of input values (or confidence). The two datasets can be viewed simultaneously as a three‐dimensional perspective image in which colour represents prospectivity and topography represents confidence. With the vectorial fuzzy logic method, factors such as null data and incomplete knowledge can also be incorporated into the prospectivity analysis.  相似文献   

9.
Indexing methods are used for the evaluation of aquifer vulnerability and establishing guidelines for the protection of ground-water resources. The principle of the indexing method is to rank influences on groundwater to determine overall vulnerability of an aquifer to contamination. The analytic element method (AEM) of ground-water flow modeling is used to enhance indexing methods by rapidly calculating a potentiometric surface based primarily on surface-water features. This potentiometric map is combined with a digital-elevation model to produce a map of water-table depth. This is an improvement over simple water-table interpolation methods. It is physically based, properly representing surface-water features, hydraulic boundaries, and changes in hydraulic conductivity. The AEM software, SPLIT, is used to improve an aquifer vulnerability assessment for a valley-fill aquifer in western New York State. A GIS-based graphical user interface allows automated conversion of hydrography vector data into analytic elements.  相似文献   

10.
全国重要成矿区带基础数据库服务体系分为技术服务和数据服务两大部分.在技术服务层面上,根据全国重要成矿区带基础数据的特点,划分了数据裁剪类型;利用2个长二进制字段分别存储图元信息和拓扑信息,解决了空间数据装入数据库中易造成数据丢失和格式转换信息缺失的难题;利用关系数据库和空间数据引擎解决了属性数据和空间数据一体化存储和联合查询难题.在数据服务层面上,指出采用WebGIS在线服务、目录服务和解说服务以及光盘邮寄服务相互结合、互相补充的方式,既可以很好地满足用户需求,又能保证数据的安全;并划分了数据类型和用户级别;提出了保护数据安全的策略.  相似文献   

11.
地球系统科学数据共享网络平台的设计和开发   总被引:9,自引:0,他引:9  
“地球系统科学数据共享网”是国家科学数据共享工程的试点之一,同时也是国家科技基础条件平台的组成部分,其目的是为地球系统科学的基础和前沿研究提供数据支撑服务。该系统平台(GEODATA)是一个分布式的数据交换体系,总体设计思路是利用元数据整合分散的地学数据资源,通过业务逻辑的技术封装提供地学数据的网络共享。设计的GEODATA体系是一个五层结构,即门户层、共享业务层、核心服务层、资源管理层和网络平台层,具体的业务逻辑处理功能被分解为13个功能模块。GEODATA的研发技术路线包括总中心和分中心两大部分,其中总中心采用“地学数据超市”理念进行功能组织;分中心基于Web Services技术,采用分布式数据管理的模式进行设计。两者的业务逻辑基于元数据的生命周期被串联成一个有机的整体。GEODATA原型系统利用J2EE开发,实现跨平台部署。应用实践表明这种平台构架模式非常适用于地球系统科学的学科特点。  相似文献   

12.
13.
With the rise in the number of applications using geospatial data and the number of GIS applications, the number of people who come into contact with geospatial data is increasing, too. Despite many attempts to introduce standardized formats in this area, they are often ignored by software developers as well as the users themselves for various reasons. When creating or exporting geographical data, users choose the format with regard to the software they use, or for which the data are intended. Users then have to deal with conversion of data formats, and considering its use also the issue of their transformation to the appropriate spatial reference system. This work presents findings related to this issue, obtained from several years of operation of an online service for the conversion and transformation of geographical data which is heavily used by users from all over the world. It presents statistics of individual formats use and spatial reference systems of geospatial data use from the point of view of both input and output data. The results, besides other things, are shown in the form of a pie chart map in which various needs of users from a variety of countries can be seen. The results of this work can be used especially by developers of applications which use geospatial data; it will allow them to quickly understand current user needs.  相似文献   

14.
The current study sought to offer guidance for developing effective web-based mapping tools for wildfire warnings by identifying (1) the important content for facilitating individuals’ decision-making, and (2) the optimal interface design for ensuring usability and ease of information access. A map-based warning tool was prototyped in the Australian context, followed by a usability and effectiveness evaluation through individual interviews and verbal protocol analysis to assess participants’ interaction with the mapping interface and information in response to the simulated warning scenario. The results demonstrated variations in participants’ approaches to wildfire warning response, revealing varied information needs. Specifically, most participants relied on their own assessment of the prospective threat, requiring specific wildfire-related information before eliciting a response. In contrast, the decision of a minority of the participants was motivated by response guidance from agencies, and accurate wildfire information was less important for their response. Imperative information for both types of residents therefore needs to be highlighted in a map-based warning tool to cater to a wide audience. Furthermore, a number of heuristics were identified for designing effective interactive functions to facilitate the control of, and access to, the various maps and textual information presented on the map-based warning interface.  相似文献   

15.
Spatial profiling of community food security data can help the targeting of geographic areas and populations most vulnerable to food insecurity. While multiple poverty mapping systems support spatial profiling, they often lack capabilities to disseminate mapping results to a wide range of audiences and to spatially link qualitative data to quantitative analysis. To address these limitations, this study presents a web mapping framework which integrates a variety of publicly available software tools to enable spatial exploration of both quantitative and qualitative data. Specifically, our framework allows online choropleth mapping and thematic data exploration through a mixture of free mapping Application Programming Interfaces (APIs) and open source software tools for spatial data processing and desktop-like user interfaces. The study demonstrates this framework by developing a web prototype for informing food insecurity issues in Bogotá, Colombia. The prototype implementation reveals that the proposed framework facilitates the development of scalable and functionally-extensible mapping systems and the identification of community-specific food insecurity problems (e.g., food kitchens inaccessible from workplaces of low-income residents). This suggests that web-based cartographic visualization using publicly available software tools can be useful for spatial examination of community food insecurity as well as for cost-effective distribution of the resulting map information.  相似文献   

16.
The consistent geometric and topological representation of a fault network is possible through a method based on the implementation of 3-dimensional Generalized maps (3-G-map) enabling all subdivisions of space to be represented. The fault network is modeled as an assemblage of polygonal faces from a set of geometric data on the faults and a knowledge of the relationships between the faults. The resultant model is expressed in terms of a 3-G-map in which volume, surface, and topological information is constructed taking into account computed intersections between faults and known interception relations. The fault network can be edited through an interactive 3-D viewer which provides several tools for navigating within the 3-G-map. Information relevant to a fault network, such as block geometry, connectivity, adjacencies, and connectivity relationships, can be obtained by exploring the data structure of the 3-G-map. The fault network architecture is made comprehensive through interactive modeling and visualization.  相似文献   

17.
This work has evaluated the functionality of various fuzzy-based fusion methods in the mineral potential mapping (MPM), by which a multi-criteria decision-making problem was solved to design a layout for drilling complementary boreholes through a comprehensive analysis of geospatial datasets. The novel methods employed were fuzzy c-means clustering, fuzzy gamma operator, fuzzy inference system (FIS), fuzzy outranking, and fuzzy ordered weighted averaging (FOWA). Kahang porphyry Cu-Mo deposit in the Isfahan province of Iran was chosen as a case study to examine the performance of these fuzzy methods in MPM. Extracted geospatial indicator layers for assessing the potential of porphyry-type mineralization were derived from four criteria, namely geology (rock units and faults), remote sensing (alteration map), geochemistry (Cu, Mo, and factor maps), and geophysics (reduced to the pole and analytical signal of magnetic data). The concentration-area multifractal method was utilized to reclassify each synthesized fuzzy favorability map into five classes. To appraise and compare the efficiency of each employed method, a productivity measure assumed as a cumulative summation of Cu grade multiplied by its thickness above an economical cut-off value of 0.2% was calculated along with each drilling (totally 33 ones). According to fuzzy favorability maps derived from running all fuzzy methods, the FIS and FOWA had the highest efficiency with 80 and 78% of accuracy, respectively. Eventually, taking all fuzzy maps into account led to the delineation of some new favorable zones, whereby further exploratory investigations are envisioned for determining their mining potential.  相似文献   

18.
DRASTIC indexing and integrated electrical conductivity (IEC) modeling are approaches for assessing aquifer vulnerability to surface pollution. DRASTIC indexing is more common, but IEC modeling is faster and more cost-effective because it requires less data and fewer processing steps. This study aimed to compare DRASTIC indexing with IEC modeling to determine whether the latter is sufficient on its own. Both approaches are utilized to determine zones vulnerable to groundwater pollution in the Nile Delta. Hence, assessing the nature and degree of risk are important for realizing effective measures toward damage minimization. For DRASTIC indexing, hydrogeological factors such as depth to aquifer, recharge rate, aquifer media, soil permeability, topography, impact of the vadose zone, and hydraulic conductivity were combined in a geographical information system environment for assessing the aquifer vulnerability. For IEC modeling, DC resistivity data were collected from 36 surface sounding points to cover the entire area and used to estimate the IEC index. Additionally, the vulnerable zones identified by both approaches were tested using a local-scale resistivity survey in the form of 1D and 2D resistivity imaging to determine the permeable pathways in the vadose zone. A correlation of 0.82 was obtained between the DRASTIC indexing and IEC modeling results. For additional benefit, the obtained DRASTIC and IEC models were used together to develop a vulnerability map. This map showed a very high vulnerability zone, a high-vulnerability zone, and moderate- and low-vulnerability zones constituting 19.89, 41, 27, and 12%, respectively, of the study area. Identifying where groundwater is more vulnerable to pollution enables more effective protection and management of groundwater resources in vulnerable areas.  相似文献   

19.
Geospatial technologies and digital data have developed and disseminated rapidly in conjunction with increasing computing efficiency and Internet availability. The ability to store and transmit large datasets has encouraged the development of national infrastructure datasets in geospatial formats. National datasets are used by numerous agencies for analysis and modeling purposes because these datasets are standardized and considered to be of acceptable accuracy for national scale applications. At Oak Ridge National Laboratory a population model has been developed that incorporates national schools data as one of the model inputs. This paper evaluates spatial and attribute inaccuracies present within two national school datasets, Tele Atlas North America and National Center of Education Statistics (NCES). Schools are an important component of the population model, because they are spatially dense clusters of vulnerable populations. It is therefore essential to validate the quality of school input data. Schools were also chosen since a validated schools dataset was produced in geospatial format for Philadelphia County; thereby enabling a comparison between a local dataset and the national datasets. Analyses found the national datasets are not standardized and incomplete, containing 76 to 90 percent of existing schools. The temporal accuracy of updating annual enrollment values resulted in 89 percent inaccuracy for 2003. Spatial rectification was required for 87 percent of NCES points, of which 58 percent of the errors were attributed to the geocoding process. Lastly, it was found that by combining the two national datasets, the resultant dataset provided a more useful and accurate solution.  相似文献   

20.
面向水文数据共享的水文核心元数据模型研究及应用   总被引:1,自引:0,他引:1  
孟令奎  李三霞  张文  张东映 《水文》2012,(1):1-5,12
研究了目前水文及相关元数据的研究和发展情况,分析了水文数据及元数据的特点,在此基础上研究设计了符合水文数据共享需求,同时具有简洁、适用性强的水文核心元数据模型。提出的水文核心元数据模型在水文水资源科学数据共享中得到了应用,并取得了较好的应用效果。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号