首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In the United States, desalination has considerably expanded since the 1950s, reaching a daily production capacity of 2 BGD (billion gallons per day) with around 1336 operating plants as of 2013 (GWI, 2013). Despite this continuous growth, a steady increase in desalination investments and growing demand for water, research on geospatial representation of desalination plants and their characteristics over time does not exist or is very limited. This paper aims at filling this gap by developing interactive 5D and 6D geospatial models and a multi-dimensional analysis of desalination trends in the time span 1950–2013. The analysis shows that desalination plants are located mainly on the East and West Coast of the United States, with Florida, California, and Texas leading in the national desalination sector. Despite the geographical proximity to the sea, most of the plants use brackish groundwater due to economic factors related to the desalination process itself and the disposal of the highly saline byproduct – brine. The models can be used both for educational and interdisciplinary research purposes and help with determining socio-economic viability of establishing prospective desalination plants in different regions in the future. They can also help support decision makers in solving emergency questions related to water shortages and preparing for long-term water scarcity in different US regions.  相似文献   

2.
A common concern when working with health‐related data is that national standard guidelines are designed to preserve individual statistical information, usually recorded as text or in a spreadsheet format (‘statistical confidentiality’), but lack appropriate rules for visualizing this information on maps (‘spatial confidentiality’). Privacy rules to protect spatial confidentiality become more and more important, as governmental agencies increasingly incorporate Geographic Information Systems (GIS) as a tool for collecting, storing, analysing, and disseminating spatial information. The purpose of this paper is to propose the first step of a general framework for presenting the location of confidential point data on maps using empirical perceptual research. The overall objective is to identify geographic masking methods that preserve both the confidentiality of individual locations, and at the same time the essential visual characteristics of the original point pattern.  相似文献   

3.
We often need to report on environmental, economic and social indicators, and properties at aggregated spatial scales, e.g. average income per suburb. To do this, we invariably create reporting polygons that are somewhat arbitrary. The question arises: how much does this arbitrary subdivision of space affect the outcome? In this paper, we develop a new, gradient‐based framework for carrying out a rigorous analysis of the sensitivity of integrating functions to quantitative changes in their spatial configuration. This approach is applied to both analytical and empirical models, and it allows the reporting of a hierarchy of sensitivity measures (from global to local). We found that the concepts of a vector space representing the spatial configurations and the response (hyper‐)surface on which gradients indicate the sensitivities to be helpful in developing the sensitivity analytical framework of spatial configurations in different dimensions. This approach works well with both analytical and empirical integrating functions. This approach resulted in a clear ranking of the sensitivities of the responses to changes in the reporting regions in an existing environmental reporting application. The approach also allowed us to find which vertices, and the directions of change of those vertices, influenced the outcome most. The application of the spatial framework allows the results to be reported in a hierarchical way, from the sensitivities of an integrative response to changes in a whole reserve/reporting system, down to the sensitivity along each of the dimensions of the vertices in the spatial configuration. The results of the spatial sensitivity framework that we developed in this paper can be readily visualized by plotting the sensitivities as vectors on geographic maps. This simplifies the presentation and facilitates the uptake of the results in the situations where the spatial configurations are complicated.  相似文献   

4.
The paper describes a problem faced by National Statistical Offices when publishing the results of decennial censuses for small geographical areas. If they publish statistical tables for two or more sets of areas, users can compare the tables and produce new statistics for the areas formed by differencing, which may have populations below confidentiality thresholds. To investigate the problem, the authors construct a software system and carry out a series of experiments using a large synthetic population base for Yorkshire and Humberside. The results indicate that publishing statistics for zones close in size to the primary areas is not safe unless the zones have been carefully designed. However, publishing statistics for sufficiently large areas such as 5 km grid squares or postal sectors alongside enumeration districts is safe.  相似文献   

5.
This paper reports research to predict the distribution of An. minimus, a malaria vector in forest fringe areas using GIS to support precision surveys for malaria control. Because An. minimus is a forest‐associated species, generalized thematic maps (1:6?000?000) of forest cover, soil type, altitude, rainfall and temperature were used. Digitization, overlaying, integration and analysis of thematic maps were done using Arc/Info 8.1 NT and Arc/View 3.2 (GIS, ESRI) software. GIS delineated favourable areas for An. minimus where the species is likely to be found, and precision surveys can be conducted. Precision field surveys in selected locations of favourable/non‐favourable areas were carried out. The species could be found in all locations designated as a favourable area and was absent in non‐favourable areas. In two districts, one where the species is reported to have disappeared in the early 1950s and the other where it was not reported in earlier surveys, GIS helped in precision surveys, and An. minimus was found. The technique can quickly cover vast and inaccessible areas and is easy to duplicate in other parts of the world to assist cost‐effective control of malaria. It can also delineate areas favourable for any species of flora and fauna to help precision surveys.  相似文献   

6.
For inherently vague and granular phenomena such as ecoregions, ecosystems, biomes, and biotopes, the interplay of granularity and vagueness leads to a trade-off in the classification and delineation of such phenomena: the goal of preciseness (lack of vagueness) of the delineation contradicts the goal of building a sophisticated classification system using the Aristotelian method of classification. This trade-off is based on the reliance on local qualities for a precise delineation of particular regions and the reliance on nonlocal qualities that serve as differentia in the Aristotelian classification. An ontological analysis of the logical interrelations between vagueness, granularity, and scale is critical for developing logically rigorous, nonlocal, and nonarbitrary classification and delineation systems for inherently vague and granular geographic phenomena.  相似文献   

7.
Since the Bonn 2011 conference, the “water-energy-food”(WEF) nexus has aroused global concern to promote sustainable development. The WEF nexus is a complex,dynamic, and open system containing interrelated and interdependent elements. However,the nexus studies have mainly focused on natural elements based on massive earth observation data. Human elements(e.g., society, economy, politics, culture) are described insufficiently, because traditional earth observation technologies cannot effectively ...  相似文献   

8.
9.
ABSTRACT

Kernel Density Estimation (KDE) is an important approach to analyse spatial distribution of point features and linear features over 2-D planar space. Some network-based KDE methods have been developed in recent years, which focus on estimating density distribution of point events over 1-D network space. However, the existing KDE methods are not appropriate for analysing the distribution characteristics of certain kind of features or events, such as traffic jams, queue at intersections and taxi carrying passenger events. These events occur and distribute in 1-D road network space, and present a continuous linear distribution along network. This paper presents a novel Network Kernel Density Estimation method for Linear features (NKDE-L) to analyse the space–time distribution characteristics of linear features over 1-D network space. We first analyse the density distribution of each linear feature along networks, then estimate the density distribution for the whole network space in terms of the network distance and network topology. In the case study, we apply the NKDE-L to analyse the space–time dynamics of taxis’ pick-up events, with real road network and taxi trace data in Wuhan. Taxis’ pick-up events are defined and extracted as linear events (LE) in this paper. We first conduct a space–time statistics of pick-up LE in different temporal granularities. Then we analyse the space–time density distribution of the pick-up events in the road network using the NKDE-L, and uncover some dynamic patterns of people’s activities and traffic condition. In addition, we compare the NKDE-L with quadrat method and planar KDE. The comparison results prove the advantages of the NKDE-L in analysing spatial distribution patterns of linear features in network space.  相似文献   

10.
The behaviour of the ice-dammed lake, Strupvatnet, Troms County, Norway, is described. Past observations are noted and related to observations in 1959 and to other ice-dammed lakes. Mechanisms for opening and maintaining water flow during a ‘hlaup’ are discussed. It is considered that Liestol's melt widening process operates after initiation by a pressure gradient across the dam. The lake and the internal drainage system of the glacier are thus linked. There is no evidence of lifting of the ice dam at Strupbreen.  相似文献   

11.
This paper presents a typology of local‐government data sharing arrangements in the US at a time when spatial data infrastructures (SDI) are moving into a second generation. In the first generation, the US National Spatial Data Infrastructure (NSDI) theoretically involved a pyramid of data integration resting on local‐government data sharing. Availability of local‐government data is the foundation for all SDI‐related data sharing in this model. However, first‐generation SDI data‐sharing activities and principles have gained only a tenuous hold in local governments. Some formalized data sharing occurs, but only infrequently in response to SDI programmes and policies. Previous research suggests that local‐government data sharing aligns with immediate organizational and practical concerns rather than state or national policies and programmes. We present research findings echoing extending these findings to show that local‐government data sharing is largely informal in nature and is undertaken to support existing governmental activities. NSDI principles remain simply irrelevant for the majority of surveyed local governments. The typology we present distinguishes four distinct types of local‐government data sharing arrangements that reflect institutional, political, and economic factors. The effectiveness of second generation, client‐service‐based SDI will be seriously constrained if the problems of local government take‐up fail to be addressed.  相似文献   

12.
New sources of data such as ‘big data’ and computational analytics have stimulated innovative pedestrian oriented research. Current studies, however, are still limited and subjective with regard to the use of Google Street View and other online sources for environment audits or pedestrian counts because of the manual information extraction and compilation, especially for large areas. This study aims to provide future research an alternative method to conduct large scale data collection more consistently and objectively on pedestrian counts and possibly for environment audits and stimulate discussion of the use of ‘big data’ and recent computational advances for planning and design. We explore and report information needed to automatically download and assemble Google Street View images, as well as other image parameters for a wide range of analysis and visualization, and explore extracting pedestrian count data based on these images using machine vision and learning technology. The reliability tests results based on pedestrian information collected from over 200 street segments in Buffalo, NY, Washington, D.C., and Boston, MA respectively suggested that the image detection method used in this study are capable of determining the presence of pedestrian with a reasonable level of accuracy. The limitation and potential improvement of the proposed method is also discussed.  相似文献   

13.
Stokkan, J. The potential model – an analysis of some methodological problems. Norsk geogr. Tidsskr. 29, 111–132.

The article is concerned with operational problems of the potential model. In connection with a general discussion of operational problems, a base model is constructed. This model is applied to a study area in order to determine the model's sensitivity to the area of the analytical unit, the distance exponent, and distances within the unit. The analysis shows that the potential is to a large degree dependent upon the definition of these three factors.  相似文献   

14.
This work deals with the identification of potentially contaminated areas using remote sensing, geographic information systems (GIS) and multi‐criteria spatial analysis. The identification of unknown illegal landfills is a crucial environmental problem in all developed and developing countries, where a large number of illegal waste deposits exist as a result of fast, and relatively unregulated, industrial growth over the past century. The criteria used to perform the spatial analysis are here selected by considering the characteristics which are ‘desirable’ for an illegal waste disposal site, chiefly related to the existence of roads for easy access and to a low population density which facilitates unnoticed dumping of illegal waste materials. A large dataset describing known legal and illegal landfills and the context of their location (population, road network, etc.) was used to perform a spatial statistical analysis to select factors and criteria allowing for the identification of the known waste deposits. The final result is a map describing the likelihood of an illegal waste deposit to be located at any arbitrary location. Such a probability map is then used together with remote sensing techniques to narrow down the set of possibly contaminated sites (Silvestri and Omri, 2008 Silvestri, S. and Omri, M. 2008. A method for the remote sensing identification of uncontrolled landfills: formulation and validation.. International Journal of Remote Sensing, 29(4): 975989. [Taylor & Francis Online] [Google Scholar]), which are candidates for further analyses and field investigations. The importance of the integration of GIS and remote sensing is highlighted and represents a key instrument for environmental management and for the spatially‐distributed characterization of possible uncontrolled landfill sites.  相似文献   

15.
Landsat ETM/TM data and an artificial neural network (ANN) were applied to analyse the expansion of the city of Xi'an and land use/cover change of its surrounding area between 2000 and 2003. Supervised classification and normalized difference barren index (NDBI) were used respectively to retrieve its urban boundary. Results showed that the urban area increased by an annual rate of 12.3%, with area expansion from 253.37 km^2 in 2000 to 358.60 km^2 in 2003. Large areas of farmland in the north and southwest were converted into urban construction land. The land use/cover changes of Xi'an were mainly caused by fast development of urban economy, population immigration from countryside, great development of infrastructure such as transportation, and huge demands for urban market. In addition, affected by the government policy of “returning farmland to woodland”, some farmland was converted into economic woodland, such as Chinese goosebeerv garden, vineyard etc.  相似文献   

16.
1 Introduction Urbanization process in Xi’an has been accelerated after the Chinese governmentimplemented the strategy ofW estern Region Developmentof China aboutfouryears ago,which has caused the loss offarm land in urban peripheralareas.W hatis more,ur…  相似文献   

17.
Hourly precipitation data were collected from 143 first-order US weather stations during the period from 1980 to 2009 to assess the internal distribution of precipitation events lasting at least three hours. A total of 46,595 individual precipitation events were identified and evaluated using the mean, standard deviation, skewness, kurtosis, and the number of peaks occurring within an event. Mean event duration is longest along the West and Northwest coasts, the Mid-South, the Mid-Atlantic, and the Northeast; while shorter-duration events are more frequent in the Rocky Mountains, the Southwest, and the Great Plains. Mean event precipitation and standard deviation are greatest along the Gulf Coast and decrease inland. Precipitation events are positively skewed, indicating that more precipitation tends to occur earlier in the event. The most positively-skewed events are also located in regions flanking the Gulf of Mexico, while less-skewed events are common in the Northwest and Rocky Mountain regions. Event kurtosis is negative throughout the entire USA, with the highest negative values generally west of the Front Range, where cyclonic development and transition produce more evenly distributed precipitation within storms. Intra-event precipitation maxima were also evaluated, with western Florida and the desert Southwest having the greatest number per event.  相似文献   

18.
Modelling spatio-temporal dependencies resulting from dynamic processes that evolve in both space and time is essential in many scientific fields. Spatio-temporal Kriging is one of the space–time procedures, which has progressed the most over the last few years. Kriging predictions strongly depend on the covariance function associated with the stochastic process under study. Therefore, the choice of such a covariance function, which is usually based on empirical covariance, is a core aspect in the prediction procedure. As the empirical covariance is not necessarily a permissible covariance function, it is necessary to fit a valid covariance model. Due to the complexity of these valid models in the spatio-temporal case, visualising them is of great help, at least when selecting the set of candidate models to represent the spatio-temporal dependencies suggested by the empirical covariogram. We focus on the visualisation of the most interesting stationary non-separable covariance functions and how they change as their main parameters take different values. We wrote a specialised code for visualisation purposes. In order to illustrate the usefulness of visualisation when choosing the appropriate non-separable spatio-temporal covariance model, we focus on an important pollution problem, namely the levels of carbon monoxide, in the city of Madrid, Spain.  相似文献   

19.
The Digital Elevation Model that has been derived from the February 2000 Shuttle Radar Topography Mission (SRTM) has been one of the most important publicly available new spatial data sets in recent years. However, the ‘finished’ grade version of the data (also referred to as Version 2) still contains data voids (some 836,000 km2)—and other anomalies—that prevent immediate use in many applications. These voids can be filled using a range of interpolation algorithms in conjunction with other sources of elevation data, but there is little guidance on the most appropriate void‐filling method. This paper describes: (i) a method to fill voids using a variety of interpolators, (ii) a method to determine the most appropriate void‐filling algorithms using a classification of the voids based on their size and a typology of their surrounding terrain; and (iii) the classification of the most appropriate algorithm for each of the 3,339,913 voids in the SRTM data. Based on a sample of 1304 artificial but realistic voids across six terrain types and eight void size classes, we found that the choice of void‐filling algorithm is dependent on both the size and terrain type of the void. Contrary to some previous findings, the best methods can be generalised as: kriging or inverse distance weighting interpolation for small and medium size voids in relatively flat low‐lying areas; spline interpolation for small and medium‐sized voids in high‐altitude and dissected terrain; triangular irregular network or inverse distance weighting interpolation for large voids in very flat areas, and an advanced spline method (ANUDEM) for large voids in other terrains.  相似文献   

20.
1 Introduction Automated extraction of drainage features from DEMs is an effective alternative to the tedious manual mapping from topographic maps. The derived hydrologic characteristics include stream-channel networks, delineation of catchment boundaries, catchment area, catchment length, stream-channel long profiles and stream order etc. Other important characteristics of river catchments, such as the stream-channel density, stream-channel bifurcation ratios, stream-channel order, number…  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号