首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   5456篇
  免费   578篇
  国内免费   777篇
测绘学   214篇
大气科学   902篇
地球物理   1164篇
地质学   2534篇
海洋学   546篇
天文学   640篇
综合类   342篇
自然地理   469篇
  2024年   16篇
  2023年   86篇
  2022年   182篇
  2021年   187篇
  2020年   135篇
  2019年   143篇
  2018年   234篇
  2017年   213篇
  2016年   277篇
  2015年   211篇
  2014年   285篇
  2013年   294篇
  2012年   253篇
  2011年   289篇
  2010年   298篇
  2009年   341篇
  2008年   274篇
  2007年   252篇
  2006年   201篇
  2005年   175篇
  2004年   139篇
  2003年   180篇
  2002年   141篇
  2001年   173篇
  2000年   167篇
  1999年   204篇
  1998年   170篇
  1997年   186篇
  1996年   179篇
  1995年   143篇
  1994年   146篇
  1993年   111篇
  1992年   104篇
  1991年   75篇
  1990年   61篇
  1989年   51篇
  1988年   52篇
  1987年   20篇
  1986年   32篇
  1985年   21篇
  1984年   21篇
  1983年   22篇
  1982年   14篇
  1981年   10篇
  1980年   10篇
  1978年   4篇
  1977年   3篇
  1976年   3篇
  1975年   3篇
  1972年   4篇
排序方式: 共有6811条查询结果,搜索用时 15 毫秒
11.
A Hierarchical Parallel simulation framework for spatially-explicit Agent-Based Models (HPABM) is developed to enable computationally intensive agent-based models for the investigation of large-scale geospatial problems. HPABM allows for the utilization of high-performance and parallel computing resources to address computational challenges in agent-based models. Within HPABM, an agent-based model is decomposed into a set of sub-models that function as computational units for parallel computing. Each sub-model is comprised of a sub-set of agents and their spatially-explicit environments. Sub-models are aggregated into a group of super-models that represent computing tasks. HPABM based on the design of super- and sub-models leads to the loose coupling of agent-based models and underlying parallel computing architectures. The utility of HPABM in enabling the development of parallel agent-based models was examined in a case study. Results of computational experiments indicate that HPABM is scalable for developing large-scale agent-based models and, thus, demonstrates efficient support for enhancing the capability of agent-based modeling for large-scale geospatial simulation.  相似文献   
12.
本文就跨图幅的地物(邻接域)的管理问题作了一些有益的探讨,为了达到有效管理这个目的,本文采用Oracle公司提供的空间数据插件Spatial将空间数据和属性数据存入Oracle数据库中,这样不仅可以通过标准SQL语句可下裁任意区域的数据,而且数据库还会自动维护数据的完整性。因此,用空间数据库的管理方式使邻接域问题就不存在了。  相似文献   
13.
Data refinement refers to the processes by which a dataset’s resolution, in particular, the spatial one, is refined, and is thus synonymous to spatial downscaling. Spatial resolution indicates measurement scale and can be seen as an index for regular data support. As a type of change of scale, data refinement is useful for many scenarios where spatial scales of existing data, desired analyses, or specific applications need to be made commensurate and refined. As spatial data are related to certain data support, they can be conceived of as support-specific realizations of random fields, suggesting that multivariate geostatistics should be explored for refining datasets from their coarser-resolution versions to the finer-resolution ones. In this paper, geostatistical methods for downscaling are described, and were implemented using GTOPO30 data and sampled Shuttle Radar Topography Mission data at a site in northwest China, with the latter’s majority grid cells used as surrogate reference data. It was found that proper structural modeling is important for achieving increased accuracy in data refinement; here, structural modeling can be done through proper decomposition of elevation fields into trends and residuals and thereafter. It was confirmed that effects of semantic differences on data refinement can be reduced through properly estimating and incorporating biases in local means.  相似文献   
14.
Spatial co‐location pattern mining aims to discover a collection of Boolean spatial features, which are frequently located in close geographic proximity to each other. Existing methods for identifying spatial co‐location patterns usually require users to specify two thresholds, i.e. the prevalence threshold for measuring the prevalence of candidate co‐location patterns and distance threshold to search the spatial co‐location patterns. However, these two thresholds are difficult to determine in practice, and improper thresholds may lead to the misidentification of useful patterns and the incorrect reporting of meaningless patterns. The multi‐scale approach proposed in this study overcomes this limitation. Initially, the prevalence of candidate co‐location patterns is measured statistically by using a significance test, and a non‐parametric model is developed to construct the null distribution of features with the consideration of spatial auto‐correlation. Next, the spatial co‐location patterns are explored at multi‐scales instead of single scale (or distance threshold) discovery. The validity of the co‐location patterns is evaluated based on the concept of lifetime. Experiments on both synthetic and ecological datasets show that spatial co‐location patterns are discovered correctly and completely by using the proposed method; on the other hand, the subjectivity in discovery of spatial co‐location patterns is reduced significantly.  相似文献   
15.
In this letter, a coherence-based technique for atmospheric artifact removal in ground-based (GB) zero-baseline synthetic aperture radar (SAR) acquisitions is proposed. For this purpose, polarimetric measurements acquired using the GB-SAR sensor developed at the Universitat Politecnica de Catalunya are employed. The heterogeneous environment of Collserola Park in the outskirts of Barcelona, Spain, was selected as the test area. Data sets were acquired at X-band during one week in June 2005. The effects of the atmosphere variations between successive zero-baseline SAR polarimetric acquisitions are treated here in detail. The need to compensate for the resulting phase-difference errors when retrieving interferometric information is put forward. A compensation technique is then proposed and evaluated using the control points placed inside the observed scene.  相似文献   
16.
The significance of crop yield estimation is well known in agricultural management and policy development at regional and national levels. The primary objective of this study was to test the suitability of the method, depending on predicted crop production, to estimate crop yield with a MODIS-NDVI-based model on a regional scale. In this paper, MODIS-NDVI data, with a 250 m resolution, was used to estimate the winter wheat (Triticum aestivum L.) yield in one of the main winter-wheat-growing regions. Our study region is located in Jining, Shandong Province. In order to improve the quality of remote sensing data and the accuracy of yield prediction, especially to eliminate the cloud-contaminated data and abnormal data in the MODIS-NDVI series, the Savitzky–Golay filter was applied to smooth the 10-day NDVI data. The spatial accumulation of NDVI at the county level was used to test its relationship with winter wheat production in the study area. A linear regressive relationship between the spatial accumulation of NDVI and the production of winter wheat was established using a stepwise regression method. The average yield was derived from predicted production divided by the growing acreage of winter wheat on a county level. Finally, the results were validated by the ground survey data, and the errors were compared with the errors of agro-climate models. The results showed that the relative errors of the predicted yield using MODIS-NDVI are between −4.62% and 5.40% and that whole RMSE was 214.16 kg ha−1 lower than the RMSE (233.35 kg ha−1) of agro-climate models in this study region. A good predicted yield data of winter wheat could be got about 40 days ahead of harvest time, i.e. at the booting-heading stage of winter wheat. The method suggested in this paper was good for predicting regional winter wheat production and yield estimation.  相似文献   
17.
Lossy compression is being increasingly used in remote sensing; however, its effects on classification have scarcely been studied. This paper studies the implications of JPEG (JPG) and JPEG 2000 (J2K) lossy compression for image classification of forests in Mediterranean areas. Results explore the impact of the compression on the images themselves as well as on the obtained classification. The results indicate that classifications made with previously compressed radiometrically corrected images and topoclimatic variables are not negatively affected by compression, even at quite high compression ratios. Indeed, JPG compression can be applied to images at a compression ratio (CR, ratio between the size of the original file and the size of the compressed file) of 10:1 or even 20:1 (for both JPG and J2K). Nevertheless, the fragmentation of the study area must be taken into account: in less fragmented zones, high CR are possible for both JPG and J2K, but in fragmented zones, JPG is not advisable, and when J2K is used, only a medium CR is recommended (3.33:1 to 5:1). Taking into account that J2K produces fewer artefacts at higher CR, the study not only contributes with optimum CR recommendations, but also found that the J2K compression standard (ISO 15444-1) is better than the JPG (ISO 10918-1) when applied to image classification. Although J2K is computationally more expensive, this is no longer a critical issue with current computer technology.  相似文献   
18.
方向数据是矢量数据的标量表示,通常标量数据的插值方法不适用于方向数据。本文根据方向数据的特征,把其分解成沿坐标轴的分量,通过对分量数据的插值形成表面,最后合成反算出方向数据,给出了一种对于方向数据的行之有效的插值方法;并用实验进行了验证,对其插值结果和精度进行了统计和分析,取得了较为理想的内插结果。  相似文献   
19.
通过瑞安市基础控制网的改造实践,对各地控制网改造的方法进行了探讨,并对原有测绘成果转换到新坐标基准中进行了分析、探讨.  相似文献   
20.
图像二维熵分割,一直因耗时长而限制了实际应用。本文借鉴生物免疫思想,提出二维熵图像分割的人工免疫算法。在克隆选择算法中引入疫苗的免疫接种,用于优化最优分割阈值对的搜索过程。在遥感高分辨率图像上的实验显示,该算法不仅能准确搜索到最优阈值对,而且计算时间只有传统算法的1.8%。该算法也验证了人工免疫思想用于图像分割的可行性和有效性。  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号