全文获取类型
收费全文 | 2522篇 |
免费 | 223篇 |
国内免费 | 284篇 |
专业分类
测绘学 | 870篇 |
大气科学 | 288篇 |
地球物理 | 492篇 |
地质学 | 585篇 |
海洋学 | 257篇 |
天文学 | 62篇 |
综合类 | 234篇 |
自然地理 | 241篇 |
出版年
2024年 | 8篇 |
2023年 | 15篇 |
2022年 | 49篇 |
2021年 | 74篇 |
2020年 | 97篇 |
2019年 | 87篇 |
2018年 | 77篇 |
2017年 | 121篇 |
2016年 | 100篇 |
2015年 | 109篇 |
2014年 | 123篇 |
2013年 | 164篇 |
2012年 | 165篇 |
2011年 | 123篇 |
2010年 | 112篇 |
2009年 | 113篇 |
2008年 | 134篇 |
2007年 | 157篇 |
2006年 | 150篇 |
2005年 | 141篇 |
2004年 | 111篇 |
2003年 | 111篇 |
2002年 | 117篇 |
2001年 | 78篇 |
2000年 | 78篇 |
1999年 | 65篇 |
1998年 | 52篇 |
1997年 | 55篇 |
1996年 | 41篇 |
1995年 | 27篇 |
1994年 | 26篇 |
1993年 | 29篇 |
1992年 | 15篇 |
1991年 | 24篇 |
1990年 | 15篇 |
1989年 | 14篇 |
1988年 | 10篇 |
1987年 | 13篇 |
1986年 | 8篇 |
1985年 | 3篇 |
1984年 | 5篇 |
1983年 | 2篇 |
1982年 | 3篇 |
1981年 | 1篇 |
1979年 | 1篇 |
1978年 | 2篇 |
1977年 | 1篇 |
1975年 | 1篇 |
1972年 | 1篇 |
1954年 | 1篇 |
排序方式: 共有3029条查询结果,搜索用时 490 毫秒
91.
92.
针对城市地物信息提取中地物边界难以确定、分类精度不高的问题,该文提出一套综合利用影像及激光雷达点云高程信息的面向对象分类方法。在分割中,各类地物的最佳分割尺度由监督法分割精度评价确定,最终分割结果利用粒度理论下的分割尺度综合方法进行合成,能兼顾不同地物最优分割尺度,获得准确地物边界;在分类中,采用ReliefF特征选择算法度量从影像及点云数据提取的对象特征重要度,选择最佳特征组合,并采用多分类器组合方法进行分类,以消除Hughes现象,提高分类精度。选择德国斯图加特市两块实验区进行分类实验,结果表明:该方法有利于提高大范围城市地物精细信息提取的精度和效率,具有较高的应用价值。 相似文献
93.
94.
Christopher Gold 《地球空间信息科学学报》2016,19(2):157-167
AbstractWe attempt to describe the role of tessellated models of space within the discipline of Geographic Information Systems (GIS) – a speciality coming largely out of Geography and Land Surveying, where there was a strong need to represent information about the land’s surface within a computer system rather than on the original paper maps. We look at some of the basic operations in GIS, including dynamic and kinetic applications. We examine issues of topology and data structures, and produced a tessellation model that may be widely applied both to traditional “object” and “field” data types. The Part I of this study examined object and field spatial models, the Voronoi extension of objects, and the graphs that express the resulting adjacencies. The required data structures were also briefly described, along with 2D and 3D structures and hierarchical indexing. The importance of graph duality was emphasized. Here, this second paper builds on the structures described in the first, and examines how these may be modified: change may often be associated with either viewpoint or time. Incremental algorithms permit additional point insertion, and applications involving the addition of skeleton points, for map scanning, contour enrichment or watershed delineation and simulation. Dynamic algorithms permit skeleton smoothing, and higher order Voronoi diagram applications, including Sibson interpolation. Kinetic algorithms allow collision detection applications, free-Lagrange flow modeling, and pen movement simulation for map drawing. If desired these methods may be extended to 3D. Based on this framework, it can be argued that tessellation models are fundamental to our understanding and processing of geographical space, and provide a coherent framework for understanding the “space” in which we exist. 相似文献
95.
The attempt to obtain long-term observed data around some sea areas we concern is usually very hard or even impossible in practical offshore and ocean engineering situations. In this paper, by means of linear mean-square estimation method, a new way to extend short-term data to long-term ones is developed. The long-term data about concerning sea areas can be constructed via a series of long-term data obtained from neighbor oceanographic stations, through relevance analysis of different data series. It is effective to cover the insufficiency of time series prediction method’s overdependence upon the length of data series, as well as the limitation of variable numbers adopted in multiple linear regression model. The storm surge data collected from three oceanographic stations located in Shandong Peninsula are taken as examples to analyze the number-selection effect of reference oceanographic stations (adjacent to the concerning sea area) and the correlation coefficients between sea sites which are selected for reference and for engineering projects construction respectively. By comparing the N-year return-period values which are calculated from observed raw data and processed data which are extended from finite data series by means of the linear mean-square estimation method, one can draw a conclusion that this method can give considerably good estimation in practical ocean engineering, in spite of different extreme value distributions about raw and processed data. 相似文献
96.
This study considered the possibility of using visible and near infrared (VNIR) spectral absorption feature parameters (SAFPs) in predicting the concentration and mapping the distribution of heavy metals in sediments of the Takab area. In total, 60 sediment samples were collected along main streams draining from the mining districts and tailing sites, in order to measure the concentration of As, Co, V, Cu, Cr, Ni, Hg, Ti, Pb and Zn and the reflectance spectra (350–2500 nm). The quantitative relationship between SAFPs (Depth500nm, R610/500nm, R1344/778nm, Area500nm, Depth2200nm, Area2200nm, Asym2200nm) and geochemical data were assessed using stepwise multiple linear regression (SMLR) and enter multiple linear regression (EMLR) methods. The results showed a strong negative correlation between Ni and Cr with Area2200nm, a significant positive correlation between As and Asym2200nm, Ni and Co with Depth2200nm, as well as Co, V and total values with Depth500nm. The EMLR method eventuated in a significant prediction result for Ni, Cr, Co and As concentrations based on spectral parameters, whereas the prediction for Zn, V and total value was relatively weak. The spatial distribution pattern of geochemical data showed that mining activities, along with the natural weathering of base metal occurrences and rock units, has caused high concentrations of heavy metals in sediments of the Sarough River tributaries. 相似文献
97.
The automatic extraction of information content from remotely sensed data is always challenging. We suggest a novel fusion approach to improve the extraction of this information from mono-satellite images. A Worldview-2 (WV-2) pan-sharpened image and a 1/5000-scaled topographic vector map (TOPO5000) were used as the sample data. Firstly, the buildings and roads were manually extracted from WV-2 to point out the maximum extractable information content. Subsequently, object-based automatic extractions were performed. After achieving two-dimensional results, a normalized digital surface model (nDSM) was generated from the underlying digital aerial photos of TOPO5000, and the automatic extraction was repeated by fusion with the nDSM to include individual object heights as an additional band for classification. The contribution was tested by precision, completeness and overall quality. Novel fusion technique increased the success of automatic extraction by 7% for the number of buildings and by 23% for the length of roads. 相似文献
98.
A geometric-based approach for road matching on multi-scale datasets using a genetic algorithm 总被引:1,自引:0,他引:1
Alireza Chehreghan 《制图学和地理信息科学》2018,45(3):255-269
Object matching is used in various applications including conflation, data quality assessment, updating, and multi-scale analysis. The objective of matching is to identify objects referring to the same entity. This article aims to present an optimization-based linear object-matching approach in multi-scale, multi-source datasets. By taking into account geometric criteria, the proposed approach uses real coded genetic algorithm (RCGA) and sensitivity analysis to identify corresponding objects. Moreover, in this approach, any initial dependency on empirical parameters such as buffer distance, threshold of spatial similarity degree, and weights of criteria is eliminated and, instead, the optimal values for these parameters are calculated for each dataset. Volunteered geographical information (VGI) and authoritative data with different scales and sources were used to assess the efficiency of the proposed approach. According to the results, in addition to an efficient performance in various datasets, the proposed approach was able to appropriately identify the corresponding objects in these datasets by achieving higher F-Score. 相似文献
99.
在资源储量估算的过程中,矿石小体重的准确与否将直接影响到资源储量估算的客观程度。对矿石小体重的准确预测将需要通过大量的统计分析工作,传统的方法是利用矿石小体重的算术平均值进行估算矿床资源储量,并没有考虑到多矿种对矿石小体重的影响程度。本文利用Excel软件"数据分析"的"回归"功能模块,对实验室所测定的矿石小体重值与其对应品位进行二元线性回归分析,快速、准确地构建矿石体重与其品位的数学模型,从而为资源储量估算提供了更客观、更科学的矿石体重模型。 相似文献
100.
线性化物理过程对GRAPES 4DVAR同化的影响 总被引:8,自引:3,他引:5
线性化物理过程能够改善四维变分同化中极小化收敛的稳定性和增加极小化过程中对大气物理过程和动力更加精确的描述,它是四维变分同化中非常重要的一部分。通过在GRAPES全球模式中研究线性化物理过程,尤其是两个湿线性化物理过程,改善切线性模式预报精度,来提高GRAPES全球四维变分同化的分析和预报效果。线性化物理过程的开发首先需要简化原非线性化物理过程中的强非线性项,然后对线性化物理过程进行规约化,以抑制切线性扰动的异常增长。目前GRAEPS全球模式中的线性化物理过程主要包括次网格尺度地形参数化、垂直扩散、积云深对流和大尺度凝结。线性化物理过程预报精度的检验方法是通过选择合适大小的初始扰动(同化分析增量),来比较非线性模式和切线性模式中的扰动演化的纬向平均误差。然后以绝热版本的切线性模式为基础,通过冬、夏两个个例试验来分别检验4个线性化物理过程的12 h预报效果。试验结果表明,通过添加次网格地形参数化和垂直扩散两个干线性化物理过程方案,可以有效抑制住绝热版本切线性模式低层扰动的异常增长,大幅度改善切线性模式预报效果。通过添加积云深对流和大尺度凝结两个湿线性化物理过程,可以在热带区域和中、高纬度地区提高切线性模式中湿变量和温度变量的近似精度,提高GRAPES全球四维变分同化的分析和预报效果。 相似文献