首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2522篇
  免费   223篇
  国内免费   284篇
测绘学   870篇
大气科学   288篇
地球物理   492篇
地质学   585篇
海洋学   257篇
天文学   62篇
综合类   234篇
自然地理   241篇
  2024年   8篇
  2023年   15篇
  2022年   49篇
  2021年   74篇
  2020年   97篇
  2019年   87篇
  2018年   77篇
  2017年   121篇
  2016年   100篇
  2015年   109篇
  2014年   123篇
  2013年   164篇
  2012年   165篇
  2011年   123篇
  2010年   112篇
  2009年   113篇
  2008年   134篇
  2007年   157篇
  2006年   150篇
  2005年   141篇
  2004年   111篇
  2003年   111篇
  2002年   117篇
  2001年   78篇
  2000年   78篇
  1999年   65篇
  1998年   52篇
  1997年   55篇
  1996年   41篇
  1995年   27篇
  1994年   26篇
  1993年   29篇
  1992年   15篇
  1991年   24篇
  1990年   15篇
  1989年   14篇
  1988年   10篇
  1987年   13篇
  1986年   8篇
  1985年   3篇
  1984年   5篇
  1983年   2篇
  1982年   3篇
  1981年   1篇
  1979年   1篇
  1978年   2篇
  1977年   1篇
  1975年   1篇
  1972年   1篇
  1954年   1篇
排序方式: 共有3029条查询结果,搜索用时 490 毫秒
91.
郭庆胜  王琳  孙雅庚  周林  龙毅 《测绘学报》2016,45(7):850-857
在地图综合过程中,线的图形简化和移位算子通常是分别执行的,图形简化和移位有时都会产生新的空间冲突。本文试图把这两种算子进行协同,避免在地图综合过程中进行空间冲突的多次探测,提高地图综合数据处理的效率。本文通过把线图形简化转换为线上的点移位,并构建邻近地图目标之间的移位传播路径,在考虑空间上下文关系和地图感受规则的前提下,使移位过程能考虑到线图形简化,并尽量保持有关地图目标的空间特征。最后,以道路和其周边的建筑物群为例,验证了该算法的有效性和可行性。  相似文献   
92.
针对城市地物信息提取中地物边界难以确定、分类精度不高的问题,该文提出一套综合利用影像及激光雷达点云高程信息的面向对象分类方法。在分割中,各类地物的最佳分割尺度由监督法分割精度评价确定,最终分割结果利用粒度理论下的分割尺度综合方法进行合成,能兼顾不同地物最优分割尺度,获得准确地物边界;在分类中,采用ReliefF特征选择算法度量从影像及点云数据提取的对象特征重要度,选择最佳特征组合,并采用多分类器组合方法进行分类,以消除Hughes现象,提高分类精度。选择德国斯图加特市两块实验区进行分类实验,结果表明:该方法有利于提高大范围城市地物精细信息提取的精度和效率,具有较高的应用价值。  相似文献   
93.
为了进一步研究移动测量系统的数据处理问题,该文根据点云的基本特征,归纳了由7个特征构成的点云原始特征向量,在此基础上,结合语义环境构建了由17个特征构成的点云扩展特征向量,并采用支持向量机模型对车载LiDAR点云进行行道树点云识别的一系列实验。实验中采用粒子群优化算法和遗传算法对支持向量机进行参数寻优;采用不同特征向量和不同数目样本对点云进行学习和目标识别;分析了特征向量的学习曲线和识别精度。实验结果表明,支持向量机模型能够在行道树点云识别中取得较高的精度。  相似文献   
94.
Abstract

We attempt to describe the role of tessellated models of space within the discipline of Geographic Information Systems (GIS) – a speciality coming largely out of Geography and Land Surveying, where there was a strong need to represent information about the land’s surface within a computer system rather than on the original paper maps. We look at some of the basic operations in GIS, including dynamic and kinetic applications. We examine issues of topology and data structures, and produced a tessellation model that may be widely applied both to traditional “object” and “field” data types. The Part I of this study examined object and field spatial models, the Voronoi extension of objects, and the graphs that express the resulting adjacencies. The required data structures were also briefly described, along with 2D and 3D structures and hierarchical indexing. The importance of graph duality was emphasized. Here, this second paper builds on the structures described in the first, and examines how these may be modified: change may often be associated with either viewpoint or time. Incremental algorithms permit additional point insertion, and applications involving the addition of skeleton points, for map scanning, contour enrichment or watershed delineation and simulation. Dynamic algorithms permit skeleton smoothing, and higher order Voronoi diagram applications, including Sibson interpolation. Kinetic algorithms allow collision detection applications, free-Lagrange flow modeling, and pen movement simulation for map drawing. If desired these methods may be extended to 3D. Based on this framework, it can be argued that tessellation models are fundamental to our understanding and processing of geographical space, and provide a coherent framework for understanding the “space” in which we exist.  相似文献   
95.
The attempt to obtain long-term observed data around some sea areas we concern is usually very hard or even impossible in practical offshore and ocean engineering situations. In this paper, by means of linear mean-square estimation method, a new way to extend short-term data to long-term ones is developed. The long-term data about concerning sea areas can be constructed via a series of long-term data obtained from neighbor oceanographic stations, through relevance analysis of different data series. It is effective to cover the insufficiency of time series prediction method’s overdependence upon the length of data series, as well as the limitation of variable numbers adopted in multiple linear regression model. The storm surge data collected from three oceanographic stations located in Shandong Peninsula are taken as examples to analyze the number-selection effect of reference oceanographic stations (adjacent to the concerning sea area) and the correlation coefficients between sea sites which are selected for reference and for engineering projects construction respectively. By comparing the N-year return-period values which are calculated from observed raw data and processed data which are extended from finite data series by means of the linear mean-square estimation method, one can draw a conclusion that this method can give considerably good estimation in practical ocean engineering, in spite of different extreme value distributions about raw and processed data.  相似文献   
96.
This study considered the possibility of using visible and near infrared (VNIR) spectral absorption feature parameters (SAFPs) in predicting the concentration and mapping the distribution of heavy metals in sediments of the Takab area. In total, 60 sediment samples were collected along main streams draining from the mining districts and tailing sites, in order to measure the concentration of As, Co, V, Cu, Cr, Ni, Hg, Ti, Pb and Zn and the reflectance spectra (350–2500 nm). The quantitative relationship between SAFPs (Depth500nm, R610/500nm, R1344/778nm, Area500nm, Depth2200nm, Area2200nm, Asym2200nm) and geochemical data were assessed using stepwise multiple linear regression (SMLR) and enter multiple linear regression (EMLR) methods. The results showed a strong negative correlation between Ni and Cr with Area2200nm, a significant positive correlation between As and Asym2200nm, Ni and Co with Depth2200nm, as well as Co, V and total values with Depth500nm. The EMLR method eventuated in a significant prediction result for Ni, Cr, Co and As concentrations based on spectral parameters, whereas the prediction for Zn, V and total value was relatively weak. The spatial distribution pattern of geochemical data showed that mining activities, along with the natural weathering of base metal occurrences and rock units, has caused high concentrations of heavy metals in sediments of the Sarough River tributaries.  相似文献   
97.
The automatic extraction of information content from remotely sensed data is always challenging. We suggest a novel fusion approach to improve the extraction of this information from mono-satellite images. A Worldview-2 (WV-2) pan-sharpened image and a 1/5000-scaled topographic vector map (TOPO5000) were used as the sample data. Firstly, the buildings and roads were manually extracted from WV-2 to point out the maximum extractable information content. Subsequently, object-based automatic extractions were performed. After achieving two-dimensional results, a normalized digital surface model (nDSM) was generated from the underlying digital aerial photos of TOPO5000, and the automatic extraction was repeated by fusion with the nDSM to include individual object heights as an additional band for classification. The contribution was tested by precision, completeness and overall quality. Novel fusion technique increased the success of automatic extraction by 7% for the number of buildings and by 23% for the length of roads.  相似文献   
98.
Object matching is used in various applications including conflation, data quality assessment, updating, and multi-scale analysis. The objective of matching is to identify objects referring to the same entity. This article aims to present an optimization-based linear object-matching approach in multi-scale, multi-source datasets. By taking into account geometric criteria, the proposed approach uses real coded genetic algorithm (RCGA) and sensitivity analysis to identify corresponding objects. Moreover, in this approach, any initial dependency on empirical parameters such as buffer distance, threshold of spatial similarity degree, and weights of criteria is eliminated and, instead, the optimal values for these parameters are calculated for each dataset. Volunteered geographical information (VGI) and authoritative data with different scales and sources were used to assess the efficiency of the proposed approach. According to the results, in addition to an efficient performance in various datasets, the proposed approach was able to appropriately identify the corresponding objects in these datasets by achieving higher F-Score.  相似文献   
99.
在资源储量估算的过程中,矿石小体重的准确与否将直接影响到资源储量估算的客观程度。对矿石小体重的准确预测将需要通过大量的统计分析工作,传统的方法是利用矿石小体重的算术平均值进行估算矿床资源储量,并没有考虑到多矿种对矿石小体重的影响程度。本文利用Excel软件"数据分析"的"回归"功能模块,对实验室所测定的矿石小体重值与其对应品位进行二元线性回归分析,快速、准确地构建矿石体重与其品位的数学模型,从而为资源储量估算提供了更客观、更科学的矿石体重模型。  相似文献   
100.
线性化物理过程对GRAPES 4DVAR同化的影响   总被引:8,自引:3,他引:5  
线性化物理过程能够改善四维变分同化中极小化收敛的稳定性和增加极小化过程中对大气物理过程和动力更加精确的描述,它是四维变分同化中非常重要的一部分。通过在GRAPES全球模式中研究线性化物理过程,尤其是两个湿线性化物理过程,改善切线性模式预报精度,来提高GRAPES全球四维变分同化的分析和预报效果。线性化物理过程的开发首先需要简化原非线性化物理过程中的强非线性项,然后对线性化物理过程进行规约化,以抑制切线性扰动的异常增长。目前GRAEPS全球模式中的线性化物理过程主要包括次网格尺度地形参数化、垂直扩散、积云深对流和大尺度凝结。线性化物理过程预报精度的检验方法是通过选择合适大小的初始扰动(同化分析增量),来比较非线性模式和切线性模式中的扰动演化的纬向平均误差。然后以绝热版本的切线性模式为基础,通过冬、夏两个个例试验来分别检验4个线性化物理过程的12 h预报效果。试验结果表明,通过添加次网格地形参数化和垂直扩散两个干线性化物理过程方案,可以有效抑制住绝热版本切线性模式低层扰动的异常增长,大幅度改善切线性模式预报效果。通过添加积云深对流和大尺度凝结两个湿线性化物理过程,可以在热带区域和中、高纬度地区提高切线性模式中湿变量和温度变量的近似精度,提高GRAPES全球四维变分同化的分析和预报效果。   相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号