全文获取类型
收费全文 | 15625篇 |
免费 | 1757篇 |
国内免费 | 2119篇 |
专业分类
测绘学 | 5290篇 |
大气科学 | 2409篇 |
地球物理 | 2403篇 |
地质学 | 3351篇 |
海洋学 | 1376篇 |
天文学 | 1646篇 |
综合类 | 1448篇 |
自然地理 | 1578篇 |
出版年
2024年 | 97篇 |
2023年 | 206篇 |
2022年 | 498篇 |
2021年 | 647篇 |
2020年 | 681篇 |
2019年 | 712篇 |
2018年 | 529篇 |
2017年 | 811篇 |
2016年 | 757篇 |
2015年 | 770篇 |
2014年 | 886篇 |
2013年 | 1120篇 |
2012年 | 956篇 |
2011年 | 884篇 |
2010年 | 708篇 |
2009年 | 911篇 |
2008年 | 926篇 |
2007年 | 1081篇 |
2006年 | 1005篇 |
2005年 | 831篇 |
2004年 | 773篇 |
2003年 | 607篇 |
2002年 | 504篇 |
2001年 | 413篇 |
2000年 | 355篇 |
1999年 | 322篇 |
1998年 | 271篇 |
1997年 | 188篇 |
1996年 | 176篇 |
1995年 | 166篇 |
1994年 | 145篇 |
1993年 | 128篇 |
1992年 | 81篇 |
1991年 | 77篇 |
1990年 | 44篇 |
1989年 | 45篇 |
1988年 | 37篇 |
1987年 | 26篇 |
1986年 | 24篇 |
1985年 | 15篇 |
1984年 | 13篇 |
1982年 | 8篇 |
1981年 | 9篇 |
1980年 | 7篇 |
1979年 | 6篇 |
1977年 | 11篇 |
1973年 | 4篇 |
1972年 | 5篇 |
1971年 | 5篇 |
1954年 | 6篇 |
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
11.
12.
DEM和RS技术是研究滑坡地质灾害的重要资料和手段。近年来,随着高空间分辨率遥感卫星和高精度雷达卫星的上天,可以获取现时性高精度的DEM,使滑坡地质灾害的研究由二维向三维提升。利用IRS-P5数据生成的5m精度的DEM,借鉴GOOGLE的三维可视性原理,将其和高空间分辨率QuickBird(0.61m)数据叠置到数字地球之上,制作成三维可视性图像,进行滑坡环境指标参数提取方法研究。研究结果表明,该方法可直接读取滑坡环境指标的三维参数,具有客观、准确、快速的特点,可为滑坡灾害评估和区域地质灾害危险性评价提供定量化资料。 相似文献
13.
Wenwen Li Sizhe Wang 《International journal of geographical information science》2017,31(8):1562-1582
The increasing research interest in global climate change and the rise of the public awareness have generated a significant demand for new tools to support effective visualization of big climate data in a cyber environment such that anyone from any location with an Internet connection and a web browser can easily view and comprehend the data. In response to the demand, this paper introduces a new web-based platform for visualizing multidimensional, time-varying climate data on a virtual globe. The web-based platform is built upon a virtual globe system Cesium, which is open-source, highly extendable and capable of being easily integrated into a web environment. The emerging WebGL technique is adapted to support interactive rendering of 3D graphics with hardware graphics acceleration. To address the challenges of transmitting and visualizing voluminous, complex climate data over the Internet to support real-time visualization, we develop a stream encoding and transmission strategy based on video-compression techniques. This strategy allows dynamic provision of scientific data in different precisions to balance the needs for scientific analysis and visualization cost. Approaches to represent, encode and decode processed data are also introduced in detail to show the operational workflow. Finally, we conduct several experiments to demonstrate the performance of the proposed strategy under different network conditions. A prototype, PolarGlobe, has been developed to visualize climate data in the Arctic regions from multiple angles. 相似文献
14.
P. Bogaert D. Fasbender 《Stochastic Environmental Research and Risk Assessment (SERRA)》2007,21(6):695-709
In spite of the exponential growth in the amount of data that one may expect to provide greater modeling and predictions opportunities,
the number and diversity of sources over which this information is fragmented is growing at an even faster rate. As a consequence,
there is real need for methods that aim at reconciling them inside an epistemically sound theoretical framework. In a statistical
spatial prediction framework, classical methods are based on a multivariate approach of the problem, at the price of strong
modeling hypotheses. Though new avenues have been recently opened by focusing on the integration of uncertain data sources,
to the best of our knowledges there have been no systematic attemps to explicitly account for information redundancy through
a data fusion procedure. Starting from the simple concept of measurement errors, this paper proposes an approach for integrating
multiple information processing as a part of the prediction process itself through a Bayesian approach. A general formulation
is first proposed for deriving the prediction distribution of a continuous variable of interest at unsampled locations using
on more or less uncertain (soft) information at neighboring locations. The case of multiple information is then considered,
with a Bayesian solution to the problem of fusing multiple information that are provided as separate conditional probability
distributions. Well-known methods and results are derived as limit cases. The convenient hypothesis of conditional independence
is discussed by the light of information theory and maximum entropy principle, and a methodology is suggested for the optimal
selection of the most informative subset of information, if needed. Based on a synthetic case study, an application of the
methodology is presented and discussed. 相似文献
15.
川西北地区是国内重要的岩金普查靶区。本文对松潘地区的地质特征,泛克立格法原理,及在本区的应用情况作了介绍,对主要计算步骤,给出基本结果,对计算获得的有意义异常区,进行地质解释,并将此方法与趋势面分析进行对比。 相似文献
16.
17.
将Google Earth应用于遥感影像数据资料归档系统中,是一种新的遥感资料归档方法,能将归档的成果文件整合,克服了传统归档方式不能快速查看数据效果、位置、属性信息的缺点,很好地适应了信息化工作程序对图像数据库的快速响应的要求.ARCGIS补充了Google Earth薄弱的空间分析功能.系统再联合EXCLE表件,方... 相似文献
18.
对于测绘技术人员来说,需要编写设计书、技术总结、监测方案等测绘类文档。利用Word2003自带的绘图工具,远远不能满足要求。介绍了“利用复制+粘贴工具”、“转换图像格式”、“使用插入对象”三种方式在Word2003文档中插入AutoCAD2004图形的三种方法, 相似文献
19.
Spatial clustering is widely used in many fields such as WSN (Wireless Sensor Networks), web clustering, remote sensing and so on for discovery groups and to identify interesting distributions in the underlying database. By discussing the relationships between the optimal clustering and the initial seeds, a clustering validity index and the principle of seeking initial seeds were proposed, and on this principle we recommend an initial seed-seeking strategy: SSPG (Single-Shortest-Path Graph). With SSPG strategy used in clustering algorithms, we find that the result of clustering is optimized with more probability. At the end of the paper, according to the combinational theory of optimization, a method is proposed to obtain optimal reference k value of cluster number, and is proven to be efficient. 相似文献
20.
Gennady A. Kivman 《Surveys in Geophysics》1997,18(6):621-643
The problem of deriving tidal fields from observations by reason of incompleteness and imperfectness of every data set practically available has an infinitely large number of allowable solutions fitting the data within measurement errors and hence can be treated as ill-posed. Therefore, interpolating the data always relies on some a priori assumptions concerning the tides, which provide a rule of sampling or, in other words, a regularization of the ill-posed problem. Data assimilation procedures used in large scale tide modeling are viewed in a common mathematical framework as such regularizations. It is shown that they all (basis functions expansion, parameter estimation, nudging, objective analysis, general inversion, and extended general inversion), including those (objective analysis and general inversion) originally formulated in stochastic terms, may be considered as utilizations of one of the three general methods suggested by the theory of ill-posed problems. The problem of grid refinement critical for inverse methods and nudging is discussed. 相似文献