首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   15707篇
  免费   1747篇
  国内免费   2003篇
测绘学   5481篇
大气科学   2100篇
地球物理   2289篇
地质学   3688篇
海洋学   1753篇
天文学   1236篇
综合类   1495篇
自然地理   1415篇
  2024年   101篇
  2023年   220篇
  2022年   519篇
  2021年   652篇
  2020年   679篇
  2019年   748篇
  2018年   526篇
  2017年   851篇
  2016年   761篇
  2015年   809篇
  2014年   913篇
  2013年   1070篇
  2012年   1007篇
  2011年   900篇
  2010年   710篇
  2009年   875篇
  2008年   918篇
  2007年   1038篇
  2006年   968篇
  2005年   788篇
  2004年   747篇
  2003年   578篇
  2002年   501篇
  2001年   431篇
  2000年   347篇
  1999年   310篇
  1998年   250篇
  1997年   195篇
  1996年   194篇
  1995年   164篇
  1994年   149篇
  1993年   131篇
  1992年   82篇
  1991年   73篇
  1990年   46篇
  1989年   47篇
  1988年   31篇
  1987年   20篇
  1986年   20篇
  1985年   12篇
  1984年   11篇
  1982年   8篇
  1981年   6篇
  1980年   6篇
  1979年   4篇
  1977年   11篇
  1973年   4篇
  1972年   4篇
  1971年   5篇
  1954年   6篇
排序方式: 共有10000条查询结果,搜索用时 0 毫秒
41.
Constitutive modeling of granular materials has been a subject of extensive research for many years. While the calculation of the Cauchy stress tensor using the discrete element method has been well established in the literature, the formulation and interpretation of the strain tensor are not as well documented. According to Bagi, 1 researchers mostly adopt well‐known continuum or discrete microstructural approaches to calculate strains within granular materials. However, neither of the 2 approaches can fully capture the behavior of granular materials. They are considered complementary to each other where each has its own strengths and limitations in solving granular‐mechanics problems. Zhang and Regueiro 2 proposed an equivalent continuum approach to calculating finite strain measures at the local level in granular materials subjected to large deformations. They used three‐dimensional discrete element method results to compare the proposed strains measures. This paper presents an experimental application of the Zhang and Regueiro 2 approach using three‐dimensional synchrotron microcomputed tomography images of a sheared Ottawa sand specimen. Invariant Eulerian finite strain measures were calculated for representative element volumes within the specimen. The spatial maps of Eulerian octahedral shear and volumetric strain were used to identify zones of intense shearing within the specimen and compared well with maps of incremental particle translation and rotation for the same specimen. The local Eulerian volumetric strain was compared to the global volumetric strains, which also can be considered as an averaging of all local Eulerian volumetric strains.  相似文献   
42.
43.
Artificial neural networks (ANNs) are a popular class of techniques for performing soft classifications of satellite images. They have successfully been applied for estimating crop areas through sub-pixel classification of medium to low resolution images. Before a network can be used for classification and estimation, however, it has to be trained. The collection of the reference area fractions needed to train an ANN is often both time-consuming and expensive. This study focuses on strategies for decreasing the efforts needed to collect the necessary reference data, without compromising the accuracy of the resulting area estimates. Two aspects were studied: the spatial sampling scheme (i) and the possibility for reusing trained networks in multiple consecutive seasons (ii). Belgium was chosen as the study area because of the vast amount of reference data available. Time series of monthly NDVI composites for both SPOT-VGT and MODIS were used as the network inputs. The results showed that accurate regional crop area estimation (R2 > 80%) is possible using only 1% of the entire area for network training, provided that the training samples used are representative for the land use variability present in the study area. Limiting the training samples to a specific subset of the population, either geographically or thematically, significantly decreased the accuracy of the estimates. The results also indicate that the use of ANNs trained with data from one season to estimate area fractions in another season is not to be recommended. The interannual variability observed in the endmembers’ spectral signatures underlines the importance of using up-to-date training samples. It can thus be concluded that the representativeness of the training samples, both regarding the spatial and the temporal aspects, is an important issue in crop area estimation using ANNs that should not easily be ignored.  相似文献   
44.
汤曦 《四川测绘》2000,23(2):58-60
本文介绍了几种纵横断面测设的常用方法及记录方法 ,通过数据传输和数据格式的转换 ,在AutoCADR14环境下 ,利用VB6和AutoLISP语言 ,自动生成断面图 ,形成标准格式的电子文件。  相似文献   
45.
在海洋地震资料处理中,由于检波点空间分布不均匀造成的采集脚印现象会影响地震资料成像,特别是对双源单缆方式采集的三维地震数据影响更为明显。通过对海上某区双源单缆方式采集的天然气水合物三维地震数据采集脚印形成原因及特点的详细分析,最终分别在叠前采用子波处理、潮汐、水速校正、面元中心化方法,叠后采用频率、波数域滤波的方法压制采集脚印。通过以上关键技术的应用,提高了双源单缆方式采集的三维地震数据的信噪比和分辨率,采集脚印从时间切片和剖面上均有了明显衰减,而且实现了最大限度的保持振幅、频率、相位属性。为精细刻画、描述地质构造形态和预测天然气水合物储层分布及有利富集区带,提供可靠的地震资料。  相似文献   
46.
随着遥感技术的发展和卫星影像分辨率的不断提高,高分辨率卫星影像广泛应用于各个行业。本文介绍了对卫星影像进行DOM(数字正射影像)制作、河涌排放口定位、污水等级监督分类及变化监测等在城市污水行业中的应用。  相似文献   
47.
机器学习在当今诸多领域已经取得了巨大的成功,但是机器学习的预测效果往往依赖于具体问题.集成学习通过综合多个基分类器来预测结果,因此,其适应各种场景的能力较强,分类准确率较高.基于斯隆数字巡天(Sloan Digital Sky Survey,SDSS)计划恒星/星系中最暗源星等集分类正确率低的问题,提出一种基于Stacking集成学习的恒星/星系分类算法.从SDSS-DR7(SDSS Data Release 7)中获取完整的测光数据集,并根据星等值划分为亮源星等集、暗源星等集和最暗源星等集.仅针对分类较为复杂且困难的最暗源星等集展开分类研究.首先,对最暗源星等集使用10折嵌套交叉验证,然后使用支持向量机(Support Vector Machine,SVM)、随机森林(Random Forest,RF)、XGBoost(eXtreme Gradient Boosting)等算法建立基分类器模型;使用梯度提升树(Gradient Boosting Decision Tree,GBDT)作为元分类器模型.最后,使用基于星系的分类正确率等指标,与功能树(Function Tree,FT)、SVM、RF、GBDT、XGBoost、堆叠降噪自编码(Stacked Denoising AutoEncoders,SDAE)、深度置信网络(Deep Belief Network,DBN)、深度感知决策树(Deep Perception Decision Tree,DPDT)等模型进行分类结果对比分析.实验结果表明,Stacking集成学习模型在最暗源星等集分类中要比FT算法的星系分类正确率提高了将近10%.同其他传统的机器学习算法、较强的提升算法、深度学习算法相比,Stacking集成学习模型也有较大的提升.  相似文献   
48.
提出了一种确定海洋平台钢裂纹扩展曲线的快速方法。该方法可以综合利用以往的经验数据和当前试验数据确定表面疲劳裂纹扩展速率曲线。与传统的只能利用当前试验数据确定表面疲劳裂纹扩展速率曲线相比。其可利用信息量大幅度增加,所以在精度相同的情况下,可以节省大量试件;而且在试样数一定的情况下。又可大大提高预测精度。文中还给出了海洋平台钢试验对比实例。  相似文献   
49.
气象部门馆藏的西部最早的器测气象资料始于20世纪30年代,不能满足建立20世纪以来中国气候变化序列的需求,而古气候重建或气候模拟资料则可以延伸到器测时代以前。为了探讨长序列多源气候资料序列融合方法,采用贝叶斯方法对中国北疆地区8条树轮气温重建资料、器测资料与国际耦合模式比较计划第5阶段(CMIP5)模式模拟资料进行了融合试验。首先利用器测资料对气温代用资料进行校验与网格重建,并以此作为贝叶斯模型的先验分布,然后,用泰勒图选出了该区域气候模拟效果最佳的几个模式;最后将网格重建和气候模拟序列用贝叶斯模型进行了融合试验。结果表明,贝叶斯融合模型能有效提取各种数据来源的有用信息进行融合,融合结果的长期变化(线性)趋势更接近器测气候序列,并在一定程度上提高了序列的精度,减小了结果的不确定性;并且,融合结果能够纠正先验分布及气候模拟数据的明显偏差,为长年代气候序列重建提供了一个可行的思路。   相似文献   
50.
The impact of diabatic processes on 4-dimensional variational data assimilation (4D-Var) was studied using the 1995 version of NCEP's global spectral model with and without full physics.The adjoint was coded manually.A cost function measuring spectral errors of 6-hour forecasts to "observation" (the NCEP reanalysis data) was minimized using the L-BFGS (the limited memory quasi-Newton algorithm developed by Broyden,Fletcher,Goldfard and Shanno) for optimizing parameters and initial conditions.Minimization of the cost function constrained by an adiabatic version of the NCEP global model converged to a minimum with a significant amount of decrease in the value of the cost function.Minimization of the cost function using the diabatic model, however,failed after a few iterations due to discontinuities introduced by physical parameterizations.Examination of the convergence of the cost function in different spectral domains reveals that the large-scale flow is adjusted during the first 10 iterations,in which discontinuous diabatic parameterizations play very little role.The adjustment produced by the minimization gradually moves to relatively smaller scales between 10-20th iterations.During this transition period,discontinuities in the cost function produced by "on-off" switches in the physical parameterizations caused the cost function to stay in a shallow local minimum instead of continuously decreasing toward a deeper minimum. Next,a mixed 4D-Var scheme is tested in which large-scale flows are first adiabatically adjusted to a sufficient level,followed by a diabatic adjustment introduced after 10 to 20 iterations. The mixed 4D-Var produced a closer fit of analysis to observations,with 38% and 41% more decrease in the values of the cost function and the norm of gradient,respectively,than the standard diabatic 4D-Var,while the CPU time is reduced by 21%.The resulting optimal initial conditions improve the short-range forecast skills of 48-hour statistics.The detrimental effect of parameterization discontinuities on minimization was also reduced.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号