首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   4411篇
  免费   502篇
  国内免费   449篇
测绘学   1362篇
大气科学   425篇
地球物理   967篇
地质学   1131篇
海洋学   395篇
天文学   434篇
综合类   333篇
自然地理   315篇
  2024年   16篇
  2023年   19篇
  2022年   72篇
  2021年   91篇
  2020年   152篇
  2019年   150篇
  2018年   101篇
  2017年   136篇
  2016年   159篇
  2015年   174篇
  2014年   256篇
  2013年   304篇
  2012年   244篇
  2011年   265篇
  2010年   222篇
  2009年   261篇
  2008年   243篇
  2007年   285篇
  2006年   288篇
  2005年   251篇
  2004年   224篇
  2003年   177篇
  2002年   159篇
  2001年   146篇
  2000年   136篇
  1999年   122篇
  1998年   111篇
  1997年   91篇
  1996年   107篇
  1995年   87篇
  1994年   63篇
  1993年   55篇
  1992年   49篇
  1991年   25篇
  1990年   28篇
  1989年   26篇
  1988年   16篇
  1987年   11篇
  1986年   10篇
  1985年   3篇
  1984年   4篇
  1982年   1篇
  1981年   1篇
  1977年   5篇
  1976年   1篇
  1975年   2篇
  1974年   1篇
  1973年   4篇
  1972年   3篇
  1971年   5篇
排序方式: 共有5362条查询结果,搜索用时 567 毫秒
941.
为了从海量极地科学数据中准确、智能地发现用户所需要的数据,首先,系统研究分析极地科学数据特征,挖掘极地科学数据本身隐含的语义信息,建立极地科学数据关联指标体系;其次,研究极地科学数据关联方法,最后设计并实现极地科学数据关联查询应用原型系统。以极地科学数据共享平台的元数据作为语料库,自动抽取极地科学数据关联指标数据,实现极地科学数据集之间的自动关联。应用实践表明,利用极地科学数据关联方法不仅可直接检索出满足条件的数据,还可以检索出潜在的相关联的数据。该关联方法研究可促进极地科学数据的集成、融合与共享应用。  相似文献   
942.
为了确定正热电离质谱法测定硼同位素时的最佳实验条件 ,采用均匀设计法进行实验表头设计 ,并用MATLAB进行数据处理和回归模型优化。将最佳优化条件应用于硼同位素测定中 ,得到了与理论计算相吻合的高精度测定结果。  相似文献   
943.
宋戈  高楠 《地理科学》2008,28(2):185-188
以哈尔滨市建成区土地利用作为研究对象,结合哈尔滨市所处发展阶段及土地利用现状特点构建评价指标体系,以DEA方法为手段,用Matlab软件进行数学计算,对哈尔滨市2001~2005年城市土地利用经济效益进行定量分析和评价。哈尔滨市土地利用经济效益水平一般,且土地投入有冗余。提出加强城市存量土地利用、注重不同用地类型的投资比例、调整产业结构三方面入手提出提高哈尔滨市土地利用经济效益的建议。  相似文献   
944.
Using the Taw Estuary as an example, data routinely collected by the Environment Agency for England and Wales over the period 1990-2004 were interrogated to identify the drivers of excessive algal growth. The estuary was highly productive with chlorophyll concentrations regularly exceeding 100 μg L−1, mostly during periods of low freshwater input from the River Taw when estuarine water residence times were longest. However, algal growth in mid estuary was often inhibited by ammonia inputs from the adjacent sewage treatment works. The reported approach demonstrates the value of applying conventional statistical analyses in a structured way to existing monitoring data and is recommended as a useful tool for the rapid assessment of eutrophication. However, future estuarine monitoring should include the collection of dissolved organic nutrient data and targeted high temporal resolution data because the drivers of eutrophication are complex and often very specific to a particular estuary.  相似文献   
945.
基于三维模拟的海洋CSEM资料处理   总被引:7,自引:6,他引:1       下载免费PDF全文
海洋可控源电磁法已经成为海洋油气勘探一个重要工具,但是其资料处理和解释还处于定性和一维模拟阶段.在积分方程三维模拟的基础上对Troll油田实测数据进行了处理,采用人机交互三维模拟寻找背景模型和异常体初始模型,最后对异常体电阻率采用准线性近似快速反演,取得了定量的结果.同时,说明对于二维测线和二维模型依然可以用三维来模拟,其结果优于二维反演.在电子计算机技术快速发展的今天,可以预计三维反演将成为资料处理解释的主流.  相似文献   
946.
讨论了利用多项式拟合和小波变换制作零炮检距剖面和小波剖面的方法,并用VisualC++语言开发了基于Windows操作平台下的地震资料零炮检距剖面与小波剖面制作系统。利用最佳拟合算法与最佳小波基函数,使用制作系统对煤田实际地震资料进行了处理,取得了满意的地质效果。   相似文献   
947.
Some limitations of the Hilbert–Huang transform (HHT) for nonlinear and nonstationary signal processing are remarked. As an enhancement to the HHT, a time varying vector autoregressive moving average (VARMA) model based method is proposed to calculate the instantaneous frequencies of the intrinsic mode functions (IMFs) obtained from the empirical mode decomposition (EMD) of a signal. By representing the IMFs as time varying VARMA model and using the Kalman filter to estimate the time varying model parameters, the instantaneous frequencies are calculated according to the time varying parameters, then the instantaneous frequencies and the envelopes derived from the cubic spline interpolation of the maxima of IMFs are used to yield the Hilbert spectrum. The analysis of the length of day dataset and the ground motion record El Centro (1940, N–S) shows that the proposed method offers advantages in frequency resolution, and produces more physically meaningful and readable Hilbert spectrum than the original HHT method, short-time Fourier transform (STFT) and wavelet transform (WT). The analysis of the seismic response of a building during the 1994 Northridge earthquake shows that the proposed method is a powerful tool for structural damage detection, which is expected as the promising area for future research.  相似文献   
948.
The Land Information System (LIS) is an established land surface modeling framework that integrates various community land surface models, ground measurements, satellite-based observations, high performance computing and data management tools. The use of advanced software engineering principles in LIS allows interoperability of individual system components and thus enables assessment and prediction of hydrologic conditions at various spatial and temporal scales. In this work, we describe a sequential data assimilation extension of LIS that incorporates multiple observational sources, land surface models and assimilation algorithms. These capabilities are demonstrated here in a suite of experiments that use the ensemble Kalman filter (EnKF) and assimilation through direct insertion. In a soil moisture experiment, we discuss the impact of differences in modeling approaches on assimilation performance. Provided careful choice of model error parameters, we find that two entirely different hydrological modeling approaches offer comparable assimilation results. In a snow assimilation experiment, we investigate the relative merits of assimilating different types of observations (snow cover area and snow water equivalent). The experiments show that data assimilation enhancements in LIS are uniquely suited to compare the assimilation of various data types into different land surface models within a single framework. The high performance infrastructure provides adequate support for efficient data assimilation integrations of high computational granularity.  相似文献   
949.
It is the goal of remote sensing to infer information about objects or a natural process from a remote location. This invokes that uncertainty in measurement should be viewed as central to remote sensing. In this study, the uncertainty associated with water stages derived from a single SAR image for the Alzette (G.D. of Luxembourg) 2003 flood is assessed using a stepped GLUE procedure. Main uncertain input factors to the SAR processing chain for estimating water stages include geolocation accuracy, spatial filter window size, image thresholding value, DEM vertical precision and the number of river cross sections at which water stages are estimated. Initial results show that even with plausible parameter values uncertainty in water stages over the entire river reach is 2.8 m on average. Adding spatially distributed field water stages to the GLUE analysis following a one-at-a-time approach helps to considerably reduce SAR water stage uncertainty (0.6 m on average) thereby identifying appropriate value ranges for each uncertain SAR water stage processing factor. For the GLUE analysis a Nash-like efficiency criterion adapted to spatial data is proposed whereby acceptable SAR model simulations are required to outperform a simpler regression model based on the field-surveyed average river bed gradient. Weighted CDFs for all factors based on the proposed efficiency criterion allow the generation of reliable uncertainty quantile ranges and 2D maps that show the uncertainty associated with SAR-derived water stages. The stepped GLUE procedure demonstrated that not all field data collected are necessary to achieve maximum constraining. A possible efficient way to decide on relevant locations at which to sample in the field is proposed. It is also suggested that the resulting uncertainty ranges and flood extent or depth maps may be used to evaluate 1D or 2D flood inundation models in terms of water stages, depths or extents. For this, the extended GLUE approach, which copes with the presence of uncertainty in the observed data, may be adopted.  相似文献   
950.
A long-standing problem in operational seismology is that of reliable focal depth estimation. Standard analyst practice is to pick and identify a ‘phase’ in the P-coda. This picking will always produce a depth estimate but without any validation it cannot be trusted. In this article we ‘hunt’ for standard depth phases like pP, sP and/or PmP but unlike the analyst we use Bayes statistics for classifying the probability that polarization characteristics of pickings belong to one of the mentioned depth phases given preliminary epicenter information. In this regard we describe a general-purpose PC implementation of the Bayesian methodology that can deal with complex nonlinear models in a flexible way. The models are represented by a data-flow diagram that may be manipulated by the analyst through a graphical-programming environment. An analytic signal representation is used with the imaginary part being the Hilbert transform of the signal itself. The pickings are in terms of a plot of posterior probabilities as a function of time for pP, Sp or PmP being within the presumed azimuth and incident angle sectors for given preliminary epicenter locations. We have tested this novel focal depth estimation procedure on explosion and earthquake recordings from Cossack Ranger II stations in Karelia, NW Russia, and with encouraging results. For example, pickings deviating more than 5° off ‘true’ azimuth are rejected while Pn-incident angle estimate exhibit considerable scatter. A comprehensive test of our approach is not quite easy as recordings from so-called Ground Truth events are elusive.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号