首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1499篇
  免费   213篇
  国内免费   312篇
测绘学   478篇
大气科学   186篇
地球物理   288篇
地质学   528篇
海洋学   222篇
天文学   32篇
综合类   127篇
自然地理   163篇
  2024年   7篇
  2023年   30篇
  2022年   63篇
  2021年   60篇
  2020年   67篇
  2019年   94篇
  2018年   77篇
  2017年   75篇
  2016年   104篇
  2015年   95篇
  2014年   107篇
  2013年   128篇
  2012年   88篇
  2011年   107篇
  2010年   83篇
  2009年   93篇
  2008年   84篇
  2007年   96篇
  2006年   96篇
  2005年   64篇
  2004年   50篇
  2003年   41篇
  2002年   28篇
  2001年   34篇
  2000年   29篇
  1999年   30篇
  1998年   29篇
  1997年   33篇
  1996年   37篇
  1995年   6篇
  1994年   20篇
  1993年   14篇
  1992年   14篇
  1991年   14篇
  1990年   6篇
  1989年   6篇
  1988年   2篇
  1987年   2篇
  1986年   4篇
  1983年   1篇
  1982年   2篇
  1981年   1篇
  1980年   2篇
  1978年   1篇
排序方式: 共有2024条查询结果,搜索用时 15 毫秒
991.
Model data selection using gamma test for daily solar radiation estimation   总被引:1,自引:0,他引:1  
R. Remesan  M. A. Shamim  D. Han 《水文研究》2008,22(21):4301-4309
Hydrological modelling is a complicated procedure and there are many tough questions facing all modellers: what input data should be used? how much data is required? and what model should be used? In this paper, the gamma test (GT) has been used for the first time in modelling one of the key hydrological components: solar radiation. The study aimed to resolve the questions about the relative importance of input variables and to determine the optimum number of data points required to construct a reliable smooth model. The proposed methodology has been studied through the estimation of daily solar radiation in the Brue Catchment, the UK. The relationship between input and output in the meteorological data sets was achieved through error variance estimation before the modelling using the GT. This work has demonstrated how the GT helps model development in nonlinear modelling techniques such as local linear regression (LLR) and artificial neural networks (ANN). It was found that the GT provided very useful information for input data selection and subsequent model development. The study has wider implications for various hydrological modelling practices and suggests further exploration of this technique for improving informed data and model selection, which has been a difficult field in hydrology in past decades. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   
992.
华北地震活动中短期异常图像研究   总被引:2,自引:0,他引:2  
系统研究了多种地震活动性方法后认为, 1970年以来华北地区发生的MS≥5.8中强震前, 约83%具有3级或4级地震异常条带、 孕震空区及信号震出现。 震前有信号震的比例占92%, 83%的信号震距主震150 km以内, 距主震发生时间小于1年的占67%。 条带时间形成进程在2年内的约占90%, 震前条带形成后到主震发生83%在5个月内。 空区形成进程在1年半内的占83%, 空区形成后到主震发生91%在50天内, 为有意义的地震活动图像短期异常特征。 文中还探讨了异常条带图像的定量判别指标及其与未来强震的关联。  相似文献   
993.
田丰  罗建明  田力  王建芳  王松 《地震》2001,21(4):80-87
扩频微波技术是近几年发展起来的高新技术,是目前发展宽带网络技术的主要手段之一。采用扩频微波方式组网在地震系统还尚属首次。“京区扩频微波高速链路建设”是采用2-3Mbp/s速率扩频微波技术将中国地震局京区主要直属单位及北京市地震局的计算机网络互联起来,构成中国地震局北京城区计算机网络互联的计算机广域网扩频微波网络平台及中国地震局北京城区计算机网络信道骨干网。同时利用扩频微波建立了与中国科技网相连的信道,将京区各单位和全国各省(市)地震局的计算机网络接入因特网。  相似文献   
994.
The remediation of sites contaminated with unexploded ordnance (UXO) remains an area of intense focus for the Department of Defense. Under the sponsorship of SERDP, data fusion techniques are being developed for use in enhancing wide-area assessment UXO remediation efforts and a data fusion framework is being created to provide a cohesive data management and decision-making utility to allow for more efficient expenditure of time, labor and resources. An important first step in this work is the development of feature extraction utilities and feature probability density maps for eventual input to data fusion algorithms, making data fusion of estimates of data quality, UXO-related features, non-UXO backgrounds, and correlations among independent data streams possible. Utilizing data acquired during ESTCP’s Wide-Area Assessment Pilot Program, the results presented here successfully demonstrate the feasibility of automated feature extraction from light detection and ranging, orthophotography, and helicopter magnetometry wide-area assessment survey data acquired at the Pueblo Precision Bombing Range #2. These data were imported and registered to a common survey map grid and UXO-related features were extracted and utilized to construct survey site-wide probability density maps that are well-suited for input to higher level data fusion algorithms. Preliminary combination of feature maps from the various data sources yielded maps for the Pueblo site that offered a more accurate UXO assessment than any one data source alone.
Susan L. Rose-PehrssonEmail:
  相似文献   
995.
The well-known “Maximum Entropy Formalism” offers a powerful framework for deriving probability density functions given a relevant knowledge base and an adequate prior. The majority of results based on this approach have been derived assuming a flat uninformative prior, but this assumption is to a large extent arbitrary (any one-to-one transformation of the random variable will change the flat uninformative prior into some non-constant function). In a companion paper we introduced the notion of a natural reference point for dimensional physical variables, and used this notion to derive a class of physical priors that are form-invariant to changes in the system of dimensional units. The present paper studies effects of these priors on the probability density functions derived using the maximum entropy formalism. Analysis of real data shows that when the maximum entropy formalism uses the physical prior it yields significantly better results than when it is based on the commonly used flat uninformative prior. This improvement reflects the significance of the incorporating additional information (contained in physical priors), which is ignored when flat priors are used in the standard form of the maximum entropy formalism. A potentially serious limitation of the maximum entropy formalism is the assumption that sample moments are available. This is not the case in many macroscopic real-world problems, where the knowledge base available is a finite sample rather than population moments. As a result, the maximum entropy formalism generates a family of “nested models” parameterized by the unknown values of the population parameters. In this work we combine this formalism with a model selection scheme based on Akaike’s information criterion to derive the maximum entropy model that is most consistent with the available sample. This combination establishes a general inference framework of wide applicability in scientific/engineering problems.  相似文献   
996.
Motile bacteria may form bands that travel with a constant speed of propagation through a medium containing a dissolved substrate, to which they respond energy tactically. We generalize the analytical solution by Keller and Segel for such bands by accounting for (1) the presence of a porous medium, (2) substrate consumption described by a Monod kinetics model, and (3) an energy tactic response model derived by Rivero et al. Specifically, we determine the concentration profiles of the bacteria and the substrate. We also derive various expressions for the band velocity. The band velocity is also shown to equal the energy tactic velocity at the bacterial peak divided by tortuosity.  相似文献   
997.
基于广义高斯模型的SAR幅度图像震害检测   总被引:1,自引:0,他引:1  
刘云华  庾露  单新建 《地震》2013,33(2):29-36
本文以2008年5月12日汶川地震前后的星载ALOS 合成孔径雷达图像作为数据源, 利用广义高斯分布模型作为差异图像中变化类和未变化类的分布模型, 在假设变化和未变化像元服从广义高斯分布的条件下, 估计变化和未变化像元的概率密度参数, 采用KI算法计算最佳分割阈值并提取变化区域。 以都江堰地区为例, 自动检测这次地震导致的地表变化, 快速圈定一些变化差异较大的区域。 研究表明, 该变化检测算法能较准确地描述差异影像的分布, 能够在地震灾害提取中发挥作用。  相似文献   
998.
Uncertainty in the estimation of hydrologic export of solutes has never been fully evaluated at the scale of a small‐watershed ecosystem. We used data from the Gomadansan Experimental Forest, Japan, Hubbard Brook Experimental Forest, USA, and Coweeta Hydrologic Laboratory, USA, to evaluate many sources of uncertainty, including the precision and accuracy of measurements, selection of models, and spatial and temporal variation. Uncertainty in the analysis of stream chemistry samples was generally small but could be large in relative terms for solutes near detection limits, as is common for ammonium and phosphate in forested catchments. Instantaneous flow deviated from the theoretical curve relating height to discharge by up to 10% at Hubbard Brook, but the resulting corrections to the theoretical curve generally amounted to <0.5% of annual flows. Calibrations were limited to low flows; uncertainties at high flows were not evaluated because of the difficulties in performing calibrations during events. However, high flows likely contribute more uncertainty to annual flows because of the greater volume of water that is exported during these events. Uncertainty in catchment area was as much as 5%, based on a comparison of digital elevation maps with ground surveys. Three different interpolation methods are used at the three sites to combine periodic chemistry samples with streamflow to calculate fluxes. The three methods differed by <5% in annual export calculations for calcium, but up to 12% for nitrate exports, when applied to a stream at Hubbard Brook for 1997–2008; nitrate has higher weekly variation at this site. Natural variation was larger than most other sources of uncertainty. Specifically, coefficients of variation across streams or across years, within site, for runoff and weighted annual concentrations of calcium, magnesium, potassium, sodium, sulphate, chloride, and silicate ranged from 5 to 50% and were even higher for nitrate. Uncertainty analysis can be used to guide efforts to improve confidence in estimated stream fluxes and also to optimize design of monitoring programmes. © 2014 The Authors. Hydrological Processes published John Wiley & Sons, Ltd.  相似文献   
999.
Lu Zhuo  Qiang Dai  Dawei Han 《水文研究》2015,29(11):2463-2477
Hydrological models play a significant role in modelling river flow for decision making support in water resource management. In the past decades, many researchers have made a great deal of efforts in calibrating and validating various models, with each study being focused on one or two models. As a result, there is a lack of comparative analysis on the performance of those models to guide hydrologists to choose appropriate models for the individual climate and physical conditions. This paper describes a two‐level meta‐analysis to develop a matching system between catchment complexity (based on catchment significant features (CSFs)) and model types. The intention is to use the available CSF information for choosing the most suitable model type for a given catchment. In this study, the CSFs include the elements of climate, soil type, land cover and catchment scale. Specific choices of model types in small and medium catchments are further explored with all CSF information obtained. In particular, it is interesting to find that semi‐distributed models are the most suitable model type for catchments with the area over 3000 km2, regardless of other CSFs. The potential methodology for expanding the matching system between catchment complexity and model complexity is discussed. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   
1000.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号