首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2813篇
  免费   450篇
  国内免费   490篇
测绘学   376篇
大气科学   525篇
地球物理   779篇
地质学   713篇
海洋学   272篇
天文学   585篇
综合类   117篇
自然地理   386篇
  2024年   16篇
  2023年   32篇
  2022年   66篇
  2021年   77篇
  2020年   130篇
  2019年   118篇
  2018年   121篇
  2017年   130篇
  2016年   146篇
  2015年   129篇
  2014年   157篇
  2013年   295篇
  2012年   169篇
  2011年   150篇
  2010年   129篇
  2009年   183篇
  2008年   179篇
  2007年   231篇
  2006年   204篇
  2005年   169篇
  2004年   139篇
  2003年   119篇
  2002年   92篇
  2001年   61篇
  2000年   78篇
  1999年   72篇
  1998年   80篇
  1997年   54篇
  1996年   35篇
  1995年   35篇
  1994年   33篇
  1993年   22篇
  1992年   13篇
  1991年   25篇
  1990年   14篇
  1989年   7篇
  1988年   9篇
  1987年   8篇
  1986年   6篇
  1985年   5篇
  1984年   2篇
  1983年   3篇
  1982年   1篇
  1981年   3篇
  1980年   1篇
  1979年   2篇
  1978年   1篇
  1972年   2篇
排序方式: 共有3753条查询结果,搜索用时 15 毫秒
1.
Average velocity in streams is a key variable for the analysis and modelling of hydrological and hydraulic processes underpinning water resources science and practice. The present study evaluates the impact of the sampling duration on the quality of average velocity measurements acquired with contemporary instruments such as Acoustic Doppler Velocimeters (ADV) an Acoustic Doppler Current Profilers (ADCP). The evaluation combines considerations on turbulent flows and principles and configurations of acoustic instruments with practical experience in conducting customized analysis for uncertainty analysis purposes. The study sheds new insights on the spatial and temporal variability of the uncertainty in the measurement of average velocities due to variable sampling durations acting in isolation from other sources of uncertainties. Sampling durations of 90 and 150 s are found sufficient for ADV and ADCP, respectively, to obtain reliable average velocities in a flow affected only by natural turbulence and instrument noise. Larger sampling durations are needed for measurements in most of the natural streams exposed to additional sources of data variability.  相似文献   
2.
To date, passive flux meters have predominantly been applied in temperate environments for tracking the movement of contaminants in groundwater. This study applies these instruments to reduce uncertainty in (typically instantaneous) flux measurements made in a low-gradient, wetland dominated, discontinuous permafrost environment. This method supports improved estimation of unsaturated and over-winter subsurface flows which are very difficult to quantify using hydraulic gradient-based approaches. Improved subsurface flow estimates can play a key role in understanding the water budget of this landscape.  相似文献   
3.
Water quality is often highly variable both in space and time, which poses challenges for modelling the more extreme concentrations. This study developed an alternative approach to predicting water quality quantiles at individual locations. We focused on river water quality data that were collected over 25 years, at 102 catchments across the State of Victoria, Australia. We analysed and modelled spatial patterns of the 10th, 25th, 50th, 75th and 90th percentiles of the concentrations of sediments, nutrients and salt, with six common constituents: total suspended solids (TSS), total phosphorus (TP), filterable reactive phosphorus (FRP), total Kjeldahl nitrogen (TKN), nitrate-nitrite (NOx), and electrical conductivity (EC). To predict the spatial variation of each quantile for each constituent, we developed statistical regression models and exhaustively searched through 50 catchment characteristics to identify the best set of predictors for that quantile. The models predict the spatial variation in individual quantiles of TSS, TKN and EC well (66%–96% spatial variation explained), while those for TP, FRP and NOx have lower performance (37%–73% spatial variation explained). The most common factors that influence the spatial variations of the different constituents and quantiles are: annual temperature, percentage of cropping land area in catchment and channel slope. The statistical models developed can be used to predict how low- and high-concentration quantiles change with landscape characteristics, and thus provide a useful tool for catchment managers to inform planning and policy making with changing climate and land use conditions.  相似文献   
4.
5.
We suggest a new algorithm to remove systematic effects in a large set of light curves obtained by a photometric survey. The algorithm can remove systematic effects, such as those associated with atmospheric extinction, detector efficiency, or point spread function changes over the detector. The algorithm works without any prior knowledge of the effects, as long as they linearly appear in many stars of the sample. The approach, which was originally developed to remove atmospheric extinction effects, is based on a lower rank approximation of matrices, an approach which has already been suggested and used in chemometrics, for example. The proposed algorithm is especially useful in cases where the uncertainties of the measurements are unequal. For equal uncertainties, the algorithm reduces to the Principal Component Analysis (PCA) algorithm. We present a simulation to demonstrate the effectiveness of the proposed algorithm and we point out its potential, in the search for transit candidates in particular.  相似文献   
6.
7.
The present generation of weak lensing surveys will be superseded by surveys run from space with much better sky coverage and high level of signal-to-noise ratio, such as the Supernova/Acceleration Probe ( SNAP ). However, removal of any systematics or noise will remain a major cause of concern for any weak lensing survey. One of the best ways of spotting any undetected source of systematic noise is to compare surveys that probe the same part of the sky. In this paper we study various measures that are useful in cross-correlating weak lensing surveys with diverse survey strategies. Using two different statistics – the shear components and the aperture mass – we construct a class of estimators which encode such cross-correlations. These techniques will also be useful in studies where the entire source population from a specific survey can be divided into various redshift bins to study cross-correlations among them. We perform a detailed study of the angular size dependence and redshift dependence of these observables and of their sensitivity to the background cosmology. We find that one-point and two-point statistics provide complementary tools which allow one to constrain cosmological parameters and to obtain a simple estimate of the noise of the survey.  相似文献   
8.
We develop a general formalism for analysing parameter information from non-Gaussian cosmic fields. The method can be adapted to include the non-linear effects in galaxy redshift surveys, weak lensing surveys and cosmic velocity field surveys as part of parameter estimation. It can also be used as a test of non-Gaussianity of the cosmic microwave background. Generalizing maximum-likelihood analysis to second order, we calculate the non-linear Fisher information matrix and likelihood surfaces in parameter space. To this order we find that the information content is always increased by including non-linearity. Our methods are applied to a realistic model of a galaxy redshift survey, including non-linear evolution, galaxy bias, shot-noise and redshift-space distortions to second order. We find that including non-linearities allows all of the degeneracies between parameters to be lifted. Marginalized parameter uncertainties of a few per cent will then be obtainable using forthcoming galaxy redshift surveys.  相似文献   
9.
10.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号