首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   5982篇
  免费   148篇
  国内免费   62篇
测绘学   123篇
大气科学   503篇
地球物理   1374篇
地质学   1985篇
海洋学   592篇
天文学   957篇
综合类   14篇
自然地理   644篇
  2021年   47篇
  2020年   65篇
  2019年   73篇
  2018年   111篇
  2017年   116篇
  2016年   133篇
  2015年   103篇
  2014年   129篇
  2013年   315篇
  2012年   177篇
  2011年   260篇
  2010年   238篇
  2009年   236篇
  2008年   241篇
  2007年   209篇
  2006年   228篇
  2005年   182篇
  2004年   184篇
  2003年   171篇
  2002年   165篇
  2001年   117篇
  2000年   108篇
  1999年   103篇
  1998年   95篇
  1997年   85篇
  1996年   88篇
  1995年   92篇
  1994年   86篇
  1993年   76篇
  1992年   86篇
  1991年   70篇
  1990年   98篇
  1989年   85篇
  1988年   75篇
  1987年   100篇
  1986年   75篇
  1985年   96篇
  1984年   131篇
  1983年   108篇
  1982年   97篇
  1981年   100篇
  1980年   88篇
  1979年   100篇
  1978年   71篇
  1977年   86篇
  1976年   71篇
  1975年   70篇
  1974年   55篇
  1973年   61篇
  1972年   35篇
排序方式: 共有6192条查询结果,搜索用时 15 毫秒
81.
Net mass balance has been measured since 1958 at South Cascade Glacier using the 'direct method,' e.g. area averages of snow gain and firn and ice loss at stakes. Analysis of cartographic vertical photography has allowed measurement of mass balance using the 'geodetic method' in 1970, 1975, 1977, 1979–80, and 1985–97. Water equivalent change as measured by these nearly independent methods should give similar results. During 1970–97, the direct method shows a cumulative balance of about −15 m, and the geodetic method shows a cumulative balance of about −22 m. The deviation between the two methods is fairly consistent, suggesting no gross errors in either, but rather a cumulative systematic error. It is suspected that the cumulative error is in the direct method because the geodetic method is based on a non-changing reference, the bedrock control, whereas the direct method is measured with reference to only the previous year's summer surface. Possible sources of mass loss that are missing from the direct method are basal melt, internal melt, and ablation on crevasse walls. Possible systematic measurement errors include under-estimation of the density of lost material, sinking stakes, or poorly represented areas.  相似文献   
82.
A survey of waters adjacent to this heavily urbanized and industrialized region showed concentrations of copper, 65 μg l.?1 to be the highest reported to date for estuarine waters, and lead up to 13.9 μg l.?1 Correlations between distributions of dissolved and total metal concentration in the water column, hydrography, and metal in the sediment were related to benthic studies in this area. Laboratory studies are cited which show the potential for adverse effects on marine animals at these metal concentrations.  相似文献   
83.
84.
We evaluate relative sea level (RSL) trajectories for North Carolina, USA, in the context of tide-gauge measurements and geological sea-level reconstructions spanning the last ~11,000 years. RSL rise was fastest (~7 mm/yr) during the early Holocene and slowed over time with the end of the deglaciation. During the pre-Industrial Common Era (i.e., 0–1800 CE), RSL rise (~0.7 to 1.1 mm/yr) was driven primarily by glacio-isostatic adjustment, though dampened by tectonic uplift along the Cape Fear Arch. Ocean/atmosphere dynamics caused centennial variability of up to ~0.6 mm/yr around the long-term rate. It is extremely likely (probability P=0.95) that 20th century RSL rise at Sand Point, NC, (2.8 ± 0.5 mm/yr) was faster than during any other century in at least 2,900 years. Projections based on a fusion of process models, statistical models, expert elicitation, and expert assessment indicate that RSL at Wilmington, NC, is very likely (P=0.90) to rise by 42–132 cm between 2000 and 2100 under the high-emissions RCP 8.5 pathway. Under all emission pathways, 21st century RSL rise is very likely (P>0.90) to be faster than during the 20th century. Due to RSL rise, under RCP 8.5, the current ‘1-in-100 year’ flood is expected at Wilmington in ~30 of the 50 years between 2050-2100.  相似文献   
85.
Reducing systematic errors by empirically correcting model errors   总被引:2,自引:0,他引:2  
A methodology for the correction of systematic errors in a simplified atmospheric general‐circulation model is proposed. First, a method for estimating initial tendency model errors is developed, based on a 4‐dimensional variational assimilation of a long‐analysed dataset of observations in a simple quasi‐geostrophic baroclinic model. Then, a time variable potential vorticity source term is added as a forcing to the same model, in order to parameterize subgrid‐scale processes and unrepresented physical phenomena. This forcing term consists in a (large‐scale) flow dependent parametrization of the initial tendency model error computed by the variational assimilation. The flow dependency is given by an analogues technique which relies on the analysis dataset. Such empirical driving causes a substantial improvement of the model climatology, reducing its systematic error and improving its high frequency variability. Low‐frequency variability is also more realistic and the model shows a better reproduction of Euro‐Atlantic weather regimes. A link between the large‐scale flow and the model error is found only in the Euro‐Atlantic sector, other mechanisms being probably the origin of model error in other areas of the globe.  相似文献   
86.
87.
Icefish populations continue to decline. Historical as well as current over-exploitations of stocks aggravated by climate change are frequently seen as res  相似文献   
88.
A 275‐km‐long transversel Northern Adriatic profile from the mouth of the Po River (Italian Adriatic coast) to the Kvarner region (Croatian coastal island area) was investigated in three successive case studies in August 2008, 2009 and 2010. The short Po River pulses in August result in the surface advection of riverine water, nutrients and phytoplankton from the western to the eastern side of the Adriatic. This surface spreading exhibits inter‐annual variability depending on the riverine discharge in the preceding period. The Po River discharge pulse in August 2010 in particular resulted in an extraordinary tongue‐like advection of riverine water, nutrients, and phytoplankton towards the Eastern Adriatic coast. The phenomenon was detected using both satellite imagery and classical oceanographic measurements. In the advective water, toxic dinoflagellates were most abundant in August 2010, when the influence of the Po was greatest.  相似文献   
89.
Open coast storm surge water levels consist of wind setup due to wind shear at the water surface; a wave setup component caused by wind induced waves transferring momentum to the water column; an atmospheric pressure head component due to the atmospheric pressure deficit over the spatial extent of the storm system; a Coriolis forced setup or setdown component due to the effects of the rotation of the earth acting on the wind driven alongshore current at the coast; a possible seiche component due to resonance effects initiated by moving wind system, and, if astronomical tides are present, an astronomical tide component (although the tide is typically considered to be a forced astronomical event and not really a direct part of the external wind-driven meteorological component of storm surge). Typically the most important component of a storm surge is the wind setup component, especially on the U.S. East Coast and the Gulf of Mexico shorelines. In many approaches to storm surge modeling, a constant depth approximation is invoked over a limited step size in the computational domain. The use of a constant depth approximation has received little attention in the literature although can be very important to the resulting magnitude of the computed storm surge. The importance of discrete step size to the wind setup storm surge component is considered herein with a simple case computation of the wind setup component on a linear slope offshore profile. The present study findings show that the constant depth approximation to wind setup storm surge estimation is biased on the low side (except in extremely shallow water depths) and can provide large errors if discrete step size is not sufficiently resolved. Guidance has been provided on the error that one might encounter for various step sizes on different slopes.  相似文献   
90.
A procedure is presented for the estimation of extreme values of stationary Gaussian random processes with arbitrary bandwidths. This approach is based on the analytic envelope defined by the Hilbert Transform; this envelope is Rayleigh distributed regardless of bandwidth. For experimentally derived data that has been converted into digital form, the Hilbert Transform is approximated using algorithms implemented on a digital computer to produce samples of the envelope's time history. Next, the degree of correlation between these envelope samples is taken into account using a method developed from simulation studies of a series of synthetic Gaussian time histories with varying bandwidths. Once this correlation effect has been estimated, the standard methods of order statistics are applied to these samples using the Rayleigh probability density function. Examples of applying this procedure to experimentally derived data are presented.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号