首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   54533篇
  免费   802篇
  国内免费   322篇
测绘学   1364篇
大气科学   4711篇
地球物理   10539篇
地质学   19172篇
海洋学   4196篇
天文学   12292篇
综合类   111篇
自然地理   3272篇
  2019年   346篇
  2018年   833篇
  2017年   815篇
  2016年   1061篇
  2015年   701篇
  2014年   1178篇
  2013年   2499篇
  2012年   1129篇
  2011年   1555篇
  2010年   1450篇
  2009年   1960篇
  2008年   1793篇
  2007年   1796篇
  2006年   1713篇
  2005年   1577篇
  2004年   1530篇
  2003年   1430篇
  2002年   1374篇
  2001年   1243篇
  2000年   1171篇
  1999年   1137篇
  1998年   1081篇
  1997年   1101篇
  1996年   915篇
  1995年   879篇
  1994年   854篇
  1993年   802篇
  1992年   836篇
  1991年   826篇
  1990年   905篇
  1989年   793篇
  1988年   766篇
  1987年   895篇
  1986年   742篇
  1985年   930篇
  1984年   1088篇
  1983年   1072篇
  1982年   1031篇
  1981年   990篇
  1980年   868篇
  1979年   874篇
  1978年   836篇
  1977年   776篇
  1976年   708篇
  1975年   651篇
  1974年   715篇
  1973年   727篇
  1972年   486篇
  1971年   463篇
  1970年   347篇
排序方式: 共有10000条查询结果,搜索用时 375 毫秒
941.
Discusses the tragedy of the commons with regard to fish resources in the North Atlantic, and points to the fact that the Atlantic cod has now been so heavily exploited that a strong regulation of fisheries is needed to preserve an exploitable stock. The author argues for exclusive economic zones dividing up the remaining loopholes among coastal nations and individually transferable quotas.  相似文献   
942.
Seismic hazard analysis is based on data and models, which both are imprecise and uncertain. Especially the interpretation of historical information into earthquake parameters, e.g. earthquake size and location, yields ambiguous and imprecise data. Models based on probability distributions have been developed in order to quantify and represent these uncertainties. Nevertheless, the majority of the procedures applied in seismic hazard assessment do not take into account these uncertainties, nor do they show the variance of the results. Therefore, a procedure based on Bayesian statistics was developed to estimate return periods for different ground motion intensities (MSK scale).Bayesian techniques provide a mathematical model to estimate the distribution of random variables in presence of uncertainties. The developed method estimates the probability distribution of the number of occurrences in a Poisson process described by the parameter . The input data are the historical occurrences of intensities for a particular site, represented by a discrete probability distribution for each earthquake. The calculation of these historical occurrences requires a careful preparation of all input parameters, i.e. a modelling of their uncertainties. The obtained results show that the variance of the recurrence rate is smaller in regions with higher seismic activity than in less active regions. It can also be demonstrated that long return periods cannot be estimated with confidence, because the time period of observation is too short. This indicates that the long return periods obtained by seismic source methods only reflects the delineated seismic sources and the chosen earthquake size distribution law.  相似文献   
943.
Summary The study of the regime of ozone variations in the huge tropical belt (25° S to 25° N), which are, in general, very small and zonally nearly symmetric, permits to establish a statistical model for estimating the ozone deviations using Total Ozone Mapping Spectrometer (TOMS) data. The equatorial stratospheric winds at 25 and 50hPa and the solar flux at 10.7 cm are used as major predictors and the linear trend was also estimated. The 10m/sec stratospheric wind change is related to1.2% ozone change at the equator, to practically no change in the 8–15° belts and up to 1.4% change with opposite phase over the tropics in spring but nearly zero change in fall. The solar cycle related amplitude is about 1.4% per 100 units of 10.7 cm solar flux. The ozone trends are negative: not significant over the equator and about –2% per decade (significant at 95% level) over the tropics. The latter could have been enforced by the 2 to 4% lower ozone values during 1991–1993, part of which might be related to the effects of the Mt. Pinatubo eruption, but might also be due to the strong QBO. The estimated deviations are verified versus reliable observations and the very good agreement permits applying the model for quantitative quality control of the reported ozone data from previous years. The standard deviation of the difference between observed ozone deviations and those estimated from the model is only 0.9–1.6% for yearly mean, that means instruments used for total ozone observations in the tropical belt should have systematic error of less than 1%. Cases when the discrepancies between the model and reported observations at a given station exceed 2–3% for time interval of 2 or more years should be verified.With 17 Figures  相似文献   
944.
Summary We propose and objective method whereby the density of Shannon's information associated with the retrieval of a profile of atmospheric variables from satellite-derived infrared radiance measurements may be estimated. The technique is a natural extension of one we previously proposed to estimate the effective data density in a profile. We test the method in a comparison of simulated satellite instruments to show that the method does indeed provide an objective summary of the spatial distribution of each instrument's information content. We propose that further extensions of the method be developed to include other more traditional data sources in a fully three-dimensional scheme. We also note that analogous and compatible methods may be used to diagnose the information content of meteorological analysis and forecast fields relative to the information contained in the covariance, at the appropriate season, of the corresponding climate fields.With 8 Figures  相似文献   
945.
Summary First, we review the present status of diabatic initialization used for numerical weather prediction and conclude that the deficiency of diabatic initialization mostly stems from the shortcoming in evaluating diabatic heating rates accurately, particularly the release of latent heat by cumulus convection. This indicates the need to adjust the initial conditions for physical processes, and Krishnamurti and his colleagues introduced in 1984 the concept of physical initialization. Since cumulus convection is most sensitive to input data among many physical processes, the adjustment of atmospheric input data to a prediction model to produce desired initial precipitation rates is referred to as cumulus initialization.In this article we describe a general approach to diabatic initialization with a special emphasis on cumulus initialization. We present the results of forecasting experiments with a version of the NCAR Community Climate Model (CCM) to demonstrate the efficacy of a cumulus initialization procedure to ameliorate the spinup problern of precipitation. Finally, we discuss application of the present methodology of cumulus initialization for a stability-dependent mass-flux cumulus parameterization of CCM2 to pave the way to complete the diabatic normal mode initialization package for CCM2. Note that the present cumulus initialization scheme can be used to assimilate into the atmospheric analysis of the tropics the precipitation rates estimated by satellite radiometric imagery data.With 8 FiguresThe National Center for Atmospheric Research is sponsored by the National Science Foundation  相似文献   
946.
Where the Holyoke flood-basalt flow in the Mesozoic HartfordBasin in Connecticut is thick and contains coarse-grained, horizontalsegregation sheets in its central part, the lower part of theflow is strongly depleted in incompatible elements; where theflow is thin and contains no segregation sheets it is homogeneousthroughout. This chemical variation can be explained only throughcompaction of the partly crystallized basalt. The compositionof the segregation sheets shows that they separated from thebasalt following only 33% crystallization. The segregation sheets,however, are clearly intrusive into the basalt, which must thereforehave already formed a crystal mush with finite strength at thislow degree of crystallinity. The incompatible element concentrationsindicate that the partly crystallized basalt underwent as muchas 28% compaction in the lowest 60 m of the flow. Between 60and 130 m above the base of the flow, the crystal mush becamedilated, and eventually ruptured with formation of the segregationsheets. No segregation sheet has a composition indicating separationafter more than 33% crystallization of the basalt. This is interpretedto indicate that compaction ceased at this stage because ofthe increasing strength of the mush and the increasing densityof the fractionating interstitial liquid KEY WORDS: crystal-mush compaction; segregation shtets; flood basalt; tholeiitie; Connecticut *e-mail: philpotts{at}geol.uconn.edu  相似文献   
947.
Most studies of sandstone provenance involve modal analysis of framework grains using techniques that exclude the fine-grained breakdown products of labile mineral grains and rock fragments, usually termed secondary matrix or pseudomatrix. However, the data presented here demonstrate that, when the proportion of pseudomatrix in a sandstone exceeds 10%, standard petrographic analysis can lead to incorrect provenance interpretation. Petrographic schemes for provenance analysis such as QFL and QFR should not therefore be applied to sandstones containing more than 10% secondary matrix. Pseudomatrix is commonly abundant in sandstones, and this is therefore a problem for provenance analysis. The difficulty can be alleviated by the use of whole-rock chemistry in addition to petrographic analysis. Combination of chemical and point-count data permits the construction of normative compositions that approximate original framework grain compositions. Provenance analysis is also complicated in many cases by fundamental compositional alteration during weathering and transport. Many sandstones, particularly shallow marine deposits, have undergone vigorous reworking, which may destroy unstable mineral grains and rock fragments. In such cases it may not be possible to retrieve provenance information by either petrographic or chemical means. Because of this, pseudomatrix-rich sandstones should be routinely included in chemical-petrological provenance analysis. Because of the many factors, both pre- and post-depositional, that operate to increase the compositional maturity of sandstones, petrologic studies must include a complete inventory of matrix proportions, grain size and sorting parameters, and an assessment of depositional setting.  相似文献   
948.
Hubble Space Telescope observations by Savage, Cardelli and Sofia (1992) and Cardelliet al. (1993) have led to improved gas phase abundances for many elements in the diffuse cloud towards the star Oph. Most remarkably, it was found that oxygen is much more strongly depleted than previousCopernicus observations indicated. As a consequence, chemical models of the Oph cloud are severely affected by the drop in the observed oxygen abundance by almost a factor of two; some previous model calculations for the Oph cloud failed to reproduce-even approximately-the observed high CO column density. The model calculations for the Oph cloud developed by Wagenblast (1992) in which the abundance of all observed neutral molecules could be reproduced have been revised and it is found that oxygen and nitrogen hydrides are required to be formed efficiently on the surface of grains; further, there are indications for a high cosmic ray ionization rate of the order 10–16 s–1.  相似文献   
949.
The GEM (Galactic Emission Mapping) project is an international collaboration established with the aim of surveying the full sky at long wavelengths with a multi-frequency radio telescope. A total of 745 hours of observation at 408 MHz were completed from an Equatorial site in Colombia. The observations cover the celestial band 0 h <<24 h , and –24° 22<<+35° 37. Preliminary results of this partial survey will be discussed. A review of the instrumental setup and a 10° resolution sky map at 408 MHz is presented.Presented by S. Torres at the UN/ESA Workshop on Basic Space Sciences: From Small Telescopes to Space Missions, Colombo, Sri Lanka 11–14 January 1996  相似文献   
950.
This paper is designed to bring to the attention the fact that the effect of focusing of solar energetic particles is always essential as compared with scattering, no matter how small the value of the mean free path may be. That is why, an ordinary (focusing-free) diffusion approach can not be applied to the solar cosmic ray transport. In the case of high-energy solar particles, the focused diffusion is demonstrated to lead to a power law decay of energetic particle intensity much like an ordinary diffusion. However, the power law index of the decay is renormalized by the focusing.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号