首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   12771篇
  免费   618篇
  国内免费   144篇
测绘学   349篇
大气科学   1301篇
地球物理   3283篇
地质学   4743篇
海洋学   955篇
天文学   1906篇
综合类   72篇
自然地理   924篇
  2022年   73篇
  2021年   188篇
  2020年   219篇
  2019年   206篇
  2018年   448篇
  2017年   448篇
  2016年   592篇
  2015年   453篇
  2014年   577篇
  2013年   875篇
  2012年   692篇
  2011年   666篇
  2010年   560篇
  2009年   683篇
  2008年   547篇
  2007年   480篇
  2006年   436篇
  2005年   351篇
  2004年   376篇
  2003年   334篇
  2002年   338篇
  2001年   259篇
  2000年   230篇
  1999年   196篇
  1998年   190篇
  1997年   202篇
  1996年   142篇
  1995年   158篇
  1994年   153篇
  1993年   116篇
  1992年   101篇
  1991年   114篇
  1990年   130篇
  1989年   97篇
  1988年   86篇
  1987年   112篇
  1986年   77篇
  1985年   116篇
  1984年   128篇
  1983年   112篇
  1982年   109篇
  1981年   123篇
  1980年   87篇
  1979年   93篇
  1978年   61篇
  1977年   83篇
  1976年   79篇
  1975年   66篇
  1974年   78篇
  1973年   75篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
991.
Time-relative positioning makes use of observations taken at two different epochs and stations with a single global positioning system (GPS) receiver to determine the position of the unknown station with respect to the known station. The limitation of this method is the degradation over time of the positioning accuracy due to the temporal variation of GPS errors (ionospheric delay, satellite clock corrections, satellite ephemerides, and tropospheric delay). The impact of these errors is significantly reduced by adding to the one-way move from the known to the unknown station, a back move to the known station. A loop misclosure is computed from the coordinates obtained at the known station at the beginning and at the end of the loop, and is used to correct the coordinates of the unknown station. The field tests, presented in this paper, show that using the loop misclosure corrections, time-relative positioning accuracy can be improved by about 60% when using single frequency data, and by about 40% with dual frequency data. For a 4-min processing interval (an 8-min loop) and a 95% probability level, errors remain under 20 cm for the horizontal components and 36 cm for the vertical component with single frequency data; and under 11 cm for the horizontal components and 29 cm for the vertical component with dual frequency data.  相似文献   
992.
Introduction Earthquake magnitude is the most common measure of an earthquake′s size,and is one of the basic parameters of an earthquake.There are three most familiar scales of earthquake magnitude:ML(local earthquake magnitude),MS(surface wave magnitude)and mB/mb(body wave magni-tude).Richter(1935)introduced ML when studying earthquakes in Southern California.In1945,Gutenberg(1945a)put forward surface wave magnitude scale to determine earthquake magnitude(MS)using surface waves(20s)of s…  相似文献   
993.
Upscaling is a major issue regarding mechanical and transport properties of rocks. This paper examines three issues relative to upscaling. The first one is a brief overview of Effective Medium Theory (EMT), which is a key tool to predict average rock properties at a macroscopic scale in the case of a statistically homogeneous medium. EMT is of particular interest in the calculation of elastic properties. As discussed in this paper, EMT can thus provide a possible way to perform upscaling, although it is by no means the only one, and in particular it is irrelevant if the medium does not adhere to statistical homogeneity. This last circumstance is examined in part two of the paper. We focus on the example of constructing a hydrocarbon reservoir model. Such a construction is a required step in the process of making reasonable predictions for oil production. Taking into account rock permeability, lithological units and various structural discontinuities at different scales is part of this construction. The result is that stochastic reservoir models are built that rely on various numerical upscaling methods. These methods are reviewed. They provide techniques which make it possible to deal with upscaling on a general basis. Finally, a last case in which upscaling is trivial is considered in the third part of the paper. This is the fractal case. Fractal models have become popular precisely because they are free of the assumption of statistical homogeneity and yet do not involve numerical methods. It is suggested that using a physical criterion as a means to discriminate whether fractality is a dream or reality would be more satisfactory than relying on a limited data set alone.  相似文献   
994.
995.
Strong-motion networks have been operating in the Caribbean region since the 1970s, however, until the mid-1990s only a few analogue stations were operational and the quantity of data recorded was very low. Since the mid-1990s, digital accelerometric networks have been established on islands within the region. At present there are thought to be about 160 stations operating in this region with a handful on Cuba, 65 on the French Antilles (mainly Guadeloupe and Martinique), eight on Jamaica, 78 on Puerto Rico (plus others on adjacent islands) and four on Trinidad.After briefly summarising the available data from the Caribbean islands, this article is mainly concerned with analysing the data that has been recorded by the networks operating on the French Antilles in terms of their distribution with respect to magnitude, source-to-site distance, focal depth and event type; site effects at certain stations; and also with respect to their predictability by ground motion estimation equations developed using data from different regions of the world. More than 300 good quality triaxial acceleration time-histories have been recorded on Guadeloupe and Martinique at a large number of stations from earthquakes with magnitudes larger than 4.8, however, most of the records are from considerable source-to-site distances. From the data available it is found that many of the commonly-used ground motion estimation equations for shallow crustal earthquakes poorly estimate the observed ground motions on the two islands; ground motions on Guadeloupe and Martinique have smaller amplitudes and are more variable than expected. This difference could be due to regional dependence of ground motions because of, for example, differing tectonics or crustal structures or because the ground motions so far recorded are, in general, from smaller earthquakes and greater distances than the range of applicability of the investigated equations.  相似文献   
996.
Two soil CO2 efflux surveys were carried out in September 1999 and June 2002 to study the spatial distribution of diffuse CO2 degassing and estimate the total CO2 output from Showa-Shinzan volcanic dome, Japan. Seventy-six and 81 measurements of CO2 efflux were performed in 1999 and 2002, respectively, covering most of Showa-Shinzan volcano. Soil CO2 efflux data showed a wide range of values up to 552 g m-2 d-1. Carbon isotope signatures of the soil CO2 ranged from -0.9‰ to -30.9‰, suggesting a mixing between different carbon reservoirs. Most of the study area showed CO2 efflux background values during the 1999 and 2002 surveys (B = 8.2 and 4.4 g m-2 d-1, respectively). The spatial distribution of CO2 efflux anomalies for both surveys showed a good correlation with the soil temperature, indicating a similar origin for the extensive soil degassing generated by condensation processes and fluids discharged by the fumarolic system of Showa-Shinzan. The total diffuse CO2 output of Showa-Shinzan was estimated to be about 14.0–15.6 t d-1 of CO2 for an area of 0.53 km2.  相似文献   
997.
In this paper, we first discuss the controversial result of the work by Cabanes et al. (Science 294:840–842, 2001), who suggested that the rate of past century sea level rise may have been overestimated, considering the limited and heterogeneous location of historical tide gauges and the high regional variability of thermal expansion which was supposed to dominate the observed sea level. If correct, this conclusion would have solved the problem raised by the IPCC third assessment report [Church et al, Cambridge University Press, Cambridge, pp 881, 2001], namely, the factor two difference between the 20th century observed sea level rise and the computed climatic contributions. However, recent investigations based on new ocean temperature data sets indicate that thermal expansion only explains part (about 0.4 mm/year) of the 1.8 mm/year observed sea level rise of the past few decades. In fact, the Cabanes et al.’s conclusion was incorrect due to a contamination of abnormally high ocean temperature data in the Gulf Stream area that led to an overestimate of thermal expansion in this region. In this paper, we also estimate thermal expansion over the last decade (1993–2003), using a new ocean temperature and salinity database. We compare our result with three other estimates, two being based on global gridded data sets, and one based on an approach similar to that developed here. It is found that the mean rate of thermosteric sea level rise over the past decade is 1.5±0.3 mm/year, i.e. 50% of the observed 3 mm/year by satellite altimetry. For both time spans, past few decades and last decade, a contribution of 1.4 mm/year is not explained by thermal expansion, thus needs to be of water mass origin. Direct estimates of land ice melt for the recent years account for about 1 mm/year sea level rise. Thus, at least for the last decade, we have moved closer to explaining the observed rate of sea level rise than the IPCC third assessment report.  相似文献   
998.
Climate Warming and Water Management Adaptation for California   总被引:1,自引:3,他引:1  
The ability of California's water supply system to adapt to long-term climatic and demographic changes is examined. Two climate warming and a historical climate scenario are examined with population and land use estimates for the year 2100 using a statewide economic-engineering optimization model of water supply management. Methodologically, the results of this analysis indicate that for long-term climate change studies of complex systems, there is considerable value in including other major changes expected during a long-term time-frame (such as population changes), allowing the system to adapt to changes in conditions (a common feature of human societies), and representing the system in sufficient hydrologic and operational detail and breadth to allow significant adaptation. While the policy results of this study are preliminary, they point to a considerable engineering and economic ability of complex, diverse, and inter-tied systems to adapt to significant changes in climate and population. More specifically, California's water supply system appears physically capable of adapting to significant changes in climate and population, albeit at a significant cost. Such adaptation would entail large changes in the operation of California's large groundwater storage capacity, significant transfers of water among water users, and some adoption of new technologies.  相似文献   
999.
Institutional barriers and bridges to local climate change impacts adaptation affecting small rural municipalities and Conservation Authorities (CAs are watershed agencies) in Eastern Ontario (Canada) are examined, and elements of a community-based adaptation strategy related to water infrastructures are proposed as a case-study in community adaptation to climate change. No general water scarcity is expected for the region even under unusually dry weather scenarios. Localized quantity and quality problems are likely to occur especially in groundwater recharge areas. Some existing institutions can be relied on by municipalities to build an effective adaptation strategy based on a watershed/region perspective, on their credibility, and on their expertise. Windows of opportunity or framing issues are offered at the provincial level, the most relevant one in a federal state, by municipal emergency plan requirements and pending watershed source water protection legislation. Voluntary and soon to be mandated climate change mitigation programs at the federal level are other ones.  相似文献   
1000.
The mechanisms involved in the glacial inception are still poorly constrained due to a lack of high resolution and cross-dated climate records at various locations. Using air isotopic measurements in the recently drilled NorthGRIP ice core, we show that no evidence exists for stratigraphic disturbance of the climate record of the last glacial inception (∼123–100 kyears BP) encompassing Dansgaard–Oeschger events (DO) 25, 24 and 23, even if we lack sufficient resolution to completely rule out disturbance over DO 25. We quantify the rapid surface temperature variability over DO 23 and 24 with associated warmings of 10±2.5 and 16±2.5°C, amplitudes which mimic those observed in full glacial conditions. We use records of δ18O of O2 to propose a common timescale for the NorthGRIP and the Antarctic Vostok ice cores, with a maximum uncertainty of 2,500 years, and to examine the interhemispheric sequence of events over this period. After a synchronous North–South temperature decrease, the onset of rapid events is triggered in the North through DO 25. As for later events, DO 24 and 23 have a clear Antarctic counterpart which does not seem to be the case for the very first abrupt warming (DO 25). This information, when added to intermediate levels of CO2 and to the absence of clear ice rafting associated with DO 25, highlights the uniqueness of this first event, while DO 24 and 23 appear similar to typical full glacial DO events.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号