首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1169篇
  免费   60篇
  国内免费   6篇
测绘学   30篇
大气科学   106篇
地球物理   337篇
地质学   423篇
海洋学   130篇
天文学   109篇
综合类   9篇
自然地理   91篇
  2024年   1篇
  2023年   4篇
  2022年   8篇
  2021年   21篇
  2020年   23篇
  2019年   22篇
  2018年   55篇
  2017年   44篇
  2016年   104篇
  2015年   68篇
  2014年   67篇
  2013年   69篇
  2012年   67篇
  2011年   93篇
  2010年   64篇
  2009年   84篇
  2008年   61篇
  2007年   49篇
  2006年   60篇
  2005年   31篇
  2004年   29篇
  2003年   24篇
  2002年   28篇
  2001年   13篇
  2000年   16篇
  1999年   21篇
  1998年   21篇
  1997年   15篇
  1996年   10篇
  1995年   10篇
  1994年   8篇
  1993年   4篇
  1992年   2篇
  1991年   4篇
  1990年   7篇
  1989年   5篇
  1988年   4篇
  1987年   1篇
  1986年   3篇
  1984年   1篇
  1983年   1篇
  1982年   3篇
  1980年   2篇
  1979年   2篇
  1978年   1篇
  1977年   1篇
  1974年   1篇
  1973年   2篇
  1970年   1篇
排序方式: 共有1235条查询结果,搜索用时 15 毫秒
121.
122.
In this paper, the unfeasibility of producing “objective” probabilistic climate change scenarios is discussed. Realizing that the knowledge of “true” probabilities of the different scenarios and temperature changes is unachievable, the objective must be to find the probabilities that are the most consistent with what our state of knowledge and expert judgment are. Therefore, subjective information plays, and should play, a crucial role. A new methodology, based on the Principle of Maximum Entropy, is proposed for constructing probabilistic climate change scenarios when only partial information is available. The objective is to produce relevant information for decision-making according to different agents’ judgment and subjective beliefs. These estimates have desirable properties such as: they are the least biased estimate possible on the available information; maximize the uncertainty (entropy) subject to the partial information that is given; The maximum entropy distribution assigns a positive probability to every event that is not excluded by the given information; no possibility is ignored. The probabilities obtained in this manner are the best predictions possible with the state of knowledge and subjective information that is available. This methodology allows distinguishing between reckless and cautious positions regarding the climate change threat.  相似文献   
123.
We analyze the ground motion time histories due to the local seismicity near the Itoiz reservoir to estimate the near-source, surface 3D displacement gradients and dynamic deformations. The seismic data were obtained by a semipermanent broadband and accelerometric network located on surface and at underground sites. The dynamic deformation field was calculated by two different methodologies: first, by the seismo-geodetic method using the data from a three-station microarray located close to the dam, and second, by single station estimates of the displacement gradients. The dynamic deformations obtained from both methods were compared and analyzed in the context of the local free-field effects. The shallow 1D velocity structure was estimated from the seismic data by modeling the body wave travel times. Time histories obtained from both methods result quite similar in the time window of body wave arrivals. The strain misfits between methods vary from 1.4 to 35.0 % and rotational misfits vary from 2.5 to 36.0 %. Amplitudes of displacement gradients vary in the range of 10?8 to 10?7 strains. From these results, a new scaling analysis by numerical modeling is proposed in order to estimate the peak dynamic deformations for different magnitudes, up to the expected maximum M w in the region (M5.5). Peak dynamic deformations due to local M w5.5 earthquakes would reach amplitudes of 10?5 strain and 10?3 radians at the Itoiz dam. The single station method shows to be an adequate option for the analysis of local seismicity, where few three-component stations are available. The results obtained here could help to extend the applicability of these methodologies to other sites of engineering interest.  相似文献   
124.
The output of several multi-century simulations with a coupled ocean–atmosphere general circulation model is examined with respect to the variability of global storm activity in winter on time scales of decades and longer. The frequency of maximum wind speed events within a grid box, using the lower limits on the Beaufort wind speed scale of 8 and 10 Bft as thresholds, is taken as the characteristic parameter. Two historical climate runs with time-dependent forcing of the last five centuries, one control simulation, and three climate change experiments are considered. The storm frequency shows no trend until recently. Global maps for the industrially influenced period hardly differ from pre-industrial maps, even though significant temperature anomalies temporarily emerge in the historical runs. Two indicators describing the frequency and the regional shift of storm activity are determined. In historical times they are decoupled from temperature. Variations in solar and volcanic forcing in the historical simulations as well as in greenhouse gas concentrations for the industrially influenced period are not related to variations in storm activity. Also, anomalous temperature regimes like the Late Maunder Minimum are not associated with systematic storm conditions. In the climate change experiments, a poleward shift of storm activity is found in all three storm track regions. Over the North Atlantic and Southern Ocean, storm activity increases, while it decreases over the Pacific Ocean. In contrast to the historical runs, and with the exception of the North Pacific storm frequency index, the storm indices parallel the development of temperature, exceeding the 2 σ-range of pre-industrial variations in the early twenty-first century.  相似文献   
125.
We present the first climate prediction of the coming decade made with multiple models, initialized with prior observations. This prediction accrues from an international activity to exchange decadal predictions in near real-time, in order to assess differences and similarities, provide a consensus view to prevent over-confidence in forecasts from any single model, and establish current collective capability. We stress that the forecast is experimental, since the skill of the multi-model system is as yet unknown. Nevertheless, the forecast systems used here are based on models that have undergone rigorous evaluation and individually have been evaluated for forecast skill. Moreover, it is important to publish forecasts to enable open evaluation, and to provide a focus on climate change in the coming decade. Initialized forecasts of the year 2011 agree well with observations, with a pattern correlation of 0.62 compared to 0.31 for uninitialized projections. In particular, the forecast correctly predicted La Niña in the Pacific, and warm conditions in the north Atlantic and USA. A similar pattern is predicted for 2012 but with a weaker La Niña. Indices of Atlantic multi-decadal variability and Pacific decadal variability show no signal beyond climatology after 2015, while temperature in the Niño3 region is predicted to warm slightly by about 0.5 °C over the coming decade. However, uncertainties are large for individual years and initialization has little impact beyond the first 4 years in most regions. Relative to uninitialized forecasts, initialized forecasts are significantly warmer in the north Atlantic sub-polar gyre and cooler in the north Pacific throughout the decade. They are also significantly cooler in the global average and over most land and ocean regions out to several years ahead. However, in the absence of volcanic eruptions, global temperature is predicted to continue to rise, with each year from 2013 onwards having a 50 % chance of exceeding the current observed record. Verification of these forecasts will provide an important opportunity to test the performance of models and our understanding and knowledge of the drivers of climate change.  相似文献   
126.
Bottom-up and top-down models are used to support climate policies, to identify the options required to meet GHG abatement targets and to evaluate their economic impact. Some studies have shown that the GHG mitigation options provided by economic top-down and technological bottom-up models tend to vary. One reason for this is that these models tend to use different baseline scenarios. The bottom-up TIMES_PT and the top-down computable general equilibrium GEM-E3_PT models are examined using a common baseline scenario to calibrate them, and the extend of their different mitigation options and its relevant to domestic policy making are assessed. Three low-carbon scenarios for Portugal until 2050 are generated, each with different GHG reduction targets. Both models suggest close mitigation options and locate the largest mitigation potential to energy supply. However, the models suggest different mitigation options for the end-use sectors: GEM-E3_PT focuses more on energy efficiency, while TIMES_PT relies on decrease carbon intensity due to a shift to electricity. Although a common baseline scenario cannot be ignored, the models’ inherent characteristics are the main factor for the different outcomes, thereby highlighting different mitigation options.

Policy relevance

The relevance of modelling tools used to support the design of domestic climate policies is assessed by evaluating the mitigation options suggested by a bottom-up and a top-down model. The different outcomes of each model are significant for climate policy design since each suggest different mitigation options like end-use energy efficiency and the promotion of low-carbon technologies. Policy makers should carefully select the modelling tool used to support their policies. The specific modelling structures of each model make them more appropriate to address certain policy questions than others. Using both modelling approaches for policy support can therefore bring added value and result in more robust climate policy design. Although the results are specific for Portugal, the insights provided by the analysis of both models can be extended to, and used in the climate policy decisions of, other countries.  相似文献   
127.
Maconellicoccus hirsutus (Green) (Hemiptera:Pseudoccidae) is an important pest in many countries being responsible for considerable economic loses. Although it is not currently present in Chile, the chance that it could be accidentally introduced rises with the list of infested countries increasing over the last years. In addition, climate change projections indicate that a larger region would fit as potential habitat for this pest, allowing it to persist over time and colonize a larger proportion of the Chilean territory. In this study the geographic distribution and the number of generations this mealybug would develop in Chile were determined, under current temperatures and under two projected climatic scenarios. Cumulative degree days were calculated for current and future scenarios using a lower temperature threshold of 14.5 °C, with 624.5 degree-days as the thermal requirement for the species to complete one generation. The results show that under current climate conditions M. hirsutus could develop up to three generations in the north of the country (i.e. 18° South) and one generation in the region near 37° South. Under future scenarios’ conditions the pest could develop up to five generations in the north, and one generation around the 42° South. Present climate conditions in Chile would allow the establishment of the pink hibiscus mealybug, if the pest enters the country. Climate change conditions would allow the potentially invaded area to expand south, and would promote the development of more generations per year of the mealybug in the studied territory.  相似文献   
128.
A biomonitoring survey using the moss species Hypnum cupressiforme Hedw. was performed in the North of Navarra (Spain) in 2006. The levels of V, Cr, Mn, Ni, Cu, Zn, As, Zr, Cd, Hg and Pb, and the total nitrogen content were determined in the samples by means of ICP-MS, CV-AA, and the Kjeldahl method. PCA analysis showed a differentiation between lithogenic (V, Cr, Mn, Ni, As and Zr) and anthropogenic elements (Mn, Cu, Zn, Cd, Hg and Pb). Spatial distribution maps were drawn using the kriging method, in order to identify the most affected areas and the main pollution sources. A similar spatial distribution was observed for the elements belonging to each group separated by the PCA, showing an important contribution from metal industries located in the Basque Country, as well as influence of long-range transboundary pollution processes. Background levels were also determined for the study area, along with the contamination factor for the different elements analysed. Mosses seemed to be good biomonitors of N deposition in areas of accumulation.  相似文献   
129.
Dynamic adaptation of maize and wheat production to climate change   总被引:2,自引:0,他引:2  
  相似文献   
130.
In order to understand the structure of fish assemblages in the modified Lima basin (Northern Portugal), two distinct datasets concerning the presence and abundance of fish species were subjected to multivariate analysis. On the River Lima two types of flow modification are present within kilometres of one another: (a) a reduced and constant flow due to hypolimnetic release; and (b) an intense and irregular flow. A comparison of their influence on fish assemblages revealed a gradient of assemblage types from tributaries to main river sites. The latter were characterised by a strong dominance of cyprinids, particularly Iberian barbel (Barbus bocagei). The former harboured two kinds of fish assemblages: those closer to the river mouth were dominated by the cyprinids Iberian chub (Squalius carolitertii) and Iberian nase (Chondrostoma polylepis), which were also frequently present in the main river; while in those further upstream the predominant species was the brown trout (Salmo trutta). Although explanatory variables such as distance from source, altitude, substrate coarseness and width were the primary correlates of fish assemblage composition, dam construction and flow regulation also had a significant effect upon assemblage structure, particularly by: i) reducing the importance of migratory species; ii) constraining the presence of trout in the regulated segments; and iii) simplifying the community, especially in the case of the constant and reduced flow regime.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号