首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   153篇
  免费   10篇
  国内免费   1篇
测绘学   3篇
大气科学   41篇
地球物理   33篇
地质学   48篇
海洋学   12篇
天文学   10篇
综合类   1篇
自然地理   16篇
  2024年   1篇
  2023年   2篇
  2022年   2篇
  2021年   2篇
  2020年   3篇
  2019年   4篇
  2018年   11篇
  2017年   5篇
  2016年   11篇
  2015年   5篇
  2014年   16篇
  2013年   17篇
  2012年   9篇
  2011年   9篇
  2010年   12篇
  2009年   6篇
  2008年   7篇
  2007年   7篇
  2006年   7篇
  2005年   1篇
  2004年   2篇
  2003年   3篇
  2002年   4篇
  1999年   2篇
  1998年   4篇
  1997年   3篇
  1996年   2篇
  1995年   1篇
  1994年   1篇
  1989年   1篇
  1988年   1篇
  1987年   1篇
  1978年   1篇
  1973年   1篇
排序方式: 共有164条查询结果,搜索用时 437 毫秒
141.
Human adaptation to climate change is gaining increasing academic as well as political attention. Understanding how and what people around the world adapt to is, however, difficult. Climate change is often, if not always, only one of a multiplicity of exposures perforating local communities. In Biidi 2, a small Sahelian village in northern Burkina Faso, climate variability have had a great influence on inhabitants’ lives since the major droughts of the early 1970s and 1980s. Tracing the intertwinement of drought, diminishing agricultural production and the need to buy food, this article explores how villagers attempt to attract development projects and negotiate with political parties in order to negate the impact of the global food crisis on their livelihoods. In doing so the article attempts to show how adaptation to climate variability is related to multiple, intersecting processes, and in this specific case is a matter of navigating changing socioeconomic factors. Using recent theory from social anthropology, adaptation is explored as a matter of social navigation. It is suggested that this theoretical approach might help nuance and elucidate how, and to what, local people around the world adapt.  相似文献   
142.
A. Bun  K. Hamal  M. Jonas  M. Lesiv 《Climatic change》2010,103(1-2):215-225
The focus of this study is on the preparatory detection of uncertain greenhouse gas (GHG) emission changes (also termed emission signals) under the Kyoto Protocol. Preparatory signal detection is a measure that should be taken prior to/during negotiation of the Protocol. It allows the ranking of countries under the Protocol according to their realized versus their agreed emission changes and in terms of both certainty and credibility. Controlling GHGs is affected by uncertainty and may be costly. Thus, knowing whether each nation is doing its part is in the public interest. At present, however, countries to the United Nations Framework Convention on Climate Change (UNFCCC) are obliged to include in the reporting of their annual inventories direct or alternative estimates of the uncertainty associated with these, consistent with the Intergovernmental Panel on Climate Change’s (IPCC) good practice guidance reports. As a consequence, inventory uncertainty is monitored, but not regulated, under the Kyoto Protocol. Although uncertainties are becoming increasingly available, monitored emissions and uncertainties are still dealt with separately. In our study we analyze estimates of both emission changes and uncertainties to advance the evaluation of countries and their performance under the Protocol. Our analysis allows supply and demand of emissions credits to be examined in consideration of uncertainty. For the purpose of our exercise, we make use of the Undershooting and Verification Time concept described by Jonas et al. (Clim Change doi:10.1007/s10584-010-9914-6, 2010).  相似文献   
143.
The effects of cloud shadowing, channelling, cloud side illumination and droplet concentration are investigated with regard to the reflection of shortwave solar radiation. Using simple geometric clouds, coupled with a Monte Carlo model the transmission properties of idealized cloud layers are found. The clouds are illuminated with direct solar radiation from above. The main conclusion reached is that the distribution of the cloud has a very large influence on the reflectivity of a cloud layer. In particular, if the cloud contains vertical gaps through the cloud layer in which the liquid water content is zero, then, smaller more numerous gaps are more influential on the radiation than fewer, larger gaps with equal cloud fraction. At very low solar zenith angles channelling of the radiation reduces the reflection expected on the basis of the percentage cloud cover. At high solar zenith angles the illumination of the cloud edges significantly increases the reflection despite the shadowing of one cloud by another when the width of the gaps is small. The impact of droplet concentration upon the reflection of cloud layers is also investigated. It is found that at low solar zenith angles where channelling is important, the lower concentrations increase the transmission. Conversely, when cloud edge illumination is dominant the cloud distribution is found to be more important for the higher concentrations.  相似文献   
144.
The Continuous Plankton Recorder has been deployed for 70 years. Although modifications to the machine have been relatively minor, there has been a steady increase in the speed at which it is towed, creating a need to quantify what effects this may have had on its sampling characteristics. Additionally, because the CPR database is one of the longest and most geographically extensive biological time series in the world, and scientists are currently focusing on gaining understanding about climate-induced ecological changes, there is increasing pressure to quantify the sampling performance and relate the CPR data to data collected by other plankton samplers. Many of these issues of consistency and comparability have been investigated throughout the decades of the CPR survey. The primary aim of this study is to draw together the results of those investigations, updating or integrating them where applicable. A secondary aim is to use the CPR database to address other previously unexamined issues. We show that the increase in speed of tow has had no effect on the depth of sampling and the mechanical efficiency of the internal mechanism, but that at the highest tow speeds there is some evidence that flow may be reduced. Depth of tow may also be dependent on the ship operating a particular route. We describe the processing procedures used to ensure consistency of analysis and detail the changes in taxonomic resolution that have occurred through the course of the survey. Some consistency issues remain unresolved, such as the effects of adding heavy instrumentation to the attitude of the CPR in the water and possible effects on sampling performance. The reduction of flow caused by clogging of the filtering mesh has now been quantified through the addition of flowmeters and each CPR sample can now be calibrated for measured, or derived, filtered volume. Although estimates of abundances for large areas have been shown to be unaffected by recalibration, absolute quantification of plankton abundance is necessary to enable comparisons with other sampling devices. Several studies have now been undertaken that compare plankton abundances obtained with the CPR with those obtained using vertical nets at specific locations on the European continental shelf. Although catches by the CPR are almost always lower, seasonal cycles are replicated in each comparison, and interannual variability generally agrees between time series. The relative catch rates for an individual species by each device appear to be consistent, probably because of the organisms’ behaviour and attributes of the sampling device. We are now able to develop calibration factors to convert CPR catches to absolute abundances that can be integrated with other data sets where appropriate, which should increase the applicability and utility of CPR data.  相似文献   
145.
146.
This paper describes a methodology that combines the outputs of (1) the Integrated Model to Assess the Greenhouse Effect (IMAGE Version 1.0) of the Netherlands National Institute of Public Health and Environmental Protection (RIVM) (given a greenhouse gas emission policy, this model can estimate the effects such as global mean surface air temperature change for a wide variety of policies) and (2) ECHAM-1/LSG, the Global Circulation Model (GCM) of the Max-Planck Institute for Meteorology in Hamburg, Germany. The combination enables one to calculate grid point surface air temperature changes for different scenarios with a turnaround time that is much quicker than that for a GCM. The methodology is based upon a geographical pattern of the ratio of grid point temperature change to global mean values during a certain period of the simulation, as calculated by ECHAM-1/LSG for the 1990 Scenarios A and D of the Intergovernmental Panel on Climate Change (IPCC). A procedure, based upon signal-to noise ratios in the outputs, enabled us to estimate where we have confidence in the methodology; this is at about 23% to 83% of the total of 2,048 grid points, depending upon the scenario and the decade in the simulation. It was found that the methodology enabled IMAGE to provide useful estimates of the GCM-predicted grid point temperature changes. These estimates were within 0.5K (0.25K) throughout the 100 years of a given simulation for at least 79% (74%) of the grid points where we are confident in applying the methodology. The temperature ratio pattern from Scenario A enabled IMAGE to provide useful estimates of temperature change within 0.5K (0.25K) in Scenario D for at least 88% (68%) of the grid points where we have confidence; indicating that the methodology is transferable to other scenarios. Tests with the Geophysical Fluid Dynamics Laboratory GCM indicated, however, that a temperature ratio pattern may have to be developed for each GCM. The methodology, using a temperature ratio pattern from the 1990 IPCC Scenario A and involving IMAGE, gave gridded surface air temperature patterns for the 1992 IPCC radiative-forcing Scenarios C and E and the RIVM emission Scenario B; none of these scenarios has been simulated by ECHAM-1/LSG. The simulations reflect the uncertainty range of a future warming.The work reported by the authors was carried out during their stay at the project Forestry and Climate Change of the International Institute for Applied Systems Analysis, Laxenburg, Austria.  相似文献   
147.
We carry out climate simulations for 1880–2003 with GISS modelE driven by ten measured or estimated climate forcings. An ensemble of climate model runs is carried out for each forcing acting individually and for all forcing mechanisms acting together. We compare side-by-side simulated climate change for each forcing, all forcings, observations, unforced variability among model ensemble members, and, if available, observed variability. Discrepancies between observations and simulations with all forcings are due to model deficiencies, inaccurate or incomplete forcings, and imperfect observations. Although there are notable discrepancies between model and observations, the fidelity is sufficient to encourage use of the model for simulations of future climate change. By using a fixed well-documented model and accurately defining the 1880–2003 forcings, we aim to provide a benchmark against which the effect of improvements in the model, climate forcings, and observations can be tested. Principal model deficiencies include unrealistically weak tropical El Nino-like variability and a poor distribution of sea ice, with too much sea ice in the Northern Hemisphere and too little in the Southern Hemisphere. Greatest uncertainties in the forcings are the temporal and spatial variations of anthropogenic aerosols and their indirect effects on clouds. Electronic supplementary material The online version of this article (doi:) contains supplementary material, which is available to authorized users.  相似文献   
148.
In order to systematically and visually understand well-known but qualitative and complex relationships between synoptic fields and heavy rainfall events in Kyushu Islands, southwestern Japan, during the BAIU season, these synoptic fields were classified using the Self-Organizing Map (SOM), which can convert complex non-linear features into simple two-dimensional relationships. It was assumed that the synoptic field patterns could be simply expressed by the spatial distribution of (1) wind components at the 850 hPa level and (2) precipitable water (PW) defined by the water vapor amount contained in a vertical column of the atmosphere. By the SOM algorithm and the clustering techniques of the U-matrix and the K-means, the synoptic fields could be divided into eight kinds of patterns (clusters). One of the clusters has the notable spatial features represented by a large PW content accompanied by strong wind components known as low-level jet (LLJ). The features of this cluster indicate a typical synoptic field pattern that frequently causes heavy rainfall in Kyushu during the rainy season.In addition, an independent data set was used for validating the performance of the trained SOM. The results indicated that the SOM could successfully extract heavy rainfall events related to typical synoptic field patterns of the BAIU season. Interestingly, one specific SOM unit was closely related to the occurrence of disastrous heavy rainfall events observed during both training and validation periods. From these results, the trained SOM showed good performance for identifying synoptic fields causing heavy rainfall also in the validation period. We conclude that the SOM technique may be an effective tool for classifying complicated non-linear synoptic fields and identifying heavy rainfall events to some degree.  相似文献   
149.
Mountain water resources management often requires hydrological models that need to handle both snow and ice melt. In this study, we compared two different model types for a partly glacierized watershed in central Switzerland: (1) an energy‐balance model primarily designed for snow simulations; and (2) a temperature‐index model developed for glacier simulations. The models were forced with data extrapolated from long‐term measurement records to mimic the typical input data situation for climate change assessments. By using different methods to distribute precipitation, we also assessed how various snow cover patterns influenced the modelled runoff. The energy‐balance model provided accurate discharge estimations during periods dominated by snow melt, but dropped in performance during the glacier ablation season. The glacier melt rates were sensitive to the modelled snow cover patterns and to the parameterization of turbulent heat fluxes. In contrast, the temperature‐index model poorly reproduced snow melt runoff, but provided accurate discharge estimations during the periods dominated by glacier ablation, almost independently of the method used to distribute precipitation. Apparently, the calibration of this model compensated for the inaccurate precipitation input with biased parameters. Our results show that accurate estimates of snow cover patterns are needed either to correctly constrain the melt parameters of the temperature‐index model or to ensure appropriate glacier surface albedos required by the energy‐balance model. Thus, particularly when only distant meteorological stations are available, carefully selected input data and efficient extrapolation methods of meteorological variables improve the reliability of runoff simulations in high alpine watersheds. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   
150.
T. Jonas  C. Marty  J. Magnusson   《Journal of Hydrology》2009,378(1-2):161-167
The snow water equivalent (SWE) characterizes the hydrological significance of snow cover. However, measuring SWE is time-consuming, thus alternative methods of determining SWE may be useful. SWE can be calculated from snow depth if the bulk snow density is known. Thus, a reliable estimation method of snow densities could (a) potentially save a lot of effort by, at least partly, sampling snow depth instead of SWE, and would (b) allow snow hydrological evaluations, when only snow depth data are available. To generate a useful parameterization of the bulk density a large dataset was analyzed covering snow densities and depths measured biweekly over five decades at 37 sites throughout the Swiss Alps. Four factors were identified to affect the bulk snow density: season, snow depth, site altitude, and site location. These factors constitute a convenient set of input variables for a snow density model developed in this study. The accuracy of estimating SWE using our model is shown to be equivalent to the variability of repeated SWE measurements at one site. The technique may therefore allow a more efficient but indirect sampling of the SWE without necessarily affecting the data quality.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号