首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   972篇
  免费   50篇
  国内免费   7篇
测绘学   17篇
大气科学   92篇
地球物理   243篇
地质学   454篇
海洋学   62篇
天文学   105篇
综合类   4篇
自然地理   52篇
  2024年   2篇
  2023年   4篇
  2022年   4篇
  2021年   18篇
  2020年   20篇
  2019年   19篇
  2018年   25篇
  2017年   35篇
  2016年   25篇
  2015年   40篇
  2014年   41篇
  2013年   61篇
  2012年   51篇
  2011年   60篇
  2010年   70篇
  2009年   58篇
  2008年   54篇
  2007年   46篇
  2006年   46篇
  2005年   52篇
  2004年   44篇
  2003年   33篇
  2002年   32篇
  2001年   22篇
  2000年   15篇
  1999年   9篇
  1998年   13篇
  1997年   12篇
  1996年   12篇
  1995年   15篇
  1994年   2篇
  1993年   13篇
  1992年   4篇
  1991年   4篇
  1990年   11篇
  1989年   6篇
  1987年   12篇
  1986年   7篇
  1985年   6篇
  1984年   7篇
  1983年   4篇
  1982年   2篇
  1981年   2篇
  1980年   2篇
  1979年   1篇
  1978年   1篇
  1977年   3篇
  1976年   1篇
  1973年   1篇
  1969年   1篇
排序方式: 共有1029条查询结果,搜索用时 46 毫秒
901.
Changes in extreme precipitation should be one of the primary impacts of climate change (CC) in urban areas. To assess these impacts, rainfall data from climate models are commonly used. The main goal of this paper is to report on the state of knowledge and recent works on the study of CC impacts with a focus on urban areas, in order to produce an integrated review of various approaches to which future studies can then be compared or constructed. Model output statistics (MOS) methods are increasingly used in the literature to study the impacts of CC in urban settings. A review of previous works highlights the non-stationarity nature of future climate data, underscoring the need to revise urban drainage system design criteria. A comparison of these studies is made difficult, however, by the numerous sources of uncertainty arising from a plethora of assumptions, scenarios, and modeling options. All the methods used do, however, predict increased extreme precipitation in the future, suggesting potential risks of combined sewer overflow frequencies, flooding, and back-up in existing sewer systems in urban areas. Future studies must quantify more accurately the different sources of uncertainty by improving downscaling and correction methods. New research is necessary to improve the data validation process, an aspect that is seldom reported in the literature. Finally, the potential application of non-stationarity conditions into generalized extreme value (GEV) distribution should be assessed more closely, which will require close collaboration between engineers, hydrologists, statisticians, and climatologists, thus contributing to the ongoing reflection on this issue of social concern.  相似文献   
902.
903.
This study presents the evaluation of simulations from two new Canadian regional climate models (RCMs), CanRCM4 and CRCM5, with a focus on the models’ skill in simulating daily precipitation indices and the Standardized Precipitation Index (SPI). The evaluation was carried out over the past two decades using several sets of gridded observations that partially cover North America. The new Canadian RCMs were also compared with four reanalysis products and six other RCMs. The different configurations of the Canadian RCM simulations also permit evaluation of the impact of different spatial resolutions, atmospheric drivers, and nudging conditions. The results from the new Canadian models show some improvement in precipitation characteristics over the previous Canadian RCM (CRCM4), but these differ with the seasons. For winter, CanRCM4 and CRCM5 have better skill than most other models over all of North America. For the summer, CRCM5 0.44° performs best over the United States, while CRCM4 has the best skill over Canada. Good skill is exhibited by CanRCM4 and CRCM4 in simulating the 6-month SPI over the Prairies and the western US Corn Belt. In general, differences are small between runs with or without large-scale spectral nudging; differences are small when different boundary conditions are used.  相似文献   
904.
The accuracy of old maps can hold interesting historical information, and is therefore studied using distortion analysis methods. These methods start from a set of ground control points that are identified both on the old map and on a modern reference map or globe, and conclude with techniques that compute and visualise distortion. Such techniques have advanced over the years, but leave room for improvement, as the current ones result in approximate values and a coarse spatial resolution. We propose a more elegant and more accurate way to compute distortion of old maps by translating the technique of differential distortion analysis, used in map projection theory, to the setting where an old map and a reference map are directly compared. This enables the application of various useful distortion metrics to the study of old maps, such as the area scale factor, the maximum angular distortion and the Tissot indicatrices. As such a technique is always embedded in a full distortion analysis method we start by putting forward an optimal analysis method for a general-purpose study, which then serves as the foundation for the development of our technique. Thereto, we discuss the structure of distortion analysis methods and the various options available for every step of the process, including the different settings in which the old map can be compared to its modern counterpart, the techniques that can be used to interpolate between both, and the techniques available to compute and visualise the distortion. We conclude by applying our general-purpose method, including the differential distortion analysis technique, to an example map also used in other literature.  相似文献   
905.
906.
907.
We performed large-scale earthquake economic loss estimations for France and cost–benefit analyses for several French cities by developing a semiempirical, intensity-based approach. The proposed methodology is inexpensive and easily applicable in case of a paucity of detailed information regarding the specific regional seismic hazard and the structural characteristics of the building stock, which is of particular importance in moderate-to-low seismic hazard regions. The exposure model is derived from census datasets, and the seismic vulnerability distribution of buildings is calculated using data mining techniques. Several hypothetical, large-scale retrofit scenarios are proposed, with increasing levels of investment. These cities, in their respective reinforced states, are then subjected to a series of hazard scenarios. Seismic hazard data for different return periods are calculated from regulatory accelerations from French seismic zoning. Loss estimations for the original (non-reinforced) configuration show high levels of expected building repair and replacement costs for all time spans. Finally, the benefits in terms of damage avoidance are compared with the costs of each retrofit measure. Relatively limited strengthening investments reduce the probability of building collapse, which is the main cause of human casualties. However, the results of this study suggest that retrofitting is, on average, only cost-effective in the parts of France with the highest seismicity and over the longest time horizons.  相似文献   
908.
The eighteenth century Carte de cabinet of count de Ferraris is the first large-scale (1:11?520) topographic map of the entire Belgian territory, making it a valuable source of historical information. In the past, a number of studies have tried to assess the geometric accuracy of this map, but they all suffer from restricted technical capabilities for computing and visualizing the distortions, and most of them only focus on a limited number of the 275 map sheets. This paper therefore seeks to provide the first systematic and in-depth investigation of the map’s local geometric accuracy. Recently, two Belgian government agencies georeferenced the Flemish and Walloon part of the Carte de cabinet with a high level of detail, using some 30,000 ground control points to link the old map to the modern topographic map of Belgium. These data sets represent a new and unprecedented potential source of accuracy information. However, the high number of control points and our desire to compute distortions in an exact, local, quantitative and continuous way meant prominent techniques for studying the geometric accuracy of old maps, such as displacement vectors, distortion grids, triangular nets and the popular MapAnalyst software, were unsuited for this task. To meet all our requirements a new technique called Differential Distortion Analysis, which is influenced by the treatment of distortions in map projection theory, was used instead. Its advantages, structure and application to the Carte de cabinet are discussed in detail. The new technique allows calculating and displaying the map’s local angular and surface distortions with a very high spatial resolution. Consequently, it was possible to identify trends in the obtained levels of accuracy and to relate these to historical facts about the Carte de cabinet’s production process. This has resulted in important new insights into the map’s geometric accuracy.  相似文献   
909.
Natural Hazards - Remote sensing was used to visualize the West region with the purpose of investigating recent natural hazards observed in this area. Various approaches used based on...  相似文献   
910.
The stratigraphy of the last deglaciation sequence is investigated in Lake Saint‐Jean (Québec Province, Canada) based on 300 km of echo‐sounder two dimensional seismic profiles. The sedimentary archive of this basin is documented from the Late Pleistocene Laurentidian ice‐front recession to the present‐day situation. Ten seismic units have been identified that reflect spatio‐temporal variations in depositional processes characterizing different periods of the Saint‐Jean basin evolution. During the postglacial marine flooding, a high deposition rate of mud settling, from proglacial glacimarine and then prodeltaic plumes in the Laflamme Gulf, produced an extensive, up to 50 m thick mud sheet draping the isostatically depressed marine basin floor. Subsequently, a closing of the water body due to glacio‐isostatic rebound occurred at 8.5 cal. ka BP, drastically modifying the hydrodynamics. Hyperpycnal flows appeared because fresh lake water replaced dense marine water. River sediments were transferred towards the deeper part of the lake into river‐related sediment drifts and confined lobes. The closing of the water body is also marked by the onset of a wind‐driven internal circulation associating coastal hydrodynamics and bottom currents with sedimentary features including shoreface deposits, sediment drifts and a prograding shelf‐type body. The fingerprints of a forced regression are well expressed by mouth‐bar systems and by the shoreface–shelf system, the latter unexpected in such a lacustrine setting. In both cases, a regressive surface of lacustrine erosion (RSLE) has been identified, separating sandy mouth‐bar from glaciomarine to prodeltaic muds, and sandy shoreface wedges from the heterolithic shelf‐type body, respectively. The Lake Saint‐Jean record is an example of a regressive succession driven by a glacio‐isostatic rebound and showing the transition from late‐glacial to post‐glacial depositional systems.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号