首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   6495篇
  免费   236篇
  国内免费   81篇
测绘学   236篇
大气科学   512篇
地球物理   1526篇
地质学   2164篇
海洋学   523篇
天文学   1177篇
综合类   28篇
自然地理   646篇
  2022年   30篇
  2021年   76篇
  2020年   87篇
  2019年   123篇
  2018年   178篇
  2017年   161篇
  2016年   231篇
  2015年   173篇
  2014年   196篇
  2013年   409篇
  2012年   257篇
  2011年   342篇
  2010年   291篇
  2009年   381篇
  2008年   336篇
  2007年   290篇
  2006年   268篇
  2005年   259篇
  2004年   261篇
  2003年   206篇
  2002年   214篇
  2001年   115篇
  2000年   144篇
  1999年   102篇
  1998年   114篇
  1997年   81篇
  1996年   85篇
  1995年   83篇
  1994年   90篇
  1993年   73篇
  1992年   88篇
  1991年   67篇
  1990年   55篇
  1989年   51篇
  1988年   57篇
  1987年   52篇
  1986年   58篇
  1985年   68篇
  1984年   64篇
  1983年   76篇
  1982年   57篇
  1981年   65篇
  1980年   55篇
  1979年   65篇
  1978年   51篇
  1977年   35篇
  1976年   29篇
  1975年   26篇
  1974年   26篇
  1973年   30篇
排序方式: 共有6812条查询结果,搜索用时 15 毫秒
101.
102.
Abstract— It has now been about a decade since the first demonstrations that hypervelocity particles could be captured, partially intact, in aerogel collectors. But the initial promise of a bonanza of partially‐intact extraterrestrial particles, collected in space, has yet to materialize. One of the difficulties that investigators have encountered is that the location, extraction, handling and analysis of very small (10 μm and less) grains, which constitute the vast majority of the captured particles, is challenging and burdensome. Furthermore, current extraction techniques tend to be destructive over large areas of the collectors. Here we describe our efforts to alleviate some of these difficulties. We have learned how to rapidly and efficiently locate captured particles in aerogel collectors, using an automated microscopic scanning system originally developed for experimental nuclear astrophysics. We have learned how to precisely excavate small access tunnels and trenches using an automated micromanipulator and glass microneedles as tools. These excavations are only destructive to the collector in a very small area—this feature may be particularly important for excavations in the precious Stardust collectors. Using actuatable silicon microtweezers, we have learned how to extract and store “naked” particles—essentially free of aerogel—as small as 3 μm in size. We have also developed a technique for extracting particles, along with their terminal tracks, still embedded in small cubical aerogel blocks. We have developed a novel method for storing very small particles in etched nuclear tracks. We have applied these techniques to the extraction and storage of grains captured in aerogel collectors (Particle Impact Experiment, Orbital Debris Collector Experiment, Comet‐99) in low Earth orbit.  相似文献   
103.
Hurricanes and tropical storms represent one of the major hazards in coastal communities. Storm surge generated by strong winds and low pressure from these systems have the potential to bring extensive flooding in coastal areas. In many cases, the damage caused by the storm surge may exceed the damage from the wind resulting in the total collapse of buildings. Therefore, in coastal areas, one of the sources for major structural damage could be due to scour, where the soil below the building that serves as the foundation is swept away by the movement of the water. The existing methodologies to forecast hurricane flood damage do not differentiate between the different damage mechanisms (e.g., inundation vs. scour). Currently, there are no tools available that predominantly focus on forecasting scour-related damage for buildings. Such a tool could provide significant advantages for planning and/or preparing emergency responses. Therefore, the focus of this study was to develop a methodology to predict possible scour depth due to hurricane storm surges using an automated ArcGIS tool that incorporates the expected hurricane conditions (flow depth, velocity, and flood duration), site-specific building information, and the associated soil types for the foundation. A case study from Monmouth County (NJ), where the scour damages from 2012 Hurricane Sandy were recorded after the storm, was used to evaluate the accuracy of the developed forecasting tool and to relate the scour depth to potential scour damage. The results indicate that the developed tool provides relatively consistent results with the field observations.  相似文献   
104.
105.
In the recent past, Australia has experienced several catastrophic hazard events and the frequency and intensity of such events is expected to increase in the future. Natural hazards can rarely be fully prevented, yet their losses can be minimized if the necessary preparedness and mitigation actions are taken before an event occurs. Identification of vulnerable groups is an important first step in any preparedness and emergency management planning process. Social vulnerability refers to population characteristics that influence the capacity of a community to prepare for, respond to and recover from disasters. Factors that contribute to social vulnerability are often hidden and difficult to capture. This study analyzes the relative levels of social vulnerability of communities at the urban?Cbush interface in the Blue Mountains and Ku-ring-gai local council areas in New South Wales (NSW), Australia. We tested whether a standardized social vulnerability index could be developed using a pre-existing set of indicators. We created an exploratory principle component analysis model using Australian Bureau of Statistics 2006 census data at the Census Collection District (CCD) level. We identified variables contributing to social vulnerability and used the component scores to develop a social vulnerability index. Finally, the social vulnerability index was mapped at the CCD level. Our results indicate that both contributors to and the level of social vulnerability differ between and within communities. In other words, they are spatially variable. They show different spatial patterns across the areas, which provides useful information for identifying communities that are most likely to experience negative disaster impacts due to their socio-demographic characteristics.  相似文献   
106.
Simulation of quick runoff components such as surface runoff and associated soil erosion requires temporal high‐resolution rainfall intensities. However, these data are often not available because such measurements are costly and time consuming. Current rainfall disaggregation methods have shortcomings, especially in generating the distribution of storm events. The objectives of this study were to improve point rainfall disaggregation using a new magnitude category rainfall disaggregation approach. The procedure is introduced using a coupled disaggregation approach (Hyetos and cascade) for multisite rainfall disaggregation. The new procedure was tested with ten long‐term precipitation data sets of central Germany using summer and winter precipitation to determine seasonal variability. Results showed that dividing the rainfall amount into four daily rainfall magnitude categories (1–10, 11–25, 26–50, >50 mm) improves the simulation of high rainfall intensity (convective rainfall). The Hyetos model category approach (HyetosCat) with seasonal variation performs representative to observed hourly rainfall compared with without categories on each month. The mean absolute percentage accuracy of standard deviation for hourly rainfall is 89.7% in winter and 95.6% in summer. The proposed magnitude category method applied with the coupled HyetosCat–cascade approach reproduces successfully the statistical behaviour of local 10‐min rainfall intensities in terms of intermittency as well as variability. The root mean square error performance statistics for disaggregated 10‐min rainfall depth ranges from 0.20 to 2.38 mm for summer and from 0.12 to 2.82 mm for the winter season in all categories. The coupled stochastic approach preserves the statistical self‐similarity and intermittency at each magnitude category with a relatively low computational burden. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   
107.
A swath bathymetric survey was conducted on Marsili Volcano, the biggest seamount in the Tyrrhenian Sea. It stands 3000 m above the surrounding oceanic crust of the 3500 m-deep Marsili back-arc basin and is axially located within the basin. The seamount has an elongated shape and presents distinctive morphology, with narrow (<1000 m) ridges, made up of several elongated cones, on the summit zone and extensive cone fields on its lower flanks. A dredging campaign carried out at water depths varying between 3400 and 600 m indicates that most of Marsili Seamount is composed of medium-K calc-alkaline basalts. Evolved high-K andesites were only recovered from the small cones on the summit axis zone. Petrological and geochemical characteristics of the least differentiated basalts reveal that at least two varieties of magmas have been erupted on the Marsili Volcano. Group 1 basalts have plagioclase and olivine as dominant phases and show lower Al, Ca, K, Ba, Rb and Sr, and higher Fe, Na, Ti and Zr with respect to a second type of basaltic magma. Group 2 basalts reveal the presence of clinopyroxene as an additional phenocryst phase. In addition, the two basaltic magmas have different original pre-eruptive H2O content (group 1, H2O-poor and group 2, H2O-rich). Moreover, comparison of the compositional trends and mineralogical compositions obtained from MELTS [Ghiorso, M.S., Sack, R.O., Contrib. Mineral. Petrol. 119 (1995) 197–212] fractional crystallization calculations reveal that the evolved andesites can only exclusively be derived from a low-pressure (0.3 kbar) fractionation of magmas compositionally similar to the least evolved group 2 basalts. Finally, we suggest that the high vesicularity of the basalts sampled at relatively great depths (>2400 m) on the edifice is governed by H2O and, probably, CO2 exsolution and is not a feature indicative of shallow water depth eruption.  相似文献   
108.
109.
Geophysical data sets are growing at an ever-increasing rate, requiring computationally efficient data selection(thinning)methods to preserve essential information. Satellites, such as Wind Sat, provide large data sets for assessing the accuracy and computational efficiency of data selection techniques. A new data thinning technique, based on support vector regression(SVR), is developed and tested. To manage large on-line satellite data streams, observations from Wind Sat are formed into subsets by Voronoi tessellation and then each is thinned by SVR(TSVR). Three experiments are performed. The first confirms the viability of TSVR for a relatively small sample, comparing it to several commonly used data thinning methods(random selection, averaging and Barnes filtering), producing a 10% thinning rate(90% data reduction), low mean absolute errors(MAE) and large correlations with the original data. A second experiment, using a larger dataset, shows TSVR retrievals with MAE < 1 m s-1and correlations 0.98. TSVR was an order of magnitude faster than the commonly used thinning methods. A third experiment applies a two-stage pipeline to TSVR, to accommodate online data. The pipeline subsets reconstruct the wind field with the same accuracy as the second experiment, is an order of magnitude faster than the nonpipeline TSVR. Therefore, pipeline TSVR is two orders of magnitude faster than commonly used thinning methods that ingest the entire data set. This study demonstrates that TSVR pipeline thinning is an accurate and computationally efficient alternative to commonly used data selection techniques.  相似文献   
110.
The sedimentological and geochemical properties of a 7·47 m long laminated sequence from hypersaline Lake Yoa in northern Chad have been investigated, representing a unique, continuous 6100 year long continental record of climate and environmental change in the eastern Central Sahara. These data were used to reconstruct the Mid to Late Holocene history of this currently hyper‐arid region, in order to address the question of whether the Mid Holocene environmental transition from a humid to a dry Sahara was progressive or abrupt. This study involved a suite of analyses, including petrographic and scanning electron microscope examination of thin sections, X‐ray diffraction, X‐radiography, granulometry, loss on ignition and magnetic susceptibility. The potential of micro‐X‐ray fluorescence core scanning was tested at very high resolution. Detailed microscopic investigation revealed the sedimentary processes responsible for the formation of the fine laminations, identified the season during which they were formed, and confirmed their annually rhythmic nature. High‐resolution X‐ray fluorescence core scanning allowed the distinction of each individual lamination over the entire record, opening new perspectives for the study of finely laminated sediment sequences. Geochemical and mineralogical data reveal that, due to decreasing monsoon rainfall combined with continuous and strong evaporation, the hydrologically open and fresh Mid Holocene Lake Yoa slowly evolved into the present‐day hypersaline brine depleted in calcium, which has existed for about the past 1050 years. During the oldest part of the investigated period, Lake Yoa probably contained a permanently stratified lower water column that was nevertheless disrupted relatively frequently by mixing events. Deep‐water anoxia became more stable because of increased salinity‐driven density stratification. In parallel, the sediment grain‐size proxies record a progressive increase of aeolian input in the course of the last 6100 years. Altogether, all geochemical and sedimentological indicators point to a progressive drying of the eastern Central Sahara, strengthening previous conclusions based on palaeoecological indicators.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号