首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   439篇
  免费   24篇
  国内免费   8篇
测绘学   35篇
大气科学   43篇
地球物理   79篇
地质学   166篇
海洋学   43篇
天文学   69篇
综合类   1篇
自然地理   35篇
  2023年   3篇
  2022年   4篇
  2021年   8篇
  2020年   6篇
  2019年   10篇
  2018年   10篇
  2017年   6篇
  2016年   24篇
  2015年   16篇
  2014年   14篇
  2013年   24篇
  2012年   24篇
  2011年   31篇
  2010年   29篇
  2009年   36篇
  2008年   15篇
  2007年   20篇
  2006年   18篇
  2005年   21篇
  2004年   15篇
  2003年   10篇
  2002年   11篇
  2001年   12篇
  2000年   3篇
  1999年   8篇
  1998年   3篇
  1997年   3篇
  1996年   4篇
  1995年   7篇
  1994年   8篇
  1993年   10篇
  1992年   6篇
  1991年   8篇
  1990年   4篇
  1989年   3篇
  1988年   6篇
  1987年   1篇
  1986年   5篇
  1985年   3篇
  1984年   5篇
  1983年   2篇
  1982年   1篇
  1981年   5篇
  1980年   1篇
  1979年   2篇
  1978年   1篇
  1976年   2篇
  1975年   2篇
  1974年   1篇
排序方式: 共有471条查询结果,搜索用时 15 毫秒
161.
162.
This conceptual model of avalanche hazard identifies the key components of avalanche hazard and structures them into a systematic, consistent workflow for hazard and risk assessments. The method is applicable to all types of avalanche forecasting operations, and the underlying principles can be applied at any scale in space or time. The concept of an avalanche problem is introduced, describing how different types of avalanche problems directly influence the assessment and management of the risk. Four sequential questions are shown to structure the assessment of avalanche hazard, namely: (1) What type of avalanche problem(s) exists? (2) Where are these problems located in the terrain? (3) How likely is it that an avalanche will occur? and (4) How big will the avalanche be? Our objective was to develop an underpinning for qualitative hazard and risk assessments and address this knowledge gap in the avalanche forecasting literature. We used judgmental decomposition to elicit the avalanche forecasting process from forecasters and then described it within a risk-based framework that is consistent with other natural hazards disciplines.  相似文献   
163.
The physical risk from snow avalanches poses a serious threat to mountain backcountry travelers. Avalanche risk is primarily managed by (1) assessing avalanche hazard through analysis of the local weather, snowpack, and recent avalanche activity and (2) selecting terrain that limits exposure to the identified hazard. Professional ski guides have a tremendous wealth of knowledge about using terrain to manage avalanche risk, but their expertise is tacit, which makes it difficult for them to explicitly articulate the underlying decision rules. To make this existing expertise more broadly accessible, this study examines whether it is possible to derive quantitative measures for avalanche terrain severity and condition-dependent terrain guidance directly from observed terrain selection of professional guides. We equipped lead guides at Mike Wiegele Helicopter Skiing with GPS tracking units during the 2014/2015 and 2015/2016 winters creating a dataset of 10,592 high-resolution tracked ski runs. We used four characteristics—incline, vegetation, down-slope curvature (convexities/concavities), and cross-slope curvature (gullies/ridges)—to describe the skied terrain and employed a mixed-effects ordered logistic regression model to examine the relationship between the character of most severe avalanche terrain skied on a day and the associated field-validated avalanche hazard ratings. Patterns in the regression parameter estimates reflected the existing understanding of how terrain is selected to manage avalanche risk well: the guides skied steeper, less dense vegetation, and more convoluted slopes during times of lower avalanche hazard. Avalanche terrain severity scores derived from the parameter estimates compared well to terrain previously zoned according to the Avalanche Terrain Exposure Scale. Using a GIS implementation of the regression analysis, we created avalanche condition-dependent maps that provide insights into what type of terrain guides deemed acceptable for skiing under different avalanche hazard conditions. These promising results highlight the potential of tracking guides’ terrain selection decisions as they manage avalanche hazard for the development of evidence-based avalanche terrain ratings and decision aids for professional and recreational backcountry travelers.  相似文献   
164.
165.
The origin of the Martian moons, Phobos and Deimos, is still an open issue: either they are asteroids captured by Mars or they formed in situ from a circum-Mars debris disk. The capture scenario mainly relies on the remote-sensing observations of their surfaces, which suggest that the moon material is similar to outer-belt asteroid material. This scenario, however, requires high tidal dissipation rates inside the moons to account for their current orbits around Mars. Although the in situ formation scenarios have not been studied in great details, no observational constraints argue against them. Little attention has been paid to the internal structure of the moons, yet it is pertinent for explaining their origin. The low density of the moons indicates that their interior contains significant amounts of porous material and/or water ice. The porous content is estimated to be in the range of 30?C60% of the volume for both moons. This high porosity enhances the tidal dissipation rate but not sufficiently to meet the requirement of the capture scenario. On the other hand, a large porosity is a natural consequence of re-accretion of debris at Mars?? orbit, thus providing support to the in situ formation scenarios. The low density also allows for abundant water ice inside the moons, which might significantly increase the tidal dissipation rate in their interiors, possibly to a sufficient level for the capture scenario. Precise measurements of the rotation and gravity field of the moons are needed to tightly constrain their internal structure in order to help answering the question of the origin.  相似文献   
166.
The radio emission during 201 selected X-ray solar flares was surveyed from 100 MHz to 4 GHz with the Phoenix-2 spectrometer of ETH Zürich. The selection includes all RHESSI flares larger than C5.0 jointly observed from launch until June 30, 2003. Detailed association rates of radio emission during X-ray flares are reported. In the decimeter wavelength range, type III bursts and the genuinely decimetric emissions (pulsations, continua, and narrowband spikes) were found equally frequently. Both occur predominantly in the peak phase of hard X-ray (HXR) emission, but are less in tune with HXRs than the high-frequency continuum exceeding 4 GHz, attributed to gyrosynchrotron radiation. In 10% of the HXR flares, an intense radiation of the above genuine decimetric types followed in the decay phase or later. Classic meter-wave type III bursts are associated in 33% of all HXR flares, but only in 4% are they the exclusive radio emission. Noise storms were the only radio emission in 5% of the HXR flares, some of them with extended duration. Despite the spatial association (same active region), the noise storm variations are found to be only loosely correlated in time with the X-ray flux. In a surprising 17% of the HXR flares, no coherent radio emission was found in the extremely broad band surveyed. The association but loose correlation between HXR and coherent radio emission is interpreted by multiple reconnection sites connected by common field lines.  相似文献   
167.
168.
Continuous, very long baseline interferometry (VLBI) campaigns over 2 weeks have been carried out repeatedly, i.e., CONT02 in October 2002, CONT05 in September 2005, CONT08 in August 2008, and CONT11 in September 2011, to demonstrate the highest accuracy the current VLBI was capable at that time. In this study, we have compared zenith total delays (ZTD) and troposphere gradients as consistently estimated from the observations of VLBI, Global Navigation Satellite Systems (GNSS), and Doppler Orbitography and Radiopositioning Integrated by Satellite (DORIS) at VLBI sites participating in the CONT campaigns. We analyzed the CONT campaigns using the state-of-the-art software following common processing strategies as closely as possible. In parallel, ZTD and gradients were derived from numerical weather models, i.e., from the global European Centre for Medium-Range Weather Forecasts (ECMWF) analysis fields, the High Resolution Limited Area Model (European sites), the Japan Meteorological Agency-Operational Meso-Analysis Field (MANAL, over Japan), and the Cloud Resolving Storm Simulator (Tsukuba, Japan). Finally, zenith wet delays were estimated from the observations of water vapor radiometers (WVR) at sites where the WVR observables are available during the CONT sessions. The best ZTD agreement, interpreted as the smallest standard deviation, was found between GNSS and VLBI techniques to be about 5–6 mm at most of the co-located sites and CONT campaigns. We did not detect any significant improvement in the ZTD agreement between various techniques over time, except for DORIS and MANAL. On the other hand, the agreement and thus the accuracy of the troposphere parameters mainly depend on the amount of humidity in the atmosphere.  相似文献   
169.
Determination of spherical harmonic coefficients of the Earth’s gravity field is often an ill-posed problem and leads to solving an ill-conditioned system of equations. Inversion of such a system is critical, as small errors of data will yield large variations in the result. Regularization is a method to solve such an unstable system of equations. In this study, direct methods of Tikhonov, truncated and damped singular value decomposition and iterative methods of ν, algebraic reconstruction technique, range restricted generalized minimum residual and conjugate gradient are used to solve the normal equations constructed based on range rate data of the gravity field and climate experiment (GRACE) for specific periods. Numerical studies show that the Tikhonov regularization and damped singular value decomposition methods for which the regularization parameter is estimated using quasioptimal criterion deliver the smoothest solutions. Each regularized solution is compared to the global land data assimilation system (GLDAS) hydrological model. The Tikhonov regularization with L-curve delivers a solution with high correlation with this model and a relatively small standard deviation over oceans. Among iterative methods, conjugate gradient is the most suited one for the same reasons and it has the shortest computation time.  相似文献   
170.
The caravan park sub-sector of the Australian tourism accommodation industry provides at least half of the national tourism bed capacity, and in 2009 generated over A$1.1 billion in annual takings. However, the number of parks and park capacity is in decline nationally while both international and domestic demand for the drive-tourism experience is growing. This sets a trend towards an accommodation facilities shortage for the caravanning sector and exposes its vulnerability. This paper uses a case study of caravan parks in the Tweed Shire, New South Wales, Australia, to examine the life-cycle pattern of these parks as a discrete unit of tourist area development and to consider the sector's future. The sector's history is framed within Butler's (Canadian Geographer 24(1): 5–12 (1980)) concept of the tourist area life cycle (TALC). The historical data demonstrate the urban and market change that has occurred around and within caravan parks of this coastal region over almost two centuries. The pattern of caravan park development and evolution conformed to the involvement, exploration, development, consolidation and stagnation stages of the TALC. In 2011, caravan parks in the Tweed region were at a critical tipping point with potential for either decline or rejuvenation.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号