首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1115篇
  免费   229篇
  国内免费   181篇
测绘学   194篇
大气科学   142篇
地球物理   512篇
地质学   357篇
海洋学   102篇
天文学   5篇
综合类   55篇
自然地理   158篇
  2024年   7篇
  2023年   12篇
  2022年   25篇
  2021年   28篇
  2020年   60篇
  2019年   65篇
  2018年   63篇
  2017年   74篇
  2016年   78篇
  2015年   69篇
  2014年   74篇
  2013年   168篇
  2012年   78篇
  2011年   70篇
  2010年   60篇
  2009年   58篇
  2008年   64篇
  2007年   96篇
  2006年   74篇
  2005年   49篇
  2004年   44篇
  2003年   22篇
  2002年   31篇
  2001年   15篇
  2000年   25篇
  1999年   17篇
  1998年   18篇
  1997年   18篇
  1996年   11篇
  1995年   10篇
  1994年   12篇
  1993年   5篇
  1992年   3篇
  1991年   9篇
  1990年   7篇
  1989年   3篇
  1988年   1篇
  1987年   1篇
  1983年   1篇
排序方式: 共有1525条查询结果,搜索用时 15 毫秒
31.
Fuzzy set approaches to classification of rock masses   总被引:6,自引:0,他引:6  
A. Aydin   《Engineering Geology》2004,74(3-4):227-245
Rock mass classification is analogous to multi-feature pattern recognition problem. The objective is to assign a rock mass to one of the pre-defined classes using a given set of criteria. This process involves a number of subjective uncertainties stemming from: (a) qualitative (linguistic) criteria; (b) sharp class boundaries; (c) fixed rating (or weight) scales; and (d) variable input reliability. Fuzzy set theory enables a soft approach to account for these uncertainties by allowing the expert to participate in this process in several ways. Hence, this study was designed to investigate the earlier fuzzy rock mass classification attempts and to devise improved methodologies to utilize the theory more accurately and efficiently. As in the earlier studies, the Rock Mass Rating (RMR) system was adopted as a reference conventional classification system because of its simple linear aggregation.

The proposed classification approach is based on the concept of partial fuzzy sets representing the variable importance or recognition power of each criterion in the universal domain of rock mass quality. The method enables one to evaluate rock mass quality using any set of criteria, and it is easy to implement. To reduce uncertainties due to project- and lithology-dependent variations, partial membership functions were formulated considering shallow (<200 m) tunneling in granitic rock masses. This facilitated a detailed expression of the variations in the classification power of each criterion along the corresponding universal domains. The binary relationship tables generated using these functions were processed not to derive a single class but rather to plot criterion contribution trends (stacked area graphs) and belief surface contours, which proved to be very satisfactory in difficult decision situations. Four input scenarios were selected to demonstrate the efficiency of the proposed approach in different situations and with reference to the earlier approaches.  相似文献   

32.
Weights of evidence (WofE) modeling usually is applied to map mineral potential in areas with large number of deposits/prospects. In this paper, WofE modeling is applied to a case study area measuring about 920 km2 with 12 known porphyry copper prospects. A pixel size of 100 m × 100 m was used in the spatial data analyses to represent in a raster-based GIS lateral extents of prospects and of geological features considered as spatial evidence. Predictor maps were created based on (a) estimates of studentized values of positive spatial association between prospects and spatial evidence; (b) proportion of number of prospects in zones where spatial evidence is present; and (c) geological interpretations of positive spatial association between prospects and spatial evidence. Uncertainty because of missing geochemical evidence is shown to have an influence on tests of assumption of conditional independence (CI) among predictor maps with respect to prospects. For the final predictive model, assumption of CI is rejected based on omnibus test but is accepted based on a new omnibus test. The final predictive model, which delineates 30% of study area as zones with potential for porphyry copper, has 83% success rate and 73% prediction rate. The results demonstrate plausibility of WofE modeling of mineral potential in large areas with small number of mineral prospects.  相似文献   
33.
Models capable of estimating losses in future earthquakes are of fundamental importance for emergency planners, for the insurance and reinsurance industries, and for code drafters. Constructing a loss model for a city, region or country involves compiling databases of earthquake activity, ground conditions, attenuation equations, building stock and infrastructure exposure, and vulnerability characteristics of the exposed inventory, all of which have large associated uncertainties. Many of these uncertainties can be classified as epistemic, implying—at least in theory—that they can be reduced by acquiring additional data or improved understanding of the physical processes. The effort and cost involved in refining the definition of each component of a loss model can be very large, for which reason it is useful to identify the relative impact on the calculated losses due to variations in these components. A mechanically sound displacement‐based approach to loss estimation is applied to a test case of buildings along the northern side of the Sea of Marmara in Turkey. Systematic variations of the parameters defining the demand (ground motion) and the capacity (vulnerability) are used to identify the relative impacts on the resulting losses, from which it is found that the influence of the epistemic uncertainty in the capacity is larger than that of the demand for a single earthquake scenario. Thus, the importance of earthquake loss models which allow the capacity parameters to be customized to the study area under consideration is highlighted. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   
34.
River flooding is a problem of international interest. In the past few years many countries suffered from severe floods. A large part of the Netherlands is below sea level and river levels. The Dutch flood defences along the river Rhine are designed for water levels with a probability of exceedance of 1/1250 per year. These water levels are computed with a hydrodynamic model using a deterministic bed level and a deterministic design discharge. Traditionally, the safety against flooding in the Netherlands is obtained by building and reinforcing dikes. Recently, a new policy was proposed to cope with increasing design discharges in the Rhine and Meuse rivers. This policy is known as the Room for the River (RfR) policy, in which a reduction of flood levels is achieved by measures creating space for the river, such as dike replacement, side channels and floodplain lowering. As compared with dike reinforcement, these measures may have a stronger impact on flow and sediment transport fields, probably leading to stronger morphological effects. As a result of the latter the flood conveyance capacity may decrease over time. An a priori judgement of safety against flooding on the basis of an increased conveyance capacity of the river can be quite misleading. Therefore, the determination of design water levels using a fixed-bed hydrodynamic model may not be justified and the use of a mobile-bed approach may be more appropriate. This problem is addressed in this paper, using a case study of the river Waal (one of the Rhine branches in the Netherlands). The morphological response of the river Waal to a flood protection measure (floodplain lowering in combination with summer levee removal) is analysed. The effect of this measure is subject to various sources of uncertainty. Monte Carlo simulations are applied to calculate the impact of uncertainties in the river discharge on the bed levels. The impact of the “uncertain” morphological response on design flood level predictions is analysed for three phenomena, viz. the impact of the spatial morphological variation over years, the impact of the seasonal morphological variation and the impact of the morphological variability around bifurcation points. The impact of seasonal morphological variations turns out to be negligible, but the other two phenomena appear to have each an appreciable impact (order of magnitude 0.05–0.1 m) on the computed design water levels. We have to note however, that other sources of uncertainty (e.g. uncertainty in hydraulic roughness predictor), which may be of influence, are not taken into consideration. In fact, the present investigation is limited to the sensitivity of the design water levels to uncertainties in the predicted bed level.  相似文献   
35.
IPCC reports provide a synthesis of the state of the science in order to inform the international policy process. This task is made difficult by the presence of deep uncertainty in the climate problem that results from long time scales and complexity. This paper focuses on how deep uncertainty can be effectively communicated. We argue that existing schemes do an inadequate job of communicating deep uncertainty and propose a simple approach that distinguishes between various levels of subjective understanding in a systematic manner. We illustrate our approach with two examples. To cite this article: M. Kandlikar et al., C. R. Geoscience 337 (2005).  相似文献   
36.
The accurate measurement of precipitation is essential to understanding regional hydrological processes and hydrological cycling. Quantification of precipitation over remote regions such as the Tibetan Plateau is highly unreliable because of the scarcity of rain gauges. The objective of this study is to evaluate the performance of the satellite precipitation product of tropical rainfall measuring mission (TRMM) 3B42 v7 at daily, weekly, monthly, and seasonal scales. Comparison between TRMM grid precipitation and point‐based rain gauge precipitation was conducted using nearest neighbour and bilinear weighted interpolation methods. The results showed that the TRMM product could not capture daily precipitation well due to some rainfall events being missed at short time scales but provided reasonably good precipitation data at weekly, monthly, and seasonal scales. TRMM tended to underestimate the precipitation of small rainfall events (less than 1 mm/day), while it overestimated the precipitation of large rainfall events (greater than 20 mm/day). Consequently, TRMM showed better performance in the summer monsoon season than in the winter season. Through comparison, it was also found that the bilinear weighted interpolation method performs better than the nearest neighbour method in TRMM precipitation extraction.  相似文献   
37.
Prediction intervals (PIs) are commonly used to quantify the accuracy and precision of a forecast. However, traditional ways to construct PIs typically require strong assumptions about data distribution and involve a large computational burden. Here, we improve upon the recent proposed Lower Upper Bound Estimation method and extend it to a multi‐objective framework. The proposed methods are demonstrated using a real‐world flood forecasting case study for the upper Yangtze River Watershed. Results indicate that the proposed methods are able to efficiently construct appropriate PIs, while outperforming other methods including the widely used Generalized Likelihood Uncertainty Estimation approach. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   
38.
The last decade has seen major technical and scientific improvements in the study of water transfer time through catchments. Nevertheless, it has been argued that most of these developments used conservative tracers that may disregard the oldest component of water transfer, which often has transit times greater than 5 years. Indeed, although the analytical reproducibility of tracers limits the detection of the older flow components associated with the most dampened seasonal fluctuations, this is very rarely taken into account in modelling applications. Tritium is the only environmental tracer at hand to investigate transfer times in the 5‐ to 50‐year range in surface waters, as dissolved gases are not suitable due to the degassing process. Water dating with tritium has often been difficult because of the complex history of its atmospheric concentration, but its current stabilization together with recent analytical improvements open promising perspectives. In this context, the innovative contribution of this study lies in the development of a generalized likelihood uncertainty estimation‐based approach for analysing the uncertainties associated with the modelling of transit time due to both parameter identification and tracer analytical precision issues. A coupled resampling procedure allows assessment of the statistical significance of the transfer time differences found in diverse waters. This approach was developed for tritium and the exponential‐piston model but can be implemented for virtually any tracer and model. Stream baseflow, spring and shallow aquifer waters from the Vallcebre research catchments, analysed for tritium in different years with different analytical precisions, were investigated by using this approach and taking into account other sources of uncertainty. The results showed three groups of waters of different mean transit times, with all the stream baseflow and spring waters older than the 5‐year threshold needing tritium. Low sensitivity of the results to the model structure was also demonstrated. Dual solutions were found for the waters sampled in 2013, but these results may be disambiguated when additional analyses will be made in a few years. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   
39.
Historically, observing snow depth over large areas has been difficult. When snow depth observations are sparse, regression models can be used to infer the snow depth over a given area. Data sparsity has also left many important questions about such inference unexamined. Improved inference, or estimation, of snow depth and its spatial distribution from a given set of observations can benefit a wide range of applications from water resource management, to ecological studies, to validation of satellite estimates of snow pack. The development of Light Detection and Ranging (LiDAR) technology has provided non‐sparse snow depth measurements, which we use in this study, to address fundamental questions about snow depth inference using both sparse and non‐sparse observations. For example, when are more data needed and when are data redundant? Results apply to both traditional and manual snow depth measurements and to LiDAR observations. Through sampling experiments on high‐resolution LiDAR snow depth observations at six separate 1.17‐km2 sites in the Colorado Rocky Mountains, we provide novel perspectives on a variety of issues affecting the regression estimation of snow depth from sparse observations. We measure the effects of observation count, random selection of observations, quality of predictor variables, and cross‐validation procedures using three skill metrics: percent error in total snow volume, root mean squared error (RMSE), and R2. Extremes of predictor quality are used to understand the range of its effect; how do predictors downloaded from internet perform against more accurate predictors measured by LiDAR? Whereas cross validation remains the only option for validating inference from sparse observations, in our experiments, the full set of LiDAR‐measured snow depths can be considered the ‘true’ spatial distribution and used to understand cross‐validation bias at the spatial scale of inference. We model at the 30‐m resolution of readily available predictors, which is a popular spatial resolution in the literature. Three regression models are also compared, and we briefly examine how sampling design affects model skill. Results quantify the primary dependence of each skill metric on observation count that ranges over three orders of magnitude, doubling at each step from 25 up to 3200. Whereas uncertainty (resulting from random selection of observations) in percent error of true total snow volume is typically well constrained by 100–200 observations, there is considerable uncertainty in the inferred spatial distribution (R2) even at medium observation counts (200–800). We show that percent error in total snow volume is not sensitive to predictor quality, although RMSE and R2 (measures of spatial distribution) often depend critically on it. Inaccuracies of downloaded predictors (most often the vegetation predictors) can easily require a quadrupling of observation count to match RMSE and R2 scores obtained by LiDAR‐measured predictors. Under cross validation, the RMSE and R2 skill measures are consistently biased towards poorer results than their true validations. This is primarily a result of greater variance at the spatial scales of point observations used for cross validation than at the 30‐m resolution of the model. The magnitude of this bias depends on individual site characteristics, observation count (for our experimental design), and sampling design. Sampling designs that maximize independent information maximize cross‐validation bias but also maximize true R2. The bagging tree model is found to generally outperform the other regression models in the study on several criteria. Finally, we discuss and recommend use of LiDAR in conjunction with regression modelling to advance understanding of snow depth spatial distribution at spatial scales of thousands of square kilometres. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   
40.
The south‐west region of the Goulburn–Broken catchment in the south‐eastern Murray–Darling Basin in Australia faces a range of natural resource challenges. A balanced strategy is required to achieve the contrasting objectives of remediation of land salinization and reducing salt export, while maintaining water supply security to satisfy human consumption and support ecosystems. This study linked the Catchment Analysis Tool (CAT), comprising a suite of farming system models, to the catchment‐scale CATNode hydrological model to investigate the effects of land use change and climate variation on catchment streamflow and salt export. The modelling explored and contrasted the impacts of a series of different revegetation and climate scenarios. The results indicated that targeted revegetation to only satisfy biodiversity outcomes within a catchment is unlikely to have much greater impact on streamflow and salt load in comparison with simple random plantings. Additionally, the results also indicated that revegetation to achieve salt export reduction can effectively reduce salt export while having a disproportionately smaller affect on streamflows. Furthermore, streamflow declines can be minimized by targeting revegetation activities without significantly altering salt export. The study also found that climate change scenarios will have an equal if not more significant impact on these issues over the next 70 years. Uncertainty in CATNode streamflow predictions was investigated because of the effect of parameter uncertainty. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号