首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   275篇
  免费   69篇
  国内免费   47篇
测绘学   26篇
大气科学   49篇
地球物理   147篇
地质学   105篇
海洋学   28篇
综合类   15篇
自然地理   21篇
  2024年   2篇
  2023年   10篇
  2022年   4篇
  2021年   4篇
  2020年   9篇
  2019年   19篇
  2018年   20篇
  2017年   17篇
  2016年   14篇
  2015年   26篇
  2014年   14篇
  2013年   34篇
  2012年   20篇
  2011年   16篇
  2010年   12篇
  2009年   12篇
  2008年   17篇
  2007年   16篇
  2006年   22篇
  2005年   14篇
  2004年   6篇
  2003年   15篇
  2002年   5篇
  2001年   8篇
  2000年   4篇
  1999年   6篇
  1998年   9篇
  1997年   6篇
  1996年   5篇
  1995年   6篇
  1994年   1篇
  1993年   1篇
  1992年   3篇
  1991年   3篇
  1990年   1篇
  1988年   3篇
  1986年   1篇
  1985年   4篇
  1984年   1篇
  1979年   1篇
排序方式: 共有391条查询结果,搜索用时 664 毫秒
21.
In river bank filtration, impurities present in the river water travel with the bank filtrate towards the pumping well. During this passage, certain types of impurities, such as turbidity, total coliform, and so forth, may get attenuated; however, it is interesting to note that some of the instant raw river water quality parameters, such as alkalinity and electrical conductivity, increase after the passage of water through the porous medium. This occurs because water, when passing through the soil pores, absorbs many of the solutes that cause an increase in alkalinity and electrical conductivity. Measurements at a river bank filtration site for a year showed that alkalinity of 116–32 mg l?1 in river water increased to 222.4–159.9 mg l?1 in the river bank filtered water. Likewise, the electrical conductivity increased from 280–131 μS cm?1 to 462–409.6 μS cm?1. This study uses a probabilistic approach for investigating the variation of alkalinity and electrical conductivity of source water that varies with the natural logarithm of the concentration of influent water. The probabilistic approach has the potential of being used in simulating the variation of alkalinity and electrical conductivity in river bank filtrate. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   
22.
The time-dependence of earthquake occurrence is mostly ignored in standard seismic hazard assessment even though earthquake clustering is well known. In this work, we attempt to quantify the impact of more realistic dynamics on the seismic hazard estimations. We include the time and space dependences between earthquakes into the hazard analysis via Monte Carlo simulations. Our target region is the Lower Rhine Embayment, a low seismicity area in Germany. Including aftershock sequences by using the epidemic type aftershock-sequence (ETAS) model, we find that on average the hypothesis of uncorrelated random earthquake activity underestimates the hazard by 5–10 per cent. Furthermore, we show that aftershock activity of past large earthquakes can locally increase the hazard even centuries later. We also analyse the impact of the so-called long-term behaviour, assuming a quasi-periodic occurrence of main events on a major fault in that region. We found that a significant impact on hazard is only expected for the special case of a very regular recurrence of the main shocks.  相似文献   
23.
For Probabilistic Tsunami Hazard Analysis (PTHA), we propose a logic-tree approach to construct tsunami hazard curves (relationship between tsunami height and probability of exceedance) and present some examples for Japan for the purpose of quantitative assessments of tsunami risk for important coastal facilities. A hazard curve is obtained by integration over the aleatory uncertainties, and numerous hazard curves are obtained for different branches of logic-tree representing epistemic uncertainty. A PTHA consists of a tsunami source model and coastal tsunami height estimation. We developed the logic-tree models for local tsunami sources around Japan and for distant tsunami sources along the South American subduction zones. Logic-trees were made for tsunami source zones, size and frequency of tsunamigenic earthquakes, fault models, and standard error of estimated tsunami heights. Numerical simulation rather than empirical relation was used for estimating the median tsunami heights. Weights of discrete branches that represent alternative hypotheses and interpretations were determined by the questionnaire survey for tsunami and earthquake experts, whereas those representing the error of estimated value were determined on the basis of historical data. Examples of tsunami hazard curves were illustrated for the coastal sites, and uncertainty in the tsunami hazard was displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves.  相似文献   
24.
An improved seismic hazard model for use in performance‐based earthquake engineering is presented. The model is an improved approximation from the so‐called ‘power law’ model, which is linear in log–log space. The mathematics of the model and uncertainty incorporation is briefly discussed. Various means of fitting the approximation to hazard data derived from probabilistic seismic hazard analysis are discussed, including the limitations of the model. Based on these ‘exact’ hazard data for major centres in New Zealand, the parameters for the proposed model are calibrated. To illustrate the significance of the proposed model, a performance‐based assessment is conducted on a typical bridge, via probabilistic seismic demand analysis. The new hazard model is compared to the current power law relationship to illustrate its effects on the risk assessment. The propagation of epistemic uncertainty in the seismic hazard is also considered. To allow further use of the model in conceptual calculations, a semi‐analytical method is proposed to calculate the demand hazard in closed form. For the case study shown, the resulting semi‐analytical closed form solution is shown to be significantly more accurate than the analytical closed‐form solution using the power law hazard model, capturing the ‘exact’ numerical integration solution to within 7% accuracy over the entire range of exceedance rate. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   
25.
The quantitative probabilistic assessment of the undiscovered mineral resources of the 17.1-million-acre Tongass National Forest (the largest in the United States) and its adjacent lands is a nonaggregated, mineral-resource-tract-oriented assessment designed for land-planning purposes. As such, it includes the renewed use of gross-in-place values (GIPV's) in dollars of the estimated amounts of metal contained in the undiscovered resources as a measure for land-use planning.Southeastern Alaska is geologically complex and contains a wide variety of known mineral deposits, some of which have produced important amounts of metals during the past 100 years. Regional geological, economic geological, geochemical, geophysical, and mineral exploration history information for the region was integrated to define 124 tracts likely to contain undiscovered mineral resources. Some tracts were judged to contain more than one type of mineral deposit. Each type of deposit may contain one or more metallic elements of economic interest. For tracts where information was sufficient, the minimum number of as-yet-undiscovered deposits of each type was estimated at probability levels of 0.95, 0.90, 0.50, 0.10, and 0.05.The undiscovered mineral resources of the individual tracts were estimated using the U.S. Geological Survey's MARK3 mineral-resource endowment simulator; those estimates were used to calculate GIPV's for the individual tracts. Those GIPV's were aggregated to estimate the value of the undiscovered mineral resources of southeastern Alaska. The aggregated GIPV of the estimates is $40.9 billion.Analysis of this study indicates that (1) there is only a crude positive correlation between the size of individual tracts and their mean GIPV's: and (2) the number of mineral-deposit types in a tract does not dominate the GIPV's of the tracts, but the inferred presence of synorogenic-synvolcanic nickel-copper, porphyry copper skarn-related, iron skarn, and porphyry copper-molybdenum deposits does. The influence of this study on the U.S. Forest Service planning process is yet to be determined.  相似文献   
26.
Kijko  A.  Retief  S. J. P.  Graham  G. 《Natural Hazards》2002,26(2):175-201
In this part of our study the probabilistic seismic hazard analysis (PSHA) for Tulbagh was performed. The applied procedure is parametric and consists essentially of two steps. The first step is applicable to the area in the vicinity of Tulbagh and requires an estimation of the area-specific parameters, which, in this case, is the mean seismic activity rate, , the Gutenberg-Richter parameter, b, and the maximum regional magnitude, mmax. The second step is applicable to the Tulbagh site, and consists of parameters of distribution of amplitude of the selected ground motion parameter. The current application of the procedure provides an assessment of the PSHA in terms of peak ground acceleration (PGA) and spectral acceleration (SA). The procedure permits the combination of both historical and instrumental data. The historical part of the catalogue only contains the strongest events, whereas the complete part can be divided into several subcatalogues, each assumed complete above a specified threshold of magnitude. In the analysis, the uncertainty in the determination of the earthquake was taken into account by incorporation of the concept of `apparent magnitude'. The PSHA technique has been developed specifically for the estimation of seismic hazard at individual sites without the subjective judgement involved in the definition of seismic source zones, when the specific active faults have not been mapped or identified, and where the causes of seismicity are not well understood. The results of the hazard assessment are expressed as probabilities that specified values of PGA will be exceeded during the chosen time intervals, and similarly for the spectral accelerations. A worst case scenario sketches the possibility of a maximum PGA of 0.30g. The results of the hazard assessment can be used as input to a seismic risk assessment.  相似文献   
27.
用历史类比法对中国强震活动的概率预测   总被引:1,自引:0,他引:1  
本文将20世纪全球强震活动划分为7个活动期,以国内中强以上地震同期活动资料进行比较,对未来国内强震活动趋势作出中期的统计概率预测,似乎可以作为一个较为有力的判据。  相似文献   
28.
本文针对2016年6月23日江苏阜宁龙卷,设计了两组对流可分辨尺度集合预报:一组以ERA5再分析资料为初始和侧边界(CEFS_ERA5);另一组以NCEP GEFS为初始和侧边界(CEFS_GEFS),评估了两组试验对此次龙卷的预报能力。结果显示:两组对流尺度集合预报均有约半数以上成员能够再现龙卷超级单体的特征;2~5 km上升螺旋度(UH25)对本次龙卷超级单体有较好的预报指示意义。在上述分析的基础上,考虑位置预报偏差,提出了一种基于UH25的邻域龙卷概率预报产品,分析了龙卷概率预报技巧对关键参数邻域半径和UH25阈值的敏感性,CEFS_ERA5邻域半径取15个格点,UH25阈值取250 m2·s-2最优;而CEFS_GEFS邻域半径取15个格点,UH25阈值取100 m2·s-2最优。总的来说,邻域概率预报产品显著提升了对此次龙卷概率预报水平。  相似文献   
29.
Geotechnical engineering problems are characterized by many sources of uncertainty. Some of these sources are connected to the uncertainties of soil properties involved in the analysis. In this paper, a numerical procedure for a probabilistic analysis that considers the spatial variability of cross‐correlated soil properties is presented and applied to study the bearing capacity of spatially random soil with different autocorrelation distances in the vertical and horizontal directions. The approach integrates a commercial finite difference method and random field theory into the framework of a probabilistic analysis. Two‐dimensional cross‐correlated non‐Gaussian random fields are generated based on a Karhunen–Loève expansion in a manner consistent with a specified marginal distribution function, an autocorrelation function, and cross‐correlation coefficients. A Monte Carlo simulation is then used to determine the statistical response based on the random fields. A series of analyses was performed to study the effects of uncertainty due to the spatial heterogeneity on the bearing capacity of a rough strip footing. The simulations provide insight into the application of uncertainty treatment to geotechnical problems and show the importance of the spatial variability of soil properties with regard to the outcome of a probabilistic assessment. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   
30.
A micropaleontologic assemblage zone is defined by occurrence of some characteristic species among many coeval species. When number of assemblage-defining species and total number of species observed are designated as Aand N,respectively, the ratio, A/N,is strongly dependent on duration of the assemblage. Theoretical consideration on the basis of a micropaleontologic cohort model shows that, when origination rate and extinction rate of species are obtained, the most reasonable ratio (A/N) and duration of the assemblage can be determined. The probabilistic model described in this paper provides a theoretical relation between the ratio and the duration. Inaccuracy in correlating micropaleontologic data to certain assemblage zones established can not be avoided because of many natural sorting and artificial biases. Ambiguity arising when data with a small number of characteristic species are correlated with a certain assemblage is numerically estimated.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号