首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3500篇
  免费   555篇
  国内免费   568篇
测绘学   939篇
大气科学   377篇
地球物理   1055篇
地质学   1155篇
海洋学   434篇
天文学   69篇
综合类   252篇
自然地理   342篇
  2024年   16篇
  2023年   31篇
  2022年   78篇
  2021年   104篇
  2020年   158篇
  2019年   150篇
  2018年   123篇
  2017年   178篇
  2016年   184篇
  2015年   190篇
  2014年   246篇
  2013年   337篇
  2012年   274篇
  2011年   249篇
  2010年   183篇
  2009年   179篇
  2008年   176篇
  2007年   248篇
  2006年   206篇
  2005年   187篇
  2004年   146篇
  2003年   123篇
  2002年   111篇
  2001年   97篇
  2000年   91篇
  1999年   76篇
  1998年   67篇
  1997年   76篇
  1996年   60篇
  1995年   53篇
  1994年   59篇
  1993年   37篇
  1992年   27篇
  1991年   20篇
  1990年   18篇
  1989年   17篇
  1988年   18篇
  1987年   12篇
  1986年   4篇
  1985年   5篇
  1984年   2篇
  1983年   3篇
  1979年   3篇
  1954年   1篇
排序方式: 共有4623条查询结果,搜索用时 15 毫秒
61.
An important task in modern geostatistics is the assessment and quantification of resource and reserve uncertainty. This uncertainty is valuable support information for many management decisions. Uncertainty at specific locations and uncertainty in the global resource is of interest. There are many different methods to build models of uncertainty, including Kriging, Cokriging, and Inverse Distance. Each method leads to different results. A method is proposed to combine local uncertainties predicted by different models to obtain a combined measure of uncertainty that combines good features of each alternative. The new estimator is the overlap of alternate conditional distributions.  相似文献   
62.
Seismic hazard analysis is based on data and models, which both are imprecise and uncertain. Especially the interpretation of historical information into earthquake parameters, e.g. earthquake size and location, yields ambiguous and imprecise data. Models based on probability distributions have been developed in order to quantify and represent these uncertainties. Nevertheless, the majority of the procedures applied in seismic hazard assessment do not take into account these uncertainties, nor do they show the variance of the results. Therefore, a procedure based on Bayesian statistics was developed to estimate return periods for different ground motion intensities (MSK scale).Bayesian techniques provide a mathematical model to estimate the distribution of random variables in presence of uncertainties. The developed method estimates the probability distribution of the number of occurrences in a Poisson process described by the parameter . The input data are the historical occurrences of intensities for a particular site, represented by a discrete probability distribution for each earthquake. The calculation of these historical occurrences requires a careful preparation of all input parameters, i.e. a modelling of their uncertainties. The obtained results show that the variance of the recurrence rate is smaller in regions with higher seismic activity than in less active regions. It can also be demonstrated that long return periods cannot be estimated with confidence, because the time period of observation is too short. This indicates that the long return periods obtained by seismic source methods only reflects the delineated seismic sources and the chosen earthquake size distribution law.  相似文献   
63.
DMS (dimethylsulfide), a breakdown product of cellular solutes of many species of macroalgae andphytoplankton plays an important role in regulating global climate and counteracting partly the "greenhouse" effect.In this paper, the advance and prospects of DMS study are reviewed and discussed with respectto DMS sample storage, measurement and importance in regulating global climate and the acidity ofrain and aerosol.  相似文献   
64.
针对铝土矿床突水,开展地应力测量研究,是分析查明突水成因、优化确定防治方案及合理进行矿床底板管理的一项重要手段。夹沟矿床测量分析结果表明:①矿床地区存在较高的水平构造应力,其量值明显高于垂向应力。矿床水平高构造应力对矿床底板及其边坡稳定性具有不良影响,矿床开采掘面轴线走向应沿N70°E方向布设趋佳;②由于持续开采作用,矿床附近地应力状态产生不良扰动,水平高构造应力同开采形成的局部地应力场不良改变的叠加影响对矿床底板岩石场各类裂隙、结构面水起到极为重要的“催化”作用,是造成矿床底板突水的重要因素之一;③防治工作应重视加强矿床底板管理,科学预留安全隔水层屏障。并建议对矿床底板及其下伏一定范围内的灰岩实施预注浆处理,使得裂隙得以加固、溶洞得到堵塞,以抵抗不良应力的作用,消除或减弱对矿床突水的影响。  相似文献   
65.
This paper presents a statistical analysis of the algebraic strain estimation algorithm of Shimamoto and Ikeda [Shimamoto, T., Ikeda, Y., 1976. A simple algebraic method for strain estimation from deformed eillipsoidal objects: 1. Basic theory. Tectonophysics 36, 315–337]. It is argued that the error in their strain estimation procedure can be quantified using an expected discrepancy measure. An analysis of this measure demonstrates that the error is inversely proportional to the number of clasts used. The paper also examines the role of measurement error, in particular that incurred under (i) a moment based and (ii) manual data acquisition methods. Detailed analysis of these two acquisition methods shows that in both cases, the effect of measurement error on the expected discrepancy is small relative to the effect of the sample size (number of objects). Given their relative speed advantage, this result favours the use of automated measurement methods even if they incur more measurement error on individual objects. Validation of these results is carried out by means of a simulation study, as well as by reference to studies appearing in previous literature. The results are also applied to obtain an upper bound on the error of strain estimation for various studies published in the literature on strain analysis.  相似文献   
66.
The determination of the optimal type and placement of a nonconventional well in a heterogeneous reservoir represents a challenging optimization problem. This determination is significantly more complicated if uncertainty in the reservoir geology is included in the optimization. In this study, a genetic algorithm is applied to optimize the deployment of nonconventional wells. Geological uncertainty is accounted for by optimizing over multiple reservoir models (realizations) subject to a prescribed risk attitude. To reduce the excessive computational requirements of the base method, a new statistical proxy (which provides fast estimates of the objective function) based on cluster analysis is introduced into the optimization process. This proxy provides an estimate of the cumulative distribution function (CDF) of the scenario performance, which enables the quantification of proxy uncertainty. Knowledge of the proxy-based performance estimate in conjunction with the proxy CDF enables the systematic selection of the most appropriate scenarios for full simulation. Application of the overall method for the optimization of monobore and dual-lateral well placement demonstrates the performance of the hybrid optimization procedure. Specifically, it is shown that by simulating only 10% or 20% of the scenarios (as determined by application of the proxy), optimization results very close to those achieved by simulating all cases are obtained.  相似文献   
67.
All geochemical measurements require the taking of field samples, but the uncertainty that this process causes is often ignored when assessing the reliability of the interpretation, of the geochemistry or the health implications. Recently devised methods for the estimation, optimisation and reduction of this uncertainty have been evaluated by their application to the investigation of contaminated land. Uncertainty of measurement caused by primary sampling has been estimated for a range of six different contaminated land site investigations, using an increasingly recognized procedure. These site investigations were selected to reflect a wide range of different sizes, contaminants (organic and metals), previous land uses (e.g. tin mining, railway sidings and gas works), intended future use (housing to nature reserves) and routinely applied sampling methods. The results showed that the uncertainty on measurements was substantial, ranging from 25% to 186% of the concentration values at the different sites. Sampling was identified as the dominant source of the uncertainty (〉70% of measurement uncertainty) in most cases. The fitness-for-purpose of the measurements was judged using the optimized contaminated land investigation (OCLI) method. This identifies the optimal level of uncertainty that reduces to overall financial loss caused by the measurement procedures and the misclassification of the contamination, caused by the uncertainty. Generally the uncertainty of the actual measurements made in these different site investigations was found to be sub-optimal, and too large by a factor of approximately two. The uncertainty is usually limited by the sampling, but this can be reduced by increasing the sample mass by a factor of 4 (predicted by sampling theory). It is concluded that knowing the value of the uncertainty enables the interpretation to be made more reliable, and that sampling is the main factor limiting most investigations. This new approach quantifies this problem for the first time, and allows sampling procedures to be critically evaluated, and modified, to improve the reliability of the geochemical assessment.  相似文献   
68.
The three most important components necessary for functioning of an operational flood warning system are: (1) a rainfall measuring system; (2) a soil moisture updating system; and, (3) a surface discharge measuring system. Although surface based networks for these systems can be largely inadequate in many parts of the world, this inadequacy particularly affects the tropics, which are most vulnerable to flooding hazards. Furthermore, the tropical regions comprise developing countries lacking the financial resources for such surface-based monitoring. The heritage of research conducted on evaluating the potential for measuring discharge from space has now morphed into an agenda for a mission dedicated to space-based surface discharge measurements. This mission juxtaposed with two other upcoming space-based missions: (1) for rainfall measurement (Global Precipitation Measurement, GPM), and (2) soil moisture measurement (Hydrosphere State, HYDROS), bears promise for designing a fully space-borne system for early warning of floods. Such a system, if operational, stands to offer tremendous socio-economic benefit to many flood-prone developing nations of the tropical world. However, there are two competing aspects that need careful assessment to justify the viability of such a system: (1) cost-effectiveness due to surface data scarcity; and (2) flood prediction uncertainty due to uncertainty in the remote sensing measurements. This paper presents the flood hazard mitigation opportunities offered by the assimilation of the three proposed space missions within the context of these two competing aspects. The discussion is cast from the perspective of current understanding of the prediction uncertainties associated with space-based flood prediction. A conceptual framework for a fully space-borne system for early-warning of floods is proposed. The need for retrospective validation of such a system on historical data comprising floods and its associated socio-economic impact is stressed. This proposal for a fully space-borne system, if pursued through wide interdisciplinary effort as recommended herein, promises to enhance the utility of the three space missions more than what their individual agenda can be expected to offer.  相似文献   
69.
A new earthquake catalogue for central, northern and northwestern Europe with unified Mw magnitudes, in part derived from chi-square maximum likelihood regressions, forms the basis for seismic hazard calculations for the Lower Rhine Embayment. Uncertainties in the various input parameters are introduced, a detailed seismic zonation is performed and a recently developed technique for maximum expected magnitude estimation is adopted and quantified. Applying the logic tree algorithm, resulting hazard values with error estimates are obtained as fractile curves (median, 16% and 84% fractiles and mean) plotted for pga (peak ground acceleration; median values for Cologne 0.7 and 1.2 m/s2 for probabilities of exceedence of 10% and 2%, respectively, in 50 years), 0.4 s (0.8 and 1.5 m/s2) and 1.0 s (0.3 and 0.5 m/s2) pseudoacclerations, and intensity (I0 = 6.5 and 7.2). For the ground motion parameters, rock foundation is assumed. For the area near Cologne and Aachen, maps show the median and 84% fractile hazard for 2% probability of exceedence in 50 years based on pga (maximum median value about 1.5 m/s2), and 0.4 s (>2 m/s2) and 1.0 s (about 0.8 m/s2) pseudoaccelerations, all for rock. The pga 84% fractile map also has a maximum value above 2 m/s2 and shows similarities with the median map for 0.4 s. In all maps, the maximum values fall within the area 6.2–6.3° E and 50.8–50.9° N, i.e., east of Aachen.  相似文献   
70.
Under the assumptions of triangular cross section channel and uniform stable flow, an analytical solution of the minimum ecological in-stream flow requirement (MEIFR) is deduced. Based on the analytical solution, the uncertainty of the wetted perimeter method is analyzed by comparing the two techniques for the determination of the critical point on the relationship curve between wetted perimeter, P and discharge, Q. It is clearly shown that the results of MEIFR based on curvature technique (corresponding to the maximum curvature) and slope technique (slope being 1) are significantly different. On the P-Q curve, the slope of the critical point with the maximum curvature is 0.39 and the MEIFR varied prominently with the change of the slope threshold. This indicates that if a certain value of the slope threshold is not available for slope technique, curvature technique may be a better choice. By applying the analytical solution of MEIFR in the losing rivers of the Western Route South-to-North Water Transfer Project in China, the MEIFR value via curvature technique is 2.5%-23.7% of the multi-year average annual discharge, while that for slope technique is 11%-105.7%. General conclusions would rely on the more detailed research for all kinds of cross-sections.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号