首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   126篇
  免费   5篇
大气科学   8篇
地球物理   20篇
地质学   51篇
海洋学   3篇
天文学   28篇
自然地理   21篇
  2022年   1篇
  2021年   2篇
  2019年   2篇
  2018年   5篇
  2017年   3篇
  2016年   6篇
  2015年   3篇
  2014年   4篇
  2013年   3篇
  2012年   4篇
  2011年   9篇
  2010年   6篇
  2009年   8篇
  2008年   2篇
  2007年   2篇
  2006年   5篇
  2005年   1篇
  2004年   3篇
  2003年   3篇
  2002年   2篇
  2001年   5篇
  2000年   4篇
  1999年   3篇
  1998年   3篇
  1997年   3篇
  1996年   2篇
  1995年   3篇
  1994年   7篇
  1993年   5篇
  1991年   1篇
  1990年   2篇
  1988年   1篇
  1986年   3篇
  1984年   3篇
  1982年   2篇
  1980年   2篇
  1978年   1篇
  1977年   1篇
  1975年   2篇
  1974年   3篇
  1971年   1篇
排序方式: 共有131条查询结果,搜索用时 31 毫秒
121.
Teacher''s Aide Variogram Interpretation and Modeling   总被引:13,自引:0,他引:13  
The variogram is a critical input to geostatistical studies: (1) it is a tool to investigate and quantify the spatial variability of the phenomenon under study, and (2) most geostatistical estimation or simulation algorithms require an analytical variogram model, which they will reproduce with statistical fluctuations. In the construction of numerical models, the variogram reflects some of our understanding of the geometry and continuity of the variable, and can have a very important impact on predictions from such numerical models. The principles of variogram modeling are developed and illustrated with a number of practical examples. A three-dimensional interpretation of the variogram is necessary to fully describe geologic continuity. Directional continuity must be described simultaneously to be consistent with principles of geological deposition and for a legitimate measure of spatial variability for geostatistical modeling algorithms. Interpretation principles are discussed in detail. Variograms are modeled with particular functions for reasons of mathematical consistency. Used correctly, such variogram models account for the experimental data, geological interpretation, and analogue information. The steps in this essential data integration exercise are described in detail through the introduction of a rigorous methodology.  相似文献   
122.
There exist many secondary data that must be considered in in reservoir characterization for resource assessment and performance forecasting. These include multiple seismic attributes, geological trends and structural controls. It is essential that all secondary data be accounted for with the precision warranted by that data type. Cokriging is the standard technique in geostatistics to account for multiple data types. The most common variant of cokriging in petroleum geostatistics is collocated cokriging. Implementations of collocated cokriging are often limited to a single secondary variable. Practitioners often choose the most correlated or most relevant secondary variable. Improved models would be constructed if multiple variables were accounted for simultaneously. This paper presents a novel approach to (1) merge all secondary data into a single super secondary variable, then (2) implement collocated cokriging with the single variable. The preprocessing step is straightforward and no major changes are required in the standard implementation of collocated cokriging. The theoretical validity of this approach is proven, that is, the results are proven to be identical to a “full” approach using all multiple secondary variables simultaneously.  相似文献   
123.
An important aim of modern geostatistical modeling is to quantify uncertainty in geological systems. Geostatistical modeling requires many input parameters. The input univariate distribution or histogram is perhaps the most important. A new method for assessing uncertainty in the histogram, particularly uncertainty in the mean, is presented. This method, referred to as the conditional finite-domain (CFD) approach, accounts for the size of the domain and the local conditioning data. It is a stochastic approach based on a multivariate Gaussian distribution. The CFD approach is shown to be convergent, design independent, and parameterization invariant. The performance of the CFD approach is illustrated in a case study focusing on the impact of the number of data and the range of correlation on the limiting uncertainty in the parameters. The spatial bootstrap method and CFD approach are compared. As the number of data increases, uncertainty in the sample mean decreases in both the spatial bootstrap and the CFD. Contrary to spatial bootstrap, uncertainty in the sample mean in the CFD approach decreases as the range of correlation increases. This is a direct result of the conditioning data being more correlated to unsampled locations in the finite domain. The sensitivity of the limiting uncertainty relative to the variogram and the variable limits are also discussed.  相似文献   
124.
125.
Interpretation of geophysical data or other indirect measurements provides large-scale soft secondary data for modeling hard primary data variables. Calibration allows such soft data to be expressed as prior probability distributions of nonlinear block averages of the primary variable; poorer quality soft data leads to prior distributions with large variance, better quality soft data leads to prior distributions with low variance. Another important feature of most soft data is that the quality is spatially variable; soft data may be very good in some areas while poorer in other areas. The main aim of this paper is to propose a new method of integrating such soft data, which is large-scale and has locally variable precision. The technique of simulated annealing is used to construct stochastic realizations that reflect the uncertainty in the soft data. This is done by constraining the cumulative probability values of the block average values to follow a specified distribution. These probability values are determined by the local soft prior distribution and a nonlinear average of the small-scale simulated values within the block, which are all known. For each realization to accurately capture the information contained in the soft data distributions, we show that the probability values should be uniformly distributed between 0 and 1. An objective function is then proposed for a simulated annealing based approach to enforce this uniform probability constraint. The theoretical justification of this approach is discussed, implementation details are considered, and an example is presented.  相似文献   
126.
Truncated plurigaussian (TPG) simulation is a flexible method for simulating rock types in deposits with complicated ordering structures. The truncation of a multivariate Gaussian distribution controls the proportions and ordering of rock types in the simulation while the variogram for each Gaussian variable controls rock type continuity. The determination of a truncation procedure for complicated geological environments is not trivial. A method for determining the truncation and fitting variograms applicable to any number of rock types and multivariate Gaussian distribution is developed here to address this problem. Multidimensional scaling is applied to place dissimilar categories far apart and similar categories close together. The multivariate space is then mapped using a Voronoi decomposition and rotated to optimize variogram reproduction. A case study simulating geologic layers at a large mineral deposit demonstrates the potential of this method and compares the results with sequential indicator simulation (SIS). Input proportion and transition probability reproduction with TPG is demonstrated to be better than SIS. Variogram reproduction is comparable for both techniques.  相似文献   
127.
A novel grid-free geostatistical simulation method (GFS) allows representing coregionalized variables as an analytical function of the coordinates of the simulation locations. Simulation on unstructured grids, regridding and refinement of available realizations of natural phenomena including, but not limited to, environmental systems are possible with GFS in a consistent manner. The unconditional realizations are generated by utilizing the linear model of coregionalization and Fourier series-based decomposition of the covariance function. The conditioning to data is performed by kriging. The data can be measured at scattered point-scale locations or sampled at a block scale. Secondary data are usually used in conjunction with primary data for the improved modeling. Satellite imaging is an example of exhaustively sampled secondary data. Improvements and recommendations are made to the implementation of GFS to properly assimilate secondary exhaustive data sets in a grid-free manner. Intrinsic cokriging (ICK) is utilized to reduce computational time and preserve the overall quality of the simulation. To further reduce the computational cost of ICK, a block matrix inversion is implemented in the calculation of the kriging weights. A projection approach to ICK is proposed to avoid artifacts in the realizations around the edges of the exhaustive data region when the data do not cover the entire modeling domain. The point-scale block value representation of the block-scale data is developed as an alternative to block cokriging to integrate block-scale data into realizations within the GFS framework. Several case studies support the proposed enhancements.  相似文献   
128.
The interaction of relativistic electrons produced by ultrafast lasers focussing them on strongly precompressed thermonuclearfuel is analytically modelled. Energy loss to target electrons is treated through binary collisions and Langmuir wave excitation. The overall penetration depth is determined by quasielastic and multiple scattering on target ions. It thus appears possible to ignite efficient hot spots in a target with density larger than 300 g/cc.  相似文献   
129.
The majority of geostatistical estimation and simulation algorithms rely on a covariance model as the sole characteristic of the spatial distribution of the attribute under study. The limitation to a single covariance implicitly calls for a multivariate Gaussian model for either the attribute itself or for its normal scores transform. The Gaussian model could be justified on the basis that it is both analytically simple and it is a maximum entropy model, i.e., a model that minimizes unwarranted structural properties. As a consequence, the Gaussian model also maximizes spatial disorder (beyond the imposed covariance) which can cause flow simulation results performed on multiple stochastic images to be very similar; thus, the space of response uncertainty could be too narrow entailing a misleading sense of safety. The ability of the sole covariance to adequately describe spatial distributions for flow studies, and the assumption that maximum spatial disorder amounts to either no additional information or a safe prior hypothesis are questioned. This paper attempts to clarify the link between entropy and spatial disorder and to provide, through a detailed case study, an appreciation for the impact of entropy of prior random function models on the resulting response distributions.  相似文献   
130.
Fitting semivariograms with analytical models can be tedious and restrictive. There are many smooth functions that could be used for the semivariogram; however, arbitrary interpolation of the semivariogram will almost certainly create an invalid function. A spectral correction, that is, taking the Fourier transform of the corresponding covariance values, resetting all negative terms to zero, standardizing the spectrum to sum to the sill, and inverse transforming is a valuable method for constructing valid discrete semivariogram models. This paper addresses some important implementation details and provides a methodology to working with spectrally corrected semivariograms.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号