首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1356篇
  免费   146篇
  国内免费   164篇
测绘学   172篇
大气科学   109篇
地球物理   335篇
地质学   492篇
海洋学   181篇
天文学   19篇
综合类   92篇
自然地理   266篇
  2024年   4篇
  2023年   24篇
  2022年   45篇
  2021年   45篇
  2020年   54篇
  2019年   70篇
  2018年   45篇
  2017年   67篇
  2016年   59篇
  2015年   65篇
  2014年   72篇
  2013年   113篇
  2012年   55篇
  2011年   64篇
  2010年   58篇
  2009年   84篇
  2008年   74篇
  2007年   88篇
  2006年   84篇
  2005年   52篇
  2004年   59篇
  2003年   40篇
  2002年   41篇
  2001年   49篇
  2000年   34篇
  1999年   24篇
  1998年   25篇
  1997年   30篇
  1996年   19篇
  1995年   18篇
  1994年   5篇
  1993年   19篇
  1992年   13篇
  1991年   13篇
  1990年   9篇
  1989年   8篇
  1988年   10篇
  1987年   10篇
  1986年   2篇
  1985年   2篇
  1984年   3篇
  1983年   2篇
  1982年   2篇
  1981年   2篇
  1979年   1篇
  1977年   1篇
  1976年   2篇
  1973年   1篇
排序方式: 共有1666条查询结果,搜索用时 31 毫秒
11.
顾及DEM误差自相关的坡度计算模型精度分析   总被引:11,自引:1,他引:10  
基于DEM的坡度计算,其误差来源于DEM误差、DEM结构和坡度计算模型。在顾及DEM误差自相关的前提下,对四种DEM坡度计算模型进行了分析和评价。研究表明,三阶不带权差分能给出较高的坡度计算精度;在局部窗口中,格网点数量越多,坡度计算越准确;等权比不等权的坡度计算模型更准确;DEM误差自相关结构形式对坡度计算无影响。进一步的理论分析和试验分析还表明:DEM误差自相关性的存在,不仅能够改善地形分析的精度,也能改善DEM自身精度。  相似文献   
12.
In this paper, the Markov Chain Monte Carlo (MCMC) approach is used for sampling of the permeability field conditioned on the dynamic data. The novelty of the approach consists of using an approximation of the dynamic data based on streamline computations. The simulations using the streamline approach allows us to obtain analytical approximations in the small neighborhood of the previously computed dynamic data. Using this approximation, we employ a two-stage MCMC approach. In the first stage, the approximation of the dynamic data is used to modify the instrumental proposal distribution. The obtained chain correctly samples from the posterior distribution; the modified Markov chain converges to a steady state corresponding to the posterior distribution. Moreover, this approximation increases the acceptance rate, and reduces the computational time required for MCMC sampling. Numerical results are presented.  相似文献   
13.
14.
Wang  Yutian  Tan  Bingqi  Wang  Yifeng  Wu  Jiangtao 《Natural Resources Research》1994,3(4):284-294
We propose an information-structure-analysis (ISA) method to quantify the correlations between quantitative and qualitative variables as well as within each type of variable. This method is applied to the evaluation of mineral resources in the western Zheijiang Province of China. The district contains a number of silver-bearing Fe–Cu–Pb–Zn mineral deposits near igneous complexes and FeCuPbZn zones away from the complexes. Silver anomalies occur not only in the known Fe–Cu–Zn–Pb deposits, but also in the country rock, suggesting the possible existence of silver deposits far from the igneous complexes.The tonnage distribution of silver is modeled by Monte Carlo simulation. This simulation is conducted on the basis of the correlations between silver (Ag) and lead (Pb), since no known data on silver is available. The known tonnage distribution of lead in 11 control cells was used to approximate the tonnage distribution of silver in the Monte Carlo simulation. With ISA and Monte Carlo methods, the total amount of potential polymetallic resources in 49 cells in the western Zhejiang Provice is predicted. Significantly, a deposit with about 24 tonnes of silver has been found within our exploration target area.  相似文献   
15.
Consider the problem of generating a realization y1 of a Gaussian random field on a dense grid of points 1 conditioned on field observations y2 collected on a sparse grid of points 2. An approach to this is to generate first an unconditional realization y over the grid =1 2, and then to produce y1 by conditioning y on the data y2. As standard methods for generating y, such as the turning bands, spectral or Cholesky approaches can have various limitations, it has been proposed by M. W. Davis to generate realizations from a matrix polynomial approximations to the square root of the covariance matrix. In this paper we describe how to generate a direct approximation to the conditional realization y1, on 1 using a variant of Davis' approach based on approximation by Chebyshev polynomials. The resulting algorithm is simple to implement, numerically stable, and bounds on the approximation error are readily available. Furthermore we show that the conditional realization y1 can be generated directly with a lower order polynomial than the unconditional realization y, and that further reductions can be achieved by exploiting a nugget effect if one is present. A pseudocode version of the algorithm is provided that can be implemented using the fast Fourier transform if the field is stationary and the grid 1 is rectangular. Finally, numerical illustrations are given of the algorithm's performance in generating various 2-D realizations of conditional processes on large sampling grids.  相似文献   
16.
Many stochastic process models for environmental data sets assume a process of relatively simple structure which is in some sense partially observed. That is, there is an underlying process (Xn, n 0) or (Xt, t 0) for which the parameters are of interest and physically meaningful, and an observable process (Yn, n 0) or (Yt, t 0) which depends on the X process but not otherwise on those parameters. Examples are wide ranging: the Y process may be the X process with missing observations; the Y process may be the X process observed with a noise component; the X process might constitute a random environment for the Y process, as with hidden Markov models; the Y process might be a lower dimensional function or reduction of the X process. In principle, maximum likelihood estimation for the X process parameters can be carried out by some form of the EM algorithm applied to the Y process data. In the paper we review some current methods for exact and approximate maximum likelihood estimation. We illustrate some of the issues by considering how to estimate the parameters of a stochastic Nash cascade model for runoff. In the case of k reservoirs, the outputs of these reservoirs form a k dimensional vector Markov process, of which only the kth coordinate process is observed, usually at a discrete sample of time points.  相似文献   
17.
Two different goals in fitting straight lines to data are to estimate a true linear relation (physical law) and to predict values of the dependent variable with the smallest possible error. Regarding the first goal, a Monte Carlo study indicated that the structural-analysis (SA) method of fitting straight lines to data is superior to the ordinary least-squares (OLS) method for estimating true straight-line relations. Number of data points, slope and intercept of the true relation, and variances of the errors associated with the independent (X) and dependent (Y) variables influence the degree of agreement. For example, differences between the two line-fitting methods decrease as error in X becomes small relative to error in Y. Regarding the second goal—predicting the dependent variable—OLS is better than SA. Again, the difference diminishes as X takes on less error relative to Y. With respect to estimation of slope and intercept and prediction of Y, agreement between Monte Carlo results and large-sample theory was very good for sample sizes of 100, and fair to good for sample sizes of 20. The procedures and error measures are illustrated with two geologic examples.  相似文献   
18.
B. Ilbery  D. Maye 《Geoforum》2006,37(3):352-367
Local food is championed as one alternative response to industrial systems of food production and supply. While advocacy for local food is high, there is a lack of empirical evidence about the actual shape and scale of such food supply chains, especially from a retail perspective. Using supply chain diagrams, this paper presents a summary of ‘new’ agro-food geographies for five different retail types—farm shops, butchers, caterers, specialist shops, supermarkets/department stores—that all source local food from suppliers in the Scottish-English borders. Presented as five separate ‘shopping trips’, the paper examines where, how and why retailers source local food. Results reveal the complex nature of local food systems, especially in terms of intra-sector competitive dynamics (with a notable tension between direct forms of retail and established (independent) retailers), links and overlaps with ‘normal’ food retail systems and elastic notions of the ‘local’. The paper also draws a key distinction between locally produced and locally supplied food products.  相似文献   
19.
We studied the temporal behavior of the background shallow seismicity rate in 700 circular areas across inland Japan. To search for and test the significance of the possible rate changes in background seismicity, we developed an efficient computational method that applies the space–time ETAS model proposed by Ogata in 1998 to the areas. Also, we conducted Monte Carlo tests using a simulated catalog to validate the model we applied. Our first finding was that the activation anomalies were found so frequently that the constant background seismicity hypothesis may not be appropriate and/or the triggered event model with constraints on the parameters may not adequately describe the observed seismicity. However, quiescence occasionally occurs merely by chance. Another outcome of our study was that we could automatically find several anomalous background seismicity rate changes associated with the occurrence of large earthquakes. Very significant seismic activation was found before the M6.1 Mt. Iwate earthquake of 1998. Also, possible seismic quiescence was found in an area 150 km southwest of the focal region of the M7.3 Western Tottori earthquake of 2000. The seismicity rate in the area recovered after the mainshock.  相似文献   
20.
From July 1996 to August 1997 the TOR project operated 130 seismographs in North Germany, Denmark and South Sweden, with the aim of collecting signals from local, regional and teleseismic earthquakes. This data set is particularly interesting since the seismic antenna crosses the most significant geological boundary in Europe, the Tornquist Zone, which in the northern part is the border between the Baltic Shield and the younger European lithosphere. Previous studies have shown significant physical changes in the crust and upper mantle across this transition zone, including two independent teleseismic tomographic studies of the TOR data set. But these two studies disagree on the orientation of the slope of the transition. Both studies used an iterative linearized inversion method. We will in this work Preprint submitted to Elsevier Science 27 July 2005 present an inversion based on Bayesian statistics, where the solution space is examined in order to study a very large number of tomographic solutions and to examine the solution uniqueness and uncertainty. The method is applied to measurements of 3345 relative teleseismic P-phase travel times from 48 teleseismic earthquakes with good azimuthal coverage with respect to the great circle arc of the TOR array. We find the lithospheric transition to be a north east inclination of around 30° to 45° off vertical.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号