首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   25篇
  免费   3篇
  国内免费   3篇
测绘学   7篇
大气科学   4篇
地球物理   11篇
地质学   3篇
海洋学   1篇
天文学   1篇
自然地理   4篇
  2023年   1篇
  2020年   2篇
  2017年   1篇
  2013年   1篇
  2011年   2篇
  2009年   3篇
  2008年   3篇
  2007年   4篇
  2006年   1篇
  2005年   1篇
  2004年   1篇
  1998年   3篇
  1996年   1篇
  1995年   1篇
  1994年   1篇
  1993年   1篇
  1992年   1篇
  1991年   1篇
  1990年   1篇
  1954年   1篇
排序方式: 共有31条查询结果,搜索用时 0 毫秒
11.
Reconstructed sunspot data are available that extend solar activity back to 11 360 years before the present. We have examined these data using Hurst analysis, a moving average filter, and Fourier analysis. All of the procedures indicate the presence of a long term (≈6 000 year) cycle not previously reported. A number of shorter cycles formerly identified in the literature by using Fourier analysis, Bayes methods, and maximum entropy methods were also detected in the reconstructed sunspot data.  相似文献   
12.
Transfer functions represent the ratio in the frequency domain of one ground motion to another. Transfer functions are a convenient way by which the variation of earthquake ground motions from the free-field to the foundation level of a structure can be quantified for studies of kinematic soil–structure interaction. Aside from ordinary filtering and baseline correction, substantial signal processing occurs in the computation of transfer functions, including windowing (to extract the S-wave portion of the record) and smoothing (to reduce scatter that can obscure physically significant trends). Utilizing several signal processing techniques on a sample data set, we find that detailed features of the transfer function ordinates (i.e., frequency-to-frequency variations) can be affected by the degree of smoothing and by the window length (e.g., whole record versus S-window). However, the overall shape and magnitude of the transfer functions are relatively consistent. More important than signal processing details is the frequency bandwidth over which the results are considered valid, because significant portions of the spectrum can be dominated by stochastic processes with little physical meaning. We argue that transfer functions should be interpreted over those portions of the spectrum having minimal noise impact, as indicated by high coherence.  相似文献   
13.
This paper revisits the computation of product combinations for quantification of resources (e.g., pore volume for hydrocarbon reservoir formations). The explanations have been simplified by considering the exhaustive numerical model for properties that are multiplied first and then added or averaged for evaluation. The analysis starts without any probabilistic considerations. Abbreviated (up-scaled) computations in the sum of multiplications are proposed by substituting the individual values by averages for each rock property in the coarser cell resolution model. A result found is that averaged properties can be utilized for estimation at non-sampled locations, instead of individual values; however, covariances and cumulants must also be included in the abbreviated computations. The smoothing effect of kriging is found to be irrelevant if the kriging variance is also included in the up-scaled abbreviated pore volume computations. Thus, the equivalence between computation of resource volumes from kriging estimates and conditional stochastic simulations is established, with the condition that numerical estimation must incorporate the complete covariance and cumulant information as well. An example shows pore volume prediction from a kriging model matches the unbiased result from stochastic simulations.  相似文献   
14.
This paper shows the potential of gravity data to map a buried landfill bottom topography. To this end, a gravity inversion method is presented for estimating the landfill’s bottom depths at discrete points assuming a decrease of the density contrast with depth according to a hyperbolic law. The method’s efficiency was tested using synthetic data from simulated waste landfills, producing estimated bottom topographies very close to the true ones. The method’s potentiality has been further evaluated in applying it to the gravity data from the abandoned Thomas Farm Landfill site, Indiana, USA, whose bottom topography is known. The estimated topography showed close agreement with the known Thomas Farm Landfill’s bottom topography.  相似文献   
15.
本文主要介绍一种利用TMS 32010数字信号处理器(DSP)和IBM-PC/XT微机构成的微机遥感图像快速处理系统。在此系统上,我们根据微波遥感图像的特点,采用TMS32010宏汇编语言研制出了一个遥感图像快速处理软件库。经过机载微波辐射计图像处理检验,采用TMS 32010语言编程在该系统上的图像处理速度要比采用FORTRAN语言编程在普通微机系统上的图像处理速度提高了100-500倍。此外,本文还提出一种新的平滑滤波算法。该算法滤除高斯白噪声的能力要优于中值滤波,同时还可以保护图像的边缘不被模糊。  相似文献   
16.

层析是电离层三维结构反演的重要技术.增加约束可缓解层析模型的病态问题.然而, 迭代层析中, 大量无射线穿过的网格易对施加的约束产生与真实意图相违背的约束结果(称负面约束).针对该问题, 提出了一种参数平滑的迭代层析方法(iterative tomography method via parameters smoothing, ITPS).每次迭代中, 先用Chapman函数及最小二乘法拟合及改正每一垂直剖面, 获取关于Chapman函数参数的二维图像, 然后用移动窗口法平滑各参数图像并以此改正各网格的电子密度.实验表明: ITPS方法可一定程度上减少负面约束并抑制层像的扰动.相对于MART和CMART算法, ITPS方法在垂直剖面、F2层临界频率(foF2)、F2层峰值高度(hmF2)、斜电子总含量(Slant Total Electron Content, STEC)及hmF2以上电子密度等方面均具有更佳的表现.相对于精度较高的CMART算法, ITPS方法在foF2hmF2的平均优化率分别为7.49%及6.60%, STEC的平均优化率为5.19%, hmF2以上电子密度的平均优化率为11.41%.

  相似文献   
17.
It is possible to reconstruct the past variation of an environmental variable from measured historical indicators when the modern values of the variable and the indicators are known. In a Bayesian statistical approach, the selection of a prior probability distribution for the past values of the environmental variable can then be crucial and the selection therefore should be made carefully. This is particularly the case when the data are noisy and the statistical model used is complex since the influence of the prior on the results can then be especially strong. It can be difficult to elicit the prior probability distribution from the available information, since usually there are no measured data on the past values of the variable one wants to reconstruct and different reconstructions are typically consistent with each other only at a coarse level. To overcome these difficulties we propose to use a non-informative smoothing prior, possibly in combination with an informative prior, that simply penalizes for roughness of the reconstruction as measured by the variability of its values. We believe that it can sometimes be easier to set an overall prior distribution on the roughness than to agree on a prior for the actual values of the reconstructed variable. Note that by using a smoothing prior one incorporates into the model itself the smoothing step usually done before or after the actual numerical reconstruction. Another idea proposed in this paper is to integrate the reconstruction model with a multiscale feature analysis technique known as SiZer. Multiscale analysis of the posterior distribution of the reconstructed variable makes it possible to infer its statistically significant features such as trends, maxima and minima at several different time scales. While only temperature is considered in this paper, the technique can be applied to other environmental variables.  相似文献   
18.
Exploratory data analysis(EDA)is a toolbox of data manipulation methods for looking at data to seewhat they seem to say,i.e.one tries to let the data speak for themselves.In this way there is hope thatthe data will lead to indications about'models'of relationships not expected a priori.In this respect EDAis a pre-step to confirmatory data analysis which delivers measures of how adequate a model is.In thistutorial the focus is on multivariate exploratory data analysis for quantitative data using linear methodsfor dimension reduction and prediction.Purely graphical multivariate tools such as 3D rotation andscatterplot matrices are discussed after having introduced the univariate and bivariate tools on which theyare based.The main tasks of multivariate exploratory data analysis are identified as'search for structure'by dimension reduction and'model selection'by comparing predictive power.Resampling is used tosupport validity,and variables selection to improve interpretability.  相似文献   
19.
GRAPES中尺度模式地形有效尺度影响的理想数值试验研究   总被引:3,自引:1,他引:3  
针对区域GRAPES模式,设计并实施了一系列不同属性气流条件下的二维与三维理想数值模拟试验,通过比较分析不同尺度地形的模拟试验表明,模式地形尺度的选择对模式预报能力有着非常重要的影响;同时指出6倍格距的地形尺度可以认为是区域GRAPES模式的最高有效分辨地形尺度,6倍以下尺度的地形对该模式的预报能力是有害的,应该滤除,...  相似文献   
20.
Smoothing is essential to many oceanographic, meteorological, and hydrological applications. There are two predominant classes of smoothing problems. The first is fixed-interval smoothing, where the objective is to estimate model states within a time interval using all available observations in the interval. The second is fixed-lag smoothing, where the objective is to sequentially estimate model states over a fixed or indefinitely growing interval by restricting the influence of observations within a fixed window of time ahead of the evolving estimation time. In this paper, we use an ensemble-based approach to fixed-interval and fixed-lag smoothing, and synthesize two algorithms. The first algorithm is a fixed-interval smoother whose computation time is linear in the interval. The second algorithm is a fixed-lag smoother whose computation time is independent of the lag length. The complexity of these algorithms is presented, shown to improve upon existing implementations and verified with identical-twin experiments conducted with the Lorenz-95 system. Results suggest that ensemble methods yield efficient fixed-interval and fixed-lag smoothing solutions in the sense that the additional increment for smoothing is a small fraction of either filtering or model propagation costs in a practical ensemble application. We also show that fixed-interval smoothing can perform as fast as fixed-lag smoothing, and it may not be necessary to use a fixed-lag approximation for computational savings alone.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号