排序方式: 共有31条查询结果,搜索用时 0 毫秒
11.
Reconstructed sunspot data are available that extend solar activity back to 11 360 years before the present. We have examined
these data using Hurst analysis, a moving average filter, and Fourier analysis. All of the procedures indicate the presence
of a long term (≈6 000 year) cycle not previously reported. A number of shorter cycles formerly identified in the literature
by using Fourier analysis, Bayes methods, and maximum entropy methods were also detected in the reconstructed sunspot data. 相似文献
12.
Transfer functions represent the ratio in the frequency domain of one ground motion to another. Transfer functions are a convenient way by which the variation of earthquake ground motions from the free-field to the foundation level of a structure can be quantified for studies of kinematic soil–structure interaction. Aside from ordinary filtering and baseline correction, substantial signal processing occurs in the computation of transfer functions, including windowing (to extract the S-wave portion of the record) and smoothing (to reduce scatter that can obscure physically significant trends). Utilizing several signal processing techniques on a sample data set, we find that detailed features of the transfer function ordinates (i.e., frequency-to-frequency variations) can be affected by the degree of smoothing and by the window length (e.g., whole record versus S-window). However, the overall shape and magnitude of the transfer functions are relatively consistent. More important than signal processing details is the frequency bandwidth over which the results are considered valid, because significant portions of the spectrum can be dominated by stochastic processes with little physical meaning. We argue that transfer functions should be interpreted over those portions of the spectrum having minimal noise impact, as indicated by high coherence. 相似文献
13.
J.A. Vargas-Guzmán 《Natural Resources Research》2008,17(4):245-254
This paper revisits the computation of product combinations for quantification of resources (e.g., pore volume for hydrocarbon
reservoir formations). The explanations have been simplified by considering the exhaustive numerical model for properties
that are multiplied first and then added or averaged for evaluation. The analysis starts without any probabilistic considerations.
Abbreviated (up-scaled) computations in the sum of multiplications are proposed by substituting the individual values by averages
for each rock property in the coarser cell resolution model. A result found is that averaged properties can be utilized for
estimation at non-sampled locations, instead of individual values; however, covariances and cumulants must also be included
in the abbreviated computations. The smoothing effect of kriging is found to be irrelevant if the kriging variance is also
included in the up-scaled abbreviated pore volume computations. Thus, the equivalence between computation of resource volumes
from kriging estimates and conditional stochastic simulations is established, with the condition that numerical estimation
must incorporate the complete covariance and cumulant information as well. An example shows pore volume prediction from a
kriging model matches the unbiased result from stochastic simulations. 相似文献
14.
This paper shows the potential of gravity data to map a buried landfill bottom topography. To this end, a gravity inversion
method is presented for estimating the landfill’s bottom depths at discrete points assuming a decrease of the density contrast
with depth according to a hyperbolic law. The method’s efficiency was tested using synthetic data from simulated waste landfills,
producing estimated bottom topographies very close to the true ones. The method’s potentiality has been further evaluated
in applying it to the gravity data from the abandoned Thomas Farm Landfill site, Indiana, USA, whose bottom topography is
known. The estimated topography showed close agreement with the known Thomas Farm Landfill’s bottom topography. 相似文献
15.
本文主要介绍一种利用TMS 32010数字信号处理器(DSP)和IBM-PC/XT微机构成的微机遥感图像快速处理系统。在此系统上,我们根据微波遥感图像的特点,采用TMS32010宏汇编语言研制出了一个遥感图像快速处理软件库。经过机载微波辐射计图像处理检验,采用TMS 32010语言编程在该系统上的图像处理速度要比采用FORTRAN语言编程在普通微机系统上的图像处理速度提高了100-500倍。此外,本文还提出一种新的平滑滤波算法。该算法滤除高斯白噪声的能力要优于中值滤波,同时还可以保护图像的边缘不被模糊。 相似文献
16.
层析是电离层三维结构反演的重要技术.增加约束可缓解层析模型的病态问题.然而, 迭代层析中, 大量无射线穿过的网格易对施加的约束产生与真实意图相违背的约束结果(称负面约束).针对该问题, 提出了一种参数平滑的迭代层析方法(iterative tomography method via parameters smoothing, ITPS).每次迭代中, 先用Chapman函数及最小二乘法拟合及改正每一垂直剖面, 获取关于Chapman函数参数的二维图像, 然后用移动窗口法平滑各参数图像并以此改正各网格的电子密度.实验表明: ITPS方法可一定程度上减少负面约束并抑制层像的扰动.相对于MART和CMART算法, ITPS方法在垂直剖面、F2层临界频率(foF2)、F2层峰值高度(hmF2)、斜电子总含量(Slant Total Electron Content, STEC)及hmF2以上电子密度等方面均具有更佳的表现.相对于精度较高的CMART算法, ITPS方法在foF2与hmF2的平均优化率分别为7.49%及6.60%, STEC的平均优化率为5.19%, hmF2以上电子密度的平均优化率为11.41%.
相似文献17.
It is possible to reconstruct the past variation of an environmental variable from measured historical indicators when the
modern values of the variable and the indicators are known. In a Bayesian statistical approach, the selection of a prior probability
distribution for the past values of the environmental variable can then be crucial and the selection therefore should be made
carefully. This is particularly the case when the data are noisy and the statistical model used is complex since the influence
of the prior on the results can then be especially strong. It can be difficult to elicit the prior probability distribution
from the available information, since usually there are no measured data on the past values of the variable one wants to reconstruct
and different reconstructions are typically consistent with each other only at a coarse level. To overcome these difficulties
we propose to use a non-informative smoothing prior, possibly in combination with an informative prior, that simply penalizes
for roughness of the reconstruction as measured by the variability of its values. We believe that it can sometimes be easier
to set an overall prior distribution on the roughness than to agree on a prior for the actual values of the reconstructed
variable. Note that by using a smoothing prior one incorporates into the model itself the smoothing step usually done before
or after the actual numerical reconstruction. Another idea proposed in this paper is to integrate the reconstruction model
with a multiscale feature analysis technique known as SiZer. Multiscale analysis of the posterior distribution of the reconstructed
variable makes it possible to infer its statistically significant features such as trends, maxima and minima at several different
time scales. While only temperature is considered in this paper, the technique can be applied to other environmental variables. 相似文献
18.
C.WEIHS Mathematical Applications Information Services R-.Z. CIBA-GEIGY Ltd. CH- Basel Switzerland 《地理学报(英文版)》1993,(5)
Exploratory data analysis(EDA)is a toolbox of data manipulation methods for looking at data to seewhat they seem to say,i.e.one tries to let the data speak for themselves.In this way there is hope thatthe data will lead to indications about'models'of relationships not expected a priori.In this respect EDAis a pre-step to confirmatory data analysis which delivers measures of how adequate a model is.In thistutorial the focus is on multivariate exploratory data analysis for quantitative data using linear methodsfor dimension reduction and prediction.Purely graphical multivariate tools such as 3D rotation andscatterplot matrices are discussed after having introduced the univariate and bivariate tools on which theyare based.The main tasks of multivariate exploratory data analysis are identified as'search for structure'by dimension reduction and'model selection'by comparing predictive power.Resampling is used tosupport validity,and variables selection to improve interpretability. 相似文献
19.
20.
Smoothing is essential to many oceanographic, meteorological, and hydrological applications. There are two predominant classes
of smoothing problems. The first is fixed-interval smoothing, where the objective is to estimate model states within a time
interval using all available observations in the interval. The second is fixed-lag smoothing, where the objective is to sequentially
estimate model states over a fixed or indefinitely growing interval by restricting the influence of observations within a
fixed window of time ahead of the evolving estimation time. In this paper, we use an ensemble-based approach to fixed-interval
and fixed-lag smoothing, and synthesize two algorithms. The first algorithm is a fixed-interval smoother whose computation
time is linear in the interval. The second algorithm is a fixed-lag smoother whose computation time is independent of the lag length. The complexity of these algorithms is presented, shown to improve upon existing implementations and verified
with identical-twin experiments conducted with the Lorenz-95 system. Results suggest that ensemble methods yield efficient
fixed-interval and fixed-lag smoothing solutions in the sense that the additional increment for smoothing is a small fraction
of either filtering or model propagation costs in a practical ensemble application. We also show that fixed-interval smoothing
can perform as fast as fixed-lag smoothing, and it may not be necessary to use a fixed-lag approximation for computational
savings alone. 相似文献