首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 156 毫秒
1.
利用Welch功率谱估计方法求埋深的参数选择   总被引:1,自引:0,他引:1  
研究了采用Welch谱估计方法求异常体埋深时,窗函数类型、窗口长度、FFT点数、重叠采样点数对结果的影响;应用Matlab中的PSD函数,以无限长水平圆柱体为例作了分析。根据分析结果,提出了采用Welch谱估计方法计算异常体埋深时,关于窗函数、窗长度、FFT点数和重叠采样点数选择的建议。  相似文献   

2.
短时傅里叶变换(STFT)和广义S变换(GST)都被应用到地震时频分析中,但对两者在信号分析过程中的特点和差异的研究相对较少。通过比较两者的理论公式、窗口函数及地震信号的实际处理效果发现:短时傅里叶变换在地震信号分析过程中整个时频域具有相同的分辨率,整体性较强,缺少时频聚焦能力,不能对信号重点观测区域有针对性的提高时频分辨率;广义S变换对地震高频信号具有较高的时间分辨率,对低频信号具有较高的频率分辨率,且可以通过改变参数p的值对广义S变换窗口函数的形态做出较大的调整,也可以改变λ的值实现窗口形态微调,通过对窗口函数的调整,广义S变换可以对信号特定区域进行时频聚焦。  相似文献   

3.
自适应加权改进窗口中值滤波   总被引:1,自引:0,他引:1  
针对传统的多级二维中值滤波窗口函数及窗口大小对地震数据噪声处理存在影响,提出了一种自适应加权改进窗口多级二维中值滤波器。对传统的多级二维中值滤波窗口函数进行改进,使其具有保护线性和细节特征。在改进窗口函数的基础上,提出了自适应加权函数,加权函数的自适应性对噪声衰减和有效信号的保真奠定了基础,自适应加权改进窗口中值滤波器去除噪声明显。通过理论模型和实际数据处理的对比,表明本方法去除噪声和保护有效信号能力优于传统的二维多级中值滤波器。  相似文献   

4.
符伟  刘财 《世界地质》2015,34(2):505-510
将L1正则化方法应用到地震谱反演的实现中,验证了谱反演方法在拓宽频谱宽度、提高分辨率上的可行性。对谱反演中L1正则化解的稀疏性和应用矩形窗造成结果不稳定的现象做了详细讨论,提出自动选择窗口长度进行谱反演的算法模式,并基于该模式对地震模型进行了试算。该方法有效地拓宽了数据的频谱宽度,提高了反射地震勘探的精度,为薄层的识别和更加复杂的地震勘探提供了一种新的思路。  相似文献   

5.
地震勘探广泛应用于油气、煤田勘探。地震波场数值模拟是整个地震勘探数据处理技术的基石。将径向基函数(RBF)引入地震声波波场数值模拟中,在空间上用径向基函数无网格法来构造二阶导数,而在时间上采用简单的二阶差分公式,并重点讨论了形状参数c对该方法精度的影响,总结c经验取值范围为2~4倍平均数据点间距。设计不同模型,利用径向基函数无网格法进行声波波场模拟,并与空间四阶时间二阶的有限差分计算结果进行对比,结果表明:同样精度下,径向基函数每个波长所取的数据点数远小于空间四阶矩形网格有限差分每个波长所取的网格点数,即径向基函数的空间采样率更低,这表明径向基函数具有更小的数值频散。   相似文献   

6.
在城市中应用微动H/V谱比方法面对大量且复杂的人文噪声干扰,需要对噪声强度较大的微动数据进行去噪处理或信号分析。本文针对现有方法难以处理干扰较大的微动数据以及信号提取过程繁琐的问题,提出基于XGBoost(extreme gradient boosting)的多重加权谱比降噪方法。首先对采集的微动数据进行幅值和频率分析,建立幅值加权谱比、频率加权谱比和多重加权谱比;然后根据建立的多重加权谱比,通过XGBoost方法获得降噪后的谱比曲线。将本文方法与传统STA/LTA(short time average/long time average)方法进行实际高噪声数据对比分析,结果表明相比于STA/LTA方法,本文方法对高噪声数据提取效果更好。  相似文献   

7.
水工勘测中物探观测信号的处理及解译   总被引:2,自引:0,他引:2  
针对电磁法物探观测信号离散傅立叶变换中信号畸变问题,研究了不同尺度窗口对信号分析的作用和影响。研究表明,不同尺度窗口表现了信号的不同特点,并各自具有一定的优缺点。信号解译中首先应进行小波分析,剔除病态信号,再根据信号和勘测目的,选用不同尺度窗口函数,从而准确、有效地进行信号处理与解译。  相似文献   

8.
为了改善现有的时频域算法对于探地雷达信号消噪效果不佳的现状,本文利用时频谱分解重排算法实现了探地雷达信号的优化消噪处理。由于采用窗口函数对信号进行划分,传统的时频变换方法得到的结果是模糊化的信号时频分布,这使得在时频消噪过程中,如果阈值取不好,信号的有用成份极容易受到损伤。而采用时频谱分解重排算法可以对信号的时频变换结果进行重排,使得信号能量更加集中在真实频率点附近,从而获得更高的分辨率,而且可以显著降低消噪过程中对有用信息损伤的几率。仿真数据和实际资料对比结果表明,时频重排谱分解算法比传统消噪方法具有更好的消噪效果。  相似文献   

9.
通过矩方法,以无序数据集中不同品位级次内的频率为测度,研究了胶东大磨曲家金矿矿体及玲珑花岗岩、胶东群等近矿围岩的Au、Ag、As等主要成矿元素分布的广义谱曲线(D_q-q)和多重谱曲线(f(α)-α)特征。在权重-0.2≤q≤1. 6时,结果显示矿体与近矿围岩中成矿元素品位分布的广义谱和多重谱特征存在明显的区别。矿体中三种成矿元素的广义谱曲线近似直线,而多重谱曲线的宽度较小,显示成矿元素品位级次密集,较为均匀,接近简单分形;近矿围岩中广义谱曲线弯曲明显,多重谱曲线的宽度较大,显示近矿围岩中元素品位级次范围较宽,相对不均匀。同时,矿体中多重谱曲线f(α)的极值所对应的α比近矿围岩中的小,代表最或然子集的品位增加,指示成矿元素在矿体中富集。多重分形分析显示矿体中的成矿元素较近矿围岩中变得更为富集而均匀。  相似文献   

10.
贺梅  刘财  周寅  罗腾  张鹏  丁玲 《世界地质》2015,34(2):476-483
通过对比短时Fourier变换、连续小波变换、S变换、广义S变换等时频分析方法发现,从窗的维度看,它们具有统一的内积形式,差别在于窗函数中σ(f)的取值。通过分析相应方法中σ(f)随信号频率的变化规律,以及随之产生的时窗与时频分辨率问题,解释广义S变换的相对优越性。将广义S变换应用于四类薄互层模型,研究薄互层时频特性与地质特性的对应关系,指导地层旋回方向的预测。  相似文献   

11.
The application of frequency distribution statistics to data provides objective means to assess the nature of the data distribution and viability of numerical models that are used to visualize and interpret data. Two commonly used tools are the kernel density estimation and reduced chi-squared statistic used in combination with a weighted mean. Due to the wide applicability of these tools, we present a Java-based computer application called KDX to facilitate the visualization of data and the utilization of these numerical tools.  相似文献   

12.
Cluster analysis can be used to group samples and to develop ideas about the multivariate geochemistry of the data set at hand. Due to the complex nature of regional geochemical data (neither normal nor log-normal, strongly skewed, often multi-modal data distributions, data closure), cluster analysis results often strongly depend on the preparation of the data (e.g. choice of the transformation) and on the clustering algorithm selected. Different variants of cluster analysis can lead to surprisingly different cluster centroids, cluster sizes and classifications even when using exactly the same input data. Cluster analysis should not be misused as a statistical “proof” of certain relationships in the data. The use of cluster analysis as an exploratory data analysis tool requires a powerful program system to test different data preparation, processing and clustering methods, including the ability to present the results in a number of easy to grasp graphics. Such a tool has been developed as a package for the R statistical software. Two example data sets from geochemistry are used to demonstrate how the results change with different data preparation and clustering methods. A data set from S-Norway with a known number of clusters and cluster membership is used to test the performance of different clustering and data preparation techniques. For a complex data set from the Kola Peninsula, cluster analysis is applied to explore regional data structures.  相似文献   

13.
All variables of several large data sets from regional geochemical and environmental surveys were tested for a normal or lognormal data distribution. As a general rule, almost all variables (up to more than 50 analysed chemical elements per data set) show neither a normal or a lognormal data distribution. Even when different transformation methods are used more than 70 % of all variables in every single data set do not approach a normal distribution. Distributions are usually skewed, have outliers and originate from more than one process. When dealing with regional geochemical or environmental data normal and/or lognormal distributions are an exception and not the rule. This observation has serious consequences for the further statistical treatment of geochemical and environmental data. The most widely used statistical methods are all based on the assumption that the studied data show a normal or lognormal distribution. Neglecting that geochemcial and environmental data show neither a normal or lognormal distribution will lead to biased or faulty results when such techniques are used. Received: 21 June 1999 · Accepted: 14 August 1999  相似文献   

14.
We present a method for fitting trishear models to surface profile data, by restoring bedding dip data and inverting for model parameters using a Markov chain Monte Carlo method. Trishear is a widely-used kinematic model for fault-propagation folds. It lacks an analytic solution, but a variety of data inversion techniques can be used to fit trishear models to data. Where the geometry of an entire folded bed is known, models can be tested by restoring the bed to its pre-folding orientation. When data include bedding attitudes, however, previous approaches have relied on computationally-intensive forward modeling. This paper presents an equation for the rate of change of dip in the trishear zone, which can be used to restore dips directly to their pre-folding values. The resulting error can be used to calculate a probability for each model, which allows solution by Markov chain Monte Carlo methods and inversion of datasets that combine dips and contact locations. These methods are tested using synthetic and real datasets. Results are used to approximate multimodal probability density functions and to estimate uncertainty in model parameters. The relative value of dips and contacts in constraining parameters and the effects of uncertainty in the data are investigated.  相似文献   

15.
Geologists may want to classify compositional data and express the classification as a map. Regionalized classification is a tool that can be used for this purpose, but it incorporates discriminant analysis, which requires the computation and inversion of a covariance matrix. Covariance matrices of compositional data always will be singular (noninvertible) because of the unit-sum constraint. Fortunately, discriminant analyses can be calculated using a pseudo-inverse of the singular covariance matrix; this is done automatically by some statistical packages such as SAS. Granulometric data from the Darss Sill region of the Baltic Sea is used to explore how the pseudo-inversion procedure influences discriminant analysis results, comparing the algorithm used by SAS to the more conventional Moore–Penrose algorithm. Logratio transforms have been recommended to overcome problems associated with analysis of compositional data, including singularity. A regionalized classification of the Darss Sill data after logratio transformation is different only slightly from one based on raw granulometric data, suggesting that closure problems do not influence severely regionalized classification of compositional data.  相似文献   

16.
传统纸介质手工制图存在信息实时再现滞后、共享交换不便等弊端,即便目前普遍使用计算机制图,也只是对纸介质底图进行二次描绘,不是真正意义上的计算机数字化制图和专业应用.因此,由野外实测数据直接进机处理生成地质基础图件,已成为专业人员在实际工作中急需解决的实际问题.笔者尝试以中国地质大学开发的MAPGIS6.6为主,辅以常用图表处理软件等,将原始数据转换成MAPGIS明码格式文件,导入成图系统,经系统处理直接成图,基本解决了野外实测数据直接进机处理生成地质基础图件的问题,其成图精度高,基本图形、数据共享,成果信息实时再现,能够进行基本的空间分析,可实现信息全息化,成图质量可通过源数据检查等.  相似文献   

17.
Wavelet transforms have been used widely to analyse environmental data. These data typically comprise a series of measurements taken at regular intervals in time or space. The analysis offers a decomposition of the data that distinguishes components at different spatial scales but also, unlike Fourier analysis, can resolve local intermittent features. Most wavelet methods require the data to be sampled at regular intervals and little attention has been paid to developing methods for data that are not. In this paper, we derive a discrete Haar wavelet transform for irregularly sampled data and show how the resulting wavelet coefficients can be used to estimate contributions of variance. We discuss the interpretation of these statistics using data on apparent soil electrical conductivity of soil measured across a landscape as an example.  相似文献   

18.
鉴于目前岩土工程勘察软件在内业处理中存在数据处理不灵活,不能完全剔除异常数据等不足,提出应用Excel来实现对勘察数据的数据共享与统计分析,以满足内业处理时的独特要求,减轻其工作量;建议在勘察成果中以图表形式直观显示勘察数据,便于勘察成果的应用。  相似文献   

19.
A data reduction method is described for determining platinum-group element (PGE) abundances by inductively coupled plasma-mass spectrometry (ICP-MS) using external calibration or the method of standard addition. Gravimetric measurement of volumes, the analysis of reference materials and the use of procedural blanks were all used to minimise systematic errors. Internal standards were used to correct for instrument drift. A linear least squares regression model was used to calculate concentrations from drift-corrected counts per second (cps). Furthermore, mathematical manipulations also contribute to the uncertainty estimates of a procedure. Typical uncertainty estimate calculations for ICP-MS data manipulations involve: (1) Carrying standard deviations from the raw cps through the data reduction or (2) calculating a standard deviation from multiple final concentration calculations. It is demonstrated that method 2 may underestimate the uncertainty estimate of the calculated data. Methods 1 and 2 do not typically include an uncertainty estimate component from a regression model. As such models contribute to the uncertainty estimates affecting the calculated data, an uncertainty estimate component from the regression must be included in any final error calculations. Confidence intervals are used to account for uncertainty estimates from the regression model. These confidence intervals are simpler to calculate than uncertainty estimates from method 1, for example. The data reduction and uncertainty estimation method described here addresses problems of reporting PGE data from an article in the literature and addresses both precision and accuracy. The method can be applied to any analytical technique where drift corrections or regression models are used.  相似文献   

20.
赵勇  宋阳  罗勇  张静 《新疆地质》2006,24(4):460-463
利用地震物理模型技术,依据准噶尔盆地车38井区三维地震的地质解释成果,按一定的模拟相似比,制作一个三维物理模型.利用超声波反射技术,用2种观测系统对模型进行全三维数据采集.通过不同面元三维数据效果对比,提供针对不同地质目标确定面元大小的试验依据,提出从野外采集、资料处理到解释的成套方案.根据不同地震地质条件和勘探目标要求,应用模型正演技术可以指导合理确定三维地震勘探面元大小和其它采集参数.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号