首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到17条相似文献,搜索用时 346 毫秒
1.
最小二乘配置法由于其能融合不同种类重力观测数据进行局部重力场逼近的特性而受到广泛关注,但最小二乘配置结果的不稳定严重影响该方法的推广应用。 基于对重力观测量协方差矩阵的谱分解,分析出该协方差矩阵存在病态性,协方差矩阵的求逆过程是信号放大的非平稳过程,微小的观测误差会被协方差矩阵的小奇异值放大,从而导致配置结果的不稳定且精度偏低。 引入 Tikhonov 正则化算法,通过 L 曲线法选择正则化参数,利用正则化参数修正重力观测量协方差矩阵的小奇异值,能抑制其对观测误差的放大影响。 通过以 EGM2008 重力场模型分别计算的山区、丘陵和海域重力异常作为基础数据确定相应区域大地水准面的实验,验证了本文算法的有效性。  相似文献   

2.
基于最小二乘配置法建立的航空重力异常与其相应地面重力异常的协方差函数,由于协方差矩阵严重病态,观测误差被放大从而影响向下延拓结果的稳定性和精度;引入基于 GCV 法选择正则化参数的奇异值截断法后,能抑制协方差矩阵的病态性对观测噪声的放大影响。 基于 EGM2008 重力场模型分别计算陆地和海域的航空重力异常作为基础数据,用本文方法向下延拓获取相应的地面重力异常,实验结果验证该方法的有效性。  相似文献   

3.
提出6种少量地面重力测量布设方案和4种基于少量地面控制的航空重力测量数据处理方法,建立基于最小二乘配置的格网平均重力异常的融合模型,有效地估计出了航空重力数据可能存在的系统偏差,提高了格网数据的质量。  相似文献   

4.
基于常规三维变分同化(3DVAR)思想和反问题中的正则化技术,提出了适用于风场融合的带正则化约束项的3DVAR方法,在南海海域开展数据融合试验,同时采用模型函数方法确定合理的正则化参数,针对一次台风个例进行了QuikSCAT散射计海面风场数据和华南中尺度模式海面风场数据的融合试验,结果表明采用带正则化约束的3DVAR融合方法,明显消除了常规3DVAR方法融合风场时带来的虚假信息,融合后分析风场以及涡度场和散度场分布均匀,结构清晰,气旋中心显著,且分析场中观测起主导作用;采用信号自由度(DFS)方法对融合方法进行定量评估,发现相对常规3DVAR方法,带正则化约束的3DVAR融合系统中观测数据提供的DFS较多,同时提高了观测场对分析场的影响;基于独立观测资料对融合结果进行检验发现相对华南中尺度模式和常规3DVAR方法的统计结果,带正则化约束的3DVAR方法得到的风场具有最小的均方根误差和最大的相关系数。  相似文献   

5.
大型结构的模型修正求解问题多呈现不同程度的病态,实测数据的微小误差都有可能造成求解的失效。该文研究了测量噪声影响下模型修正病态系统的求解问题。首先,介绍了结构模型修正的求解问题以及数学上常用的正则化方法;然后,通过一悬臂梁模型数值算例探讨了截断奇异值分解正则化方法和"L"曲线法在结构模型修正求解中的适用性。结果显示,适当的正则化可以有效的解决模型修正病态系统的求解问题;另外,该方法对于部分满足离散Picard条件的模型修正方程同样适用。最后,通过一导管架平台物理模型试验对该正则化方法的实用性进行了验证。  相似文献   

6.
地磁场的向下延拓是实现磁场数据空间转换的主要手段,是多源海洋磁力测量数据融合和构建三维海洋磁空间背景场模型的关键技术。在分析频域位场向下延拓方法原理基础上,分别采用了4种向下延拓方法将300 m高度磁测数据延拓至200 m高度,并与同一海区200 m高度实测数据对比,以检验各延拓方法的计算效果,为构建三维海洋磁空间背景场模型提供参考。结果表明:Tikhonov正则化法延拓结果与实测数据最接近,积分迭代法容易放大实测数据中高频噪声,延拓误差最大,迭代Tikhonov正则化法与Landweber迭代法延拓结果等值线最光滑,各方法计算误差相差不大,均在4 nT左右,但各方法计算值缺乏局部细节信息,与实测数据仍存在一定差距。  相似文献   

7.
介绍利用声学方法测量深海热液温度场的基本原理。对宽单峰、窄单峰和双峰温度场模型,采用最小二乘法和傅立叶正则化方法分别进行重建仿真,给出重建温度场的绝对误差、相对误差和均方根误差。比较最小二乘法与正则化方法对几种温度场模型的重建结果。仿真结果表明:最小二乘法对宽单峰模型温度场具有较高的重建精度,而正则化方法对高温区相对测量域偏小的窄单峰模型及双峰模型温度场比最小二乘法有更好的还原结果。  相似文献   

8.
Surfer8.0在重力异常数据格网化中的应用   总被引:1,自引:0,他引:1  
介绍了利用Surfer8.0软件进行重力异常数据格网化的方法,并结合重力场数据的特点,编制了相应的数据格式转换程序,直接利用Surfer8.0软件中几种数据插值方法,对陆地、山区、海洋等不同地形数据进行格网化处理,并对结果进行交叉验证和对比分析,结果表明利用Surfer8.0软件进行重力异常数据格网化的方法是切实可行的。  相似文献   

9.
通过对南黄海及周边船载测量重力数据、现代卫星测高重力数据、陆地测量重力数据以及地球重力模型等多来源重力数据进行整理和融合,对南黄海地区布格重力异常特征进行了分析。利用基于切割法和插值迭代法的大深度位场向下延拓技术的视密度反演法对南黄海布格重力进行处理和解释,结合山东和朝鲜半岛的地质信息,推断了扬子块体和中朝块体在南黄海海域的界限。中朝和扬子块体在海上的分界位置为千里岩北缘断裂,向NE延伸到125°E后,折向NNW向并在36°N附近再次转为NE向进入朝鲜半岛并经过洪城南侧继续以NE向延伸。扬子的北缘为苏鲁造山带,穿过黄海与京畿造山带相接,苏鲁超高压变质带对应洪城杂岩。  相似文献   

10.
多次波压制日益成为单道地震资料处理的焦点,其关键技术之一是自适应匹配滤波。然而,传统匹配滤波算法不总适合描述随时间和空间非平稳变化的地震信号,针对这一问题正则化非平稳回归技术被提出。从基本原理出发,采用模型数据对正则化参数进行了试验,并获得了它们对处理效果的影响特性,最后将该技术用于实际资料的多次波预测减去过程中,取得了良好的多次波压制效果。通过实际应用对比,证实该技术较传统匹配滤波方法在去除多次波方面更具优势。  相似文献   

11.
我国陆海新一代平均重力异常基本分辨率等问题的研究   总被引:3,自引:0,他引:3  
研究了建立我国陆海新一代平均重力异常数字模型的两个问题:一是基本分辨率的确定问题;二是拟合推估中已知重力点的选取问题。经过在全国范围内的计算,表明本文中的结论是正确的。  相似文献   

12.
Deflections of the vertical (DOVs) over oceans cannot be directly measured, which restricts their applications. A local covariance function of anomalous potential is put forward in this paper in conjunction with the least-squares collocation (LSC) method to compute the oceanic DOVs utilization of oceanic gravity data along a profile. The covariance functions of gravity field quantities have been derived directly as functions of x, y and z without the need to introduce coordinate transformations corresponding to along- or cross-profile components. In the proposed methodology, gravity data along a profile were used to calculate the residual gravity anomaly using the remove-compute-restore technique. The residual gravity anomaly was used to calculate the parameters of the proposed covariance function of the local anomalous gravity field, which was used in the LSC to compute the residual DOVs along the profile. The residual DOVs added model DOVs to recover the DOVs along the profile. The results of a simulation experiment prove that the proposed methodology is feasible and effective.  相似文献   

13.
The recovery of quantities related to the gravity field (i.e., geoid heights and gravity anomalies) is carried out in a test area of the central Mediterranean Sea using 5' × 5' marine gravity data and satellite altimeter data from the Geodetic Mission (GM) of ERS‐J. The optimal combination of the two heterogeneous data sources is performed using (1) the space‐domain least‐squares collocation (LSC) method, and (2) the frequency‐domain input‐output system theory (IOST). The results derived by these methods agree at the level of 2 cm in terms of standard deviation in the case of the geoid height prediction. The gravity anomaly prediction results by the same methods vary between 2.18 and 2.54 mGal in terms of standard deviation. In all cases, the spectral techniques have a much higher computational efficiency than the collocation procedure. In order to investigate the importance of satellite altimetry for gravity field modeling, a pure gravimetric geoid solution, carried out in a previous study for our lest area by the fast collocation approach (FCOL), is used in comparison with the combined geoid models. The combined solutions give more accurate results, at the level of about 15 cm in terms of standard deviation, than the gravimetric geoid solution, when the geoid heights derived by each method are compared with TOPEX altimeter sea surface heights (SSHs). Moreover, nonisotropic power spectral density functions (PSDs) can be easily used by IOST, while LSC requires isotropic covariance functions. The results show that higher prediction accuracies are always obtained when using a priori nonisotropic information instead of isotropic information.  相似文献   

14.
The approach presented is directed toward a specific adaptation of the least‐squares collocation with noise, yielding smooth predictions of geophysical quantities. The smoothing corresponds here to a truncated gravity field equivalent to an (n’, n') spherical‐harmonic expansion. This is reflected in the truncation, at the degree n‘, of the pertinent covariance and cross‐covariance functions in most (but not all) instances. The smooth predictions of geophysical quantities, made in an equilateral grid corresponding to the truncation degree n‘, serve in constructing contour maps after having been densified for the needs of a contour routine. Such a densification is carried out efficiently via errorless collocation with the degree truncation n‘ throughout. Consistent with this procedure, “residuals” at observation points (i.e., discrepancies between the contour map and the data) are computed using the same algorithm. The complete collocation approach is utilized for a 2° resolution of the earth's gravity field with emphasis on the oceanic geoid, based on the residuals from a global spherical‐harmonic adjustment of SEASAT altimetry. The presented results include contour maps of geoid undulations and gravity anomalies. They are compared to the results of a point‐mass adjustment, another technique based on the spherical‐harmonic adjustment. The agreement between these two techniques is found to be excellent.  相似文献   

15.
Two mean dynamic topography (MDT) fields are determined in the Fram Strait between Svalbard and Greenland. New airborne gravity anomalies, older data, and two different mean sea surface (MSS) fields are combined using the least squares collocation (LSC) technique. The results are compared to an oceanographic MDT model and two synthetic MDT fields. The same main currents are seen in all fields. Additionally, smaller scale features are revealed in the new MDT fields. Geostrophic surface currents derived from the MDT models are compared to moorings and Lagrangian drifters. The agreement is desultory. The oceanographic data are an inadequate basis of comparison due to data gaps. Nevertheless, it is the only one available.  相似文献   

16.
在建立局部地区的地球重力场模型中,球冠谐分析是一种有效的方法。在球冠坐标系下建立重力场模型时,仅涉及到重力位的垂向分量,不必进行关于极角的微分运算,所以仅用单个正交基来建模,但若要在球冠坐标系下对重力场作进一步分析,如类似于平面处理的导数运算,来突出地质体的边界,要计算极角的微分,利用单个正交基表示则不易收敛,且误差较大。本文中作者采用两个正交基函数在球冠坐标系下对重力场建模,推导计算了径向导数和曲面导数模,并利用棱柱体正演模型进行检验,结果表明双基函数建模更适合曲面上的位场表示。最后计算了冲绳海槽地区卫星测高重力异常数据的曲面导数,径向导数清晰划分了东海陆架上大构造单元,曲面导数梯度带显示了东海陆架外缘隆起带,也指示了陆架盆地内的一些局部构造。这表明双基函数的球冠谐分析是一种有效的曲面位场建模方法。  相似文献   

17.
Estimation of the open-boundary inputs solving a weak constraint variational formulation for an Arctic tide model is considered as an ill-posed problem in the sense that the solution is very sensitive to the data noise and to grid size. Mathematically, spatial discretization of a cost function to be minimized and penalization of normal flow through the open boundary act as regularization of the problem. An heuristic choosing rule for the regularization parameter is applied to assess a suitable spatial resolution and the weight referred to the open boundary penalty. It is shown that these provide a better fit of the solution to a control data set compared with a finer grid, the value of the energy flux through the open boundary being in agreement with other model estimates. The M2 solution obtained is much closer to the control data than other modern solutions while the accuracy of the simulated K1 constituent is within the same error level. The tidal maps for these waves exhibit certain distinctions in comparison with other charts.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号