首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 687 毫秒
1.
Multibeam bathymetric data provide critical information for the modeling of seabed geology and benthic biodiversity. The accuracy of these models depends on the accuracy of the bathymetric data, which contain uncertainties that are stochastic at individual soundings but exhibit a distinct spatial distribution with increasing magnitude from nadir to the outer beams. A restricted spatial randomness method that simulates both the stochastic and spatial characteristics of the data uncertainty performed better than a complete spatial randomness method in analyzing the impact of bathymetric data uncertainty on derived seafloor attributes.  相似文献   

2.
基于多源水深数据融合的海底高精度地形重建   总被引:2,自引:0,他引:2  
本文在研究多源水深数据构建技术的基础上,分析了张力样条插值算法和“移去-恢复”法的多源水深数据融合处理技术,基于该方法选取实验区,利用多波束、单波束、历史海图等多源水深数据进行高精度海底地形融合试验,并针对多源水深融合技术缺少误差评估的现状,利用split-sample方法对融合结果进行水深不确定性评估,形成融合结果的可靠性空间分布。结果表明该方法无论是在数据稀疏区还是高密度区都达到了较好的融合效果,既保留了高分辨率水深数据的细节信息,又较真实的反映了研究区海底地形特征,且构建的海底地形精度可靠,误差百分比集中在0.5%。本文整套数据融合和结果评估方法可为多源水深数据融合的海底高精度地形构建提供借鉴和参考。  相似文献   

3.
Gridding heterogeneous bathymetric data sets for the compilation of Digital bathymetric models (DBMs), poses specific problems when there are extreme variations in source data density. This requires gridding routines capable of subsampling high-resolution source data while preserving as much as possible of the small details, at the same time as interpolating in areas with sparse data without generating gridding artifacts. A frequently used gridding method generalizes bicubic spline interpolation and is known as continuous curvature splines in tension. This method is further enhanced in this article in order to specifically handle heterogeneous bathymetric source data. Our method constructs the final grid through stacking several surfaces of different resolutions, each generated using the splines in tension algorithm. With this approach, the gridding resolution is locally adjusted to the density of the source data set: Areas with high-resolution data are gridded at higher resolution than areas with sparse source data. In comparison with some of the most widely used gridding methods, our approach yields superior DBMs based on heterogeneous bathymetric data sets with regard to preserving small bathymetric details in the high-resolution source data, while minimizing interpolation artifacts in the sparsely data constrained regions. Common problems such as artifacts from ship tracklines are suppressed. Even if our stacked continuous curvature splines in tension gridding algorithm has been specifically designed to construct DBMs from heterogeneous bathymetric source data, it may be used to compile regular grids from other geoscientific measurements.  相似文献   

4.
所研制的海洋测深数据绘图接口装置(以下简称接口装置),采用MCS8031单片微电脑管理本机的运行。它能把测量仪器测到的数据取出送往打印机打印出曲线;或把所测的数据输入接口装置的内存区,待具备打印条件时,再依次取出绘成曲线、该接口装置在研制成功后,曾用于连接CS-60回声测深仪及九针打印机,并在一次海上测量海底地形的测深作业中使用,得到满意的效果。文章还讨论了该测深数据绘图方法结合其他类型的测量仪器和其他打印机应用的问题。  相似文献   

5.
Over the past five decades, several approaches for estimating probabilities of extreme still water levels have been developed. Currently, different methods are applied not only on transnational, but also on national scales, resulting in a heterogeneous level of protection. Applying different statistical methods can yield significantly different estimates of return water levels, but even the use of the same technique can produce large discrepancies, because there is subjective parameter choice at several steps in the model setup. In this paper, we compare probabilities of extreme still water levels estimated using the main direct methods (i.e. the block maxima method and the peaks over threshold method) considering a wide range of strategies to create extreme value dataset and a range of different model setups. We primarily use tide gauge records from the German Bight but also consider data from sites around the UK and Australia for comparison. The focus is on testing the influence of the following three main factors, which can affect the estimates of extreme value statistics: (1) detrending the original data sets; (2) building samples of extreme values from the original data sets; and (3) the record lengths of the original data sets. We find that using different detrending techniques biases the results from extreme value statistics. Hence, we recommend using a 1-year moving average of high waters (or hourly records if these are available) to correct the original data sets for seasonal and long-term sea level changes. Our results highlight that the peaks over threshold method yields more reliable and more stable (i.e. using short records leads to the same results as when using long records) estimates of probabilities of extreme still water levels than the block maxima method. In analysing a variety of threshold selection methods we find that using the 99.7th percentile water level leads to the most stable return water level estimates along the German Bight. This is also valid for the international stations considered. Finally, to provide guidance for coastal engineers and operators, we recommend the peaks over threshold method and define an objective approach for setting up the model. If this is applied routinely around a country, it will help overcome the problem of heterogeneous levels of protection resulting from different methods and varying model setups.  相似文献   

6.
A method which utilizes the lateral offset information obtained by comparing swath bathymetric data at track crossover points as a further constraint on the navigation is presented. The method, based on generalized least squares inversion theory, derives a new navigational solution that minimizes the overall misfit between the pairs of topography at crossovers while trying to remain smooth and close to the starting model. To achieve a high numerical efficiency during inversions of large matrices, we employed sparse matrix algorithms. The inversion scheme was applied to a set of Sea Beam data collected over the East Pacific Rise near 9° 30' N in early 1988 at the time when the Global Positioning System had limited coverage. The starting model was constructed by taking evenly spaced samples of positions along the tracklines. For each one of the 361 crossovers, we gridded the bathymetric data around the crossover point compared the gridded maps, and calculated the offset and uncertainty associated with this estimation. A suite of inversion solutions were obtained depending on the choice of three free parameters (that is, the a priori model variance, the correlation interval of a priori model, and the trade-off coefficient between fitting the data and remaining close to the a priori model). The best solution was chosen as one that minimizes both the Sea Beam topography and free-air gravity anomaly differences at crossovers. The improvement was significant; the initial rms mismatch between the tracks and free-air gravity anomalies at crossovers was reduced from 610m to 75m and from 2.5mGal to 1.9mGal, respectively.  相似文献   

7.
A method which utilizes the lateral offset information obtained by comparing swath bathymetric data at track crossover points as a further constraint on the navigation is presented. The method, based on generalized least squares inversion theory, derives a new navigational solution that minimizes the overall misfit between the pairs of topography at crossovers while trying to remain smooth and close to the starting model. To achieve a high numerical efficiency during inversions of large matrices, we employed sparse matrix algorithms. The inversion scheme was applied to a set of Sea Beam data collected over the East Pacific Rise near 9° 30' N in early 1988 at the time when the Global Positioning System had limited coverage. The starting model was constructed by taking evenly spaced samples of positions along the tracklines. For each one of the 361 crossovers, we gridded the bathymetric data around the crossover point compared the gridded maps, and calculated the offset and uncertainty associated with this estimation. A suite of inversion solutions were obtained depending on the choice of three free parameters (that is, the a priori model variance, the correlation interval of a priori model, and the trade-off coefficient between fitting the data and remaining close to the a priori model). The best solution was chosen as one that minimizes both the Sea Beam topography and free-air gravity anomaly differences at crossovers. The improvement was significant; the initial rms mismatch between the tracks and free-air gravity anomalies at crossovers was reduced from 610m to 75m and from 2.5mGal to 1.9mGal, respectively.  相似文献   

8.
使用有监督机器学习方法进行海洋文献的分类往往存在人工标注量太大的缺点,针对这个问题,提出利用半监督机器学习中的协同训练(Co-training)方法来实现减小人工标注量的目标。该方法从2个View分别训练不同的分类器,在此基础上,根据少量有标注文档从大量无标注文档中获取有用信息,通过协同训练来提升2个分类器的性能,并训练出最终分类模型。实验结果表明,在人工标注仅2篇文献的条件下,该方法最终的分类性能十分接近需人工标注1 500多篇文献的有监督分类器。这说明将Co-training方法应用于海洋文献分类可以大大减小人工标注量,并有着较为良好的分类性能。  相似文献   

9.
The existing studies of the Azores triple junction, although based on specific geological or geophysical data, largely rely upon morphological considerations. However, there is no systematic bathymetric coverage of this area, and the quality of the available bathymetric charts does not allow consistent morpho-structural analysis.In this work we present a new bathymetric grid elaborated with all the available data sources in an area comprised between 24° W to 32° W and 36° N to 41° N. The basic data set corresponds to the merge of NGDC data with new swath profiles. This new map, included as an Appendix, combined with other results from seismology and neotectonics, is the basis for the study of the morpho-structural pattern of the Azores area, the present day stress field and its implications on the current view of the Azores geodynamics.As a major result, we conclude that the Azores region is controlled by two sets of conjugated faults with 120° and 150° strikes that establish the framework for the onset of volcanism, expressing as linear volcanic ridges or as point source volcanism. This interaction develops what can be considered as the morphological signature of the Azores Spreading axis segmentation. We argue that the Azores domain, presently in a broad transtensional regime, is acting simultaneously as a ultra slow spreading centre and as a transfer zone between the MAR and the dextral Gloria Fault, as it accommodates the differential shear movement between the Eurasian and African plates.  相似文献   

10.
Automated threshold selection methods for extreme wave analysis   总被引:2,自引:0,他引:2  
The study of the extreme values of a variable such as wave height is very important in flood risk assessment and coastal design. Often values above a sufficiently large threshold can be modelled using the Generalized Pareto Distribution, the parameters of which are estimated using maximum likelihood. There are several popular empirical techniques for choosing a suitable threshold, but these require the subjective interpretation of plots by the user.In this paper we present a pragmatic automated, simple and computationally inexpensive threshold selection method based on the distribution of the difference of parameter estimates when the threshold is changed, and apply it to a published rainfall and a new wave height data set. We assess the effect of the uncertainty associated with our threshold selection technique on return level estimation by using the bootstrap procedure. We illustrate the effectiveness of our methodology by a simulation study and compare it with the approach used in the JOINSEA software. In addition, we present an extension that allows the threshold selected to depend on the value of a covariate such as the cosine of wave direction.  相似文献   

11.
12.
In this paper we examine the use of bathymetric sidescan sonar for automatic classification of seabed sediments. Bathymetric sidescan sonar, here implemented through a small receiver array, retains the advantage of sidescan in speed through illuminating large swaths, but also enables the data gathered to be located spatially. The spatial location allows the image intensity to be corrected for depth and insonification angle, thus improving the use of the sonar for identifying changes in seafloor sediment. In this paper we investigate automatic tools for seabed recognition, using wavelets to analyse the image of Hopvågen Bay in Norway. We use the back-propagation elimination algorithm to determine the most significant wavelet features for discrimination. We show that the features selected present good agreement with the grab sample results in the survey under study and can be used in a classifier to discriminate between different seabed sediments.  相似文献   

13.
利用无验潮水深测量系统定位设备,采集了静态和动态两组定位数据,从静态和动态定位两个方面分析了不同长度基线的解算结果。结果表明:基线长度在40km范围以内,定位结果完全满足无验潮水深测量系统的±10cm的测量精度要求。  相似文献   

14.
In 1979, the General Bathymetric Chart of the Oceans (GEBCO) published Sheet 5.17 in the Fifth Edition of its series of global bathymetric maps. Sheet 5.17 covered the northern polar region above 64° N, and was for long the authoritative portrayal of Arctic bathymetry. The GEBCO compilation team had access to an extremely sparse sounding database from the central Arctic Ocean, due to the difficulty of mapping in this permanently ice covered region. In the past decade, there has been a substantial increase in the database of central Arctic Ocean bathymetry, due to the declassification of sounding data collected by US and British Navy nuclear submarines, and to the capability of modern icebreakers to measure ocean depths in heavy ice conditions. From these data sets, evidence has mounted to indicate that many of the smaller (and some larger) bathymetric features of Sheet 5.17 were poorly or wrongly defined. Within the framework of the project to construct the International Bathymetric Chart of the Arctic Ocean (IBCAO), all available historic and modern data sets were compiled to create a digital bathymetric model. In this paper, we compare both generally and in detail the contents of GEBCO Sheet 5.17 and version 1.0 of IBCAO, two bathymetric portrayals that were created more than 20 years apart. The results should be helpful in the analysis and assessment of previously published studies that were based on GEBCO Sheet 5.17. Ron Macnab: Retired.  相似文献   

15.
Seamounts and knolls are ‘undersea mountains’, the former rising more than 1000 m from the seafloor. These features provide important habitats for aquatic predators, demersal deep-sea fish and benthic invertebrates. However most seamounts have not been surveyed and their numbers and locations are not well known. Previous efforts to locate and quantify seamounts have used relatively coarse bathymetry grids. Here we use global bathymetric data at 30 arc-sec resolution to identify seamounts and knolls. We identify 33,452 seamounts and 138,412 knolls, representing the largest global set of identified seamounts and knolls to date. We compare estimated seamount numbers, locations, and depths with validation sets of seamount data from New Zealand and Azores. This comparison indicates the method we apply finds 94% of seamounts, but may overestimate seamount numbers along ridges and in areas where faulting and seafloor spreading creates highly complex topography. The seamounts and knolls identified herein are significantly geographically biased towards areas surveyed with ship-based soundings. As only 6.5% of the ocean floor has been surveyed with soundings it is likely that new seamounts will be uncovered as surveying improves. Seamount habitats constitute approximately 4.7% of the ocean floor, whilst knolls cover 16.3%. Regional distribution of these features is examined, and we find a disproportionate number of productive knolls, with a summit depth of <1.5 km, located in the Southern Ocean. Less than 2% of seamounts are within marine protected areas and the majority of these are located within exclusive economic zones with few on the High Seas. The database of seamounts and knolls resulting from this study will be a useful resource for researchers and conservation planners.  相似文献   

16.
Several bathymetric data sets are compared and assessed with constraints of an ocean current model and velocity observations. The root-mean-square (rms) differences among the data sets reach 20 m in the shallow Tsushima/Korea Straits. The numerical experiments to simulate the Tsushima Warm Current are performed using four different topography data sets. The JTOPO1 data (MIRC, 2003) give the smallest rms difference to long-term horizontal velocity observations. Several least-squares combinations of the topography data sets are then sought to minimize the rms difference between the observed and modeled barotropic velocities. Most of the data sets reveal a large bias of 30–60 m at the Western Channel compared to independent sounding depths  相似文献   

17.
The effectiveness of the time reduction procedure is the critical problem for the informative contents evaluation in the high definition (HD) geomagnetic marine surveys. The use of magnetogradiometers is a good solution for the time-reduction in the regional offshore studies but often, for technical reasons, this is not a practicable method in the very detailed coastal, canal or harbour surveys where the use of coastal base stations may be preferable. On the other hand the uncertainty of the full transferability of the coastal magnetograms to large areas of marine surveys can disrupt the time-reduced data sets by a geomagnetic residual time component. The phenomenon is related to the distance from the coastal observatory and to the homogeneity of the local magnetic characteristics of the crust. The maximum applicability distance of the time-line correction (TL) is qualitatively evaluated and is shown to be inconsistent with geomagnetic marine surveys in high definition. In this work we show a quantitative method to evaluate the stability of the coastal observatory magnetograms over the nearby marine area, together with the numerical degree of precision of the correction. The method is based on a double survey of the same profile (timer track: TT) at two different times. The surveys produce two different row data sets where the difference is related only to the geomagnetic time variations. Using the coastal observatory magnetograms we time-reduce the two data sets: if the coastal observatory magnetograms are fully coherent in the whole survey area the difference between the two reduced data sets will be zero. However, if the time variations measured in the observatory are inadequate in amplitude or phase to model the corresponding time variations in the surveyed profile (TT), the discrepancy between the time reduced data sets will not be zero. Similar TTs starting with various course from the base station permit the surveyed area to be split into sectors with variable degree of time coherence and to assign a degree of precision to the time-reduced survey.  相似文献   

18.
A Bayesian network model has been developed to simulate a relatively simple problem of wave propagation in the surf zone (detailed in Part I). Here, we demonstrate that this Bayesian model can provide both inverse modeling and data-assimilation solutions for predicting offshore wave heights and depth estimates given limited wave-height and depth information from an onshore location. The inverse method is extended to allow data assimilation using observational inputs that are not compatible with deterministic solutions of the problem. These inputs include sand bar positions (instead of bathymetry) and estimates of the intensity of wave breaking (instead of wave-height observations). Our results indicate that wave breaking information is essential to reduce prediction errors. In many practical situations, this information could be provided from a shore-based observer or from remote-sensing systems. We show that various combinations of the assimilated inputs significantly reduce the uncertainty in the estimates of water depths and wave heights in the model domain. Application of the Bayesian network model to new field data demonstrated significant predictive skill (R2 = 0.7) for the inverse estimate of a month-long time series of offshore wave heights. The Bayesian inverse results include uncertainty estimates that were shown to be most accurate when given uncertainty in the inputs (e.g., depth and tuning parameters). Furthermore, the inverse modeling was extended to directly estimate tuning parameters associated with the underlying wave-process model. The inverse estimates of the model parameters not only showed an offshore wave height dependence consistent with results of previous studies but the uncertainty estimates of the tuning parameters also explain previously reported variations in the model parameters.  相似文献   

19.
分析了海底地形测量成果的多源异构和海量特性,指出合理构建交叉点不符值数列是开展数据质量评估及精化处理的关键。为避免格网法结果自身失真可能对评估造成的不确定性,充分准确挖掘交叉点不符值所隐含的粗差及系统性偏差等信息,提出应基于三角网构建海底地形曲面并计算全部水深交叉点不符值。结果表明,交叉点不符值数列构建结果具有唯一性,基于此可对海底地形成果可能隐含的误差作进一步探测并削弱。  相似文献   

20.
A method for the estimation of the results of reconstruction of the trajectories of drifting buoys is proposed. It is based on the comparison of the estimates of power spectral densities for the components of current velocity computed according to three data sets: the data set of the coordinates of a drifting buoy with a built-in GPS receiver, the data set of coordinates formed on the basis of the first set but with data missing and observational errors corresponding to the actual trajectory measurements by the Argos satellite location and data collection system by the Doppler method, and the data set of coordinates obtained as a result of interpolation of the second set. As an example, we consider the procedure of realization of the proposed method and demonstrate the efficiency of its application for the improvement of the reliability of reconstruction of the trajectories of drifting buoys.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号