首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 750 毫秒
1.
A 9 arc‐second grid of gravimetric terrain corrections has been computed over Australia using ver.2 of the GEODATA 9 arc‐second digital elevation model (DEM) and the fast Fourier transform technique. This supersedes the 27 arc‐second grid previously reported in this journal, computed from ver.1 GEODATA DEM. The improved resolution is possible because of the removal of errors in ver.1 DEM.  相似文献   

2.
The Mellin transform is a mathematical tool which has been applied in many areas of Mathematics, Physics and Engineering. Its application in Geophysics is in the computation of solution of potential problems for the determination of the mass as well as the depth to the basement of some solid mineral deposits. In this study, the Mellin transform is used to determine the depth to the top (h) and the depth to the bottom (H) of the basement of a profile of an anomalous magnetic body. Ibuji, the study area is located in Ifedore Local Government area of Ondo state, Nigeria, underlain by Precambrian complex rocks and bounded by geographical co-ordinate of Easting 5°00t’00″ to 5°4t’30″ and Northing 7°24t’00″ to 7°27t’36″. The magnetic anomaly profile due to a two- dimensional body(vertical thin sheet)over magnetic spring of the study area was digitised and the values of magnetic amplitude (nT) with respect to its horizontal distance (say interval of 5 m) obtained from the digitized profile was then used in the computation of Mellin transform using Matlab programs. In order to determine the depths H and h, the amplitudes were considered at three arbitrary point (s = ¼, ½ and ¾) such that, (0 < s < 1), where s is a complex variable of real positive integer. The value obtained for H was 47.95 m, which compared favourably with the result obtained using other methods. Meanwhile, the value obtained for h has a convergence restriction, whereby, at lower values of s, there is divergence, while at higher values of s, (about 0.9), the result converges and h was obtained to be 32.56 m. The Ibuji magnetic anomaly was therefore analysed to have a depth to the bottom (H) of 47.95 m and depth to the top of 32.56 m using this mathematical tool.  相似文献   

3.
为了解高程数据网格间距对表面积分法、直立长方体法和平均高程直立长方体法计算的中区地形改正值精度的影响,笔者选择某地区450个测点,并使用不同网格间距高程数据计算中区地改值,通过对比发现表面积分法计算精度受高程数据网格间距影响较小,而直立长方体法反之。然后将中区地改50~2 000 m分为10个区间段进行计算,通过统计得出误差的45%和30%左右都分布在50~200 m和200~500 m段,因此提出提高中区地形改正精度必须提高50~200 m和200~500 m内高程数据网格密度。  相似文献   

4.
For regional and national study purposes, there is a high need for updating the terrain corrections (TC) in the French gravity database. We have recomputed the TC for all the French gravity stations from 50 m out to a distance of 167 km. We compute the TC with a flat-top-prism algorithm and three DEM with grid spacing of 50, 250 and 1000 m, used in the zones 53 m/3 km, 3 km/10 km et 10 km/167 km, respectively. Analysing the DEM/station Δz and comparing our results to the ones previously obtained in the Alps area, we estimate the accuracy of our TC to be better than 1 mGal. To cite this article: G. Martelet et al., C. R. Geoscience 334 (2002) 449–454.  相似文献   

5.
Estimating palaeowind strength from beach deposits   总被引:1,自引:0,他引:1  
Abstract The geological record of past wind conditions is well expressed in the coarse gravel, cobble and boulder beach deposits of Quaternary palaeolakes in the Great Basin of the western USA and elsewhere. This paper describes a technique, using the particle‐size distribution of beach deposits, to reconstruct palaeowind conditions when the lakes were present. The beach particle technique (BPT) is first developed using coarse beach deposits from the 1986–87 highstand of the Great Salt Lake in Utah, combined with instrumental wind records from the same time period. Next, the BPT is used to test the hypothesis that wind conditions were more severe than at present during the last highstand of Lake Lahontan (≈ 13 ka), which only lasted a decade or two at most. The largest 50 beach clasts were measured at nine beach sites located along the north, west and south sides of Antelope Island in the Great Salt Lake, all of which formed in 1986–87. At these sites, the largest clast sizes range from 10 to 28 cm (b‐axis), and fetch lengths range from 25 to 55 km. Nearshore wave height was calculated by assuming that the critical threshold velocity required to move the largest clasts represents a minimum estimate of the breaking wave velocity, which is controlled by wave height. Shoaling transformations are undertaken to estimate deep‐water wave heights and, ultimately, wind velocity. Wind estimates for the nine sites, using the BPT, range from 6·5 to 17·4 m s?1, which is in reasonable agreement with the instrumental record from Salt Lake City Airport. The same technique was applied to eight late Pleistocene beaches surrounding the Carson Sink sub‐basin of Lake Lahontan, Nevada. Using the BPT, estimated winds for the eight sites range from 9·7 to 27·1 m s?1. The strongest winds were calculated for a cobble/boulder beach with a fetch of 25 km. Instrumental wind records for the 1992–99 period indicate that wind events of 9–12 m s?1 are common and that the strongest significant wind event (≥ 9 m s?1 for ≥ 3 h) reached an average velocity of 15·5 m s?1. Based on this preliminary comparison, it appears that the late Pleistocene western Great Basin was a windier place than at present, at least for a brief time.  相似文献   

6.
We report technical and data treatment methods for making accurate, high‐precision measurements of 18O/16O in Ca–Mg–Fe garnet utilising the Cameca IMS 1280 multi‐collector ion microprobe. Matrix effects were similar to those shown by previous work, whereby Ca abundance is correlated with instrumental mass fractionation (IMF). After correction for this effect, there appeared to be no significant secondary effect associated with Mg/Fe2+ for routine operational conditions. In contrast, investigation of the IMF associated with Mn‐ or Cr‐rich garnet showed that these substitutions are significant and require a more complex calibration scheme. The Ca‐related calibration applied to low‐Cr, low‐Mn garnet was reproducible across different sample mounts and under a range of instrument settings and therefore should be applicable to similar instruments of this type. The repeatability of the measurements was often better than ± 0.2‰ (2s), a precision that is similar to the repeatability of bulk techniques. At this precision, the uncertainties due to spot‐to‐spot repeatability were at the same magnitude as those associated with matrix corrections (± 0.1–0.3‰) and the uncertainties in reference materials (± 0.1–0.2‰). Therefore, it is necessary to accurately estimate and propagate uncertainties associated with these parameters – in some cases, uncertainties in reference materials or matrix corrections dominate the uncertainty budget.  相似文献   

7.
Oxygen air-water gas exchange was measured using floating chambers in two shallow tidal estuaries of differing bathymetry and local terrain, near Waquoit Bay, Massachusetts (United States). The specific chamber design permitted measurements of gas flux in 15 min, allowing analysis of the relationship with wind speed and tidal stage. Exchange coefficients ranged from 0.5 to 2.5 g O2·m?2 h?1 atm?1 (equivalent to piston velocities of 1.5 to 7 cm h?1) for wind speeds of 0.3 to 9 m s?1 at 10 m elevation. While the relationships for each estuary appear linear (significant linear regressions with wind speed were shown for each estuary, and the slopes were different at the 99.5% confidence level), the range of speeds differed at the two sites and an exponential function of wind speed was consistent with the combined data from both estuaries. A power function of wind speed was not an acceptable model. The exchange coefficients for our estuaries are from 57% to as low as 9% of that predicted by previously published generic equations. Because the atmospheric correction can be significant in shallow, metabolically active coastal waters, we suggest that empirically determined relationships for gas exchange versus wind for a specific estuary are preferable to the predictions of the general equations. While the floating chamber method should be used cautiously, at low winds speeds (below 8 m s?1) and in slowly flowing waters, it provides a convenient approach for quantifying these site-specific differences. The differences, especially those between shallow sheltered systems and the open waters best fit by some published relationships, are ecologically important and do not appear yet to be measurable by other methods.  相似文献   

8.
多年来,不论重力勘探程度如何,在布格重力异常计算中都必须经过地形改正和中间层改正。本文通过分析在计算布格重力异常时地改和中间层改正对测点的重力补偿,提出了取消中间层改正以适应微重力勘探精细解释需要的地形校正方法。该方法建立在对实际地形(岩性)的正演基础上,可以根据施工地区的地质条件合理选择重力基准面进行可变密度地形校正。使用该方法可以比较好地消除地形起伏和不均匀岩性对测点产生的重力影响,从而得到比较可信的重力异常数据。  相似文献   

9.
Experiments have been conducted in a shear cell in order to provide insight into the separation of flocs by size and density in a hydrocyclone. The size of the aggregates was measured after shearing at a rate comparable to that found within a typical hydrocyclone. Two types of coal tailings from the Hunter Valley, NSW Australia with average sizes of about 0.4 μm and 10 μm were investigated. The size of aggregates after shearing was measured for a range of different polymeric flocculants of varying molecular weight and charge density. Under certain conditions 90% of the aggregates from the submicron tailings have size greater than 38 μm after shear at 1200 s 1 for 30 s. The aggregates produced from the 10 μm tailings had over 90% larger than 82 μm with the flocculants tested. The size of the aggregates remained sufficiently large after shearing to be suitable for hydrocyclone dewatering. The density of the aggregates was calculated from the aggregate size and mass fractal dimension. The small difference between the density of the aggregates and water was found to be the factor limiting their velocity under the applied centrifugal acceleration.  相似文献   

10.
区域重力调查中的中区地形改正方法及精度   总被引:2,自引:0,他引:2  
冯治汉 《物探与化探》2007,31(5):455-458
在讨论重力中区地改方法的基础上,分别用20m×20m、50m×50m和100m×100m方域计算了北祁连西段1:20万区域重力调查的764个测点的中区地改值,通过移动方域网格进行检查计算,讨论了各算法对地形的模拟程度和地改计算精度.  相似文献   

11.
通过对地形体及其密度信息数字图像化,利用计算机仿真能较精确地计算出地形体在任意一点的重力场强度。在重力异常地形校正时,用各测点的实测数据减去相应测点的地形体重力场强度,即得到地形改正和中间层改正后的重力场强度。这样,重力异常校正中的地形改正和中间层改正可一并完成,既简化了校正的工作步骤,又提高了准确性。通过地形体实例,分别用数字图像仿真计算和积分精确计算其重力场强度,结果表明,二者计算结果十分相近。用数字图像仿真计算地形体重力场强度,误差较小,准确性较高,该方法完全适合重力异常地形校正。  相似文献   

12.
The present study evaluates several critical issues related to precision and accuracy of Cu and Zn isotopic measurements with application to estuarine particulate materials. Calibration of reference materials (such as the IRMM 3702 Zn) against the JMC Zn and NIST Cu reference materials were performed in wet and/or dry plasma modes (Aridus I and DSN‐100) on a Nu Plasma MC‐ICP‐MS. Different mass bias correction methods were compared. More than 100 analyses of certified reference materials suggested that the sample‐calibrator bracketing correction and the empirical external normalisation methods provide the most reliable corrections, with long term external precisions of 0.06 and 0.07‰ (2SD), respectively. Investigation of the effect of variable analyte to spike concentration ratios on Zn and Cu isotopic determinations indicated that the accuracy of Cu measurements in dry plasma is very sensitive to the relative Cu and Zn concentrations, with deviations of δ65Cu from ?0.4‰ (Cu/Zn = 4) to +0.4‰ (Cu/Zn = 0.2). A quantitative assessment (with instrumental mass bias corrections) of spectral and non‐spectral interferences (Ti, Cr, Co, Fe, Ca, Mg, Na) was performed. Titanium and Cr were the most severe interfering constituents, contributing to inaccuracies of ?5.1‰ and +0.60‰ on δ68/64Zn, respectively (for 500 μg l?1 Cu and Zn standard solutions spiked with 1000 μg l?1 of Ti or Cr). Preliminary isotopic results were obtained on contrasting sediment matrices from the Scheldt estuary. Significant isotopic fractionation of zinc (from 0.21‰ to 1.13‰ for δ66Zn) and copper (from ?0.38‰ to 0.23‰ for δ65Cu), suggest a control by physical mixing of continental and marine water masses, characterized by distinct Cu and Zn isotopic signatures. These results provide a stepping‐stone to further evaluate the use of Cu and Zn isotopes as biogeochemical tracers in estuarine environments.  相似文献   

13.
Recent analytical developments in germanium stable isotope determination by multicollector ICP‐MS have provided new perspectives for the use of Ge isotopes as geochemical tracers. Here, we report the germanium isotope composition of the NIST SRM 3120a elemental reference solution that has been calibrated relative to internal isotopic standard solutions used in the previous studies. We also intercalibrate several geological reference materials as well as geological and meteoritic samples using different techniques, including online hydride generation and a spray chamber for sample introduction to MC‐ICP‐MS, and different approaches for mass bias corrections such as sample–calibrator bracketing, external mass bias correction using Ga isotopes and double‐spike normalisation. All methods yielded relatively similar precisions at around 0.1‰ (2s) for δ74/70Ge values. Using igneous and mantle‐derived rocks, the bulk silicate Earth (BSE) δ74/70Ge value was re‐evaluated to be 0.59 ± 0.18‰ (2s) relative to NIST SRM 3120a. Several sulfide samples were also analysed and yielded very negative values, down to ?4.3‰, consistent with recent theoretical study of Ge isotope fractionation. The strong heavy isotope depletion in ore deposits also contrasts with the generally positive Ge isotope values found in many modern and ancient marine sediments.  相似文献   

14.
Scale is one of the most important but unsolved issues in various scientific disciplines that deal with spatial data. The arbitrary choice of grid cell size for contour interpolated digital elevation models (DEM) is one of the major sources of uncertainty in the hydrologic modelling process. In this paper, an attempt was made to identify methods for determining an optimum cell size for a contour interpolated DEM in prior to hydrologic modelling. Twenty-meter interval contour lines were used to generate DEMs of five different resolutions, viz., 30, 45, 60, 75, and 90 m using TOPOGRID algorithm. The obtained DEMs were explored for their intrinsic quality using four different methods, i.e., sink analysis, fractal dimension of derived stream network, entropy measurement and semivariogram modelling. These methods were applied to determine the level artifacts (interpolation error) in DEM surface as well as derived stream network, spatial information content and spatial variability respectively. The results indicated that a 90 m cell size is sufficient to capture the terrain variability for subsequent hydrologic modelling in the study area. The significance of this research work is that it provides methods which DEM users can apply to select an appropriate DEM cell size in prior to detailed hydrologic modelling.  相似文献   

15.
A HF‐free sample preparation method was used to purify silicon in twelve geological RMs. Silicon isotope compositions were determined using a Neptune instrument multi‐collector‐ICP‐MS in high‐resolution mode, which allowed separation of the silicon isotope plateaus from their interferences. A 1 μg g‐1 Mg spike was added to each sample and standard solution for online mass bias drift correction. δ30Si and δ29Si values are expressed in per mil (‰), relative to the NIST SRM 8546 (NBS‐28) international isotopic RM. The total variation of δ30Si in the geological reference samples analysed in this study ranged from ‐0.13‰ to ‐0.29‰. Comparison with δ29Si values shows that these isotopic fractionations were mass dependent. IRMM‐17 yielded a δ30Si value of ‐1.41 ± 0.07‰ (2s, n = 12) in agreement with previous data. The long‐term reproducibility for natural samples obtained on BHVO‐2 yielded δ30Si = ‐0.27 ± 0.08‰ (2s, n = 42) on a 12 month time scale. An in‐house Si reference sample was produced to check for the long‐term reproducibility of a mono‐elemental sample solution; this yielded a comparable uncertainty of ± 0.07‰ (2s, n = 24) over 5 months.  相似文献   

16.
The DEM generalization is the foundation of expressing and analyzing the terrain and the basis of multi-scale observation. Meanwhile, it is also the core of building the multi-scale geographic database. This paper would like to propose a new algorithm using profile simplification in four directions(4-DP). This algorithm is composed of two parts, namely extraction of terrain feature points in local window as well as in global profile line and reconstruction of DEMs. The paper used the 5 m resolution DEM of the Suide in Loess Plateau of China as the original data. In the experiment, this paper has achieved the generalized DEM with 5 m and 25 m resolution by removed small details and computed out the optimal threshold. In contrast to the classic algorithms, VIP and Aggregate, based on three evaluation methods. The results show that this method is able to retain the main geographical information effectively in terrain surface.  相似文献   

17.
A new earthquake catalogue for central, northern and northwestern Europe with unified Mw magnitudes, in part derived from chi-square maximum likelihood regressions, forms the basis for seismic hazard calculations for the Lower Rhine Embayment. Uncertainties in the various input parameters are introduced, a detailed seismic zonation is performed and a recently developed technique for maximum expected magnitude estimation is adopted and quantified. Applying the logic tree algorithm, resulting hazard values with error estimates are obtained as fractile curves (median, 16% and 84% fractiles and mean) plotted for pga (peak ground acceleration; median values for Cologne 0.7 and 1.2 m/s2 for probabilities of exceedence of 10% and 2%, respectively, in 50 years), 0.4 s (0.8 and 1.5 m/s2) and 1.0 s (0.3 and 0.5 m/s2) pseudoacclerations, and intensity (I0 = 6.5 and 7.2). For the ground motion parameters, rock foundation is assumed. For the area near Cologne and Aachen, maps show the median and 84% fractile hazard for 2% probability of exceedence in 50 years based on pga (maximum median value about 1.5 m/s2), and 0.4 s (>2 m/s2) and 1.0 s (about 0.8 m/s2) pseudoaccelerations, all for rock. The pga 84% fractile map also has a maximum value above 2 m/s2 and shows similarities with the median map for 0.4 s. In all maps, the maximum values fall within the area 6.2–6.3° E and 50.8–50.9° N, i.e., east of Aachen.  相似文献   

18.
The rate of extraterrestrial accretion for particles in the size range 0.45 μm to ∼20 μm was determined from dust concentrates extracted from Greenland Ice Sheet Project 2 (GISP2) ice core samples. Using instrumental neutron activation analysis (INAA), we determined the iridium (Ir) content of the dust. Following a core-specific correction for terrestrial Ir and assuming a chondritic Ir abundance of 500 ppb, we measure an average accretion rate for 0.45 μm to ∼20 μm particles over the entire Earth of 0.22 (± 0.11) × 109 g/yr (kton/yr) for 317 years of ice through the interval 6 to 20 ka. This is consistent with the interplanetary dust accretion rate of 0.17 (± 0.08) x 109 g/yr that we derive from published 3He data for the GISP2 core. Accounting for particles that are larger and smaller than those detected by or experiment, our best estimate of the total accretion rate (including particle sizes up to about 4 cm in diameter) is 2.5 × 109 g/yr. The uncertainty in this estimate is dominated by statistical fluctuations in the number of particles expected to end up in the ice core and not by measurement error. Based on Monte Carlo simulations, we estimate the upper limit for total extraterrestrial accretion to Earth of 6.25 × 109 g/yr (95% confidence level). This accretion rate is consistent with some estimates from micrometeorite concentrations in polar ice, estimates from ground-based radar studies, and with accretion estimates of 3He-bearing interplanetary dust particles, assuming that 3He is correlated with particle surface area. It is, however, lower than estimates based on platinum group element studies of marine sediments. The conflict may indicate systematic errors with either the marine or the non-marine samples, departures from the assumed particle spectrum of Grün and coauthors, or time-variable accretion rates, with the early Holocene period being characterized by low levels of accretion.  相似文献   

19.
Viewshed analysis is widely used in many terrain applications such as siting problem, path planning problem, and etc. But viewshed computation is very time-consuming, in particular for applications with large-scale terrain data. Parallel computing as a mainstream technique with the tremendous potential has been introduced to enhance the computation performance of viewshed analysis. This paper presents a revised parallel viewshed computation approach based on the existing serial XDraw algorithm in a distributed parallel computing environment. A layered data-dependent model for processing data dependency in the XDraw algorithm is built to explore scheduling strategy so that a fine-granularity scheduling strategy on the process-level and thread-level parallel computing model can be accepted to improve the efficiency of the viewshed computation. And a parallel computing algorithm, XDraw-L, is designed and implemented taken into account this scheduling strategy. The experimental results demonstrate a distinct improvement of computation performance of the XDraw-L algorithm in this paper compared with the coarse-partition algorithm, like XDraw-E which is presented by Song et al. (Earth Sci Inf 10(5):511–523, 2016), and XDraw-B that is the basic algorithm of serial XDraw. Our fine-granularity scheduling algorithm can greatly improve the scheduling performance of the grid cells between the layers within a triangle region.  相似文献   

20.
A 150 μm thick fused layer of rock has been produced by rotating two metadolerite core faces against each other at 3000 r.p.m. under an axial load of 330 kg for 11 s using friction welding apparatus. Scanning electron microscopy and electron microprobe analysis reveal that the melt layer comprises sub-angular to rounded porphyroclasts of clinopyroxene, feldspar and ilmenite (>20 μm diameter), derived from the host metadolerite, set within a silicate glass matrix. Thermal calculations confirm that melting occurred at the rock interface and that mean surface temperatures in excess of 1400°C were attained. The fused layer shows many textural similarities with pseudotachylyte described from fault zones. Morphologically, the fused layer consists of a series of stacks of porphyroclasts welded together by melt to form ‘build-ups’ oriented at right angles to the friction surface. There is also evidence of gouging, ploughing and plucking, as well as transfer and adhesion of material having occurred between the rock faces. The mean surface velocity attained by the metadolerite (0.24 m s−1) and duration of the experiment are comparable with velocities and rise times of typical single jerk earthquakes occurring during stick-slip seismic faulting within brittle crust (i.e. slip rates of 0.1-0.5 m s−1 for, say, 1–10s). In these respects the experiment successfully simulated frictional fusion on a fault plane in the absence of an intergranular fluid. Power dissipation during the experiment was about MW m−2, comparable only to very low values for earthquakes (e.g. 1–100 MW m−2 for displacement rates of 0.1-0.5 m s−1 at shear stresses of 100–1000 bars). This indicates that melting on fault planes during earthquakes should be commonplace. Field evidence, however, does not support this contention. Either pseudotachylyte is not being recognized in exhumed ancient seismic fault zones or melting only occurs under very special circumstances.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号