首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Scherbaum et al. [(2004) Bull Seismolo Soc Am 94(6): 2164–2185] proposed a likelihood-based approach to select and rank ground-motion models for seismic hazard analysis in regions of low-seismicity. The results of their analysis were first used within the PEGASOS project [Abrahamson et al. (2002), In Proceedings of the 12 ECEE, London, 2002, Paper no. 633] so far the only application of a probabilistic seismic hazard analysis (PSHA) in Europe which was based on a SSHAC Level 4 procedure [(Budnitz et al. 1997, Recommendations for PSHA: guidance on uncertainty and use of experts. No. NUREG/CR-6372-V1). The outcome of this project have generated considerable discussion (Klügel 2005, Eng Geol 78:285–307, 2005b) Eng Geol 78: 285–307, (2005c) Eng Geol 82: 79–85 Musson et al. (2005) Eng Geol 82(1): 43–55]; Budnitz et al. (2005), Eng Geol 78(3–4): 285–307], a central part of which is related to the issue of ground-motion model selection and ranking. Since at the time of the study by Scherbaum et al. [(2004.) Bull Seismolo Soc Am 94(6): 2164–2185], only records from one earthquake were available for the study area, here we test the stability of their results using more recent data. Increasing the data set from 12 records of one earthquake in Scherbaum et al. [(2004) Bull Seismolo Soc Am 94(6): 2164–2185] to 61 records of 5 earthquakes, which have mainly occurred since the publication of the original study, does not change the set of the three top-ranked ground-motion models [Abrahamson and Silva (1997) Seismolo Res Latt 68(1): 94–127; Lussou et al. (2001) J Earthquake Eng 5(1):13–33; Berge-Thierry et al. (2003) Bull Seismolog Soc Am 95(2): 377–389. Only for the lower-ranked models do we obtain modifications in the ranking order. Furthermore, the records from the Waldkirch earthquake (Dec, 5th, 2004, M w = 4.9) enabled us to develop a new stochastic model parameter set for the application of Campbell’s [(2003) Bull Seismolo Soc Am 93(3): 1012–1033] hybrid empirical model to SW Germany and neighbouring regions.  相似文献   

2.
We address possibilities of minimising environmental risks using statistical features of current-driven propagation of adverse impacts to the coast. The recently introduced method for finding the optimum locations of potentially dangerous activities (Soomere et al. in Proc Estonian Acad Sci 59:156–165, 2010) is expanded towards accounting for the spatial distributions of probabilities and times for reaching the coast for passively advecting particles released in different sea areas. These distributions are calculated using large sets of Lagrangian trajectories found from Eulerian velocity fields provided by the Rossby Centre Ocean Model with a horizontal resolution of 2 nautical miles for 1987–1991. The test area is the Gulf of Finland in the northeastern Baltic Sea. The potential gain using the optimum fairways from the Baltic Proper to the eastern part of the gulf is an up to 44% decrease in the probability of coastal pollution and a similar increase in the average time for reaching the coast. The optimum fairways are mostly located to the north of the gulf axis (by 2–8 km on average) and meander substantially in some sections. The robustness of this approach is quantified as the typical root mean square deviation (6–16 km) between the optimum fairways specified from different criteria. Drastic variations in the width of the ‘corridors’ for almost optimal fairways (2–30 km for the average width of 15 km) signifies that the sensitivity of the results with respect to small changes in the environmental criteria largely varies in different parts of the gulf.  相似文献   

3.
An upgrade of the Siberian Solar Radio Telescope (SSRT) [Smolkov et al., 1986; Grechnev et al., 2003] to a multiwave radio heliograph has been started. The radio heliograph being created will be designed mainly to measure coronal magnetic fields, to determine the locations of solar-flare energy release, and to investigate coronal mass ejections. These tasks define the parameters of next-generation radio heliographs. A high spatial resolution, a high image acquisition rate, and a high sensitivity are required simultaneously. All these parameters should be realized in the widest possible frequency range—from fractions to tens of GHz). The expected parameters of the future SSRT-based radio heliograph are listed below: spatial resolution 12″–24″, temporal resolution 0.02–1.0 s, frequency range 4–8 GHz, sensitivity up to 100 K, left-hand and right-hand circular polarizations, data rate 0.5–20 Mb s−1 (normal and flare modes). In this paper, we describe the broadband antennas, analog optical data transmission lines, and correlator used in the 10-antenna radio heliograph prototype.  相似文献   

4.
This study is an extension of the stochastic analysis of transient two-phase flow in randomly heterogeneous porous media (Chen et al. in Water Resour Res 42:W03425, 2006), by incorporating direct measurements of the random soil properties. The log-transformed intrinsic permeability, soil pore size distribution parameter, and van Genuchten fitting parameter are treated as stochastic variables that are normally distributed with a separable exponential covariance model. These three random variables conditioned on given measurements are decomposed via Karhunen–Loève decomposition. Combined with the conditional eigenvalues and eigenfunctions of random variables, we conduct a series of numerical simulations using stochastic transient water–oil flow model (Chen et al. in Water Resour Res 42:W03425, 2006) based on the KLME approach to investigate how the number and location of measurement points, different random soil properties, as well as the correlation length of the random soil properties, affect the stochastic behavior of water and oil flow in heterogeneous porous media.  相似文献   

5.
The stochastic Green’s function method, which simulates one component of the far-field S-waves from an extended fault plane at high frequencies (Kamae et al., J Struct Constr Eng Trans AIJ, 430:1–9, 1991), is extended to simulate the three components of the full waveform in layered half-spaces for broadband frequency range. The method firstly computes ground motions from small earthquakes, which correspond to the ruptures of sub-faults on a fault plane of a large earthquake, and secondly constructs the strong motions of the large earthquake by superposing the small ground motions using the empirical Green’s function technique (e.g., Irikura, Proc 7th Japan Earthq Eng Symp, 151–156, 1986). The broadband stochastic omega-square model is proposed as the moment rate functions of the small earthquakes, in which random and zero phases are used at higher and lower frequencies, respectively. The zero phases are introduced to simulate a smooth ramp function of the moment function with the duration of 1/fc s (fc: the corner frequency) and to reproduce coherent strong motions at low frequencies (i.e., the directivity pulse). As for the radiation coefficients, the theoretical values of double couple sources for lower frequencies and the theoretical isotropic values for the P-, SV-, and SH-waves (Onishi and Horike, J Struct Constr Eng Trans AIJ, 586:37–44, 2004) for high frequencies are used. The proposed method uses the theoretical Green’s functions of layered half-spaces instead of the far-field S-waves, which reproduce the complete waves including the direct and reflected P- and S-waves and surface waves at broadband frequencies. Finally, the proposed method is applied to the 1994 Northridge earthquake, and results show excellent agreement with the observation records at broadband frequencies. At the same time, the method still needs improvements especially because it underestimates the high-frequency vertical components in the near fault range. Nonetheless, the method will be useful for modeling high frequency contributions in the hybrid methods, which use stochastic and deterministic methods for high and low frequencies, respectively (e.g., the stochastic Green’s function method + finite difference methods; Kamae et al., Bull Seism Soc Am, 88:357–367, 1998; Pitarka et al., Bull Seism Soc Am 90:566–586, 2000), because it reproduces the full waveforms in layered media including not only random characteristics at higher frequencies but also theoretical and deterministic coherencies at lower frequencies.  相似文献   

6.
In this study, the spatial distributions of seismicity and seismic hazard were assessed for Turkey and its surrounding area. For this purpose, earthquakes that occurred between 1964 and 2004 with magnitudes of M ≥ 4 were used in the region (30–42°N and 20–45°E). For the estimation of seismicity parameters and its mapping, Turkey and surrounding area are divided into 1,275 circular subregions. The b-value from the Gutenberg–Richter frequency–magnitude distributions is calculated by the classic way and the new alternative method both using the least-squares approach. The a-value in the Gutenberg–Richter frequency–magnitude distributions is taken as a constant value in the new alternative method. The b-values calculated by the new method were mapped. These results obtained from both methods are compared. The b-value shows different distributions along Turkey for both techniques. The b-values map prepared with new technique presents a better consistency with regional tectonics, earthquake activities, and epicenter distributions. Finally, the return period and occurrence hazard probability of M ≥ 6.5 earthquakes in 75 years were calculated by using the Poisson model for both techniques. The return period and occurrence hazard probability maps determined from both techniques showed a better consistency with each other. Moreover, maps of the occurrence hazard probability and return period showed better consistency with the b-parameter seismicity maps calculated from the new method. The occurrence hazard probability and return period of M ≥ 6.5 earthquakes were calculated as 90–99% and 5–10 years, respectively, from the Poisson model in the western part of the studying region.  相似文献   

7.
The unconditional stochastic studies on groundwater flow and solute transport in a nonstationary conductivity field show that the standard deviations of the hydraulic head and solute flux are very large in comparison with their mean values (Zhang et al. in Water Resour Res 36:2107–2120, 2000; Wu et al. in J Hydrol 275:208–228, 2003; Hu et al. in Adv Water Resour 26:513–531, 2003). In this study, we develop a numerical method of moments conditioning on measurements of hydraulic conductivity and head to reduce the variances of the head and the solute flux. A Lagrangian perturbation method is applied to develop the framework for solute transport in a nonstationary flow field. Since analytically derived moments equations are too complicated to solve analytically, a numerical finite difference method is implemented to obtain the solutions. Instead of using an unconditional conductivity field as an input to calculate groundwater velocity, we combine a geostatistical method and a method of moment for flow to conditionally simulate the distributions of head and velocity based on the measurements of hydraulic conductivity and head at some points. The developed theory is applied in several case studies to investigate the influences of the measurements of hydraulic conductivity and/or the hydraulic head on the variances of the predictive head and the solute flux in nonstationary flow fields. The study results show that the conditional calculation will significantly reduce the head variance. Since the hydraulic head measurement points are treated as the interior boundary (Dirichlet boundary) conditions, conditioning on both the hydraulic conductivity and the head measurements is much better than conditioning only on conductivity measurements for reduction of head variance. However, for solute flux, variance reduction by the conditional study is not so significant.  相似文献   

8.
Let {Y, Y i , −∞ < i < ∞} be a doubly infinite sequence of identically distributed and asymptotically linear negative quadrant dependence random variables, {a i , −∞ < i < ∞} an absolutely summable sequence of real numbers. We are inspired by Wang et al. (Econometric Theory 18:119–139, 2002) and Salvadori (Stoch Environ Res Risk Assess 17:116–140, 2003). And Salvadori (Stoch Environ Res Risk Assess 17:116–140, 2003) have obtained Linear combinations of order statistics to estimate the quantiles of generalized pareto and extreme values distributions. In this paper, we prove the complete convergence of under some suitable conditions. The results obtained improve and generalize the results of Li et al. (1992) and Zhang (1996). The results obtained extend those for negative associated sequences and ρ*-mixing sequences. CIC Number O211, AMS (2000) Subject Classification 60F15, 60G50 Research supported by National Natural Science Foundation of China  相似文献   

9.
The altimetric satellite signal is the sum of the geoid and the dynamic topography, but only the latter is relevant to oceanographic applications. Poor knowledge of the geoid has prevented oceanographers from fully exploiting altimetric measurements through its absolute component, and applications have concentrated on ocean variability through analyses of sea level anomalies. Recent geodetic missions like CHAMP, GRACE and the forthcoming GOCE are changing this perspective. In this study, data assimilation is used to reconstruct the Tropical Pacific Ocean circulation during the 1993–1996 period. Multivariate observations are assimilated into a primitive equation ocean model (OPA) using a reduced order Kalman filter (the Singular Evolutive Extended Kalman filter). A 6-year (1993–1998) hindcast experiment is analyzed and validated by comparison with observations. In this experiment, the new capability offered by an observed absolute dynamic topography (built using the GRACE geoid to reference the altimetric data) is used to assimilate, in an efficient way, the in-situ temperature profiles from the TAO/TRITON moorings together with the T/P and ERS1&2 altimetric signal. GRACE data improves compatibility between both observation data sets. The difficulties encountered in this regard in previous studies such as Parent et al. (J Mar Syst 40–41:381–401, 2003) are now circumvented. This improvement helps provide more efficient data assimilation, as evidenced, by assessing the results against independent data. This leads in particular to significantly more realistic currents and vertical thermal structures.  相似文献   

10.
Extreme high precipitation amounts are among environmental events with the most disastrous consequences for human society. This paper deals with the identification of ‘homogeneous regions’ according to statistical characteristics of precipitation extremes in the Czech Republic, i.e. the basic and most important step toward the regional frequency analysis. Precipitation totals measured at 78 stations over 1961–2000 are used as an input dataset. Preliminary candidate regions are formed by the cluster analysis of site characteristics, using the average-linkage clustering and Ward’s method. Several statistical tests for regional homogeneity are utilized, based on the 10-yr event and the variation of L-moment statistics. In compliance with results of the tests, the area of the Czech Republic has been divided into four homogeneous regions. The findings are supported by simulation experiments proposed to evaluate stability of the test results. Since the regions formed reflect also climatological differences in precipitation regimes and synoptic patterns causing high precipitation amounts, their future application may not be limited to the frequency analysis of extremes.  相似文献   

11.
一种基于多能统计的射束硬化校正方法   总被引:3,自引:0,他引:3  
在X射线层析成像(X-CT)系统中,传统的CT重建算法都基于射线源是单能的假设提出的,而实际的射线源是多能的,直接由多能投影数据用传统的重建算法重建图像,会导致硬化效应。本文把实际的多能谱细化成若干个子能谱,根据光子数服从Poisson分布这一规律建立数学模型,在传统迭代重建算法的基础上,借用统计的方法进行参数估计。实验结果表明该方法可以有效消除重建图像中的杯状伪影,提高重建图像质量。  相似文献   

12.
 The open literature reveals several types of bivariate exponential distributions. Of them only the Nagao–Kadoya distribution (Nagao and Kadoya, 1970, 1971) has a general form with marginals that are standard exponential distributions and the correlation coefficient being 0≤ρ<1. On the basis of the principle that if a theoretical probability distribution can represent statistical properties of sample data, then the computed probabilities from the theoretical model should provide a good fit to observed ones, numerical experiments are executed to investigate the applicability of the Nagao–Kadoya bivariate exponential distribution for modeling the joint distribution of two correlated random variables with exponential marginals. Results indicate that this model is suitable for analyzing the joint distribution of two exponentially distributed variables. The procedure for the use of this model to represent the joint statistical properties of two correlated exponentially distributed variables is also presented.  相似文献   

13.
In this work, we carried out a preliminary study of traffic-derived pollutants from primary sources (vehicles), and on roads (paved area), road borders and surroundings areas. The study is focussed on the identification, distribution and concentration of pollutants and magnetic carriers. Magnetic parameters and their analyses suggest that the magnetic signal of vehicle-derived emissions is controlled by a magnetite-like phase. Magnetic grain size estimations reveal the presence of fine particles (0.1–5 μm) that can be inhaled and therefore are dangerous to human health. Magnetic susceptibility results (about 175 × 10−5 SI) show a higher magnetic concentration — magnetic enhancement — in the central area of the tollbooth line that is related to higher traffic. In addition, magnetic susceptibility was computed on several roadside soils along a length of 120 km and used to generate a 2-D contour map, which shows higher magnetic values (100–200 10−5 SI) near the edge of the road. The observed distribution of magnetic values indicates that magnetic particles emitted by vehicles are accumulated and mainly concentrated within a distance of several meters (1–2 m) from the edge of the road. In consequence, the magnetic susceptibility parameter seems to be a suitable indicator of traffic-related pollution. Non-magnetic studies show an enrichment of some trace elements, such as Ba, Cr, Cu, Zn and Pb, that are associated with traffic pollution. Furthermore, statistical correlations between the content of toxic trace metals and magnetic variables support the use of magnetic parameters as potential proxies for traffic-related pollution in this study area.  相似文献   

14.
The natural-climatic causes of changes in river runoff and seasonal recharge of groundwater in Don basin are considered. Joint analysis is made of changes in the statistical characteristics of the series of air temperature and precipitation, mean annual and dry-season-averaged runoff for both the entire observation period and of periods 1940–1969 and 1970–2000 with comparable durations. The presence of statistically reliable ascending trends in air temperature, precipitation, and dry-season (groundwater) runoff for period 1970–2005 is demonstrated. Climatic changes in Don basin also have their effect on the formation of extremely low water in small and medium rivers, including cases of zero runoff. Zoning of the territory by runoff formation conditions is carried out, and new estimates of natural groundwater resources in Don basin for period 1970–2000 are constructed. Appropriate maps are compiled.  相似文献   

15.
Using ground temperature data from meteorological stations as well as earthquake, ground tilt and precipitation data, the spatial-temporal distribution of “Underground Hot Vortex” (UHV) in China was analyzed in detail. The results show that concerning an “Underground Hot Vortex” cell, its life-span is 3–8 seasons, 1.5 years on average; the mean horizontal scale is 600 km and its characteristic velocity is about 400 km/a; UHV is likely to appear in some areas where the crustal movement is intense and the absolute value of vertical deformation rate is relatively high; its activity could hardly be detected in the area where the crust is stable and the vertical deformation is weak; most of “Underground Hot Vortex Groups” originate from the edge of Indian Plate, then migrate eastwards with a leaping-frog style. 5–10 years are needed for their arrival in the eastern border of China. Their horizontal migrating velocity is 200–500 km/a which is nearly equal to the characteristic velocity of a single UHV. Project sponsored by the National Climbing Project and Key Project of the Chinese Academy of Sciences.  相似文献   

16.
The objective of the study presented in this paper is to investigate the predictive capabilities of a process-based sand–mud model in a quantitative way. This recently developed sand–mud model bridges the gap between noncohesive sand models and cohesive mud models. It explicitly takes into account the interaction between these two sediment fractions and temporal and spatial bed composition changes in the sediment bed [Van Ledden (2002) 5:577–594, Van Ledden et al. (2004a) 24:1–11, Van Ledden et al. (2004b) 54:385–391]. The application of this model to idealized situations has demonstrated a good qualitative agreement between observed and computed bed levels and bed composition developments. However, in real-life situations, a realistic quantitative prediction of the magnitude and timescale of this response is important to assess the short-term and long-term impacts of human interventions and/or natural changes. For this purpose, the Friesche Zeegat in the Wadden Sea (the Netherlands) is used as a reference to hindcast the morphological response in the period 1970–1994. Due to the closure of the Lauwerszee in 1969, the tidal prism of this tidal basin was reduced by about 30%. Significant changes in the bed level and bed composition have occurred in the decades following the closure to adjust to the new hydrodynamic conditions. We modeled the long-term bed level and bed composition development in the Friesche Zeegat in the period 1970–1994 starting with the geometry of 1970 by using a research version of Delft3D, which incorporates the sand–mud formulations proposed by [Van Ledden (2002) 5:577–594].The computed total net deposition in the tidal basin in the period 1970–1994 agrees well with the observations, but the observed decrease of the import rate with time is not predicted. The model predicts net deposition in the deeper parts and at the intertidal area in the basin and net erosion in between, which resembles the observations qualitatively. Furthermore, the computed distribution of sand and mud in the basin of the Friesche Zeegat appears to be realistic. Analysis of the results shows that the absence of the decreasing import rate in the basin is caused by a poor quantitative prediction of the changes in the hypsometry of the basin. Because of this, the computed velocity asymmetry in the main channel tends toward flood dominance, whereas the observations indicate that the system is ebb-dominant in 1992. Although the sand–mud model needs to be further improved and verified, the results presented in this paper indicate that the model can be applied as a first step to estimate the effects of human interventions on the large-scale bed level and bed composition changes in tidal systems with sand and mud.  相似文献   

17.
18.
The main purpose of this paper is to evaluate and compare some of the statistical tropospheric scintillation models for one-year data (1999–2000) measured using SUPERBIRD-C satellite in Tronoh, Malaysia. Eight statistical models of monthly mean scintillation intensity are briefly reviewed and their predictions compared with measurements. Results are discussed in order to understand the potentials and the limits of each prediction model within this case study. In the context of our measurements, the Karasawa and Ortgies-T models have the best overall performance. The agreement with satellite measurements is found to be mainly dependent on the parameterization of prediction models to the radiometeorological variables along the earth-space path.  相似文献   

19.
 The 1991–1993 lava flow is the most voluminous flow erupted at Mount Etna, Sicily, in over 300 years. Estimates of the volume obtained by various methods range from 205×106 m3 (Tanguy 1996) to over 500×106 m3 (Barberi et al. 1993). This paper describes the results of an electronic distance measurement (EDM)-based field survey of the upper surface of the 1991–1993 flow field undertaken in 1995. The results were digitised, interpolated and converted into a digital elevation model and then compared with a pre-eruption digital elevation model, constructed from a 1 : 25 000 contour map of the area, based on 1989 aerial photographs. Our measurements are the most accurate to date and show that the 1991–1993 lava flow occupies a volume of 231±29×106 m3. Received: 20 July 1996 / Accepted: 5 November 1996  相似文献   

20.
The 0.5°×0.5°grid resolution distribution of lightning density in China and its circumjacent regions have been analyzed by using the satellite-borne OTD (Apr 1995-Mar 2000) and LIS (Dec 1997-Mar 2003) databases. It is shown that: (i) Firstly, the variability of the lightning density (LD) is particularly pronounced over the different subareas, 9 times greater over the south than the north side of Himalayas Mountains, 2.5 times greater over the eastern than the western area of China. While the maximum and minimum LD are respectively 31.4fl/km2/a (in Guangzhou region) and less than 0.2fl/km2/a (in the desert of western China). Secondly, the LD of China's continent regularly varies with latitude and distance off coast, which is consistent with annual mean precipitation in varying trend. In conclusion, the Qinghai-Tibet Plateau, the China's three-step staircase topography and the latitude are three important factors affecting macro-scale characteristics of the LD distribution, (ii) The regional differences  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号