首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The effects of climate change and population growth in recent decades are leading us to consider their combined and potentially extreme consequences, particularly regarding hydrological processes, which can be modeled using a generalized extreme value (GEV) distribution. Most of the GEV models were based on a stationary assumption for hydrological processes, in contrast to the nonstationary reality due to climate change and human activities. In this paper, we present the nonstationary generalized extreme value (NSGEV) distribution and use it to investigate the risk of Niangziguan Springs discharge decreasing to zero. Rather than assuming the location, scale, and shape parameters to be constant as one might do for a stationary GEV distribution analysis, the NSGEV approach can reflect the dynamic processes by defining the GEV parameters as functions of time. Because most of the GEV model is designed to evaluate maxima (e.g. flooding, represented by positive numbers), and spring discharge cessation is a ?minima’, we deduced an NSGEV model for minima by applying opposite numbers, i.e. negative instead of positive numbers. The results of the model application to Niangziguan Springs showed that the probability of zero discharge at Niangziguan Springs will be 1/80 in 2025, and 1/10 in 2030. After 2025, the rate of decrease in spring discharge will accelerate, and the probability that Niangziguan Springs will cease flowing will dramatically increase. The NSGEV model is a robust method for analysing karst spring discharge. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

2.
Changes in river flow regime resulted in a surge in the number of methods of non-stationary flood frequency analysis. Common assumption is the time-invariant distribution function with time-dependent location and scale parameters while the shape parameters are time-invariant. Here, instead of location and scale parameters of the distribution, the mean and standard deviation are used. We analyse the accuracy of the two methods in respect to estimation of time-dependent first two moments, time-invariant skewness and time-dependent upper quantiles. The method of maximum likelihood (ML) with time covariate is confronted with the Two Stage (TS) one (combining Weighted Least Squares and L-moments techniques). Comparison is made by Monte Carlo simulations. Assuming parent distribution which ensures the asymptotic superiority of ML method, the Generalized Extreme Value distribution with various values of linearly changing in time first two moments, constant skewness, and various time-series lengths are considered. Analysis of results indicates the superiority of TS methods in all analyzed aspects. Moreover, the estimates from TS method are more resistant to probability distribution choice, as demonstrated by Polish rivers’ case studies.  相似文献   

3.
Long-term time-dependent stochastic modelling of extreme waves   总被引:4,自引:3,他引:1  
This paper presents a literature survey on time-dependent statistical modelling of extreme waves and sea states. The focus is twofold: on statistical modelling of extreme waves and space- and time-dependent statistical modelling. The first part will consist of a literature review of statistical modelling of extreme waves and wave parameters, most notably on the modelling of extreme significant wave height. The second part will focus on statistical modelling of time- and space-dependent variables in a more general sense, and will focus on the methodology and models used also in other relevant application areas. It was found that limited effort has been put on developing statistical models for waves incorporating spatial and long-term temporal variability and it is suggested that model improvements could be achieved by adopting approaches from other application areas. In particular, Bayesian hierarchical space–time models were identified as promising tools for spatio-temporal modelling of extreme waves. Finally, a review of projections of future extreme wave climate is presented.  相似文献   

4.
It is common in geostatistics to use the variogram to describe the spatial dependence structure and to use kriging as the spatial prediction methodology. Both methods are sensitive to outlying observations and are strongly influenced by the marginal distribution of the underlying random field. Hence, they lead to unreliable results when applied to extreme value or multimodal data. As an alternative to traditional spatial modeling and interpolation we consider the use of copula functions. This paper extends existing copula-based geostatistical models. We show how location dependent covariates e.g. a spatial trend can be accounted for in spatial copula models. Furthermore, we introduce geostatistical copula-based models that are able to deal with random fields having discrete marginal distributions. We propose three different copula-based spatial interpolation methods. By exploiting the relationship between bivariate copulas and indicator covariances, we present indicator kriging and disjunctive kriging. As a second method we present simple kriging of the rank-transformed data. The third method is a plug-in prediction and generalizes the frequently applied trans-Gaussian kriging. Finally, we report on the results obtained for the so-called Helicopter data set which contains extreme radioactivity measurements.  相似文献   

5.
In the present paper a statistical model for extreme value analysis is developed, considering seasonality. The model is applied to significant wave height data from the N. Aegean Sea. To build this model, a non-stationary point process is used, which incorporates apart from a time varying threshold and harmonic functions with a period of one year, a component μ w(t) estimated through the wavelet transform. The wavelet transform has a dual role in the present study. It detects the significant “periodicities” of the signal by means of the wavelet global and scale-averaged power spectra and then is used to reconstruct the part of the time series, μ w(t), represented by these significant features. A number of candidate models, which incorporate μ w(t) in their location and scale parameters are tried. To avoid overparameterisation, an automatic model selection procedure based on the Akaike information criterion is carried out. The best obtained model is graphically evaluated by means of diagnostic plots. Finally, “aggregated” return levels with return periods of 20, 50 and 100 years, as well as time-dependent quantiles are estimated, combining the results of the wavelet analysis and the Poisson process model, identifying a significant reduction in return level estimation uncertainty, compared to more simple non-stationary models.  相似文献   

6.
Regional models of extreme rainfall must address the spatial variability induced by orographic obstacles. However, the proper detection of orographic effects often depends on the availability of a well‐designed rain gauge network. The aim of this study is to investigate a new method for identifying and characterizing the effects of orography on the spatial structure of extreme rainfall at the regional scale, including where rainfall data are lacking or fail to describe rainfall features thoroughly. We analyse the annual maxima of daily rainfall data in the Campania region, an orographically complex region in Southern Italy, and introduce a statistical procedure to identify spatial outliers in a low order statistic (namely the mean). The locations of these outliers are then compared with a pattern of orographic objects that has been a priori identified through the application of an automatic geomorphological procedure. The results show a direct and clear link between a particular set of orographic objects and a local increase in the spatial variability of extreme rainfall. This analysis allowed us to objectively identify areas where orography produces enhanced variability in extreme rainfall. It has direct implications for rain gauge network design criteria and has led to promising developments in the regional analysis of extreme rainfall. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

7.
Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.  相似文献   

8.
Mixed extreme wave climate model for reanalysis databases   总被引:1,自引:0,他引:1  
Hindcast or wave reanalysis databases (WRDB) constitute a powerful source with respect to instrumental records in the design of offshore and coastal structures, since they offer important advantages for the statistical characterization of wave climate variables, such as continuous long time records of significant wave heights, mean and peak periods, etc. However, reanalysis data is less accurate than instrumental records, making extreme data analysis derived from WRDB prone to under predict design return period values. This paper proposes a mixed extreme value model to deal with maxima, which takes full advantage of both (i) hindcast or wave reanalysis and (ii) instrumental records, reducing the uncertainty in its predictions. The resulting mixed model consistently merges the information given by both kinds of data sets, and it can be applied to any extreme value analysis distribution, such as generalized extreme value, peaks over threshold or Pareto–Poisson. The methodology is illustrated using both synthetically generated and real data, the latter taken from a given location on the northern Spanish coast.  相似文献   

9.
In climatology and hydrology, univariate Extreme Value Theory has become a powerful tool to model the distribution of extreme events. The Generalized Pareto Distribution (GPD) is routinely applied to model excesses in space or time by letting the two GPD parameters depend on appropriate covariates. Two possible pitfalls of this strategy are the modeling and the interpretation of the scale and shape GPD parameters estimates which are often and incorrectly viewed as independent variables. In this note we first recall a statistical technique that makes the GPD estimates less correlated within a Maximum Likelihood (ML) estimation approach. In a second step we propose novel reparametrizations for two method-of-moments particularly popular in hydrology: the Probability Weighted Moment (PWM) method and its generalized version (GPWM). Finally these three inference methods (ML, PWM and GPWM) are compared and discussed with respect to the issue of correlations.  相似文献   

10.
Extreme value analysis of precipitation is of great importance for several types of engineering studies and policy decisions. For return level estimation of extreme 24-h precipitation, practitioners often use daily measurements (usually 08:00–08:00 local time) since high-frequency measurements are scarce. Annual maxima of daily series are smaller or equal to continuous 24-h precipitation maxima such that the resulting return levels may be systematically underestimated. In this paper we use a rule, derived earlier, on the conversion of the generalized extreme value (GEV) distribution of daily to 24-h maxima. We develop an estimator for the conversion exponent by combining daily maxima and high-frequency sampled 24-h maxima in one joint log-likelihood. Once the conversion exponent has been estimated, GEV-parameters of 24-h maxima can be obtained at sites where only daily data is available. The new methodology has been extended to spatial regression models.  相似文献   

11.
Droughts, as many climatic and environmental phenomena, are events with a random duration. In the monitoring and risk management of this type of phenomena, it is important the development of measures of the risk that an ongoing event ends. This work develops a risk measure conditional on the current state of the event, that can be easily updated in real time. The measure is based on the hazard function of the duration of an event, that is modeled as a parametric function of covariates describing the current state of the process. The use of (time-dependent) internal covariates is often required to describe that state, and maximum likelihood methods cannot be used to estimate the model. Therefore, an approach based on partial likelihood functions that permit the inclusion of both external and internal covariates is suggested. This approach is very general but it has the drawback of requiring some programming to be implemented. However, it is proved that for durations with a geometric distribution, an equivalent and easily implemented approach based on generalized linear models can be used to estimate the hazard function. This methodology is applied to develop a risk measure in drought analysis. The approach is exemplified using the drought series from a Spanish location (Huesca) and internal covariates derived from the rainfall series. The whole modeling process is thoroughly described, including the covariate selection procedure and some new validation tools.  相似文献   

12.
Large observed datasets are not stationary and/or depend on covariates, especially, in the case of extreme hydrometeorological variables. This causes the difficulty in estimation, using classical hydrological frequency analysis. A number of non-stationary models have been developed using linear or quadratic polynomial functions or B-splines functions to estimate the relationship between parameters and covariates. In this article, we propose regularised generalized extreme value model with B-splines (GEV-B-splines models) in a Bayesian framework to estimate quantiles. Regularisation is based on penalty and aims to favour parsimonious model especially in the case of large dimension space. Penalties are introduced in a Bayesian framework and the corresponding priors are detailed. Five penalties are considered and the corresponding priors are developed for comparison purpose as: Least absolute shrinkage and selection (Lasso and Ridge) and smoothing clipped absolute deviations (SCAD) methods (SCAD1, SCAD2 and SCAD3). Markov chain Monte Carlo (MCMC) algorithms have been developed for each model to estimate quantiles and their posterior distributions. Those approaches are tested and illustrated using simulated data with different sample sizes. A first simulation was made on polynomial B-splines functions in order to choose the most efficient model in terms of relative mean biais (RMB) and the relative mean-error (RME) criteria. A second simulation was performed with the SCAD1 penalty for sinusoidal dependence to illustrate the flexibility of the proposed approach. Results show clearly that the regularized approaches leads to a significant reduction of the bias and the mean square error, especially for small sample sizes (n < 100). A case study has been considered to model annual peak flows at Fort-Kent catchment with the total annual precipitations as covariates. The conditional quantile curves were given for the regularized and the maximum likelihood methods.  相似文献   

13.
小波模极大值去噪算法中将高频小波系数全部当做噪声处理, 忽略了高频小波系数中仍含有的有用信息, 从而导致了模极大值传播点错选现象以及计算出的噪声方差中仍含有用信息. 针对这些问题, 提出了小波熵与相关性相结合的小波模极大值去噪算法. 将高频小波系数进行相关处理, 确定有效信号的位置; 将最大尺度上的高频小波系数划分成若干个小区间, 计算各区间小波熵; 以小波熵最大区间的高频小波系数的平均值作为噪声方差, 根据Donoho提出的阈值公式计算最大尺度上的阈值; 经阈值比较得到的模极大值点位置与相关处理得到的有用信息的位置进行比较, 保留相同位置的模极大值, 剔除位置不同由噪声引起的模极大值点; 采用即兴(Adhoc)算法逐级搜索各尺度上的模极大值, 并用交替投影算法进行重构. 该算法实现了阈值的自适应选取, 并有效解决了去除错选模极大值传播点的问题. 将本算法和传统去噪方法用于仿真信号处理中, 经对比分析验证了本算法的有效性.   相似文献   

14.
Starting from a recent paper by Murshed (Stoch Environ Res Risk Assess 25:897–911, 2011) in which a good performance of the Beta-k distribution in analyzing extreme hydrologic events is shown, in this paper, we propose the use of two new four-parameters distribution functions strongly related to the Beta-k distribution, namely the Beta-Dagum and the Beta-Singh-Maddala distributions. More in detail, the new distributions are a generalization of a reparametrization of Beta-k and Beta-p distributions, respectively. For these distributions some particular interpretations in terms of maximum and minimum of sequences of random variables can be derived and the maximal and minimal domain of attraction can be obtained. Moreover, the method of maximum likelihood, the method of moments and the method of L-moments are examined to estimate the parameters. Finally, two different applications on real data regarding maxima and minima of river flows are reported, in order to show the potentiality of these two models in the extreme events analysis.  相似文献   

15.
Abstract

Heavy rainfall events often occur in southern French Mediterranean regions during the autumn, leading to catastrophic flood events. A non-stationary peaks-over-threshold (POT) model with climatic covariates for these heavy rainfall events is developed herein. A regional sample of events exceeding the threshold of 100 mm/d is built using daily precipitation data recorded at 44 stations over the period 1958–2008. The POT model combines a Poisson distribution for the occurrence and a generalized Pareto distribution for the magnitude of the heavy rainfall events. The selected covariates are the seasonal occurrence of southern circulation patterns for the Poisson distribution parameter, and monthly air temperature for the generalized Pareto distribution scale parameter. According to the deviance test, the non-stationary model provides a better fit to the data than a classical stationary model. Such a model incorporating climatic covariates instead of time allows one to re-evaluate the risk of extreme precipitation on a monthly and seasonal basis, and can also be used with climate model outputs to produce future scenarios. Existing scenarios of the future changes projected for the covariates included in the model are tested to evaluate the possible future changes on extreme precipitation quantiles in the study area.

Editor Z.W. Kundzewicz; Associate editor K. Hamed

Citation Tramblay, Y., Neppel, L., Carreau, J., and Najib, K., 2013. Non-stationary frequency analysis of heavy rainfall events in southern France. Hydrological Sciences Journal, 58 (2), 280–294.  相似文献   

16.
A Fast and Reliable Method for Surface Wave Tomography   总被引:6,自引:0,他引:6  
—?We describe a method to invert regional or global scale surface-wave group or phase-velocity measurements to estimate 2-D models of the distribution and strength of isotropic and azimuthally anisotropic velocity variations. Such maps have at least two purposes in monitoring the nuclear Comprehensive Test-Ban Treaty (CTBT): (1) They can be used as data to estimate the shear velocity of the crust and uppermost mantle and topography on internal interfaces which are important in event location, and (2) they can be used to estimate surface-wave travel-time correction surfaces to be used in phase-matched filters designed to extract low signal-to-noise surface-wave packets.¶The purpose of this paper is to describe one useful path through the large number of options available in an inversion of surface-wave data. Our method appears to provide robust and reliable dispersion maps on both global and regional scales. The technique we describe has a number of features that have motivated its development and commend its use: (1) It is developed in a spherical geometry; (2) the region of inference is defined by an arbitrary simple closed curve so that the method works equally well on local, regional, or global scales; (3) spatial smoothness and model amplitude constraints can be applied simultaneously; (4) the selection of model regularization and the smoothing parameters is highly flexible which allows for the assessment of the effect of variations in these parameters; (5) the method allows for the simultaneous estimation of spatial resolution and amplitude bias of the images; and (6) the method optionally allows for the estimation of azimuthal anisotropy.¶We present examples of the application of this technique to observed surface-wave group and phase velocities globally and regionally across Eurasia and Antarctica.  相似文献   

17.
Multicomponent probability distributions such as the two‐component Gumbel distribution are sometimes applied to annual flood maxima when individual floods are seen as belonging to different classes, depending on physical processes or time of year. However, hydrological inconsistencies may arise if only nonclassified annual maxima are available to estimate the component distribution parameters. In particular, an unconstrained best fit to annual flood maxima may yield some component distributions with a high probability of simulating floods with negative discharge. In such situations, multicomponent distributions cannot be justified as an improved approximation to a local physical reality of mixed flood types, even though a good data fit is achieved. This effect usefully illustrates that a good match to data is no guarantee against degeneracy of hydrological models. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

18.
Extreme value theory for the maximum of a time series of daily precipitation amount is described. A chain-dependent process is assumed as a stochastic model for daily precipitation, with the intensity distribution being the gamma. To examine how the effective return period for extreme high precipitation amounts would change as the parameters of the chain-dependent process change (i.e., probability of a wet day, shape and scale parameters of the gamma distribution), a sensitivity analysis is performed. This sensitivity analysis is guided by some results from statistical downscaling that relate patterns in large-scale atmospheric circulation to local precipitation, providing a physically plausible range of changes in the parameters. For the particular location considered in the example, the effective return period is most sensitive to the scale parameter of the intensity distribution.  相似文献   

19.
Gully delineation is a critical aspect of accurately determining soil losses but associated methodologies are rarely detailed. Here, we describe a new gully mapping method, the normalized topographic method (NorToM), based on processing digital elevation model (DEM) data, and we assess associated errors when it is applied over a range of geomorphological scales. The NorToM is underpinned by two gully detection variables (normalized slope and elevation) calculated over local windows of prescribed size, and a group of filtering variables. For four study sites, DEMs of gullies were obtained using field and airborne photo‐reconstruction and evaluated using total station and differential global positioning system (dGPS) survey. NorToM provided accurate areal and volume estimates at the individual gully scale but differences increased at the larger gully system and gully network scales. We were able to identify optimal parameters for using the NorToM approach and so confirm that is represents a useful scale‐independent means of gully mapping that is likely to be valid in other environments. Its main limitations are that the normalization process might be time‐consuming at regional scales and the need for a fixed window size when applied to landforms with extreme variations in dimensions. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

20.
Local extreme rain usually resulted in disasters such as flash floods and landslides. Upon today, it is still one of the most difficult tasks for operational weather forecast centers to predict those events accurately. In this paper, we simulate an extreme precipitation event with ensemble Kalman filter (EnKF) assimilation of Doppler radial-velocity observations, and analyze the uncertainties of the assimilation. The results demonstrate that, without assimilation radar data, neither a single initialization of deterministic forecast nor an ensemble forecast with adding perturbations or multiple physical parameterizations can predict the location of strong precipitation. However, forecast was significantly improved with assimilation of radar data, especially the location of the precipitation. The direct cause of the improvement is the buildup of a deep mesoscale convection system with EnKF assimilation of radar data. Under a large scale background favorable for mesoscale convection, efficient perturbations of upstream mid-low level meridional wind and moisture are key factors for the assimilation and forecast. Uncertainty still exists for the forecast of this case due to its limited predictability. Both the difference of large scale initial fields and the difference of analysis obtained from EnKF assimilation due to small amplitude of initial perturbations could have critical influences to the event's prediction. Forecast could be improved through more cycles of EnKF assimilation. Sensitivity tests also support that more accurate forecasts are expected through improving numerical models and observations.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号