首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

本文利用经过均一化订正的长江流域共669个气象站近60年(1961—2020年)逐日观测资料,采用相对阈值和绝对阈值相结合的极值分析方法,对长江流域近60年极端高温事件、极端低温事件、极端干旱事件和极端降水事件进行识别,分析了年发生频率和线性变化趋势.在此基础上,考虑到全国极端气候事件发生情况,构建了多个极端气候事件综合危险性等级指标,比较客观地给出了长江流域极端气候事件综合危险性等级.研究结果表明,相对于全国其他地区,长江流域大部分地区极端气候综合危险性等级较高,虽然自1961年以来综合年发生频率呈现弱的线性减少趋势,但自20世纪90年代以来,长江流域极端气候事件发生的危险性相对于全国其他地区明显偏高.通过对不同极端气候事件危险性和变化规律研究,结果表明:长江流域近60年极端干旱事件年发生频率呈现线性减少趋势,与全国他其区域相比较,长江流域大部分地区极端干旱发生的危险性等级都在中级以上,说明长江流域容易发生极端干旱事件;长江流域近60年极端降水事件年发生频率呈现弱的增加趋势,危险性等级指数分析表明,高危险区主要位于长江中下游地区,湖南西部、江西大部、湖北南部等地发生极端降水事件的危险性很高;近60年长江流域大部分地区极端高温事件显著增加,尤其进入21世纪以来发生更加频繁,但相对于全国其他地区,危险性等级较低;近60年长江流域极端低温事件显著减少,但相对于全国其他地区,极端低温事件发生的危险性增加明显.进入21世纪以来,长江流域极端气候事件的综合危险性不断增加,极端高温和极端干旱相伴而生的高温干旱复合型事件频繁发生,极端降水事件和极端低温事件在全国的占比不断升高,造成的社会经济影响越来越严重,说明长江流域加强极端气候事件风险防范的重要性和紧迫性.

  相似文献   

2.
There is an urgent need for the development and implementation of modern statistical methodology for long-term risk assessment of extreme hydrological hazards in the Caribbean. Notwithstanding the inevitable scarcity of data relating to extreme events, recent results and approaches call into question standard methods of estimation of the risks of environmental catastrophes that are currently adopted. Estimation of extreme hazards is often based on the Gumbel model and on crude methods for estimating predictive probabilities. In both cases the result is often a remarkable underestimation of the predicted probabilities for disasters of large magnitude. Simplifications do not stop here: assumptions of data homogeneity and temporal independence are usually made regardless of potential inconsistencies with genuine process behaviour and the fact that results may be sensitive to such mis-specifications. These issues are of particular relevance for the Caribbean, given its exposure to diverse meteorological climate conditions.In this article we present an examination of predictive methodologies for the assessment of long-term risks of hydrological hazards, with particular focus on applications to rainfall and flooding, motivated by three data sets from the Caribbean region. Consideration is given to classical and Bayesian methods of inference for annual maxima and daily peaks-over-threshold models. We also examine situations where data non-homogeneity is compromised by an unknown seasonal structure, and the situation in which the process under examination has a physical upper limit. We highlight the fact that standard Gumbel analyses routinely assign near-zero probability to subsequently observed disasters, and that for San Juan, Puerto Rico, standard 100-year predicted rainfall estimates may be routinely underestimated by a factor of two.  相似文献   

3.
The coastal zones are facing the prospect of changing storm surge statistics due to anthropogenic climate change. In the present study, we examine these prospects for the North Sea based on numerical modelling. The main tool is the barotropic tide-surge model TRIMGEO (Tidal Residual and Intertidal Mudflat Model) to derive storm surge climate and extremes from atmospheric conditions. The analysis is carried out by using an ensemble of four 30-year atmospheric regional simulations under present-day and possible future-enhanced greenhouse gas conditions. The atmospheric regional simulations were prepared within the EU project PRUDENCE (Prediction of Regional scenarios and Uncertainties for Defining EuropeaN Climate change risks and Effects). The research strategy of PRUDENCE is to compare simulations of different regional models driven by the same global control and climate change simulations. These global conditions, representative for 1961–1990 and 2071–2100 were prepared by the Hadley Center based on the IPCC A2 SRES scenario. The results suggest that under future climatic conditions, storm surge extremes may increase along the North Sea coast towards the end of this century. Based on a comparison between the results of the different ensemble members as well as on the variability estimated from a high-resolution storm surge reconstruction of the recent decades it is found that this increase is significantly different from zero at the 95% confidence level for most of the North Sea coast. An exception represents the East coast of the UK which is not affected by this increase of storm surge extremes.  相似文献   

4.
Extreme rainfall events recently occurring in Korea have been shown to change frequency-based rainfall amounts quite significantly. Regardless of the reason for these extremes, the general concern of most hydrologists is how to handle these events for practical applications in Hydrology. Our study aim is to evaluate these extremes with their effect on frequency-based rainfall amounts, especially if they can be assumed to be within normal levels. As there is no commonly accepted methodology to be applied to this kind of study, we follow simplified steps such as: (1) estimation of the climatological variance of frequency-based rainfall amounts, (2) estimation of confidence intervals of frequency-based rainfall amounts (lower and upper bounds for the 5 and 1% significance levels estimated using the climatological variance), and (3) evaluation of the effect of extra rainfall events on the frequency-based rainfall amounts. Twelve stations on the Korean peninsula are selected as they have relatively longer data length. The annual maximum rainfall data collected from 1954 to 1998 are used. From this study we concluded that (1) at least 30 years of data length should be used for the frequency analysis in order to assure the stability of the variance of frequency-based rainfall amounts, (2) the climatological variances estimated all range from 5 to 8% of the frequency-based rainfall amounts, and (3) even though the frequency-based rainfall amount seems to become extreme with seemingly abnormal events, it still remains under its upper bound for the 5 or 1% significance levels estimated using the climatological variance, as well as it decays exponentially to the normal level as extra events are added. Thus, we conclude that we do not need to panic over seemingly abnormal events occurring so far, but just need to consider the variability inherent in frequency-based rainfall amounts.  相似文献   

5.
The analysis of the impact of climate change (CC) on flood peaks has been the subject of several studies. However, a flood is characterized not only by its peak, but also by other characteristics such as its volume and duration. Little effort has been directed towards the study of the impact of CC on these characteristics. The aim of the present study is to evaluate and compare flood characteristics in a CC context, in the watershed of the Baskatong reservoir (Province of Québec, Canada). Comparisons are based on observed flow data and simulated flow series obtained from hydrological models using meteorological data from a regional climate model for a reference period (1971–2000) and a future period (2041–2070). To this end, two hydrological models HSAMI and HYDROTEL are considered. Correlations, stationarity, change‐points, and the multivariate behaviour of flood series were studied. The results show that, at various levels, all flood characteristics could be affected by CC. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

6.
Tectonic movement along faults is often re?ected by characteristic geomorphological features such as linear valleys, ridgelines and slope‐breaks, steep slopes of uniform aspect, regional anisotropy and tilt of terrain. Analysis of digital elevation models, by means of numerical geomorphology, provides a means of recognizing fractures and characterizing the tectonics of an area in a quantitative way. The objective of this study is to investigate the use of numerical geomorphometric methods for tectonic geomorphology through a case study. The methodology is based on general geomorphometry. In this study, the basic geometric attributes (elevation, slope, aspect and curvatures) are complemented with the automatic extraction of ridge and valley lines and surface speci?c points. Evans' univariate and bivariate methodology of general geomorphometry is extended with texture (spatial) analysis methods, such as trend, autocorrelation, spectral, and network analysis. Terrain modelling is implemented with the integrated use of: (1) numerical differential geometry; (2) digital drainage network analysis; (3) digital image processing; and (4) statistical and geostatistical analysis. Application of digital drainage network analysis is emphasized. A simple shear model with principal displacement zone with an NE–SW orientation can account for most of the the morphotectonic features found in the basin by geological and digital tectonic geomorphology analyses. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

7.
The geochemical analysis of fumarolic gases collected at quiescent and active volcanic systems over time is one of the main tools to understand changes in the state of activity for surveillance and risk assessment. The continuous output of chemical species through fumarolic activity, which characterizes the inter-eruptive intervals, has also a major and general influence on the environment. The mobilization of chemical species due to weathering of volcanic rocks, or the input of gaseous components from fumarolic activity, results in some kind of modification of the environment affecting, in particular, water, soils, and the consequent growth of the plants present in these areas. In this paper, an investigation on the chemical composition of fumarolic gases collected at Vulcano island (Sicily, southern Italy) is performed, with the aim to discover how data changes during the monitored period of time, and to design a strategy for the environmental surveillance of volcanic systems taking into account the nature of the analyzed data. In order to summarize the contribution of all the components that can affect the chemical composition of volcanic gases, a multivariate statistical approach appears to be suitable. Since many of those methods assume independent observations, the possible presence of time-dependent structures should be carefully verified. In this framework, given the compositional nature of geochemical data, we have applied recent theoretical and practical developments in the field of compositional data analysis to work in the correct sample space and to isolate groups of parts responsible for significant changes in the gas chemistry. The proposed approach can be generalized to the investigation of complex environmental systems.  相似文献   

8.
用去趋势涨落分析研究北京气候的长程变化特征   总被引:2,自引:0,他引:2       下载免费PDF全文
无标度性广泛存在于自然界系统包括气候系统中,其特征之一是可观测量存在幂函数关系,它揭示了气候系统的复杂性.为探索气候可预测性的客观基础,运用去趋势涨落分析(DFA)方法对北京1870~2003 年平均气温和1725~2003年降水序列进行了分析.结果表明,北京年平均气温和降水量均可划分为多个标度不变区域.在特定的标度域内,它们都表现出正长程相关的性质,为制作年际与年代际气候预测提供了理论基础.  相似文献   

9.
Calibration of hydrologic models is very difficult because of measurement errors in input and response, errors in model structure, and the large number of non-identifiable parameters of distributed models. The difficulties even increase in arid regions with high seasonal variation of precipitation, where the modelled residuals often exhibit high heteroscedasticity and autocorrelation. On the other hand, support of water management by hydrologic models is important in arid regions, particularly if there is increasing water demand due to urbanization. The use and assessment of model results for this purpose require a careful calibration and uncertainty analysis. Extending earlier work in this field, we developed a procedure to overcome (i) the problem of non-identifiability of distributed parameters by introducing aggregate parameters and using Bayesian inference, (ii) the problem of heteroscedasticity of errors by combining a Box–Cox transformation of results and data with seasonally dependent error variances, (iii) the problems of autocorrelated errors, missing data and outlier omission with a continuous-time autoregressive error model, and (iv) the problem of the seasonal variation of error correlations with seasonally dependent characteristic correlation times. The technique was tested with the calibration of the hydrologic sub-model of the Soil and Water Assessment Tool (SWAT) in the Chaohe Basin in North China. The results demonstrated the good performance of this approach to uncertainty analysis, particularly with respect to the fulfilment of statistical assumptions of the error model. A comparison with an independent error model and with error models that only considered a subset of the suggested techniques clearly showed the superiority of the approach based on all the features (i)–(iv) mentioned above.  相似文献   

10.
Multi-objective optimisation is being increasingly applied in water supply management to identify optimal operating options. However, a key challenge in the implementation of multi-objective optimisation is interpreting the large and multidimensional Pareto-optimal set. This paper shows how cluster, visual and post-optimisation analysis can aid the decision maker in addressing this challenge. This is demonstrated for a case study based on South East Queensland Water Grid, Australia, as part of a broader operational planning framework. Firstly, cluster analysis identifies a smaller set of representative options to aid in visual analysis. Secondly, visual analysis techniques are used to identify the trade-offs between objectives, the relationships between decision variables and objective performance, and to shortlist promising operating options. Finally, post-optimisation analysis techniques identify efficient operating options from the Pareto set, based on decision-maker preferences. Together these techniques can be used to identify a shortlist of operating options, for further consideration using multicriteria analysis.  相似文献   

11.
Many dating techniques include significant error terms which are not independent between samples to date. This is typically the case in Optically Stimulated Luminescence (OSL) dating where the conversion from characteristic equivalent doses to the corresponding ages using the annual dosimetry data includes error terms that are common to all produced datings. Dealing with these errors is essential to estimate ages from a set of datings whose chronological ordering is known. In this work, we propose and we study a Bayesian model to address this problem. For this purpose, we first consider a multivariate model with multiplicative Gaussian errors in a Bayesian framework. This model relates a set of characteristic equivalent doses to the corresponding ages while taking into account for the systematic and non-systematic errors associated to the dosimetry. It thus offers the opportunity to deal properly with stratigraphic constraints within OSL datings, but also with other datings possessing errors which are independent from systematic errors of OSL (e.g. radiocarbon). Then, we use this model to extend an existing Bayesian model for the assessment of characteristic equivalent doses from Single Aliquot and Regenerative (SAR) dose measurements. The overall Bayesian model leads to the joint estimation of all the variables (which include all the dose–response functions and characteristic equivalent doses) of a sequence of, possibly heterogeneous, datings. We also consider a more generic solution consisting in using directly the age model from a set of characteristic equivalent dose estimates and their associated standard errors. We finally give an example of application on a set of five OSL datings with stratigraphic constraints and observe a good adequacy between the two approaches.  相似文献   

12.
In optical dating, especially single-grain dating, various patterns of distributions in equivalent dose (De) are usually observed and analysed using different statistical models. None of these methods, however, is designed to deal with outliers that do not form part of the population of grains associated with the event of interest (the ‘target population’), despite outliers being commonly present in single-grain De distributions. In this paper, we present a Bayesian method for detecting De outliers and making allowance for them when estimating the De value of the target population. We test this so-called Bayesian outlier model (BOM) using data sets obtained for individual grains of quartz from sediments deposited in a variety of settings, and in simulations. We find that the BOM is suitable for single-grain De distributions containing outliers that, for a variety of reasons, do not form part of the target population. For example, De outliers may be associated with grains that have undesirable luminescence properties (e.g., thermal instability, high rates of anomalous fading) or with contaminant grains incorporated into a sample when collected in the field or prepared in the laboratory. Grains that have much larger or smaller De values than the target population, due to factors such as insufficient bleaching, beta-dose heterogeneity or post-depositional disturbance, may also be identified as outliers using the BOM, enabling these values to be weighted appropriately for final De and age determination.  相似文献   

13.
城镇化背景下极端降水事件频发,洪涝灾害问题日益突出,探讨城镇化对极端降水的影响已成为热点与难点问题.本文以长江下游太湖平原地区为例,基于区内40个雨量站长序列的逐日资料(1976-2015年),结合城镇化下土地利用/覆被和社会经济等数据,对比分析了不同城镇化阶段极端降水相关指标的时空变化规律,并定量评估不同城镇化水平对...  相似文献   

14.
ABSTRACT

The Pettitt test is widely used in climate change and hydrological analyses. However, studies show difficulties of this test in detecting change points, especially in small samples. This study presents a bootstrap application of the Pettitt test, and compares it numerically with the classical Pettitt test by an extensive Monte Carlo simulation. The proposed test outperforms the classical test in all simulated scenarios. An application of the tests is conducted on the historical series of naturalized flows of the Itaipu Hydroelectric Plant in Brazil, for which several studies have shown a change point in the 1970s. When the series is split into shorter sub-series, to simulate actual situations of small samples, the proposed test is more powerful than the classical Pettitt test in detecting the change point. The proposed test can be an important tool for detecting abrupt changes in water availability, in support of hydroclimatological resources decision making.  相似文献   

15.
Climate change is an issue of major concern nowadays.Its impact on the natural and human environment is studied intensively,as the expected shift in climate will be significant in the next few decades.Recent experience shows that the effects will be critical in coastal areas,resulting in erosion and inundation phenomena worldwide.In addition to that,coastal areas are subject to "pressures" from upstream watersheds in terms of water quality and sediment transport.The present paper studies the impact of climate change on sediment transport and morphology in the aforementioned coupled system.The study regards a sandy coast and its upstream watershed in Chalkidiki,North Greece;it is based on:(a)an integrated approach for the quantitative correlation of the two through numerical modeling,developed by the authors,and(b)a calibrated application of the relevant models Soil and Water Assessment Tool(SWAT)and PELNCON-M,applied to the watershed and the coastal zone,respectively.The examined climate change scenarios focus on a shift of the rainfall distribution towards fewer and more extreme rainfall events,and an increased frequency of occurrence of extreme wave events.Results indicate the significance of climatic pressures in wide-scale sediment dynamics,and are deemed to provide a useful perspective for researchers and policy planners involved in the study of coastal morphology evolution in a changing climate.  相似文献   

16.
In the monitoring of earthquakes and nuclear explosions using a sparse worldwide network of seismic stations, it is frequently necessary to make reliable location estimates using a single seismic array. It is also desirable to screen out routine industrial explosions automatically in order that analyst resources are not wasted upon detections which can, with a high level of confidence, be associated with such a source. The Kovdor mine on the Kola Peninsula of NW Russia is the site of frequent industrial blasts which are well recorded by the ARCES regional seismic array at a distance of approximately 300 km. We describe here an automatic procedure for identifying signals which are likely to result from blasts at the Kovdor mine and, wherever possible, for obtaining single array locations for such events. Carefully calibrated processing parameters were chosen using measurements from confirmed events at the mine over a one-year period for which the operators supplied Ground Truth information. Phase arrival times are estimated using an autoregressive method and slowness and azimuth are estimated using broadband f{-}k analysis in fixed frequency bands and time-windows fixed relative to the initial P-onset time. We demonstrate the improvement to slowness estimates resulting from the use of fixed frequency bands. Events can be located using a single array if, in addition to the P-phase, at least one secondary phase is found with both an acceptable slowness estimate and valid onset-time estimate. We evaluate the on-line system over a twelve month period; every event known to have occured at the mine is detected by the process and 32 out of 53 confirmed events were located automatically. The remaining events were classified as “very likely” Kovdor events and were subsequently located by an analyst. The false alarm rate is low; only 84 very likely Kovdor events were identified during the whole of 2003 and none of these were subsequently located at a large distance from the mine. The location accuracy achieved automatically by the single-array process is remarkably good, and is comparable to that obtained interactively by an experienced analyst using two-array observations. The greatest problem encountered in the single array location procedure is the difficulty in determining arrival times for secondary phases, given the weak Sn phase and the complexity of the P-coda. The method described here could be applied to a wide range of locations and sources for which the monitoring of seismic activity is desirable. The effectiveness will depend upon the distance between source and receiver, the nature of the seismic sources and the level of regional seismicity.  相似文献   

17.
The earthquake in Central Finland on 16November 1931 and its aftershock the sameday are investigated. It is the strongestevent known to have occurred in this areaand thus of importance for understandingthe seismicity there. The originalmacroseismic questionnaires werere-examined using statistical analysis andtaking into account the recommendations forintensity assessments according to theEuropean Macroseismic Scale (EMS-92,-98).The data were augmented with contemporarypress reports. Test theory was applied whenpreprocessing the data, and intensityassessment was carried out by means ofcorrespondence analysis. Differentapproaches were applied to determine themacroseismic field and trace theisoseismals. Some of the practical problemsinvolve the treatment of audibleobservations. The macroseismic magnitudeswere estimated at 4.3 (±0.2) for themain shock and 3.7 (±0.2) for itslargest aftershock. Despite the smallmagnitudes, earthquake light sightings werealso reported for the events.  相似文献   

18.
How can spatially explicit nonlinear regression modelling be used for obtaining nonpoint source loading estimates in watersheds with limited information? What is the value of additional monitoring and where should future data‐collection efforts focus on? In this study, we address two frequently asked questions in watershed modelling by implementing Bayesian inference techniques to parameterize SPAtially Referenced Regressions On Watershed attributes (SPARROW), a model that empirically estimates the relation between in‐stream measurements of nutrient fluxes and the sources/sinks of nutrients within the watershed. Our case study is the Hamilton Harbour watershed, a mixed agricultural and urban residential area located at the western end of Lake Ontario, Canada. The proposed Bayesian approach explicitly accounts for the uncertainty associated with the existing knowledge from the system and the different types of spatial correlation typically underlying the parameter estimation of watershed models. Informative prior parameter distributions were formulated to overcome the problem of inadequate data quantity and quality, whereas the potential bias introduced from the pertinent assumptions is subsequently examined by quantifying the relative change of the posterior parameter patterns. Our modelling exercise offers the first estimates of export coefficients and delivery rates from the different subcatchments and thus generates testable hypotheses regarding the nutrient export ‘hot spots’ in the studied watershed. Despite substantial uncertainties characterizing our calibration dataset, ranging from 17% to nearly 400%, we arrived at an uncertainty level for the whole‐basin nutrient export estimates of only 36%. Finally, we conduct modelling experiments that evaluate the potential improvement of the model parameter estimates and the decrease of the predictive uncertainty if the uncertainty associated with the current nutrient loading estimates is reduced. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

19.
Abstract

Southern Ontario, Canada, has been impacted in recent years by many heavy rainfall and flooding events that have exceeded existing historical estimates of infrastructure design rainfall intensity–duration–frequency (IDF) values. These recent events and the limited number of short-duration recording raingauges have prompted the need to research the climatology of heavy rainfall events within the study area, review the existing design IDF methodologies, and evaluate alternative approaches to traditional point-based heavy rainfall IDF curves, such as regional IDF design values. The use of additional data and the regional frequency analysis methodology were explored for the study area, with the objective of validating identified clusters or homogeneous regions of extreme rainfall amounts through Ward's method. As the results illustrate, nine homogeneous regions were identified in Southern Ontario using the annual maximum series (AMS) for daily and 24-h rainfall data from climate and rate-of-rainfall or tipping bucket raingauge (TBRG) stations, respectively. In most cases, the generalized extreme value and logistic distributions were identified as the statistical distributions that provide the best fit for the 24-h and sub-daily rainfall data in the study area. A connection was observed between extreme rainfall variability, temporal scale of heavy rainfall events and location of each homogeneous region. Moreover, the analysis indicated that scaling factors cannot be used reliably to estimate sub-daily and sub-hourly values from 24- and 1-h data in Southern Ontario.

Citation Paixao, E., Auld, H., Mirza, M.M.Q., Klaassen, J. & Shephard, M.W. (2011) Regionalization of heavy rainfall to improve climatic design values for infrastructure: case study in Southern Ontario, Canada. Hydrol. Sci. J. 56(7), 1067–1089.  相似文献   

20.
Parametric method of flood frequency analysis (FFA) involves fitting of a probability distribution to the observed flood data at the site of interest. When record length at a given site is relatively longer and flood data exhibits skewness, a distribution having more than three parameters is often used in FFA such as log‐Pearson type 3 distribution. This paper examines the suitability of a five‐parameter Wakeby distribution for the annual maximum flood data in eastern Australia. We adopt a Monte Carlo simulation technique to select an appropriate plotting position formula and to derive a probability plot correlation coefficient (PPCC) test statistic for Wakeby distribution. The Weibull plotting position formula has been found to be the most appropriate for the Wakeby distribution. Regression equations for the PPCC tests statistics associated with the Wakeby distribution for different levels of significance have been derived. Furthermore, a power study to estimate the rejection rate associated with the derived PPCC test statistics has been undertaken. Finally, an application using annual maximum flood series data from 91 catchments in eastern Australia has been presented. Results show that the developed regression equations can be used with a high degree of confidence to test whether the Wakeby distribution fits the annual maximum flood series data at a given station. The methodology developed in this paper can be adapted to other probability distributions and to other study areas. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号