首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This study evaluates two (of the many) modelling approaches to flood forecasting for an upland catchment (the River South Tyne at Haydon Bridge, England). The first modelling approach utilizes ‘traditional’ hydrological models. It consists of a rainfall–runoff model (the probability distributed model, or PDM) for flow simulation in the upper catchment. Those flows are then routed to the lower catchment using two kinematic wave (KW) routing models. When run in forecast‐mode, the PDM and KW models utilize model updating procedures. The second modelling approach uses neural network models, which use a ‘pattern‐matching’ process to produce model forecasts.Following calibration, the models are evaluated in terms of their fit to continuous stage data and flood event magnitudes and timings within a validation period. Forecast times of 1 h, 2 h and 4 h are selected (the catchment has a response time of approximately 4 h). The ‘traditional’ models generally perform adequately at all three forecast times. The neural networks produce reasonable forecasts of small‐ to medium‐sized flood events but have difficulty in forecasting the magnitude of the larger flood events in the validation period. Possible modifications to the latter approach are discussed. © Crown copyright 2002. Reproduced with the permission of Her Majesty's stationery office. Published by John Wiley & Sons, Ltd.  相似文献   

2.
Keith Beven was amongst the first to propose and demonstrate a combination of conceptual rainfall–runoff modelling and stochastically generated rainfall data in what is known as the ‘continuous simulation’ approach for flood frequency analysis. The motivations included the potential to establish better links with physical processes and to avoid restrictive assumptions inherent in existing methods applied in design flood studies. Subsequently, attempts have been made to establish continuous simulation as a routine method for flood frequency analysis, particularly in the UK. The approach has not been adopted universally, but numerous studies have benefitted from applications of continuous simulation methods. This paper asks whether industry has yet realized the vision of the pioneering research by Beven and others. It reviews the generic methodology and illustrates applications of the original vision for a more physically realistic approach to flood frequency analysis through a set of practical case studies, highlighting why continuous simulation was useful and appropriate in each case. The case studies illustrate how continuous simulation has helped to offer users of flood frequency analysis more confidence about model results by avoiding (or exposing) bad assumptions relating to catchment heterogeneity, inappropriateness of assumptions made in (UK) industry‐standard design event flood estimation methods, and the representation of engineered or natural dynamic controls on flood flows. By implementing the vision for physically realistic analysis of flood frequency through continuous simulation, each of these examples illustrates how more relevant and improved information was provided for flood risk decision‐making than would have been possible using standard methods. They further demonstrate that integrating engineered infrastructure into flood frequency analysis and assessment of environmental change are also significant motivations for adopting the continuous simulation approach in practice. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

3.
《Marine pollution bulletin》2008,56(10-12):407-414
The EU Water Framework Directive demands the protection of the functioning and the structure of our aquatic ecosystems. The defined means to realize this goal are: (1) optimization of the habitat providing conditions and (2) optimizing the water quality. The effects of the measures on the structure and functioning of the aquatic ecosystems then has to be assessed and judged. The available tool to do this is ‘monitoring’. The present monitoring activities in The Netherlands cover target monitoring and trend monitoring. This is insufficient to meet the requirements of the EU. It is, given the EU demands, the ongoing budget reductions in The Netherlands and an increasing flow of unused new ecological concepts and theories (e.g. new theoretical insights related to resource competition theory, intermediate disturbance hypothesis and tools to judge the system quality like ecological network analysis) suggested to reconsider the present monitoring tasks among governmental services (final responsibility for the program and logistic support) and the academia (data analyses, data interpretation and development of concepts suitable for ecosystem modelling and tools to judge the quality of our ecosystems). This will lead to intensified co-operation between both arena’s and consequently increased exchange of knowledge and ideas. Suggestions are done to extend the Dutch monitoring by surveillance monitoring and to change the focus from ‘station oriented’ to ‘area oriented’ without changing the operational aspects and its costs. The extended data sets will allow proper calibration and validation of developed dynamic ecosystem models which is not possible now. The described ‘cost-effective’ change in the environmental monitoring will also let biological and ecological theories play the pivotal role they should play in future integrated environmental management.  相似文献   

4.
SANDRINE DELMEIRE 《水文研究》1997,11(10):1393-1396
The aim of this study, undertaken by Geoimage, was the setting up of a fast and precise location method of flooded areas over two sites in southern France. The use of satellite imagery seemed to be the appropriate tool for this study. Two types of flood had to be distinguished: (i) an oceanic flood, of long duration characteristic, and of low intensity on the Rhône Valley, (ii) a torrential flood, of short duration characteristic, but of high intensity, on the Var Valley. As we distributed of ERS-1 images over both sites, during the floods, we could test our methodology. A multitemporal approach using ERS-1 images in PRI mode, acquired before, during and after the flood, was set up. In the case of oceanic flood, the radar images characteristic answers, enabled us to extract and identify areas under water at each date of acquisition of the images. Therefore, if we distribute images at each step of the flood, its evolution can be precisely reconstituted (in terms of time and surface). In the case of torrential flood, it is more difficult to localize the flood with precision. This can be explained by the change of water surface, which has a large swell in this case. Radars are sensitive to these changes in the turbidity, an interaction occurs and thus the results were ‘turned off’. Nevertheless, simulation studies from other satellite data make possible the location of more or less strong hydrological risk accident areas. © 1997 John Wiley & Sons, Ltd.  相似文献   

5.
Fluvial geomorphology is rapidly becoming centrally involved in practical applications to support the agenda of sustainable river basin management. In the UK its principal contributions to date have primarily been in flood risk management and river restoration. There is a new impetus: the European Union's Water Framework and Habitats Directives require all rivers to be considered in terms of their ecological quality, defined partly in terms of ‘hydromorphology’. This paper focuses on the problematic definition of ‘natural’ hydromorphological quality for rivers, the assessment of departures from it, and the ecologically driven strategies for restoration that must be delivered by regulators under the EU Water Framework Directive (WFD). The Habitats Directive contains similar concepts under different labels. Currently available definitions of ‘natural’ or ‘reference’ conditions derive largely from a concept of ‘damage’, principally to channel morphology. Such definitions may, however, be too static to form sustainable strategies for management and regulation, but attract public support. Interdisciplinary knowledge remains scant; yet such knowledge is needed at a range of scales from catchment to microhabitat. The most important contribution of the interdisciplinary R&D effort needed to supply management tools to regulators of the WFD and Habitats regulations is to interpret the physical habitat contribution to biodiversity conservation, in terms of ‘good ecological quality’ in rivers, and the ‘hydromorphological’ component of this quality. Contributions from ‘indigenous knowledge’, through public participation, are important but often understated in this effort to drive the ‘fluvial hydrosystem’ back to spontaneous, affordable, sustainable self‐regulation. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

6.
R. OBERSTADLER  H. H NSCH  D. HUTH 《水文研究》1997,11(10):1415-1425
GAF examined, under contract to DARA (German Space Agency), the applicability of ERS-1 SAR data for flood mapping under operational conditions. The flood event investigated was the flooding in the Rhine valley in winter 1993–1994. In order to carry out an examination close to the end-user needs, the specific user requests concerning information about flood events were identified. The mapping accuracy in view of the flood extent and the flood level, the production of flood maps as well as the demonstration of the runoff turned out to be the most interesting points. The specific user information needs were considered in the project objectives to define the applicability as well as the deficits of ERS-1 data concerning an operational use for flood mapping. After a detailed analysis of the time aspects of the traditional mapping method and a satellite data analysis, a visual interpretation as well as an automatic classification were applied, including various filter steps to derive the flood boundary. As a result, the visual interpretation proved to be the more accurate method. Crucial domains for both the visual interpretation and the automatic classification turned out to be settlements, forests and bushes as well as regions with layover and foreshortening effects. The comparison between the flood level derived from satellite data and the flood level registered by the water authority boards brought a height difference which ranged between 0·5 and 2·0 m. The relatively coarse resolution and problems with correct interpretation of the flood line proved to be the reason for this difference. In general the results are convenient, but in relation to field measurements of the water level they are too inaccurate. A cost and benefit analysis as well as a proposal for an operational GIS system using ERS-1 SAR data are still under investigation. © 1997 John Wiley & Sons, Ltd.  相似文献   

7.
Meslem  A.  Iversen  H.  Iranpour  K.  Lang  D. 《Bulletin of Earthquake Engineering》2021,19(10):4083-4114

In the framework of the multi-disciplinary LIQUEFACT project, funded under the European Commission’s Horizon 2020 program, the LIQUEFACT Reference Guide software has been developed, incorporating both data and methodologies collected and elaborated in the project’s various work packages. Specifically, this refers to liquefaction hazard maps, methodologies and results of liquefaction vulnerability analysis for both building typologies and critical infrastructures, liquefaction mitigation measures as well as cost-benefit considerations. The software is targeting a wider range of user groups with different levels of technical background as well as requirements (urban planners, facility managers, structural and geotechnical engineers, or risk modelers). In doing so, the LIQUEFACT software shall allow the user assessing the liquefaction-related risk as well as assisting them in liquefaction mitigation planning. Dependent on the user’s requirements, the LIQUEFACT software can be used to separately conduct the liquefaction hazard analysis, the risk analysis, and the mitigation analysis. At the stage of liquefaction hazard, the users can geo-locate their assets (buildings or infrastructures) against the pre-defined macrozonation and microzonation maps in the software and identify those assets/sites that are potentially susceptible to an earthquake-induced liquefaction damage hazard. For potentially susceptible sites the user is able to commission a detailed ground investigation (e.g. CPT, SPT or VS30 profile) and this data can be used by the software to customise the level of susceptibility to specific site conditions. The users can either use inbuilt earthquake scenarios or enter their own earthquake scenario data. In the Risk Analysis, the user can estimate the level of impact of the potential liquefaction threat on the asset and evaluate the performance. For the Mitigation Analysis, the user can develop a customized mitigation framework based on the outcome of the risk and cost-benefit analysis.

  相似文献   

8.
Using eye tracking to evaluate the usability of animated maps   总被引:1,自引:0,他引:1  
Cartographic animation has been developed and widely used in geo-visualisation and many other areas in recent years.The usability of animated maps is a key characteristic affecting map users’effectiveness and efficiency in accomplishing tasks.In this paper,an eye tracking approach was proposed as a visual analytics method to evaluate the usability of animated maps by capturing participants’eye movement data and quantitatively analysing the accuracy(effectiveness)and response time(efficiency)of users’task completion.In the study,a set of animated traffic maps represented by three important visual variables(colour hue,size and frequency)was used for the usability evaluation.The experimental results showed that the usability of these three visual variables for cartographic animation affects the usability of animated maps.Red,yellow,and aqua were found to convey map information more effectively than other colour hues.Size was found to be more usable than colour hues for both animated maps and static maps.Usability was not found to be proportional to the playback rate of animated maps.Furthermore,the usability of the frequency,colour hue,and size was found to be related to the display’s size.We hope that the analysis approach presented in this paper and the results of this study will be of help in the design of cartographic animation displays with better usability.  相似文献   

9.
Aiming at reducing the losses from flood disaster, a dynamic risk assessment model for flood disaster is studied in this article. This model is built upon the projection pursuit cluster principle and risk indexes in the system, proceeding from the whole structure to its component parts. In this study, a fuzzy analytic hierarchy approach is employed to screen out the index system and determine the index weight, while the future value of each index is simulated by an improved back-propagation neural network algorithm. The proposed model adopts a dynamic evaluation method to analyze temporal data and assesses risk development by comprehensive analysis. The projection pursuit theory is used for clustering spatial data. The optimal projection vector is applied to calculate the risk cluster type. Therefore, the flood disaster risk level is confirmed and then the local conditions for presenting the control strategy. This study takes the Tunxi area, Huangshan city, as an example. After dynamic risk assessment model establishment, verification and application for flood disasters between the actual and simulated data from 2001 to 2013, the comprehensive risk assessment results show that the development trend for flood disaster risk is still in a decline on the whole, despite the rise in a few years. This is in accordance with the actual conditions. The proposed model is shown to be feasible for theory and application, providing a new way to assess flood disaster risk.  相似文献   

10.
Elcin Kentel   《Journal of Hydrology》2009,375(3-4):481-488
Reliable river flow estimates are crucial for appropriate water resources planning and management. River flow forecasting can be conducted by conceptual or physical models, or data-driven black box models. Development of physically-based models requires an understanding of all the physical processes which impact a natural process and the interactions among them. Since identification of the relationships among these physical processes is very difficult, data-driven approaches have recently been utilized in hydrological modeling. Artificial neural networks are one of the widely used data-driven approaches for modeling hydrological processes. In this study, estimation of future monthly river flows for Guvenc River, Ankara is conducted using various artificial neural network models. Success of artificial neural network models relies on the availability of adequate data sets. A direct mapping from inputs to outputs without consideration of the complex relationships among the dependent and independent variables of the hydrological process is identified. In this study, past precipitation, river flow data, and the associated month are used to predict future river flows for Guvenc River. Impacts of various input patterns, number of training cycles, and initial values assigned to the weights of the connections are investigated. One of the major weaknesses of artificial neural networks is that they may fail to generate good estimates for extreme events, i.e. events that do not occur at all or often enough in the training data set. It is very important to be able to identify such unlikely events. A fuzzy c-means algorithm is used in this study to cluster the training and validation input vectors into regular and extreme events so that the user will have an idea about the risk of the artificial neural network model to generate unreliable results.  相似文献   

11.
A neural network-based approach is presented for the detection of changes in the characteristics of structure-unknown systems. The approach relies on the use of vibration measurements from a ‘healthy’ system to train a neural network for identification purposes. Subsequently, the trained network is fed comparable vibration measurements from the same structure under different episodes of response in order to monitor the health of the structure. The methodology is applied to actual data obtained from ambient vibration measurements on a steel building structure that was damaged under strong seismic motion during the Hyogo-Ken Nanbu Earthquake of 17 January 1995. The measurements were done before and after repairs to the damaged frame were made. A neural network is trained with data after the repairs, which represents ‘healthy’ condition of the building. The trained network, which is subsequently fed data before the repairs, successfully identified the difference between the damaged storey and the undamaged storey. Through this study, it is shown that the proposed approach has the potential of being a practical tool for a damage detection methodology applied to smart civil structures. © 1998 John Wiley & Sons, Ltd.  相似文献   

12.
Flood vulnerability assessment plays a key role in the area of risk management. Therefore, techniques that make this assessment more straightforward and at the same time improve the results are important. In this briefing, we present an automated calculation of a flood vulnerability index implemented through a web management interface (PHP) that enhances the ability of decision makers to strategically guide investment. To test the applicability of this methodology using this website, many case studies are required in order to cover the full range of cases in terms of scale such as river basin, subcatchment and urban area. This requires prompt solutions with large amounts of data and this has led to the development of this automated tool to help organize, monitor, process and compare the data of different case studies. The authors aim to create a network of knowledge between different institutions and universities in which this methodology is used. It is also hoped to encourage collaboration between the members of the network on managing flood vulnerability information and also promoting further studies on flood risk assessment at all scales. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

13.
Self‐organizing maps (SOMs) have been successfully accepted widely in science and engineering problems; not only are their results unbiased, but they can also be visualized. In this study, we propose an enforced SOM (ESOM) coupled with a linear regression output layer for flood forecasting. The ESOM re‐executes a few extra training patterns, e.g. the peak flow, as recycling input data increases the mapping space of peak flow in the topological structure of SOM, and the weighted sum of the extended output layer of the network improves the accuracy of forecasting peak flow. We have investigated an ESOM neural network by using the flood data of the Da‐Chia River, Taiwan, and evaluated its performance based on the results obtained from a commonly used back‐propagation neural network. The results demonstrate that the ESOM neural network has great efficiency for clustering, especially for the peak flow, and super capability of modelling the flood forecast. The topology maps created from the ESOM are interesting and informative. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

14.
When a scarce water resource is distributed between different users by a Water Resource Management Authority (WRMA), the replenishment of this resource as well as the meeting of users’ demand is subject to considerable uncertainty. Cost optimization and risk management models can assist the WRMA in its decision about striking the balance between the level of target delivery to the users and the level of risk that this delivery will not be met. Addressing the problem as a multi-period dynamic network optimization, the proposed approach is also based on further developments in stochastic programming for scenario optimization. This approach tries to obtain a “robust” decision policy that minimizes the risk of wrong decisions when managing scarce water resources. In the paper we also illustrate two application examples for water resources management problems.  相似文献   

15.
In this paper we extend the generalized likelihood uncertainty estimation (GLUE) technique to estimate spatially distributed uncertainty in models conditioned against binary pattern data contained in flood inundation maps. Untransformed binary pattern data already have been used within GLUE to estimate domain‐averaged (zero‐dimensional) likelihoods, yet the pattern information embedded within such sources has not been used to estimate distributed uncertainty. Where pattern information has been used to map distributed uncertainty it has been transformed into a continuous function prior to use, which may introduce additional errors. To solve this problem we use here ‘raw’ binary pattern data to define a zero‐dimensional global performance measure for each simulation in a Monte Carlo ensemble. Thereafter, for each pixel of the distributed model we evaluate the probability that this pixel was inundated. This probability is then weighted by the measure of global model performance, thus taking into account how well a given parameter set performs overall. The result is a distributed uncertainty measure mapped over real space. The advantage of the approach is that it both captures distributed uncertainty and contains information on global likelihood that can be used to condition predictions of further events for which observed data are not available. The technique is applied to the problem of flood inundation prediction at two test sites representing different hydrodynamic conditions. In both cases, the method reveals the spatial structure in simulation uncertainty and simultaneously enables mapping of flood probability predicted by the model. Spatially distributed uncertainty analysis is shown to contain information over and above that available from global performance measures. Overall, the paper highlights the different types of information that may be obtained from mappings of model uncertainty over real and n‐dimensional parameter spaces. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

16.
Much of the nonlinearity and uncertainty regarding the flood process is because hydrologic data required for estimation are often tremendously difficult to obtain. This study employed a back‐propagation network (BPN) as the main structure in flood forecasting to learn and to demonstrate the sophisticated nonlinear mapping relationship. However, a deterministic BPN model implies high uncertainty and poor consistency for verification work even when the learning performance is satisfactory for flood forecasting. Therefore, a novel procedure was proposed in this investigation which integrates linear transfer function (LTF) and self‐organizing map (SOM) to efficiently determine the intervals of weights and biases of a flood forecasting neural network to avoid the above problems. A SOM network with classification ability was applied to the solutions and parameters of the BPN model in the learning stage, to classify the network parameter rules and to obtain the winning parameters. The outcomes from the previous stage were then used as the ranges of the parameters in the recall stage. Finally, a case study was carried out in Wu‐Shi basin to demonstrate the effectiveness of the proposal. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

17.
A common source of uncertainty in flood inundation forecasting is the hydrograph used. Given the role of sea-air-hydro-land chain processes on the water cycle, flood hydrographs in coastal areas can be indirectly affected by sea state. This study investigates sea-state effects on precipitation, discharge, and flood inundation forecasting implementing atmospheric, ocean wave, hydrological, and hydraulic-hydrodynamic coupled models. The Chemical Hydrological Atmospheric Ocean wave System (CHAOS) was used for coupled hydro-meteorological-wave simulations ‘accounting’ or ‘not accounting’ the impact of sea state on precipitation and, subsequently, on flood hydrograph. CHAOS includes the WRF-Hydro hydrological model and the WRF-ARW meteorological model two-way coupled with the WAM wave model through the OASIS3-MCT coupler. Subsequently, the 2D HEC-RAS hydraulic-hydrodynamic model was forced by the flood hydrographs and map the inundated areas. A flash flood event occurred on 15 November 2017 in Mandra, Attica, Greece, causing 24 fatalities, and damages was selected as case study. The calibration of models was performed exploiting historical flood records and previous studies. Human interventions such as hydraulic works and the urban areas were included in the hydraulic modelling geometry domain. The representation of the resistance caused by buildings was based on Unmanned Aerial System (UAS) data while the local elevation rise method was used in the urban-flood simulation. The flood extent results were assessed using the Critical Success Index (CSI), and CSI penalize. Integrating sea-state affected the forecast of precipitation and discharge peaks, causing up to +24% and from −8% to +36% differences, respectively, improving inundation forecast by 4.5% and flooding additional approximately 70 building blocks. The precipitation forcing time step was also highlighted as significant factor in such a small-scale flash flood. The integrated multidisciplinary methodological approach could be adopted in operational forecasting for civil protection applications facilitating the protection of socio-economic activities and human lives during similar future events.  相似文献   

18.
Hydrological models used for flood prediction in ungauged catchments are commonly fitted to regionally transferred data. The key issue of this procedure is to identify hydrologically similar catchments. Therefore, the dominant controls for the process of interest have to be known. In this study, we applied a new machine learning based approach to identify the catchment characteristics that can be used to identify the active processes controlling runoff dynamics. A random forest (RF) regressor has been trained to estimate the drainage velocity parameters of a geomorphologic instantaneous unit hydrograph (GIUH) in ungauged catchments, based on regionally available data. We analyzed the learning procedure of the algorithm and identified preferred donor catchments for each ungauged catchment. Based on the obtained machine learning results from catchment grouping, a classification scheme for drainage network characteristics has been derived. This classification scheme has been applied in a flood forecasting case study. The results demonstrate that the RF could be trained properly with the selected donor catchments to successfully estimate the required GIUH parameters. Moreover, our results showed that drainage network characteristics can be used to identify the influence of geomorphological dispersion on the dynamics of catchment response.  相似文献   

19.
Probabilistic seismic risk assessment for spatially distributed lifelines is less straightforward than for individual structures. While procedures such as the ‘PEER framework’ have been developed for risk assessment of individual structures, these are not easily applicable to distributed lifeline systems, due to difficulties in describing ground‐motion intensity (e.g. spectral acceleration) over a region (in contrast to ground‐motion intensity at a single site, which is easily quantified using Probabilistic Seismic Hazard Analysis), and since the link between the ground‐motion intensities and lifeline performance is usually not available in closed form. As a result, Monte Carlo simulation (MCS) and its variants are well suited for characterizing ground motions and computing resulting losses to lifelines. This paper proposes a simulation‐based framework for developing a small but stochastically representative catalog of earthquake ground‐motion intensity maps that can be used for lifeline risk assessment. In this framework, Importance Sampling is used to preferentially sample ‘important’ ground‐motion intensity maps, and K‐Means Clustering is used to identify and combine redundant maps in order to obtain a small catalog. The effects of sampling and clustering are accounted for through a weighting on each remaining map, so that the resulting catalog is still a probabilistically correct representation. The feasibility of the proposed simulation framework is illustrated by using it to assess the seismic risk of a simplified model of the San Francisco Bay Area transportation network. A catalog of just 150 intensity maps is generated to represent hazard at 1038 sites from 10 regional fault segments causing earthquakes with magnitudes between five and eight. The risk estimates obtained using these maps are consistent with those obtained using conventional MCS utilizing many orders of magnitudes more ground‐motion intensity maps. Therefore, the proposed technique can be used to drastically reduce the computational expense of a simulation‐based risk assessment, without compromising the accuracy of the risk estimates. This will facilitate computationally intensive risk analysis of systems such as transportation networks. Finally, the study shows that the uncertainties in the ground‐motion intensities and the spatial correlations between ground‐motion intensities at various sites must be modeled in order to obtain unbiased estimates of lifeline risk. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号