Landslides triggered by rainfall are the cause of thousands of deaths worldwide every year. One possible approach to limit the socioeconomic consequences of such events is the development of climatic thresholds for landslide initiation. In this paper, we propose a method that incorporates antecedent rainfall and streamflow data to develop a landslide initiation threshold for the North Shore Mountains of Vancouver, British Columbia. Hydroclimatic data were gathered for 18 storms that triggered landslides and 18 storms that did not. Discriminant function analysis separated the landslide-triggering storms from those storms that did not trigger landslides and selected the most meaningful variables that allow this separation. Discriminant functions were also developed for the landslide-triggering and nonlandslide-triggering storms. The difference of the discriminant scores, ΔCS, for both groups is a measure of landslide susceptibility during a storm. The variables identified that optimize the separation of the two storm groups are 4-week rainfall prior to a significant storm, 6-h rainfall during a storm, and the number of hours 1 m3/s discharge was exceeded at Mackay Creek during a storm. Three thresholds were identified. The Landslide Warning Threshold (LWT) is reached when ΔCS is −1. The Conditional Landslide Initiation Threshold (CTLI) is reached when ΔCS is zero, and it implies that landslides are likely if 4 mm/h rainfall intensity is exceeded at which point the Imminent Landslide Initiation Threshold (ITLI) is reached. The LWT allows time for the issuance of a landslide advisory and to move personnel out of hazardous areas. The methodology proposed in this paper can be transferred to other regions worldwide where type and quality of data are appropriate for this type of analysis. 相似文献
Limiting global warming to ‘well below’ 2°C above pre-industrial levels and pursuing efforts to limit the temperature increase even further to 1.5°C is an integral part of the 2015 Paris Agreement. To achieve these aims, cumulative global carbon emissions after 2016 should not exceed 940 – 390?Gt of CO2 (for the 2°C target) and 167 – ?48?Gt of CO2 (for the 1.5°C target) by the end of the century. This paper analyses the EU’s cumulative carbon emissions in different models and scenarios (global models, EU-focused models and national carbon mitigation scenarios). Due to the higher reductions in energy use and carbon intensity of the end-use sectors in the national scenarios, we identify an additional mitigation potential of 26–37 Gt cumulative CO2 emissions up to 2050 compared to what is currently included in global or EU scenarios. These additional reductions could help to both reduce the need for carbon dioxide removals and bring cumulative emissions in global and EU scenarios in line with a fairness-based domestic EU budget for a 2°C target, while still remaining way above the budget for 1.5°C.Key policy insights
Models used for policy advice such as global integrated assessment models or EU models fail to consider certain mitigation potential available at the level of sectors.
Global and EU models assume significant levels of CO2 emission reductions from carbon capture and storage to reach the 1.5°C target but also to reach the 2°C target.
Global and EU model scenarios are not compatible with a fair domestic EU share in the global carbon budget either for 2°C or for 1.5°C.
Integrating additional sectoral mitigation potential from detailed national models can help bring down cumulative emissions in global and EU models to a level comparable to a fairness-based domestic EU share compatible with the 2°C target, but not the 1.5°C aspiration.
At the beginning of the twenty-first century, a technological change took place in geodetic astronomy by the development of
Digital Zenith Camera Systems (DZCS). Such instruments provide vertical deflection data at an angular accuracy level of 0.̋1
and better. Recently, DZCS have been employed for the collection of dense sets of astrogeodetic vertical deflection data in
several test areas in Germany with high-resolution digital terrain model (DTM) data (10–50 m resolution) available. These
considerable advancements motivate a new analysis of the method of astronomical-topographic levelling, which uses DTM data
for the interpolation between the astrogeodetic stations. We present and analyse a least-squares collocation technique that
uses DTM data for the accurate interpolation of vertical deflection data. The combination of both data sets allows a precise
determination of the gravity field along profiles, even in regions with a rugged topography. The accuracy of the method is
studied with particular attention on the density of astrogeodetic stations. The error propagation rule of astronomical levelling
is empirically derived. It accounts for the signal omission that increases with the station spacing. In a test area located
in the German Alps, the method was successfully applied to the determination of a quasigeoid profile of 23 km length. For
a station spacing from a few 100 m to about 2 km, the accuracy of the quasigeoid was found to be about 1–2 mm, which corresponds
to a relative accuracy of about 0.05−0.1 ppm. Application examples are given, such as the local and regional validation of
gravity field models computed from gravimetric data and the economic gravity field determination in geodetically less covered
regions. 相似文献