首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 656 毫秒
1.
Developing economy greenhouse gas emissions are growing rapidly relative to developed economy emissions (Boden et al. 2010) and developing economies as a group have greater emissions than developed economies. These developments are expected to continue (U.S. Energy Information Administration 2010), which has led some to question the effectiveness of emissions mitigation in developed economies without a commitment to extensive mitigation action from developing economies. One often heard argument against proposed U.S. legislation to limit carbon emissions to mitigate climate change is that, without participation from large developing economies like China and India, stabilizing temperature at 2 degrees Celsius above preindustrial (United Nations 2009), or even reducing global emissions levels, would be impossible (Driessen 2009; RPC Energy Facts 2009) or prohibitively expensive (Clarke et al. 2009). Here we show that significantly delayed action by rapidly developing countries is not a reason to forgo mitigation efforts in developed economies. This letter examines the effect of a scenario with no explicit international climate policy and two policy scenarios, full global action and a developing economy delay, on the probability of exceeding various global average temperature changes by 2100. This letter demonstrates that even when developing economies delay any mitigation efforts until 2050 the effect of action by developed economies will appreciably reduce the probability of more extreme levels of temperature change. This paper concludes that early carbon mitigation efforts by developed economies will considerably affect the distribution over future climate change, whether or not developing countries begin mitigation efforts in the near term.  相似文献   

2.
Gary Yohe 《Climatic change》2010,99(1-2):295-302
Article 2 of the United Nations Framework Convention on Climate Change commits its parties to stabilizing greenhouse gas concentrations in the atmosphere at a level that “would prevent dangerous anthropogenic interference with the climate system.” Authors of the Third Assessment Report of the Intergovernmental Panel on Climate Change (IPCC 2001a, b) offered some insight into what negotiators might consider dangerous by highlighting five “reasons for concern” (RFC’s) and tracking concern against changes in global mean temperature; they illustrated their assessments in the now iconic “burning embers” diagram. The Fourth Assessment Report reaffirmed the value of plotting RFC’s against temperature change (IPCC 2007a, b), and Smith et al. (2009) produced an unpated embers visualization for the globe. This paper applies the same assessment and communication strategies to calibrate the comparable RFC’s for the United States. It adds “National Security Concern” as a sixth RFC because many now see changes in the intensity and/or frequency of extreme events around the world as “risk enhancers” that deserve attention at the highest levels of the US policy and research communities. The US embers portrayed here suggest that: (1) US policy-makers will not discover anything really “dangerous” over the near to medium term if they consider only economic impacts that are aggregated across the entire country but that (2) they could easily uncover “dangerous anthropogenic interference with the climate system” by focusing their attention on changes in the intensities, frequencies, and regional distributions of extreme weather events driven by climate change.  相似文献   

3.
Global and regional trends in greenhouse gas emissions from livestock   总被引:2,自引:0,他引:2  
Following IPCC guidelines (IPCC 2006), we estimate greenhouse gas emissions related to livestock in 237 countries and 11 livestock categories during the period 1961–2010. We find that in 2010 emissions of methane and nitrous oxide related to livestock worldwide represented approximately 9 % of total greenhouse gas (GHG) emissions. Global GHG emissions from livestock increased by 51 % during the analyzed period, mostly due to strong growth of emissions in developing (Non-Annex I) countries (+117 %). In contrast, developed country (Annex I) emissions decreased (?23 %). Beef and dairy cattle are the largest source of livestock emissions (74 % of global livestock emissions). Since developed countries tend to have lower CO2-equivalent GHG emissions per unit GDP and per quantity of product generated in the livestock sector, the amount of wealth generated per unit GHG emitted from the livestock sector can be increased by improving both livestock farming practices in developing countries and the overall state of economic development. Our results reveal important details of how livestock production and associated GHG emissions have occurred in time and space. Discrepancies with higher tiers, demonstrate the value of more detailed analyses, and discourage over interpretation of smaller-scale trends in the Tier 1 results, but do not undermine the value of global Tier 1 analysis.  相似文献   

4.
In public debate surrounding climate change, scientific uncertainty is often cited in connection with arguments against mitigative action. This article examines the role of uncertainty about future climate change in determining the likely success or failure of mitigative action. We show by Monte Carlo simulation that greater uncertainty translates into a greater likelihood that mitigation efforts will fail to limit global warming to a target (e.g., 2 °C). The effect of uncertainty can be reduced by limiting greenhouse gas emissions. Taken together with the fact that greater uncertainty also increases the potential damages arising from unabated emissions (Lewandowsky et al. 2014), any appeal to uncertainty implies a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.  相似文献   

5.
The design of optimal environmental policy inherits model uncertainty. We investigate the consequences in a simple linear model, where the aim of the policymaker is to stabilise the atmospheric content of carbon. We study how decision-makers’ concerns about robustness alters policy using the Hansen and Sargent (2003, 2008) approach. The analysis shows that a policymaker, who fears about model misspecification should react more aggressively to changes in the stock of atmospheric carbon and reduce emissions stronger.  相似文献   

6.
A. Bun  K. Hamal  M. Jonas  M. Lesiv 《Climatic change》2010,103(1-2):215-225
The focus of this study is on the preparatory detection of uncertain greenhouse gas (GHG) emission changes (also termed emission signals) under the Kyoto Protocol. Preparatory signal detection is a measure that should be taken prior to/during negotiation of the Protocol. It allows the ranking of countries under the Protocol according to their realized versus their agreed emission changes and in terms of both certainty and credibility. Controlling GHGs is affected by uncertainty and may be costly. Thus, knowing whether each nation is doing its part is in the public interest. At present, however, countries to the United Nations Framework Convention on Climate Change (UNFCCC) are obliged to include in the reporting of their annual inventories direct or alternative estimates of the uncertainty associated with these, consistent with the Intergovernmental Panel on Climate Change’s (IPCC) good practice guidance reports. As a consequence, inventory uncertainty is monitored, but not regulated, under the Kyoto Protocol. Although uncertainties are becoming increasingly available, monitored emissions and uncertainties are still dealt with separately. In our study we analyze estimates of both emission changes and uncertainties to advance the evaluation of countries and their performance under the Protocol. Our analysis allows supply and demand of emissions credits to be examined in consideration of uncertainty. For the purpose of our exercise, we make use of the Undershooting and Verification Time concept described by Jonas et al. (Clim Change doi:10.1007/s10584-010-9914-6, 2010).  相似文献   

7.
Chris Hope 《Climatic change》2013,117(3):531-543
PAGE09 is an updated version of the PAGE2002 integrated assessment model (Hope 2011a). The default PAGE09 model gives a mean estimate of the social cost of CO2 (SCCO2) of $106 per tonne of CO2, compared to $81 from the PAGE2002 model used in the Stern review (Stern 2007). The increase is the net result of several improvements that have been incorporated into the PAGE09 model in response to the critical debate around the Stern review: the adoption of the A1B socio-economic scenario, rather than A2 whose population assumptions are now thought to be implausible; the use of ranges for the two components of the discount rate, rather than the single values used in the Stern review; a distribution for the climate sensitivity that is consistent with the latest estimates from IPCC 2007a; less adaptation than in PAGE2002, particularly in the economic sector, which was criticised for possibly being over-optimistic; and a more theoretically-justified basis of valuation that gives results appropriate to a representative agent from the focus region, the EU. The effect of each of these adjustments is quantified and explained.  相似文献   

8.
Greenhouse gases emission inventories are computed with rather low precision. Moreover, their uncertainty distributions may be asymmetric. This should be accounted for in the compliance and trading rules. In this paper we model the uncertainty of inventories as intervals or using fuzzy numbers. The latter allows us to better shape the uncertainty distributions. The compliance and emission trading rules obtained generalize the results for the symmetric uncertainty distributions that were considered in the earlier papers by the present authors (Nahorski et al., Water Air & Soil Pollution. Focus 7(4–5):539–558, 2007; Nahorski and Horabik, 2007, J Energy Eng 134(2):47–52, 2008). However, unlike in the symmetric distribution, in the asymmetric fuzzy case it is necessary to apply approximations because of nonlinearities in the formulas. The final conclusion is that the interval uncertainty rules can be applied, but with a much higher substitutional noncompliance risk, which is a parameter of the rules.  相似文献   

9.
Expert elicitation studies have become important barometers of scientific knowledge about future climate change (Morgan and Keith, Environ Sci Technol 29(10), 1995; Reilly et al., Science 293(5529):430–433, 2001; Morgan et al., Climate Change 75(1–2):195–214, 2006; Zickfeld et al., Climatic Change 82(3–4):235–265, 2007, Proc Natl Acad Sci 2010; Kriegler et al., Proc Natl Acad Sci 106(13):5041–5046, 2009). Elicitations incorporate experts’ understanding of known flaws in climate models, thus potentially providing a more comprehensive picture of uncertainty than model-driven methods. The goal of standard elicitation procedures is to determine experts’ subjective probabilities for the values of key climate variables. These methods assume that experts’ knowledge can be captured by subjective probabilities—however, foundational work in decision theory has demonstrated this need not be the case when their information is ambiguous (Ellsberg, Q J Econ 75(4):643–669, 1961). We show that existing elicitation studies may qualitatively understate the extent of experts’ uncertainty about climate change. We designed a choice experiment that allows us to empirically determine whether experts’ knowledge about climate sensitivity (the equilibrium surface warming that results from a doubling of atmospheric CO2 concentration) can be captured by subjective probabilities. Our results show that, even for this much studied and well understood quantity, a non-negligible proportion of climate scientists violate the choice axioms that must be satisfied for subjective probabilities to adequately describe their beliefs. Moreover, the cause of their violation of the axioms is the ambiguity in their knowledge. We expect these results to hold to a greater extent for less understood climate variables, calling into question the veracity of previous elicitations for these quantities. Our experimental design provides an instrument for detecting ambiguity, a valuable new source of information when linking climate science and climate policy which can help policy makers select decision tools appropriate to our true state of knowledge.  相似文献   

10.
We present further steps in our analysis of the early anthropogenic hypothesis (Ruddiman, Clim Change 61:261–293, 2003) that increased levels of greenhouse gases in the current interglacial, compared to lower levels in previous interglacials, were initiated by early agricultural activities, and that these increases caused a warming of climate long before the industrial era (~1750). These steps include updating observations of greenhouse gas and climate trends from earlier interglacials, reviewing recent estimates of greenhouse gas emissions from early agriculture, and describing a simulation by a climate model with a dynamic ocean forced by the low levels of greenhouse gases typical of previous interglacials in order to gauge the magnitude of the climate change for an inferred (natural) low greenhouse gas level relative to a high present day level. We conduct two time slice (equilibrium) simulations using present day orbital forcing and two levels of greenhouse gas forcing: the estimated low (natural) levels of previous interglacials, and the high levels of the present (control). By comparing the former to the latter, we estimate how much colder the climate would be without the combined greenhouse gas forcing of the early agriculture era (inferred from differences between this interglacial and previous interglacials) and the industrial era (the period since ~1750). With the low greenhouse gas levels, the global average surface temperature is 2.7 K lower than present day—ranging from ~2 K lower in the tropics to 4–8 K lower in polar regions. These changes are large, and larger than those reported in a pre-industrial (~1750) simulation with this model, because the imposed low greenhouse gas levels (CH4 = 450 ppb, CO2 = 240 ppm) are lower than both pre-industrial (CH4 = 760 ppb, CO2 = 280 ppm) and modern control (CH4 = 1,714 ppb, CO2 = 355 ppm) values. The area of year-round snowcover is larger, as found in our previous simulations and some other modeling studies, indicating that a state of incipient glaciation would exist given the current configuration of earth’s orbit (reduced insolation in northern hemisphere summer) and the imposed low levels of greenhouse gases. We include comparisons of these snowcover maps with known locations of earlier glacial inception and with locations of twentieth century glaciers and ice caps. In two earlier studies, we used climate models consisting of atmosphere, land surface, and a shallow mixed-layer ocean (Ruddiman et al., Quat Sci Rev 25:1–10, 2005; Vavrus et al., Quat Sci Rev 27:1410–1425, 2008). Here, we replaced the mixed-layer ocean with a complete dynamic ocean. While the simulated climate of the atmosphere and the surface with this improved model configuration is similar to our earlier results (Vavrus et al., Quat Sci Rev 27:1410–1425, 2008), the added information from the full dynamical ocean is of particular interest. The global and vertically-averaged ocean temperature is 1.25 K lower, the area of sea ice is larger, and there is less upwelling in the Southern Ocean. From these results, we infer that natural ocean feedbacks could have amplified the greenhouse gas changes initiated by early agriculture and possibly account for an additional increment of CO2 increase beyond that attributed directly to early agricultural, as proposed by Ruddiman (Rev Geophys 45:RG4001, 2007). However, a full test of the early anthropogenic hypothesis will require additional observations and simulations with models that include ocean and land carbon cycles and other refinements elaborated herein.  相似文献   

11.
Global Circulation Models (GCMs) provide projections for future climate warming using a wide variety of highly sophisticated anthropogenic CO2 emissions scenarios as input, each based on the evolution of four emissions ??drivers??: population p, standard of living g, energy productivity (or efficiency) f and energy carbonization c (IPCC WG III 2007). The range of scenarios considered is extremely broad, however, and this is a primary source of forecast uncertainty (Stott and Kettleborough, Nature 416:723?C725, 2002). Here, it is shown both theoretically and observationally how the evolution of the human system can be considered from a surprisingly simple thermodynamic perspective in which it is unnecessary to explicitly model two of the emissions drivers: population and standard of living. Specifically, the human system grows through a self-perpetuating feedback loop in which the consumption rate of primary energy resources stays tied to the historical accumulation of global economic production??or p×g??through a time-independent factor of 9.7±0.3 mW per inflation-adjusted 1990 US dollar. This important constraint, and the fact that f and c have historically varied rather slowly, points towards substantially narrowed visions of future emissions scenarios for implementation in GCMs.  相似文献   

12.
Simulated climate variables in a simple energy balance model subject to linearly increasing external forcing (due to increasing greenhouse gas emissions) and random internal forcings have been studied for more accurate climate prediction. The numerical method for such a system requires careful treatment of random forcings. Mathematical analyses show that the effect of random forcings should be diminished in the numerical integration method by the reciprocal of the root of the integration time step $ \left( {1/\sqrt {{\Delta t}} } \right) $ , which we call an attenuator. Our simulations consistently show that the attenuator desirably reduces variances of simulated climate variables and eliminates overestimation of the variances. However, the attenuator tends to bias the estimates of the climate feedback parameter obtained from a simple regression analysis of simulated variables toward unrealistically low values. This is because the reduced random forcings amplify the negative effect of a warming trend due to greenhouse emissions (when added to random forcing) on feedback estimation. Without the attenuator, the estimated feedback is much more accurate. The bias induced from the attenuator was largely resolved for the feedback estimation by the methodology of Lindzen and Choi (Asia-Pacific J Atmos Sci 47(4):377–390, 2011), which minimizes the negative effect of the warming trends by isolating short (few months) segments of increasing and decreasing temperature changes.  相似文献   

13.
On the time-varying trend in global-mean surface temperature   总被引:4,自引:0,他引:4  
The Earth has warmed at an unprecedented pace in the decades of the 1980s and 1990s (IPCC in Climate change 2007: the scientific basis, Cambridge University Press, Cambridge, 2007). In Wu et al. (Proc Natl Acad Sci USA 104:14889–14894, 2007) we showed that the rapidity of the warming in the late twentieth century was a result of concurrence of a secular warming trend and the warming phase of a multidecadal (~65-year period) oscillatory variation and we estimated the contribution of the former to be about 0.08°C per decade since ~1980. Here we demonstrate the robustness of those results and discuss their physical links, considering in particular the shape of the secular trend and the spatial patterns associated with the secular trend and the multidecadal variability. The shape of the secular trend and rather globally-uniform spatial pattern associated with it are both suggestive of a response to the buildup of well-mixed greenhouse gases. In contrast, the multidecadal variability tends to be concentrated over the extratropical Northern Hemisphere and particularly over the North Atlantic, suggestive of a possible link to low frequency variations in the strength of the thermohaline circulation. Depending upon the assumed importance of the contributions of ocean dynamics and the time-varying aerosol emissions to the observed trends in global-mean surface temperature, we estimate that up to one third of the late twentieth century warming could have been a consequence of natural variability.  相似文献   

14.
A new approach is proposed to predict concentration fluctuations in the framework of one-particle Lagrangian stochastic models. The approach is innovative since it allows the computation of concentration fluctuations in dispersing plumes using a Lagrangian one-particle model with micromixing but with no need for the simulating of background particles. The extension of the model for the treatment of chemically reactive plumes is also accomplished and allows the computation of plume-related chemical reactions in a Lagrangian one-particle framework separately from the background chemical reactions, accounting for the effect of concentration fluctuations on chemical reactions in a general, albeit approximate, manner. These characteristics should make the proposed approach an ideal tool for plume-in-grid calculations in chemistry transport models. The results are compared to the wind-tunnel experiments of Fackrell and Robins (J Fluid Mech, 117:1–26, 1982) for plume dispersion in a neutral boundary layer and to the measurements of Legg et al. (Boundary-Layer Meteorol, 35:277–302, 1986) for line source dispersion in and above a model canopy. Preliminary reacting plume simulations are also shown comparing the model with the experimental results of Brown and Bilger (J Fluid Mech, 312:373–407, 1996; Atmos Environ, 32:611–628, 1998) to demonstrate the feasibility of computing chemical reactions in the proposed framework.  相似文献   

15.
Fifty-four broadband models for computation of solar diffuse irradiation on horizontal surface were tested in Romania (South-Eastern Europe). The input data consist of surface meteorological data, column integrated data, and data derived from satellite measurements. The testing procedure is performed in 21 stages intended to provide information about the sensitivity of the models to various sets of input data. There is no model to be ranked “the best” for all sets of input data. However, some of the models performed better than others, in the sense that they were ranked among the best for most of the testing stages. The best models for solar diffuse radiation computation are, on equal footing, ASHRAE 2005 model (ASHRAE 2005) and King model (King and Buckius, Solar Energy 22:297–301, 1979). The second best model is MAC model (Davies, Bound Layer Meteor 9:33–52, 1975). Details about the performance of each model in the 21 testing stages are found in the Electronic Supplementary Material.  相似文献   

16.
For many decades, attempts have been made to find the universal value of the critical bulk Richardson number ( $Ri_{Bc}$ ; defined over the entire stable boundary layer). By analyzing an extensive large-eddy simulation database and various published wind-tunnel data, we show that $Ri_{Bc}$ is not a constant, rather it strongly depends on bulk atmospheric stability. A (qualitatively) similar dependency, based on the well-known resistance laws, was reported by Melgarejo and Deardorff (J Atmos Sci 31:1324–1333, 1974) about forty years ago. To the best of our knowledge, this result has largely been ignored. Based on data analysis, we find that the stability-dependent $Ri_{Bc}$ estimates boundary-layer height more accurately than the conventional constant $Ri_{Bc}$ approach. Furthermore, our results indicate that the common practice of setting $Ri_{Bc}$ as a constant in numerical modelling studies implicitly constrains the bulk stability of the simulated boundary layer. The proposed stability-dependent $Ri_{Bc}$ does not suffer from such an inappropriate constraint.  相似文献   

17.
Spatial GHG inventory at the regional level: accounting for uncertainty   总被引:3,自引:1,他引:2  
R. Bun  Kh. Hamal  M. Gusti  A. Bun 《Climatic change》2010,103(1-2):227-244
Methodology and geo-information technology for spatial analysis of processes of greenhouse gas (GHG) emissions from mobile and stationary sources of the energy sector at the level of elementary plots are developed. The methodology, which takes into account the territorial specificity of point, line, and area sources of emissions, is based on official statistical data surveys. The spatial distribution of emissions and their structure for the main sectors of the energy sector in the territory of the Lviv region of Ukraine are analyzed. The relative uncertainties of emission estimates obtained are calculated using knowledge of the spatial location of emission sources and following the Tier 1 and Tier 2 approaches of IPCC methodologies. The sensitivity of total relative uncertainty to change of uncertainties in input data uncertainties is studied for the biggest emission point sources. A few scenarios of passing to the alternative energy generation are considered and respective structural changes in the structure of greenhouse gas emissions are analyzed. An influence of these structural changes on the total uncertainty of greenhouse gas inventory results is studied.  相似文献   

18.
19.
The local thermal effects in the wake of a single cube with a strong heated rear face, representing a large building in an urban area, are studied using large-eddy simulations (LES) for various degrees of heating, which are characterized by the local Richardson number, $Ri$ . New wall models are implemented for momentum and temperature and comparison of the flow and thermal fields with the wind-tunnel data of Richards et al. (J Wind Eng Ind Aerodyn 94, 621–636, 2006) shows fair agreement. Buoyancy effects are quite evident at low $Ri$ and a significant increase in the turbulence levels is observed for such flows. Apart from the comparisons with experiments, further analysis included the estimation of the thermal boundary-layer thickness and heat transfer coefficient for all $Ri$ . For sufficiently strong heating, the heat transfer coefficient at the leeward face is found to be higher than the roof surface. This suggests that, beyond a certain $Ri$ value, buoyancy forces from the former surface dominate the strong streamwise convection of the latter. Quadrant analysis along the shear layer behind the cube showed that the strength of sweeps that contribute to momentum flux is considerably enhanced by heating. The contribution of different quadrants to the heat flux is found to be very different to that of the momentum flux for lower $Ri$ .  相似文献   

20.
The surface air temperature increase in the southwestern United States was much larger during the last few decades than the increase in the global mean. While the global temperature increased by about 0.5 °C from 1975 to 2000, the southwestern US temperature increased by about 2 °C. If such an enhanced warming persisted for the next few decades, the southwestern US would suffer devastating consequences. To identify major drivers of southwestern climate change we perform a multiple-linear regression of the past 100 years of the southwestern US temperature and precipitation. We find that in the early twentieth century the warming was dominated by a positive phase of the Atlantic multi-decadal oscillation (AMO) with minor contributions from increasing solar irradiance and concentration of greenhouse gases. The late twentieth century warming was about equally influenced by increasing concentration of atmospheric greenhouse gases (GHGs) and a positive phase of the AMO. The current southwestern US drought is associated with a near maximum AMO index occurring nearly simultaneously with a minimum in the Pacific decadal oscillation (PDO) index. A similar situation occurred in mid-1950s when precipitation reached its minimum within the instrumental records. If future atmospheric concentrations of GHGs increase according to the IPCC scenarios (Solomon et al. in Climate change 2007: working group I. The Physical Science Basis, Cambridge, 996 pp, 2007), climate models project a fast rate of southwestern warming accompanied by devastating droughts (Seager et al. in Science 316:1181–1184, 2007; Williams et al. in Nat Clim Chang, 2012). However, the current climate models have not been able to predict the behavior of the AMO and PDO indices. The regression model does support the climate models (CMIP3 and CMIP5 AOGCMs) projections of a much warmer and drier southwestern US only if the AMO changes its 1,000 years cyclic behavior and instead continues to rise close to its 1975–2000 rate. If the AMO continues its quasi-cyclic behavior the US SW temperature should remain stable and the precipitation should significantly increase during the next few decades.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号