首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Despite decades of research, large multi-model uncertainty remains about the Earth’s equilibrium climate sensitivity to carbon dioxide forcing as inferred from state-of-the-art Earth system models (ESMs). Statistical treatments of multi-model uncertainties are often limited to simple ESM averaging approaches. Sometimes models are weighted by how well they reproduce historical climate observations. Here, we propose a novel approach to multi-model combination and uncertainty quantification. Rather than averaging a discrete set of models, our approach samples from a continuous distribution over a reduced space of simple model parameters. We fit the free parameters of a reduced-order climate model to the output of each member of the multi-model ensemble. The reduced-order parameter estimates are then combined using a hierarchical Bayesian statistical model. The result is a multi-model distribution of reduced-model parameters, including climate sensitivity. In effect, the multi-model uncertainty problem within an ensemble of ESMs is converted to a parametric uncertainty problem within a reduced model. The multi-model distribution can then be updated with observational data, combining two independent lines of evidence. We apply this approach to 24 model simulations of global surface temperature and net top-of-atmosphere radiation response to abrupt quadrupling of carbon dioxide, and four historical temperature data sets. Our reduced order model is a 2-layer energy balance model. We present probability distributions of climate sensitivity based on (1) the multi-model ensemble alone and (2) the multi-model ensemble and observations.  相似文献   

2.
 We present a method for constraining key properties of the climate system that are important for climate prediction (climate sensitivity and rate of heat penetration into the deep ocean) by comparing a model's response to known forcings over the twentieth century against climate observations for that period. We use the MIT 2D climate model in conjunction with results from the Hadley Centre's coupled atmosphere–ocean general circulation model (AOGCM) to determine these constraints. The MIT 2D model, which is a zonally averaged version of a 3D GCM, can accurately reproduce the global-mean transient response of coupled AOGCMs through appropriate choices of the climate sensitivity and the effective rate of diffusion of heat anomalies into the deep ocean. Vertical patterns of zonal mean temperature change through the troposphere and lower stratosphere also compare favorably with those generated by 3-D GCMs. We compare the height–latitude pattern of temperature changes as simulated by the MIT 2D model with observed changes, using optimal fingerprint detection statistics. Using a linear regression model as in Allen and Tett this approach yields an objective measure of model-observation goodness-of-fit (via the residual sum of squares weighted by differences expected due to internal variability). The MIT model permits one to systematically vary the model's climate sensitivity (by varying the strength of the cloud feedback) and rate of mixing of heat into the deep ocean and determine how the goodness-of-fit with observations depends on these factors. This provides an efficient framework for interpreting detection and attribution results in physical terms. With aerosol forcing set in the middle of the IPCC range, two sets of model parameters are rejected as being implausible when the model response is compared with observations. The first set corresponds to high climate sensitivity and slow heat uptake by the deep ocean. The second set corresponds to low sensitivities for all magnitudes of heat uptake. These results demonstrate that fingerprint patterns must be carefully chosen, if their detection is to reduce the uncertainty of physically important model parameters which affect projections of climate change. Received: 19 April 2000 / Accepted: 13 April 2001  相似文献   

3.
We report the results of an uncertainty decomposition analysis of the social cost of carbon as estimated by FUND, a model that has a more detailed representation of the economic impact of climate change than any other model. Some of the parameters particularly influence impacts in the short run whereas other parameters are important in the long run. Some parameters are influential in some regions only. Some parameters are known reasonably well, but others are not. Ethical values, such as the pure rate of time preference and the rate of risk aversion, therefore affect not only the social cost of carbon, but also the importance of the parameters that determine its value. Some parameters, however, are consistently important: cooling energy demand, migration, climate sensitivity, and agriculture. The last two are subject to a large research effort, but the first two are not.  相似文献   

4.
A simplified climate model is presented which includes a fully 3-D, frictional geostrophic (FG) ocean component but retains an integration efficiency considerably greater than extant climate models with 3-D, primitive-equation ocean representations (20 kyears of integration can be completed in about a day on a PC). The model also includes an Energy and Moisture Balance atmosphere and a dynamic and thermodynamic sea-ice model. Using a semi-random ensemble of 1,000 simulations, we address both the inverse problem of parameter estimation, and the direct problem of quantifying the uncertainty due to mixing and transport parameters. Our results represent a first attempt at tuning a 3-D climate model by a strictly defined procedure, which nevertheless considers the whole of the appropriate parameter space. Model estimates of meridional overturning and Atlantic heat transport are well reproduced, while errors are reduced only moderately by a doubling of resolution. Model parameters are only weakly constrained by data, while strong correlations between mean error and parameter values are mostly found to be an artefact of single-parameter studies, not indicative of global model behaviour. Single-parameter sensitivity studies can therefore be misleading. Given a single, illustrative scenario of CO2 increase and fixing the polynomial coefficients governing the extremely simple radiation parameterisation, the spread of model predictions for global mean warming due solely to the transport parameters is around one degree after 100 years forcing, although in a typical 4,000-year ensemble-member simulation, the peak rate of warming in the deep Pacific occurs 400 years after the onset of the forcing. The corresponding uncertainty in Atlantic overturning after 100 years is around 5 Sv, with a small, but non-negligible, probability of a collapse in the long term.  相似文献   

5.
Climate Change Prediction   总被引:4,自引:0,他引:4  
The concept of climate change prediction in response to anthropogenic forcings at multi-decadal time scales is reviewed. This is identified as a predictability problem with characteristics of both first kind and second kind (due to the slow components of the climate system). It is argued that, because of the non-linear and stochastic aspects of the climate system and of the anthropogenic and natural forcings, climate change contains an intrinsic level of uncertainty. As a result, climate change prediction needs to be approached in a probabilistic way. This requires a characterization and quantification of the uncertainties associated with the sequence of steps involved in a climate change prediction. A review is presented of different approaches recently proposed to produce probabilistic climate change predictions. The additional difficulties found when extending the prediction from the global to the regional scale and the implications that these have on the choice of prediction strategy are finally discussed.  相似文献   

6.
The majority of climate change impacts assessments account for climate change uncertainty by adopting the scenario-based approach. This typically involves assessing the impacts for a small number of emissions scenarios but neglecting the role of climate model physics uncertainty. Perturbed physics ensemble (PPE) climate simulations offer a unique opportunity to explore this uncertainty. Furthermore, PPEs mean it is now possible to make risk-based impacts estimates because they allow for a range of estimates to be presented to decision-makers, which spans the range of climate model physics uncertainty inherent from a given climate model and emissions scenario, due to uncertainty associated with the understanding of physical processes in the climate model. This is generally not possible with the scenario-based approach. Here, we present the first application of a PPE to estimate the impact of climate change on heat-related mortality. By using the estimated impacts of climate change on heat-related mortality in six cities, we demonstrate the benefits of quantifying climate model physics uncertainty in climate change impacts assessment over the more common scenario-based approach. We also show that the impacts are more sensitive to climate model physics uncertainty than they are to emissions scenario uncertainty, and least sensitive to whether the climate change projections are from a global climate model or a regional climate model. The results demonstrate the importance of presenting model uncertainties in climate change impacts assessments if the impacts are to be placed within a climate risk management framework.  相似文献   

7.
Anthropogenic greenhouse gas emissions may trigger climate threshold responses, such as a collapse of the North Atlantic meridional overturning circulation (MOC). Climate threshold responses have been interpreted as an example of “dangerous anthropogenic interference with the climate system” in the sense of the United Nations Framework Convention on Climate Change (UNFCCC). One UNFCCC objective is to “prevent” such dangerous anthropogenic interference. The current uncertainty about important parameters of the coupled natural – human system implies, however, that this UNFCCC objective can only be achieved in a probabilistic sense. In other words, climate management can only reduce – but not entirely eliminate – the risk of crossing climate thresholds. Here we use an integrated assessment model of climate change to derive economically optimal risk-reduction strategies. We implement a stochastic version of the DICE model and account for uncertainty about four parameters that have been previously identified as dominant drivers of the uncertain system response. The resulting model is, of course, just a crude approximation as it neglects, for example, some structural uncertainty and focuses on a single threshold, out of many potential climate responses. Subject to this caveat, our analysis suggests five main conclusions. First, reducing the numerical artifacts due to sub-sampling the parameter probability density functions to reasonable levels requires sample sizes exceeding 103. Conclusions of previous studies that are based on much smaller sample sizes may hence need to be revisited. Second, following a business-as-usual (BAU) scenario results in odds for an MOC collapse in the next 150 years exceeding 1 in 3 in this model. Third, an economically “optimal” strategy (that maximizes the expected utility of the decision-maker) reduces carbon dioxide(CO2) emissions by approximately 25% at the end of this century, compared with BAU emissions. Perhaps surprisingly, this strategy leaves the odds of an MOC collapse virtually unchanged compared to a BAU strategy. Fourth, reducing the odds for an MOC collapse to 1 in 10 would require an almost complete decarbonization of the economy within a few decades. Finally, further risk reductions (e.g., to 1 in 100) are possible in the framework of the simple model, but would require faster and more expensive reductions in CO2 emissions.  相似文献   

8.
Probabilistic climate change projections using neural networks   总被引:5,自引:0,他引:5  
Anticipated future warming of the climate system increases the need for accurate climate projections. A central problem are the large uncertainties associated with these model projections, and that uncertainty estimates are often based on expert judgment rather than objective quantitative methods. Further, important climate model parameters are still given as poorly constrained ranges that are partly inconsistent with the observed warming during the industrial period. Here we present a neural network based climate model substitute that increases the efficiency of large climate model ensembles by at least an order of magnitude. Using the observed surface warming over the industrial period and estimates of global ocean heat uptake as constraints for the ensemble, this method estimates ranges for climate sensitivity and radiative forcing that are consistent with observations. In particular, negative values for the uncertain indirect aerosol forcing exceeding –1.2 Wm–2 can be excluded with high confidence. A parameterization to account for the uncertainty in the future carbon cycle is introduced, derived separately from a carbon cycle model. This allows us to quantify the effect of the feedback between oceanic and terrestrial carbon uptake and global warming on global temperature projections. Finally, probability density functions for the surface warming until year 2100 for two illustrative emission scenarios are calculated, taking into account uncertainties in the carbon cycle, radiative forcing, climate sensitivity, model parameters and the observed temperature records. We find that warming exceeds the surface warming range projected by IPCC for almost half of the ensemble members. Projection uncertainties are only consistent with IPCC if a model-derived upper limit of about 5 K is assumed for climate sensitivity.  相似文献   

9.
Uncertainty forms an integral part of climate science, and it is often used to argue against mitigative action. This article presents an analysis of uncertainty in climate sensitivity that is robust to a range of assumptions. We show that increasing uncertainty is necessarily associated with greater expected damages from warming, provided the function relating warming to damages is convex. This constraint is unaffected by subjective or cultural risk-perception factors, it is unlikely to be overcome by the discount rate, and it is independent of the presumed magnitude of climate sensitivity. The analysis also extends to “second-order” uncertainty; that is, situations in which experts disagree. Greater disagreement among experts increases the likelihood that the risk of exceeding a global temperature threshold is greater. Likewise, increasing uncertainty requires increasingly greater protective measures against sea level rise. This constraint derives directly from the statistical properties of extreme values. We conclude that any appeal to uncertainty compels a stronger, rather than weaker, concern about unabated warming than in the absence of uncertainty.  相似文献   

10.
Most dynamical models of the natural system contain a number of empirical parameters which reflect our limited understanding of the simulated system or describe unresolved subgrid-scale processes. While the parameterizations basically introduce some uncertainty to the model results, they also hold the prospect of tuning the model. In general, a deterministic tuning is related to an inversion of the model which is often impossible or requires considerable computing effort for most climate models. Another way to adjust the model parameters to a specific observed process is stochastic fitting where a set of parameters and model output are taken as random variables. Here, we present a dynamical?Cstatistical approach with a simplified model of the El Ni?o?CSouthern Oscillation (ENSO) cycle whose parameters are adjusted to simulated and observed data by means of Bayesian statistics. As ENSO model, we employ the Schop?CSuarez delay oscillator model. Monte Carlo experiments highlight the large sensitivity of the model results to varied model parameters and initial values. The statistical adjustment is done by Bayesian model averaging of the Monte Carlo experiments. Applying the method to simulated data, the posterior ensemble mean is much closer to the reference data than the prior ensemble mean. The learning effect of the model is evident in the leading empirical orthogonal functions and statistically significant in the mean state. When the method is applied to the observed ENSO time series, the ENSO model in its classical setup is not able to account for the temporally varying periodicity of the observed ENSO phenomenon. An improved setup with continuous adjustment periods and extended parameter range is developed in order to allow the model to learn from the data gradually. The improved setup leads to promising results during the twentieth century and even a weak forecast skill over 6?months. Thus, the described method offers a promising tool for data assimilation in dynamical weather and climate models. However, the simplified ENSO model is barely appropriate for operational ENSO forecasts owing to its limited physical complexity.  相似文献   

11.
This paper examines the uncertainty in the change in the heat content in the ocean component of a general circulation model. We describe the design and implementation of our statistical methodology. Using an ensemble of model runs and an emulator, we produce an estimate of the full probability distribution function (PDF) for the change in upper ocean heat in an Atmosphere/Ocean General Circulation Model, the Community Climate System Model v. 3, across a multi-dimensional input space. We show how the emulator of the GCM’s heat content change and hence, the PDF, can be validated and how implausible outcomes from the emulator can be identified when compared to observational estimates of the metric. In addition, the paper describes how the emulator outcomes and related uncertainty information might inform estimates of the same metric from a multi-model Coupled Model Intercomparison Project phase 3 ensemble. We illustrate how to (1) construct an ensemble based on experiment design methods, (2) construct and evaluate an emulator for a particular metric of a complex model, (3) validate the emulator using observational estimates and explore the input space with respect to implausible outcomes and (4) contribute to the understanding of uncertainties within a multi-model ensemble. Finally, we estimate the most likely value for heat content change and its uncertainty for the model, with respect to both observations and the uncertainty in the value for the input parameters.  相似文献   

12.
There is increasingly clear evidence that human influence has contributed substantially to the large-scale climatic changes that have occurred over the past few decades. Attention is now turning to the physical implications of the emerging anthropogenic signal. Of particular interest is the question of whether current climate models may be over- or under-estimating the amplitude of the climate system's response to external forcing, including anthropogenic. Evidence of a significant error in a model-simulated response amplitude would indicate the existence of amplifying or damping mechanisms that are inadequately represented in the model. The range of uncertainty in the factor by which we can scale model-simulated changes while remaining consistent with observed change provides an estimate of uncertainty in model-based predictions. With any model that displays a realistic level of internal variability, the problem of estimating this factor is complicated by the fact that it represents a ratio between two incompletely known quantities: both observed and simulated responses are subject to sampling uncertainty, primarily due to internal chaotic variability. Sampling uncertainty in the simulated response can be reduced, but not eliminated, through ensemble simulations. Accurate estimation of these scaling factors requires a modification of the standard "optimal fingerprinting" algorithm for climate change detection, drawing on the conventional "total least squares" approach discussed in the statistical literature. Code for both variants of optimal fingerprinting can be found on .  相似文献   

13.
Towards quantifying uncertainty in transient climate change   总被引:2,自引:3,他引:2  
Ensembles of coupled atmosphere–ocean global circulation model simulations are required to make probabilistic predictions of future climate change. “Perturbed physics” ensembles provide a new approach in which modelling uncertainties are sampled systematically by perturbing uncertain parameters. The aim is to provide a basis for probabilistic predictions in which the impact of prior assumptions and observational constraints can be clearly distinguished. Here we report on the first perturbed physics coupled atmosphere–ocean model ensemble in which poorly constrained atmosphere, land and sea-ice component parameters are varied in the third version of the Hadley Centre model (the variation of ocean parameters will be the subject of future study). Flux adjustments are employed, both to reduce regional sea surface temperature (SST) and salinity biases and also to admit the use of combinations of model parameter values which give non-zero values for the global radiation balance. This improves the extent to which the ensemble provides a credible basis for the quantification of uncertainties in climate change, especially at a regional level. However, this particular implementation of flux-adjustments leads to a weakening of the Atlantic overturning circulation, resulting in the development of biases in SST and sea ice in the North Atlantic and Arctic Oceans. Nevertheless, model versions are produced which are of similar quality to the unperturbed and un-flux-adjusted version. The ensemble is used to simulate pre-industrial conditions and a simple scenario of a 1% per year compounded increase in CO2. The range of transient climate response (the 20 year averaged global warming at the time of CO2 doubling) is 1.5–2.6°C, similar to that found in multi-model studies. Measures of global and large scale climate change from the coupled models show simple relationships with associated measures computed from atmosphere-mixed-layer-ocean climate change experiments, suggesting that recent advances in computing the probability density function of climate change under equilibrium conditions using the perturbed physics approach may be extended to the transient case.  相似文献   

14.
Most previous land-surface model calibration studies have defined global ranges for their parameters to search for optimal parameter sets. Little work has been conducted to study the impacts of realistic versus global ranges as well as model complexities on the calibration and uncertainty estimates. The primary purpose of this paper is to investigate these impacts by employing Bayesian Stochastic Inversion (BSI) to the Chameleon Surface Model (CHASM). The CHASM was designed to explore the general aspects of land-surface energy balance representation within a common modeling framework that can be run from a simple energy balance formulation to a complex mosaic type structure. The BSI is an uncertainty estimation technique based on Bayes theorem, importance sampling, and very fast simulated annealing.The model forcing data and surface flux data were collected at seven sites representing a wide range of climate and vegetation conditions. For each site, four experiments were performed with simple and complex CHASM formulations as well as realistic and global parameter ranges. Twenty eight experiments were conducted and 50 000 parameter sets were used for each run. The results show that the use of global and realistic ranges gives similar simulations for both modes for most sites, but the global ranges tend to produce some unreasonable optimal parameter values. Comparison of simple and complex modes shows that the simple mode has more parameters with unreasonable optimal values. Use of parameter ranges and model complexities have significant impacts on frequency distribution of parameters, marginal posterior probability density functions, and estimates of uncertainty of simulated sensible and latent heat fluxes.Comparison between model complexity and parameter ranges shows that the former has more significant impacts on parameter and uncertainty estimations.  相似文献   

15.
Wide ranging climate changes are expected in the Arctic by the end of the 21st century, but projections of the size of these changes vary widely across current global climate models. This variation represents a large source of uncertainty in our understanding of the evolution of Arctic climate. Here we systematically quantify and assess the model uncertainty in Arctic climate changes in two CO2 doubling experiments: a multimodel ensemble (CMIP3) and an ensemble constructed using a single model (HadCM3) with multiple parameter perturbations (THC-QUMP). These two ensembles allow us to assess the contribution that both structural and parameter variations across models make to the total uncertainty and to begin to attribute sources of uncertainty in projected changes. We find that parameter uncertainty is an major source of uncertainty in certain aspects of Arctic climate. But also that uncertainties in the mean climate state in the 20th century, most notably in the northward Atlantic ocean heat transport and Arctic sea ice volume, are a significant source of uncertainty for projections of future Arctic change. We suggest that better observational constraints on these quantities will lead to significant improvements in the precision of projections of future Arctic climate change.  相似文献   

16.
We have characterized the relative contributions to uncertainty in predictions of global warming amount by year 2100 in the C4MIP model ensemble ( Friedlingstein et al., 2006 ) due to both carbon cycle process uncertainty and uncertainty in the physical climate properties of the Earth system. We find carbon cycle uncertainty to be important. On average the spread in transient climate response is around 40% of that due to the more frequently debated uncertainties in equilibrium climate sensitivity and global heat capacity.
This result is derived by characterizing the influence of different parameters in a global climate-carbon cycle 'box' model that has been calibrated against the 11 General Circulation models (GCMs) and Earth system Models of Intermediate Complexity (EMICs) in the C4MIP ensemble; a collection of current state-of-the-art climate models that include an explicit representation of the global carbon cycle.  相似文献   

17.
We investigate an important scientific uncertainty facing climate-change policymakers, namely, the impact of potential abrupt climatic change. We examine sequential decision strategies for abating climate change where near-term policies are viewed as the first of a series of decisions which adapt over the years to improving scientific information. We compare two illustrative near-term (1992–2002) policies - moderate and aggressive emission reductions - followed by a subsequent long-term policy chosen to limit global-mean temperature change to a specified ‘climate target’. We calculate the global-mean surface temperature change using a simple climate/ocean model and simple models of greenhouse-gas concentrations. We alter model parameters to examine the impact of abrupt changes in the sinks of carbon dioxide, the sources of methane, the circulation of the oceans, and the climate sensitivity, ΔT 2x. Although the abrupt changes increase the long-term costs of responding to climate change, they do not significantly affect the comparatively small cost difference between near-term strategies. Except for an abrupt increase in ΔT 2x, the investigated abrupt climate changes do not significantly alter the values of the climate target for which each near-term strategy is preferred. In contrast, innovations that reduce the cost of limiting greenhouse-gas emissions offer the potential for substantial abatement cost savings, regardless of which level of near-term abatement is selected.  相似文献   

18.
This article traces the development of uncertainty analysis through three generations punctuated by large methodology investments in the nuclear sector. Driven by a very high perceived legitimation burden, these investments aimed at strengthening the scientific basis of uncertainty quantification. The first generation building off the Reactor Safety Study introduced structured expert judgment in uncertainty propagation and distinguished variability and uncertainty. The second generation emerged in modeling the physical processes inside the reactor containment building after breach of the reactor vessel. Operational definitions and expert judgment for uncertainty quantification were elaborated. The third generation developed in modeling the consequences of release of radioactivity and transport through the biosphere. Expert performance assessment, dependence elicitation and probabilistic inversion are among the hallmarks. Third generation methods may be profitably employed in current Integrated Assessment Models (IAMs) of climate change. Possible applications of dependence modeling and probabilistic inversion are sketched. It is unlikely that these methods will be fully adequate for quantitative uncertainty analyses of the impacts of climate change, and a penultimate section looks ahead to fourth generation methods.  相似文献   

19.
Considered are the changes in the climate impact on the objects of construction and infrastructure on the territory of Russia. The focus is on the changes in the characteristics of daily air temperature and precipitation expected by the middle of the 21st century, which are of high importance in terms of the building design. The assessment of expected changes is based on the results of ensemble computations using the MGO gobal climate model and the embedded regional model (MGO RCM) with the horizontal resolution of 25 km. Along with the ensemble-averaged estimates of changes in applied climate parameters, an uncertainty of estimates associated with the natural climate variability is analyzed using the data of numerical experiments. The attention is drawn to the most significant effects of climate changes which should be taken into account when developing the measures for adapting the construction sector in Russia.  相似文献   

20.
Impact of climate change on Pacific Northwest hydropower   总被引:2,自引:0,他引:2  
The Pacific Northwest (PNW) hydropower resource, central to the region’s electricity supply, is vulnerable to the impacts of climate change. The Northwest Power and Conservation Council (NWPCC), an interstate compact agency, has conducted long term planning for the PNW electricity supply for its 2005 Power Plan. In formulating its power portfolio recommendation, the NWPCC explored uncertainty in variables that affect the availability and cost of electricity over the next 20 years. The NWPCC conducted an initial assessment of potential impacts of climate change on the hydropower system, but these results are not incorporated in the risk model upon which the 2005 Plan recommendations are based. To assist in bringing climate information into the planning process, we present an assessment of uncertainty in future PNW hydropower generation potential based on a comprehensive set of climate models and greenhouse gas emissions pathways. We find that the prognosis for PNW hydropower supply under climate change is worse than anticipated by the NWPCC’s assessment. Differences between the predictions of individual climate models are found to contribute more to overall uncertainty than do divergent emissions pathways. Uncertainty in predictions of precipitation change appears to be more important with respect to impact on PNW hydropower than uncertainty in predictions of temperature change. We also find that a simple regression model captures nearly all of the response of a sequence of complex numerical models to large scale changes in climate. This result offers the possibility of streamlining both top-down impact assessment and bottom-up adaptation planning for PNW water and energy resources.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号