Narrowing uncertainty in global climate change
Unknowns hamper initiating climate-mitigation policies
by Chris Forest, Mort Webster, and John Reilly pdf version of this article

www.wholeworldwindow.com/02/0sanfrancisco
Experts disagree about how climate might change
in the future, but they generally agree that great uncertainty
exists in any projection of future climate. The Joint
Program on the Science and Policy of Global Change, an
interdisciplinary program at the Massachusetts Institute
of Technology (MIT), has sought since its start in 1991 to
better quantify that uncertainty. The program’s multiyear
effort to develop the capability to make probabilistic projections
of future climate change is now bearing fruit.
 |
|
| Figure 1. The likelihood
of a change in equilibrium global mean surface temperature
caused by a doubling of atmospheric carbon dioxide shows a
mean of 3.5 °C and a median of 2.9 °C. |
|
Why focus on simply describing uncertainty, when the
scientific objective usually centers on how to reduce it?
Our motivation is that almost all the decisions we make
as individuals and as a society are made under uncertainty.
Decisions we make with regard to climate—whether
to reduce emissions or not, and by how much—will also
involve uncertainty. However, those decisions can be
improved with an accurate description of the uncertainty
inherent in the climate system. A vigorous scientific
research program will undoubtedly improve our understanding
of the climate system, and may reduce uncertainty
after years or decades of observation and measurement.
But some, and maybe much, of the uncertainty we
face in projecting decades into the future is irreducible.
Even as the necessary research and measurement proceed,
we need to decide how much we should begin
deflecting the world’s economy from a path that would
likely more than double the concentration of greenhouse
gases (GHGs) by the end of this century. As new data
become available, we will need to revise our estimate of
uncertainty, and that, in turn, may involve a midcourse
correction of policies designed to address climate change.
The modeling system developed in the Joint Program,
the MIT Integrated Global Systems Model (IGSM), calculates
how much a given emissions-reduction policy will
likely reduce the odds of serious impacts from global climate
change. The analytical model takes uncertain
inputs that affect economic activities, GHG emissions,
and the climate system’s response and calculates the
probability of specific outcomes. Thus, it forecasts possible
temperature increases for the next 100 years and the
probability that each will occur. Subsequent runs using
lower emissions show how emissions-control policies
could change the probabilities.
 |
| Figure 2. Projected temperature
increases between 1990 and 2100 exceed 8° centigrade for
the South Pole and 12° centigrade for the North Pole with no
policy
of significantly lower emissions and half those emissions with
such a policy. |
For decades, pervasive uncertainty has stymied the climate-
change policy-making process. How much will
temperature change, and how soon? How sensitive is the
climate to GHGs in the atmosphere? And what impact
will policies to limit emissions actually have on future
temperatures? Recognizing that it may not be possible to
decide who is right in these debates may not stop them,
but describing likelihood more precisely can help decision-
makers focus on how to avoid those serious adverse
effects that have a substantial chance of occurring.
The MIT IGSM is a set of linked computer models that
simulate economic growth and its associated emissions,
the flows of GHGs into and out of the land masses and
oceans, chemical reactions in the atmosphere, climate
dynamics, and changes in natural terrestrial ecosystems.
The models and the processes they simulate—each with
its own uncertainties—interact, with outputs from one
serving as inputs for another (Figure 3).
 |
| Figure 3. In the integrated
global system model, the human and natural emissions model
outputs are driving forces for the atmospheric chemistry and
climate model, whose outputs drive a terrestrial ecosystems
model. |
Uncertainty calculations involve quantifying the likelihood
that each of the many parameters that drive the
model will occur, from economic growth and energy efficiency
improvement to emissions of methane from agriculture
and of nitrous oxides from industry. Defining the
uncertainties in a critical process involves estimating a
probability density function (pdf)—which describes the probability
of events occurring over time—for the parameter
or parameters of that process. For the climate
system, this mathematical modeling involves processes
that affect climate sensitivity, the uptake of heat and carbon
dioxide by the oceans, and the role of aerosols and
other pollutants in climate. Some of these parameters—
the reflective sulfate aerosols, for example—have likely
contributed a cooling effect that has offset warming
caused by GHGs, at least partially.
Figure 1 shows a sample pdf for climate sensitivity. Climate
sensitivity is the change in global mean surface temperature
caused by a doubling of carbon dioxide in the
atmosphere that would occur if the Earth system would
fully adjust to the higher concentrations of CO2. It is widely
used to summarize how different climate models
respond to external forcings. Estimating this pdf involves a
statistical exercise that finds the parameter combinations
that, when used in the MIT IGSM, best fit the most recent
50 years of atmospheric and oceanic temperature measurements,
as well as the calculated interrelationships
among climate sensitivity, the effects of aerosols, and the
rate at which the deep oceans take up heat.
Uncertainties represented with similar pdfs for other
uncertain parameters in the Earth system—
the ocean, atmosphere, and biosphere, which
all contribute to climate change—and the economic
model are propagated through the
IGSM by running the model many times with
varying values for each parameter. In every
run, the model uses a numerical sampling
method to choose a value for each parameter.
The sampling method is constructed so that
each discrete value for a variable represents an
equal likelihood drawn from the input pdf.
Thus, each parameter set has an equal probability
of occurring.
For each set of input parameter values, the
IGSM run produces a different result for outcomes
that include predicted temperature
change and sea-level rise. It takes a cluster of
16 computer processors operating nonstop
for about a month to produce 250 runs—
enough to get a good approximation of the
distribution of outcomes that would be
obtained if the model were run thousands of times. Taking
the results from all the runs yields a range of values
for a given outcome, with equal probability of occurrence
for each value. The uncertainties of the input parameters
are, thus, reflected in the uncertainty of the results.
Figure 2 shows sample results for the calculated temperature
change between 1990 and 2100, presented by
latitude. The solid lines show results from “business as
usual” assumptions. Ninety-five percent of the calculated
values fall between the lines marked “upper 95%
bound” and “lower 95% bound.” Only 5% of the
values
are outside those bounds, with 2.5% falling above and
2.5% falling below. At the median, half of the temperature
values are above and half are below.
As expected, predicted temperature change varies with
latitude. Estimated warming—as well as the associated
uncertainty—is significantly greater near the poles than
in the tropics. The upper bound is especially worrisome
at the poles. Because 2.5% of the model results were
above that bound, the data suggest that a 1 in 40 chance
exists that warming will exceed 8 °C at the South Pole
and 12 °C at the North Pole. Warming is greater at the
North Pole because the smaller area of Arctic Sea ice means
less reflected sunlight.
How might emission-control policies change those outcomes?
The dashed lines in the figure show results assuming
an emissions-control policy that, by our best estimate
of conditions, would stabilize atmospheric concentrations
of carbon dioxide at 550 ppm. Such a policy would
require incredibly efficient vehicles and homes, and the
use of noncarbon fuels or carbon sequestration. But the
reduction in emissions significantly reduces the temperature
increase. Most important, it cuts the unlikely-butpossible
high-end temperature results almost in half.
The larger reduction of the worst-case outcomes
reflects the nature of the stringent policy. It is a cap on
emissions over the period. Emissions uncertainty is an
important contributor to overall uncertainty. An emissions
cap, if effectively implemented, would obviously
eliminate the possibility of high emissions. However, it
would not have any effect if little growth in emissions
had occurred, whether by good fortune (technology) or
bad (poor economic growth). A different form of policy—
a pollution tax, for example—would not cap emissions
in an absolute sense and would more likely shift
the entire temperature distribution lower.
The implications of changes in pdfs and abstract concepts
such as global average surface temperature are
hard for even experts to grasp. Ideally, we could further
describe these climate-system responses in terms of their
impacts on agriculture, human health, and fragile
ecosystems. However, the ability to represent such
effects probabilistically remains for the future. In the
meantime, we have attempted to identify some critical
values and the likelihood of exceeding them, as portrayed
in the table. The left column identifies some specific,
serious changes that could occur. The next three
columns present the odds that those changes will occur
assuming no emissions policy, a relatively lenient emissions-
control policy, and the stringent policy. Although
the lenient policy helps, the more-stringent policy dramatically
reduces the probability of the selected outcomes.
One lesson of this work is the impossibility of
completely eliminating a risk. We can only reduce the
chances of it occurring.
 |
| The probability that the serious changes
listed in column 1 could occur over a 100-year period if there
were no policy to stabilize atmospheric concentrations of carbon
dioxide, with a relatively lenient 750-ppm policy, and with
a stringent 550-ppm policy. |
This effort is an early attempt to assimilate the
available information on the Earth system and the economic
forces that relate to future climate change, and to
describe quantitatively the uncertainty in future projections.
Although such exercises guide our understanding
of the likely conditions of future climate, much work
remains to be done, and there will always be unknowables
that defy quantification. Unchanging physical laws
control natural processes. The challenge is to use available
data to constrain parameters of the Earth system
when we have an incomplete understanding of all the
processes that create variability in the climate system.
A debate continues about how to quantify uncertainty
in human systems—for example, economic growth and
emissions projections. Under the best of circumstances,
such efforts require the judgment of experts on future
growth and technology possibilities. Past response and the
behavior of the economy are our observations, but future
response is not constrained in the same way that physical properties
constrain the response of natural systems.
At the same time, the Earth system is so complex—and
our time series of good observations so short relative to
the time scale on which the system operates—that our
ability to know the behavior of this system may be quite
limited. This means that possible responses are not captured
in a distribution created with a model that includes
only what we now know about the Earth system’s behavior.
One of the pressing challenges is to understand the
processes that have led to the previous relatively abrupt
change of several degrees Celsius within a decade or so
found in the paleorecord of Earth. Current models do not
provide an explanation for such changes.
Further Reading
- Forest, C.; Allen, M.; et al. Constraining uncertainties
in climate models using climate change detection techniques.
Geophys. Res. Lett. 2000, 27 (4), 569–572. Joint
Program reprint no. 2000–10, available
here.
- Forest, C.; Stone, P.; et al. Quantifying uncertainties
in climate system properties with the use of recent climate
observations. Science 2002, 295, 113–117. Joint
Program reprint no. 2002-1, available
here.
- Prinn, R.; Jacoby, H. A.; et al. Integrated global system
model for climate policy assessment: Feedbacks and sensitivity
studies. Climatic Change 1999, 41, 469–546. Joint
Program reprint no. 1999-4, available
here.
- Reilly, J.; Stone, P. H.; et al. Uncertainty and climate
change assessments. Science 2001, 293, 430–433.
- Webster, M.; Babiker, M.; et al. Uncertainty
in emissions projections for climate models. Atmospheric
Environment 2002, 36 (22), 3659–3670.
Joint Program reprint no. 2002-3, available
here.
- Webster, M.; Forest, C.; et al. Uncertainty analysis
of climate change and policy response. Climatic Change 2003,
61 (3), 295–320. Joint Program reprint no.
2003–11, available
here.
Chris Forest is a research scientist in the Joint Program
on the Science and Policy of Global Change at the Massachusetts
Institute of Technology. Mort Webster is an assistant professor of public policy at
the University of North Carolina at Chapel Hill. John
Reilly is
the associate director
for research of the Joint Program and a senior
research scientist in the Laboratory for Energy and the
Environment. This article is adapted
from the July–December 2003 issue of energy & environment,
the newsletter of MIT’s Laboratory for Energy and
the Environment.
|