Climate Action Report, a periodic report to the United Nations, was
issued in early June. A media frenzy claimed that this report
somehow contained revelatory new science that changed the debate on
report has little new science. But since 1992, when America
embarked on the Rio Treaty, a great deal of new science has come
forward. The United States is a leader in studying the subject. The
U.S. has invested some $45 billion in research funding on this
question over the past 10 years.
wanted to update you on the latest science since 1992 and assure
you that what is in the Climate Action Report is really nothing
scientific facts on which everyone agrees are that, as a result of
using coal, oil, and natural gas, the carbon dioxide content of the
air is increasing. The air's concentration of other human-produced
greenhouse gases, like methane, has also increased. These
greenhouse gases absorb infrared radiation from the sun, and they
retain some of that energy close to earth.
computer simulations of climate change say that, based on how we
understand climate to work, the low layer of air for one to five
miles up (the low troposphere), where the radiation is trapped,
should warm. That low layer of air warming should, in turn, warm
Scientific facts gathered in the past 10
years do not support the notion of catastrophic human-made warming
as a basis for drastic carbon dioxide emission cuts.
probably know that the Kyoto agreement fails to stop the
hypothesized human-made global warming. Kyoto would hurt America's
and the world's workers and the struggling poor and the elderly,
owing to the severe cuts in energy use that it entails.
MEASURING SURFACE TEMPERATURE
for the science. There are two important records that we'll look
at. I just told you how we think climate operates in the presence
of increasing carbon dioxide and greenhouse gases in the air from
human activities. The layer of air one to five miles up retains
energy and that layer, in turn, heats the surface of the earth. The
human-made greenhouse warming component must warm both layers of
air, with computer simulations indicating the low troposphere would
warm more quickly and to a greater amount than the surface.
Let's start with the surface temperature
records. They are made by thermometers, and go back to about the
mid-19th century in locations scattered around the world. For some
locations the records go back even further.
groups have analyzed these surface temperature records: the
Climatic Research Unit in Great Britain, and the NASA Goddard
Institute for Space Sciences. They broadly say the same thing: The
19th century was cooler than the 20th century. There may be some
disagreement on the exact amount of the warming, but certainly the
20th century was warmer than the 19th.
see if the 20th-century surface warming is from human activity or
not, we begin looking in detail at the surface record. In the 20th
century, three trends are easily identified. From 1900 to 1940, the
surface warms strongly. From 1940 to about the late 1970s, a slight
cooling trend is seen. Then from the late 1970s to the present,
warming occurs. Briefly, the surface records show early
20th-century warming, mid-20th-century cooling, and late
of the increase in the air's concentration of greenhouse gases from
human activities--over 80 percent--occurred after the 1940s. That
means that the strong early 20th century warming must be largely,
if not entirely, natural.
mid-20th-century cooling can't be a warming response owing to the
air's added greenhouse gases. The only portion of this record that
could be largely human-made is that of the past few decades. The
slope of that trend calculated over the past few decades is about
one-tenth of a degree Centigrade per decade.
most all the computer models agree that the human-made warming
would be almost linear in fashion. So over a century the
extrapolated warming trend expected from continued use of fossil
fuels would amount to about 1 degree Centigrade per century. That's
what the surface temperature says would be the upper limit.
I gave you a scientific test to do early in my remarks. The
question is, What happens in the low layer of air from one to five
miles up that must warm in response to the increase in greenhouse
gas concentrations? The surface warming can be concluded as owing
to human-made greenhouse gas emissions only if the low troposphere
warms, if the computer simulations are accurate.
can have surface warming from a variety of reasons. So the key
layer of air to look at is the one-to-five-mile up layer of
MEASURING AIR TEMPERATURE
launched satellites starting in 1979 to measure this layer of air.
The satellites look down and record these measurements daily. I've
plotted the monthly averages. There are lots of jigs and jags in
the data, and they are real.
air temperature varies not only on a daily basis, on a monthly
basis, but also from year to year. A very huge warming spike in
1997-1998 is a strong, natural phenomenon called El Niño, a
warming of the Pacific that in turn warms the air. Because the
Pacific is so pervasive in the global average, it raises the
temperature. But it doesn't last very long, and after the El
Niño subsided, temperatures fell.
Niños are natural and occur every several years. In 1982, an
equally strong El Niño was developing in the Pacific. But
then, a volcano erupted. Material lofted by strong volcanic
eruptions can temporary cool temperatures. So those two events
occurring at nearly the same time meant there was a net cooling
just after 1982, instead of an unmasked strong El
Niño-driven pulse of warmth.
Niño is part of a system of ocean and air changes called the
El Niño Southern Oscillation, in which the La Niña
phase tends toward cooling. Detailed physical understanding of the
El Niño Southern Oscillation is lacking.
Again, these phenomena are naturally
occurring. They have existed for many millennia prior to
human-added greenhouse gases in the air.
asked the computer to naively draw a linear trend through the data
recorded by satellites. This linear trend probably has a bias, an
upward bias because of that strong 1997-1998 El Niño warm
pulse. Nonetheless, the fitted trend is: positive four-hundredths
of a degree Centigrade per decade.
this is the layer of air sensitive to the human-made warming
effect, and the layer that must warm at least as much as the
surface according to the computer simulations. Yet, the projected
warming from human activities can't be found in the low troposphere
in any great degree. The four-hundredths of a degree Centigrade
might be entirely due to this El Niño bias. If the small
warming trend in the low troposphere were assumed to be entirely
human-caused, the trend is much smaller than forecast by any model.
Extrapolated over a century, the observed trend indicates a
human-made warming trend no greater than four-tenths of a degree
contrast, the computer models say this very key layer of air must
be warming from human activities. The predictions are that the air
must be warming at a rate of approximately a quarter of a degree
Centigrade per decade.
Comparing what the computer models say
should be happening with the actual satellite observations shows a
mismatch of around a factor of 6. That is, this layer of air just
is not warming the way the computer simulations say it should.
There should have been a half a degree Centigrade per decade
warming in this layer of air over the period of satellite
observations. The human-made warming trend isn't there.
an argument is often made that the measurements made by satellites
looking down on this key layer of air are biased, or that the
satellites have instrumental problems.
researchers worked very hard to make these measurements the best
possible, and to correct for any of the deficiencies seen in them.
But it's always useful to have an independent set of data, and we
have that from NOAA (the National Oceanic and Atmospheric
Administration) scientists and from other groups around the
Measurements are also made of this layer
of air from weather balloons that carry thermometers. Balloons are
launched worldwide every day to make the measurements. The balloon
data go back to 1957, and importantly, they overlap with the
satellite data which began in 1979 and have continued through the
present. During the period of overlap, the correlation coefficient
between the two data sets, the technical term for how well do these
two independent measurements agree, is well over 99 percent.
other words, the satellite data and the balloon data both say that
the records reflect the actual change in this layer of air. Again,
as with the satellite record, one can recognize short-term natural
variations--El Niño, La Niña, volcanic eruptions--but
one does not see the decades-long human-caused warming trend
projected by climate models.
Often, one sees these same data from this
key layer of air with a linear trend drawn through them. However,
because of bias in the record from a natural phenomenon, it is not
appropriate to draw a straight line through the four decades of the
temperature record. One must work around the natural phenomenon I'm
going to tell you about.
Every 20 to 30 years, the Pacific Ocean
changes sharply. The sudden shift is called the Pacific Decadal
Oscillation, or PDO, and produces an ocean, air, and wind current
shift. Fishermen will notice, for example, migrations of fish
species along the West Coast.
1976-1977 the Pacific Decadal Oscillation shifted, and is labeled
the Great Pacific Climate Shift of 1976-1977. As a result,
temperatures changed dramatically from their former average (since
around 1946), and returned to warmth seen from around 1923 to 1946.
So sharp is the shift that the appropriate thing to do is to look
for a secular trend (which might be the human-made trend) before
1976-1977, and then after 1976-1977. But drawing a straight line
through that natural event should be avoided.
PDO is natural, because proxy records--of tree growth, for
example--detail the oscillation going back several centuries, which
is prior to human activities that significantly increase the
content of greenhouse gases in the air.
also known from computer simulations is that the human-made warming
trend is supposed to grow steadily over decades. So, a shift all at
once in 1976-1977 is ruled out by those two reasons. One, it's not
what the models project; and two, we see this event before the
build-up of human-made greenhouse gases, and it is therefore
satellite data and the balloon data agree when both records
coexist, from 1979 to the present. The balloon record reaches back
four decades. Neither record sees a meaningful human-made warming
just remember this one thing from this talk, if nothing else: That
layer of air cannot be bypassed; that layer of air must warm if
computer model projections are accurate in detailing the human-made
warming trend from the air's increased greenhouse gases. But that
layer of air is not warming. Thus the human-made effect must be
Additionally, the recent warming trend in
the surface record must not owe to the human-made effect. The
surface temperature is warming for some other reason, likely
natural influences. The argument here, from NASA and NOAA data, is
that this layer of air from one to five miles in altitude is not
warming the way computer simulations say it must warm in the
presence of human activity. Therefore, the human-made effect is
small. The surface data must be warming from natural effects,
because the human-made warming trend must appear both in the low
troposphere and at the surface. All models are in agreement on
if the surface data are warming for a natural reason, what might
that be? Our research team studies changes in the energy output of
the sun and its influence on life and the environment of earth.
Records of sunspot activity reach back to
the days of Galileo, some 400 years ago. Scientists then could
project an image of the sun and draw these dark sunspots that were
seen through early telescopes. We know sunspots to be areas of
intense magnetic activity, and from NASA satellite measurements in
the last 20 years, we know that over time periods of decades, when
the magnetism of the sun is strong, the energy output of the sun is
also more intense. That is, the sun is a little bit brighter when
magnetism is high, and the sun is a bit fainter when magnetism is
sharp ups and downs in the sunspot record define the familiar
11-year cycle, or sunspot cycle. The period is not exactly 11
years. It varies between eight and 15 years, and there is no good
explanation for the cause of the cycle. But I'm not going to look
at the short term, but rather the changing sun over decades to
the past half-century, the sun has become very active, and the sun
is more active than it has been for 400 years. Therefore, the sun
is likely at its brightest in 400 years.
noteworthy is a feature called the Maunder Minimum. In the 17th
century, the observations of sunspots show extraordinarily low
levels of magnetism on the sun, with little or no 11-year cycle.
That phase of low solar activity has not been encountered in modern
times (although radiocarbon records indicate that a Maunder-minimum
episode occurs for a century every several centuries). The
17th-century Maunder Minimum corresponds with the coldest century
of the last millennium.
may not be a coincidence. If the sun's energy output had faded, the
earth may have cooled in response to that decrease in the sun's
total energy output.
next step is to look closer at the temperature records on earth,
and see if they link to the decadal-to-century changes in the sun's
energy output. Climate scientists believe they can reliably
reconstruct Northern Hemisphere land temperature data back to, say,
the year 1700.
changes in the energy output of the sun, drawn from the envelope of
that activity of changes in the sun's magnetism, are superposed on
the reconstructed temperature record, then the two records show a
ups and downs of each record match fairly well. The coincident
changes in the sun's changing energy output and temperature records
on earth tend to argue that the sun has driven a major portion of
the 20th century temperature change. For example, a strong warming
in the late 19th century, continuing in the early 20th century, up
to the 1940s, seems to follow the sun's energy output changes
mid-20th century cooling, and some of the latter 20th century
warming also seem matched to changes in the sun.
review: The surface warming that should be occurring from
human-made actions, which is predicted to be accompanied by low
troposphere warming, cannot be found in modern records from balloon
and satellite platforms.
Thus, the recent surface warming trend may
owe largely to changes in the sun's energy output.
ECONOMIC CONSEQUENCES OF THE POLICY
Science is the primary tool to understand
human-caused global warming. But economic consequences of policies
meant to cut greenhouse gas emissions also enter the policy
Kyoto-type greenhouse gas emission cuts
are expected to make little impact on the forecast rise in
temperature, according to the computer simulations (which seem to
give exaggerated warming trends, as discussed). One forecast, from
the UK Meteorological Office, underscores the point. Without Kyoto,
that model predicts a rise in globally averaged temperature of just
about 1 degree Centigrade by the year 2050. Implementing Kyoto,
according to that model, would result in a slightly but
insignificantly lower temperature trend. The temperature rise
avoided by the year 2050--the difference between the two trends--is
six-hundredths of a degree. That is insignificant in the course of
natural variability of the climate. Another way to look at the
averted warming is that the temperature rise expected to occur by
2050 is projected to occur by 2053 if the emission cuts are
conclusion is that one Kyoto-type cut in greenhouse gas emissions
averts no meaningful temperature rise, as projected by the models.
In order to avoid entirely the projected warming, British
researchers estimate that 40 Kyoto-type cuts in greenhouse gas
emission would be required.
cost of implementing one Kyoto-type cut is enormous. Fossil fuels
supply approximately 85 percent of energy needs in the United
States; worldwide the fraction is about 80 percent. International
policy discussions propose expensive solutions centered on sharp
fossil fuel use cuts and a massive increase in solar and wind
power. A cost-effective solution that does not stunt energy use and
energy growth is to shut down coal plants, extend the licenses of
the 100 nuclear power plants in the United States, and build about
800 more. However, that is not under serious discussion as a
solution to what is often described as the most pressing crisis
facing the earth.
Renewable energy sources like solar and
wind are not only expensive but also environmentally damaging in
their vast land coverage. Those renewable energy sources are not
foreseen as seriously meeting projected energy and economic growth.
For economic growth, fossil fuels will be relied on for the next
decade or two.
cost of engaging in one Kyoto-type greenhouse gas emission cut
ranges between $100 billion and $400 billion of lost GDP annually
in the United States. For comparison, consider that the Social
Security Trustees estimated $407 billion was transferred to
retirees in 2001. The $400 billion annual loss in GDP is
approximately numerically equal to the total amount of public and
private primary and secondary education spending in the United
recent study from Yale University says that over the next 10 years,
Kyoto-type cuts would cost about $2.7 trillion in lost GDP in the
Those costs must be increased if the
target of greenhouse gas emission cuts is not one Kyoto-type
agreement but 40.
Another possible target for emission cuts
is the benchmark of stabilizing the atmosphere at a level of 550
parts per million of equivalent carbon dioxide concentration. That
target probably will be discussed at the World Summit on
sustainability in Johannesburg. Current discussions imply that
developed countries like the United States would be forced to go to
zero net carbon emissions by the year 2050. Beyond 2050, the United
States would produce net negative carbon emissions, i.e., the
United States would not only continue to emit zero net carbon, but
also to begin removing carbon from the atmosphere.
summary, little evidence supports the idea of catastrophic
human-made global warming effects. Undertaking a Kyoto-type program
would produce little abatement of the forecast risk, while the cost
of such a program would divert resources and attention from major
environmental, health, and welfare challenges.
that regard, forecasts are made of the hypothesized impacts of
projected human-made global warming effects. For example, one
scenario is that hurricanes may increase because more carbon
dioxide has been added to the air. This would be a serious economic
impact because hurricanes are the costliest natural disaster in the
U.S. But hurricanes have not increased in number or severity in the
past 50 years. The cost of property damage has increased, because
the cost of property has risen along with the rise in U.S.
wealth--not because carbon dioxide has been added to the air.
Another scenario is that human-made global
warming will see sweeping epidemics of infectious diseases like
malaria in the United States. But malaria is endemic to the United
States. Malaria strikes were quelled not by controlling the
weather, or by controlling the amount of carbon dioxide in the air,
but through increased wealth. That the United States became
wealthier from fossil fuel use meant people could be protected from
malaria by living inside screened or climate-controlled structures,
by reducing the disease vector, mosquitoes, and by advancing
medical knowledge and care. In contrast, nearly one million people
die from malaria each year; many of its victims are children in
Africa and other developing nations.
Diminishing the impact of natural
disasters is an immediate worldwide need that rests on keeping the
U.S. and world economy vibrant. Energy use, that is, fossil fuel
use, helped achieve stunning progress for humankind and the
environment in the 20th century. For example, life expectancy in
the U.S. in the 20th century nearly doubled.
Agricultural experts estimate that
technology has improved crop output. But some increase in crop
growth, namely about 10 percent, may owe to the added carbon
dioxide in the air, that is, the aerial fertilization effect from
carbon dioxide. Carbon dioxide is not a toxic pollutant. It is
essential to life on earth.
latest scientific results are good news: The human influence on
global climate change is small and will be slow to develop. The
conclusion comes from the lack of meaningful warming trends of the
low layer of air, in contradiction to the computer simulations that
project a strong human effect should already be present. Those
results present an opportunity to improve climate theory, computer
simulations of climate, and obtain crucial measurements.
economic consequences of not relying on science but instead on the
anti-scientific Precautionary Principle, are considerable, and are
not so speculative. The economic impact of significantly cutting
fossil fuel use will be hard-felt, and they will be devastating to
those on fixed incomes, those in developing countries, and those on
the margins of the economy.
the next several decades, fossil fuel use is key to improving the
human condition. Freed from their geologic repositories, fossil
fuels have been used for many economic, health, and environmental
benefits. But the environmental catastrophes that have been
forecast from their use have yet to be demonstrated by their
Sallie Baliunas, Ph.D., is
a Senior Scientist at George C. Marshall Institute and co-host of
TechCentralStation.com. The views expressed here do not necessarily
represent those of any institutions with which she is