"For Earth Day, Nine Scientists Offer Data on the State of the Environment"

Report Environment

"For Earth Day, Nine Scientists Offer Data on the State of the Environment"

April 19, 1990 35 min read Download Report

Authors: Kent Jeffreys, Elizabeth Whelan and Elizabeth; Brill

(Archived document, may contain errors)

765 April 19,199 0 FOR"EARTH-DAY E~~~STSO~ DATA ON THE STATE OF THE ENWR0"T INTRODUCTION April 22,1990, marks the twentieth anniversary of Earth Day, an event designed to focus attention on environmental and ecological issues. While the average American has little interes t in the details of scientific and technical is sues, most opinion surveys find that the public has an overriding concern for the quality of the environment. Yet concern about the environment and proposed solutions for perceived problems are almost wholly grounded in scientific theories and on the current state of knowledge in specific fields of research.

Without accurate data Americans cannot hope to produce effective responses to actual problems, let alone distinguish true problems from the scores of perceived threats.

Science does not always provide a simple answer to even a simple question.

This complexity has discouraged many Americans from making the necessary careful examination of science-based policy issues. Policy makers are not above making this mistake, often producing ineffective or even harmful environmental regulations that lead to further debates and more failed policies. Although years of research may be needed even before an actual problem can be identified many clamor for an immediate so lution to every fear.

Supposed Crises as Norm. Science fiction and fear should not substitute for sound policies. On topics ranging from potential climate change to bioengineer ing, from the global to the microscopic, appeals to unsubstantiated speculation and untested or unproven "scientific" assertions find a ready audience in today's sensationalized media coverage of all events. Almost any claim of impending cataclysm receives an enormous amount of coverage, regardless of the true weight of the evidence . For much of today's media news by definition, is bad news. This automatically focuses on accidents or supposed 'crises as if they were the norm. By the time the facts are marshalled to rebut false assumptions, as typi cally they have been, the public foc u s has shifted to the next crisis. The result the public never hears the full story The nine short papers here assembled are by scientists addressing important environmental issues. They apply fact and analysis to these topics As Americans mark the Earth D a y anniversary, their opinions should be as informed and their knowledge as complete as possible if rational, effective environmental policies are to be enacted Aaron Wildavsky, Professor, Political Science and Public Policy, Survey Research Center, Univer s ity of California at Berkeley, analyzes popular views of the risks involved in new products, inventions or even in the search for these. He concludes that attempts to eliminate every risk to society involve the even greater risk of stopping human progress and failing to meet new chdlenges in such areas as medicine and the environment Bernard Cohen, Professor of Physics at the University of Pittsburgh shows that exaggerated fears of radiation ignore even far greater risks from using other, more polluting fu e l sources, and cost society billions of dollars in wasteful expenditures from unneeded regulations Elizabeth Whelan, President of the American Council on Science and Health, points out that Americas food supply is made safe and plentiful because of pestic i des and chemicals. Efforts to eliminate these will make food less safe and more costly Winston J. Brill, President of Winston J. Brill and Associates, and a re search consultant on biotechnology, addresses the potential harm that arises from inappropriate restrictions on genetic research. The potential agricultural medical, and environmental benefits from bioengineering are enormous and should not be hindered by public misconceptions Edward C. Krug, soil scientist with the Illinois State Water Survey and p a rticipant in the National Acid Precipitation Assessment Programs 1990 Con ference, offers evidence that acid rain is not killing lakes or forests in the Northeast. He argues that the small environmental impact from manmade emis sions of sulfur can be corr e cted for less than one percent of the cost of the Clean Air Act approach S.-Fred Singer, Professor of Environmental Sciences at the University of harmful ultraviolet radiation, is not at an abnormally low level. Seasonal declines in ozone over Antarctica h ave been observed for over 30 years and have had no adverse effects on living creatures Patrick J. Michaels, environmental scientist of the Department of En vironmental Sciences at the University of Virginia, addresses the weaknesses in current computer m o dels predicting global warming. He offers empirical data showing that average worldwide temperatures are not rising significantly Virginia, finds that ozone in the eaitfisamoiphere which p?ote&Siiii ffom 2 e Randy Simmons, Director of the Institute of Pol i tical Economy at Utah State University, examines cases that show that commercial trade in rare species creates an incentive to protect them. Banning trade, as that of elephant ivory raises the prices received by poachers BruceYandle, Alumni Professor of E c onomics at Clemson University, ar gues that Superfund, the nations primary program for dealing with hazardous waste sites is failing .to accomplish its environmental goals, even as it grows into a huge, wasteful bureaucracy. Political porkbarrel projects, not a cleaner environ ment, are the main result of this program.

Kent Jeffreys Environmental Policy Analyst 3 THE MYTH OF THE RISK-FREE SOCIETY Aaron Wildavsky Many critics of new technologies, goods and services are motivated by fear of risks and the des ire to create a society free of all risks to health and safety. Yet while it is legitimate to avoid undue dangers, the attempt to create a risk-free society in fact would needlessly harm much of the human race.

Despite the introduction of new technologies, goods, and services that have the potential to cause harm of some sort, the people of the United States and, in deed, the entire Western world are healthier and safer than they have ever been.

Life expectancy continues to rise, fatal accident rates cont inue to decline. People of every age and ethnic group are doing better than those in previous decades. In novations that benefit mankind, even though they involve grave risks, include such things as heart transplants and vaccinations against polio Creatin g More Benefits. What accounts for this explosion of good health and increasing safety? Quite simply, modem technology improves health and safety more than it causes harm. It does this by reducing hazards and by finding safer ways to do things. Such advanc es are directly related to material prosperity.

People in wealthier countries are healthier than people in poorer nations. Con versely, sustained declines in income and rises in unemployment are accom panied by declining health.

Health and safety are in p art functions of the amount of such resources as wealth, knowledge, and energy that can be utilized to enable individuals and therefore, societies to respond to the constantly shifting challenges of life hazards why settle for any risks at all? Why not ac cept the health and safety aspects of modem technology and leave the harmful ones behind? Why can not government simply mandate that all products and activities be made safer?

The fact is that the good and the bad effects are in extricably intertwined in the same objects. Therefore risk taking usually is essential for bringing benefits. For example, individuals accept the risk of contracting a hospital-caused disease be cause of the benefits of having hospitals. Or to discover a v accine against polio which saved thousands of people from lives of physical disability, human volun teers risked death in early trials of the medicine Trial and Error Essential. Some critics would reject new innovations, or even the search for them, unles s they can be shown to do no harm. But this under mines the trial-and-error process essential for any new discovery and substitutes instead a mandated public policy of trial without error, or no trial without prior guarantees against error. What is wrong w ith this apparently prudent approach?

First, it is literally impossible for a researcher or experimenter to prove that some unforseen consequence will not occur. Attempting through regulations to prevent all potential dangers inflicts huge, unnecessary costs to society.

Governments attempts to impose a margin of safety on particular activities have Even if it is true that new discoveries and inventions create more benefits than 4 I no logical stopping points. It might be irresistible for government bureauc rats or others to suggest that if a one-in-a-million risk is good, then a one-in-one-billion risk is even better. Never mind that the cost of reaching such low risk levels redirects societys resources in ways that result in greater overall harm.

Since gre ater wealth and other resources offer the means for a safer and heal thier society, reducing societys wealth through over-regulation with no attempt to balance costs with benefits ultimately will reduce the peoples health and safety. A strategy of pure pr evention is akin to making an organism so sensitive that the smallest threat of harm leads it to set off so many of its defenses that it collapses from exhaustion and dies.

Willingness to Take Risks. The second problem with a public policy of prevent ing a ny risk or danger from any new innovations follows from the fact that failure to innovate itself can be very harmful. Society cannot conhue to realize the growth in safety and health it has experienced for the past century without en couraging technical, e ngineering, industrial and scientific progress. The more that policy makers restrict experimentation or the trial-and-error approach, the less innovation there will be. Old hazards will not be reduced as much as they would have been. New benefits to socie ty will be delayed or never materialize.

Thousands or millions of people might suffer or die from lack of new medicines.

To find a solution to new dangers, such as AIDS, will require a willingness to take some risks, to eliminate others.

It is impossible to develop rapid improvements in health and safety without an innovative technology based on continuous, noncentralized, trial-and-error decision making.The history of promoting safety through a centralized system is poor. If Big Brother knew best, the S oviet Union would have rising rather than declining health and safety rates. Just as Eastern Europe was enslaved in the name of some perfect liberty, so health can be harmed in the name of some perfect safety.

Safety without risk is a delusion. Uncertainty about the consequences of present acts and about others as yet unforeseen cannot be reduced to zero.

Health and safety are provided in the same manner as other goods, by means of trial-and-error risk taking. Only by allowing this process to continue will society be able to reap the benefits of increasing levels of health 5 RADIATION AND OUR SOCIETY Bernard L Cohen Nuclear radiation consists of various subatomic-size particles such as gamma rays, electrons and alpha particles travelling at speeds of appro x imately 100,000 miles per second. They penetrate deep inside the human body, each one damag ing about 100,000 cells as it passes. This damage might initiate a fatal cancer, or if it is in a reproductive cell, it could cause genetic defects in later genera tions.

This makes radiaiton seem very dangerous, but before panicking one should recognize that every person is struck by about 15,000 of these particles every second, a total of 500 billion every year or 40 trillion in a lifetime, from ~t~ral sources alon e. When a patient receives an x-ray, he is struck by about a hundred billion particles.

It is true that any single one of these trillions of particles might cause a cancer or a genetic disease. But the probability of any one particle having such an effect is very low, about one chance in 20 quadrillion or 20 million billion. Thus, even though such a particle passes through each human body 15,000 times every second, less than one percent of the population is ever affected by them. Of course, people are sub j ect to innumerable other such fatal games of chance. For example, every bite of food and every breath of air contains trillions of car cinogenic molecules any one of which can cause cancer, and millions of germs that can kill through causing other disease s . Walking involves risk of a fatal fall staying still, however, risks cardiovascular problems from lack of exercise. Every action involves risk Resulting in Bad Public Policy. To understand radiation risks, one must com pare them quantitatively with other risks. Thanks to over a half century of re search, effects of radiation are better understood than those of almost any other environmental agent. The nature of these effects are agreed upon, with relatively insignificant variations, by such scientific gro u ps as the National Academy of Sciences Committee on Biological Effects of Ionizing Radiation, the Internation al Commission on Radiological Protection, and the Congressionally chartered National Council on Radiation Protection, the United Nations Committe e on Effects of Atomic Radiation, the U.S. Environmental Protection Agency Britains National Radiological Protection Board, and similar official bodies charged with responsibility for radiation protection in every nation of the world.

While the general nat ure and effects of radiation are well understood, policy makers often act without applying the available quantitative information.The results are bad public policies. For example, use of electricity in the U.S. has in creased steadily for over a century, g rowing at a rate of around 4.5 percent an nually in recent years. This growth requires new power plants, and the only viable fuels for these are nuclear or coal.The best scientific estimates of the health ef fects caused by a plant producing one million k i lowatts of electricity are I 1) From a coal-burning plant, about 25 deaths per year due to air pollution 2) From a nuclear plant, about 0.05 deaths per year owing to radiation result ing from reactor accidents, treated on a probabilistic basis; radioactiv e waste, ad 6 ding up effects over millions of years; escape of radioactive materials during operations; and all other effects.Thus rational scientific analysis indicates that a coal-burning plant is 500 times more harmful to public health than a nuclear p lant. Yet coal-burning plants are often chosen over nuclear plants because the public considers the latter to be too dangerous.

The fear of radiation appears even more irrational when comparing public reaction to radiation from different sources. Nearly any risk can be reduced by spending money. For example new highway safety measures such as improved street lighting, upgraded gua r d rails, or break-away sign supports can save lives at a cost of about 100,000 per life saved. Money spent on screening programs to detect cancers in early stages also saves lives at a similar cost. The Nuclear Regulatory Commission NRC) estimates that a n uclear power plant built in the early 1970s with. technology-available-at that time will cause an average of one death due to reactor accidents over a 40-year operating lifetime. Despite tens of millions of dollars spent to research reactor safety, this e stimate has not changed substantially since 19

75. The press sometimes refers to reactor accidents that can cause up to 50,000 deaths, but these are expected only once in 10 million years an average of only 0.005 deaths per year.The NRC in the late 1970s a nd early 1980s tightened safety requirements that increased the cost of a nuclear power plant by 2 billion. Presumably this was to avert that one death, a cost of $2 bil lion per life saved 100 Million to Save a Life. Even if reactor accidents are not con s idered, great sums are spent for small statistical gains in regulating nuclear power. About $100 million per life saved is being spent on radioactive waste management. An NRC regulation requires that nuclear plants install equipment to reduce releases of radioactive iodine if it will save one statistical life for every $100 million spent A far greater danger of harm from radiation exposure is from radon in homes.

Radon is a naturally occurring, radioactive gas that can become trapped in well insulated home s, building up to potentially dangerous levels. It is estimated that 10,OOO Americans die each year from exposure to radon, over a thousand times more than are ever expected to die from nuclear powers radiation. A substantial fraction of these lives can b e saved by rather simple and cheap procedures. Such prevention efforts cost about 25,000 per life saved using the same sort of es timates as those applied to nuclear power safety measures. Yet only 2 percent of American families have taken even the first s t ep of spending $12 to measure radon levelsat home Policy Makers Ignoring Facts. Since the particles are absolutely identical when a human cell is struck by a particle of radiation, the health effects are the same whether that particle comes from radon in t he home or radiation released by a nuclear power plant. Current policies result in spending hundreds of mil lions or billions of dollars per life saved to avoid small risks, when far greater benefits are available for far lower expenditures. The informati o n is available to policy makers. Yet they too often ignore the facts and thereby waste huge amounts of the nations wealth and health 7 PESTICIDES, THE NATIONS FOOD SUPPLY -AND YOUR HEALTH Elizabeth M Whelan Pesticide has become a pejorative word. With hea d lines claiming that Alar technically a growth regulator, not a pesticide, formerly used on apples, ethylene dibromide or EDB a fungicide used on grains and other agricultural chemicals cause cancer, Americans have become understandably concerned about pes ticides use on produce.

Despite the flurxy of regulatory activity to deal with the public anxiety about pesticides, there is no evidence that pesticide residues cause human cancer, and no recorded cases of any human ill health related to exposures to resid ues of pes ticides used in an approved, regulated manner. Pesticides assure man that most crops survive the ravages of insects and end up on the dinner table. The public and policy makers should reject policies that might endanger Americas food SUPPlY Onl y the Dose Makes the Poison. A look at the facts shows that fears over pesticides are misplaced. The most basic principle of toxicology is only the dose makes the poison. Pesticides by definition are poison.They are meant to kill in sects and other predato r s. Indeed any chemical can be harmful if the dose is high enough. The scientific reality is that the exposure of American consumers to pes ticides is minuscule. The Environmental Protection Agency sets allowable tolerances for residues in food over 100 ti mes more than necessary to protect health. The Food and Drug Administration inspects food to make sure those tolerances are respected. With the margin of safety built in, there is no further health protection to be achieved by further limiting exposure.

De mands that no poisons or carcinogens be in any food are unrealistic To begin with, most natural foods abound in toxins. Further, the concerns about pes ticide residues come almost exclusively from the observation that these chemi cals can be designated ca r cinogens. Yet this fact is based exclusively on animal studies, with no discernable link to humans. For example, scientists have now determined that a multitude of natural chemicals, including those in pepper, mus tard, mushrooms and bread, also cause can cer in animals. The scientific consen sus now is that extrapolating from high-dose animal experiments to trace ex posures of humans to pesticides and other chemicals is not scientifically valid.

Epidemiologists have identified a number offactors that bgeas e thensk of human cancers. Cigarette smoking, overexposure to sunlight and radiation, al cohol abuse, particularly in conjunction with smoking, certain occupational ex posures, and reproductive practices, such as women having their first child at an advan ced age, contribute to the risks of developing specific types of malignan cies. No cancer epidemiology textbook lists pesticide residues as a cause or prob able cause of human cancer.

Necessary Human Intervention. The worlds ability to feed its billions of in habitants is in part a testament to mans ability to protect crops from insects.

Without deliberate human intervention, nature rapidly would eradicate the worlds food-producing capacity and unleash the crop-destroying pestilence that 8 has plagued much of human history. Calls for limits or the banning of pesticides based on unsubstantiated fears of a one-in-a-billion chance of contracting cancer would threaten millions of people by reducing their food supplies. Many would be threatened even with starva t ion. Without pesticides there would not be an abundant, inexpensive, varied supply of food for this nation on a year-round basis In assessing the current public anxiety about pesticides, the public and policy makers are faced with a stark choice: either t h ey can respond to those worries with facts, or they can bow to perceptions. Consumers must evaluate the purely hypothetical risks of pesticides, based on no evidence, contrasted with the grave dangers posed to the food supply by denying farmers the chemic a l tools they need to feed this country and the rest of the world 9 BIOTECHNOLOGY OFFERS ENVIRONMENTAL BENEFITS Winston J. Brill Most of the food we eat, from meat to fruit to vegetables, is the product of traditional cross-breeding, a form of genetic engi n eering. Animals with desirable traits, such as less fat, or plants more resistant to pests are mated or bredwith one another to produce.more organisms with .those traits. Biotechnol ogy is a newer science that involves transferring genetic material that p roduces an organisms traits, directly from one cell to another in an effort to improve or introduce desired traits to an organism. In this regard it is much like the tradition al, but far slower and less certain, methods of breeding for selective traits.

G enetic engineers can go beyond traditional breeding. They isolate one or several genes from an organism and introduce these genes into the chromosome of another organism. Species barriers can be overcome. For example, bacterial genes can be added to plant s.

The new techniques of biotechnology have great promise for more efficient agriculture. Application of these techniques may be critical to maintain a satisfac tory quality of life as the planets population increases and as greater attention is paid to en vironmental problems Inhibiting Crop Pests. Laboratories already have produced genetically en gineered plants that ward off caterpillar and virus problems. Microorganisms have been engineered to inhibit a variety of crop pests after the microorganisms hav e been applied to the field. These engineered organisms reduce or eliminate the need for pesticides.There is good reason to believe that the pests will not readily overcome this resistance, as they have with most chemical pesticides.

Work also is progressi ng to produce plants that utilize fertilizers more efficiently that can reduce the cost of producing crops and possibly decrease fertilizer pollu tion in lakes and streams. Biotechnology should be able to produce plants that will grow in hostile climates, for example, in the extremely dry parts of Africa. In human health care, genetically engineered organisms are now producing impor tant pharmaceuticals; in fact, this is where most of the biotechnology research and commercial activity has focused.

While sp ecial caution is required any time there is application of a new tech nology, the level of caution should be based on scientific principles and relevant egerience. The National Academy of Sciences andoeer highly respected scien tific groups have published reports stating that a genetically engineered organism should be no more dangerous than that same organism genetically modified by traditional methods, such as mutation and breeding.Thus, regulations that have been satisfactory for traditionally altered o r ganisms should be satisfactory for or ganisms that have been modified through genetic engineering This conclusion is scientifically sound. The best a geneticengineer could hope to do is to add fewer than ten foreign genes to the recipient organism and hav e that organism survive in nature. An organism contains hundreds of thousands of genes, and the addition of a few will not radically change its character. Therefore a tomato containing a couple of genes from another organism will still look and 10 in most r espects be like other tomatoes. It may, however, have improved taste or resistance to a pest. Genetic engineering is much more specific and predictable than breeding, which is really a random mixing of the hundreds of thousands of genes of each parent org a nism Predictions Unfounded. Anti-biotechnology activists have frightened the public into believing that genetically engineering organisms, when used in a field could cause havoc. They evoke science fiction-like visions of mutant organisms spreading like a plague and destroying human life. There is no scientific basis for these predictions. Very little press has focused on the conclusions from the Na tional Academy of Sciences. Thus, the public has generally been equating geneti cally engineered organisms w ith dangers, instead of with the benefits of better food, health care and less environmental problems.

The Environmental Protection Agency EPA) regdates field testing of geneti cally engineered microorganisms, while the U.S. Department of Agriculture regu lates genetically engineered plants. Current regulations, especially those of the EPA, are far too restrictive and play a major role in limiting the research advan ces in this area. A laboratory that wants to field test a genetically engineered or ganism, a test that may involve as little as a square meter in area, has to submit a tremendous amount of paperwork for the regulatory agency. It has been es timated that a request for a single field test costs a minimum of 200,0

00. In some cases state and local regulations impose additional costs. Before an en gineered plant or microorganism can be shown to be effective for commercializa tion, many dozens of field tests will be necessary Regulatory Barriers Hindering Competition . Large corporations may be able to afford this. Small entrepreneurial companies usually cannot. The greatest prob lem, however, is that the university researcher often cannot afford such a field test, unless supported by a corporation.

The net result is t hat many good university researchers are avoiding basic science projects that involve field testing genetically engineered organisms This eventually will reduce future benefits, since most applications from industrial re search are developed from basic re s earch. Biotechnology will not advance if these regulatory and public perception problems continue.The U.S. might find it self becoming less competitive in areas involving biotechnology since many other nations have not erected these regulatory barriers 11 ACID RAIN AND ACID LAKES: THE REAL STORY Edward C. Krug For more than a dozen years, the conventional wisdom held that rain in the Northeast U.S. was acidified by Midwest electric utilities coal burning, creating an aquatic silent spring thousands of Nort h east lakes were allegedly dead with thousands more soon to die. President Jimmy Carter endorsed a report by his Council on Environmental Quality in 1980 calling acid rain one of the two most serious environmental problems of the century. Such fear was und e r standable given the widespread claims of disaster. For example, the Congres sional Office of Technology Assessment in 1984 claimed that 80 percent of Northeast lakes were or would be acidified by acid rain. The U.S. Environmental Protection Agency in 19 8 0.claimed that .acid rain had.increased-the acidity of Northeast lakes 100-fold over the last 40 years. In 1981 the National Academy of Sciences claimed that, at then current levels of acid rain, the number of acidified lakes would more than double by 199 0.

These initial claims of disaster were unsubstantiated and have been refuted by extensive research of the past ten years. Yet, the authority and consistent repeti tion of these claims established a belief which persists to this day.

The ten-year Nationa l Acid Precipitation Assessment Program (NAPAP how ever, proves that the claims of an aquatic silent spring are unsubstantiated hy perbole. NAPAPs comprehensive national lake survey found only 240 lakes of 7,000 Northeast lakes, or 3.4 percent of the tota l, to be acid-dead, although this was assumed to be due to acid rain. And this percentage was found to be stable.

Furthermore, most of these dead lakes have little recreational value simply be cause they are small and hard to get to.

Pressure for Politica lly Desired Results. When NAPAP published these lake survey results in its 1987 Interim Assessment, environmentalists and their allies in Congress were outraged. NAPAPs director, Lawrence Kulp, left in the midst of the turmoil. Members of Congress demande d that the new director produce the politically desired results. This furor diverted public and political attention from the expensive survey results showing no aquatic silent spring and allowed policy makers to proceed with plans to deal with a problem of far greater mag nitude than the one that actually existed.Thus, last October 5, NAPAPs new director, Dr. James R. Mahoney, told the Senate Subcommittee on Environmen tal-Protection that deacidification of about half of the acidic lakes over the next 50 ye ars might conceivably just

the potential $300 billion cost of proposed acid rain controls. But his own NAPAP report admitted that all 240 acid-dead lakes could be deacidified simply by adding acid-neutralizing limestone for just $20 million, one fifteen-t housandth of the cost of the proposed acid rain bill.

The February 1990 NAPAP conference on the final results of this $600 million research program reduced aquatic effects to a trivial issue. While it was pre viously reported by NAPAP that most of the exi sting acid-dead lakes are acidic because of acid rain, additional study of Adirondack lakes (considered to be the lakes most severely acidified by acid rain) showed that over half of the fishless acidic lakes were acidified by natural organic acids. Most of todays acidic lakes 12 were probably naturally acidic before acid rain. The existing models do not yet in corporate these types of natural acidity.

These new research findings on the widespread importance of natural acidity were further supported by lak e sediment analyses showing that, while some Adirondack lakes have recently become more acidic, the average Adirondack lake is more alkaline now than it was 150 years ago, not more acidic. New re search shows that land-use change is almost universal in th e Northeast. Only 60,000 of over 60,000,000 acres of forest have not been cut and regrown, or 0.01 percent. Land-use changes can result in either alkalinization or acidification of lakes and streams This introduces major uncertainties about any changes the re have been in acidity or alkalinity.

Rivers of Hunger Predating Industrial Activity. The notion that acid rain is responsible for acidity in lakes and streams is also contradicted by the existence of highly acidic surface waters in regions without acid r ain. Fraser Island Cooloola National Park, and Tasmania in Australia, and the Westland area of New Zealand have no acid rain, yet are filled with highly acidic lakes and streams. Indeed the magnitude of acidic surface waters in areas without acid rain dwa r fs that of areas supposedly devastated by acid rain. In the Amazon basin, a river system the size of the Mississippi, the Rio Negro is naturally acidic and fish less. The naturalist and explorer Alexander von Humbolt wrote about these rivers of hunger nea rly 200 years ago, definitely pre-dating industrial activity in that part of the world.

The public belief in the acid rain problem was created by repetitive and un proven assertions from ostensibly authoritative sources in U.S. government programs and agen cies. This misinformed public opinion has, in turn, dominated the politics of acid rain.The U.S. is now on the verge of adopting a Clean Air Act which, among other things, seeks a multi-billion dollar reduction of acid rain emissions to restore lakes and s treams. Yet most of this concern is not based on scientific fact. Policy makers who ignore the evidence on acid rain will waste bil lions of dollars and, in the long run, cast doubts on the credibility of environmen tal concerns 13 STRATOSPHERIC OZONE S. F red Singer Ozone is a natural component of the earths atmosphere. It is a type of oxygen molecule formed when high energy components of solar radiation break apart other compounds, allowing the oxygen to recombine as ozone. In high concentra tions at the planets surface, ozone is considered a pollutant that can irritate lungs and eyes, and lower crop yields.This ozone can be created by reactions of either manmade or natural chemicals and is a primary constituent of urban smog.

High in the stratosphere, how ever, ozone forms a natural layer that efficiently absorbs ultraviolet (or W) radiation from the sun. W radiation is the tanning component of sunlight, and has been linked to benign, n6n-melanoma skin can cers. W radiation also can stimulate the body to p r oduce Vitamin D, which can reduce the incidence of certain diseases, such as rickets and osteoporosis Cloud of Suspicion. Some research in the 1970s suggested that the normally in active chlorofluorocarbons, or CFCs, used in many modem applications, inclu d ing refrigeration, computer chip manufacture, and fire extinguishers, could perco late up into the stratosphere and there be decomposed and attack ozone. In 1980, the National Academy of Sciences estimated that the maximum possible reduction in stratosph eric ozone due to CFCs was approximately 18 percent.

Over the years, this number has been revised sharply downward, to as low as be tween 2 and 4 percent. But the initial estimate has held the publics attention.

Under this cloud of suspicion, voluntary restraints on CFC use were adopted for some, noncritical applications. By 1978, the U.S. had unilaterally banned CFC use in all aerosol propellants In 1985, a British group at an ozone observing station at Halley B a y, An tarctica, announced that every October since 1975 they had found a short-lived decline in the amount of stratospheric ozone. The magnitude of the decrease had grown steadily, reaching nearly 50 percent of the total ozone. The finding was quickly con firmed by satellite instruments, which also indicated that the phenomenon covered a large geographic region. It seemed that a smoking gun had been found; linking ozone destruction with CFCs and chlorine, a chemical component of CFCs.

Limited Conditions. Ho wever the predpitgus-declingobservvd ozone levels around October is dependent upon the existence of a number of precise climatic conditions. For example, the stratospheric layers must be isolated from other air layers so that warmer air or chemical contam i nants do not interfere with the ozone-depleting reaction.These conditions are limited to the South Pole and smaller pockets near the North Pole and even then only for short periods each year ozone observations, and for whom the measuring unit for ozone is now named noted this temporary disappearance of ozone in 19

56. Dobson noted that when the Halley Bay Antarctic station was first set up in 1956, the monthly reports Further, G.M.B. Dobson, the Oxford University professor who started modem 14 showed that the values in September and October 1956 were about 150 [Dob son] units SO percent] lower than expected In November the ozone values sud denly jumped up to those expected It was not until a year later, when the same type of annual variation was repeated, t hat we realized that the early results were indeed correct and that Halley Bay showed a most interesting difference from other parts of the world Tremendous Fluctuations. The discovery, or perhaps rediscovery, of the An tarctic ozone hole was combined wit h a March 1988 report by the National Aeronautics and Space Administration OzoneTrends Panel calling for a com plete ban on CFCs.The NASA report indicated that global ozone levels had declined by a total of around 3 percent since 19

69. This ominous report con vinced many that CFCs were destroying ozone throughout the stratosphere. How ever, the-report .failed to-mention that the year selected-as-the starting point for ozone measurements was actually a peak year for ozone levels. Since ozones ex istence in the stratosphere is closely linked to the amount of solar radiation, it fluctuates tremendously over seasons and from year to year. Additionally, meas urements of UV radiation at the earths surface show that it actually has declined since 1974, even thoug h theory predicts that it should increase when ozone is reduced.

Scientific caution was not followed by many in the international environmental community. Arising from the recommendations of the Montreal Protocol of 1987, drawn up by representatives of mos t of the worlds industrialized countries global controls for CFCs have been adopted by most of the major industrial na tions, calling for a 50 percent reduction in world CFC output by the year 2000 Further demands are being pressed for total elimination o f CFCs by the year 2000 The case against CFCs is based on the scientific theory of ozone depletion plausible but quite incomplete and certainly not reliable in its quantitative predictions. There is even evidence that volcanoes, and perhaps salt spray and bio-chemical emissions from the oceans, contribute substantially to stratospheric amounts of chlorine, which minimizes the effects of manmade CFCs.

Even assuming the accuracy of the current theories, the actual threat would be quite small, to both humans a nd plant and animal life. Normally UV radiation in creases the closer to the equator, or the higher in altitude one goes. A 5 percent decrease in the ozone layer would, under the environmentalists own theory, in tance from Palm Beach to Miami Unnecessary S acrifice. CFCs contribute greatly to the welfare of modem man They are non-toxic, nonflammable, inexpensive compounds. Alternatives may turn out to be toxic to humans, corrosive to existing equipment, less energy-effi cient in use may decay over time requ iring frequent replacement, and are cer tain to be more costly. With such important and direct consequences resulting from a ban, the scientific supports for a total ban need to be greatly enhanced.

Otherwise society will be asked to sacrifice both public health and economic vitality for a threat that may not exist crease-UV exposure to the-same extent as moving-about 60 miles south,-the dis 15 THE SCIENCE AND POLITICS OF GLOBAL WARMING Patrick J. Michaels In recent years environmentalists and others have s een global warming looming as an apocalyptic ecological disaster. Premature predictions include such catastrophes as droughts in some areas and floods in others caused by in creasing concentrations of infrared or heat-absorbing trace gases, such as carbon dioxide in the atmosphere, accompanied by rapidly rising temperatures, sea level, and evaporation rates This fear has been accompanied by the Worldwatch Institutes calling for a whoelsale reordering, a fundamental restructuring of the world economy.

The s cientific question critical to the policy implications of global warming however is not How much will it warm? Instead, the proper question is Why has it warmed so little? The answers now beginning to emerge will allow policy makers to avoid rash decision s based on incomplete knowledge or untested opinion Global Warming Prior to 19

40. Because of the drastic alterations in the in frared-absorbing composition of the atmosphere due to human activities, some climate models suggest that the atmosphere should h ave warmed up some 2.0 degrees Celsius in the last century In fact, the warming since 1880 has been 0.45 plus or minus 0.10 degrees. A calculation designed to show in which years chan ges took place found that 90 percent of this warming was prior to 19

40. Yet it is since 1940 that the lions share of the trace gases alleged to cause global warming have been emitted.

Assumptions in those models concerning oceanic warming patterns predict that the Northern Hemisphere should warm up first and most. In fact, there has been no net change in mean hemispheric temperature during the last half-cen tury.The Southern Hemisphere, which should have warmed up the least and the slower, shows slight warming even though the magnitude of warming appears to be a factor of t hree under what is suggested by climate models.

Remote Likelihood of Disaster. The most alarming aspects of global warming are predictions of ecological and agricultural dislocations caused by increasing aridity or dryness and a disastrous sea level rise o f a meter or so by the mid 21st century that could flood coastal cities. Recent findings indicate that the likeljhood. of both is becoming much moreremote In the greenhouse projections, increased aridity in mid-latitude agriculture results not from lack o f rainfall as much as it does from increased evaporation of water into the atmosphere owing to warmer temperatures, which is primarily a daytime phenomenon. Yet many temperature histories now show that, while daytime temperatures have either remained the s a me or declined, nighttime temperatures appear to be rising relative to the day values. This has the effect of minimizing any evaporation increase. A benefit of this effect is a longer growing season and better growing conditions.This phenomenon has now be e n observed in most locations where it has been looked for 16 Satellite measurements now indicate rapid growth of the Greenland Ice Cap and there is strong evidence of similar increases in Antarctica. As a result of this evidence, a panel of the American G e ophysical Union in late 1989 reduced es timates of climate-related sea level rise to twelve inches.This is below the lower limit suggested in a National Academy of Sciences report used as the basis for many predictions of global floods Counteracting Globa l Warming. Other factors seem to offset some of the predicted causes of global warming. For example, dramatic increases in cloudi ness have now been detected, with the increases concentrated in the more in dustrialized Northern Hemisphere. The cloud type t h at shows the most increase with some regions showing growth of over 10 percent, is the ocean-surface stratocumulus, which is the most effective cloud type at counteracting the global warming effects of trace-gas-pollution in the-atmosphere. Calculations s how that the increase in cloudiness could have produced a net global cooling of 1.5 degrees, perhaps offsetting warming from other sources.

Warmer nights and normal or slightly cooler days are consistent with both in creased trace gases in the atmosphere, the elements said to cause global warm ing, and with increased clouds, which would reduce the rise in daytime tempera tures.

Revisions of the earlier climate models that originally predicted disastrous global warming have reduced considerably the estimates of prospective warming said to occur because of a doubling of atmospheric carbon dioxide. Even so, the most conservative of these models still predicts that the world's land areas should have warmed up approximately 1.4 degrees since 19

50. The actual net warming of these surfaces has been between 0.2 and 0.25 degrees, or a factor of six to seven times less than had been forecast to have occurred already.

Some policy makers have been criticized for not enthusiastically supporting ex pensive programs to reduce emissions said to cause global warming. These in dividuals, however, merely are responding to well tested and scientifi cally proven facts rather than to incomplete models and dated analyses 17 ENDANGERED SPECIES PROTECTION Randy T. Simmons Although hunting elephants has been illegal in Kenya for over a decade, the countrys elephant population fell from 65,000 in 1979 to 1 9,000 last year.

Kenyas wildlife managers blame this on the international ivory market. Yet in Zimbabwe, shops openly sell ivory and hides. These goods come from elephants that are culled from large herds in that countrys game parks. Animals must be remove d from these herds periodically to prevent overly rapid population growth that could result in too many elephants and not enough food. Zimbabwe has found that the best way to protect elephants is to give its citizens the opportunity to benefit from their p resence.The result: the elephant population has grown from 30,000 to 43,000.over-the-past -decade Increasing Herds. There are two conflicting approaches to elephant protection in Africa today. Kenyas ban on hunting and its efforts to suppress the ivory tr ade are typical of most of Central and Eastern Africa.The results have been dis astrous. From 1979 to 1989, Central Africas elephant population declined from 497,000 to 274,800 and East Africas from 546,650 to 154,720.

By contrast, the elephants of Botswan a, South Africa, and Zimbabwe are in creasing, and now account for 20 percent of the continents elephants. These southern African countries all support conservation through utilization, allowing safari hunting and tourism on private, state, and communal l ands, and the sale of ivory and hides. Because individuals and local communities can own and profit from elephants, they have an incentive to make certain that elephants on their preserves are not wiped out.

The Kenyan approach to wildlife protection is ty pical of the international ef fort to save many species from extinction. The Convention on International Trade in Endangered Species and Wild Flora and Fauna (CITES) establishes two levels of protection. If listed on Appendix I, all trade in products from that species is banned. If listed on Appendix 11, some trade is allowed, with official permits, under a quota system administered by CITES.

The problem with this approach, using the elephant example, is that it attempts to answer the question How do we st op the market in ivory in order to remove the incentive for poaching elephants? The question it should ask instead is How do-we make elephants valuable-enough-that-people h.av-e _anbcentive to be careful stewards rather than careless exterminators?

Failin g to Eliminate Demand. Economic theory teaches that a government ban on the supply of a valued commodity can never wholly eliminate demand. It usually accomplishes three things, however: 1) prices increase, 2) people with a comparative advantage at avoidi ng detection, usually criminals and corrupt public officials, take over the formerly legal market, and, 3) if the resource is publicly owned, it is rapidly consumed.

Legalizing trade and protecting property rights reverses these outcomes: 1 prices drop as legal supplies grow; 2) there is no premium price due to criminality or corruption; and 3) property rights encourage wise stewardship of 18 the resource, because any loss will fall on the owner rather than be spread among the millions of public owners of the resource.

Trade bans on wildlife products have failed to protect species for which there is a commercial demand. Many species of Latin American parrots, for example are protected by a CITES Appendix I listing. Rather than reducing the decline in native parrot populations, prohibition has accelerated it.The profit in trading in protected birds is often greater than that in-producing illegal drugs Protected by Commercialization. Prohibition has completely failed to protect Africas black rhinoceros. Rhino horn is so highly prized by Arabs for ceremonial dagger handles and by Asians as a medicine and an aphrodisiac that a ten-pound horn can sell for $80,0

00. About 50,000 rhinos existed in Africa when the 1976 CITES ban went into effect.These rhinos dwindle d to 14,800 by 1980, and to about 3,500 today, most of which arein Zimbabwe andSouth Africa. In South Africa, in fact, rhino populations have dramatically increased. Contrasting with the poor record of trade bans, commercialization protects a broad variet y of species. Seabirds are farmed in Iceland, crocodiles and butterflies are raised in Papua, New Guinea, and crocodile farming is a multimillion dollar business in Zimbabwe.

U.S. policies affecting commercialization of native-endangered species have been contradictory. In 1979, for example, the U.S. Fish and Wildlife Service revised its own regulations to allow commercial foreign trade in American al ligators. It was hoped that trade in alligator products would help reduce the threat from poachers against other endangered crocodile species. Ten years later alligator farming is so successful that wild populations have exploded, even as profits from hides and meat have reached record highs.

Green Sea Turtles. Yet contrary to its policy on alligators, the Fis h and Wildlife Service rejected a 1983 request to allow commercial use of captive-bred green sea turtles.These animals remain endangered, as do other sea turtle species that farmers are capable of raising. Farms could be used to replenish natural stocks a s well as to supply the commercial trade. Without a change in the philosophy of banning trade, the world will continue to lose species. America should take the lead in promoting this market alternative to the continuing decline of endangered species 19 SUP ERFUND Bruce Yandle The Comprehensive Environmental Response, Compensation, and Liability Act, popularly known as Superfund, was passed in December 1980 to create a regulatory scheme and a 1.6 billion account to clean up hazardous waste sites.

Funds were to be collected between 1981 and 1986 .primarily from a special tax on petroleum and chemical feedstocks, that is, raw materials such as petroleum based materials, used in industrial processes and products Everyone Pays. Superfund is only loose l y based upon the principle that the polluter should pay for cleanup In fact, all producers pay the identical rate of tax on feedstocks regardless of present or past pollution practices. The cleanest and most careful chemical firm pays the same tax-per pou nd of chemicals as the most negligent polluter.

The original Superfund law required that the Environmental Protection Agen cy specify at least 400 cleanup sites, which is about one project for each of the 435 Congressional districts, and stipulated that ev ery state must have at least one Superfund site with a priority designation.The EPA would then seek to identify any businesses contributing to pollution at a given site and sue them to cover the cost of cleanup. The money in the Fund would be used if the identified polluter could not be found or could not pay.

The law had the immediate and politically valuable effect of generating a large demand for what were recognized as huge pork barrel projects. Local lobbying ef forts were organized in response to the availability of large federal grants. Like most large environmental programs, Superfund became ensnarled in bureaucratic red tape and severe management problems and has been heavily criticized for ineffectiveness and waste, even by its proponents.

This failure to relate the Superfund tax directly to the areas affected by pollu tion is demonstrated by the fact that Superfund tax collections under the 1980 law were highest in the Southwest where the petroleum industry is concentrated.

Yet Superfund site expenditures were highest in the more polluted and political ly stronger Northeast.

Full Liability. Superfund is premised upon the principle of joint-and-several retroactive, strict liability.This means that all firms presumed to have con tributed any amoun t-of pollution-to a Superfund siter no-matter how small would be legally responsible for the full cost of any cleanup. In addition, a firm using state-of-the-art techniques acting in a perfectly legal manner at the particular site could become liable for the entire site later.

EPA often acts against the wealthiest firm, to saddle it with the cost of each Su perfund site. Some firms have been included in Superfund lawsuits even though they produced only nonhazardous wastes that later were placed at a site t hat in cluded another firms hazardous wastes. EPA also can target the transporters of waste, who neither control its production nor its disposal.This forces the vic timized firm to find other responsible parties to share the blame, and the cost. In some c a ses almost 300 other firms and insurance companies are being sued by 20 the company chosen by EPA This aspect of the law has hindered the develop ment of private pollution insurance since no insurer can determine that a client will not be held liable for another companys pollution.

Although the problems with Superfund were great, its political and porkbarrel benefits were even greater. The Superfund program was expanded by the Super fund Amendments and Reauthorization Act of 1986 (SARA). SARA extends these programs until 1991 and increases the Fund to $8.5 billion by creating a new excise tax on almost every industry. SARA even provides federal funds to private groups so that they may sue the EPA should it fail to place a particular site on the Superfund l i st Disappointing Results and Limited Benefits. After ten years, Superfund should be graded not on its good intentions, but on its disappointing results. The Congressional Office of Technology Assessment last Ozber found that about 50 percent of Superfund c leanups addressed speculative future risks and about 75 percent of all cleanups were unlikely to work over the long term. The benefits of Superfund activities are not spread throughout society, as may be ar gued for air quality, but instead are local, oft en limited to very small numbers of individuals.

Cleaning local hazardous waste sites should be a function of the states or local communities rather than the Federal government. Therefore the states might well assume an increasing share of Superfund taxes, which must be reauthorized in 19

91. Any Superfund money remaining in the Federal Treasury could be put into a revolving state loan fund, as is the case currently with the Wastewater Treatment Plant federal grant program a 21

Authors

Kent Jeffreys, Elizabeth Whelan

Elizabeth; Brill

Senior Research Fellow in Health Economics