Table of Contents...
Two Friends Debate Risk Assessment and Precaution
In this issue of the Precaution Reporter, Adam Finkel, a risk
assessor, and Peter Montague, an advocate for precaution, engage in a
dialog about risk and precaution.
Adam Finkel Reviews Cass Sunstein's Book, Risk and Reason
This book review by Adam Finkel started the dialog: "I believe that
at its best, QRA [quantitative risk assessment] can serve us better
than a 'precautionary principle' that eschews analysis in favor of
crusades against particular hazards that we somehow know are
needlessly harmful and can be eliminated at little or no economic or
A Letter To My Friend Who Is a Risk Assessor
In response to Adam Finkel's review of Cass Sunstein's book,
Peter Montague wrote this letter explaining why the nation's asbestos-
removal programs for schools, and President Bush's invasion of Iraq,
are not examples of precaution -- and listing some major problems with
decisions based narrowly on quantitative risk assessment.
Risk Assessment and Precaution: Common Strengths and Flaws
"The biggest challenge I have for you is a simple one: explain to
me why 'bad precaution' doesn't invalidate the precautionary
principle, but why for 25 years you've been trashing risk assessment
based on bad risk assessments!"
Two Rules for Decisions: Trust in Economic Growth vs. Precaution
Why do we need precaution? Because there is growing evidence that
the entire planet and all its inhabitants are imperiled by the total
size of the human enterprise. As a result, the precautionary principle
has arisen as a new way to balance our priorities. Two overarching
decision rules are competing for supremacy-- "trust in economic
growth" vs. "precaution." Europe is edging toward precaution. The U.S.
is, so far, sticking with "trust in economic growth."
From: Rachel's Precaution Reporter #103, Aug. 15, 2007
TWO FRIENDS DEBATE RISK ASSESSMENT AND PRECAUTION
By Peter Montague
Recently my friend Adam Finkel -- a skilled and principled risk
assessor -- won two important victories over the Occupational Safety
and Health Administration (OSHA).
In 2000, Adam was appointed Regional Administrator for OSHA in charge
of the Denver office. When he began to suspect that OSHA inspectors
were being exposed to dangerous levels of the highly-toxic metal,
beryllium, he took precautionary action -- urging OSHA to test
beryllium levels in OSHA inspectors. It cost him his job.
The agency did not even want to tell its inspectors they were being
exposed to beryllium, much less test them. So Adam felt he had no
choice -- in 2002, he blew the whistle and took his concerns public.
OSHA immediately relieved him of his duties as Regional
Administrator, and moved him to Washington where they changed his
title to "senior advisor" and assigned him to the National Safety
Council -- a place where "they send people they don't like," he would
later tell a reporter.
Adam sued OSHA under the federal whistleblower protection statute and
eventually won two years' back pay, plus a substantial lump sum
settlement, but he didn't stop there. In 2005, he lodged a Freedom of
Information Act law suit against the agency, asking for all
monitoring data on toxic exposures of all OSHA inspectors.
Meanwhile, OSHA began testing its inspectors for beryllium, finding
exactly what Adam had suspected they would find -- dangerously high
levels of the toxic metal in some inspectors.
Adam is now a professor in the Department of Environmental and
Occupational Health, School of Public Health, University of Medicine
and Dentistry of New Jersey (UMDNJ); and a visiting scholar at the
Woodrow Wilson School, Princeton University. At UMDNJ, he teaches
"Environmental Risk Assessment."
Earlier this summer, Adam won his FOIA lawsuit. A judge ruled that
OSHA has to hand over 2 million lab tests on 75,000 employees going
back to 1979. It was a stunning victory over an entrenched
Meanwhile in 2006 the American Public Health Association selected
Adam to receive the prestigious David Rall Award for advocacy in
public health. You can read his acceptance speech here.
When Adam's FOIA victory was announced early in July, I sent him a
note of congratulations. He sent back a note, attaching a couple of
articles, one of which was a book review he had published recently
of Cass Sunstein's book, Risk and Reason. Sunstein doesn't "get" the
precautionary principle -- evidently, he simply sees no need for it.
Of course the reason why we need precaution is because the cumulative
impacts of the human economy are now threatening to wreck our only
home -- as Joe Guth explained last week in Rachel's News (reprinted
in this issue of the Precaution Reporter).
In any event, I responded to Adam's book review by writing "A
letter to my friend, who is a risk assessor," and I invited Adam to
respond, which he was kind enough to do.
So there you have it.
Do any readers want to respond to either of us? Please send responses
Return to Table of Contents
From: Journal of Industrial Ecology (pg. 243), Oct. 1, 2005
ADAM FINKEL REVIEWS CASS SUNSTEIN'S BOOK, RISK AND REASON
A review of: Risk and Reason: Safety, Law, and the Environment, by
Cass R. Sunstein. Cambridge, UK: Cambridge University Press, 2002, 342
pp., ISBN 0521791995, $25.00 (Also in paperback: ISBN 0521016258
By Adam M. Finkel
As someone dedicated to the notion that society needs quantitative
risk assessment (QRA) now more than ever to help make decisions about
health, safety, and the environment, I confess that I dread the
arrival of each new book that touts QRA or cost-benefit analysis as a
"simple tool to promote sensible regulation." Although risk analysis
has enemies aplenty, from both ends of the ideological spectrum, with
"friends" such as John Graham (Harnessing Science for Environmental
Regulation, 1991), Justice Stephen Breyer (Breaking the Vicious
Circle, 1994), and now Cass Sunstein, practitioners have their hands
I believe that at its best, QRA can serve us better than a
"precautionary principle" that eschews analysis in favor of crusades
against particular hazards that we somehow know are needlessly harmful
and can be eliminated at little or no economic or human cost. After
all, this orientation has brought us increased asbestos exposure for
schoolchildren and remediation workers in the name of prevention, and
also justified an ongoing war with as pure a statement of the
precautionary principle as we are likely to find ("we have every
reason to assume the worst, and we have an urgent duty to prevent the
worst from occurring," said President Bush in October 2002 about
weapons of mass destruction in Iraq). More attention to benefits and
costs might occasionally dampen the consistent enthusiasm of the
environmental movement for prevention, and might even moderate the on-
again, off-again role precaution plays in current U.S. economic and
foreign policies. But "at its best" is often a distant, and even a
receding, target -- for in Risk and Reason, Sunstein has managed to
sketch out a brand of QRA that may actually be less scientific, and
more divisive, than no analysis at all.
To set environmental standards, to set priorities among competing
claims for environmental protection, or to evaluate the results of
private or public actions to protect the environment, we need reliable
estimates of the magnitude of the harms we hope to avert as well as of
the costs of control. The very notion of eco-efficiency presupposes
the ability to quantify risk and cost, lest companies either waste
resources chasing "phantom risks" or declare victory while needless
harms continue unabated. In a cogent chapter (Ch. 8) on the role of
the U.S. judiciary in promoting analysis, Sunstein argues persuasively
that regulatory agencies should at least try to make the case that the
benefits of their efforts outweigh the costs, but he appears to
recognize that courts are often ill-equipped to substitute their
judgments for the agencies' about precisely how to quantify and to
balance. He also offers a useful chapter (Ch. 10) on some creative
ways agencies can transcend a traditional regulatory enforcer role,
help polluters solve specific problems, and even enlist them in the
cause of improving eco-efficiency up and down the supply chain. (I
tried hard to innovate in these ways as director of rulemaking and as
a regional administrator at the U.S. Occupational Safety and Health
Administration, with, at best, benign neglect from political
appointees of both parties, so Sunstein may be too sanguine about the
practical appeal of these innovations.)
Would that Sunstein had started (or stopped) with this paean to
analysis as a means to an end -- perhaps to be an open door inviting
citizens, experts, those who would benefit from regulation, and those
reined in by it to "reason together." Instead, he joins a chorus of
voices promoting analysis as a way to justify conclusions already
ordained, adding his own discordant note. Sunstein clearly sees QRA as
a sort of national antipsychotic drug, which we need piped into our
homes and offices to dispel "mass delusions" about risk. He refers to
this as "educating" the public, and therein lies the most
disconcerting aspect of Risk and Reason: he posits a great divide
between ordinary citizens and "experts," and one that can only be
reconciled by the utter submission of the former to the latter. "When
ordinary people disagree with experts, it is often because ordinary
people are confused," he asserts (p. 56) -- not only confused about
the facts, in his view, but not even smart enough to exhibit a
rational form of herd behavior! For according to Sunstein, "millions
of people come to accept a certain belief [about risk] simply because
of what they think other people believe" (p. 37, emphasis added).
If I thought Sunstein was trying by this to aggrandize my fellow
travelers -- scientists trained in the biology of dose-response
relationships and the chemistry and physics of substances in the
environment, the ones who actually produce risk assessments -- I
suppose I would feel inwardly flattered, if outwardly sheepish, about
this unsolicited elevation above the unwashed masses. But the reader
will have to look long and hard to find citations to the work of
practicing risk assessors or scientists who helped pioneer these
methods. Instead, when Sunstein observes that "precisely because they
are experts, they are more likely to be right than ordinary people. .
. brain surgeons make mistakes, but they know more than the rest of us
about brain surgery" (p. 77), he has in my view a quaint idea of where
to find the "brain surgeons" of environmental risk analysis.
He introduces the book with three epigrams, which I would oversimplify
thus: (1) the general public neglects certain large risks worthy of
fear, instead exhibiting "paranoia" about trivial risks; (2) we
maintain these skewed priorities in order to avoid taking
responsibility for the (larger) risks we run voluntarily; and (3)
defenders of these skewed priorities are narcissists who do not care
if their policies would do more harm than good. The authors of these
epigrams have something in common beyond their worldviews: they are
all economists. Does expertise in how markets work (and that
concession would ignore the growing literature on the poor track
record of economists in estimating compliance costs in the regulatory
arena) make one a "brain surgeon" qualified to bash those with
different views about, say, epidemiology or chemical carcinogenesis?
To illustrate the effects of Sunstein's continued reliance on one or
two particular subspecies of "expert" throughout the rest of his book,
I offer a brief analysis of Sunstein's five short paragraphs (pp.
82-83) pronouncing that the 1989 public outcry over Alar involved
masses of "people [who] were much more frightened than they should
have been."1 Sunstein's readers learn the following "facts" in this
** Alar was a "pesticide" (actually, it regulated the growth of apples
so that they would ripen at a desired time).
** "About 1 percent of Alar is composed of UDMH [unsymmetrical
dimethylhydrazine], a carcinogen" (actually, this is roughly the
proportion found in raw apples -- but when they are processed into
apple juice, about five times this amount of UDMH is produced).
** The Natural Resources Defense Council (NRDC) performed a risk
assessment claiming that "about one out of every 4,200 [preschool
children] exposed to Alar will develop cancer by age six" (actually,
NRDC estimated that exposures prior to age six could cause cancer with
this probability sometime during the 70-year lifetimes of these
children -- a huge distinction, with Sunstein's revision making NRDC
appear unfamiliar with basic assumptions about cancer latency
** The EPA's current risk assessment is "lower than that of the NRDC
by a factor of 600" (actually, the 1/250,000 figure Sunstein cites as
EPA's differs from NRDC's 1/4,200 figure by only a factor of 60