(PDF) Communicating Uncertainty in Intelligence and Other Professions

  •  
  •  
  •  
  •  


Michael_Novakhov
shared this story
.

This article was downloaded by: [Georgetown University]
On: 19 February 2014, At: 11:37
Publisher: Routledge
Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered
office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK
International Journal of Intelligence and
CounterIntelligence
Publication details, including instructions for authors and
subscription information:
http://www.tandfonline.com/loi/ujic20
Communicating Uncertainty in
Intelligence and Other Professions
Charles Weiss
Published online: 13 Jun 2012.
To cite this article: Charles Weiss (2007) Communicating Uncertainty in Intelligence and Other
Professions, International Journal of Intelligence and CounterIntelligence, 21:1, 57-85, DOI:
10.1080/08850600701649312
PLEASE SCROLL DOWN FOR ARTICLE
Taylor & Francis makes every effort to ensure the accuracy of all the information (the
“Content”) contained in the publications on our platform. However, Taylor & Francis,
our agents, and our licensors make no representations or warranties whatsoever as to
the accuracy, completeness, or suitability for any purpose of the Content. Any opinions
and views expressed in this publication are the opinions and views of the authors,
and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content
should not be relied upon and should be independently verified with primary sources
of information. Taylor and Francis shall not be liable for any losses, actions, claims,
proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or
howsoever caused arising directly or indirectly in connection with, in relation to or arising
out of the use of the Content.
This article may be used for research, teaching, and private study purposes. Any
substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,
systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &
Conditions of access and use can be found at http://www.tandfonline.com/page/terms-
and-conditions

CHARLES WEISS
Communicating Uncertainty in
Intelligence and Other Professions
Recent events have focused new attention on the need for intelligence
professionals to present alternative hypotheses to pol icymakers in a way
that makes clear the uncertainties in the evaluation and interpretation of
the evidence on which they are based, and the fact that it is rarel y possible
to exclude alternative explanations.
2
This information improves the ability
of decisionmakers, if they so wish, to ta ke into account the risk that
intelligence estimates may not be correct.
This problem is not unique to the intelligence profession. Experts from
many other fields face the problem of conveying technical judgments
involving uncertainty to their nonspecialist clients. Doctors, for example,
must routinely advise th eir pa tients a bout the ris ks inv olved in various
alternative treatments. Unlike doctors, however, intelligence analysts
cannot often support their judgments with statistical analysis of empirical
data derived from a large number of similar past cases.
3
CONSIDERATION OF ALTERNATIVE HYPOTHESES
A number of recent introspective works responding to recent intelligence
failures h ave called attention to the need for intelligence analysts to give
proper attention to hypotheses and data collection efforts that are contrary
to what they regard as the most likely interpretation of available
information, and especially to hypotheses that run counter to prevailing
Dr. Charles Weiss is Distinguished Professor and, until recently, Chair of
Science, Technology, and International Affairs at the Edmund A. Walsh
School of Foreign Service, Georgetown University, Washington, D.C. A
Harvard-trained biochemical physicist, he was the first Science and
TechnologyAdvisertotheWorldBank,servinginthatcapacityfrom1971
to 1986.
1
International Journal of Intelligence and CounterIntelligence, 21: 57–85, 2008
Copyright # Taylor & Francis Group, LLC
ISSN: 0885-0607 print=1521-0561 online
DOI: 10.1080/08850600701649312
AND COUNTERINTELLIGENCE VOLUME 21, NUMBER 1 57
Downloaded by [Georgetown University] at 11:37 19 February 2014
models and preconceptions.
4
This reluctance to confront uncertainty has
parallels in science and medicine, both of which discourage interpretations
contrary to prevailing paradigms.
Like intelligence analysts, scientific advisors to policymakers have long
prided themselves on ‘‘speaking truth to power.’’
5
In prac tice, matters are
more complicated. In science advising as in intelligence analysis, ‘‘truth’’
may be probabilistic, and may depend on an advisor’s best educa ted guess
as to the outcome of experiments that have not yet been performed or the
interpretation of data that are not quite in point. Like intelligence
professionals, scientific advisors must adjust to the needs of their advisees,
who bear ultimate responsibility for their decisions.
6
Moreover, like most intelligence analysis, most scientific research is
concerned with filling gaps in existing paradigms; revolutionary concepts
require years to become established. In principle, this derives from the
dictum that ‘‘extraordinary claims require extraordinary evidence.’’ In
practice, scientists’ judgment re garding the quality of evidence often
depends o n how closely it fits their preconceptions.
7
Abandonment of a
fundamental paradigm may owe as much to the death or retirement of an
older generation of scie ntists as to the success of the new model in winning
them over.
8
Similarly, young doctors are advis ed to look for the most common
diagnosis before considering rare or exotic diseases: ‘‘When you hear
hoofbeats, think horses, not zebras.’’
9
They also learn to b e hesitant to
point out mistakes or disagreements with senior authority figure s—a
phenomenon well-known in other professions, such as pilots, and in
bureaucracies of all kinds. Nevertheless, these fields have well–established
procedures for identifying and highlighting less likely possibilities that
might undermine these key assumptions and for carrying out the tests
needed to eliminate (or possibly confirm) them. Doctors, fo r ex ample,
conduct tests intended to rule out possible but less likely diagnoses.
10
Many a scientific reputation has been established by a dramatic experiment
that overturned long-held preconceptions.
Environmental sc ientists are particularly alert to possib le surprises, and
emphasize research on indicators that could be the first signs of more
serious environmental damage than would be predicted by the hypothesis
deemed most likely in a particular situation. Extensive research is
underway, for example, to test for phenomena that would indicate an
increased likelihood of catastrophic sea level rise due to the melting of the
Antarctic ice shelf, or of the weakening of the Gulf Stream (and
consequent chilling of Western Europe) due to possible melting of the
Greenland icecap and c onsequent weakening or disruption of the ‘‘oceanic
conveyer belt.’’
11
Even so, these scientists have been criticized for
‘‘anchoring’’ on past estimates of climate change, and in particular for the
58
CHARLES WEISS
INTERNATIONAL JOURNAL OF INTELLIGENCE
Downloaded by [Georgetown University] at 11:37 19 February 2014
relatively small change i n the consensus projections of global warming
despite the many scientific advances of the past decade.
12
Even the problem of deliber ate deception i s not unique to int elligence
analysis. A medical patient may be too embarrassed to share the complete
circumstances of an illness or injury. Fakery in scientific research—s uch as
the recent ‘‘ dry-labbin g’’ of human cloni ng by a Korean sc ientist
13
—is
dealt with by peer review, and by the requirement that impor tant findings
be confirmed by independent researchers—the latter being typically
unavailable to intelligence practitioners. Environmentalists, somewhat like
intelligence analysts, sometimes face disinformation put out to the public,
in this case by opponents of one or another regulation. Scientists detect
such disinformation rather easily, but often have difficulty refuting them
convincingly before a general audience.
14
The resulting frustration
resembles that felt by intelligence analysts—or their policymaker clients—
when they cannot refute misinformation in the press.
The Intelligence Community (IC) has de veloped many devices to attack
the problem of unexpected surprises, including t he use of ‘‘red teams’’ to
attack the assumptions underly ing conventional analysis.
15
One of the
most respected analysts of the intelligence profes sion, Richards Heuer, has
proposed a method of ‘‘Analysis of Competing Hypotheses’’ (ACH), which
‘‘requires the analyst to explicitly identify all reasonable alternatives and
have them compete against one another for the analyst’s favor, rather than
evaluating their plausibility one at a time.’’
16
Heuer’s method seeks to
distinguish ‘‘key drivers’’ that a re ‘‘ diagnostic’’ in the sense that they
‘‘influence your judgment on the relative likelihood of the various
hypotheses.’’
17
In this way, the ACH method forces the analyst to ‘‘begin
with a full set of alternative hypotheses, to identify the few items of
evidence or assumptions that have the greatest diagnostic value in judging
[their] relati ve likelihood,seeking evidence to refute hypothese s [rat her
than] looking for evidence to confirm a favored hypothesis.’’
18
More
elaborate, but less user-friendly, versions of this technique employ
Bayesian statistics to assign probabilities to each alternative.
19
Now that
the National Intelligence Strategy has identified ‘‘exploring alternative
analytic views’’ as one of the ten major ‘‘enterprise objectives’’ of the
national intelligence effort, the use of these methods may become more
common in the future.
20

  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •