25 December 1997
Source: Hardcopy from the Central Intelligence Agency, Center for the Study of Intelligence

From http://www.odci.gov/csi/index.html:
Requests for copies of A Compendium of Analytic Tradecraft Notes,
or the DI Strategic Plan (August 1996), should be sent by fax to (703) 874-3875
to the attention of the DI Communications Staff.


Central Intelligence Agency
Washington D C. 20505

                                   December 17, 1997 

Mr. John Young 
JYA/Urban Deadline 
251 West 89th St, Suite 6E 
New York, NY 10024 

Dear Mr. Young: 

     Enclosed are the materials you requested. I hope you 
will find them informative and would enjoy hearing your 
reactions to them. 

     For additional information about the Directorate of 
Intelligence or the substantive issues we cover, please feel free to call me at (703) 874-3883. Sincerely, [Signature] Clark Shannon External Outreach Coordinator Directorate of Intelligence Enclosure


Directorate of Intelligence

A Compendium of
Analytic Tradecraft Notes

Volume I
(Notes 1-10)

Reprinted with a
new Foreword
by the Deputy Director
for Intelligence

February 1997

Reverse Blank


Contents

Page
Foreword     v
Summary vii
Note 1 Addressing US Interests in DI Assessments         1
Note 2 Access and Credibility 5
Note 3 Articulation of Assumptions 9
Note 4 Outlook 13
Note 5 Facts and Sourcing 17
Note 6 Analytic Expertise 25
Note 7 Effective Summary 29
Note 8 Implementation Analysis 33
Note 9 Conclusions 37
Note 10 Tradecraft and Counterintelligence 41


Reverse blank


iii


A Compendium of
Analytic Tradecraft Notes

Foreword by the Deputy Director for Intelligence

The core mission of the CIA's Directorate of Intelligence, which I head, has remained the same despite a decade rocked by profound changes in the world. As the analytical arm of the CIA, we seek to provide the most useful, high-quality analysis to US policymakers, lawmakers, warfighters, law enforcement officers, negotiators, and other officials responsible for protecting US national interests. Our analysis continues to draw from a wide range of sources--from newspapers to the most sensitive technical collection systems.

How we fulfill our mission, however, has changed dramatically. Intelligence analysts are challenged as never before to be creative and proactive in meeting intelligence needs. Long analytic papers largely focused on the Soviet threat worldwide that were the norm 10 years ago have given way to a combination of briefings and short but insightful written and multimedia products covering a broad range of regional and transnational issues.

Now more than ever, new products are tailored to the individual intelligence consumer's concerns. Our analysts put the highest premium on knowing what their consumers need.

The revolution in information technologies has improved our access to sources and our ability to quickly deliver intelligence. But it has also made our work more challenging as we are bombarded with information of varying quality, relevance, and depth.

To meet the challenge of political change and technological advances and take advantage of the opportunities they present, the Directorate of Intelligence has reexamined its core analytic "tradecraft" skills, updating them to reflect how we do our business. The word "tradecraft" comes from our colleagues in the clandestine service, who use it to embody the special skills and methods required to do their business. We have borrowed it to capture our special skills and methods.

We are making the Compendium of Analytic Tradecraft Notes available to scholars to shed light on how we support intelligence consumers. This release, although unusual for an intelligence agency, reflects our renewed commitment to reach out to academia and the public, as defined in the Directorate's new Strategic Plan.


v


Pursuit of expertise in analytic tradecraft is a central element of this plan. Our tradecraft enables analysts to provide "value-added" to consumers of intelligence by ensuring:

Analytic tradecraft skills also serve as "force multipliers," helping us provide top-quality analysis:

The author of the Tradecraft Notes is Jack Davis, a retired CIA officer who has spent 40 years as practitioner, teacher, and critic of intelligence analysis. The Notes benefited from the foresight and sharp editorial pen of F. Douglas Whitehouse, Chief of the Directorate's Product Evaluation Staff, during 1994-96. Above all, they reflect the strong commitment to raising analytic standards begun by my predecessor, Douglas E. MacEachin, who served during 1993-95.

Requests for additional copies of Tradecraft Notes or copies of the Strategic Plan should be addressed to my Communications Chief, Jon Nowick, on (703) 874-3882. I also welcome your comments on this volume as my colleagues and I work to enhance the way we serve our nation.

[Signature]

John C. Gannon
Deputy Director for Intelligence
Central Intelligence Agency


vi


Summary

Below is a brief characterization of each of the 10 Tradecraft Notes presented in this volume.

Note 1. Addressing US Interests in DI Assessments

PES launched the tradecraft series with a note on the various ways analysts can assure value added for their important clients in the world of policymakers, warfighters, and law enforcement officials:

Note 2. Access and Credibility

Demonstrates how the nine evaluation criteria used by PES work together to enhance access to and credibility with the US officials who count most on the analyst's substantive agenda:

Note 3. Articulation of Assumptions

US officials who must make their policy decisions amidst uncertainty about developments abroad are more likely to rely on DI assessments when the assumptions supporting the analysts' estimative judgments are made clear.

Tradecraft tips cover techniques for:


vii


Note 4. Outlook

The outlook sections of DI assessments must address not only what analysts believe to be the most likely course of developments but also important alternative outcomes that could confront US officials:

Note 5. Facts and Sourcing

What the analysts know and how they know it are central to their credibility not only with policy officials asked to rely on DI assessments for designing and taking action, but also with the managers who are asked to place the Directorate's authority behind the memorandum or briefing.

The two evaluation standards at play here--the handling of facts and of sourcing--were combined in one note that sets out guidelines for analysts and illustrates "best practices" for explicit and tough-minded depiction of information derived from reporting and research. For example:

Note 6. Analytic Expertise

Analytic expertise establishes the DI's authority to speak to an issue on which US national security interests are at stake. This note addresses the several facets of analytic expertise and provides tradecraft tips for effectively conveying authority in DI assessments.

The aspects covered include:


vii


Note 7. Effective Summary

An effective summary makes the distinctive DI value added stand out for the intended consumers within constricted space limitations. For example:

Note 8. Implementation Analysis

Implementation Analysis is one of the techniques for extending customized intelligence support to policymakers and action takers outlined in Note 1, Addressing US Interests. When the DI provides an assessment of tactical alternatives available to the United States for pursuing opportunities and averting dangers, the role of the analyst complements but is distinctive from the role of the policymaker:

Analysts calling upon their expertise on foreign cultures--identify and evaluate alternatives for implementing objectives.

Policy officials first set the objectives and then make decisions about which tactics to adopt.

Note 9. Conclusions

Analysts must be precise in conveying their level of confidence in the conclusions of a DI assessment; that is, their findings based on organizing, evaluating, and interpreting all-source information:


ix


Note 10. Tradecraft and Counterintelligence

Guidelines for countering deception efforts to distort analysts' understanding of what is going on abroad build on the advice for rigorous evaluation of information contained in Notes 5 (Facts and Sourcing) and 9 (Conclusions).

For issues where analysts suspect the presence of deception and for all assessments on which policy officials may directly rely in making decisions on whether to take military or other action to defend US interests, we recommend a structured effort to validate the integrity of key information.

On these suspect and sensitive issues, we urge analysts to answer a blunt question:





x


Directorate of
Intelligence

Notes on Analytic
Tradecraft

Product Evaluation
Staff


March 1995


Note 1

Addressing US Interests in DI Assessments


This is the first of a series of Product Evaluation Staff
notes to clarify the standards used for evaluating DI
assessments and to provide tradecraft tips for putting the
standards into practice.


The DI professional ethic requires that analysts provide direct support to policymakers' efforts to define and defend US national security interests. The standard is for DI products to convey distinctive value added that promotes feedback, tasking, and access generally. In many cases, this standard requires going beyond general relevance to US interests to customized assistance for identified policy officials who have the "action" on designing, implementing, or monitoring US policy on an individual issue.

The ultimate judges of the utility of DI products for the policymaking process are the officials who choose whether or not to rely on them. The opportunities for adding distinctive intelligence values are usually clear to DI analysts when policy officials ask for a study, memorandum, or briefing. These officials indicate what they expect to gain from their request: for example, specialized information, research findings, cause-and-effect analysis; cost-benefit assessment of tactical alternatives.

The challenge to address effectively US interests is greater when the initiative for launching a DI product comes largely from the intelligence side of the relationship. It is here that Agency monitors (such as the staffs of the congressional oversight committees) are most likely to raise questions about the policymaking utility of individual DI products.


1


How to proceed? Each analytic assignment represents a distinctive opportunity for providing support to policy officials and the policymaking process. That said, veteran analysts and managers have garnered from direct experience and feedback from policy officials that DI products are usually best directed to meet the needs of the user when they are either organized around or specifically highlight one or more of the following values:

Veteran analysts and managers also recommend the following Tradecraft Tips for increasing the utility of DI assessments for policymakers.

Getting Started


2


Special Tradecraft Challenges

Drafting and Self Review


3



4


Directorate of
Intelligence

Notes on Analytic
Tradecraft

Product Evaluation
Staff


April 1995


Note 2

Access and Credibility


This is the second in a series of Product Evaluation Staff notes to
clarify the standards used for evaluating DI assessments and to
provide tradecraft tips for putting the standards into practice.

The goal of enhancing the policy utility and analytic quality of DI papers and briefings is centered on efforts to promote analysts' access to and credibility with the US officials who count most in policymaking, warfighting, and law enforcement:

The two standards are complementary. Without credibility, analysts will lose their access to the hands-on policy and operational officials who need sound and reliable intelligence analysis to succeed in their professional missions. Without access, even unquestionably objective assessments do little to promote the effective use of intelligence by consumers.

As indicated below, the nine standards the Product Evaluation Staff uses to evaluate DI analytic performance reflect the importance of increasing access and protecting credibility:


5


Increasing Access

Addressing US Interests. The most important and demanding charge given to DI analysts is to assist US officials to fulfill their obligation to design, implement, and monitor national security policy. The highest standard is to structure DI assessments to underscore what officials need to know to get their jobs done (for example, about vulnerabilities and strengths of adversaries). When this is not practical, DI deliverables should contain a carefully crafted section that addresses implications, dangers, and opportunities with the role of US policymakers, warfighters, and law enforcement officials as action officers clearly in mind.

Sophistication of the Analysis/Depth of Research. Especially in broadly distributed and fully developed assessments, analysts have to convey their distinctive expertise on an issue. This demonstration of authority to speak to the issue should be based on some combination of research methodology, firsthand knowledge of country or subject, all-source data bases, closeness to collectors, and clear articulation of facts and assumptions attributes that make a busy policymaker want to come back to the analyst for additional information and analysis.

Unique Intelligence Information. Especially in quick turnaround deliverables for small numbers of key officials, the analyst should make appropriate use of unique, at times highly classified, information that provides insights not otherwise available to well-informed officials. Policymakers and warfighters who have the action on key issues almost always have the clearances as well. At times they need assistance, however, in understanding the context and character of intelligence from clandestine collection and other special sources.

Effective Summary. The analyst's charge is to convey the distinctive values of the paper in the prescribed space. To maximize service to heavily engaged officials, the most important findings and judgments should be highlighted within a bare-bones depiction of the general context. Even more than in the main text, the summary should be made "actionable" via emphasis on US policy implications.

Maintaining Credibility

The Facts--Or What We Know. With emphasis on what is new, different, attention-worthy, a DI assessment should set out what the Directorate knows with sufficient confidence to warrant reliance by policymakers and warfighters in planning and executing US courses of action. When relevant, DI deliverables should also address what analysts do not know that could have significant consequences for the issue under consideration. If the paper notes intelligence gaps, it should, when appropriate, suggest collection strategies to fill the gaps.


6


Sources of the Facts--How We Know It. DI assessments have to depict the sources of information on which consumers are asked to rely within the general rules for using evidence. Direct evidence (for example, imagery and most intercepts) should be distinguished from testimonial evidence (for example, most clandestine and embassy reporting). On complex matters (for example, the attitudes and plans of foreign leaders) analysts should make explicit their levels of confidence in the evidence.

Conclusions. Assessments should enunciate the conclusory findings from the hard evidence (for example, well-documented events) in terms of trends, patterns, and precedents that underscore dangers or opportunities for US policymakers, warfighters, and law enforcement officials. When appropriate, the calculus that leads to the extension of the factual base to an actionable finding should be spelled out (for example, rules used to establish degrees of risk in dual-purpose technological transfers).

Clear Articulation of Assumptions. When analysts address uncertainty--matters that require interpretations and estimates that go well beyond the hard evidence--their argumentation must clarify the premises, suppositions, and other elements of critical thinking that underlie the judgments. For example, effective argumentation for consumers, who will often have their own strong opinions, requires the analyst to clarify not only degree of confidence in key assumptions but also the criticality of the latter to bottom-line judgments.

Outlook. The outlook sections of estimative assessments (for example, the likely course and impact of political, economic, and military developments in foreign countries) should identify the dynamics that will have the greatest impact on subsequent developments. In other words, what are the drivers that will determine the outcome, or what drivers would have to change to alter the outcome?

Reverse blank


7


Directorate of
Intelligence

Notes on Analytic
Tradecraft

Product Evaluation
Staff


May 1995


Note 3

Articulation of Assumptions


This is the third in a series of Product Evaluation Staff notes to
clarify the standards used for evaluating DI assessments and to
provide tradecraft tips for putting the standards into practice.

DI analysts are regularly tasked to assist policymakers, warfighters, and law enforcement officers in managing the uncertainty that complicates US efforts to deal with national security threats and opportunities. Many issues cannot be addressed with certainty--by analysts, by policy officials, or even by the foreign players involved. The pattern of political and economic developments in, say, a newly democratic country can depend on so many actors, institutional variables, and contingencies that the outlook cannot be predicted with high confidence. Estimating over an extended timeline or during a period of political or economic crisis increases the burden of uncertainty.

As a rule, the greater the degree of uncertainty attending an issue:

This tradecraft note on Articulation of Assumptions and the next note on Outlook will address the DI tradecraft for argumentation that sets the standard for helping consumers deal effectively with the uncertainty they face in planning for, taking, and monitoring US action.

By "argumentation" we mean the communication in an intelligence assessment of the structure of the analysts' critical thinking in support of the bottom-line judgments. Under the DI tradecraft standard, when policymakers, warfighters, and law enforcement officials are asked to rely on DI analysis, the reasoning that ties evidence, to assumption, to judgments must be:


9


Drivers and Linchpins

The DI tradecraft for the articulation of assumptions places emphasis on identifying the drivers or key variables the uncertain factors that analysts judge most likely to determine the outcome of a complex situation. At times the economy is the key uncertain factor; at times the loyalty of the security forces; at times the leadership skills of a president or dictator. At times all three are judged to carry equal weight in driving future developments.

The analysts' working assumptions about the drivers are sometimes referred to as linchpin assumptions because these are the premises that hold the argument together and warrant the validity of the conclusion.

The following are hypothetical examples to illustrate the relationships of drivers, linchpin assumptions, and conclusions:

Linchpin assumptions are by definition debatable and subject to error. Thus, analysts must defend their judgments by marshaling supporting evidence and reasoning. For instance, in the Egyptian example, the analysts should offer convincing evidence for the assumption that "the military probably will continue supporting the government."

Moreover, on an important issue such as this, the inherent uncertainty has to be accounted for by addressing plausible alternative courses of development, a topic that will be covered in the Tradecraft Note on "Outlook." Here we present recommendations for refining and articulating the assumptions that support the analysts' bottom-line conclusions.


10


Tradecraft Tips

1. Each assessment represents a distinctive challenge in terms of how to set out the argumentation. As a rule of thumb, the more complex and controversial the issue, the more the analysts should make clear the sinews of the reasoning:

2. Before starting to draft an assessment, analysts should open up the search for drivers, and not rely solely on what was determined to be the key factors in the last exercise. That is, they should put aside their previous conclusions and focus initially on the range and alignment of assumptions:

3. Also, analysts should open up the process of determining assumptions about the drivers:

--In the Egyptian example, the subordinate assumptions to the linchpin assumption that the military probably will continue to support the government could include judgments about the military's stake in secular government and antagonism toward the Islamic extremists .


11


4. Analysts should review their first draft against a tough standard:

5. Some specific questions analysts should ask about the draft:

In Sum DI analysts can speak with authority on substantively complex and politically controversial issues only through sound and precise argumentation. The more effectively drivers and assumptions are identified and defended, the greater the credibility of the bottom-line conclusions.




12


Directorate of
Intelligence

Notes on Analytic
Tradecraft

Product Evaluation
Staff


June 1995


Note 4

Outlook


This is the fourth in a series of Product Evaluation Staff notes to
clarify the standards used for evaluating DI assessments and to
provide tradecraft tips for putting the standards into practice.

This Note on Outlook continues coverage of DI standards for helping policymakers, warfighters, and law enforcers manage uncertainty regarding threats to and opportunities for US security interests. (Please see Note 3, Articulation of Assumptions.)

The outlook sections of DI assessments must address not only what analysts believe to be the most likely course of development but also important alternative outcomes that could confront US officials. This concern for the unlikely or unexpected reflects an experience-based appreciation of (1) the fallibility of analytic judgment, (2) the special analytic requirements of US officials for addressing long-shot threats and opportunities, and (3) the adversarial character of the policymaking process.

The perils of analytic judgment include:

Policymakers and other consumers who carry the burden of executing US policy often have a different attitude than analysts toward unlikely events.


13


Finally, the outlook on important issues can be subject to deep-seated disagreements--among and between analysts, Administration officials, and congressmen--that generate suspicions about the objectivity of DI memoranda and briefings.

General Rules

As indicated in previous Tradecraft Notes, each DI assessment represents a distinctive challenge to analysts on how to assure credibility with the consumers who count most in planning and executing US policy. The general rules that follow should weigh heavily as analysts choose the most appropriate approach.

1. In all cases, the analysts' judgment on the most likely outcomes must follow clearly from the marshaling of evidence and articulation of assumptions.

2. When the US stake is high (for example, political stability of an important ally or adversary) and plausible alternative outcomes are judged to be in the range of for instance, 20 percent or greater, the outlook section should also structure these potentially important outcomes. Policymakers and other consumers should get enough understanding of these alternatives to decide whether to review contingency plans, ask for more evidence, or levy additional tasks on analysts.

Analysts should identify:


14


3. On issues vital to US security (for example, military intentions of adversaries), analysts must help policymakers and warfighters engage in contingency planning by:

Tradecraft Tips

1. Analysts should avoid unrealistic precision in citing odds. Weather forecasters may have sufficient data to determine that the chances of rain are slightly better than even. Analysts rarely have such luxury in assessing the political or economic outlook.

2. Also avoid phrases that compound unavoidable uncertainty with unnecessary confusion: for example, real possibility, good chance.

3. As a rule, use constructions that tie the outcome to the driver and linchpin assumptions, rather than flat predictions.

4. To minimize misinterpretation when making important judgments, combine probabilistic phrases and rough numerical odds.


15


5. To understand the often intense concern of hands-on policy officials about a 20 percent likelihood of a disastrous turn of events happening on their watch, analysts should consider whether they would voluntarily get on an airplane that had a one-in-five chance of crashing.

6. To deal rigorously with unlikely developments, analysts should switch their focus from whether something will happen to how it could happen. Some techniques for generating credible alternative scenarios by structuring the available evidence differently:


16


Directorate of
Intelligence

Notes on Analytic
Tradecraft

Product Evaluation
Staff


July 1995


Note 5

Facts and Sourcing


This is the fifth in a series of Product Evaluation Staff notes to
clarify the standards used for evaluating DI assessments and to
provide tradecraft tips for putting the standards into practice.

The present note is longer than those previously published
because it addresses two closely related evaluation
standards.

The credibility of DI assessments with key consumers starts with the analyst's skill in organizing and evaluating the information presented in support of policymaking, warfighting, and law enforcement. Estimative judgments are also important, and at times essential for addressing complex and uncertain issues. But extensive feedback makes clear that US officials with hands-on responsibility for planning and executing policy take their first measure of DI analysts in terms of the judgment they exercise in providing actionable and reliable information:

This tradecraft note, on Facts and Sourcing, presents DI guidelines for depiction in memorandums and briefings of what the DI knows that is, the character and source of its information. The guidelines will be illustrated by hypothetical examples of "best practices" for handling recurring challenges in characterizing information for example, addressing the reported attitudes and intentions of foreign actors.


17


Definitions

DI analysts have a good track record in conveying what they know clearly and credibly to consumers of intelligence. What is undertaken here is codification of rules of thumb into general guidelines.

The following definitions are an attempt to promote standardized use of common terms and concepts relating to intelligence information.

Fact: Verified information; something known to exist or to have happened.

Information: The content of reports, research, and analytic reflection on an intelligence issue that helps analysts and their consumers evaluate the likelihood that something is factual and thereby reduces uncertainty.

Direct Information: Information relating to an intelligence issue under scrutiny the details of which can, as a rule, be considered factual, because of the nature of the source, the source's direct access to the information, and the concrete and readily verifiable character of the contents. For example:

Indirect Information: Information relating to an intelligence issue the details of which may or may not be factual, the doubt reflecting some combination of the source's questionable reliability, the source's lack of direct access, and the complex character of the contents. For example:

Sourcing: Depiction of the manner in which information was obtained, in order to assist in evaluating the likelihood that the content is factual. A single report from a source or collection platform can contain both direct and indirect information.

Data: Organized information that provides context for evaluating the likelihood that a matter


18


under scrutiny is factual. The information can be either direct (a chronology of events based on observation by US Embassy officers) or indirect (a chronology based on reports provided by a liaison intelligence service.

These terms are illustrated by the following hypothetical example.

We believe country X has begun a major crackdown on the "Extremist Movement," which the government holds responsible for the campaign of terrorism over the past two years.

The Army has been ordered to support the police in cleaning out Extremist strongholds (direct information), according to special intelligence (sourcing). The President of X reportedly is using last week's attack on a shopping center in a working-class neighborhood to justify calling upon the Army to close down the terrorist campaign (indirect information). according to a reliable clandestine source (sourcing). The pro-government press reports (sourcing) the Extremists cannot match Army firepower and are taking high casualties (indirect information). A US Embassy observer reports (sourcing) seeing Army trucks deliver more than 100 prisoners, some badly wounded, to the Central Prison (direct information). According to country X police officials (sourcing), these were part of the 1,000 Extremists rounded up so far in the crackdown (indirect information). CIA's "Country X Terrorism Chronology" indicates this is the first time the Army has been used against the Extremists since the terrorism campaign began in 1993 (data).

Guidelines

Rule 1. Be precise about what is known.

With US interests, and at times lives, on the line, policymakers, warfighters, and law enforcement officials need to be informed precisely what the all-source analysts know and how they know it. In making decisions on whether and how to take action, it is important for them to know if the information is direct or indirect, and if and why the analysts have concluded it is factual.

Most key consumers of intelligence have learned to respect the complexity of the national security issues they are charged with managing and the frequent uncertainty about what is taking place and what lies ahead. DI analysts should write to their standard and not to that of the occasional consumer who wants answers no matter what the circumstances.

Thus, in the name of reliability, analysts should never exaggerate what is known. They should report any important gaps in information bearing on US decisionmaking and potential courses of action, as well as relevant information that seems to contradict the main flow of information.


19


Best Practices

Analysts should be precise as well in sourcing information. The phrase, According to the US Embassy, for example, does not inform the reader whether the information is direct or indirect .

Best Practices

Rule 2. Distinguish carefully between information and fact.

Analysts may have direct information on what a foreign leader said, for example, and responsibly conclude this as factual. But what that foreign leader believes, intends to do, and will do cannot be known to be true on the basis of a report on what he or she said.

Best Practices

Rule 3. Distinguish carefully between information and estimative judgment.

Analysts' estimative judgments, as indicated, are an important element in the process of supporting policymakers, warfighters, and law enforcement officials. As a rule, these judgments cannot rely solely on reported opinions of foreign players or clandestine sources.


20


They must be amply argued in terms of the entire body of available information and sound inferential reasoning.

Best Practices

Also, care should be taken to avoid confusion over whether the DI analyst is making an estimative judgment or a source is expressing an opinion. Thus, the following formulation should be avoided: "Country X has turned the corner toward recovery, as indicated by a reliable clandestine source with access to the Finance Minister."

Best Practices

Rule 4. Take account of substantive complexity.

The more complicated an issue (that is, the more inherently difficult it is to be certain on a matter), the greater the informational demands to establish that the matter under consideration is factual. The burden of proof for determining what foreign leaders or groups believe or intend to do, for example, is much greater than that required for determining what they have done or said. Again, analysts may properly make a conditional judgment about what a foreign leader intends to do. (We believe country X is preparing to invade country Y.) But only rarely can this be stated as verified or factual information. (Country X will invade country Y.)

Best Practices


21


The mindsets of foreign leaders--their attitudes, for example--are also extremely difficult to verify, even with direct or firsthand information, and should rarely be stated as factual. When appropriate, mindset can be a subject for the analyst's estimative judgment.

Best Practices

Rule 5. Take account of policy sensitivity.

As with substantively complex matters, the burden of proof is high on matters that are controversial among policymakers or politically sensitive between the administration and Congress. Directly stated, a solid informational base is needed to present as factual (rather than as the analysts' conditional judgment) something that will be seen by key consumers as bad news. As a rule, then, on controversial matters analysts should place emphasis on the relevant information, and not on estimative conclusions.

Best Practices

Similarly, when addressing the behavior of a given country regarding treaties and agreements with the United States, DI analysts should place emphasis on reporting rather than interpretation. As a rule in these matters, analysts monitor (report relevant information), and policymakers verify (decide whether a violation of a treaty or an agreement has taken place).


22


Best Practices

--The President of country X told the US Ambassador that all official assistance had been terminated, although small-scale shipments of weapons by private groups that sympathize with the guerrillas' cause might still be taking place.

--According to a reliable clandestine source, a senior assistant to the President of X has ordered the military to hide small shipments of weapons for the guerrillas in trucks engaged in normal cross-border trade.

--Special intelligence indicates that a military mission from country X is to await further orders before purchasing abroad communications equipment requested by the guerrillas .

Rule 6. Take account of the possibility of deception.

Deception can be defined as the manipulation of information by a foreign government, group, or individual to get US intelligence analysts to reach an erroneous conclusion. Deception often works because it gives busy analysts what they are seeking--seemingly reliable information on which to base a conclusion.

Here is where the strength of the all-source analyst comes into play. One test for detecting and countering deception is to determine whether all the sources and collection platforms that should be reporting on a matter have indeed done so.

Databases, or organized information, also help. Is the reported or observed information consistent, in all important details, with past patterns? Do intelligence services with an interest in the matter have a record of perpetrating deceptions?

So does critical thinking. Is the information consistent with the analysts ' best judgments on the subject's interests, capabilities, and methods of operation?

Best Practices


23


coup planning or unusual movements of troops. Opposition political groups abroad report that they are unaware of any serious coup activity, according to multiple reliable clandestine sources. The government, therefore, could be running a deception operation involving Colonel X, to smoke out "disloyal" officers.

Rule 7. Use the term "evidence"sparingly.

This tradecraft note uses the term information as synonymous with the term evidence as it usually is employed in DI assessments. That is, both are used to refer to the content of reports and research that helps reduce the uncertainty surrounding a specific matter.

The US legal system, however, uses evidence in a more specialized manner to refer to matters introduced in a court case and subject to proof and refutation by contending parties. Because the DI is increasingly providing assessments to support law enforcement officials and because even other kinds of assessments can be subpoenaed for possible use in a court case, analysts should avoid using the term evidence when information serves their purposes just as well. At times, characterization of the information is sufficient to make the analysts' point for the benefit of consumers.

Best Practices

Special Circumstances

This note has concentrated on general guidelines for depicting the character and source of the DI's information. The DI serves a broad range of audiences with varied needs and entitlements for intelligence support on a multiplicity of substantive issues. Thus, the special character of the subject matter, the delivery vehicle, or the audience can require exceptions to the rules posited above.

For example, either the limited clearances of the recipients of an assessment or a customer's request for an unclassified memorandum can require the analyst to avoid precision in depicting information and sources.


24


Directorate of
Intelligence

Notes on Analytic
Tradecraft

Product Evaluation
Staff


August 1995


Note 6

Analytic Expertise


This is the sixth in a series of Product Evaluation
Staff notes to clarify the standards used for evaluating
DI assessments and to provide tradecraft tips for
putting the standards into practice.

Analytic expertise establishes the DI's authority to speak to an issue on which US national security interests are at stake. Demonstration of expertise in DI assessments is thus needed to gain access to and credibility with the policymakers, warfighters, and law enforcement officials who carry the heaviest burden for US decision and action on national security issues.

As with all evaluation criteria addressed in these tradecraft notes, analysts have to tailor their demonstration of expertise in individual memorandums and briefings to the circumstances of each assignment. These include the degree of substantive complexity, the analytic requirements and timeline of the policymaking process, and whether the DI production unit already has established close ties to its key clients.

As a rule though, no DI product should be delivered without deliberate effort to make it stand out from the formidable competition including other intelligence providers, journalists, scholars, lobbyists, and various additional purveyors of information and judgment:


25


How much expertise needs to be demonstrated? Taking space and time limitations into account, the more the better. The DI not only has to stand out regarding the breadth and depth of its analytic expertise, it also has to convince its key customers that it can put this expertise to use for their benefit.

Again, the appropriate means through which to convey expertise varies with each analytic assignment. As a general test, we recommend that analysts read their first draft critically, to answer the following questions as if they were posed by a policy official deciding whether to rely on the DI to meet the challenges on his or her agenda.

Tradecraft Tips

Research papers are natural vehicles for effective display of analytic expertise. DI veterans, over the years, have devised an armory of measures for demonstrating hard-earned individual and collective expertise even in quick turnaround papers and within the space limitations for most DI deliverables. We list below some of these measures:


26



27


Expertise in Support of the Intelligence Consumer

We end with an obvious caution. When the analysts know their clients' specific needs~ demonstration of expertise is readily projected as a tool or means for providing value added With larger assessments prepared for a broader audience, more care has to be taken to avoid excessive display of knowledge--that is, substantive expertise as an end in itself. Here is where investment in knowing the timelines of the policymaking process and the specific dangers and opportunities policy officials are dealing with comes into play.


28


Directorate of
Intelligence

Notes on Analytic
Tradecraft

Product Evaluation
Staff


September 1995


Note 7

Effective Summary


This is the seventh in a series of Product Evaluation Staff notes to
clarify the standards used for evaluating DI assessments and to
provide tradecraft tips for putting the standards into practice.

The summary plays a key role in determining the kind of impact a DI report or memorandum will have on the policymaking process. Thus, within the allotted space, the summary has to make the distinctive DI value added stand out--whether consisting of special intelligence with immediate bearing on US security interests, important findings from the sifting of all-source information, or actionable judgments about pending threats and opportunities.

As with other criteria covered in this series of tradecraft notes, analysts have to tailor each summary to the circumstances of the individual assignment. One way or another. though, the analyst has to crystallize the DI expertise presented in the text in a manner that will seize the attention of the intended audience.

When the clients for the assessment are few in number and their priority interests regarding US policy and action agendas are well known to the DI, the analysts' job is to craft a summary that crisply presents the what's new and the so what.

When an assessment is broadcast to a larger and more diffuse audience, the analysts' challenge in determining the structure and content of the summary is greater. Usually, the two most important goals are:


29


To account for the diversity of the potential audience for broadcast assessments, other introductory sections can be put to use in flagging the full range of value added and DI expertise. In particular, an introductory textbox, placed immediately after the summary, can be used to meet the needs of specialized segments of the audience:

A preface is also a useful instrument, in longer and broadly distributed assessments, for supplementing the summary:

Tradecraft Tips

DI veterans offer the following experience-based tips for enhancing the effectiveness of summaries:


30


--Use short paragraphs, bullets and sub-bullets, bold face, and italics to break up the space and to help the busy decisionmaker latch on quickly to the key information, relationships, and judgments.

--One official reported that he did not have time for "whole sentences"--just for what it was the analyst wanted him to know.

--This usually means policy implications. In effect, the summary is a vehicle for explaining why a busy policy official should spend time on a DI product.

--Strive to make the summary actionable; that is, structure it to help a policy official get something started, stopped, or otherwise sorted out.

--If the assessment makes a bottom-line predictive judgment, refer to the key assumptions and other essential elements of the argumentation.

--If the assessment conveys important new findings, cover the reliability of the sourcing and the essence of the methodology.

Reverse Blank


31


Directorate of
Intelligence

Notes on Analytic
Tradecraft

Product Evaluation
Staff


October 1995


Note 8

Implementation Analysis


This is the eighth in a series of Product Evaluation Staff notes to
clarify the standards used for evaluating DI assessments and to
provide tradecraft tips for putting the standards into practice.

This note discusses Implementation Analysis, one of the techniques for extending customized intelligence support to policymakers, law enforcement officials, and warfighters outlined in Note 1, Addressing US Interests in DI Assessments.

Implementation Analysis provides decisionmakers and action takers with an assessment of tactical alternatives for pursuing opportunities and averting dangers regarding established US policy goals. The role of the analyst complements but is distinctive from the role of the policymaker:

Analysts identify and evaluate alternatives for implementing objectives; policy officials first set the objective and then make the decisions about which tactics to adopt.

Illustrative hypothetical cases of Implementation Analysis include:


33


Why and When

Policy implementation regularly involves attempting to advance US interests under uncertain and often risky conditions. In these circumstances, policy officials recognize that US objectives can be difficult to achieve. Deterring rogue states from developing weapons of mass destruction and promoting democratic practice in policies with entrenched authoritarian traditions, for example, require overcoming deeply imbedded obstacles. Policy officials need support from intelligence analysts in identifying and assessing opportunities and dangers.

Intelligence analysts, through Implementation Analysis, can bring important value added to the policymakers' table:

Even when the analysts' inventory of alternatives largely mirrors that of policy officials, the latter benefit from exposure to the organized information and rigorous argumentation of the former.

In addition to its intrinsic value added, Implementation Analysis can also help the DI maintain effective policy ties on politically charged issues. When pursuit of US goals entails coping with multiple difficult challenges, heavily engaged policy officials do not appreciate a steady stream of DI assessments that reminds them (as well as congressional critics, for example) that they are working against long odds. Implementation Analysis sends the message that the intelligence team, although professionally committed to objectivity, is not indifferent to the challenges the policy team faces and will lend its expertise to help advance and protect US interests even under difficult circumstances.

In any case, analysts should place a premium on close lines of communication in delivering Implementation Analysis:

The initial DI delivery of Implementation Analysis usually consists of a briefing or memorandum for one or a handful of principal policy officials. In recognition of the value added. the latter have in the past subsequently asked for:


34


Tradecraft Tips

Veteran analysts offer the following tradecraft tips for putting Implementation Analysis into practice:

--Recognition of the policymakers' role as "action officers'' charged with getting things started or stopped among adversaries and allies overseas.

--Recognition of the policy officials' propensity at times to take risk for gain. For example, policymakers may see a one-in-five chance of turning an unsatisfactory situation around as a sound investment of US prestige and their professional energies.

--For example, if the objective of US policy is to leverage a foreign government or organization to reverse its course, analysts can first think of what developments could lead to that outcome and then make an inventory of the forces at play that could trigger such developments.

--If US policy calls for getting from point A to point B, then here are some gateways to consider and some cul-de-sacs to avoid.

--If the US policy objective is to strengthen the forces for moderation in country X, then the following factors should be examined as potential catalysts for the desired development .


35


--When analysts are pessimistic about prospects for advancing US objectives, they can employ a variation of the Sherlock Holmes methodology for moving forward when there is no promising suspect in a murder case. Provide policymakers with an assessment of the least unpromising tactical alternatives.

--DI analysts, in effect, are the scouts that provide information and insights to help the policymaker-coach develop the most sensible game plan, win or lose.

--Presentation and ranking of alternatives via a matrix containing a rough cost-benefit analysis is a useful way to convey the analysts' underlying all-source findings and linchpin assumptions. At the same time, such a matrix focuses the DI effort on analyzing alternatives, while leaving the actual choosing to the policymakers.


36


Directorate of
Intelligence

Notes on Analytic
Tradecraft

Product Evaluation
Staff


December [sic] 1995


Note 9

Conclusions


This is the ninth in a series of Product Evaluation Staff notes to
clarify the standards used for evaluating DI assessments and to
provide tradecraft tips for putting the standards into practice.

Conclusions--the analysts' findings based on organizing, evaluating, and interpreting the all-source information available on the issue at hand usually are the most important value added for key consumers of DI assessments. Because policymakers and action takers rely on DI assessments as they define and defend US interests, analysts must be precise in conveying the level of confidence in their conclusions, taking appropriate account of the prospect of deception and other sources of uncertainty.

Heavily engaged policy officials often receive directly much of the open-source and intelligence information available to DI analysts. But the policy officials need DI assistance in keeping track of the facts, fictions, and trivia. The important event has to be identified from among the ordinary, the underlying patterns have to be cobbled together from seemingly unrelated pieces of information, and the reliable sources have to be distinguished from the self-serving.

On complex national security issues the information available to the analysts rarely speaks for itself. Gaps and inconsistencies are the rule. This is where the DI analysts' expertise comes into play. When seasoned and skeptical DI analysts believe they have made their case by a careful scrubbing of ample all-source information, they should lean forward in making their conclusions precise and clear to the policy officials responsible for management of the issue. Analysts who have organized and evaluated their information are able to conclude with authority, for example:


37


DI assessments are particularly valuable to policy officials when the analytic findings are derived from the collective databases of a multidisciplinary team. Depiction of the political context for a foreign country's lax policy on narcotics controls, for example, or its financial and technological potential for pursuing a nuclear weapons program enables users of DI analysis to take a better measure of the potential risks and benefits of contemplated US policy initiatives.

In contrast, when available information is incomplete or susceptible to foreign deception operations and other sources of ambiguity, the analysts' reasonable doubts about, say, cause-and-effect relationships should be shared with the officials who may rely on DI assessments in taking policy actions.

Many of the issues that the DI tackles are inherently complex and thus shrouded by uncertainties. The analysts have a professional obligation--where warranted--to conclude that they do not know. In such instances, presentation of two or more plausible interpretations of the available information makes a more useful conclusion than a single unreliable one masked in vague language (for example, a real possibility).

Analysts should be particularly wary about projecting thin information as a DI conclusion. When, for example, analysts do not have a solid informational base and are relying on a small number of reports depicting an event as unprecedented or a pattern as well-established, they should attribute such conclusions to the source. Clandestine agents, foreign officials, and the local media may jump to conclusions. DI analysts should not.

In sum, the value added to policymakers, negotiators, warfighters, and law enforcement officials of conclusions in DI memorandums and briefings rests on:


38


Tradecraft Tips

DI veterans offer the following recommendations for conveying conclusions effectively:

1. When analysts have reached firm conclusions on complex and especially controversial issues, take the time to present the data, to point to the relationships and other implications, and to state the conclusions forcefully. For example:

2. Again, when an issue is complex and controversial, describe the credentials that lie behind the findings--in a textbox or footnote, if not in the main text. For example: depict the reliability of the sources of the information and other specific characteristics of a database; spell out the indicators used to determine diversion to military use of dual-purpose technology imports.

3. To minimize confusion when conveying a necessarily qualified conclusion, think of supplementing adverbial descriptors with a statement of rough numerical odds:

4. When the quality of available information requires either reserving judgment about conclusions or presenting multiple plausible interpretations, consider including a textbox or annex on information gaps and collection requirements.

5. When the text must be kept brief because of space limitations of a DI art form, the findings can be laid out in some detail in a textbox. This coverage can be useful both for those consumers who need a quick study into the issue and those with direct responsibility for decision and action who have an interest in taking precise account of what the DI knows.

6. Also use a textbox to explain any major shift in a DI conclusion from previous assessments or the basis for a contrasting conclusion held by other Intelligence Community analysts.

7. When appropriate, use chronologies, matrices, and other graphics to supplement the text in conveying complex trends and relationships. Even the best informed policy officials appreciate graphics that help them evaluate important information.


39


8. Conclusions are the bedrock foundation for estimative judgments in DI assessments that address future patterns of development. In papers that are divided between sections that set out the findings and those that make predictive judgments, analysts may find it useful to summarize the findings in a textbox immediately preceding the estimative portion. This helps clarify the argument for the critical reader. It can also help analysts audit their own logic trail.





40


Directorate of
Intelligence

Notes on Analytic
Tradecraft

Product Evaluation
Staff


December 1995


Note 10

Tradecraft and Counterintelligence


This is the tenth in a series of Product Evaluation Staff
notes to clarify the standards used for evaluating DI
assessments and to provide tradecraft tips for putting
the standards into practice.

This tradecraft note addresses the aspect of counterintelligence (CI) on which the DI is most directly responsible for improved Agency performance: Countering deception operations by foreign governments and organizations aimed at distorting the conclusions and judgments of analysts' assessments.

A subsequent note will address related CI aspects on which DI substantive expertise and analytic skill can make important contributions, including (1) countering espionage by helping to identify the US secrets foreign intelligence services are most interested in obtaining and (2) analytic support of efforts to manipulate foreign intelligence operations to US advantage.

Deception is here defined as all efforts by a foreign intelligence service or adversarial group to distort and otherwise manipulate the perceptions of analysts in order to place analysts and their policy clients at a disadvantage in understanding and dealing with the perpetrating country or group. Deception operations can be divided into two closely related subsets:

Countering Deception

For DI analysts, the first step in improving CI performance is to show increased respect for the deceiver's ability to manipulate perceptions and judgments by compromising collection systems and planting disinformation. Next, analysts must adjust the balance between speed and care in producing DI assessments. There is no "free lunch" here. The analysts' major


41


defense for countering deception exercising increased care in evaluating information and in testing working assumptions places pressures on the DI commitment to timeliness in servicing consumer demand for tailored products.

The key to holding down the opportunity costs of countering deception is to tie the effort as closely as possible to the normal processes analysts use to expand expertise and to ensure quality and policy utility in their memorandums and briefings:

Warning Signs

Through an investment in understanding the warning signs that a deception operation may be under way, analysts become more expert about their subjects as well as about when and how to apply their main weapons for protecting the integrity of DI assessments:

The first set of warning signs addresses the likelihood that a country or organization is engaged in an attempt to distort the analysts' perceptions:

1. Means. The country or entity (for example, terrorist group) being assessed has the experience and means to undertake sophisticated deception operations. Nearly all countries and entities given priority attention under PDD 35 have well-practiced means to deceive, often in third countries as well as at home.

2. Opportunity. The foreign country or entity is known to have countered the collection systems or platforms on which the DI analyst is particularly dependent. For example, when a target country has knowledge of the periodicity and acuity of technical collection vehicles that pass over an area it wishes to protect, analysts have to be aware that the resultant information may be incomplete if not also deliberately distorted.


42


Enhanced knowledge of both the reach and the vulnerabilities of collection systems will also help analysts in dealing with the everyday challenges of evaluating incomplete and ambiguous information.

3. Motive. A motive to deceive is believed to be present. Accomplished intelligence services--for example, in Russia, China, or Cuba--have the ability to mount elaborate denial and disinformation operations virtually across the board. Would they see a high payoff from distorting analyst and policymaker perceptions about the issue at hand?

The second set of warning signs focuses on anomalies in the information available to the analyst. Investment in addressing these warning signs reinforces core skills regarding what the analysts know, how they know it, and what they do not know that could affect the outcomes in ways important to policy officials, warfighters, and law enforcement officials. See, for example, Tradecraft Note No. 5, Facts and Sourcing (July 1995).

These warning signs include:

4. Suspicious gaps in collection. For example, when information central to the analysts' conclusions and judgments received through one collection channel is not supported to the extent expected from the take of other collection systems. In other words, the analysts are not receiving the range and volume of information they would expect if there was no deliberate tampering with sources and collection platforms.

5. Contradictions to carefully researched patterns. Does new information undercut the trends and relationships honed through past research? While analysts must be open-minded about the possibility of unexpected change, they should examine critically information that signals an inexplicable change, for example, in an adversary's priorities and practices.

6. Suspicious confirmations. For example, when a new stream of information from clandestine sources or technical collection seems to reinforce the rationale for or against a US policy initiative. In these circumstances, receiving the same "story" from more than one DO or diplomatic source does not in itself speak to the authenticity of the information.


43


Tradecraft Tips

The effort needed to meet the DI standard of ensuring appropriate protection against deception at the lowest net cost to timeliness will vary from case to case. For each assignment, analysts and managers will have to weigh the costs to credibility of being deceived against the opportunity costs of increased care. Below are general schemes for proceeding efficiently against two distinct levels of risk for DI assessments on complex issues.

Regular Issues. We recommend a two-step defense against deception for assessments on which there is no specific reason to suspect deception is at play and on which the primary value added to consumers is a general increase in understanding (as contrasted with tailored support for decision and action):

Step One. Organize the key information by setting it out, for example, on a yellow pad or blackboard, and examine it critically with warning signs of deception in mind. Are all the expected collection systems kicking in? Is the information appropriately consistent? Are the pieces of information on which conclusions and judgments are most heavily based from a reliable clandestine source or uncompromised collection platform?

Step Two. Play Devil's Advocate and develop a hypothetical argument for the case that deception is taking place; that is, a foreign entity is attempting to manipulate perceptions through some combination of denial and disinformation. Then determine to what extent the information organized in Step One lends support to the case for the existence of a deception operation. Look for "hits and misses" between the information and the hypothetical case that substantially raise or diminish the likelihood of the presence of an elaborate deception effort.

Suspect and Sensitive Issues. Where there is reason to suspect the presence of deception (based on general warning signs and the exercises outlined above), analysts should undertake a more elaborate defense of the integrity of their assessments. With or without cause for suspicion, an extended defense should also be employed on sensitive issues--those on which the policy officials may directly rely in making decisions on whether to take military, diplomatic, or economic actions to defend US interests.

On suspect and sensitive issues, we recommend that analysts prepare a textbox or annex that addresses the possibility that a deception operation is distorting the assessment's conclusions and estimative judgments. Managers should consider preparation and defense of the textbox


44


as an essential element of the analyst's effort, even in instances when they determine it is unnecessary to publish it.

The content of the recommended assessment of the integrity of information will vary with circumstances. But at a minimum it should convey that (I ) the possibility of the presence of deception has been taken seriously, (2) analytic tests to determine the likelihood of deception have been executed, and (3) any reasonable doubts are forthrightly reported.

Specific questions that analysts can address with profit include:

1. What is known of the ability of the country or group under examination to engage in deception? About incentives to do so?

2. What is known about the reliability and integrity of the sources and collection platforms most heavily depended upon?

3. What can be said about the availability and consistency of information from all relevant sources and platforms?

4. What tradecraft tests were used to evaluate the authenticity of the information relied upon to reach key conclusions?

In sum, such a textbox provides the analysts with an answer to a question likely to be posed with greater frequency by Agency leaders and policy officials in the wake of the Ames case: How do you know you are not being deceived?


45


[End]

Hypertext conversion by JYA/Urban Deadline