Cognitive traps for intelligence analysis
From Wikipedia, the free encyclopedia
-
- This article deals with a subset of the intellectual process of intelligence analysis itself, as opposed to intelligence analysis management, which, in turn, is a subcomponent of intelligence cycle management. For a complete hierarchical list of articles in this series, see the intelligence cycle management hierarchy.
Intelligence analysis is fraught with a host of general cognitive traps that appear in many disciplines, but also to a set of cognitive traps common to intelligence analysis. The intelligence traps that lie between an analyst and clear thinking were first articulated, systematically, by Dick Heuer. [1] The traps may be facets of the analyst's own personality, or of the analyst's organizational culture. The personality trap is, most often, assuming the people being studied think like the analyst, a phenomenon called mirror-imaging. An experienced analyst can usually, although not always, detect that he or she is mirror imaging, if unwilling to examine variants of what seems most reasonable -- in the analyst's personal framework. Peer review, especially by people with different background, can be a wise safeguard. When mirror-imaging, the analyst may regard a legitimate question as a personal attack, rather than looking beyond ego to the merits of the question.Organizational culture can create a trap in which individual analysts become unwilling to challenge the acknowledged experts in the group.
Target fixation, where a pilot, so intent on delivering a bomb to a target that he forgets to fly the airplane and crashes into the target, is a more basic human tendency that many realize. Analysts may fixate on one hypothesis, looking only at evidence that is consistent with a pre-formed view, and ignore other relevant views. A desire for rapid closure is another form of idea fixation.
"Familiarity with terrorist methods, repeated attacks against U.S. facilities overseas, combined with indications that the continental United States was at the top of the terrorist target list might have alerted us that we were in peril of a significant attack. And yet, for reasons those who study intelligence failure will find familiar, 9/11 fits very much into the norm of surprise caused by a breakdown of intelligence warning."[2] The breakdown happened, in part, because there was poor information sharing among analysts, for example, in different FBI offices. At a conceptual level, US intelligence knew that al-Qaida actions almost always involved multiple, near-simultaneous attacks. The FBI did not put together the information of foreign students performing oddly in flight training.
On the day of the hijackings, under much greater time pressure, no analyst associated the multiple hijackings with the multiple attack signature of al-Qaeda. There had been a sufficient failure of imagination that a major attack could occur within the US, such that the irregularities detected by the Federal Aviation Administration and North American Air Defense Command were not flowing into a center where analysts could fuse this information, ideally with the earlier reports about the possibility of using hijacked airliners, and the oddities of pilot training school reports.
Analogies can be extremely useful, but, when the analogy is forced,and full of assumptions of cultural or contextual equivalences, inappropriate analogies are yet another cognitive trap, especially when the analyst is unaware of differences in one's own context and that of others. It can be hard to admit that one does not know something. It is even harder to deal with a situation when one is unaware one lacks critical knowledge. Ignorance can be a lack of study, an inability to mesh new facts with old, or a simple denial of conflicting facts.
Contents |
[edit] Organizational Culture
Even though a individual may be a creative thinker, the analyst's organization may not support productivity. Managers especially concerned with appearances often are at the root of suppressing creative conflict, and staying with stereotypes. Stereotyping can relate to stovepiping, where a group, especially invested in a collection technology, may ignore valid information from other sources (functional specialization). It was a Soviet tendency to value HUMINT from espionage beyond all other sources, such that Sovient OSINT had to develop outside the state intelligence organization, but in the USA Institute (later USA-Canada) in the Soviet Academy of Sciences[3].
Another problem of specialization may be as a result of security compartmentation. If an analytic team has unique access to one source, they may overemphasize its significance. This can be a strong problem with long-term HUMINT relationships, where the partners develop personal bonds.
Just as analysts can reject evidence that contradicts prior judgments, the same phenomenon can capture groups. There is a very delicate balance between thoughtful use of deliberately contrarian "red teams" and politicized use of ideologues who want support only for one policy. The latter has, recently, been called stovepiping, not of intelligence collection disciplines but of information flow.
[edit] The Other Culture
There are many levels at which one can misunderstand another culture, be it of an organization or a country. One key trap is the rational-actor hypothesis, which ascribes "rational" behavior to the other side, but with the definition of rationality coming from one's own culture. The social anthropologist Edward T. Hall illustrated one such conflict [4] with an example of conflict in the American Southwest. Drivers from the "Anglo" culture became infuriated when "Hispanic" traffic police would cite them for going one mile per hour above the speed limit, but see the "Hispanic" judge dismiss charges. "Hispanic" drivers were convinced that the "Anglo" judges were unfair because they would not dismiss charges due to circumstances.
Both cultures were rational, but in different ways, pertaining to the enforcement of laws and the adjudication of charges. Both believed that one of the two had to be flexible and the other had to be formal. In Anglo culture, police had discretion about speeding tickets, while the court was expected to stay within the letter of the law. In the Hispanic culture, the police were expected to be strict, but the courts would balance the situation. There was a fundamental Lack of Empathy. Each side was ethnocentric and assumed the other culture was its mirror image. Denial of rationality, in the traffic example, happened in both cultures. Each culture, however, was acting rationally within its own value set.
In a subsequent interview, Hall spoke widely about intercultural communication [5]. He summed up years of study in the simple statement "I spent years trying to figure out how to select people to go overseas. This is the secret. You have to know how to make a friend. And that is it!"
To make a friend, one has to understand the culture of the potential friend, one's own culture, and how things rational in one may not translate to the other. Key questions are
-
- "what is culture"
- "how can an individual be unique within a culture?
Hall said "If we can get away from theoretical paradigms and focus more on what is really going on with people, we will be doing well. I have two models that I used originally. One is the linguistics model, that is, descriptive linguistics. And the other one is animal behavior. Both involve paying close attention to what is happening right under our nose. There is no way to get answers unless you immerse yourself in a situation and pay close attention. From this, the validity and integrity of patterns is experienced. In other words, the pattern can live and become apart of you.
"The main thing that marks my methodology is that I really do use myself as a control. I pay very close attention to myself, my feelings because then I have a base. And it is not intellectual."
Proportionality bias assumes that small things are small in every culture. Things are of different priority in different culture. In Western, especially Northern European, culture, time schedules are very important. Being late can a major discourtesy. Waiting one's turn is a cultural norm, confusing Westerners who ascribe failing to stand in line as a cultural failing. "Honor killing" seems bizarre to some cultures, but very serious in others.
Even within a culture, individuals are just that. Presumption of Unitary Action by Organizations is another trap. In Japanese culture, the lines of authority are very clear, but the senior individual will also seek consensus. American negotiators may push for quick decisions, when the Japanese need to build consensus first -- and once it exists, they may execute faster than Americans.
[edit] The Other Side is Different
The analyst's country (or organization) is not identical to that of the opponent. One error is to mirror-image the opposition, and assume it will act as your country and culture would under the same circumstances: "the belief that the perpetrators will not carry out a particular act because the defender, in their place, would not do it. It seemed inconceivable to the U.S. planners in 1941 that the Japanese would be so foolish to attack a power whose resources so exceeded those of Japan, thus virtually guaranteeing defeat".[2]
In like manner, no analyst in US Navy force protection conceived of a Burke-class destroyer, such as the USS Cole, being attacked with a small suicide boat--a boat like the ones that were planned to be used extensively in the WWII Japanese defense against invasion.
[edit] The Other Side Makes Different Technological Assumptions
Mirror-imaging can also take place among one's own analysts, who commit to a common set of assumptions rather than challenging assumptions. Another country may have quite different cultural frameworks that affect the targets of intelligence, positively or negatively.
Other countries may approach technology differently, which certainly was the case in the Pacific Theater of World War II. On the other side, the Japanese seemed to believe that their language was so complex that even if their cryptosystems, such as PURPLE were broken, outsiders could not truly understand the content. That was not strictly true, but it was sufficiently so to find cases where the intended recipients did not clearly understand the writer's intent.
On one side, the US Navy assumed that ships anchored in the shallow waters of Pearl Harbor were safe from torpedo attack, even though the British had demonstrated shallow-water torpedo attack in the 1940 Battle of Taranto.
Just as the US underestimated the ability of the Imperial Japanese Navy to make a successful attack in the Battle of Pearl Harbor, although the 9/11 conspirators demonstrated an organizational capacity to coordinate the simultaneously hijacking of four airliners, no one suspected that the hijackers' weapon of choice would be the box-cutter [2].
The US Navy underestimated the danger of suicide boats in harbor, and set Rules of Engagement that let an unidentified boat sail into the USS Cole without being warned off or taken under fire. An Arleigh Burke class destroyer is one of the most powerful warships ever built, yet US security doctrine did not protect the docked USS Cole[2].
[edit] The Other Side Doesn't Make Decisions As Yours Does
Mirror imaging, or assuming the other side thinks as you do, is one of the great hazards. Again looking at policymakers rather than analysts, this was a huge problem during Vietnam, with the President and Secretary of Defense making assumptions that Ho Chi Minh would react to situations in the same way that would Lyndon Baines Johnson, or, the epitome of Western rationality, Robert S. McNamara. In like manner, completely aside from US political manipulations of intelligence, there was a serious misapprehension that Saddam Hussein would view the situation vis-a-vis Kuwait as the State Department and White House viewed it.
Opposing countries are not monolithic, even within their governments. There can be bureaucratic competition that becomes associated with different ideas. Some dictators, such as Hitler and Stalin, were known for creating internal dissension, so only the leader was in complete control. A current issue, which analysts understand but politicians either may not, or want to play to domestic fears, is to ignore the actual political and power structure of Iran, and equate the power of the President of Iran with that of the President of the United States.
Opponents are not always rational, or they may be more risk-taking that one's own country would be. Returning again to the Iranian example, an apparently irrational statement may come from President Mahmoud Ahmadinejad, but does not have the importance of a similar statement coming from Supreme Leader Ali Khamenei. Risk-taking to give the illusion of a WMD threat appears to have been one of Saddam Hussein's doctrines. They will be unlikely either to act in the best case for your side, or to take the worst case approach to which you are most vulnerable. There is sometimes an internal danger, among analysts, to assume the opponent is all-wise and will always know all of your side's weaknesses.
[edit] The Other Side May Be Trying to Confuse You
Analysts need to form hypotheses, but the analysts need to be open to data that either confirms or disproves a hypothesis, rather than searching for evidence that supports only one theory. They must remember that the enemy may be deliberately deceiving them with information that seems plausible to the enemy. Bacon observed "the most successful deception stories were apparently as reasonable as the truth. Allied strategic deception, as well as Soviet deception in support of the operations at Stalingrad, Kursk, and the 1944 summer offensive, all exploited German leadership’s preexisting beliefs and were, therefore, incredibly effective." Stories that Hitler believed implausible were not accepted. Western deception staffs mixed "ambiguity-type" and "misleading-type" deceptions, the former intended simply to confuse analysts and the latter to make one false alternative especially likely.
Of all modern militaries, the Russians treat strategic deception, or, in their word, maskirovka, which goes beyond our phrase to include deception, operational security and concealment, as an integral part of all planning, in which the highest levels of command are involved. See London Controlling Section for a WWII Allied deception organizations. [6].
Bacon wrote "The battle of Kursk was also an example of effective Soviet maskirovka. While the Germans were preparing for their Kursk offensive, the Soviets created a story that they intended to conduct only defensive operations at Kursk. The reality was the Soviets planned a large counteroffensive at Kursk once they blunted the German attack. .... German intelligence for the Russian Front assumed the Soviets would conduct only “local” attacks around Kursk to “gain a better jumping off place for the winter offensive.” The counterattack by the Steppe Front stunned the Germans [7].
The opponent may try to overload the analytical capability [8], As a warning to those preparing the intelligence budget, and to those agencies where the fast track to promotion is in collection, one's own side may produce so much raw data that the analyst is overwhelmed, even without enemy assistance.
[edit] See also
[edit] References
- ^ Heuer, Richards J. Jr. (1999). Psychology of Intelligence Analysis. Chapter 2. Perception: Why Can't We See What Is There To Be Seen?. History Staff, Center for the Study of Intelligence, Central Intelligence Agency. Retrieved on 2007-10-29.
- ^ a b c d Porch, Douglas (September 2002). "Surprise and Intelligence Failure". Strategic Insights I (7). US Naval Postgraduate School.
- ^ “The Amerikanisti”, Time, Jul. 24, 1972, <http://www.time.com/time/magazine/article/0,9171,906150,00.html>. Retrieved on 28 October 2007
- ^ Hall, Edward T. (1973). The Silent Language. Anchor.
- ^ Sorrells, Kathryn (Summer 1998). Gifts of Wisdom: An Interview with Dr. Edward T. Hall. Retrieved on 2007-10-28.
- ^ Smith, Charles L. (Spring 1988). "Soviet Maskirovko". Airpower Journal.
- ^ Bacon, Donald J. (December 1998). Second World War Deception: Lessons Learned for Today’s Joint Planner, Wright Flyer Paper No. 5. (US) Air Command and Staff College. Retrieved on 2007-10-24.
- ^ Luttwak, Edward (1997). Coup D'Etat: A Practical Handbook. Harvard University Press.