© 2024 Kansas City Public Radio
NPR in Kansas City
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Succeeding By 'Thinking Like The Enemy'

You might not be familiar with the term “red team” but it’s a concept that is used by the CIA, the military and many corporations to assess their vulnerabilities and better protect themselves against threats.

Micah Zenko, a senior fellow with the Council on Foreign Relations, analyzes this concept in his new book “ Red Team: How to Succeed By Thinking Like the Enemy.” He tells Here & Now‘s Indira Lakshmanan that the theme of his book is “you can’t grade your own homework.”

Book Excerpt: ‘Red Team’

By Micah Zenko

Cover of Micah Zenko's new book

This is a book about how to improve the performance of an institution by enabling it to see the world in a new and different way. Institutions—whether they are military units, government agencies, or small businesses— operate according to some combination of long-range strategies, near-term plans, day-to-day operations, and to-do lists. Decision-makers and their employees do not simply show up at their jobs each morning anew and decide then and there how to work and what to work on. The existing guidance, practices, and culture of an institution are essential to its functioning effectively. Yet, the dilemma for any institution operating in a competitive environment characterized by incomplete information and rapid change is how to determine when its standard processes and strategies are resulting in a suboptimal outcome, or, more seriously, leading to a potential catastrophe. Even worse, if the methods an institution uses to process corrective information are themselves flawed, they can become the ultimate cause of failure.

This inherent problem leads to the central theme of this book: you cannot grade your own homework. Think back to a high school class where you struggled every day to grasp the subject. Now, imagine that the teacher empowered you to grade your own homework. At first this would seem like a great boon—a guaranteed 100 percent every time! No matter how poorly you actually performed, you could decide your own grade for each assignment. In correcting those assignments you would develop a range of rationalizations as to why you really deserved an A, in spite of inferior results: “this wasn’t covered in class,” “the teacher did a lousy job,” “I was really tired,” or maybe “just this one last time.” Now, imagine your shock when, after a semester of self-grading, the teacher hands out the final exam and announces that this time she will be the one holding the red pen. This would expose all the things that you should have learned or maybe thought you understood, but never really did. Grading your own homework might feel good in the short term, but it completely clouds one’s self-awareness, and can eventually lead to a failing grade.

The warning that “you cannot grade your own homework” has relevance far beyond the classroom. Consider the mistaken self-evaluation strategy that was employed by the CIA in its post-9/11 detention and interrogation program. Internal assessments of its operations’ necessity and effectiveness—including the use of “enhanced interrogation techniques” (i.e., torture) against suspected terrorists—were conducted by the same CIA personnel that had been assigned to develop and manage the program, and also by outside contractors who had obvious financial interests in continuing or expanding it. In June 2013, an internal CIA review found that its personnel regularly made “assessments on an ad hoc basis” to determine if “various enhanced techniques were effective based upon their own ‘before and after’ observations” of changes in a detainee’s demeanor. Unsurprisingly, the CIA personnel and outside contractors judged with confidence that the program they worked in was both highly effective and needed.

Despite requests by National Security Advisor Condoleezza Rice and Senate Select Committee on Intelligence in the mid-2000s to commission what was the equivalent of a red team alternative analysis of these programs, none was ever ordered by senior CIA officials. As the Agency acknowledged: “The sole external analysis of the CIA interrogation program relied on two reviewers; one admitted to lacking the requisite expertise to review the program, and the other noted that he did not have the requisite information to accurately assess the program.” An informed and empowered red team, comprised of knowledgeable experts holding the requisite security clearances, would have offered a more realistic evaluation of the use of torture and provided recommendations for how to revise or terminate the detention and interrogation program.

An astonishing number of senior leaders are systemically incapable of identifying their organization’s most glaring and dangerous shortcomings. This is not a function of stupidity, but rather stems from two routine pressures that constrain everybody’s thinking and behavior. The first is comprised of cognitive biases, such as mirror imaging, anchoring, and confirmation bias. These unconscious motivations on decision-making under uncertain conditions make it inherently difficult to evaluate one’s own judgments and actions. As David Dunning, a professor of psychology at Cornell University, has shown in countless environments, people who are highly incompetent in terms of their skills or knowledge are also terrible judges of their own performance. For example, people who perform the worst on pop quizzes also have the widest variance between how they thought they performed and the actual score that they earned.

The second related pressure stems from organizational biases— whereby employees become captured by the institutional culture that they experience daily and adopt the personal preferences of their bosses and workplaces more generally. Over a century ago, the brilliant economist and sociologist Thorstein Veblen illustrated how our minds become shaped and narrowed by our daily occupations:

What men can do easily is what they do habitually, and this decides what they can think and know easily. They feel at home in the range of ideas which is familiar through their everyday line of action. A habitual line of action constitutes a habitual line of thought, and gives the point of view from which facts and events are apprehended and reduced to a body of knowledge. What is consistent with the habitual course of action is consistent with the habitual line of thought, and gives the definitive ground of knowledge as well as the conventional standard of complacency or approval in any community.

Though we would now refer to this derisively as “going native” or “clientism”—whereby people become incapable of perceiving a subject critically after years of continuous study—any honest employee or staffer should recognize this all-pervasive phenomenon that results in organizational biases. This is particularly prominent in jobs that require deep immersion in narrow fields of technical or classified knowledge, and those that are characterized by rigid hierarchical authority—the military is a clear example. Taken together, these common human and organizational pressures generally prevent institutions from hearing bad news, without which corrective steps will not be taken to address existing or emerging problems.

Excerpted from the book RED TEAM: HOW TO SUCCEED BY THINKING LIKE THE ENEMY by Micah Zenko. Copyright © 2015 by Micah Zenko. Published by Basic Books. Used by permission of the publisher.

Guest

  • Micah Zenko, senior fellow with the Council on Foreign Relations and author of “Red Team: How to Succeed By Thinking Like the Enemy.” He tweets @micahzenko.

Copyright 2020 NPR. To see more, visit https://www.npr.org.

/

(Pixabay)
/
(Pixabay)

KCUR prides ourselves on bringing local journalism to the public without a paywall — ever.

Our reporting will always be free for you to read. But it's not free to produce.

As a nonprofit, we rely on your donations to keep operating and trying new things. If you value our work, consider becoming a member.