The Art of Thinking Clearly
Rolf Dobelli
Overview: Swiss author and businessman Rolf Dobelli became interested in heuristics, biases, and other cognitive errors in a rather unexpected way, after a chance invitation to speak at an event with Nassim Nicholas Taleb (The Black Swan). He began to consume works on failures to think clearly, which led to him creating a list of such errors. The Art of Thinking Clearly represents the culmination of this effort. In this book, Dobelli lists 99 such cognitive errors. Examples include, “Why You Should Forget the Past” (sunk cost fallacy), “Why the Last Cookie in the Jar Makes Your Mouth Water” (scarcity error), and “Why Speed Demons Appear to Be Safer Drivers” (intention-to-treat error). Each brief chapter includes anecdotes, examples, and practical advice to help the reader recognize the error and avoid it
- Which error struck home for you the most? Can you provide an example of a decision you made that was adversely affected by this error?
- Chapter 6, “Don’t Accept Free Drinks,” points out the danger of reciprocity, in which people have “extreme difficulty being in another person’s debt.” As an example, he states that he and his wife invited an uninteresting couple to dinner simply because they had done the same for them a few months earlier. How do we see the pull of reciprocity in our interactions with subordinates? With peers? With outside contractors? Do existing rules regarding professional and unprofessional relationships, and relations with non-federal entities adequately guard against the reciprocity principle?
- In Chapter 15, “Why You Systematically Overestimate Your Knowledge and Abilities”, Dobelli describes the overconfidence effect, in which we “systematically overestimate our knowledge and ability to predict—on a massive scale.” For example, he cites a survey in which 93 percent of US students estimate themselves to be above average drivers. Is this danger particularly acute in the Air Force, in which performance reports tend to treat nearly everyone as above average? What dangers does the overconfidence effect introduce into our thinking at a strategic level? What about on a day-to-day basis?
- Most of us are generally aware of the groupthink error explained in chapter 25, “The Calamity of Conformity.” Under this error, “a group of smart people makes reckless decisions because everyone aligns their opinions with the supposed consensus.” This danger is particularly acute in a close-knit group that cultivates team spirit. In your small groups, how do you avoid this danger? Who is responsible for questioning tacit assumptions, to serve as devil’s advocate? Can you describe a team when you succumbed to groupthink, and what dangers arose from this?
- Along the lines of problems that teams present, Dobelli describes the phenomenon of social loafing in chapter 33, “Why Teams are Lazy.” This chapter runs contrary to the narrative that in teams, the whole is greater than the sum of its parts. Under this phenomenon, individual team members tend to hold back in terms of both participation and accountability, realizing that their lack of effort and accountability can be covered for by others. Does your organization suffer from this effect? Do you as an individual tend to hold back sometimes, realizing that it will likely go unnoticed? What does this say about our assumptions that teams are efficient ways to operate?
- In chapter 48, “Why Experience Can Damage Your Judgment,” Dobelli explains association bias, in which our brains are wired to make connections that can create “false knowledge.” In other words, we tend to “overlearn” the lessons of history. To what degree do we see this in Air Force operations? Do we as leaders display a tendency to overcorrect, forming connections between past events and outcomes where none should exist?
- Dobelli uses the fable of the fox and the grapes to illustrate our tendency to avoid cognitive dissonance in chapter 50, “Sweet Little Lies.” Under this cognitive error, we tend to reinterpret what happened retrospectively to avoid the dissonance that comes when we set out to do something and fail to accomplish it. As an example, I may try hard to win a quarterly award. When I come up short, I convince myself that the award criteria was not fair, or that I really did not want the award after all. Can you think of an example when you fell back on cognitive dissonance to justify past actions or decisions? How can this phenomenon affect an organization’s thinking?
- Chapter 69, “Disregard the Brand New,” warns against neomania, placing “far too much emphasis on flavor-of-the-month inventions and the latest ‘killer apps’ while underestimating the role of traditional technology.” He cites examples in which futurists predicted a world far more advanced than actually turned out to be the case. Are we as an Air Force particularly susceptible to this cognitive error? What futuristic technologies have we placed too much emphasis on, and what traditional technologies have we inappropriately discounted?
- Dobelli discusses domain independence in chapter 76, “Knowledge is Nontransferable.” Under this principle, “[i]nsights do not pass well from one field to another.” The skills that a person gains in one area do not necessarily transfer to other areas; for example, a good teacher might not make a good salesperson, even though the fields have some overlap. As an Air Force, our leadership has focused on the importance of “multi-domain command and control.” To what degree does domain independence complicate this focus area? How can we break down barriers between domains to allow insights to better pass from one field to another?
- Dobelli lays out the familiar default effect in chapter 81, “Why You Go with the Status Quo.” This effect stresses the power of a default choice, in which we tend to stick with the status quo simply because it is easier, more familiar, and less risky, even when the status quo is not the most advantageous option. He writes, “The default setting is as warm and welcoming as a soft pillow, into which we happily collapse.” What decisions are you making that default to the status quo simply because it is familiar? What about your organization? How can you be aware of this phenomenon and guard against relying on it too much?
- Dobelli’s last chapter might be his most provocative. In chapter 99, “Why You Shouldn’t Read the News,” he provides three reasons for reading more books and long articles and fewer current news accounts: 1) we react disproportionately to flashy news headlines; 2) current news rarely helps us make better decisions; and 3) news is “a waste of time” that could be better spent elsewhere. In a world in which we are bombarded with headlines, what do you think about his argument? Should we spend more time becoming educated on current events, or less? How much time do you spend just trying to keep up with current events through various media outlets, including social media? Could you focus less on currency and more on depth of understanding? What outlets for succinct news summaries have you found helpful so you can spend less time absorbed in current events?