Why do we think we understand things more than we actually do? đŸ˜”

Jaffer Jamaludeen
4 min readAug 26, 2021

--

The illusion of explanatory depth

The illusion of explanatory depth (IOED) describes our belief that we understand more about the world than we actually do. It is often not until we are asked to actually explain a concept that we come face to face with our limited understanding of it.

Why it happens

The illusion of explanatory depth happens for four reasons:

  • When information is not in front of us, our memories for it are foggy, but we aren’t aware of this gap.
  • Believing that we can briefly explain multiple parts or levels of a concept leads us to believe we understand the entire concept better
  • Not having a natural end-point for explanations feeds our ego and leads us to believe we can explain anything well since there is no such thing as “complete.”
  • We rarely explain things and therefore don’t get the practice or feedback we need to understand our own shortcomings.

How it all started

The illusion of explanatory depth was coined in 2002 by Yale researchers Leonid Rozenblit and Frank Keil. In their experiment, Rozenblit and Keil asked Yale undergraduate students to rate their understanding of everyday items like sewing machines, cell phones, and zippers. They then asked subjects to write a detailed explanation of how each item works and then re-rate their own knowledge on these items. Confronted with the limitations of their own understandings, participants consistently scored themselves much lower than they had before writing these explanations. In their paper, The Misunderstood Limits of Folk Science: An Illusion of Explanatory Depth, Rozenblit and Keil concluded that having to explain basic concepts helps people realize that they don’t know all they think they do.

Example 1 — Rash decision-making

Suppose you happen upon information about a new graduate program that seems to be a great fit for you. Its slogan and marketing techniques draw on your values of innovation, betterment in society, and diversity. It’s located in a city you’ve always wanted to move to, and its recent graduates have gone on to successful careers in a variety of fields that you could definitely see in your personal crystal ball. You’re already starting your application, and you begin to tell your friends and family about your excitement.

All of a sudden, one of your friends calls and asks you to tell him more about the program. You launch into the reasons you want to go: it’s in a great city, it has reputable alumni, and its website really speaks to your values.

“No, no, no,” your friend says, stopping you. “I want you to explain the program itself. What are you going to learn about? What is the program actually training you to do?”

Often, subjects and concepts catch us when we’re thinking fast and running high on passion, meaning our feelings don’t necessarily translate succinctly into explanation. In these cases, we may make rash decisions without thinking them through, and being asked to explain ourselves can actually help get us in touch with our own thoughts.

If it weren’t for your friend, before you know it, you may spend thousands of dollars on a graduate program that you don’t even fully understand. Only the attempt to explain the program can put you back in touch with your own reality, and prompt you to do some more research.

Example 2 — Political extremism

In one study by Phil Fernbach and colleagues, experimenters asked subjects what political causes were close to their hearts and then asked them to give reasons why they support them. Afterward, the experimenters offered to donate money on the subjects’ behalf to an advocacy group for these particular causes, and more often than not, the subjects agreed.

In a second trial, experimenters asked subjects what political causes were close to their hearts and then asked them to simply explain the causes in detail. Afterward, once again, the experiments offered to donate money on the subjects’ behalf to an advocacy group for these particular causes. In this trial, however, people largely declined the experimenters’ offers. After explaining the causes, the subjects often realized they didn’t actually know as much as they thought they did about these subjects, and that they would rather dig deeper into these issues before giving them money.

While in the first instance, being probed to think with one side in mind confirmed subjects’ already held views, being asked to explain a more general view on the second trial caused subjects to think twice about supporting a cause. As you can see, our political landscape’s focus on argument and reason, rather than thorough explanation, often confirms people’s already held views, promoting extremism and division

How to avoid it

You can avoid the IOED by explaining concepts in detail, out loud, before launching into a debate or making decisions regarding them. Keep an open mind, eyes, and ears, and be sure to understand all sides of a topic before picking one to vouch for.

References: Ted

--

--

No responses yet