Facts and feelings: knowing better by knowing ourselves

FACTS and FEELINGS: from what I read in today’s paper, there seems to be little public will to distinguish between these two when firmly asserting knowledge claims.  And from what I hear in science-based podcasts, our biased brains make it hard to do so even when we try.  As Theory of Knowledge teachers, aiming for thinking critically and appreciating what it takes to know, we’re tackling no lightweight project!  We might seriously welcome resources that give us support.  So today I’m recommending two I consider to be entertaining and helpful – a totally delightful book named Factfulness and a short video on why we can be so convinced we’re right.

Factfulness: Hans Rosling

No surprise here!  The first resource absolutely has to be the 2018 book by Hans Rosling, with Ola Rosling and Anna Rosling Rönnlund.  He was working on this book up to his death in February 2017, even taking the manuscript into the hospital in his last days.  Ola and Anna, family and co-authors, then took the book to its publication.  Surely its full title is one that will prompt any teacher of Theory of Knowledge to snatch it up:  Factfulness: Ten Reasons We’re Wrong About the World – and Why Things Are Better Than You Think.

­­­­

You’ve almost certainly met Hans Rosling already – the Swedish professor of international health who made statistics comprehensible, visually, to an audience worldwide. He was an educator who argued that our beliefs about the world should be grounded in facts (data!), and reached an audience of millions as a compelling speaker and through the Gapminder Foundation.  You may want to click back to the tribute I wrote on his death to this thinker I so admire:   Thank you, Hans Rosling: numbers, facts, and the world.   

“Just as I have urged you to look behind the statistics at the individual stories, I also urge you to look behind the individual stories at the statistics.  The world cannot be understood without numbers.  And it cannot be understood with numbers alone.” Hans Rosling (128)

The book, though, isn’t just about facts – or our woeful ignorance of them.  It’s more relevant to our TOK course because it’s also about the biases of our brains and habits of our minds that can stand in the way of evaluating information clearly.  Rosling calls these “instincts” for how deeply entrenched they seem to be.  Although I wouldn’t use the term “instincts” myself, the ten he identifies give a very good entry into becoming aware and thinking more critically.

I’ll spoil nothing for readers if I outline here, oh-so-briefly, the “ten reasons” of the title.  They will show you why the book is so relevant to TOK.  But that’s all.  An outline like this one leaves for your own reading the qualities that make it a pleasure to read: the stories taken from many parts of the world, the nuances of the discussion, and the inspiration to learn more, and better, about the world.

  1. the gap instinct  We tend to think in polarities, dividing the world into rich and poor, developed and developing – and simplified thinking is reinforced by dramatic stories and ignorance.  When distribution and averages are understood, with awareness of how comparisons are drawn, the data demonstrate spreads and gradations, not distinct groups with gaps between them.  For TOK?  Understanding statistics, but also biased assumptions.
  2. the negativity instinct  We have an instinct “to notice the bad more than the good.  There are three things going on here: the misremembering of the past; selective reporting by journalists and activists; and the feeling that as long as things are bad it’s heartless to say they are getting better.” (65). For TOK? Understanding selection and emphasis in representing the world, as in the media.
  3. the straight-line instinct  We tend to think in straight lines as we project a present rate into the future, neglecting other possibilities such as doubling (as in an epidemic), S-bends as a situation rises then plateaus, humps as a high point is passed, or curving slides as the rate drops.  We thereby distort our predictions.  For TOK?  Recognizing assumptions, visualizing statistics.
  4. the fear instinct  Frightening things get our attention in our own lives and the media, but they are not necessarily the most dangerous in posing risk.  “Our natural fears of violence, captivity, and contamination make us systematically overestimate these risks.” For TOK?  Attention filters on sense perception, influence of intuition and emotion on reasoning.
  5. the size instinct  A single number can seem impressive, but can be misleading unless it is compared with some other relevant number (e.g. other places, other times) or divided by it (e.g. per person) to see things in proportion.  For TOK?  Numbers and the world/statistics, perspectives and argument.
  6. the generalization instinct  “The gap instinct divides the world into ‘us’ and ‘them’, and the generalization instinct makes ‘us’ think of ‘them’ as all the same.”  We need to question our categories: look for differences within groups; look for similarities and differences across groups; beware of “the majority” and the huge range the term encompasses. (165). For TOK?  Excellent treatment of the pitfalls in generalizing, in inductive reasoning.
  7. the destiny instinct  Slow change is not no change. “Cultures, nations, religions, and people are not rocks. They are in constant transformation.” (170). Africa is not “destined” to be poor nor Europe rich! We need to keep track of gradual improvements, update our knowledge, and recognize cultural change. For TOK?  Excellent treatment of the assumptions and biases built into perspectives on the world.
  8. the single perspective instinct  It is better to look at problems from different perspectives to understand and find solutions.  For TOK?  Central to our course: the influence of perspectives on what we claim to know.
  9. the blame instinct  We tend to simplify causes by blaming individuals, groups, experts, media. We should look for causes, not villains and look for systems of interacting causes, not individuals.  For TOK? Complexities in the search for causes and explanations.
  10. the urgency instinct  “Data must be used to tell the truth, not to call to action, no matter how noble the intentions…. Urgency is one of the worst distorters of our worldview.” (236). We should take a breath, insist on the data, beware of fortune-tellers, and be wary of dramatic action. For TOK?  The influence of emotion on reaching conclusions, the importance of reason.

“I don’t tell you not to worry.  I tell you to worry about the right things….. Be less stressed by the imaginary problems of an overdramatic world, and more alert to the real problems and how to solve them.” Hans Rosling (241)

Ultimately, what I love about Hans Rosling’s book is its insistence on the importance of learning – resisting our human biases and continually updating with facts our understanding of the world. This is also what makes it a perfect book for a teacher of IB Theory of Knowledge. It reinforces our aims as we teach our course, and does so with captivating stories of the world and reflections upon a more hopeful version of it than we usually encounter.

And I’ll just add that my own hardcover copy of Factfulness is special to me for a personal reason.  It was a gift to me on a recent significant birthday from dear friends, who clearly know me well.

Facts and Feelings: Motivated Reasoning

The second resource I recommend is smaller, lighter, and narrower in focus than Hans Rosling’s book: it’s already packaged into an 11-minute video for class delivery at a level suitable for Theory of Knowledge students, and it’s free.  Yet what these two resources have in common is a treatment of facts and feelings, and the need to be aware of both in order to think critically.

The video is a TED talk by Julia Galef, host of the Rationally Speaking podcast.  In it, she treats motivated reasoning in a way that’s instantly accessible to students.  She uses a metaphor (soldier mindset vs scout mindset) to establish her contrasts and a story (the Dreyfus affair in France over 100 years ago) to illustrate them.  She is arguing for becoming more aware of our own motivations and responses in order to make better judgments. What I find most interesting, though, is that she argues not for more logic and analysis but instead for an emotional shift – a shift away from feeling defensive or tribal (soldier mindset) and toward feeling the value of being curious, open, and personally grounded enough to be willing to be wrong (scout mindset).  Recognizing facts thus depends on recognizing feelings.

The video makes a good introduction to basic biases in evaluating evidence – those facts of the world that Rosling vigorously defends with data.  Galef focuses on motivated reasoning, but it would be easy in TOK to expand a little more on her presentation to recognize at least three intersecting biases.  In my mind, they really cluster together.

  1. motivated reasoning  As she explains, we seek and interpret evidence in a way that supports our fears or desires, and we are motivated to endorse claims that reinforce what we want.  If information we’re given contradicts our beliefs, we experience uncomfortable “cognitive dissonance”, a sense of friction or conflict.  We can resolve the conflict by changing our beliefs or by rejecting the information that contradicts them – but are inclined to discard the information regardless of how well grounded it may be.  If we hold a belief with an emotional investment, we want to defend it and shoot down the enemy belief, position, or ideology that is in conflict with our own.
  2. confirmation bias  Galef does not treat the closely related cognitive bias known as confirmation bias, which inclines us to notice and accept evidence that supports our beliefs and not notice evidence that does not. Confirmation bias works more unconsciously than motivated reasoning, more like a filter constantly screening what we see or recognize. We notice primarily what fits our expectations – and thereby have our assumptions and prejudices confirmed.

If we have our beliefs continuously confirmed by our cognitive biases, and if emotional investment in them gives us motivation to reject counter-evidence and differing views, then we really do have problems when it comes to thinking critically!   But let me add just one more difficulty, equally part of being human.

  1. fundamental attribution bias  It’s not always easy to see from other people’s points of view, and maybe we don’t even want to try if the point of view opposes our own. This cognitive bias is our tendency to attribute to ourselves the best of motives and attribute to our opponents…well, certainly not the best!  Are they just stupid?  Or are they deliberately lying?  Or are they even conspiring to cover up the truth?  How else could they think such a thing that is (from our point of view) so obviously wrong?

Although Julia Galef is dealing only with the first of this trio in this video, you’ll readily see why I group them together for a major means of countering them.  She urges us to try to be open and curious – not to block out opposing views, and certainly not to react with defensive hostility.   This advice certainly fits what we aim for in TOK as we explore the effect of different perspectives on what we claim to know.

Facts, feelings, and TOK

In selecting these two resources to recommend today, I confess that I’m not thinking only of material for planning classes.  I’m thinking of the continuing education we need ourselves as teachers, and the encouragement we deserve in this educational task we’ve taken on.  At some moments and in some contexts, it feels particularly challenging to lead students to think critically and understand careful methodologies of building knowledge! Any boost we can get from good resources is welcome!  For ourselves rather than just for our students, I’ll close with some comments from Hans Rosling – not for the facts he gives, but for the feelings that frame them.

“Most important of all, we should be teaching our children humility and curiosity.

“Being humble, here, means being aware of how difficult your instincts can make it to get the facts right.  It means being realistic about the extent of your knowledge.  It means being happy to say ‘I don’t know.’ It also means, when you do have an opinion, being prepared to change it when you discover new facts.  It is quite relaxing being humble, because it means you can stop feeling pressured to have a view about everything, and stop feeling you must be ready to defend your views all the time. 

“Being curious means being open to new information and actively seeking it out.  It means embracing facts that don’t fit your worldview and trying to understand their implications.  It means letting your mistakes trigger curiosity instead of embarrassment.  ‘How on earth could I be so wrong about that fact?  What can I learn from that mistake?  Those people are not stupid, so why are they using that solution?’  It is quite exciting being curious, because it means you are always discovering something interesting.”  (249)

Humble and open, curious and active in seeking knowledge:  this sounds to me a fine recipe for maximum enjoyment in learning and knowing.  I think I’ll leave Hans Rosling the last word!

Resources

Julia Galef, “Why you think you’re right — even if you’re wrong”, TED.  August 8, 2016.  https://www.youtube.com/watch?v=w4RLfVxTGH4

Hans Rosling with Ola Rosling and Anna Rosling Rönnlund, Factfulness: Ten Reasons We’re Wrong About the World – and Why Things Are Better Than You Think.  Flatiron Books, New York, 2018.