How to Think Read online




  HOW TO THINK

  Alan Jacobs is the Distinguished Professor of the Humanities at Baylor University. He has written extensively for The Atlantic, WSJ, The New Atlantis, and Harper’s and is the author of several books including a well-received biography of C. S. Lewis and a book on the pleasures of reading.

  Find him on Twitter @ayjay. http://blog.ayjay.org/

  ALSO BY ALAN JACOBS

  The Narnian: The Life and Imagination of C. S. Lewis (2005)

  Original Sin: A Cultural History (2008)

  The Pleasures of Reading in an Age of Distraction (2011)

  The Book of Common Prayer: A Biography (2013)

  HOW TO THINK

  A GUIDE FOR THE PERPLEXED

  ALAN JACOBS

  First published in Great Britain in 2017 by

  PROFILE BOOKS LTD

  3 Holford Yard

  Bevin Way

  London

  WC1X 9HD

  www.profilebooks.com

  First published in the United States of America in 2017 by Convergent Books, an imprint of the Crown Publishing Group, a division of Penguin Random House LLC

  Copyright © Alan Jacobs, 2017

  The moral right of the author has been asserted.

  All rights reserved. Without limiting the rights under copyright reserved above, no part of this publication may be reproduced, stored or introduced into a retrieval system, or transmitted, in any form or by any means (electronic, mechanical, photocopying, recording or otherwise), without the prior written permission of both the copyright owner and the publisher of this book.

  A CIP catalogue record for this book is available from the British Library.

  eISBN 978 1 78283 406 9

  To the students and faculty of the Honors College at

  Baylor University

  CONTENTS

  Introduction

  one: Beginning to Think

  two: Attractions

  three: Repulsions

  four: The Money of Fools

  five: The Age of Lumping

  six: Open and Shut

  seven: A Person, Thinking

  Conclusion: The Pleasures and Dangers of Thinking

  Afterword: The Thinking Person’s Checklist

  Acknowledgments

  HOW TO THINK

  INTRODUCTION

  Why we’re worse at thinking than we think

  “What were you thinking?” It’s a question we ask when we find someone’s behavior inexplicable, when we can’t imagine what chain of reasoning could possibly lead to what they just said, or did. But even when we’re not at the point of exasperation, we can still find ourselves wondering where our friends and family and neighbors got such peculiar ideas. And it might even happen, from time to time, in the rare quiet hours of our lives, that we ask how we got our own ideas—why we think the way we do.

  Such matters strike me as both interesting and important: given the questions that constantly confront us as persons and societies, about health and illness, justice and injustice, sexuality and religion, wouldn’t we all benefit from a better understanding of what it means to think well? So in the past few years I’ve read many books about thinking, and while they offer varying and in some cases radically incompatible models of what thinking is, there’s one trait all of them share: they’re really depressing to read.

  They’re depressing because even when they don’t agree on anything else, they provide an astonishingly detailed and wide-ranging litany of the ways that thinking goes astray—the infinitely varied paths we can take toward the seemingly inevitable dead end of Getting It Wrong. And these paths to error have names! Anchoring, availability cascades, confirmation bias, the Dunning-Kruger effect, the endowment effect, framing effects, group attribution errors, halo effects, ingroup and outgroup homogeneity biases, recency illusions ... that’s a small selection, but even so: what a list. What a chronicle of ineptitude, arrogance, sheer dumb-assery. So much gone wrong, in so many ways, with such devastating consequences for selves and societies. Still worse, those who believe that they are impeccably thoughtful turn out to be some of the worst offenders against good sense.*

  So surely, I think as I pore over these books, it’s vital for me (for all of us) to get a firm grip on good thinking and bad, reason and error—to shun the Wrong and embrace the Right. But given that there appear to be as many kinds of mental error as stars in the sky, the investigation makes me dizzy. After a while I find myself asking: What are these people even talking about? What, at bottom, is thinking?

  THINKING IN ACTION: AN EXAMPLE

  Imagine that you and your partner are buying a car. You’re not a pure impulse buyer, so you’re not going to choose on appearance alone (unless, of course, a car is so hideously ugly that you’d be ashamed to be seen in it). You know that there are many factors to keep in mind, and you try to remember what they all are—gas mileage, reliability, comfort, storage space, seating, sound system. Do we need extra features, like a GPS?, you might ask. How much more would it cost to have that installed?

  A checklist helps, but it’s not going to tell you which items on the list should have greater priority and which less. Maybe you’d say in general that comfort is more important than gas mileage, but what if the car’s an absolute guzzler? That could be a deal breaker.

  Anyway, here you are at the used car lot. This blue Toyota looks nice, and the reviews on the major websites are positive. You look it over, you sit in it and consult your lumbar region: Everything feel pretty good down there? You take it for a test drive and it seems to you that the ride is a little rough, though it could be that you’re paying too much attention and have made yourself oversensitive, like the princess in “The Princess and the Pea.” You try to factor in that possibility.

  You go through this ritual three or four times and then you make your decision, which you’re relatively pleased with until you get home and your partner comments that the obviously best choice would have been the one you ruled out at the beginning because you thought it looked hideous, at which point you reflect that maybe you shouldn’t have tried to make this decision on your own.

  This is what thinking is: not the decision itself but what goes into the decision, the consideration, the assessment. It’s testing your own responses and weighing the available evidence; it’s grasping, as best you can and with all available and relevant senses, what is, and it’s also speculating, as carefully and responsibly as you can, about what might be. And it’s knowing when not to go it alone, and whom you should ask for help.

  The uncertainties that necessarily accompany predicting the future—not only do you not know what will happen but you don’t even know how you’ll feel about what happens, whether you’ll eventually stop noticing that uncomfortable seat or will want to drive the car off a cliff because of it—mean that thinking will always be an art rather than a science. (Science can help, though; science is our friend.)

  My father had an almost unerring ability to buy bad cars, for a simple reason: He never actually thought about it. He acted always on impulse and instinct, and his impulses and instincts, like mine and yours, weren’t very reliable. But he liked acting impulsively, and I believe he would rather have owned a lousy car than devoted research and planning to the task of purchasing one. (Verily, he had his reward.) But I was always annoyed with him because it seemed obvious to me that buying a decent automobile isn’t that hard. Yes, no matter what you do, you can end up with a lemon, but with due diligence you dramatically reduce the likelihood of that happening. It’s a matter of observing the percentages and refusing to heed your immediate impulses—a bit like playing poker, in that respect.

  The problem is, as things-we-think-about go, buying a car is one of the simpler and more straightforward cases. It conta
ins all the key elements, but it’s considerably less complicated than the issues and questions—political, social, religious—that really befuddle us and set us at odds with our fellow residents of this vale of tears. If everything we have to think about were as easy as buying a car, then I’d need only to write a blog post or a few tweets to set us all on the right path. Instead, I’ve had to write this book.

  SPEED KILLS

  A few years ago, the eminent psychologist Daniel Kahneman summarized a lifetime of research into cognitive error in a big book called Thinking, Fast and Slow, and near the end of that book he came around to the really central question: “What can be done about biases? How can we improve judgments and decisions, both our own and those of the institutions that we serve and that serve us?”

  To which he replies: “The short answer is that little can be achieved without a considerable investment of effort.” Well, that’s fine; after all, we’d all be happy to invest a great deal of effort to rid ourselves of biases that deform our thinking, would we not? But as Kahneman continues, the news gets worse. A considerable part of our thinking apparatus, the part that generates our immediate intuitions, “is not readily educable. Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues.” This is not encouraging.

  It is perhaps presumptuous of me to think that I can offer a more hopeful picture than the great Daniel Kahneman, but I truly believe that there are some insufficiently explored ways to understand and ameliorate the problems we have in thinking. We have thought too much in recent years about the science of thinking and not enough about the art. There are certain humanistic traditions, some of them quite ancient, that can come to our aid when we’re trying to think about thinking, and to get better at it.

  But it would be foolish to neglect what people like Kahn-eman have studied and learned. In the passage I just quoted, he talks about “intuitive thinking”: this is the “fast” kind. It’s what provides us with snap judgments, instantaneous reads on a given situation, strong predispositions toward approving some ideas and disapproving others. Kahneman calls this System 1, and says that it is supplemented and sometimes corrected by System 2, which is conscious reflection. We go through life basically running System 1; System 2 kicks in only when we perceive a problem, an inconsistency, an anomaly that needs to be addressed. This is why another psychologist who has researched thinking, Jonathan Haidt, uses a different set of terms when he’s describing essentially the same distinction: he thinks of intuitive thinking as an elephant, and conscious decision-making as the rider. The idea is that our intuitive thinking is immensely powerful and has a mind of its own, but can be gently steered—by a rider who is truly skillful and understands the elephant’s inclinations. It’s a hopeful image, and indeed Haidt is more cheerful about the possibility of better thinking than Kahn-eman is.

  In this book I will be writing largely about the rider rather than the elephant, System 2 rather than System 1. I will certainly draw a great deal from major scholars like Kahneman and Haidt, and several others, but I will also suggest that they do not always frame our problems with thinking in the most useful and constructive ways. In particular I’m going to argue that we go astray when we think of our task primarily as “overcoming bias.” For me, the fundamental problem we have may best be described as an orientation of the will: we suffer from a settled determination to avoid thinking. Relatively few people want to think. Thinking troubles us; thinking tires us. Thinking can force us out of familiar, comforting habits; thinking can complicate our lives; thinking can set us at odds, or at least complicate our relationships, with those we admire or love or follow. Who needs thinking?

  Moreover, conscious thinking is, as Kahneman indicates in his book’s title, slow. Jason Fried, the creator of the popular project-management software Basecamp, tells a story about attending a conference and listening to a talk. He didn’t like the talk; he didn’t agree with the speaker’s point of view; as the talk went on he grew more agitated. When it was over, he rushed up to the speaker to express his disagreement. The speaker listened, and then said: “Give it five minutes.”*

  Fried was taken aback, but then he realized the point, and the point’s value. After the first few moments of the speaker’s lecture, Fried had effectively stopped listening: he had heard something he didn’t agree with and immediately entered Refutation Mode—and in Refutation Mode there is no listening. Moreover, when there is no listening there is no thinking. To enter Refutation Mode is to say, in effect, that you’ve already done all the thinking you need to do, that no further information or reflection is required.

  Fried was so taken by the speaker’s request, he adopted “Give it five minutes” as a kind of personal watchword. It ought to be one for the rest of us too; but before it can become one, we should probably reflect on the ways that our informational habits—the means (mostly online means) by which we acquire and pass on and respond to information—strongly discourage us from taking even that much time. No social-media service I know of enforces a waiting period before responding, though Gmail allows you to set a delay in sending emails, a delay during which you can change your mind and “unsend.” However, the maximum delay allowed is thirty seconds. (Twenty-four hours might be more useful.)*

  Does it seem to you that I’m exaggerating the problem? Or just blaming social media? Could be. But as soon as I read Fried’s anecdote I realized that I too am regularly tempted to enter Refutation Mode—and the more passionate I feel about a topic, the more likely I am to succumb to that temptation. I know what it’s like to become so angry at what someone has written online that my hands shake as they hover over the keyboard, ready to type my withering retort. Many are the tweets I wish I could take back; indeed many are the tweets I have actually deleted, though not before they did damage either to someone else’s feelings or to my reputation for calm good sense. I have said to myself, If I had just thought about it I wouldn’t have sent that. But I was going with the flow, moving at the speed of the social-media traffic.

  Maybe you’re confident that you’re not like that. But before you dismiss the possibility, why don’t you just give it five minutes?

  CONSENSUS AND EMOTION

  It could be coincidence, or synchronicity, or fate; but sometimes there’s a blessed convergence between what you read and what you need. A few months ago I happened to be reading, for unrelated reasons, essays by two wise writers, Marilynne Robinson and T. S. Eliot. And I happened to be reading them at a moment when I was undertaking a serious reassessment of the time I was spending online, especially on social media. That was when the idea for this book began to coalesce in my mind.

  In a 1994 essay called “Puritans and Prigs,” Robinson challenges the contemptuous attitudes many people have toward the Puritans—the very word is no more than an insult now—and gives a more generous and accurate account of what they thought and why they thought it. In the writing of the essay it occurred to her that “the way we speak and think of the Puritans seems to me a serviceable model for important aspects of the phenomenon we call Puritanism.” That is, the kinds of traits we label “puritan”—rigidity, narrowness of mind, judgmentalism—are precisely the ones people display whenever they talk about the Puritans.*

  And why is this? Why are people so puritanical about the Puritans? “Very simply,” Robinson writes, “it is a great example of our collective eagerness to disparage without knowledge or information about the thing disparaged, when the reward is the pleasure of sharing an attitude one knows is socially approved.” That is, we deploy the accusation of Puritanism because we know that the people we’re talking to will share our disparagement of Puritanism, and will approve of us for invoking it. Whether the term as we use it has any significant relationship to the reality of Puritan actions and beliefs is totally irrelevant. The word doesn’t have any meaning as such, certainly not any historical v
alidity; it’s more like the password to get into the clubhouse.

  Robinson further comments that this kind of usage “demonstrates how effectively such consensus can close off a subject from inquiry,” which may be the most important point of all. The more useful a term is for marking my inclusion in a group, the less interested I will be in testing the validity of my use of that term against—well, against any kind of standard. People who like accusing others of Puritanism have a fairly serious investment, then, in knowing as little as possible about actual Puritans. They are invested, for the moment anyway, in not thinking.

  Robinson’s analysis is acute, and all the more so given that it was written before the Internet became a culturewide phenomenon. Why would people ever think, when thinking deprives them of “the pleasure of sharing an attitude one knows is socially approved”—especially in an online environment where the social approval of one’s attitudes is so much easier to acquire, in the currency of likes, faves, followers, and friends? And to acquire instantaneously?

  Robinson concludes this reflection with the sobering comment that in such an environment “unauthorized views are in effect punished by incomprehension,” not because we live in a society of conscious and intentional heresy hunters, though to some extent we do, “but simply as a consequence of a hypertrophic instinct for consensus.” If you want to think, then you are going to have to shrink that “hypertrophic instinct for consensus.” But given the power of that instinct, it is extremely unlikely that you, dear reader, are willing to go to that trouble.

  That instinct for consensus is magnified and intensified in our era because we deal daily with a wild torrent of what claims to be information but is often nonsense. Again, this is no new thing. T. S. Eliot wrote almost a century ago about a phenomenon that he believed to be the product of the nineteenth century: “When there is so much to be known, when there are so many fields of knowledge in which the same words are used with different meanings, when everyone knows a little about a great many things, it becomes increasingly difficult for anyone to know whether he knows what he is talking about or not.” And in such circumstances—let me add emphasis to Eliot’s conclusion—“when we do not know, or when we do not know enough, we tend always to substitute emotions for thoughts.”*