Saturday, December 3, 2011

Daniel Kahneman

I first heard Nobel-prize-winning Daniel Kahneman at TED a couple of years ago, when he talked about the difference between experience and the memory of experience (see my post here and watch the TED video here).  Kahneman is a great thinker, and delightful to listen to.  He introduces ideas in a very gentle, intuitive way and takes rather simple and intuitive concepts and formalizes them.  As you reflect on these ideas, their significance grows and sheds light on other things you've observed.  So I signed up quickly for his talk in the Rotman Speaker Series this week. 

In this talk, Kahneman was presenting ideas from his book Thinking, Fast and Slow.  He described two modes of thinking, which he calls System 1 and System 2.

System 1 is all about fast thinking; in the blink of an eye we exploit our past experience, our intuitive reactions and our emotions to come to an answer.  We make all kinds of decisions without even knowing we're making them, using our automatic thinking processes.  It's good to have System 1 thinking, because it would be too exhausting to think through everything with System 2.  For instance, we judge a person's mood by looking at their expression unconsciously and automatically.  We handle the myriad decisions around driving a car in a sort of auto-pilot mode.  When someone says 2+2, we immediately respond 4.

System 2 is more deliberative and thoughtful.  It is more logical.  We are aware that we are thinking.  But it takes much more energy and time.  We would use System 2 to come up with an answer to 17x24. 

Kahneman points out that people are lazy, and we often use the effortless System 1 to answer a question, avoiding the thoughtful effort of thinking it through with System 2.  Take the old chestnut question about a bat and a ball:
A bat and a ball together cost $1.10.
The bat costs $1.00 more than the ball.
How much does the ball cost?
Most people whip out the answer ten cents, without taking time to check the answer, which is patently wrong.  Ten cents is a System 1 answer, given instinctively and without thinking.  Since System 1 thinking is easy, we often don’t apply System 2 thinking when we should and we don’t even use it to check our System 1 thinking.  Kahneman pointed out that giving the wrong answer to that question is not a case of ignorance – even students at MIT get it wrong about 50% of the time, and at some universities up to 85% get it wrong.

When a question is really hard, and demands effortful System 2 thinking, we often substitute an easier question and use System 1 to answer that question.  He described an experiment where students were asked “How happy are you?”  It’s a fairly hard question to come up with an answer to how happy you are because it involves so many factors.  After answering this question, students were asked “How many dates did you have last month?”  This is a relatively easy, quantitative question with a single answer. When asked in this order, there was no correlation between the answers to the two questions.  However, if the students were asked first about the number of dates and then about how happy they were, there was a high correlation.  The students were using number of dates as a proxy for the happiness question – in other words switching to an easier question and using System 1 to answer it.

System 1 is extremely weak at dealing with statistics – it prefers to deal in stories.  And the more coherent the story, the more (unfounded) confidence we have in our System 1  conclusions.  Kahneman described an experiment where people were asked the following two questions before taking a trip, around a time when there had been significant news about terrorist activities:
How much would you pay for travel insurance that pays $100,000 in case of death for any reason?
How much would you pay for travel insurance that pays $100,000 in case of death in a terrorist incident?
People are willing to pay more for the second type of insurance.  The explanation is that System 1 thinking is involved in the second question – the words ‘terrorist incident’ arouse emotions that cause us to make an intuitive System 1 response, although paying more for the second insurance than the first insurance is not a rational decision.

Kahneman told a cute story about himself.  He ran into a colleague in a small hotel in Australia, and was greatly surprised at the coincidence.  What’s the probability, after all?  Two weeks later he met the same colleague at the theatre in London.  Clearly the second incident had even lower probability.  But Kahneman observed he was less surprised in London – he’d already laid in the experience that he tended to meet this colleague in unusual places.  His reaction was based on System 1.

People in marketing understand this dichotomy.  For better or for worse, appealing to System 1's intuition and emotion can be more compelling than appealing to System 2's logic.  The same is true for fundraising: a pure System 2 appeal is not likely to be effective for most people.

For me, Kahneman’s message was rather discouraging.  It put more scientific evidence around what we often observe when people make decisions solely on emotion, undeterred by facts, whether in response to a politician or a commercial.  When Kahneman was asked how one might train children to be more reflective and use System 2  thinking more often, his only suggestion was to lead by example.   So, get out there and use System 2 thinking as often as possible.

No comments: