Entries categorized "Cognitive Bias"

The Myth of the Rational Judge Lives

I think that even most economists now reject the theory of the rational actor as a myth.  Do lawyers still follow a version of this myth?  Orin Kerr has a post about judging that makes me think he believes in rational judges.

Do judges really objectively observe these things called "facts" and "legal arguments"?  Is there any empirical support for the position that judges rationally weigh legal and factual arguments?  And that one's biases, background, and perceptions don't inform how judges view facts and arguments?

If so, I would like to see this research, as it would be a ground breaking refutation of David Hume's theory of practical reason!


Cognitive Dissonance, Racial Profiling, and Right-Wing Extremists

Shouldn't all of these statements be true (or false)?

  • Black people commit more crimes than whites.  Therefore, racially profiling should be allowed.
  • Muslim males were responsible for the first World Trade Center bombing.  They were also responsible for 9/11.  Therefore, religion-based profiling should be allowed; as should racial profiling.
  • Tim McVeigh blew up the federal building in Oklahoma City.  McVeigh was white, and was a right-wing radical.  Therefore, racial and viewpoint profiling should be allowed.

It all seems very simple to me.  Yet right wingers are outraged over a recent report involving profiling.

In fairness to the Right: Who wants to bet that the Department of Homeland Security will publish a report stating that mosques should be monitored? 

Yet another reason I hate everyone.  Either we racially profile, or we don't.  You people on the Left can't say that we should monitor right-wing radicals if you won't accept that we should monitor Muslim males.  You people on the Right can't say that we should monitor Muslims; but not monitor right-wing radicals.

Free yourselves from the shackles of partisanship-induced cognitive dissonance!


Gary Condit and Chandra Levy: Undone by Randomness and Confirmation Bias

Gary Condit was a Congressperson who had an affair with his 23-year-old intern.  He was about to break off the affair.  The intern was upset, and was considering going public with the affair.  Later, Ms. Levy disappeared. 

Gary Condit must have done it, right?  How could he have not?  The narrative makes perfect sense.

Wrong (h/t):

Authorities in Washington, D.C. may be close to an arrest in the murder of former government intern Chandra Levy, a case that made headlines, and brought down a congressman eight years ago.

There are reports that D.C. police have submitted evidence to the U.S. Attorney's Office in an effort to get an arrest warrant for a man identified as Ingmar Guandique.

He's behind bars, convicted of assaulting two women jogging in Washington's Rock Creek Park.

How many narratives make perfect sense?  How often do we see patterns that do not exist?  How often are we fooled by randomness?

Also, might the police have found Ms. Levy's killer earlier if they hadn't concluded that Gary Condit was somehow involved?  What evidence do we overlook once our minds have already been made up, once we've already reached a conclusion? 

I don't pimp books on cognitive bias because I think those books are fun and games.  Some are a collection of graduate-school level articles.  It ain't pop corn for the mind.  An ignorance of the biases that blind our thinking can have devastating effects.  People lose their jobs and go to prison based on simple thinking errors.  

If someone had said, "Hey, maybe we should look at the evidence as if Gary Condit was not involved in Levy's death," maybe they would have found her killer sooner?  How much more damage did Levy's killer cause while investigators focused on Condit? 


Why I Want to Finish in Third Place

Would you rather finish in second or third place?  Would you rather win a silver medal or a bronze medal?  The right answer seems intuitive.  What's there to think about?  A silver medal is objectively better than winning a bronze medal.  So what chump would want to finish in third place!

The answer, surprisingly, is: The chump who wants to be happy.  If you want to be happy, third place is preferable to second place.  The second place winner is consumed at what he missed out on - first place!  The third place winner is consumed with what he almost missed out on - third place! 

This idea is explored in, "When Less is More: Counterfactual Thinking and Satisfaction among Olympic Medalists":

Research on counterfactual thinking has shown that people's emotional responses to events are influenced by their thoughts about "what might have been." The authors extend these findings by documenting a familiar occasion in which those who are objectively better off nonetheless feel worse. In particular, an analysis of the emotional reactions of bronze and silver medalists at the 1992 Summer Olympics—both at the conclusion of their events and on the medal stand—indicates that bronze medalists tend to be happier than silver medalists. The authors attribute these results to the fact that the most compelling counterfactual alternative for the silver medalist is winning the gold, whereas for the bronze medalist it is finishing without a medal. Support for this interpretation was obtained from the 1992 Olympics and the 1994 Empire State Games. The discussion focuses on the implications of endowment and contrast for well being.

The rest of the paper is here. That paper and many others are contained in the fabulous Heuristics and Biases: The Psychology of Intuitive Judgment (here).

The paper tends to explain why I'm so happy, and don't have a high desire for status.  I know what not winning any medal at all is like.  I grew up in a family of 6 where the family annual income was $10,000 a year.  Each day I feel very lucky to do what I do.


Amazon.com Pricing Scam? Price Goes Up After Book Put Into Wish List?

This could be my imagination.  I could be fooled by randomness.  I don't know.  Right now, I want answers. (UPDATE: In contact with Amazon.com people.  Will report my results.)

I added Heuristics and Biases: The Psychology of Intuitive Judgment to my wish list.  It cost $37 at the time.

A day or two later, the price had increased to $40.  I still didn't buy it.  A couple of days later, the price increased to $45.90.  After a week of the price remaining constant, I ordered the book.  This was on Friday.

Today I went on Amazon, and see the book now costs $36.72.  I'm livid.  Friday's order was already  processed, so I can't cancel it.

Do they have some sort of pricing algorithm that increases the costs of items placed in your wish list?  In other words: To get me interested, show me a lower price.  Once I show interest by adding it to my Wish List, increase the price? 

Has this happened to any of you?

Whatever the case, I am not pleased.  I am going to contact Amazon immediately.  At the very least, I expect to receive an $10 credit to my account.

Oh, and at $36.72, Heuristics and Biases is a freaking steal.  It's a good deal at $45.90, when you consider the hundreds of hours that go into compiling so much information.  Still, ten bucks saved is ten bucks earned.

UPDATE: Removed pending my formal investigation.


Memory Metaphor

Take a moment to reflect on whether the following statement is true or false: "Memory can be likened to a storage chest in the brain, into which we deposit material and from which we can withdraw it later if needed.  Occasionally, something gets lost from the 'chest,' and then we say we have forgotten.

Does that sound like an accurate memory metaphor?  If a chest is archaic, think of a computer.  Memories are files we store in our brain, just as we store files in a computer.  When we want to remember something, we pull the file out from our brain and review it.

Does that sound accurate?  This is not a trick question.  Have you made up your mind?

Most of you probably thought that the metaphor was accurate.  Is it?  To test it, try this thought experiment:

Close your eyes and recall a scene in which you experienced something pleasurable.  Don't read any further until you have finished replaying your experience.

Once you've replayed your experience, or re-lived your memory, scroll down....


































Did you see yourself in that scene?  Most people do.  But if you saw yourself, then you must have reconstructed that scene (unless, of course, you were looking at yourself during the original experience.


Pretty cool, huh?  Pretty scary, too, when you realize that people go to prison based on the memories of eye-witnesses.  Further implications of memory are explored in The Psychology of Judgment and Decision Making (available here).


Availability Bias and Water Landings

Out of several million commercial flights, how many emergency water landings have there been?  One.  That's it.  Statistically speaking, none of us have any change of drowning after an airplane we were on crashed into the ocean.  It is literally more likely that lightning will strike you several times than it is that you will die while flying on a commercial airplane that makes an emergency water landing.

Yet that one water landing occurred very recently, and so the availability bias comes into play.  The availability bias "is a phenomenon (which can result in a cognitive bias) in which people base their prediction of the frequency of an event or the proportion within a population based on how easily an example can be brought to mind."  Since we just had an airplane land in the water, something must be done about the "problem." And so we have this:

DALLAS (AP) -- American Airlines is limiting the number of passengers on some planes while it orders additional life rafts needed in case of a water landing like the one made this month on the Hudson River by a US Airways jet.

American will allow no more than 228 people including passengers and crew on its Boeing 767-300 aircraft, which normally holds 236 people including a crew of 11, spokesman Tim Wagner said Wednesday.

Putting aside the irrationality of preventing a non-existent problems, let's look at the cost.  Assume the average ticket price is $600.  For every flight, American Airlines will need to recoup an immediate $4800 loss (8 fewer passengers time $600 per ticket).  Multiply that by thousands of flights, and you're talking about millions of dollars a year. 

Millions of dollars to solve a problem that does not exist?  Bad thinking is expensive.  Even at $50, this book might be the best bargain of your life.


"The Psychology of Judgment and Decision Making"

It takes a lot for a book to wow me.  Especially considering that I've been studying critical thinking and cognitive bias since 2000.  Yet this book, published in 1993 (!) has wowed me: The Psychology of Judgment and Decision Making.

It pulls together most of the research on cognitive bias into a slim volume.  Best of all, there is a test at the beginning of the book.  Take it, and then amaze yourself at your numerous wrong answers.  I thought for sure I'd ace the test.  Fortunately, I did not.

It's pricey new.  I was able to borrow a copy from my library, through inter-library loan  for less than a latte.  Highly recommended. 


"The Gray Lady Is A Tramp"

I wonder if the New York Times isn't failing because it has alienated right-wing readers?  Ms. Torre, while overstating how bad the Times is, does make a good point: Why publish the McCain-lobbyist (non)scandal on the slimmest of evidence while ignoring the John Edwards' love child story - which was supported by eyewitness and video evidence?

I used to think it was a liberal conspiracy.  How embarrassing.  I now realize that cognitive bias is to blame. 

If you're liberal, the slimmest evidence will prove to you that McCain was sleeping with a lobbyist.  Even the strongest evidence would not persuade liberals that John Edwards was cheating on his wife.  Hell, people bought into the myth that Edwards' dad was a mill worker, even though his dad was a company man through-and-through.

The solution, of course, is to diversify news rooms.  Not racial or other pretend diversity, but actual intellectual diversity.  Everyone who has studied team building would agree.

People are different, and thus every team should combine a few "big picture" and "details" people.  Put two big picture folks together, and you'll have great ideas; but nothing will ever get done.  One person instructed his law partner to manage me thusly: "Let him do whatever he wants.  But do not trust him about deadlines.  Ever."  Ouch.  Put two detail-oriented people together, and no one will forget the deadline; but will there be much of a brief to file?

Even in trial practice, there should always be a "law person" and a "fact person" on each case.  Some people are better at turning facts into compelling stories.  Others are better at researching the law.  Few do both well; and no one does both eqaully well.  A team should thus include one or more of each.

And so too should a news room have a few right-wing cranks.  Given that the newspaper industry is so ignorant of basic business management principles, it's no surprise that newspapers are going out of business.


Ultimatum, Human Rationality, and Goal Setting and Goal Attainment

People are irrational, as the game Ultimatum proves.  In Ultimatum, there are two players and a pot of booty.  Each player has total power to make one move, and there is no negotiation. 

The first player gets to propose (only once) how to split the booty among the two players.  The second player gets to accept or reject the offer.  No counter-offers are allowed.  If the second player rejects the offer, both players leave with nothing. 

In theory, the second player should always accept any offer.  After all, he went into the game with nothing.  If he accepts offer, he will leave with something.  Accepting the offer will always make the second player better off.

Of course, the second player does not accept just any offer.  Instead, an offer must be "fair" before the second player will accept it.  How is turning down an offer to become better off rational?  It's not.

Yet does Ultimatum prove that people are irrational?  What if, using higher cognitive processes, a person could learn to accept "unfair" but economically beneficial offers?  What would that say of human rationality? 

On a concrete level, wouldn't you like your decisions to be rational?  I sure would.  Dr. Peter M. Gollwitzer, a expert on goal attainment and human motivation, suggest that we can become more rational. 

In a short paper entitled, "Self-Regulation in Ultimatum Bargaining: Controlling Emotion with Binding Goals," he and his colleagues explain that "participants with a goal to stay calm accepted unfair ultimatums more than participants who were not given such a goal."  The bottom-line result is that when one knows he is about to be offered an "unfair" but economically-beneficial deal, he should set a goal to remain calm.  (That's called a suppression strategy, since the goal is to suppress one's emotion - in this case anger.)

The author hypothesize, however, that there is a better strategy, namely, cognitive reappraisal.  Cognitive reappraisal "is defined as a reconstrual of an emotional stimulus, in a way that significantly lessens the unwanted emotional impact."  In other words, accentuate the positive.  Don't say, "This deal is unfair because it undervalues my work."  Instead, think, "This is an opportunity for me to increase my revenues by 15%." 

In terms of New Year's Resolutions: Don't suppress the urge to eat a chocolate-chip cookie.  Instead, reappraise it.  "Turning down this cookie will allow me to look better in a swimsuit."

The entire paper (8 pages, double-spaced) is available for download here.