Entries categorized "Cognitive Bias"

FoxNews, Framing, and "Politics and the English Language"

There's been a lot of research on this, with George Lakoff's work being the best:

According to the report at Media Matters, in August of 2009 after Fox News' Sean Hannity used the term "public option," Luntz encouraged him to say "government option" instead.

"If you call it a 'public option,' the American people are split," Luntz said. "If you call it the 'government option,' the public is overwhelmingly against it."

As Lakoff notes:

Language always comes with what is called "framing." Every word is defined relative to a conceptual framework. If you have something like "revolt," that implies a population that is being ruled unfairly, or assumes it is being ruled unfairly, and that they are throwing off their rulers, which would be considered a good thing. That's a frame.

If you then add the word "voter" in front of "revolt," you get a metaphorical meaning saying that the voters are the oppressed people, the governor is the oppressive ruler, that they have ousted him and this is a good thing and all things are good now. All of that comes up when you see a headline like "voter revolt" - something that most people read and never notice. But these things can be affected by reporters and very often, by the campaign people themselves.

Liberals, of course, call people who entered the United States illegally undocumented immigrants. These people just need a few documents, and everything will be OK. An abortion is not a termination of a pregnancy or killing a fetus: It's a choice. And so it'd be unfair to single out FoxNews. In fact, all media is corrupt. Into the Buzzsaw.

Metaphors We Live By is the denser work, but Don't Think of an Elephant is good, too. (And of course see this very blog's discussion of cognitive bias.)

At least once each year I read George Orwell's "Politics and the English Language":

Now, it is clear that the decline of a language must ultimately have political and economic causes: it is not due simply to the bad influence of this or that individual writer. But an effect can become a cause, reinforcing the original cause and producing the same effect in an intensified form, and so on indefinitely. A man may take to drink because he feels himself to be a failure, and then fail all the more completely because he drinks. It is rather the same thing that is happening to the English language. It becomes ugly and inaccurate because our thoughts are foolish, but the slovenliness of our language makes it easier for us to have foolish thoughts. The point is that the process is reversible. Modern English, especially written English, is full of bad habits which spread by imitation and which can be avoided if one is willing to take the necessary trouble. If one gets rid of these habits one can think more clearly, and to think clearly is a necessary first step toward political regeneration: so that the fight against bad English is not frivolous and is not the exclusive concern of professional writers.


Cognitive Bias and Climategate

Did a whistle blower reveal hidden documents to the public; or was a hacker responsible?  If your answer is, "That's a stupid question," then you are likely free of mind control.  If the question makes sense to you, you're brain has been through too many spin cycles.

How much money do people spend on DVDs each year?  How many will balk at spending a few bucks on this?  And thus it's easy to see why America as a great country is finished.  


Credit Cards and Cognitive Bias

Tonight PBS's Frontline will air a fascinating article on how credit card companies manipulate consumers into bad deals.  Some of the stuff is illegal, although some perfectly legal tactics rely on cognitive biases like optimism bias:

"I used to use the word 'penalty pricing' or 'stealth pricing,'" [a credit card exec.] tells FRONTLINE. "When people make the buying decision, they don't look at the penalty fees because they never believe they'll be late. They never believe they'll be over limit, right? ... Our business took off. ... We were making a billion dollars a year."

More details on the special is here.  

There's a lesson for criminal defense lawyers in there, too.  How many conditions of probation should be inserted into a plea agreement?  Oh, of course every client will follow every term and condition of probation.  Next thing you know, you're saying, "How could you have been so stupid as to not do your community service?"  Well, how could you have been so optimistic that he would?

Optimism bias means - in law and in life - one should be bound to as few terms as possible - since it's not likely that one is really going to do everything he says he will.  Always ask: "What's the worst that can happen?" rather than, "How many things can I take on?"  It's a simple framing maneuver.  

Optimism bias will usually prevent a person from allowing negativity to prevent him from assuming any obligations.  One just needs to even the scales.

Reframing the issue away from optimism bias is also a procrastination killer.  "I'll get to it tomorrow" presupposes that you will be able to.  Maybe you'll get sick; get into a car wreck; need to attend to a family emergency.  If you truly can't get to it until tomorrow, fine.  

Don't put it off, though, just because you assume that you'll be around tomorrow.  I just had some sort of super flu.  Missed zero deadlines and asked for zero extensions because I don't procrastinate - even though I am naturally a procrastinator.  If I have time to do something, I do it immediately because there's no guarantee I'll be around to do it tomorrow.  

Probably there's a life lesson in there, too.  Do you assume your friends, family, and pets will be here tomorrow?  Why?  People die unexpectedly so often that it's to be expected.  

Recognizing that the sun might not come out tomorrow is often the way to ensure sunny days today.


Thinking Like a Scientist

Science is supposed to be more about the memorization of trivial. Instead, science is supposed to be about method, about process – most of all, about thinking. Most scientists, sadly, are weak problem solvers.

Scientists amass a great amount of trivia – often life-saving trivia. Few of them amassed that trivia through another other than rote memorization. Scientists learn through an act of brutal memorization. The latest poster boy for scientists as ignoramuses is Dan Carey, Ph.D., an assistant professor of exercise physiology at the University of St. Thomas in Minnesota.

§§§

Pop quiz: Would you prefer 50% of x or 100% of y? Think about it for a minute.

If you answered the question, you’re not thinking. You can’t say what you’d prefer unless you know the values of x and y. For example: Would you rather earn 50% of $100,000 or 100% of $10,000? Cast concretely, the question is retarded. It’s not even a choice. And yet a scientist who receives taxpayer money to fund research - and tells people how to live their lives - proves that he’s not thinking.

§§§

In a New York Times piece about weight loss, Dr. Carey and others seem perplexed that people who exercise at the pace of a snail, don’t lose fat. You’d think that’d be a silly mystery. If there is 3,500 calories in a pound of fat, and a person only burns 200 calories a day “exercising,” how much weight loss would one expect? Yet dogma clouds clear thinking.

There is a dogma is exercise science that one who wants to lose fat should train in the so-called “fat-burning zone.” Here is the logic: At a low intensity (think brisk walk), a person burns mostly fat calories. Therefore, a person who wants to burn fat should train in the fat-burning zone. Here is how Dr. Carey puts it:

“If you work out at an easy intensity, you will burn a higher percentage of fat calories” than if you work out a higher intensity, Carey says, so you should draw down some of the padding you’ve accumulated on the hips or elsewhere — if you don’t replace all of the calories afterward.

Return to the pop quiz. How does the fat-burning zone – in isolation - make any sense?

§§§

One could burn a greater percentage of calories from fat while actually burning less fat. I would rather have 50% of $100,000 than 100% of $10,000, because a lesser percentage is a greater amount.

Now we can see why Carey can’t think. In an hour-long hard workout (where my pulse is continually between 160-180), I might burn 800 calories. What if only 50% of those calories come from fat? Isn’t that superior to burning 100% of calories from fat if I’m only burning 200 total fat calories in his weak workout?

Clearly 400 is greater than 200. Yet Carey, like most scientists, cannot think. Instead, he tosses around dogma – as if saying “the fat-burning zone” does anything other than make the person seem like a fool.

If Carey were just another idiot with an opinion, we’d pass over his opinion in silence. Carey, however, is emblematic of the scientific profession. Why think when you can rely upon dogma?

§§§

Thus you can see that although this is a poster about exercise, it has nothing at all to do with exercise.


Money and Cognitive Bias

Bankrate.com has fantastic article entitled, "12 Hair-Raising Money Tales."  Some of the tales involve banker corruption, deceit ans so-called gotcha capitalism.  Many money mistakes involve decisions.  And thus there are problems with cognitive bias.  Take this example: 

In 1995, my husband's employer in Pensacola, Fla., sold to a company in Bowling Green, Ky. The economy was strong, and we were enjoying our Florida home and thinking about retirement. But my husband took a consulting position with the Kentucky company.

We took equity from our residence in Florida and bought a small home on a large lake in Kentucky. We planned to downsize, sell the Florida home and travel after my husband completed his work.

Isn't that typical?  That's optimism bias. "Optimism bias is the demonstrated systematic tendency for people to be over-optimistic about the outcome of planned actions. This includes over-estimating the likelihood of positive events and under-estimating the likelihood of negative events. It is one of several kinds of positive illusion to which people are generally susceptible."  

Someone gets a promotion: Don't just celebrate with a nice dinner.  Buy a new car!  A new house!  Spend right up to your new salary.  Live as if that new salary is always going to be there.

Let's put aside the spiritually-draining concept of supersizing one's spending.  If you're mostly happy with your current salary, why buy more stuff just because you have more money?  Your lifestyle was fine before your raise.  Plus, money is freedom.  Save enough money that you can tell your boss to go to Hell.  Watch your job-related stress fall.  Oh, you thought it was a coincidence that your boss always tells you to buy a house or to upgrade your car?    

I see optimism bias with friends my age.  Why are you in such a hurry to get into a house?  Besides the fact that owning a house makes one a debt slave: Why do you think you'll be able to keep paying the mortgage?  Is there a guarantee that the money will always be there?  Unless your parents are rich, what's your safety net?  No one needs a safety net, though.  Optimism bias.

Most people, though, underestimate the likelihood of negative outcomes.  It can't happen to me.  Yet another aspect of our culture of narcissism.


Games for the Brain

I could go on about neuroplasticity, cognitive enhancement, improving brain cross-lateralization, or whatever.  If you'd find persuasive, then you would not, by definition, need sold on this most excellent site: http://www.gamesforthebrain.com/  It is interesting, though, how playing a few daily rounds of Rotate has changed the way I look at buildings and other structures.  Cool site.


The Elitism Bias/In-Group Bias

The financial crisis has given us insight into a blindness bias that affects the elite.  Ask yourself: How furious were you to learn the details of the bail outs?  Unless you're reading this blog from a Goldman Sachs network, you were likely outraged.  

How is it, then, that even progessives like Barack Obama were unable to anticipate the populist outrage? Cognitive psychology has the answer.  It's the in-group bias:

In-group bias is the preferential treatment people give to those whom they perceive to be members of their own groups.

Experiments in psychology have shown that group members will award one another higher pay-offs even when the "group" they share seems random and arbitrary, such as having the same birthday, having the same final digit in their U.S. Social Security Number, or even being assigned to the same flip of a coin.

We all these in-group biases.  Are you white?  Black?  A lawyer?  A judge?  Let me make posts critical of lawyers, judges, blacks, or whites while measuring your blood pressure.  It's going to go up.  "How dare he attack my people?"  We are herd animals, and so an attack on those within our in-group is an attack on ourselves.  

More interesting is that the bias is so subtle that is destroys thinking without our knowing that it's destroying out thinking.  Think about it.  How could Obama not know that the bail outs would destroy his credibility?

Obama and his advisors have pollsters monitoring public opinion.  He has an army of Ph.D's.  Yet none of his best and brightest were able to anticipate the populist outrage resulting fromt he bail outs.  How could they have missed this?  The in-group bias is so strong that it literally prevents you from thinking critically.  You don't even know what questions to ask in those expensive polls.

Robin Hanson, at the must-read Overcoming Bias, has a great post on the bias of the elite.  He lists to ancedotes involving well-qualified peopel being excluded from leadership positions for lacking elite credentials.  He concludes:

So it seems the US has a finance and policy elite defined by college ties and related social connections, an elite with a strong sense that only people in their circle can really be trusted, and that their institutions must be saved at all cost at taxpayer expense if necessary. 

Perfect.  And doesn't that seem consistent with the bail outs?  Moreover, the bias is stronger than Hanson states.  The bias is so strong that no one was even able to anticipate how furious the public would become at what Paul Krugman aptly described as "the spectacle of government supported institutions paying giant bonuses is playing."  

Cognitive bias is literally no different from brain damage.  The only answer to bias is true intellectual and cultural diversity.  

Obama, like anyone who makes serious decisions, needs to find a Napoleon's Corporal.  Before Napoleon gave the final command for a battle decision, he briefed a lower-ranking enlisted soldier, asking the soldier if he understood the plan.  Only if the corporal understood the plan would Napoleon move forward.

Imagine if Obama had had a Napoleon's Corporal in the room.  "Hey, man, I'm about to give billions of dollars of no-strings-attached money to Goldman Sachs, AIG, and other Wall Street investment bankers.  Ya dig?"  What corporal would have understood that plan?

Now, here is what is more perverse.  The in-group bias would prevent Obama from having the right Napoleon's Corporal.  We all know that the corporal would be an Political Science intern from Harvard College rather than an Aggie major from a community college.  And thus, we see that the in-group climbs in through the window after we've kicked it out the front door.

Four Concentric Circles

If you don't believe me, trace the outline of each circle with your mouse or finger.  Our eyes don't deceive us, since our eyes only gather sensory data.  It's our brains that do the processing.  Everyone recognizes the validity of optical allusions; few recognize that our brains make similar mistakes when thinking about politics, policy, and personal problems.


Four Concentric Circles

State v. Outing and Change Blindness: Will the Connecticut Supreme Court Respect Empirical Evidence?

There's an interesting case pending before the Connecticut Supreme Court.  The Court states the issues as follows

[T]he defendant sought to introduce expert testimony regarding, among other things, the following identification concepts: 

  • (1) that witnesses who experience heightened levels of stress during a crime tend to make inaccurate identifications; 
  • (2) that under the "weapons focus effect," witnesses tend to focus on a perpetrator's weapon as opposed to the perpetrator's facial features; 
  • (3) that there is a weak correlation between a witness's confidence and the accuracy of an identification; 
  • (4) that pursuant to the "disguise effect," a perpetrator's use of a disguise makes an accurate identification more difficult; and 
  • (5) that when multiple witnesses discuss the crime with each other, the different versions become melded in such a way that the witnesses can no longer be certain of what they actually saw. 

The trial court prohibited the defendant's expert from presenting testimony regarding these concepts, concluding that they were matters of common sense.

How did the trial court conclude that it was common sense?  Did the court ask the jurors in voir dire about eyewitness identification bias and decision making?

The actual, empirical, real research shows that people consider eyewitness identification to be highly reliable. There are, however, some common sense exceptions.

People consider external factors in determining whether an eyewitness identification is reliable.  If it can be shown that it was dark outside, jurors are less likely to believe an eyewitness.  If it was raining, or the witnesses wasn't wearing her glasses, or if the witness is a geriatric: Jurors view eyewitness identification suspiciously.   Those are matters of common sense. 

The really interesting stuff, however, is not.  It's the stuff that is in the inside of an eyewitness's mind that jurors don't understand.  Let's look at one example.  Please bear witness to this video:

http://www.youtube.com/watch?v=nkn3wRyb9Bk  Most people clearly miss something huge.  The concept is known as inattentional blindness or change blindness.

We can only see what we are paying attention to.  When we are paying attention to one thing, we miss something obvious about the rest of the scene.  Here is another example of change blindness:

Now, how does that related to eyewitness identification?  If I point a guy at someone's face, what will you focus on?  It won't be my eyes, ears, or nose.  It'll be the gun.  That is scientific fact.  How then could you reliably describe my face?

Now, maybe it's still possible to pay attention to the gun and my face.  Still, most people are not able to.  Here is an abstract from one (of many dozens) of studies

Three experiments investigated the role of 'change blindness' in mistaken eyewitness identifications of innocent bystanders to a simulated crime. Two innocent people appeared briefly in a filmed scene in a supermarket. The 'continuous innocent' (CI) walked down the liquor aisle and passed behind a stack of boxes, where upon the perpetrator emerged and stole a bottle of liquor, thereby resulting in an action sequence promoting the illusion of continuity between perpetrator and innocent. The 'discontinuous innocent' (DI) was shown immediately afterward in the produce aisle. Results revealed that: (1) more than half of participants failed to notice the change between the CI and the perpetrator, (2) among those who failed to notice the change, more misidentified the 'CI' than the 'DI', a pattern that did not hold for those who did notice the change. Participants were less likely to notice the change when they were distracted while watching the video

Did most of you - judges, lawyers, law professors, and smart people in other professors - know about change blindness?  If your answer is yes: Did you know about it before you studied the issue?  In other words, was it a matter of "common sense" to you?

Of course not.  Change blindness is part of a body of literature on cognitive bias that is only a a few decades old.  Although some of these concepts are being popularized by books like Predictably Irrational, the knowledge hasn't trickled down to the Main Street juror.

Why shouldn't lawyers be allowed to educate jurors about the faults of eyewitness identification?  Perhaps there are good reasons - although I haven't heard any.  That inattention blindness is a matter of common sense, however, is not such a reason.


Cognitive Dissonance, Thomas More, and Jurisprudence

As a follow-up to this post:

Most of you know that Thomas Moe is a saint.  He refused to sign an oath stating that King Henry VII's marriage was valid.  Signing the oath would have violated God's law.  It would have been perfectly legal under man's law. He didn't want to violate his conscience.  He died for his beliefs.  What a hero!

Thomas Aquinas also burned at least five Lutherans at the stake.  Burning Lutherans at the stake was perfectly legal under man's law.  Under God's law, too, I presume.

Isn't is fascinating that Thomas More is heralded as a hero of the rule of law?  It should be. 

Yet it's not.  Cognitive dissonance.  You hear a lot about the Thomas More as depicted in A Man for All Seasons.  Yet people call me incredible when I tell them that Saint Thomas Moore burned people alive for believing in a different theology than he believed in.  Oh, it's true.  He did.

Cognitive dissonance.  We can't believe that a man who burned people alive was a hero of the law.  We can't believe that someone who burned others alive was a good man. 

We'll just pretend those five charred bodies don't exist.  We'll let their memories pass through the wind like screams in the night.