Right is Right (even when it's wrong)

I’m often surprised in mediations and in life how two people can recall a history or interpret facts so differently.  Apparently neutral bits of information result in one person seeing a giraffe and the other seeing a tornado.  This disconnect occurs in ordinary situations, but gets super-charged in cases involving a conflict.  We see this in political discussions all the time: we scratch our heads and wonder how seemingly intelligent people can be so completely wrong.  Right? 

In mediation, once agreeable partners now recall the intent of a large financial gift from grandma entirely differently.  Or they disagree on the value of the father’s day-to-day involvement in the children’s lives.  Or the other’s annoying habits now take on a dangerous quality.  Armed with articles, statistics and advice, they come prepared to prove their rightness.  But no amount of evidence will convince the other person of the obvious truth.

Yale Law professor Dan Kahan sheds some light on this phenomenon in a recent study that addresses the matter of why good evidence doesn’t resolve political debates.  The 1000-person study asked people to work through a brainteaser first about skin cream.  They found that participants who tested high in math skills were more likely to resolve the problem correctly.  In the second round, they tested the same equation using gun control statistics.  In this case, math whizzes not only didn’t do better than the average test taker, they actually did worse if the evidence didn’t match with their political beliefs about the issue.  Kahan and his team termed this “identity protective cognition.” 

Identity protective cognition.  Honestly, who can’t relate to that term?  We read the articles that support what we already believe.  We summarily shut down sources that espouse facts that contradict our settled notions.

Other not surprising insights from researchers on this phenomenon highlight the challenges of impacting thinking when logic can’t be employed:

  •  The longer a belief is held, the more tightly.  Once we’ve fully latched on to a notion, we’re not likely to give it up.
  •  Information that supports or contradicts our values are most susceptible to creative cognition.  The more our identity is on the line, the more likely we are to make the facts work to fit our beliefs.
  •    No forms of counter-argument (stories, facts, imagery) impact our habit of bending facts to fit our beliefs.  The only effective strategy is information from a trusted expert.

Fortunately, the solution to this conundrum often plays out in mediation as well. 

First, mediators set the stage for resolution. Research by Claude Steele suggests that affirmation makes people perform better in intellectual situations because they want to prove themselves right.  In mediation, we create buy-in from participants to resolve their dispute peaceably. 

As the mediation proceeds, we neutrally frame the situation.  We remove the point/counterpoint positioning of traditional arguments and focus on the broader interests that lie beneath those positions.  Those broader interests or shared values (i.e. “we want our children to have strong relationships with both parents”) change the direction of the disagreement away from the other person’s position and toward a common goal. 

In mediation, we also use shared experts for facts.  In family law, each attorney will hire an evaluator to value the business, for example.  Then those evaluators testify in court on their behalf and the judge decides which is right(er).  In mediation, the couple chooses an evaluator together and uses that recommendation for valuing the business.  The business evaluator becomes the trusted expert that both believe.

Human behavior often falls short of the precision of a computer or machine.  We are quirky, unpredictable beings who often behave in ways that don’t make sense and are not healthy.   I love mediation because it uses that assumption and moves forward from there.