Tuesday, March 13, 2012

Political Deceptology: The Science of Effective Lying


In the last blog I argued that Rick Sanctimonious was more effective as a politician than Willard Romney because he strongly believes in what he is dishing out, while Romney often appears not to be too convinced by his own spiel.  Both may, or may not, be delivering to us a pack of lies, but that is totally irrelevant because every politician (as well as everyone else) is usually lying and facts don’t matter, especially in politics.

I am not saying that I would bet on Rick beating Willard to the nomination. He probably won’t because Willard is backed by 9 billionaires and Rick has only one, and in today's presidential campaigns, the number of billionaire backers you have trumps even effective political lying.

The point is, however, that a lie is only politically effective if the perpetrator believes in the lies he is telling, or at least is perceived to believe in them.  When Rick Sanctimonious says weird things such as “JFK makes me want to throw up,” it actually reinforces the belief that he really does believe in what he is saying.  It is even more convincing when he has to pay a political price in the elite liberal media for his weirdness.

There is more to it than just being a true believer, however.  Explaining this involves an teensey weensy bit of cognitive science, so stick with me here.

Most of the characteristics of effective political lying are based on how the human brain takes in data, processes the data, and makes decisions.  Cognitive scientists have demonstrated that the human mind can only handle a small part of the massive volume of data coming in through our senses.

If you took everything in, your thought process would look something like the circuitry of an Indian telephone system.


     Photo attributed to Marcus Ferrell; widely circulating on the Internet

To deal with this overwhelming mess, your mind uses filters to declutter the incoming data. These filters are not totally random. They come from childhood, family, school, and other learned experiences.  Some may even be based on heredity and our ancient genes that allowed us to survive in primitive times. Most filters seem to come from early childhood.[i]

After filtering everything through our cognitive illusions, most of the data get lost, but our thought process and decision-making process become simpler, reducing the mess above to this:


Photo from a web page attributed to: © Dept. of Physics, Univ. of Illinois at Urbana-Champaign, 1996.

Our filters are all loosely hooked together into a (not always coherent) system we call our world view or our belief system. One observer calls this our package of “cognitive illusions.”[ii]  This is the source of the intuitions and educated guesses that make the complex pathway of life easier to navigate.

Whatever their origin, and whatever is lost in the filtering, the filters in our brain allow us to get on with daily life and make decisions about what to eat, wear, and who to vote for and other very complex matters without trying to sift laboriously through all the factual data and make reasoned decisions about everything.  

Our filters determine how we understand the world and make meaning out of the chaos of the universe.  No one could survive without them.

David Kahneman, the world's most influential living psychologist,  explains this as our two brains:  the slow brain and the fast brain. By necessity, we use the fast brain for almost everything. [iii] Malcom Gladwell, in his best-selling book Blink, tries to make the case that the fast brain can produce better decisions than can the slow brain.[iv]

Gladwell’s view is not supported by anything in cognitive science, but sometimes  our filters do more or less coincide with reality. Then they will work for us. Sometimes they don’t, and then we end up getting royally screwed by our own ways of thinking.[v] The latter happens far more than the former, especially in politics.

One of the ways the fast brain filters things is through the adoption of rules of thumb, referred to sometimes as “heuristics.”  Political consultants, magicians, and advertising wizards realized a long time ago that these heuristics can be triggered by language. They have developed an armory of techniques to get us to buy in to whatever they are trying to sell to us.  For advertisers, some of the words that trigger our brain heuristics are “SALE!”,  “BARGAIN,” “FREE!”, “NEW!”, “BEST,” or “TOP RATED.”  We become weak and helpless when attacked by these little buzz words --and out comes the credit card.

Politicians do the same thing. 

An example:  Romney will often say things like, “I am going to bring you smaller government, free enterprise, lower taxes, and a strong defense. Obama is a socialist who wants to promote class warfare. He wants to take away your hard-earned cash and give it to some loafing, lazy illegal immigrant.”  Everyone cheers wildly. No one asks for any proof or evidence.  Romney is using buzz words triggering heuristics that hook into a way of seeing the world that resonates with his listeners.

George Lakoff, in his book “Don’t Think of An Elephant,” calls this “framing.” He defines it as using language to evoke a certain world view and certain strongly held values.

People are often amazed and baffled by why poor people vote for candidates who promise to raise their taxes, take away their social services, and transfer money to rich people and big corporations.  There is a book and a movie about why people vote against their own economic self-interest in Kansas.[vi]  In the 2010 elections, a Tea Party advocate--who was on Medicare!--told South Carolina congressman Bob Inglis in a town hall meeting that he wanted to force “the government to  get its hands off my health care.”  Inglis, for once in his life, was speechless.

The reason people don’t vote their interest is that they never analyze data or think things through logically with their slow brains when deciding about political issues. They vote for the candidate who best triggers the cognitive heuristics in the fast brain--the one who taps into their intuitive core world view and values.


A politician cannot just mumble any nutty thing over and over again and get away with it.  It can be nutty, but he has to believe it to make it work, and it has to hook into some deeply held values.  This is how Ron Paul gets votes. He is patently a total fruitcake, but (1) he believes firmly in his delusions, (2) some of them hook into values that everyone holds dear, and (3) he is emotional and ANGRY. You will never see him smile when he unloads his true beliefs.

The key for the effective political liar is not to get all tangled up in policy details or other wonky stuff like Hillary Clinton famously used to do.  Don’t worry about facts. If a strongly held frame does not fit the facts, the facts will be ignored and the frame will be kept.[vii]

Republicans have completely mastered this game.

Democrats (Bill Clinton, excepted) struggle with lying effectively.

Why can’t Democrats lie effectively?

That will be the subject of the next blog. 


And how can we protect ourselves from lies, political lies and other lies (assuming we want to)?


That important question is best answered by Pamela Meyer on her Ted talk entitled "How to Spot a Liar"  which every reader of this blog should watch.



[i] I know there is an authoritative scientific source for this statement but I have dozens of books scattered all over my house and I cannot locate the right one.

[ii] Ben Goldacre, Bad Science (2008).

[iii] David Khaneman, Thinking Fast and Thinking Slow. David Brookes described the theory as follows: “We are dual process thinkers. We have two interrelated systems running in our heads. One is slow, deliberate and arduous (our conscious reasoning). The other is fast, associative, automatic and supple (our unconscious pattern recognition).” http://www.economicsandethics.org/2011/10/david-brooks-on-daniel-kahneman-and-the-complexities-of-the-mind.html.

[iv] Gladwell argues in an age of information overload, experts often make better decisions with snap judgments than they do drowning in volumes of analysis. http://en.wikipedia.org/wiki/Blink_(book).

[v] In Blink, Gladwell cites numerous examples in which lightening quick intuition saved the day. On the other hand, intuitive decisions made in the blink of an eye can also lead to disaster, as in the case where four New York policemen using rapid, intuitive, fast brain judgment shot an innocent man on his doorstep 41 times.

[vi] Thomas Frank, What's the Matter with Kansas? How Conservatives Won the Heart of America (2004). In the book. Frank argues that politicians generate votes by evoking issues, such as abortion, immigration, or gay marriage. However, once in office, they turn their attention to issues favoring their wealthy business supporters, such as deregulation of polluting industries.

[vii]  George Lakoff, Don’t Think of an Elephant.

1 comment:

PW said...

I do love this post and come back to look at the Indian phone lines when my brain is on overload.

Why is it that a footnote which says "I can't find it right now" sounds more truthful than most? Is it due to the Age We Live In?