Read a book review yesterday by John Loftus. It made a lot of sense to me, this idea that people are constantly falling into the mind traps of irrational thought…a hidden world of “sways” or “pulls” trapping us into irrational thinking. People seem to be drawn to the irrational. I don’t find it surprising how much irrationality goes into faith. Sadly, this faulty decision making based in irrational values, fears, and biases more often than not results in the suffering and deaths of so many in our world. Until we are free of religious superstition, how can we possibly expect freedom from irrationality? Seems like our irrational selves can only be dealt with when we free our minds of the arrogant belief we are not irrational.
Here’s a little excerpt from his review:
The authors focus on three currents and hidden forces that cause us to act irrationally, “value attrition,” which is our inclination to attribute to a person or thing a certain value based on our initial perceptions, “loss aversion,” which is the tendency to go to great lengths to avoid possible losses, and “diagnosis bias,” which describes our blindness to all evidence that contradicts our initial assessment of a person or situation. In the midst of this they show other ways we’re influenced by the sway of irrational behavior.
About value attrition the authors say: “Once we attribute a value to a person or thing, it dramatically alters our perceptions of subsequent information.” (p. 55) And then “it’s very difficult to view it in any other light.” (p. 56). It is such “a strong force that it has the power to derail our objective and professional judgment.” (p. 63).
About loss aversion the authors say: “The more there is on the line, the easier it is to get swept into an irrational decision.” (p. 22). A closely linked sway is called “commitment.” The more that a person has a commitment to an idea then the more it is virtually impossible for him or her to take a different path. Independently these two forces have a powerful effect on us, “but when the two forces combine; it becomes that much harder to break free and do something else.” (p. 30).
About diagnosis bias the authors say that it “causes us to distort or even ignore objective data.” (p. 75). As such, “we often ignore all evidence that contradicts what we want to believe.” (p. 88)
The authors give us plenty of interesting examples and psychological studies proving that this is what human beings do in ordinary decision making, some of which cost the lives of many people. My argument is that if this is what takes place in our ordinary decision-making, then how much more does this apply when it comes to faith!
For some reason this is reminding me of Nietzsche when he wrote on how one becomes what one is:
Seeing that before long I must confront humanity with the most difficult demand ever made of it, it seems indispensable to me to say who I am. Really, one should know it, for I have not left myself “without testimony.” But the disproportion between the greatness of my task and the smallness of my contemporaries has found expression in the fact that one has neither heard nor even seen me. I live on my own credit; is it perhaps a mere prejudice that I live? … Under these circumstances I have a duty against which my habits, even more the pride of my instincts, revolt at bottom, namely, to say: Hear me! For I am such and such a person. Above all, do not mistake me for someone else!
At the end of their book, the Brafman brothers make a good summarizing statement. What if people accepted this idea of irrationality and began to actively identify the difference between their opinions and the facts?
“It is only by recognizing and understanding the hidden world of sways that we can hope to weaken their influence and curb their power over our thinking and our lives.” (p. 181).