Yesterday on the drive home I was listening to the Winsome Conviction podcast from Biola University. communications professor Tim Muehlhoff was talking about the book he recently co-wrote with apologetics professor Sean McDowell, End the Stalemate: Move Past Cancel Culture to Meaningful Conversations.
Dr. Muehlhoff talks about a couple of types of bias we often bring to our interactions, especially around hot-button topics.
The first type of bias is “my side bias.” This bias comes into play when we have the tendency to trust someone’s claims or conclusions based, not so much on their knowledge or expertise in a matter, but primarily upon their agreement with us on another matter that is important to us. A relatively innocuous form of my side bias can be found among sports fans. One has his or her preferred team (based on home state, or where one attended college), and tends to believe good things about that team simply because of affinity.
Moving to more serious matters: in theology, for example, I’m likely to accept the scholarship of conservative evangelical writers because I am a conservative evangelical. Until Craig Keener, or Craig Blomberg, or Darrell Bock, or Ben Witherington III, or N. T. Wright proves unreliable, I am at the very least going to take their research and arguments into consideration, even if I don’t agree with them on every minute point. In politics, this translates into trusting by default the news and commentary put forth by those on one’s own side of the aisle, whether left or right, progressive or conservative. We have a natural tendency to trust those with whom we agree in one area when they speak to other topics, even if they have not actually demonstrated any concrete knowledge of expertise in those other areas.
The other type of bias is “mind blindness,” and this occurs when we automatically assume that someone who is from an “opposing side” on some issue important to us—or even simply not aligned with our position on that issue—is automatically suspect and wrong on any other issue as well. This takes the form of, “Well, we know John Doe believes X about Y, so he must be wrong when he says A about B as well.” This is not unlike Nathaniel’s question in John 1:46: “Can anything good come out of Nazareth?” Because we have a preconceived prejudice against people from a certain place, or of a certain political persuasion, we tend to dismiss out-of-hand anything they say, without evaluating the claim on its own merits and based on evidence.
In theology, for instance, I’m going to be much more circumspect when reading material from a cessationist such as John MacArthur. Because I already don’t agree with his conclusions about certain spiritual gifts still being for today, I’m more likely to look for the flaws in any statement he makes. And even if a quote from MacArthur circulating in some meme is true and accurate, I’m not likely to share it online, because I don’t want to lend him any credence that may lead others to start consuming his materials and potentially follow him into what I consider to be serious error. But just because MacArthur is wrong about that area of theology, doesn’t give me a blank check to reject absolutely anything he says about any theological subject.
And, if you’ll allow me to be honest for a minute here, I find myself questioning the majority of what certain people on social media—some of whom I even know in real life—post, simply because of certain stances they have taken on social, political, or theological matters (or, frequently, the manner in which they voice those stances or parrot certain party lines). I like to think of myself as a rational, analytical thinker (when my classmates and I played “Star Trek” on the during recess in the mid-70s, I was always Spock), but in reality, I am just as susceptible to mind blindness as anyone else. I need to improve in that area, making a sincere effort to actually listen to what people are saying, and not simply dismiss everything they say because of something they have said in the past with which I disagree, even vehemently.
Both of these biases, the “my side bias” and “mind blindness,” are forms of what is known in logic as the genetic fallacy: accepting or rejecting an argument based on its origin rather than its content or logical coherence. If I want to avoid falling into the trap of the elephant and the rider, I need to work to avoid both biases.
You can listen to the podcast, or read a transcript of the conversation, here.
That's good stuff.