Monday, May 04, 2009

Let's Poke a Troll! #1

Well, it's been a few days since I replied to Dave's spam that went direct into my inbox. Poked him with a few questions, including one at a link about "Disconfirmation bias" from the dark bulbs of Telic Thoughts. If Creationists can't even tell us what evidence would support their theory of randomness, well, how can we be biased into ignoring it? Given the predictability of Creationists, who've been pulling the same tricks for centuries now, I didn't bother to read the link. Until now. Bracing myself...

We've been talking about confirmation bias, which is the tendency to seek out data that confirm one's preconceptions, while ignoring data that conflict with those preconceptions. But there is a flip-side to this phenomenon known as disconfirmation bias.
Yes, we're quite knowledgeable about confirmation bias, and we routinely deal with denialism, which is ignoring contradicting evidence. Of course, on the topic of evolution versus creationism, Creationists, to my knowledge, have never bothered coming up with examples of what would be confirming or contradicting evidence for their unfalsifiable "hypothesis."

Quotes from some psychologists:
When evaluating an argument, can one assess its strength independently of one's prior belief in the conclusion? A good deal of evidence indicates the answer is an emphatic no. This phenomenon, which we refer to as the prior belief effect, has important implications. Given two people, or groups, with opposing beliefs about a social, political, or scientific issue, the degree to which they will view relevant evidence as strong will differ. This difference, in turn, may result in a failure of the opposing parties to converge on any kind of meaningful agreement, and, under some circumstances, they may become more extreme in their beliefs.
The history of science says otherwise. New ideas come up quite often, get rigorously tested, and if they keep passing tests, they become accepted. I'm sensing the usual sour grapes involved in psychobabble arguments instead of honest inquiry, but that's mostly because I have a lot of prior experience with Creationists. Cynicism is no substitute for a sound, logical argument based on evidence. Science and skepticism may be conservative, but they're inherently expansive.

More psychobabble follows, including one example involving the death penalty. Of course, that's an ethical/moral argument for the most part, not one of science.

Back to Telic Thoughts' commentary:
It makes sense that a disconfirmation bias would exist. If the human brain is wired to defend its preconceptions with confirmation bias, attacking beliefs that threaten those preconceptions would likely be part of the same strategy. This undercuts Michael Shermer's belief that "Skepticism is the antidote for the confirmation bias." In reality, hyper-skepticism, or selectively applied skepticism, may simply be another facet of the same brain processes that generate confirmation bias.
What is "hyper-skepticism"? Where are Shermer and my fellow skeptics applying our skepticism selectively? That's what I'd like to hear you talk about.
What does disconfirmation bias look like? I would like to propose three possible signs that disconfirmation bias is taking place, where one may be defending their preconceptions more so than playing the honest skeptic who is simply trying to "follow the evidence."

1. According to Edwards and Smith, "When one is presented an argument to evaluate, there will be some automatic activation in memory of material relevant to the argument." Searching one's "memory banks" can easily become relying on stereotypes. A stereotype, after all, is the brain's "summation" of previous experience that is linked by certain cues. Thus, I hypothesize that when one is confronted with an argument that challenges their preconceptions, the more that person relies on stereotypes, the more it is likely they are exhibiting disconfirmation bias to protect a preconception.
The difficulty with this is that sometimes people actually live up to negative stereotypes, and I can't recall the last time a Creationist broke from the mold. In contrast, skeptics rarely ever act according to the predictions of Creationists.
How can you tell if stereotype is involved? Often, it is obvious. For example, if a critic on the Internet poses his own argument against my design hypothesis, and I begin to rail against Richard Dawkins, obviously my brain has been tapping into information about Dawkins to interpret my opponent. Often times, however, the evidence is more subtle. And that takes me to my second sign.
Just a note: I have yet to meet a Creationist who knows anything whatsoever about Dawkins's views. From what I've seen, they've done everything they can to shove Dawkins into their straw man molds and failed. At least he deserves kudos for demonstrating some sliver of self-awareness.
2. If one's brain is on "a deliberative search of memory for material that will undermine the argument simply" and the "search include stored beliefs and arguments that offer direct evidence against the premises and conclusion of the presented argument," it stands to reason the person with disconfirmation bias will have a strong tendency to link a current argument with the perceived failures of previously-experienced arguments. This creates a mental inertia that leads to two expressions of disconfirmation bias:
Oh, boy. Have you ever considered that the presented argument is exactly the same as a previously experienced and refuted argument? That's why we've got things like the Index to Creationist Claims. I haven't seen anything new in Creationism in decades aside from this one guy claiming ice is magnetic. And even then, it seems he was copying Kent Hovind or someone of his ilk, trying to defend the "vapor canopy" tripe.
a. Misrepresentation "“ Let's say that Jones develops an argument that threatens the preconceptions of Smith. But let us also say that Smith had previously successfully dismantled a similar, but different, argument that was once posed by Miller. The memory of this experience will shape the way Smith reacts to Jones. The brain processes involved in disconfirmation bias will cause Smith to morph Jones' position into that of Miller's. Smith will feel vindicated by the disconfirmation bias, while Jones will recognize that Smith is attacking a "straw man."
The problem I experience with this: All subtle variations of Creationism have almost all the same underlying faults. Usually, if they're making a straw man claim on me, it's for an unimportant detail. Even if I concede that minor inaccuracy, they tend to continue whining about it for the rest of the thread while ignoring any questions about their premises.
b. Faulty Extrapolation "“ This is a more subtle version of misrepresentation, where Smith's brain is so highly activated that it is sensitized to "cues" from Jones that lead Smith to believe that Jones is reaching for Miller's point. Smith will not focus on the actual argument Jones is making, but will be trying to "anticipate" where he thinks the argument is going in order to cut it off. In this case, Smith is not really disconfirming Jones' argument; he is creating an illusion of disconfirmation in his mind because he thinks he knows where the argument is going (when it may not even be going in that direction).
I'm sometimes guilty of this, but mostly in an effort to rouse the creationist from his circular mill by shock. Of course, much of the time, the extrapolation is something the creationists never thought of, and never made a cutoff limit for. They tend to make up principles that only seem to apply when it's convenient for them.
3. Finally, to put this lengthy blog (and tired blogger) to bed, there is the dead give-away of personal attack. When someone attacks another person by questioning their motivations or with ridicule (and more), they are seeking to discredit the argument by discrediting the person who makes the argument. If such personal attacks are linked to stereotypes, it becomes clear the person's brain is adopting an "end justifies the means" approach to disconfirmation bias.
There's a difference between an ad hominem fallacy and an insult. None of my arguments are based on the stupidity of the opponent. Sometimes I have to use ridicule to shock someone awake at the absurdity of their arguments. Or to keep my sanity with laughter when faced with someone who parrots debunked arguments over and over and over. When I lob an insult at someone, it's a conclusion based on their poor logic, stereotypical robotic behavior, ignorance of the most basic knowledge, etcetera. I never use it as a premise to disregard someone's opinion.
In summary, skepticism is a good thing, but skepticism can be just another facet of the way brains defend "their territory." Add tribalisitic group behavior to the picture and the whole process is amplified and entrenched. I propose that you can detect disconfirmation bias at work, in individuals or groups, when hyper-skepticism, stereotype, misrepresentation, faulty extrapolations, and personal attacks occur more often than not.
There's a difference between skepticism and rationalization. Skepticism requires that someone make an escape route in the form of specific types of evidence that would openly contradict a false belief. Without leaving room for falsification of a bias, you can never get out of it.

That's why Creationism is a bad idea: It's unfalsifiable. There's no telling what kind of evidence would support or contradict it, since they tend to fall quiet or change the subject in the matter. That's why Creationists use stereotypes, misrepresentation, faulty extrapolations, straw men, and ad hominem fallacies to defend themselves.

There was a time in my life when I was willing to entertain some of the fluffier forms of Creationism like theistic evolution. The fact that I could be convinced to join the ranks of atheism means that I was perfectly able to listen to the other side. Cynicism is no substitute for a collection of good, logical arguments. I write the way I do because I seek to emulate the wonderful people who won me over with unapologetic vigilance and irreverence in the face of would-be PC Police.

6 comments:

Don said...

They're so clever! They're being skeptical...of the skeptics!I want to vomit whenever I hear that one.

MWchase said...

I really want to see the disconfirmation bias that could produce QM. Without any of the observational evidence to back it up, you would frankly be insane for positing that the world works the way it actually does.

(More mildly, disconfirmation bias should have prevented us from ever getting that far. Conventional wisdom was the the Poisson spot wouldn't exist, because the idea sounds ridiculous: block light with a disc, and you get a spot of light in the center of the shadow.)

Dunc said...

Hang on a minute - what they're calling "disconfirmation bias" is actually just good old confirmation bias - the tendency to accept or reject information depending on whether it conforms to your current beliefs. Disconfirmation bias would be a bias to accept information which disconfirms your current beliefs, or to reject information which supports them.

Even when they're playing psychobabble semantic games, they can't get it right.

Don said...

I think what they're trying to do is use "skeptical language" to break out the old "You wouldn't change your mind if there was evidence!" gambit.

MWchase said...
This comment has been removed by the author.
MWchase said...

O, cru-el fate, that we should be hoist by our own petard.

Hee hee.

But yeah, by that interpretation (the denotationally correct one, as opposed to "like that other thing, except we don't like it"), only very confused people would exhibit "disconfirmation bias".

Firefox thinks that "disconfirmation" isn't a word. Since I've no clue how one would "disconfirm" something, I'm actually inclined to agree.