I am writing this post in hopes that many woos - fans of creationism, "alternative medicine", conspiracy theories, psychics, alien visitations, cryptozoology, and so forth are directed here. I'll be speaking to you as best I can.
There is a phenomenon among antkind known as the "circular mill." (more here) This happens when part of an army ant colony ends up going in a circle and each ant follows the ants in front of it. As a result, they continue traveling in a circle, usually until they die of exhaustion. I've read about this as an illustration of the dangers of groupthink: The circular mill happens because each ant's decision becomes dependent on other ant decisions, and not on the environment or the end goal of their activities. Given ants' comparatively simple nature, you can't expect them to have a reflective moment. You're different. You're a sapient being. You're capable of introspection, self-doubt, and reevaluation. That's why many of us skeptics get easily frustrated when we see you going in circles and repeating cliches as justification for your behavior. I hate when someone doesn't exercise their ability to think independently, and, unfortunately, there aren't "woos" who express themselves in a manner that convinces me that they've given the matter any genuine thought.
I'll be forming an informal list of some of the priorities you should focus on:
First: You do not argue from authority in a scientific blog debate.
In an ideal world for scientific discourse, there would be no need for the concept for authority. Unfortunately, we live in a world where snap decisions must be made. If, for example, a spider bites me and my arm goes numb shortly thereafter, I would go to the nearest hospital's emergency room and trust the doctors to pick out the right treatment (and hopefully expedite that decision by describing the spider that bit me). If the world didn't have such time constraints, it'd be reasonable to argue over the matter, compare evidence for various antivenom concoctions, get an education in toxicology, and so on. In the real world, where there are time constraints, other things to do, etcetera, authorities in various fields are a handy time saver.
What separates a scientist from a layperson isn't a magical property that renders one beyond question. A trained scientist is a person who knows how to evaluate evidence, where to look for evidence, and has a large collection of the conclusions from that evidence memorized, usually for quick decisions. The latter two traits are time saving measures. If the doctor treating my spider bite knows what kind of neurotoxin the described spider secretes as well as a drug to counteract it right off the top of his head, he can treat me that much faster. In a life or death situation, his memorized knowledge or ability to quickly locate the appropriate recorded knowledge would be invaluable.
Extend the example to longer time periods: If I have a condition that could irreparably harm me over days instead of hours, he could tell me what he thinks is happening, what he plans to do, and why he thinks it's the appropriate treatment. If I happen to know about some factor in myself or my medical history that could affect that treatment negatively, I'd be able to tell him, and he'd be able to change his decision accordingly. Of course, having a medical history that can be provided while he makes the decision saves more time. As an authority, he's expected to have the knowledge of what conditions to look for.
Extend it further: I have a slower condition that'll cause problems over the course of decades. At that level of time, I should be encouraged to research that condition for myself so that I can reasonably discuss it with my doctor. If I find good evidence that raises questions about a decision, I can bring that to his attention, get second opinions from other experts, etcetera. That's how doctors do things these days. That's pretty much how skeptics like me expect our doctors to work these days.
So, to hammer the point home: Authority is a time-saving tool, not an absolute source of information. In my typical blog conversation, I am not interested in saving time. I'm interested in learning.
The danger in blindly trusting in a person's authority is that you're basing your decisions purely on someone else's. In a pinch, that may be a useful quality, but if you have time to think, you should keep an eye on the environment and the larger picture so that you can respond appropriately if that person's actions don't fit what's going on.
Second: Know how to evaluate evidence.
This is a common failure point I get frustrated dealing with a number of you. There are different levels of quality when it comes to evidence. To continue on a medical side, a large repeated double-blind, placebo-controlled study is golden. Each part of that is designed to reduce or eliminate bias. Many people underestimate the body's ability to heal itself, the mind's ability to deceive itself or rationalize away symptoms, unaccounted for factors, and plain dumb luck and coincidence. Large studies with many subjects are better than small studies or individual anecdotes because a larger number of subjects will dillute coincidences. Patients must be blinded to which treatment they get (with their consent, of course) because knowing they've gotten the "real" treatment will change how they look at their symptoms, usually for the better. The experimenters must be blinded so that they won't consciously or unconsciously fudge the results in one direction or the other. The experiment must be repeatable so that other, independent researchers with different biases can see if they can get the same results.
Sometimes, compromises have to be made. In medicine, it's often for ethical considerations. You wouldn't test a vaccine against placebo because it'd be unethical to ask parents, children, and those around them to risk death or disfigurement by infectious diseases for the sake of testing a vaccine that might be only slightly more effective or less prone to side effects than the standard of care. In other cases, it might be limitations on evidence. For example, there might only be one known fossil of a newly discovered species to work with.
Anecdotes and eyewitness tales are the worst kind of evidence because they involve the highest number of compromises, and often when those compromises are unnecessary. They leave in the biases of the witnesses, which leads me to my next point:
Third: Everyone is biased.
Everyone. Ev-er-yone. Yes, that includes you and me. Albert Einstein is attributed as saying "Common sense is the collection of prejudices acquired by age eighteen." The sentiment is essentially correct. We learn by observation, but if we don't temper our minds with reason, those prejudices can never be overcome. Skepticism is anathema to closed-mindedness.
Most people who call skeptics closed-minded seldom understand what the word really means or why we've rejected certain ideas. (It's important to note that rejection is a tentative state, not an absolute.) To deal with the topic of open-mindedness, I'll refer you to QualiaSoup's excellent YouTube video, which also covers many of these points:
It's because of our biases that we need to use the scientific method. Science is essentially about the minimization and removal of biases. When someone asks me to trust in "other ways of knowing," the only alternative they typically offer me is another set of biases. One of the particular galling sets of biases are those of "faith," which just strike me as putting oneself up to be a god. I'm just another fallible being, and I don't have the arrogance to assume I'm right before I even look at an argument. I may have a great deal of confidence in my position, but I always allow for the possibility that I'm wrong. And with a number of topics, if I'm wrong, it's cause for celebration.
Fourth: Please don't quote stereotypes.
A major irritation for me in blog discussions is that "woos" will often refer to stereotypes they have of skeptics they've learned from television, even if there's blatant contradiction. If you want to impress me, try asking questions about what I think instead of relying on what others in the circular mill say I think. There are far too many internet trolls who spend their time telling me what I think or what my position is and ignoring my protestations when I explain what I truly believe and why I believe it. If you think I've performed a stereotypical, irrational thing, tell me exactly where it happened and get me to explain that action or quote.
2 comments:
Excellent piece, as expected from you, but I thought there was one point that was missing. I am 100% on the Skeptical Side, but I do think many of 'us' make the same mistake of steroetyping that the woomeisters do, and tend to assume that people who buy into nonsense are as one-dimensiional as they think we are. (Good example, Orac's handling of the Daniel Hauser case initially as simply a religiously-based refuisal for treatment, when the case, as Orac admitted, was much more complex than that.)
Of course the difference is that Orac, like a good Skeptic, corrected himself, something the wooers are -- usually -- incapable of. But we should be aware of our own tendencies as well.
Yeah, I tend to fall into thinking about stereotypes, but there are plenty out there who seem devoted to reinforcing them, rather than defying them.
Post a Comment