30 January 2011

Frolicking with Fallacies 5: Poisoning the Well

Let’s continue our discussion of the Ad Hominem family by taking a look at the fallacy known as “Poisoning the Well”.  This is one of the easiest fallacies to grasp.  Yet some people have trouble understanding why it’s a fallacy at all.

When you try to discredit someone’s argument or position before that person has even had a chance to voice it, then you’re Poisoning the Well.  In other words, you’re anticipating that your opponent will make an argument, offer an opinion, or promote an idea that you find distasteful or threatening, so you pre-empt it by turning people against him before he even opens his mouth.  You seed suspicion and mistrust to prevent the argument from getting a fair hearing.  The well is poisoned – don’t draw its water.
 And that’s the goal.  You want to predispose his audience to distrust him, or to interpret his words the way you want them interpreted, not the way he intends them to be understood.  It then becomes more difficult for your opponent to make his case and defend himself, no matter how rationally persuasive his argument may be. 

In a democracy, Poisoning the Well is the way the game is played.  It’s a routine, probably integral, part of the system, which is one of many reasons democracy doesn’t work well in practice.  In advance of an election, candidates and their cronies work overtime to plant seeds of mistrust, to turn people against their opponents, long before arguments are made, before issues are debated.  The goal isn’t necessarily to win people over to your side; it may just be to prevent them from allying themselves with the other guy.  So you call his integrity into question.  You constantly mention past misdeeds, real or alleged, regardless of whether they’re relevant to the current issues.  You dig up dirt from his college days.  Whatever.  As long as it helps you close the minds of voters to whatever he has to say, it’s expected.  Irrational and dishonest, but expected.  Dishonest because of the irrelevance; irrational because it doesn’t take into account the long-term effects of reinforcing cultures of dishonesty.

This happens in religious disputes as well – both in disputes between people of different sects or denominations or religions, and between believers and atheists.  A common creationist tactic in church and home schooling and at places like Liberty “University” is to gird pupils for entry into the real world by warning them of the trickery they should expect from the scientists and scholars of the world, whose minds are corrupted by Satan, and whose allegiances to godless “secular humanism” or “scientism” leads them to spread demonic propaganda in the guise of objective science.  This begins early in life, and continues till death.

It works.  To a point.

It’s easy to lead people to reject perfectly rational arguments supported by mountains of evidence just by sowing mistrust.  In the Christian world, Paul started it years ago by warning his flock not to pay heed to the philosophers who’d try to trick them into sinfulness.  Paul didn’t add that the philosophers would “trick” people by asking them to think rationally about their beliefs and practices because that wouldn’t have helped him sow mistrust.

I had a student once who was absolutely puzzled that an atheist could have values.  He’d been told so often that all atheists were nihilists, doomed to wallow in sin and evil without any comprehension of right and wrong, that he couldn’t understand the secular moral theories he was supposed to learn in the course.  He couldn’t make any sense of them.  Before he could learn, he had to unlearn, overcome the poisoned well tactics that had defined his interactions with the world, probably since he learned to speak.  I felt bad for this guy.  After all, I’d once been him.

Atheists poison the well, too, by disavowing someone as a Christian or Muslim or Buddhist before they’ve had a chance to speak, or before others have had a chance to read their words.  Some atheists, particularly those who defected from a religion, become convinced that nothing that issues from a religious source could possibly have any merit.  For them, the well appears to be irreversibly poisoned.  And so they try to poison it for others as well – or, as they see it, simply point out to others that it’s poisoned.  Yet, in so doing, they prevent the arguments of religious people from getting a fair hearing, and may actually prevent themselves from being rationally convinced by a good argument.  I can think of several excellent arguments made by religious people, from Albert Schweitzer to Soren Kierkegaard to Lao Tzu, that would be irresponsible, irrational, in fact shameful to ignore.  Even if you’re convinced that religion – or conservatism or atheism or vegetarianism or whatever – is a tremendous force of evil in the world you have to take the arguments for them as they come, in good faith and honesty.

The scary thing about Poisoning the Well is that it’s so effective at preventing people from taking sensible arguments seriously and accurately – to lead them to scoff, to ridicule, to ignore, to dismiss, even just to misunderstand.  Once people have made up their minds, it’s find it difficult to change them, because now that belief part of their identity.  It’s part of what makes them who they are, so they have a stake in protecting it from disconfirmation.  Once someone believes the well is poisoned, it’s difficult to convince her to drink – no matter how much evidence you gather to prove that the water is not only safe, but rejuvenating.  The sad fact is, reason and evidence have never been as persuasive as propaganda and fallacy. 

Now, some people have trouble understanding why Poisoning the Well is considered a fallacy.  After all, they say, it doesn’t actually involve any reasoning, just a refusal to listen to reason.  And, besides, don’t we owe it to others to prevent them from being suckered in by some shyster or miscreant?  Maybe the best way to do this is by ensuring they won’t listen to what that miscreant has to say.

Here’s where good intentions hit the wall of reality.  First, refusing to reason, or helping to ensure that others refuse to reason, is a logical fallacy because it involves a refusal to pay heed to logic.  It’s irrational to ignore reason.  I’ll elaborate.  To be a rational person involves both a commitment and a pattern of behaviour.  You can think of these as the internal and external facets of rationality.  As a pattern of behaviour, being a rational person means that you believe and act according to the best evidence and logic available to you, that what influences your beliefs and actions is relevant and well-justified.  But that isn’t enough, because sometimes this just happens by accident.  Sometimes, without any intent or effort, you happen to do the right thing or the rational thing.  If there are three options open to you in a situation, and you just guess, you could choose the most rational option by chance – not because it’s rational, but because the rational one is the one you happened to choose.

And that’s why the second facet is important.  Being a rational person involves more than just acting and believing rationally.  It also involves a commitment to do so, to ensure that your thoughts and deeds are guided by evidence and logic, that you believe not because an idea seems fun or comforting, but because it’s justified.  That commitment (also an attitude) extends to your relationships with others as well.  The rational person attempts to help others act rationally as well.  That doesn’t mean you impose anything on them.  It means you provide the means for others to rationally assess options when it’s in your power to do so.  To be a rational person, then, you must make a long-term commitment to yourself and others.

To refuse to reason is to refuse to make that commitment.  And to prevent others from reasoning is to subvert their ability to be rational people by leading them away from evidence and reasoning that they need to make rational decisions.  People who are new to philosophy sometimes find this odd, because they’ve been led to believe that logic and ethics exist in two different worlds.  Not so!  They are inextricably intertwined.

Second, yeah – preventing people from listening to arguments and ideas that you don’t like may actually be the most effective way of ensuring that they won’t be persuaded by it.  It’s true. 

At least, in the short term. 

As fundamentalists and cultists of every religion have learned, this tactic only works as long as you continuously re-poison the well to shield people from the arguments of your opponents.  It requires a long-term commitment of its own – seclusion, isolation, censorship, complete immersion in the bowdlerized fantasy world you’re trying to maintain.  You have to home school your kids, for instance, where you can carefully vet everyone they interact with, monitor everything they see and hear and experience, and continually reinforce their suspicion of the outside world if you want the Poisoning the Well strategy to keep working.  You have to make sure they are surrounded only by the party line: Veggie Tales videos and shelves of books by Tim LaHaye and James Dobson and Bob Larson, summer camps sponsored by the local church or its parent organization and staffed by carefully-screened sycophants.  It’s a lot of work to maintain a Poisoning the Well strategy long-term.

Why?  Because there’s always the risk of an argument or an idea slipping through.  And though your kids might be predisposed to reject it if they believe it comes from the enemy – say, those dastardly secular humanists – what happens if they don’t know where it comes from?  Sure, you can predispose them to scoff at a secular humanist or scientist or scholar.  But that only works if they know the source of what they’re presented with.  If they can’t pin the idea to a source, they may reflect on it.  If they don’t know who’s made the argument, they might listen to it.  And then the strategy may fail.

And at that point, if they’re bright enough, they begin to understand what you’ve done to them. 

The long term effect, then, the real kicker, is that eventually the Poisoning the Well strategy tends to backfire.  And instead of preventing people from listening to the enemy, all you’ve done is turn them against you because nobody likes being manipulated.  Thus, not only is Poisoning the Well logically fallacious and ethically suspect, it’s also impractical.

As is often the case, the root of this particular fallacy is fear.  People use it because they fear the ideas of others.  They fear the consequences of those ideas, really.  But you don’t accomplish anything of worth by refusing to listen to someone else’s argument, and you accomplish even less by preventing others from listening. 

The fear is that people will become convinced by an argument you find offensive or dangerous or false.  But if that argument stands on its own merits, if it’s justified by logic and evidence, then it ought to be accepted, whether you like it or not.  In that case, you’ve no reason to believe it false.  If you do, you’re irrational and no one should bother with you anyway.

And if an argument is justified, if its conclusion is true, then only a fool would be offended by it.

And if a justified argument is dangerous, then you’re just going to have to find a way to cope with that danger.

And if your beliefs and ideas and arguments fail in the face of something more rational, you should accept that.  Abandon what is unjustified for what is justified, no matter how much you love it, for there is no virtue in an obstinate commitment to failure.

Reason, friends, can be a harsh mistress!

1 comment:

  1. Thanks for your marvelous posting! I really enjoyed reading it, you may be a great author.I will always bookmark your blog and will eventually come back in the future. I want to encourage that you continue your great posts, have a nice day! Screen monitoring

    ReplyDelete

Note: Only a member of this blog may post a comment.