SoftestPawn’s Weblog

Oh no, not another blog!

Archive for the ‘Metadebates’ Category

Articles about debating, arguing, discussing, and all that

Debunking the Debunking Handbook

Posted by softestpawn on January 16, 2012

SkepticalScience has published the Debunking Handbook that is intended to summarise how you show an argument is wrong. Unfortunately… it is itself wrong in some fairly fundamental ways.

The summary at the beginning says:

“Debunking myths is problematic. Unless great care is taken, any effort to debunk misinformation can inadvertently reinforce the very myths one seeks to correct. To avoid these “backfire effects”, an effective debunking requires three major elements. First, the refutation must focus on core facts rather than the myth to avoid the misinformation becoming more familiar. Second, any mention of a myth should be preceded by explicit warnings to notify the reader that the upcoming information is false. Finally, the refutation should include an alternative explanation that accounts for important qualities in the original misinformation”

For a start this is phrased to suggest that you already know what is fact and what is ‘myth’. ie, this is not a way of evaluating an argument for its worth or otherwise, but a way of selling a specific argument.

It’s not a debunk manual, it’s a ‘spin’ manual.

First: the refutation must focus on core facts rather than the myth

Any non-trivial problem has myriads of facts that can be interpreted in different ways to suggest different conclusions – this is what makes understanding people, the world and the universe so interesting. This guide says you should push the facts that support your views and avoid analysing those that contradict them. This is far from ‘debunking’ an argument; to focus on specific facts and avoid others is spin.

The example given is the claim that (some) climate skeptics claim that the sun has driven recent climate warming. The debunking is supposedly that the sun’s measured total radiation output does not match warming in the last very few decades, and therefore the skeptic claim is wrong. By itself, this is Fine and Good, but ignores the myriad effects that various solar outputs – different particles and radiation wavelengths – have on the atmosphere and so temperatures. The conclusion may well be right, but the text ignores or oversimplifies the facts that support an alternate view and picks those that support the agenda of the so-called ‘debunker’. This is not a debunk, it’s a sell.

Second: any mention of a myth should be preceded by explicit warnings

This is an obvious statement of intent: a claim that the argument is wrong without saying why.

It’s not even a refutation, let alone a debunk.

Finally: the refutation should include an alternative explanation

This is clearly wrong as it has nothing to do with showing how the initial argument is wrong, and can result in missing the point.

If you claim that aliens move clouds around, I can counter with a similarly clueless argument that the clouds are sentient and move themselves. The discussion can then move to how silly it is that clouds are sentient, and so lose the focus from evaluating the original claim about aliens.

At worst, having shown that it is silly to think that clouds are sentient, a (poor) conclusion is that therefore aliens do indeed move clouds around, as the only alternative considered.

The Worldview Backfire Effect

It is ironic that such a publication should talk about how people are biased by their “worldviews and sense of cultural identity” without considering how they might affect the authors.

In particular I enjoyed the phrase “Self-affirmation and framing aren’t about manipulating people” because, clearly, they are (see also, for example, the very interesting article by Kahneman and Tversky Judgements Under Uncertainty). That’s what is interesting about them.

Removing framing to get at the underlying objective data and arguments is extremely difficult, and will continue to be so while publications like the “Debunking Handbook” encourage others to muddy the waters.

Advertisements

Posted in Evidence Based Beliefs, Metadebates, Science | Tagged: , , | Leave a Comment »

Aggregating Adversarial Argument

Posted by softestpawn on September 26, 2011

I am reasonably intelligent. I am interested in the topic, and so am well read and informed. My conclusions follow reasoned lines, match presented evidence well, and are rationally the most likely. They are scientific. Many other interested and intelligent people have also come to the same conclusions.

You, however, disagree. How can this be? Since my conclusions are rational and informed, you must be biased by your ideology, your vested interests. You must be selectively ignoring key evidence: denying the science. You must therefore be ‘anti-science’. Maybe you have been persuaded by ‘misinformation’ distributed by vested interests and lobbyists, using tried and tested techniques to appeal to your emotions rather than reason. Or maybe you just lack the mental skills to be able to properly assess the complexities and uncertainties.

Convinced? No? Then it must be because one ‘cannot reason someone out of a belief they did not reason themselves into’. I am still the rational, correct one, and you are, basically, unreasonable.

Yeah, right.

This inability to understand, assess, value and maybe even argue convincingly for wide-spread opinion that we disagree with is sign of intellectual weakness in an argument. It suggests ideologically-based bias: we have been too narrowly selective about the ways in which we assess the facts, the reasons and the effects. Can those of us interested in these things make good cases for both evolution and creationism? For homoeopathy? Both for and against late term abortion? Any abortion? For cake? Smoking in pubs? Permitting a new local Tesco’s? How much fruit and veg we ‘should’ eat? And for what?

If we cannot understand and adjust the priorities assigned to evidence and reason that forms the conclusions that many other people hold, then we probably haven’t understood the problem properly. Even if later, with hindsight, we find we have come to the ‘correct’ conclusion, it is likely by accident or social identity, not reason.

This is not about persuading or convincing, or about getting inside the heads of people you disagree with in order to change their mind. It’s about better understanding the problems and their associated issues, and so coming to conclusions that better usefully match the real world.

Posted in Metadebates | Tagged: , , , | Leave a Comment »

Cautiously Approaching the Precautionary Approach

Posted by softestpawn on August 25, 2009

Following a post about the appalling Precautionary Principle, some folks said I was a bit harsh and that I had constructed an extreme version of it that it was never meant to be.

I’ve not been given an alternative definition, but all the same let’s have a look at a softer ‘Precautionary Approach’. In this case we’ll consider it in its fuzziest nicest: that we should not do dangerous things, and that if something might (but plausibly might) be dangerous, then the people who want to do it should prove it safe before they start.

All well and good it seems. This is the mellow, relaxed and comfortable “better safe than sorry” or “look before you leap”. It means people have to prove stuff is safe before they start doing it. And what can be wrong with that?

Well it’s wrong, wrong, wrong and wrong as follows:

  • It considers only direct harm.
  • It includes the potential for possible harm, even with little or no supporting evidence
  • Proof lies with the advocates
  • It makes the decision for you

It considers only direct harm

Most activities include quite definite harm. These are traded against the benefits of doing, or indeed the harms of not doing: if I am being chased by a veloceraptor, I am not going to pause and look to see what’s on the other side of the log I’m about to leap. There is a much higher certainty of danger in not doing, than in doing.

When it comes to more complicated activities, like building a factory or a hospital, we have definite harm: the land occupied will destroy wildlife, it will require energy, roads. On the other hand we will have the things the factory makes, the lives saved by the hospital, the employment. There are costs and benefits too from the houses, shops, recreation activities, schools, hospitals and factories needed by the workforce.

And the same effects will appear harmful to some and beneficial to others. Sheep grazing keeps the heather down. If you replace sheep with some other local industry, then the heather can run riot; this is good for the heather, but not so good for the moorland. Creatures that like living in heather will thrive, at the expense of those that like moorland. Are you a fan of heather or moorland?

Caution is not free. The continued ban on GM crops affects the wealthy very little, who can afford the warm, cuddly feeling of ‘protecting the environment’ at the expense of a little more wealth. That same ban, even if justified, is cold, heartless and cruel to those living on the edge of starvation, who die very young, who are always hungry, diseased and frail.

It includes the potential for risk of harm

Most activities carry some risks that they might cause harm. With the precautionary approach, we’re not only interested in the ‘known risks’, but the ‘unknown risks”.

There is no indication of quite how much possible risk is important; any “plausible” cause is sufficient. So for example the initial studies that drew a “link” between the MMR vaccine and autism are sufficient to “take precautions”. There was no actual evidence of risk of harm (the single study that started the scare had no significant value in the existing evidence), but there was enough to indicate the potential of such a risk.

And that’s enough. Being generally precautious provides no help in evaluating the possible risk of harm, and so it all rather depends on the emotions and enthusiasms of those applying the approach. If you are scared or highly reluctant to use modern medicine, then the above tenuous link between MMR vaccine and autism is enough to “take precautions”, and keep your children safe from such injections. This exposes your children to the harm of not taking the vaccine, but that’s not part of the consideration.

So we end up making decisions based on lack of evidence; based on ‘fantasy’ where we can dig out any possible plausible danger and use it as if it was actual danger. You might as well make decisions based on the plausible possibility that aliens live in clouds.

Proof lies with the advocate

This is not necessarily part of being generally precautious, but it seems to come with the Precautionary Principle:  if you want something, you prove it’s completely safe.

Which has two problems:

(1) It’s rarely possible to prove that anything is completely safe. Lots of clever people have looked for links between the MMR vaccine and autism and found nothing, but maybe that’s just because they haven’t found the right thing yet. Maybe there are other dangers. And so anyone ‘being precautious’ can do do so for ever, happy in the knowledge that, because they’re using the important-sounding “Precautionary Principle”, they are doing the Right Thing.

(2) Expecting someone with a vested interest (who wants to build the factory, or vaccinate children against MMR) to be fair about the evidence they present is a bit naive. Or rather, it’s a way of maintaining the Precaution, no matter what evidence is presented. Because you can’t trust it, obviously, it’s from the evil capitalists who want to build the factory, or evil drugs companies who want to make money from vaccine injections.

It makes the decision for you. Or not

Evidence and analysis inform our decisions, but should not make them. Sometimes the decision is obvious based on that information but sometimes it is not.

The precautionary approach provides a way of shortcutting all that palavar, and tells you not to do something if it might be dangerous. Yet it contradicts itself: it will tell you opposite things depending on how you apply it to the same problem.

For example: “Do I get up tomorrow?” The precautious approach tells you that because there are many possible (and known) harms in doing so, you should not. Yet “Do I stay in bed all day?” also has many possible (and known) harms in doing so and the precautious approach tells you that you should not.

Instead you have to choose when to use it, and choose which way to phrase your same question in order for it to ‘work’, and that’s the giveaway that it’s not a useful scientific approach. It’s a way of providing a pseudo-scientific cover over an emotional approach.

So if we look at opponents of wind farms, nuclear power, road bypasses, new shops, and so on, we sometimes see people who find just possibilities of potential harm, and seem to think that this by itself is enough to call a halt.

The Reckless Approach

Taking the same approach in the opposite direction exposes the lack of evidence, reason and analysis behind the Precautionary Approach.

The Reckless Approach says that “if there is any potential benefit to the action, we should act”.

And if we apply this approach to any ideas, we can quickly see that we’ll end up doing lots of really rather dangerous things.

Risk Analysis

The way we really – and sometimes even scientifically – decide whether to do something is to weigh up the costs against the benefits, including the uncertainties which are usually given as risks (potential costs or harms) and potential benefits.

This might be brief and scrappy – as we run from the veloceraptor, we have to make snap judgements made on very little, but often very important, information. This is a long running issue in the military (the snap judgements, not the veloceraptors) where life and death decisions need to be made on little and uncertain information.

For the more complicated issues involved in environmental impacts, impact assessments are well established, if not entirely straightforward.

Which is perhaps the problem; carrying out a proper assessment is complicated, and involves making difficult decisions in weighing the various effects against others. How do you weigh, for example, the many benefits of faster, cheaper travel provided by a bypass with the deaths to cute furry animals? To the permanent loss of certain habitats?

But doing anything else is insufficient. To pick and choose amongst the effects of an action and wave only those around, as if they force only one possible decision, is not rational, scientific or even moral.

(See also Risky Business, Adam Curtis on the Precautionary Principle, SIRC on Beware The Precautionary Principle, Precautionary Principle, Evidence Based Belief, EU Commission’s Communication on Precautionary Principle)

Posted in Environmentalism, Metadebates, Politics | Tagged: , | 4 Comments »

Deferring to Authority

Posted by softestpawn on August 1, 2009

“Deferring to Authority” is to claim someone’s opinion is valuable because they are an expert in the subject. “Arguing from Authority” is to claim that your opinion is valuable because you are an expert.

This is one in a series of posts about evidence and how it does or does not support a claim. Although this particular one is about opinion, not evidence…

There’s nothing very bad about using an expert’s opinion, but it can be seen as irrelevent in a controversial subject where the evidence is being discussed. If you are arguing about whether homeopathy can cure cancer then you want to look at the facts, not quote people who might have vested interests. Or others who might also have vested interests.

Workable Life

Few of us have the time, inclination, or expertise to carefully check every single thing that we decide. Using expert opinion is a perfectly sensible way to approach life.

There are just a few things to bear in mind:

  • Authority might not be Authority
  • Authority isn’t always expert
  • Authority can be wrong.

Authority might not really be authority

Gillian McKeith is famous for handing out advice on diet, and claims to be an expert on the subject. She even uses the title ‘Dr’ – a title that is restricted in the UK to try and prevent people from claiming a qualification they don’t have. The advertising standards authority were notified, proceedings pursued, and she has been forced to (mostly) stop using it. She has no training, qualifications or authority for her opinions on poo, carrots or lard-fried chips.

But you’ll notice here I’ve not put any links; this is my opinion. It’s based on expertise gathered from far too many hours on Ben Goldacre’s BadScience web site, where the actions to stop her misuse of the title were first formed by some of the forum regulars. So I am an expert, but you should check it yourself, rather than believe me… Google is only a few clicks away…

Authority might not be an expert

Expertise is often quite narrow. People who work in the ‘environment’ industry are not experts in all of it; thus the opinions of neither David Bellamy nor David Attenborough on global warming are expert. It goes further than that; an expert in one particular field of climate cannot claim expertise in the whole subject.

This specialisation can sometimes be quite surprising. Your General Practice doctor has quite a wide medical expertise, but when it comes to a road accident, you will probably be better off with a an amateur St John’s Ambulance volunteer.

Some authority qualifications are so broad as to be meaningless and offer no real authority at all. A “scientist” is generally just somebody who researches something and is no more intelligent for it than many other professions; a “government scientist” in particular is not an expert on everything, any more than you are or I am.

Scientists running in packs (or ‘committees’ as they are sometimes known) are similarly suspect. Pronouncements from Committees of Scientists on things outside their fields of expertise should be treated cautiously.

Authority might just be wrong

This is perhaps too obvious to bother mentioning, but in any controversial field we expect differences of opinions amongst experts; some of them must be wrong.

And sometimes great swathes of experts in a field can be quite spectacularly wrong.

So?

Expert opinion is a perfectly sensible thing to use. My GP’s expertise trumps pretty much anyone else I know on medical matters.

Just remember that sufficient evidence trumps opinion every time.

Further Reading

Argument From Authority (Wikipedia)

Deferring to Authority: Popular Science Communication as a language of control (PDF)

Posted in Evidence Based Beliefs, Metadebates, Politics, Science | Tagged: , , , , , | Leave a Comment »

Do Aliens Live In Clouds? Evidence Based Belief

Posted by softestpawn on July 15, 2009

All day long we are told stuff: stuff that’s true, stuff that’s not, stuff that may be, stuff that’s ambiguous, stuff that’s incomprehensible, stuff that’s interesting, stuff that’s quickly forgotten. Stuff that might save your life. Stuff that might save your hair.

Sometimes it’s stuff that isn’t very important. Did you know, for example, that powdered fish scales are used to clear beer when it’s brewed? Unless you’re a beer-drinking vegan, you probably don’t really care if it’s true or not. Someone can tell you that in the pub and you can nod knowingly or look amazed or dispute it, whatever the social occasion, ego, company and number of beers consumed calls for, and sod the facts, And many of us enjoy aromatherapy or horoscopes without caring whether they really do what they say they will.

But if it’s stuff that’s important then such gossip, rumour and hearsay isn’t good enough. If it’s going to affect our job (is that really illegal?), or our relationships (is she really having an affair?) or our health (will bacon really give you a heart attack?) or our family (will vaccinations really hurt your children? Will not vaccinating them be worse?) or our future (will using fossil fuels really kill us all?) then we need better information.

This is the first in a series of how evidence can help tell us what to believe. It introduces the context and various principles in the paragraphs below, which link to more detailed explanations

So we might defer to authority; that is, we ask someone who is an expert on the subject and use their opinion.

This is a generally workable approach, but if we want more than opinion (expert or not), rumour and gossip then we need facts that are relevant: we need evidence. And we need enough evidence. Until then, it’s just a fantasy.

We need to be clear about what it is we’re trying to show, about whether in fact we can show it, and what that means to our idea.

We are rarely going to get “certain proof”, so we need to understand some ordinary things about uncertainty. It may be that some things are so uncertain that we shall just have to settle for not knowing.

Diversions are everywhere: If someone makes an extraordinary claim, it’s up to them to find evidence to support it, it’s not up to you to find evidence to disprove it. It’s not up to you to provide a workable alternative. And a rubbished alternative doesn’t make the original claim right.

As new evidence arrives, we need to understand how it supports (or contradicts) either a new link in a chain of reasoning, or confirms (or contradicts) existing evidence. We need to check that it is actually new; that it’s not just the same facts wearing a different face.

Most of all, throughout what can sometimes be an eye opening, invigorating and mind-blowing exploration of all this information, we must be careful not to introduce our own bias; that we don’t put our opinion before the evidence, and especially that we collect only the evidence that fits our initially ignorant opinion.

In the end, when someone tells us that “Aliens live in clouds!!”, and it’s important for you to know, then our response is not “That’s silly, because…” but:

“Oh yes? Show me!”

In the meantime, we don’t know if aliens live in clouds, but we have no reason to think so.

And that’s alright.

(With thanks to the badscience discussion forums)

Posted in Evidence Based Beliefs, Metadebates, Science | Tagged: , , , | 4 Comments »