SoftestPawn’s Weblog

Oh no, not another blog!

Posts Tagged ‘Scientists’

Barriers to Open Research

Posted by softestpawn on December 7, 2009

One of the many so-called ‘requirements’ of good scientific research is openness, and the mythical welcoming of criticism that will result.

Humans don’t like doing this and tend to have to be forced to, through rigorous process, reams of paperwork and checks, and evil picky enforcers who come around and badger you to follow the process that not only means you have cleaned out the rotating cell-nutrienting widget breeder, but that shows that you have done so.

Full disclosure is a basic requirement of openness. Such disclosure is not necessarily public; it just means you have to show your working – all your working – so that other people can come along and check it.

Importantly, it also means showing it to people who will criticise it. That’s the point. If you just show it to your mates, there’s all sorts of social conventions that get in the way of proper criticism.

You need to show it to your rivals. To those who really will take it apart piece by piece. Otherwise you’re not really subjecting it to proper scrutiny. You ready for that? You want to do it? eh? eh? Then you’re a Proper Scientist, doing Proper Science. You Urban Spaceman you.

Thoroughness takes time

Because by making it ready for your competition to look at, you will run it through every possible check you can think of, that your mates can think of, your colleagues, boss, drinking companions, any handy six year old child. That’s a lot of work. It’s effort that takes you away from ‘real’ research. And it’s not fun – who likes writing up?

The Reputation

And you might even find stuff that’s wrong, with all the implications that has for your reputation and career and funding, let alone the extra work to fix it. It’s so much easier to just publish the results and ‘lose’ the rest of it… maybe no-one will notice… …perhaps for years …maybe not until you move on.

The Confusion

And besides which, if you publish it people might misread it. Misrepresent it. Those who aren’t experts like you might get ‘confused’ about your conclusions. They might ‘confuse’ others.

As an aside I do find the bandying about of ‘misinformation’ and ‘confusion’ by the more extreme climatologists quite ironic. All information can of course also be misinformation, and any complex subject is likely to be confusing. That they feel they can declare what is information and clear while denying open scrutiny is hubris not expertise.

It’s spectactularly ironic given the repeated use of the term ‘consensus’ – a piece of “misinformation” as it’s badly defined, not properly measured, and “confusing” as it’s not even relevant. 

The Education

A mostly sensible comment via  stoat’s blog brings out another problem with publishing more than you have to: the more you publish, and the more publically, the wider the potential audience of less expert people who will want to poke at your data and ask about it.

Though to use that as an excuse to deny FOI requests for data and code, rather than lessons, is ‘misleading’ to put it mildly.

And to complain that you’re receiving lots of requests for the evidence for something you claim is globally vital for the survival of the human race, seems a little, well, disconnected.

The Conclusion

So there are no real incentives to be open and thorough, and plenty against it. In some industries the long term benefits (ie, showing you’ve got it right) have been recognised and are forced onto unwilling staff, to a greater or lesser extent.

We have seen the effects of not having this imposed on the CRU team; the comments in Harry’s Readme (if it’s real) demonstrate quite how poor the work is understood even internally.  If these had to be prepared for external release, just think of all the extra work required to check everything rigorously. Of course, if it already had been openly published (even not publically), we wouldn’t see comments like Harry’s. Poor chap.

So what does this tell us about other climate research? Nothing really. That’s the point.

Posted in Environmentalism, Global Warming, Science | Tagged: , , | 3 Comments »

Humans Don’t Do Science

Posted by softestpawn on September 27, 2009

“Science”, we are frequently and rightly told, relies on being open about what it is doing and how, on welcoming informed criticism, on being willing to drop discredited ideas, on experimenting to test theories to try and break them, and generally progresses by proving existing ideas are not (quite) correct.

(This assumes a modern somewhat subverted meaning of ‘science’ which will purple the pedants, but it will do for now. As will this:)

‘Scientists’ are those who do the research that brings us more and better science.

The implication then, is that as science needs the above to work, and scientists do science, therefore scientists are open, willing to drop their concepts when discredited, welcome informed criticism, and so on.

Wot tosh.

Humans eh?

Scientists are human, and so are as selfish, greedy, proud, sociable, sensitive, prejudiced, noble, dislikable, charming, arrogant when given half a chance, and generally as emotionally involved as other humans. And some scientists seem to be unaware that this breaks the requirements to ‘do science well’.

It should surprise nobody that people get emotionally involved in their work, especially if it requires a lot of effort, some specialised skills, and the results look good and are valued. This applies to most of us, and it applies just as well to a scientist who has developed a respected theory. Nobody welcomes criticism of work they are proud of.

Reputations are based on theories and ideas too, not directly on rigour in the workplace. Newton is remembered for his observations of motion (and a mythical accident for discovering them), not because of his work practices.

Similarly sometimes a huuuge amount of time, effort and money and reputation is invested in developing certain concepts, and few people can be objective when assessing their own life’s work. Skills and knowledge are accumulated and not willingly abandoned. Dark matter, neutrinos, the search for the Higgs Boson, for example, are all current research programmes that might turn out to be a complete waste of time, but there is a tremendous momentum in pursuing those particular concepts.

And so sometimes there are quite large communities of people that are emotionally invested in certain concepts. Since funds are often limited, these communities can be quite large proportions of the overall field (The CERN experiments suck up quite a lot of the physics community’s funds). If we ask certain slices of the research community what the ‘consensus’ is on a topic, the results are biased by the various investments of these communities.

Iterative Steps

Of course the key here is that we are looking at research, where we are investigating things we don’t know very well. As soon as we run a proper experiment to test the theory, then the people-yness of those involved becomes nearly irrelevent.

In the meantime we can perhaps rely on the ‘iterative’ nature of science; that we can count on the overall continual reviews to eventually correct mistakes and improve on theories. This, though, is not a set of incremental improvements, where we gradually work our way closer to the ‘truth’ in the manner of many mathematical iterations. Some models have to be completely abandoned, not just improved on.

Such a messy approach is perhaps fine for general research, but is insufficient if we need to act upon it. In some cases (such as education, climate change, materials to build bridges, buildings and airplanes) we need to assess what we actually know, and know now, from amongst all the people-y assumptions and reputations and opinions.

Being Scientific

One of the key aspects of really scientific disciplines, including ones outwith research, is that we remove the people-yness of those involved as much as possible.

For example, we record all the data and methods (“audit”) because we expect to make mistakes, and so we need to be able to go back and check every step.

We let others have access to this (“full disclosure“) because, again, we expect to make mistakes, and so we need to let other people check every step. It also helps to compensate for some of the ordinary people problems; if you know the details of your work are going to be scrutinised by all and sundry, you tend to be much more careful with that work, and much more careful with drawing conclusions from it.

We run formal assessment reviews to check methods, data sources, and citations.

Where possible, experiments are designed to remove ordinary personal biases, such as the ‘double blind trials’ used to test whether medical treatments work.

These extra tasks are tracked and checked and recorded, to make sure they are done.

Except that we don’t even do all these very well. It’s expensive, and it diverts effort from the task (even if it improves the quality of knowledge overall), and so we tend to bypass them when we can. It only tends to be properly implemented where we need very very high levels of confidence, such as medicine and bridges, buildings and airplanes, and are willing to pay for it.

It’s an odd leftover from the past that we don’t require the same rigour for informing public policy, such as in education, re-employment, and major environmental impacts.

And when scientists from some of the more ‘careless’ disciplines hold forth, we ought to consider carefully whether their views have been as openly, rigorously and systematically checked as they imply – or even believe themselves.

(“The Golem: What you should know about science” is a much more thorough take on the above)

Posted in Politics, Science | Tagged: , , , | Leave a Comment »