2 Comments »Posted on Friday 19 June 2009 at 8:27 am by Jacob Aron
In Getting It Right, Getting It Wrong


“Science is inevitably biased to some extent,” says Dr Daniele Fanelli, “because it’s made by human beings.” One might easily dismiss this claim as unfounded, but Fanelli has the numbers to back it up. His recent research paper combined over 20 previous studies on scientific misconduct, and found that nearly 2% of scientists admit to falsifying or fabricating data.

Whilst most scientists would shudder at the thought of distorting or inventing results, it seems that a small number are prepared to do so. Fanelli, a researcher in science and technology studies at the University of Edinburgh, believes quantifying and identifying this practice is essential to improving science.

He’s not alone. The UK Research Integrity Office (UKRIO) is an independent advisory body set up in 2006 to support good practice in research and help address cases of scientific misconduct. UKRIO head James Parry stresses that whilst misconduct is not a common occurrence, it is a problem. “We need to take steps to actively promote good conduct and research,” he says.

What causes a scientist to turn away from good conduct, and good science? Fame and fortune are obvious answers, but Fanelli argues some scientists might feel forced in to it. “There is an excessive pressure to publish, an excessive reliance on publication record to assess scientific careers.” With scientists needing to keep up appearances, perhaps publishing a falsified paper in an obscure journal seems like the only solution.

It isn’t just smaller journals that fall foul of misconduct, as even the giants of the science publishing world can get it wrong. Parry recalls the case of Jan Hendrik Schön, a physicist at Bell Labs in New Jersey. Over the course of a few years Schön published a slew of papers on superconductivity in high profile journals, including Science and Nature. “It turned out he was faking results,” says Parry. “Some of the data used in one paper had actually been used in another – he’d just labelled it differently.”

Intentionally mislabelling data is high on the list of crimes against science, but Fanelli’s research shows that a much larger proportion of scientists are guilty of lesser offences. One third of those asked admit to a variety of “questionable research practices”, including dropping data based on gut feeling or allowing funding sources to influence a study. Whilst these may just be the research equivalent of a parking ticket or speeding fine, their high prevalence is worrying.

More worrying is that the true misconduct figures could be even higher. Scientists in the surveys Fanelli analysed were self-reporting, and may have chosen not to admit their misconduct. When asked about their colleagues, 14% reported knowing someone who had falsified results, whilst 72% suggested other questionable research practices were taking place. Even these figures don’t paint the whole picture, because one case of misconduct could be reported multiple times. “How these figures relate to the true frequency of misconduct is partly an open question,” says Fanelli.

Whilst just answering a survey might be easy, actually dealing with a colleague’s misconduct can be harder. “It’s a very stressful situation,” explains Parry, but the UKRIO can help. “If someone comes to us with concerns, we offer confidential and independent advice and guidance.” This support can play a crucial role in exposing potentially harmful misconduct, especially when it comes to health and biomedical research. “It’s the area where there is the most potential for mishap if things go wrong,” says Parry.

It is also the area with the most reported misconduct. “Medically related research has consistently higher admission rates,” says Fanelli. There are two possible explanations for this. Perhaps these researchers are more aware of issues surround scientific misconduct and so are more honest, or maybe misconduct rates simply are higher in medicine. Both explanations could be true.

Should we be concerned that we don’t know how many researchers are cooking the scientific books? Fanelli believes this behaviour is not necessarily bad for science, because dodgy data can be used to support research that is subsequently accepted as true. The 19th century scientist Gregor Mendel was posthumously accused of data that was too good to be true, but his work forms the foundation of modern genetics. Thus science is self-correcting in the long term, but for contemporary research misconduct is more of a problem.

The solution, says Fanelli, is greater transparency. “Scientists should report more faithfully what they actually did.” He suggests that if dropping a few data points lends weight to an argument then scientists should go ahead and do so, but must admit to it. And of course, he practices what he preaches: “I’m trying to be as unbiased and objective as I possibly can.”

Fanelli, D. (2009). How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data PLoS ONE, 4 (5) DOI: 10.1371/journal.pone.0005738

  1. 2 Comments

  2. It’s interesting that there’s a huge taboo against just inventing data out of thin air – rightly so of course – but there is much less of a taboo against not publishing data, i.e. sweeping it under the carpet.

    if anyone were proven to have made data up their career would be over. But every scientist knows that “uncomfortable” data often doesn’t get published. It’s a running joke at scientific meetings. Yet the consequences are just as bad.

    To take an extreme example, if someone made up a result with a p=0.05, they would be sacked. But if they did a lot of work and did twenty statistical tests until they found a “significant” p=0.05 result just by chance (which will happen in 1/20 tests, by definition), they would… be published. It’s a big problem.

    By Neuroskeptic on Friday 19 June, 2009 at 12:46 pm

  1. 1 Trackback(s)

  2. Saturday 19 June, 2010: Modern Analytics: A look at the RVM. part 2 of a week long series. – SkriptFounders

Sorry, comments for this entry are closed at this time.