Skip to content

You are reading 1 of 2 free-access articles allowed for 30 days

Bad science or bad behaviour?

Everyone has their prejudices and biases, scientists included. Publication bias is a problem that has long been acknowledged within the research community.  The term has been defined as describing instances when the research that appears in the published literature is systematically unrepresentative of the population of completed studies. This means readers and reviewers of that research are in danger of drawing the wrong conclusion about what that body of research shows. Outcome reporting is another source of bias. This is a specific form of bias, which is defined as the selection for publication of a subset of the original recorded outcome variables on the basis of the results.

These issues, and other sources of research bias, were the subject of a presentation delivered by Prof Matthias Egger at the recent Society for Academic Primary Care (SAPC) 45th Annual Scientific Meeting, which was hosted by the RCSI and the HRB Centre for Primary Care in the Printworks in Dublin Castle. Prof Egger, who is Director of the Institute for Social and Preventative Medicine at the University of Bern, Switzerland, gave a talk entitled ‘Bad science or bad behaviour? The reporting of medical research’, where he discussed the quality of clinical research and the impact of standardised reporting guidelines.

””

Prof Matthias Egger

One of the major problems with medical research, according to Prof Egger, is that positive results are much more likely to be published than negative ones. Another problem is that novel studies offering positive results are more likely to be published than future studies on the topic, even though the latter may be more methodologically sound.

“Often, what I observe is that a study that has showed something is often weak, but because the results are so interesting it gets into a top journal, and the very good studies that follow that are actually better than the first one do not make it into the top journal because they replicate it. But they are often methodologically better than the first,” Prof Egger told delegates.

‘If you only publish what appears to bring science forward, you will have a lot of false positives. You need the replication studies in order to establish that it is not a false positive’

He said these problems have been recognised by researchers and medical journal publishers for some time, and attempts have been made to solve them. In 1993, 30 experts — including medical journal editors, epidemiologists and methodologists — met in Ottawa, Canada, to discuss ways of improving the reporting of randomised trials. This meeting resulted in the Standardised Reporting of Trials (SORT) statement, which was a 32-item checklist and flow diagram in which investigators were encouraged to report on how randomised trials were conducted.

Concurrently, and independently, another group of experts, the Asilomar Working Group on Recommendations for Reporting of Clinical Trials in the Biomedical Literature, convened in California, US, and were working on a similar mandate. This group also published recommendations for authors reporting randomised trials.

In 1995, representatives from both the groups met in Chicago, US, with the aim of merging the best of the SORT and Asilomar proposals into a single, coherent, evidence-based recommendation. This resulted in the Consolidated Standards of Reporting Trials (CONSORT) Statement, which was first published in 1996. Further meetings of the CONSORT Group in 1999 and 2000 led to the publication of the revised CONSORT Statement in 2001. The most recent CONSORT Statement dates from 2010.

Another more recent innovation has been the move to only publish the results from trials that have been registered.

The International Committee of Medical Journal Editors (ICMJE) decided that from 1 July 2005, no trials will be considered for publication unless they are included on a clinical trials registry. The World Health Organisation (WHO) also pushed for clinical trial registration with the initiation of the International Clinical Trials Registry Platform. There has also been action from the pharmaceutical industry, which released plans to make clinical trial data more transparent and publicly available. Released in October 2008, the revised Declaration of Helsinki states: “Every clinical trial must be registered in a publicly accessible database before recruitment of the first subject.”

However, Prof Egger said that these two initiatives have not had the impact he would have hoped for.

He referred to a Cochrane Review, published in 2012, which asked, does use of the CONSORT Statement impact the completeness of reporting of randomised, controlled trials published in medical journals? A total of 53 reports describing 50 evaluations of 16,604 randomised control trials (RCTs) were assessed for adherence to at least one of 27 outcomes. Some 69 of 81 meta-analyses show relative benefit from CONSORT endorsement on completeness of reporting. Between endorsing and non-endorsing journals, 25 outcomes are improved with CONSORT endorsement, five of these significantly. The number of evaluations per meta-analysis was often low, with substantial heterogeneity; validity was assessed as ‘low’ or ‘unclear’ for many evaluations. The results of this review suggested that journal endorsement of CONSORT may benefit the completeness of reporting of RCTs they publish and no evidence suggested that endorsement hinders the completeness of RCT reporting. However, despite relative improvements when CONSORT is endorsed by journals, the completeness of reporting of trials remains suboptimal, according to the review. The authors stated that medical journals were not sending a clear message about endorsement to authors submitting manuscripts for publication. As such, fidelity of endorsement as an ‘intervention’ has been weak to date. The authors recommended that journals need to take further action regarding their endorsement and implementation of CONSORT to facilitate accurate, transparent and complete reporting of trials.

Prof Egger also stated that a study published in the Journal of Clinical Epidemiology found compliance with prospective trial registration guidance remained low in high-impact journals, which has implications for primary end-point reporting.

The study found that 40 of 144 (28 per cent) of articles did not comply with the registration policy and were retrospectively registered.

“Again, I’m disappointed because the editors actually don’t adhere to their own guidelines; they don’t check these things thoroughly and it is possible for quite a large number of trials to slip through the system,” he said.

Another review found that 10 years after the ICMJE declaration, one-quarter of phase 3 paediatric epilepsy clinical trials still remain unpublished.

“So registration doesn’t mean that the full dataset or the full results became available,” he said.

“It helps because you know this trial has been done and you can make an effort to get the results but again, it is disappointing that so many trials, even if registered, are not published.”

He argued, however, that hope is on the horizon.

In the past couple of years, a movement has started to have studies in the life sciences published on the basis of their hypotheses and methodologies rather than results. The initiative, which is called Registered Reports, is for editors to give a commitment to publish on the basis of study protocols, prior to the results being known. The peer review process takes place before the experiments are conducted and passing this stage of review virtually guarantees publication.

To raise the profile of the initiative, an open letter was written to The Guardian newspaper in the UK and was signed by more than 80 scientists and members of journal editorial boards. In addition to the journal Cortex, where the first completed articles were published, the format has been taken up by more than a dozen journals across neuroscience, psychology, psychiatry, biology, nutrition and medicine (see box below). The originator of Registered Reports, Mr Chris Chambers, said that the proposed publication process will eliminate the need for scientists to strive for “publishable” results.

Some of the scientific journals that use the Registered Reports method include:

AIMS Neuroscience

Attention, Perception, and Psychophysics

Cognition and Emotion

Cognitive Research: Principles and Implications

Comprehensive Results in Social Psychology

Cortex

Drug and Alcohol Dependence

Experimental Psychology

Human Movement Science

International Journal of Psychophysiology

Journal of European Psychology Students

Journal of Personnel Psychology

Journal of Media Psychology (Editorial)

Nicotine and Tobacco Research

NFS Journal (Announcement)

Perspectives on Psychological Science

Stress and Health

Working, Aging and Retirement

Speaking to the Medical Independent (MI) following his presentation, Prof Egger said: “This is all relatively new. It sort of came up a few years ago and now increasingly journals are adopting it. It is far from being widely adopted or being a standard. It is still the exception. And you saw the list of journals I showed, it looks like 20 or so, but it is a small minority of the journals that allow authors to submit. I think we will need to see what happens and to what extent other disciplines adopt that idea. But I think what is very important is that there is a promise to publish, otherwise you have the same problem if the methods and the procedures and the analysis plan are high quality, are sound, then you need to have the promise that the paper will be published when the results become available. If you only publish what appears to bring science forward, you will have a lot of false positives. You need the replication studies in order to establish that it is not a false positive.”

The SAPC 45th Annual Scientific Meeting was attended by more than 400 researchers and clinicians who discussed key topics in primary care research, including diagnosis and safety, multimorbidity, mental health, dementia and prescribing. It had attendees from Ireland, the UK, Sweden, Italy, Denmark, New Zealand, Australia, Canada and the US.

Leave a Comment

You must be logged in to post a comment.

Scroll To Top