Skip to main content

Table 1 Factors promoting the likelihood of acceptance or rejection from Implementation Science by manuscript type

From: Enhancing the reporting of implementation research

Type of manuscript

Factors promoting likelihood of acceptance

Factors promoting likelihood of rejection

Preferred reporting methods

Debate

Papers which question or challenge existing implementation policies, practices, evidence or theory and suggest modifications or alternatives

Papers which fail to contextualise in the literature or demonstrate how they build upon the existing implementation research literature

N/A

Effectiveness

Studies that fit our journal scope and that employ rigorous experimental or quasi experimental designs (i.e. designs eligible for inclusion in Cochrane EPOC reviews)

And

Evaluate the implementation of an evidence-based practice or policy or de-implementation of those demonstrated to be of low or no clinical benefit

Studies which lack a rigorous study design such as quality improvement reports, service evaluations or uncontrolled before-after studies

Studies evaluating the effectiveness of novel clinical, organisational, public health or policy interventions

CONSORT for Trials

Economic evaluation

Any cost effectiveness analysis that compares the costs and outcomes of two or more implementation strategies

Cost and cost consequences analysis where disaggregated costs and outcomes are presented

CHEERS

Intervention development reports

Prepared and submitted prior to the reporting of the effectiveness of the intervention

Plans for (robust) evaluation are made explicit

Providing empirical and/or theoretical rationale

Post hoc submission (submitted after the reporting of the effectiveness of the intervention)

No plans for (robust) evaluation

 

Methodology

Articles that present methods which may either be completely new or offer an improvement to an existing method

Articles reporting empirical comparisons of one or more methodological approaches or which clearly state what they add to existing literature

Descriptive accounts of largely established methods without any associated novel methodological insights

N/A

Pilot and feasibility studies

Studies that fit our journal scope and conducted with the explicit purpose of assessing feasibility and planning for an intervention that is expected to contribute to existing knowledge

Studies indicating how a subsequent study will draw from the pilot study

Clear plans for further evaluation or where there are clear reasons for not

No justification for conduct

Over claim on basis of results

 

Process evaluation

Studies that fit our journal scope and are submitted contemporaneously with or following reports of intervention effectiveness and that take account of the main evaluation outcomes

Studies evaluating fidelity of implementation, mechanisms of impact and or contextual influences on implementation and outcomes

Process evaluations submitted in advance of the conduct of the main effectiveness analysis (it cannot be clear if they are explaining an effect or the absence of an effect)

Process evaluations that do not take account of the main evaluation outcomes

 

Protocols

Protocols that fit our journal scope and inclusion criteria for rigorous study designs

And

That have been through a competitive peer review process to receive funding from a nationally or internationally recognised research agency

And

That have received appropriate ethics review board approval

And

That have been submitted within three possible time points: (1) Within 3 months of ethics approval, (2) Prior to enrolment of the first participant/cluster (3) Before the end of participant/cluster recruitment (i.e. prior to the commencement of data cleaning or analysis)

Protocols that have not been the subject of peer review by a national or international research agency

Protocols that have received ethics review board approval

Protocols for quality improvement or service evaluations, which lack a rigorous study design

Protocols for pilot or feasibility studies

Protocols for systematic reviews and other types of synthesis (we usually refer these to the BMC journal, systematic reviews)

Protocols that are submitted for studies where data cleaning and analysis have begun

As SPIRIT is developed for clinical trials, we prefer authors to complete as far as they can the CONSORT checklist or appropriate extension

Qualitative studies

Studies that fit the journal scope and meet applicable criteria for quality and validity

Studies where there are doubts whether planned data saturation has been achieved

Single site case studies with limited typicality

Studies that fail to link to relevant theory or without contextualisation and with little reference to previous relevant qualitative studies or reviews

 

Short reports

Brief reports of data from original research which present relatively modest advances in knowledge or methods

Reports of meetings, ‘doing implementation’ or ‘lessons learned’

N/A

Systematic reviews and other syntheses

Systematic reviews and other types of synthesis (such as rapid, realist or scoping) that fit our journal scope and which may cover issues such as the effects of implementation interventions and or influences on the uptake of evidence

Non-systematic or narrative literature reviews that fail to use explicit methods to identify, select, and critically appraise relevant research

Reviews and syntheses that fail to adhere to recognised quality and reporting standards

PRISMA

RAMESES for realist reviews