Skip to main content

Table 2 Summary of the lessons learned from our experiences of ‘doing’ evaluation

From: The practice of ‘doing’ evaluation: lessons learned from nine complex intervention trials in action

Lesson learned

Summary of learning

1. Different interpretations of study objectives and ‘success’ among team

Through pre-intervention and ongoing training, foster a shared understanding across the entire study team of why data are being collected, the processes and goals valued in the study and how individual practice feeds into the study’s rationale and outcomes.

2. Value of good communications to address challenges as they arise in the field

Plan intra-study communications structures carefully to ensure staff at all levels feel empowered to engage in reflection on the progress of the evaluation and interpretation of its outcomes, for example through frequent, supportive meetings and clear mechanisms for reporting and managing issues that arise.

3. Dialogue between different components of the evaluation

Establish mechanisms for ongoing collaboration between sub-study teams, to share experiences and observations from across study components, to encourage interpretation of research activities as the trial progresses, and to facilitate the synthesis of data from different disciplinary perspectives at the analysis stage.

4. Value of role of field research coordinator

Recognise, and support, the vital role of a field research coordinator in bridging the everyday, practical project management of a study, with an ongoing, scientific interpretation of evaluation activities, which can feed into generating meaningful results.

5. Value of collecting field notes during evaluation

Promote a continuous, inward reflection on the activities of an evaluation among team members through mechanisms for collecting, regularly reviewing and storing field notes, helping to make more meaningful interpretations of trial results at the analysis stage.

6. Recognition of, and reflection on, overlap between intervention and evaluation

In addition to careful planning and piloting of evaluation activities, the establishment, and maintenance, of the processes and structures described above should help the timely identification of and reflection on possible overlaps between intervention and evaluation activities, to feed into interpreting the trial results and usefully informing future implementation of the intervention.