by Rachel Aston, Ruth Aston, Timoci O’Connor
How often do we really use research to inform our evaluation practice? Many of us tend to use research and evidence to help us understand what we are evaluating, what outcomes we might expect to see and in what time frame, but we don’t often use research to inform how we do evaluation.
At both the Australian Evaluation Society’s International Conference in September and the Aotearoa New Zealand Evaluation Association conference in July Rachel and Ruth Aston, Timoci O’Connor and Robbie Francis presented our different perspectives to the discussion—Tim and Rachel as evaluators, Ruth as a researcher, and Robbie as a program practitioner.
Monitoring program design and implementation for continuous improvement
In Ruth’s study of 7,123 complex interventions to reduce exposure to modifiable risk of cardiovascular disease, she found that two program-specific factors can influence the magnitude (amount) of impact of interventions. These were:
- design: what the intervention looked like
- implementation: how and how well the design is enacted within a specific context.
Eleven specific indicators make up these two factors, but Ruth found that 80 per cent of reviewed interventions did not monitor many of these indicators. She concluded that often we don’t monitor the indicators that can give us critical information about how to improve the effectiveness of complex interventions.
Research can help us address this practice gap. Evaluative thinking, along with design-thinking and implementation science can help us operationalise and embed the process, principles and decision-making structures to facilitate progressive impact measurement and continuous improvement.
An evaluation practice example
Rachel is currently working on a three-year impact evaluation of a stepped care model for primary mental healthcare services. One of the challenges in this project is that mental health outcomes are not likely to shift over the course of the evaluation. Further, the intervention for the stepped care model is dynamic – it’s being rolled out in three stages over two years but progression towards impact needs to be monitored from day one.
By incorporating the research on the importance of monitoring design and implementation, we are able to look at the quality, fidelity and reach of implementation of the stepped care model. One of the tools we’re using to do this is the Consolidated Framework for Implementation Research (CFIR) – a validated framework that incorporates a large number of validated constructs (developed by Laura Damschroder and her colleagues), https://cfirguide.org/.
The constructs and overall framework can be used to build data collection tools, such as surveys, interview schedules, observation protocols and to develop coding frameworks for analysis. Using the CFIR and focusing on how, how well and how much the stepped care model has been implemented, we can develop actionable feedback to improve implementation and consequently, the effectiveness of the model.
A program practitioner’s perspective
Robbie Francis, Director of The Lucy Foundation described how the Foundation has utilised information gained from the monitoring and evaluation of the design and implementation of their exciting Coffee Project in Pluma, Hidalgo Mexico. Her reflections reinforce how adaptations can be made to program design and implementation to improve potential for impact. Robbie also provides an important practical message about the place of principles in evaluating the impact of the Coffee Project.
Click on image below for video of Robbie Francis, The Lucy Foundation
We have a role and a duty as evaluators to use the evidence we have at hand to inform and enhance our practice. This includes traditional research, evaluation practice experience, and program practitioner insights.
While this is important for any evaluation, it is arguably more important when evaluating complex interventions aiming to achieve social change. If we are going to continue to invest in and evaluate complex interventions, which seems likely given the challenging nature of the social problems we face today, then we need to think critically about our role as evaluators in advocating for the importance of:
- effective design
- attention to implementation science in planning and roll-out
- effective monitoring
- developing tools and measures for quality monitoring implementation and intervention design
- sharing and disseminating exemplars of quality intervention design and implementation
- working with policy makers and commissioners to enable evaluations of complex interventions to focus on using design and implementation as proxy indicators of impact in early rollout.
Above all, we need to accept, review and use all forms of evidence we have at our disposal. This will enable us to continually learn, become evidence-informed practitioners and use evaluative thinking in our work for the purposes of improving our practice, and generating useful, accurate and timely evaluation.
Damschroder, L., Aron, D., Keith, R., Kirsh, S., Alexander, J. and Lowery, J. (2009). Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science, 4(1).
Dr Ruth Aston
Research Fellow, University of Melbourne
Ruth has nine years’ experience in research and project evaluation. Ruth has managed several large-scale evaluations across Australia and internationally. She recently completed her PhD on 'Creating Indicators for Social Change in Public Health'. She also has a strong interest in interdisciplinary research with diverse cultural groups.
Senior Consultant, ARTD Consultants
Rachel is an experienced social researcher and evaluator who joined ARTD in 2018. She brings over six years’ experience conducting research and evaluation for government, NGOs and in the higher education sector. Rachel’s academic background is in anthropology and social research.
Lecturer, University of Melbourne
Timoci has over ten years’ experience in conducting research and evaluation projects in the public health, education, international development and community sectors. He holds a Masters of Public Health and is currently doing his PhD exploring the nature of feedback in community-based health interventions utilising mobile technologies and describing its influence on program outcomes. He is Ikiribati/ Fijian.
Director, The Lucy Foundation
Robbie Francis is a young woman who has packed a lot into 29 years. Having lived with a physical disability since birth, she has worked in the disability sector for over a decade as a support worker, documentary maker, human rights intern, researcher, consultant and as an advisor. In 2014 Robbie co-founded The Lucy Foundation, a social enterprise committed to empowering people with disabilities by working with local communities to promote education, employment and a culture of disability inclusiveness through sustainable trade.