Member login
 
   
Forgot Login?   Sign up  

May 2020
by Jade Maloney

Over the last couple of months, evaluators around the world have been grappling with the question of whether and how we evaluate in the COVID-19 context. What can and should be done now, and what should wait? How can we be most useful?

For a recent online session with AES members, which Keren Winterford, Greg Masters and I hosted on behalf of the NSW Committee, I rounded up a range of reflections on these questions to prompt discussion.

We need to consider carefully whether to pause or press on

Evaluation is the oxygen that powers decision making. Too little and we are likely to make poor decisions. And when faced with big challenges, we need more than usual. Too much evaluation without action leads to hyperventilation. Analysis paralysis. As an evaluator, it is your responsibility to keep the breathing steady. [Chris Lysy]

To decide whether to pause or press on with our existing evaluations, we need to ask ourselves a series of questions.

Can it be done without undue stress to an organisation responding to COVID-19? At the best of times, evaluation can be anxiety inducing, does the organisation/ team have the bandwidth to engage?

Can the evaluation still be useful now? Can you adapt your questions? Can you assess how COVID-19 adaptations are working? Can you help to identify what adjustments should continue post COVID-19?

Can you adapt your methods to comply with physical distancing? Will the people you are trying to engage, engage online? Can you draw on existing data sources?

The World Bank’s, Adapting Evaluation Designs sets out four questions that you can adapt to work through whether to press on. Better Evaluation, has also begun a series on Adapting evaluation in the time of COVID-19. Part 1: MANAGE has a range of useful prompts to help you work through changes to stakeholder context, engagement, decision-making protocols, information needs, Terms of Reference and budgeting.

Think beyond individual “evaluations” to tap into our value

I think one of the key gaps or aspects I don’t see addressed much is around utility of evaluation in this space. A lot of the discussion online is around the ‘how’ – how do we adapt evaluation? But I feel a deeper question is around the ‘why’ of evaluation. Why is it still important to do evaluation in this context? Is it actually important to making a difference? This is quite a tricky question and one that can make an evaluator really uncomfortable as it forces us to reconsider our work. But, on the contrary, I see this as an opportunity to reinforce our conviction, sense of purpose and clarity. Evaluation was already often an after-thought and now urgent customer-facing delivery initiatives are definitely taking priority. The case for evaluation will be harder to make. We need to genuinely think about the value evaluation can bring in these times and more broadly. {Florent Gomez, NSW Department of Customer Service]

As Michael Quinn Patton has said, we need to be prepared to make a case for the value of evaluation now. We can do this by proactively demonstrating the ongoing relevance of evaluative thinking, supporting real-time sensemaking of data, engaging in systems thinking (identifying the interconnections and their implications), enabling decision-making based on “good enough” data, and identifying the potential for negative unintended consequences so they can be prevented. In other words, “All evaluators must now become developmental evaluators, capable of adapting to complex dynamics systems, preparing for the unknown, for uncertainties, turbulence, lack of control, nonlinearities, and for emergence of the unexpected.”

For guidance on Sense-making in real-time check out Canadian facilitator Chris Corrigan’s blog.  First, observe the situation. Then, look for patterns and inquire into these. What do you notice in general? What are the exceptions to these generalisations? The contradictions? The surprises? What are you curious about? Then, using complexity concepts, look at what is keeping the patterns you have identified in place and the actionable insights that could enable change.

Sense making in real time 

My team at ARTD have also developed the 3 R Framework as a tool for using evaluative thinking under pressure. It is based around questions because, in our experience, being an evaluator is about asking effective questions at the right time, not about having all the answers. You can use the framework to direct responses at an organisational, team, program or individual level. If you’re applying it within your organisation, team or to a program, we suggest getting a diverse group of people together to reflect, drawing on existing data, stories and experiences to ensure you are not missing critical insights as you make decisions.

3 R Framework 

While being useful right now, we can also keep our eye on the long game – what data needs to be collected now to enable evaluation of pandemic responses?

Think through the implications of your choices

Among evaluators I have spoken to around Australia and overseas, there is a strong concern about the equity implications of changes. It is important we recognise differential impacts of the crisis, consider accessibility when adapting our methods and whose voice is missed if we draw only on existing data.

We also need to be as purposeful in choosing our online methods as we are in planning methods generally. Not everything has to become a Zoom session. Asynchronous (contributing at different times) online methods bring different benefits and drawbacks to synchronous (contributing at the same time) online methods.

Remember: not everything is changing and some things should

One of the things I have found most useful in this time is my colleague Emily Verstege’s reminder (with reference to Kieran Flanagan and Dan Gregory’s Forever Skills) that, while many things are changing, including how we evaluate, what is at the core of evaluation is not. We can take comfort in this, as well as in the potential to change things that need changing.

One of the benefits of taking our regular AES session online was the ability to engage with members in regional areas and other jurisdictions. It’s something the committee is already thinking about continuing when physical distancing rules are relaxed.

I have been most struck by the validity of that old maxim that necessity is the mother of invention. In many areas of work and life, more fundamental change has occurred in the last few weeks than in previous years, despite the relentless urging for innovation. Witness working from home arrangements, expansion of telehealth services, online delivery of educational programs.

Hopefully, one of the legacies of this awful crisis is that some of these new practices become ingrained and that we become more comfortable challenging the status quo and traditional modes of operation. Returning to normal is neither feasible nor desirable. Evaluators have a large role to play in leading that campaign but we also need to challenge our existing practices. [Greg Masters, Nexus Management Consulting and NSW AES Committee member]

If you have ideas for further AES blogs, the AES Blog Working Group would be keen to hear them. Please complete the online form below.

 

-------------------------- 

Jade Maloney is a Partner and Managing Director of ARTD Consultants, which specialises in program evaluation.