by Charlie Tulloch
As we move into 2021 after an interrupted 2020, it is a good time to reflect on the place of evaluators in the working world. It is clear that many sectors and vocations have been forced to significantly upscale, downscale or adapt to changing economic and global circumstances.
Fortunately for us, there remains a central role for evaluation to play in the face of increasing challenges, demanding an ongoing need for analysis of policy and program successes and failures. Indeed, evaluators now face an increasingly diverse set of choices when it comes to defining their career directions.
The final Australian Evaluation Society's Victorian seminar of 2020 explored this topic in depth, drawing on the wisdom and experiences of six fantastic evaluators of different ages, genders, study backgrounds and vocational sectors (academia, private, government, international development, philanthropy). This article reflects on the insights from this session.
by AES QLD Committee Members
Evaluators in the AES network are increasingly being challenged to apply evaluative thinking, methods and tools to innovative, emergent, place-based or otherwise complex initiatives. These initiatives often seek to achieve improvements not only in individuals and institutions, but in the systems that hold 'wicked' societal problems in place. The desired systems-level outcomes are often difficult to define, predict and measure and can change and evolve as the implementing organisations learn which strategies are most effective in reaching their goal.
In response, a recent issue of the AES QLD regional committee's newsletter focussed on resources, methods and mindsets to support members to in evaluating complex systems change initiatives. Here are the take-outs.
by Eleanor Williams
COVID-19 has, for many, been a time of adaptation and creation of a new sense of normality. As we move away, gratefully, from local crisis management, we have the opportunity to reflect on not only our own resilience through this time, but what we have learned and how we have adapted through adversity.
Eleanor Williams from the Centre for Evaluation and Research Evidence, Victorian Department of Health and Human Services and the Australian Public Sector Evaluation Network shares her reflections on Evaluation Adaptation through COVID-19.
by Eunice Sotelo & Victoria Pilbeam
Many evaluators are familiar with realist evaluation, and have come across the realist question “what works for whom, in what circumstances and how?” The book Doing Realist Research (2018) offers a deep dive into key concepts, with insights and examples from specialists in the field.
We caught up with Brad Astbury from ARTD Consultants about his book chapter. Before diving in, we quickly toured his industrial chic coworking office on Melbourne’s Collins Street – brick walls, lounges and endless fresh coffee. As we sipped on our fruit water, he began his story with a language lesson.
by Rachel Aston, Ruth Aston, Timoci O’Connor
How often do we really use research to inform our evaluation practice? Many of us tend to use research and evidence to help us understand what we are evaluating, what outcomes we might expect to see and in what time frame, but we don’t often use research to inform how we do evaluation.
By Ruby Fischer
Evaluations are like diets – you know they’re good for you, you always start off with good intentions and desperate optimism, but eventually you slip back into your old habits. So how do you stick to them? Here are 5 tips from AES NSW’s latest seminar on how NGOs can stick with evaluation in our do-more-with-less world.
By Zazie Tolmer
Late last year an opportunity came up for a Clear Horizon consultant to work full time as an embedded evaluator in a Collective Impact initiative. I jumped at the opportunity and have been part of the backbone team for the last eight months.
By Gerard Atkinson
Have you ever felt like you have put in a lot of work on an evaluation, only to find that what you have delivered hasn’t had the reach or engagement you expected? I’m not sure I have met an evaluator who hasn’t felt this way at least once in their career.
It was because of this that late last month I led a session at the 2018 Australasian Evaluation Society conference in Launceston, titled “Evolving the evaluation deliverable”.
By Liz Smith
At the 2018 AES conference, Ignite presentations were introduced to light some fire in our evaluation belly. Ignite presentations are a set formula of five minutes and 20 slides with each slide advancing automatically after 15 seconds. Presenters have to concisely and quickly pitch their idea.
By Denika Blacklock
I have been working in development for 15 years and have specialised in M&E for the past 10 years. In all that time, I have never been asked to design an M&E framework for or undertake an evaluation of a project which did not focus entirely on a logframe. Understandably, it is a practical tool for measuring results – particularly quantitative results – in development projects.