Group: Can Evaluators Solve All The World’s Problems – and if so, how?
Alan Woodward

I have worked across all aspects of evaluation for over 30 years – most of my working life. I have conducted evaluations, designed program evaluations, utilized evaluation findings for service reforms and for meeting accountability requirements, commissioned evaluations and in more recent years, provided strategic and expert advice to governments and organizations on evaluation strategies. I established the Lifeline Research Foundation, which is an internal unit that facilitates evaluation of Lifeline services for ongoing service improvements.
I have used evaluations findings for policy and program development. I have advocated for evaluation to be included in reform initiatives in suicide prevention. I have sat on national evaluation steering committees for the Department of Health.
I am a Fellow of the AES, a Past President of the AES and was involved in the development of the AES Code of Ethics..
• Program evaluation
• Developmental evaluation
• Participative evaluation
• Evaluation capacity building in organisations and service sectors
.
I hope they will retain and build their enthusiasm for evaluation as an important practice. I hope they will be encouraged as they tackle the challenges that they face in their work, within organisational and institutional settings, and with complicated evaluation topics.
I like to challenge people by raising issues and problems for discussion, and maybe putting different perspectives forward; I am inclusive and like to hear from everyone.
Group: Evaluation and Value for Money
Mentor: Julian King | Associate Mentor: Heidi Peterson

Julian King
For the past 20 years I have provided evaluation services through my company Julian King & Associates. I work across multiple sectors, with recent examples including projects in health, education, justice, international development, climate, trade and enterprise, public financial management, governance, agriculture, transport, energy, insurance, and female economic empowerment. My earlier experience included roles with KPMG Consulting in Melbourne, Health Canada, and the NZ Ministry of Health.
Through doctoral research, I developed the Value for Investment approach to evaluation and value for money that combines strengths of evaluation and economics. The approach has attracted interest from evaluators and international development teams around the world, and consequently my consulting practice focuses on teaching and mentoring in its use.
Heidi Peterson
I am currently a Senior Consultant with Clear Horizon, where I have led a number of evaluation projects across Australia in different sectors, including disability, health, education and gender equality. I have provided advice on evaluating Value for Money across a range of projects and programs, including in international development, climate-change and place-based initiatives.
Prior to this, I led the internal evaluation activities of the United Kingdom government’s Global Challenges Research Fund and the Newton Fund (a £2.2 billion investment to address challenges in developing countries). As part of this, I worked to adapt and implement the Value for Investment approach for both funds. I am also currently undertaking a part-time PhD focusing on innovative and participatory approaches to Value for Money.
Evaluation and value for money: the ‘Value for Investment’ approach
More info
We hope mentees will develop:
– Confidence that as evaluators, we have the tools and techniques to provide clear answers to value for money questions, and
– Skills to navigate this space, including the expectations and assumptions of stakeholders, and the interface between evaluative and economic thinking about VFM
Specific topics will be identified with mentees – but the sorts of things we might cover include:
– Knowledge of:
o Definitions of value for money (VFM)
o Evaluative questions about VFM
o The role of explicit evaluative reasoning and mixed methods to evaluate VFM
o The role of economic methods of evaluation - including the insights we can gain from using these methods, and why they are often not enough on their own to provide a comprehensive evaluation of VFM
–Skills in:
o How to design an evaluation framework including a VFM-oriented theory of change and rubrics
o Identifying fit-for-purpose evidence sources and methods (including using economic methods evaluatively) and synthesise mixed methods evidence to make evaluative judgements
o Reporting on VFM
o Collaborative development and implementation of an appropriate VFM approach with stakeholders
o Being able to explain and justify the pillars of the VFI approach including evaluative reasoning, mixed methods, why not just CBA, and why stakeholder participation matters.
We will provide a flexible approach to mentoring that responds to our mentees' needs and questions. We aim to foster a collaborative learning space where mentees can seek support from the mentors and each other. We will support mentees to bring real-life challenges, opportunities and questions so that the learning experience can be as practical as possible. Mentees can expect that we will share relevant tips and tricks that we have learned from our own experience, while also encouraging other mentees to share their insights. We invite mentees to come prepared with questions and ideas to lead their own learning.
Group: Managing evaluation in complexity: linking theory and practice in our choices
Co-Mentors: Judy Oakden and Anne Stephens

Judy Oakden
I have conducted and provided strategic advice on evaluation and research projects and offered training and mentoring services to build evaluation capacity for the last 15 years, and consulted with central and local government agencies, NGO's and businesses, both in Aotearoa, New Zealand and internationally, for 30 years (I was previously a market researcher).
I am a member of the Kinnect Group and owner of Pragmatica Limited.
I am active in the Australian Evaluation Society (AES), Aotearoa New Zealand Evaluation Association (ANZEA), and the American Evaluation Association (AEA).
Anne Stephens
I’ve been an evaluator since 2012 and prior to a social researcher – academic, educator, teacher, writer, permaculture designer and gardener, public servant and social activist (at the same time- awkward), administrator and business manager.
.
– Managing evaluation in complexity
– The art and practice of systemic evaluation
, including the challenges of managing a consultancy that tries to use this approach
The field of complexity offers important insights for evaluation. But it can be hard to make sense of the different worldviews from the complexity literature and learn how to use complexity-informed approaches to create knowledge. Often, it takes effort to cut through the confusing array of definitions and interpretations.
But if we persevere and adopt some of the complexity-informed worldviews and stances about knowledge creation, it changes how we think about and undertake our evaluation practice.
A 'straw' of possible topics for these mentoring sessions (to provide some possible structure for the group) include:
● Talking past one another: Why is understanding different worldviews and how we create knowledge important?
● Whose perspectives count: Who should we include, and how do we combat deficit thinking?
● Identifying some of the fundamental concepts: Path dependence, Emergence, Self-organising and Feedback - what are they and what do they look like in practice? How might they inform our evaluation practice?
● Reflection and reflexivity: Why is it important? How can we use it more?
Mentees can expect to develop a firmer grasp of some complexity-informed approaches. They will also have a chance to consider the implications for their work evaluating strategies, policies, programmes or projects.
We offer mentees the opportunity to come together to explore an interplay of ideas with peers. We will all learn together by contributing at each session, working collaboratively and reflecting on our practice. As the mentors, we aim to encourage generative conversations where mentees can share their experiences and questions. We will be flexible and responsive to the groups' needs.
Judy:
I hope that mentees become more confident in understanding the consequences of the choices they make in their evaluation practice when either taking or not taking a complexity informed approach.
Anne:
I like to go in with a bit of an agenda, but it is ‘straw’, open to negotiation, flexible and will be changed by group consensus.
Group: ECB United
Mentor: Duncan Rintoul
| Associate Mentor: Liam Downing

Duncan Rintoul
I started my professional career at Wesley Mission in 1999 before a ten-year stint in Urbis’s Public Policy team (2000-2010) where I undertook research and evaluation consultancies for a broad range of federal and state government clients. I moved to the University of Wollongong in 2011, where I managed the consultancy arm of the Institute for Innovation in Business and Social Research and embarked on my PhD in web survey design. I started Rooftop Social in 2013, a company that I run to this day. We provide services in four areas: social research, evaluation, facilitation and capacity building.
Over the last few years, my evaluation work has generally fallen into four main quadrants:
1. Education – I held a four-year part time role in the NSW Department of Education (2016-19), at the Centre for Education Statistics and Evaluation. I continue to do lots of work for this department, as well as a few others in the education sector.
2. Health – particularly around innovation in models of care, including work for clients such as the NSW Agency for Clinical Innovation and eHealth NSW.
3. Environment and infrastructure – particularly working alongside Vanessa Hood (Associate Director at Rooftop Social) who was the evaluation lead at Sustainability Victoria before she teamed up with me in 2015.
4. Social justice and community services – this includes work for NGOs like Anglicare, as well as government agencies like Legal Aid NSW and the NSW Department of Communities and Justice.
I have a particular interest in Evaluation Capacity Building (ECB). I have taught introductory and intermediate short courses in evaluation since 2009 for the AES and the Research Society, as well as for clients such as the Australian Government Department of Industry and Science, the Australian Communications and Media Authority, the Victorian Department of Education and Training, QLD Department of Natural Resources, Mines and Energy, Transport for NSW, the NSW Department of Family and Community Services and Financial Literacy Australia. My role at the NSW Department of Education (2016-19) was mostly focused on ECB; videos and other resources that I helped create can be found on the Evaluation Resource Hub on the DoE website. For several years, Vanessa Hood and I have also run a workshop on ‘practical ECB’ for the AES, as part of the professional development program it offers.
I served on the Board of the AES from 2012-16 and remain a regular contributor to AES initiatives and conferences. Within the Research Society, I established the Government and Social Research Network and have served on several conference committees. I am also a member of the University of Wollongong Human Research Ethics Committees (HRECs)
Liam Downing
I did a BA, half a teaching degree and honours in the rhetoric of gender theory at uni, so kind of a standard education for an evaluator, right?
I commenced in social research and evaluation in 2006 in Eureka Strategic Research (later Ipsos-Eureka), where I worked until 2011 (just after I moved from Canberra to Orange, where I’ve lived since, despite my current role being headquartered in Parramatta).
I then worked for 7 years in the higher education space working hard to build and operationalise evaluations and grow evaluative practice in the equity space at Charles Sturt University and across the sector.
I commenced doing full-time Evaluation Capacity Building (ECB) in 2018 for a short stint before jumping back into evaluation design a year and a bit later, where I now lead a team of 15 evaluators, ECBers (because evaluations need ECB) and data analysts in responsively evaluating innovation across 9 separate (and significant) initiatives within utilisation-focused, developmental and realist paradigms of evaluation.
Duncan:
I’d like to focus the group on ECB, targeting people who are working in ECB roles at present. Ideally the group would include people who have a few years under their belt in this space are wanting to level-up their practice and influence.
Liam:
Definitely ECB, and maybe alongside that the idea of integrating evaluation into a PMO and a profession for fun and profit. While that second part is a bit flippant (and I should have said ‘for real impact’), evaluation is very much a key element in the multidisciplinary team that delivers work where I am, and it’s likely to be the case elsewhere. Also, potentially working on ways to influence others in terms of developing positive evaluation mindsets more broadly, and working with people who may be into slightly reductive metrics and the like to take into account the more sophisticated insight that utilisation-focused, developmental and realist paradigms of evaluation can offer in actual real-life settings!
Duncan:
My main hope is that they will learn and implement useful strategies for building evaluation capacity within their sphere of influence. I’d also hope that their sphere of influence grows, along with their confidence and connection with others in the same boat. As part of this I’d hope they hook into the literature, so they can stand on the shoulders of giants. And get good at being able to demonstrate impact from ECB efforts. And get really good at giving and receiving feedback on each others’ ECB challenges and approaches – as iron sharpens iron.
Liam:
I think providing a space for mentees to knock around and shape their ideas on building ECB into their work, whether they are quanties or qualies, EPs (Evaluation Professionals – who call evaluation their profession) or POEs (Practitioners of Evaluation – who don’t call themselves evaluators but are definitely actually evaluating). Providing a space to connect to literature on theory and practice (which in many cases is really quotable and fun reading) as well, and perhaps building some additional connections! A key hope (and one I work really hard on) is that each of these spaces will be super safe and super productive (they go hand-in-hand in my experience).
Duncan
Me: Informal, energetic, optimistic, generous.
Them: Curious, eager, honest, ambitious, brave, open, willing to share, into group process
Liam
Me: Sharing, fun, love a good and meaningful GIF/visual, focused on workshopping solutions
Them: See the potential happy bits of evaluation, willing to try new things, see the value in cracking open their practice in a safe space with others
Group: Public Policy Evaluators – PPE!
Mentor: Rick Cummings
| Associate Mentor: Dorothy Lucks

Rick Cummings
I have been conducting evaluation studies for the past 40 years, mainly in Western Australia but also nationally and for the World Bank in Papua New Guinea. I gained my PhD in evaluation from Murdoch University and held an academic position there since 1995. I have taught postgraduate policy research and evaluation units at Murdoch and the University of Western Australia, and have supervised a number of Masters and PhD students. I retured from Murdoch in 2015 and now hold an Emeritus Professor position.
In the AES, I have been a member of a number of committees, was Vice-President from 2002-4 and President from 2005-7. I was made a Fellow in 2013 and currently chair the Fellows Committee. I have been conducting AES professional development workshops for over 20 years on topics such as policy evaluation, program logic, monitoring and evaluation, performance management and evaluative thinking.
I was a mentor in the pilot program and am pleased to see that my mentoring group is still meeting informally.
Dorothy Lucks
Over 25 years of experience in strategic sustainable development and monitoring, evaluation and learning. I am a credentialled international evaluator with the Canadian Evaluation Society. I represented the Australasian Evaluation Society on the International Organization for Cooperation in Evaluation and on the Global EvalPartners Executive Committee (2014-2018) and was Co-Chair of the EVALSDGs Network from 2015-2020 which is connects policy makers, institutions and practitioners who advocate for evaluation for the 2030 Agenda and the SDGs and who support processes to integrate evaluation and real-time continuous improvement into national and global systems.
.
Rick:
My PhD dissertation was on enhancing utilisation of evaluation information and this remains one of my major areas of interest. I’m a firm believer that there is a lot we can do in the ways we plan and conduct evaluation studies that will increase their use by decision makers and other stakeholders. I am also interested in how we evaluate public policies, which are often complex and long-term strategies adopted by governments to address major social issues. Finally, I have a strong interest in how we plan and report evaluation studies, again to improve their quality and the likelihood the results will be used. More recently I have delivered workshops on topics such as program logic, monitoring and evaluation and evaluative thinking.
Dorothy:
My PhD focussed on multi-stakeholder relationships in sustainable development and evaluation, covering project, programme, institutional and thematic evaluations. I provide technical support and conduct evaluation for sustainable development programs in Australia and internationally across 32 countries, with a range of Australian government agencies, UN agencies, the World Bank, ADB and AfDB. This means that I have a very diverse experience in evaluation with a focus on systems thinking and complexity.
Rick:
I hope they will realise that they already have many of the skills to be a good evaluator but also that where there are gaps, they can find good quality courses and workshops. I hope they gain the confidence to ask questions and recognize that no one has all the answers. Finally, I hope they will recognize that they can make a valuable contribution to improving public policy and through this to improving people’s lives.
Dorothy:
How to gather and apply a range of skills that will generate higher quality evaluations that are actively used by decision-makers.
Rick:
I like to set out a topic (decided by the mentees) to investigate and discuss, and to have pre-reading (usually my selection, but mentees will add readings as well) on the topic done prior to the session. At the session, I like some structure where we do some social catch-up, then discuss the topic of the day with mentees each contributing questions or comments, then to have a mentee present a problem that they have previously outlined to the group in a written summary. We end up with a discussion of any future activities (AES workshops, conferences, podcasts) mentees might be interested in, and set out the arrangements and topics for the next session. I also like to have guest speakers (AES Fellows, AES award winners, etc) for at least one of the sessions.
Dorothy:
Very consultative and starting from where the mentee is currently situated. I like to use practical examples from my and their own work. I expect mentees to communicate openly and to actively work to get the most value possible from the sessions.