Welcome to the AES Blog
The smallest Russian Doll… a practitioner’s take on developmental evaluation
By Zazie Tolmer
Late last year an opportunity came up for a Clear Horizon consultant to work full time as an embedded evaluator in a Collective Impact initiative. I jumped at the opportunity and have been part of the backbone team for the last eight months.
Over that time, the way I approach my practice has changed considerably and I finally feel that I am getting a handle on what it feels like to be a Developmental Evaluator. I have learnt:
To go where the positive energy is – Rather than trying to situate evaluation through planning, I focus on the pockets of people where there is real current interest in drawing on evaluative thinking, for example, wrapping evaluation around a small prototype or developing a theory of change. This provides a place to work from that is immediately useful and creates demand for evaluation. I tried initially to drive the scoping and development of an M&E framework and plan for the initiative, but I did not get very far, fast! Collective Impact initiatives operate deliberately in the complex, and the rationalisation that is required in more standard M&E planning goes against the grain.
To do Russian Doll evaluation – When you don’t have a plan but you want to start, it can help to do a small discrete evaluation project first. For example, an exploratory qualitative study looking at partners’ perceptions and experience of the impacts of the work. Once you have one piece completed, you can start to spring off it into other evaluative projects. It can be really hard to reconcile all the different evaluation needs and tensions on a Collective Impact initiative. I have found that if you produce one evaluative output, the rest of the backbone and partners: a) understand concretely what evaluation can do; and b) are better able to articulate what they might need next from evaluation. In my mind, this approach to evaluation looks like a Russian doll set where you start with the smallest doll and keep building on it and wrapping more evaluation layers around it, until you have built up your evaluation to cover the full initiative.
To listen, keep listening and never stop listening – I have learnt to leave the consultant in me at the door and to be guided by the group rather than taking the lead in my area of ‘expertise’. The group have a much better understanding than I do of what is needed next. My job is really to listen out for where there might be an opportunity for an evaluative piece and to translate this into something that can be delivered within the time and resourcing constraints. I’m also learning to leave ‘me’ out of it. For example, I have stopped to think about the evaluation pieces as my work and am emphasising quality (honestly the best that can be done with the available resources and time) over rigour (bullet proof evidence).
Listening also means anticipating. The evaluation work I do for the initiative I am working on includes evaluation pieces that have been identified together and others that are bubbling along in the background ready to be shared when the timing is right. These pieces are more like hunches that sometimes work out and sometimes don’t. When they do, they create good momentum!
At this year’s AES conference in Launceston, the team and I will be presenting on the transformative power of Developmental Evaluation.
Zazie is a principal consultant at Clear Horizon