by Anthea Rutter
Anne and I have been colleagues and friends for many years. I have long been an admirer of her ability as a practical evaluator and I refer to Anne and Ian’s book frequently for my own practice. I caught up with Anne at the AES International Conference in Launceston, Tasmania, where we found time to share some lunch and some great conversation.
I am always intrigued by the many routes which professionals follow to bring them into the field of evaluation. Although I have known Anne for many years, I was unsure of how she came into the field.
When I was an academic in social work, we were starting to pick up contracts in evaluation. I liked project work, so always put my hand up. I liked the organising aspect, as well as adding new knowledge and improving, rather than service delivery. Eventually, I began sub-contracting for some small evaluation companies before starting my business.
As all of us are aware, we are influenced by a number of elements which eventually shape what and who we are. I asked Anne about the influences which have helped define her practice.
Being part of the evaluation community of practice has been an important part of my career. Being a lone evaluator would be tough without opportunities to engage with other evaluators through conferences and AES Network meetings. You need to interact with others to see different people’s take on things and test your ideas. This is essential for informed practice. Being in a relationship with another evaluator also has its benefits for testing your ideas out.
Anne’s last comment made me reflect that I don’t know a partnership where both parties are evaluators. Over the course of a career, all of us have faced challenges, including AES Fellows. We all have an opportunity to learn from those experiences. I was keen to find out from Anne about the challenges she has faced during her career.
A major challenge in evaluation is managing the political aspect, negotiating the report and findings. People often challenge the findings, so you need all the skills you can muster in terms of negotiation to advocate for and justify your conclusions.
Also, some clients do not fully understand what an evaluation can and can’t do. Expectations that were not part of the original Terms of Reference and outside the remit of the evaluation’s scope and focus often come up when the draft report is delivered.
After being in a career for over 20 years there is bound to be a few changes along the way. I was interested to find out from Anne what had changed and how the field of evaluation is today.
When I was a newbie, I wasn’t sure that the field of evaluation was a good fit for me. At that time, over 20 years ago, the field of evaluation had a more quantitative and positivist focus, with a strong public sector performance and financial management leaning. I was not sure whether I fitted. But evaluation has evolved so much since then. There has been a huge paradigm shift from the quant/qual debates to the evolution of a range of evaluation-specific methods: Realist, Most Significant Change, participatory, developmental etc. Evaluators are also more diverse in their professional backgrounds and methodological leanings. The field is so much richer. It will be interesting to see what happens in the next 10 years!
Anne and I discussed the fact that, with all of the changes in the profession over the years, there could be changes to the skills and competencies needed to cope with the changes. Anne was very specific in her answer and her ideas covered the whole gamut of an evaluation.
Evaluators need foundation skills, including formulating theories of change and evaluation questions, identifying mixed methods data sources and matching data to questions, data collection, analysis and reporting. Evaluators also need foundation skills in how to build organisational systems for both monitoring and evaluation functions. More and more evaluators are being asked to build capacity within organisations for the above competencies and this may be a new skillset for evaluators.
Evaluators also need facilitation skills and conflict resolution skills. And everyone needs an understanding of ethics – you need to know when ethical standards are being held and when they are being compromised.
What do you wish you had known before starting out as an evaluator?
I wish I had known how to better predict and manage my workload as an evaluation consultant… particularly to enjoy the lean periods and just relax into them. The peaks and troughs always evened out over the long term but in retrospect I feel that the troughs were not well used to relax and recuperate from the demanding peaks.
The final question to Anne was a bit of crystal ball gazing. I asked what she saw in AES’s future. Again, in true Anne fashion, she was very clear about where and what the AES should be doing.
I think we should attempt to develop a closer role with government bodies. There are a number of opportunities for building a stronger link between the AES and both levels of government. There was once an exercise undertaken by an AES sub-committee, mapping government bodies across states/territories and nationally, and though a big job, there is an opportunity for the AES to provide an avenue for evaluation capacity building in government.
In terms of training, the AES training program could also cater better to advanced evaluators by identifying specialist areas that could be developed and delivered by experienced trainers in those areas. The AES also needs to make sure its partnerships are robust and, not least, consult its members regularly.
Through her company, Anne Markiewicz and Associates, Anne assists organisations to establish Monitoring and Evaluation (M&E) systems and regularly conducts workshops on developing M&E frameworks for AES members. In 2016, she co-authored Developing Monitoring and Evaluation Frameworks with Ian Patrick.