By Legacy User A on Wednesday, 28 October 2020
Category: Leaders

Alan Woodward: Applying knowledge to improve service delivery and community outcomes

by Anthea Rutter

Alan has worked in evaluation for a number of years. He works in mental health and suicide prevention as a policy adviser, program developer and researcher/evaluator. He has held senior executive positions at Lifeline Australia, including the establishment of the Lifeline Research Foundation. He is a part-time Commissioner with the National Mental Health Commission.

Alan’s passion for the way in which evaluation can inform improvements for people and communities was evident throughout my interview with him, but I was still curious as to how he came to evaluation. 

I came into Evaluation at the right time! I was working in the NSW public sector as a Program Manager and it was suggested that I should look at program evaluation models. This was in the early 90s, the halcyon days when evaluation was regarded as part of public sector management reform. Some of that was the work done by Brian Lennie and Sue Funnell in the development of program evaluation and program logic models.

I then worked with natural resources agencies looking at how to utilise program evaluation to review and improve the performance of various programs. I joined the (then) Australasian Evaluation Society around that time, because of my interest in evaluation theory and methods, and its relevance to policy development.

As I have discovered, evaluators have a wide range of interests, so I was interested to find out Alan’s.

Over the past five years my main interest has been in specifying, designing and commissioning evaluation. Looking back on my career I feel that my trajectory has been similar to others: developing basic skills, then leading evaluation projects, and more recently managing, designing, and establishing evaluations.  Another important consideration is looking at how the evaluation is going to be used, before the project commences.

 

Challenges, I guess are what shapes us and our careers, so I was keen to find out from Alan, what he saw as his major challenges

I think there is a tension between the operational requirements of an organisation which in most organisations take precedence , and the need to know how the actual service is working (or not). I really believe in  the notion of creative tension operating between those who operate programs or deliver services and those who generate data or knowledge about what is delivered or achieved – and the way this tension brings out the best as well as helping with continual improvement.

 

Closely aligned to challenges are the highlights of a career, and Alan’s highlight included a very personal success story for him

In 2011, I was instrumental in establishing the Lifeline Research Foundation as an internal but non-operational unit in the organisation, with the purpose of  using research and evaluation to generate knowledge for improved suicide prevention. This was achieved with the support of a 12 Member Advisory Group including some of the best experts in mental health and suicide prevention research in the country. The recent results of the Lifeline Crisis Text trial referred to how research and evaluation has informed the co-design of this new service – a real, practical way of showing the value of using knowledge to inform better service delivery.

I also found my time on the Board of AES and as President for two years enormously rewarding. I was involved in the work of transitioning the society from Canberra to Melbourne, and the AES has provided me professional development, networking and the support of many good friends.

 

All of us are influenced by people, processes and events during our career. It was satisfying to learn that other Fellows have played a part in the development of Alan’s practice.

I have been influenced by many evaluators – most of them other Fellows! The work of Sue Funnell and Brian Lennie in program evaluation and program logic has had a significant and lasting influence on me.  Another influence was Chris Milne, with his way of looking at issues to create solutions with an intellectual robustness.  I’ve also had some great teachers – people such as Rick Cummings, Ralph Straton and John Owen. I take great heed of the work of Patricia Rogers – whose emphasis on the quality of practice in evaluation has been a constant reminder of the importance of doing evaluation well. Another field of interest is the contribution made by realist evaluation. I particularly remember the AES conference in Perth in the 90s when Nick Tilley spoke about the realist approach, and the appropriateness of the approach in suicide prevention. Realist methods show a way through to deal with the complexities of suicide behaviour. Here, I also acknowledge Gill Westhorp and her contribution to my thinking and practice.

 

I asked Alan how he felt the field of evaluation had changed during his career, and his response was quick and concise

New theories, new models and new people. Evaluation is more mature as a multi-disciplinary professional body of knowledge. The other thing is that there has been a stronger emphasis around utilisation in evaluation. Going back to Quinn Patton, who has provided us with such a great basis for best practice. The evaluation community is much better nowadays in engaging with consumers and stakeholders – the people who will use evaluation data and findings.

 

I asked Alan what he thought were the main skills or competencies that evaluators need to have or develop to keep pace with emerging trends in evaluation practice. Alan’s response echoed the utilisation ideas mentioned previously.

First and foremost, evaluators have to be good at working with stakeholders to create the design and methodology to match the purpose. I have been involved in commissioning evaluation where they have not been explicit about when and why they need data and findings. Or have not assessed the level of maturity of a program to judge if a summative evaluation is the best option. Evaluators need to be clear about what results are needed and ensure the evaluation design is right for that time, and to serve the needs of those who will use the evaluation for decision making or planning.

 

A bit of blue sky dreaming here, but Alan’s response to what social issues/problems that evaluators ought to be thinking as about and seeking to resolve in the next decade was probably a hope rather than a reality!

World Peace! For evaluators, one of the things that they could turn their minds to is how knowledge is created shared and applied in the modern world. One of the challenges is that different sources and pieces of knowledge are sometimes viewed as being of equal value. It is a real challenge to reinforce the importance of robust knowledge and data being analysed to provide real quality information.

When I asked Alan about how the AES can still be relevant in the future, he pointed to the importance of developing mutually rewarding partnerships.

  

I think that one of the ways the AES can still be relevant in the future would be to have good working relationships with industry, which would enable clusters of people to engage with the evaluation community, instead of trying to find a single evaluator. I think that the ability of the AES to work in partnership with the Centre for Program Evaluation (at the University of Melbourne) is a good opportunity to offer people skills and development. I also feel that it is important for the AES to advocate with industry bodies, government, to advance the quality of evaluation – from the commissioning perspective.

Not many of us are able to design and develop a foundation to provide assistance and support in one of Australia’s biggest social crises – suicide and its prevention. This blog is a tribute to a continuing and satisfying career for one of our Fellows – Alan Woodward.

Related Posts