by Anthea Rutter
Colin has held a number of roles within the field of evaluation for over 37 years. He has managed evaluations across the Commonwealth Government and has been a private consultant as well as an educator in evaluation. He has worked with a dozen universities, including long associations with the Flinders University and University of South Australia, where he is currently an online course facilitator as well as a teacher of MBA courses.
Colin has never seen evaluation as his "bread and butter" yet has had an ongoing relationship with it over many decades.
Colin was a contemporary of Anona Armstrong (the founder of the Australian Evaluation Society) and assisted in the running of the 1984 and 1986 evaluation conferences (long before there was an AES!). A significant motivation for Colin has been the idea that one can apply evaluation concepts and use socio- and psycho-metric data collection tools as important foundations for evidence-based management. In 1982 he built an Apple personal computer, then wrote a regular column for the AES Magazine on the use of computers in evaluation. He set up outcomes measurement for the first computerised records management systems for the Commonwealth Rehabilitation Service which were used for performance indicators in program management and evaluation.
Colin has considerable evaluation expertise across a myriad of areas, but I was keen to narrow that down to what he saw as his main areas of interest.
From my discussions with the Fellows, it is clear that careers do not follow a set pattern, and Colin exemplified that! He moved from being a rehabilitation psychologist to a program manager, then into program management research and evaluation. He then developed an interest in goal attainment scaling and the idea of an individual's participation in setting and evaluating their own goals and how the outcomes can be measured. He has developed this instrument from its individual focus to the evaluation of programs. He has also used it within organisational change as a guide to enable directors to establish strategic goal setting and self-evaluation of their governance roles as well as looking at how managers and directors evaluate their own performance.
Challenges have to be a part of developing a career as well as developing as a person. Colin describes a number of professional challenges.
In the 80s Colin developed a consultancy with other professionals. However, he found that tough going to have to keep generating business, and so moved back to teaching, initially as research program director at RMIT's Graduate School of Business. His observation is that the skills in research, psychometrics, data collection and analysis "so often saved me". Another observation was that "you need evaluation in the recipe whether you are doing program management, organisational governance, performance audits and other evidence-based processes. It is essential to be able to 'talk truth to power'." A comment was added that a lot of us could identify with: "Evaluation is always trying to morph into a tool to suit whatever the managers need."
A career as long and successful as Colin's would surely have some highlights, and Colin's was no exception.
Colin was President of the AES in 1994 and stepped down after a year to run the AES 1997 Conference in South Australia. A highlight at that time was that he was able to represent the AES at the first international evaluation conference jointly run by the American Evaluation Association and the Canadian Evaluation Society in 1995. A personal highlight for Colin was that he was able to introduce a number of master's and PhD students to evaluation.
Another highlight was when he was running a big government tender for the graduate certificate in public sector management and was able to provide oversight on all the assessment and capstone projects (with a bias towards evaluation), a time which gave him great satisfaction.
All of us stand upon the shoulders of giants – and Colin cited a number of people who had influenced him throughout his career.
Colin mentioned two of our AES Fellows, Jerry Winston and Anona Armstrong, who mentored him. Others mentioned were Elaine Martin and Tom Kiresuk who was the author of Goal Attainment Scaling. Michael Quinn Patton and his utilisation-focused evaluation were also influences, both in program management and governance. Colin adapted the Project Management Body of Knowledge to his theoretical framework (organisational evaluation capability maturity model), which enabled him to help organisations map how evaluation fitted in their structure and culture.
Colin's observations of the changes to the field of evaluation over the years were particularly insightful.
Colin talked about the move to a stakeholder approach in evaluation. This movement was illustrated by the emergent themes of utilisation-focused evaluation (Patton) and empowerment evaluation (David Fetterman), which brought stakeholders to the forefront. The parallel was in education between established curriculum and practice-based approaches versus the 'flipped-classroom' student-centred and self-managed approaches. Empowerment of consumers in evaluation of their services was also advanced by Yolande Wadsworth in her manual and model of a bottom-up, consumer-driven approach. These have been crucial changes which have tried to make evaluation owned by the stakeholders rather than driven by the funders or owners.
As evaluation trends change new skills are required to cope with the changes to evaluation practice. I asked Colin what he saw as the main skills and competencies that evaluators now need to have.
Colin acknowledged the importance of data collection and analysis, but feels it is crucial to understand governance and strategy, as this provides the context for the evaluation and potential paths to implementation of change. He has translated Patton's utilisation-focused approach, and the importance of power and values in the context of an organisation's Board of Directors, and their requirement for evaluations. The subsequent evaluation report then becomes a historical resource, which can be used for corporate memory.
I asked Colin what he saw as the main social issues or problems that evaluators ought to be thinking about and seeking to resolve in the next decade.
The first are the Indigenous and cultural issues, one of the priority areas for the AES, which emerged from their strategic plan. The second is the problem within human services. How do we enable evaluation to give a voice to the clients and consumers of human services, so they can talk to the politicians? How do we "talk truth to power", i.e. surfacing the values around stakeholder impact? The lack of an evidence-based approach in decision-making in Parliament, in particular the impact of human services on stakeholders, is a huge gap which has to be addressed by evaluators and their employers.
Colin has been widely involved with the AES: the early evaluation conferences in the 80s and as convenor of AES conferences; as co-editor of the Evaluation Journal of Australasia; and Chair of the Committee on Ethics & Standards in Evaluation. He also set up and facilitated regional meetings in the South Australian Branch of the AES. In 1992, Colin developed an interim history of the AES and evaluation. He received the Outstanding Contribution to Evaluation (ET&S) Award in 1992, and then one of the first Fellows Awards in 2003. I felt that Colin would be in an ideal position to discuss the future of the AES.
Colin suggested we re-examine discussions on ways to sustain evaluation: credentialing evaluators and registration of membership, such as a certified practising evaluator. Though previously hesitant, he agreed that accreditation does give members some protection from litigation, although he saw a possible problem if accredited evaluators looked for higher salaries. Evaluation is a small industry, and there is no legal requirement for accreditation like those required of practising psychologists, for example.
Another approach to sustaining the profession is to think of evaluation as an important adjunct to education, especially in the area of teaching managers in the corporate and business world about evaluation. The society could then be seen as an authority on standards and the basis for facilitating more evidence-based practice. He envisaged a school or an institute for education around the various approaches to evaluation which could provide basic entry-level qualifications. The society would have a role on the advisory committee for the courses, as well as in credentialing courses and evaluation degrees. We need to get closer to education and be able to codify a valid and widely recognised body of knowledge in evaluation, something like the Project Management Body of Knowledge or the Institute of Internal Auditors' Standards of Audit Practice.
--------------------------
Colin Sharp is currently an online course facilitator at University of South Australia, as well as a teacher of MBA courses.