Member login
 
   
Forgot Login?   Sign up  

ImprovingEvaluation

ACT

engagement unhurried 600

Date and time: Wednesday 8 July 2020 7.30 - 8.45pm AEST please see below for start times in your region

Facilitators: Frances Byers

Venue: Via Zoom. Details will be emailed to registrants prior to the session start time

Register online by: noon on Tuesday 7 July. Places limited to 15 participants

The AES invites you to join a facilitated online networking session to catch up with your colleagues in the evaluation field. This is an opportunity to meet new people from across the AES membership base as well as the wider evaluation community and to discuss several key issues facing evaluation and evaluators, such as:

  1. Adjusting to new ways of working
  2. Challenges overcome and lessons learned
  3. Reflective networking

Facilitators will be from the AES regional committees and the free sessions will be hosted via Zoom. AES members and the wider evaluation community are welcome to join our new series of engagement sessions.

engagement unhurried 600

Date and time: Wednesday 24th June 2020 7.30 - 8.45pm AEST please see below for start times in your region

Facilitators: Frances Byers

Venue: Via Zoom. Details will be emailed to registrants prior to the session start time

Register online by: noon on Tuesday 23 June. Places limited to 15 participants

The AES invites you to join a facilitated online networking session to catch up with your colleagues in the evaluation field. This is an opportunity to meet new people from across the AES membership base as well as the wider evaluation community and to discuss several key issues facing evaluation and evaluators, such as:

  1. Adjusting to new ways of working
  2. Challenges overcome and lessons learned
  3. Reflective networking

Facilitators will be from the AES regional committees and the free sessions will be hosted via Zoom. AES members and the wider evaluation community are welcome to join our new series of engagement sessions.

engagement unhurried 600

Date and time: Wednesday 1st July 2020 5.30 - 6.45pm AEST please see below for start times in your region

Facilitators: Frances Byers

Venue: Via Zoom. Details will be emailed to registrants prior to the session start time

Register online by: noon on Tuesday 30 June. Places limited to 15 participants

The AES invites you to join a facilitated online networking session to catch up with your colleagues in the evaluation field. This is an opportunity to meet new people from across the AES membership base as well as the wider evaluation community and to discuss several key issues facing evaluation and evaluators, such as:

  1. Adjusting to new ways of working
  2. Challenges overcome and lessons learned
  3. Reflective networking

Facilitators will be from the AES regional committees and the free sessions will be hosted via Zoom. AES members and the wider evaluation community are welcome to join our new series of engagement sessions.

engagement unhurried 600

Date and time: Wednesday 17th June 2020 5.30 - 6.45pm AEST please see below for start times in your region

Facilitators: Frances Byers

Venue: Via Zoom. Details will be emailed to registrants prior to the session start time

Register online by: noon on Tuesday 16 June. Places limited to 15 participants

The AES invites you to join a facilitated online networking session to catch up with your colleagues in the evaluation field. This is an opportunity to meet new people from across the AES membership base as well as the wider evaluation community and to discuss several key issues facing evaluation and evaluators, such as:

  1. Adjusting to new ways of working
  2. Challenges overcome and lessons learned
  3. Reflective networking

Facilitators will be from the AES regional committees and the free sessions will be hosted via Zoom. AES members and the wider evaluation community are welcome to join our new series of engagement sessions.

Date and time: Wednesday 27th May 2020 5.30 - 6.45pm AEST please see below for start times in your region
Facilitators: Frances Byers and Charlie Tulloch
Venue: Via Zoom. Details will be emailed to registrants prior to the session start time
Register online by: noon on Tuesday 26 May. Places limited to 15 participants

The AES invites you to join a facilitated online networking session to catch up with your colleagues in the evaluation field. This is an opportunity to meet new people from across the AES membership base as well as the wider evaluation community and to discuss several key issues facing evaluation and evaluators, such as:

  1. Adjusting to new ways of working
  2. Challenges overcome and lessons learned
  3. Reflective networking

Facilitators will be from the AES regional committees and the free sessions will be hosted via Zoom. AES members and the wider evaluation community are welcome to join our new series of engagement sessions.

Session overview

In an unhurried conversation, there is time to think differently and connect with people in a refreshing way. Unhurried isn't always slow, but it has a pace where people find it easy to join in and not feel crowded out. And listening can be as satisfying as talking.

Unhurried conversations use a simple format to create good, human interaction.

In an unhurried conversation there is no agenda, but one key norm about how to take turns to speak. We lift up an everyday object (like a cup or pair of glasses) and whoever lifts it up it gets to talk. And everyone else listens. Which means the speaker won't get interrupted. When the speaker has finished, they put the object down, and some-else then takes a turn. Sometimes there are long pauses between speakers, sometimes not.

When you've finished speaking you are giving up control of what happens next. When the next person picks up the talking object they might follow on from what's been said or bring something new to the conversation. And they can hold the object and not speak...so they can hold silence until they're ready to speak.

The conversations often move between light topics and more personal and profound ones. And in the end, we often find that all these are connected. 

Unhurried conversations are a facilitation approach created by Johnnie Moore and Antony Quinn, and now being offered weekly online by unhurried enthusiasts in different parts of the world. For more information about Unhurried, and to participate in other Unhurried conversations going on across the world see: https://www.unhurried.org/conversations 

About the facilitators

Frances Byers and Charlie Tulloch will host the AES unhurried conversations. Frances is from the Canberra AES Regional Network Committee and Charlie is from the Victoria committee. 

Session start times
  • Victoria, NSW, ACT, QLD, TAS: 5.30pm
  • SA, NT: 5.00pm
  • Perth: 3.30pm
  • New Zealand: 7.30pm

 

Date and time: Wednesday 3rd June 2020 5.30 - 6.45pm AEST please see below for start times in your region
Facilitators: Frances Byers and Charlie Tulloch
Venue: Via Zoom. Details will be emailed to registrants prior to the session start time
Register online by: noon on Tuesday 2 June. Places limited to 15 participants

The AES invites you to join a facilitated online networking session to catch up with your colleagues in the evaluation field. This is an opportunity to meet new people from across the AES membership base as well as the wider evaluation community and to discuss several key issues facing evaluation and evaluators, such as:

  1. Adjusting to new ways of working
  2. Challenges overcome and lessons learned
  3. Reflective networking

Facilitators will be from the AES regional committees and the free sessions will be hosted via Zoom. AES members and the wider evaluation community are welcome to join our new series of engagement sessions.

Date and time: Wednesday 8th April 2020. 5.30pm - 7.00pm
Venue: Hyatt Hotel Canberra, 120 Commonwealth Avenue, Canberra ACT 2600
Register online by: 6 April 2020

Date and time: Tuesday 3rd March 2020. 6.00pm - 8.00pm
Discussion facilitated by: Kim Grey, Charles Darwin University, Tanja Porter, ACIL Allen and Samantha Mayes KPMG.
Venue: Polish White Eagle Club, 38 David Street, Turner ACT 2612
Register online by: 25 February 2020

Workshop title: Designing and Implementing a Monitoring and Evaluation System
Dates and time: POSTPONED

  • Monday 30th March: Introduction to Designing and Implementing a Participatory Monitoring and Evaluation System; 
  • Tuesday 30th March: Planning for Monitoring and Evaluation functions; 
  • Wednesday 1st April: Developing System Capabilities for Data Collection and Analysis; 
  • Thursday 2nd April: Developing System Capabilities for Reflection and Reporting for Learning and Program Improvement

People can choose to participate in the full program or part of the program dependent upon their experience and needs

Location: Canberra
Facilitators: Anne Markiewicz and Ian Patrick
Registrations close:  TBA
Fees (GST inclusive): For all 4 workshops: Members $1,750, Non-members $2,450. For day workshops: Members $484, Non-members $665 (per day)

About the 4-day Intensive Workshop

Designing and implementing a Monitoring and Evaluation System for a program or an organisation is becoming an increasingly important task required to support the assessment of agreed results and to aid organisational learning for program improvement. A robust and well-considered Monitoring and Evaluation System is also required to determine the scope and parameters of routine monitoring and periodic evaluation processes; to identify how monitoring and evaluation data will be collected and analysed; and to determine how data will be used to inform learning and reporting for accountability, program improvement and decision-making. The Public Governance, Performance and Accountability (PGPA) Act (2013) and the Department of Finance Resource Management Guide No.131 ‘Developing Good Performance Information’ (April 2015) affirm the importance of planning to identify program intentionality and to outline how a program’s performance will be measured and assessed.

Date and time: Thursday 12th March 2020, 9am to 5pm (registration from 8.30am) (and Sydney: Friday 13th March 2020)
Location: Lyneham and Dickson Rooms at Mantra MacArthur Hotel, 219 Northbourne Avenue, Turner ACT 2612
Facilitator: Brad Astbury
Register online by: 5 March 2020
Fees (GST inclusive): Members $484, Non-members $665, Student member $260, Student non-member $350

Workshop Overview

This workshop provides an overview of the origins and evolution of evaluation theory. Attention to theory in evaluation has focused predominantly on program theory and few evaluation practitioners have received formal training in evaluation theory. This workshop seeks to remedy this by introducing a framework for conceptualising different theories of evaluation and a set of criteria to support critical thinking about the practice-theory relationship in evaluation.

Workshop Content

Participant will learn about:

  • the nature and role of evaluation theory
  • major theorist’s and their contribution
  • approaches to classifying evaluation theories
  • key ways in which evaluation theorist’s differ and what this means for practice
  • dangers involved in relying too heavily on any one particular theory, and
  • techniques for selecting and combining theories based on situational analysis.

Case examples will be used to illustrate why evaluation theory matters and how different theoretical perspectives can inform, shape and guide the design and conduct of evaluations in different practice settings.

PL competencies

This workshop aligns with competencies in the AES Evaluator’s Professional Learning Competency Framework. The identified domains are:

  • Domain 1 – Evaluative attitude and professional practice
  • Domain 2 – Evaluation theory
  • Domain 4 – Research methods and systematic inquiry

Who should attend?

The workshop is designed for both new and experienced evaluators and commissioners of evaluation.

About the facilitator

Brad Astbury is a Director at ARTD Consulting and works out of the Melbourne office.  He has over 18 years of experience in evaluation and applied social research and considerable expertise in combining diverse forms of evidence to improve both the quality and utility of evaluation. He has managed and conducted needs assessments, process and impact studies and theory-driven evaluations across a wide range of policy areas for industry, government, community and not-for-profit clients. Prior to joining ARTD, Brad worked for over a decade at the University of Melbourne where he taught and mentored postgraduate evaluation students.


 

Date and time: Thursday 20th February 2020, 9am to 5pm (registration from 8.30am)
Title: Evaluation and Value for Money: An approach using rubrics and mixed methods 
Location: Keelty room at Novotel Canberra, 65 Northbourne Avenue, Canberra ACT  2600
Facilitators: Julian King
Register online by: 13 February 2020
Fees (GST inclusive): Members $484, Non-members $665, Student member $260, Student non-member $350

Workshop Overview

It is important for good resource allocation, accountability, learning and improvement that policies,  programs and other initiatives undergo rigorous evaluations of value for money (VfM). Many evaluators, however, lack training and experience in this area. This workshop offers evaluators a set of techniques to address that gap. These techniques build on approaches that will already be familiar to many evaluators, although there are also many evaluators and commissioners for whom these techniques will also be new. Drawing on a decade of the Kinnect group’s experience using rubrics for evaluation, together with Julian King’s doctoral research on evaluation and value for money, this workshop offers an approach to VfM that is rigorous, built on sound theory, and practical to use.

Date and time: Tuesday 19th November 2019, 9am to 5pm (registration from 8.30am)
Location: Four Seasons A room at Pavilion on Northbourne, 242 Northbourne Avenue, Dickson ACT 2602
Facilitators: Vanessa Hood and Natalie Moxham
Register online by: extended to Wednesday 13th November 2019
Fees (GST inclusive): Members $484, Non-members $665, Student member $260, Student non-member $350

Purpose of Workshop

Stakeholders are more likely to feel ownership of an evaluation and adopt recommendations if they are engaged throughout the process.  Therefore, using participatory approaches and having strong facilitation skills is vitally important for evaluators.  This is becoming increasingly apparent as projects become more complex and budgets shrink. 

Date and time: Tuesday 12th November 2019. 6.00pm - 8.00pm
Discussion facilitated by: Tanja Porter, ACIL Allen and Kim Grey, Charles Darwin University
Venue: Polish White Eagle Club, 38 David Street, Turner ACT 2612
Register online by: Tuesday 5th November 2019

Date and time: Wednesday 4th December 2019. 5.30pm - 7.30pm
Venue: Hyatt Hotel Canberra, 120 Commonwealth Ave, Canberra ACT 2600
Register online by: 2 December 2019

Date and time: Wednesday 10th July 2019. 12.00 - 2.00pm
Topic: Managing conflict in evaluations.
Presenter: Ruth Pitt
Venue: Walpiri Learning Lab, Ground Floor, Charles Perkins House (PM&C), 16 Bowes St Woden ACT 2606
Register online by: Monday 8 July 2019

Date and time: Tuesday 27th August 2019. 6.00pm - 8.00pm
Discussion facilitated by: Tanja Porter, ACIL Allen
Venue: Polish White Eagle Club, 38 David Street, Turner ACT 2612
Register online by: Tuesday 20th August 2019

Date and time: Wednesday 17th July 2019. 5.30 - 7.00pm
Venue: Griffin Room, Hyatt Hotel, 120 Commonwealth Avenue, Canberra
Register online by: Monday 15th July 2019

The AES Canberra Regional Committee invites all AES members in Canberra on 17 July 2019 to join us for a cosy networking event.

Let's brave Canberra's winter chill to meet in the Hyatt's cosy Griffin Room (next to the Speaker's Bar) for a fireside talk and chat.

Our special guests will be AES CEO Bill Wallace (visiting from Melbourne) and AES President John Stoney and Board members Sharon Clarke (visiting from Adelaide) and Susan Garner.

The cosy networking event will give AES members the opportunity to meet each other, to learn more about the AES, and to share their views about how the local committee could support their learning and development needs.

Nearly 100 AES members live in Canberra and region: we would love to catch up with you, and any other AES members visiting Canberra on 17 July.

Note: The event is not catered but the Hyatt offers a wide variety of food and drink choices.


 

Date and time: Monday 29th July 2019, 9am to 5pm (registration from 8.30am)
Location: Forest room at Pavilion on Northbourne, 242 Northbourne Avenue, Dickson ACT 2602
Facilitators: Duncan Rintoul & Vanessa Hood
Register online by: 23 July 2019
Fees (GST inclusive): Members $440, Non-members $605, Student member $220, Student non-member $302.50

Purpose of Workshop

This one-day course has been custom designed for people who want to commission better evaluations. It is for people with a role in planning, commissioning and managing evaluations. It is suitable for beginners through to those with a few years' experience. 

The training is interactive and hands-on, with lots of practical examples and group activities through the day to keep the blood pumping and the brain ticking. It will provide you with tools that you can start using immediately..

Date and time: Tuesday 21st May 2019. 12.00 - 2.00pm
Topic: Improving performance measurement
Presenter: Graham Smith, Numerical Advantage
Venue: ACIL Allen Consulting, Level 6, 54 Marcus Clarke Street Canberra ACT 2601
Register online by: Friday 17th May 2019

Date and time: Monday 18th March 2019, 9am to 5pm (registration from 8.30am)
Location: Four Seasons B room at Pavilion on Northbourne, 242 Northbourne Avenue, Dickson ACT 2602
Facilitator: Dr Mark Griffin
Register online by: 12 March 2019
Fees (GST inclusive): Members $440, Non-members $605, Student member $220, Student non-member $302.50

Purpose of Workshop

A robust evaluation makes use of both qualitative and quantitative research methods. At the same time many people commissioning or conducting evaluations have little training or understanding of quantitative methods such as survey design and statistics. Indeed some colleagues may even face some anxiety thinking about such methods. This workshop is not intended to turn evaluation practitioners into hard-core data scientists, but the goal instead is to give evaluation practitioners the tools necessary to work productively and in close collaboration with data scientists and to give evaluation commissioners the tools necessary to scope out projects involving statistical components, to assess the value of subsequent bids from potential statistical consultants, and to maximize the potential for statistical work to lead to true insights and business value within the commissioner’s organization.

Date and time: Wednesday 13th February 2019. 12.00 - 1.30pm
Topic: Developing defensible criteria use in public sector evaluation
Presenter: Mathea Roorda, Allen and Clarke
Venue: CentraPlaza, Wiradjuri Studio, Ground Floor, 16 Bowes Place, Phillip ACT 2616
Register online by: Monday 11 February 2018. Pease note that spaces are limited to 20 people 

This is a free event organised by the Canberra Regional Network Committee of the AES. Our seminar series provides an opportunity for you to meet with AES members and others in Canberra and to share and learn from the experiences of fellow evaluators. Members are encouraged to bring along colleagues with an interest in the topic even if they are not yet members of the AES. Please pass this onto your colleagues and networks.

Date and time: Monday 25th February 2019. 12.00 - 1.30pm
Topic: Realist axiology
Presenter: Dr Gill Westhorp, Charles Darwin University
Venue: CentraPlaza, Learning Lab 1&2, Ground Floor, 16 Bowes Place, Phillip ACT 2616
Register online by: Thursday 21 February 2019

This is a free event organised by the Canberra Regional Network Committee of the AES. Our seminar series provides an opportunity for you to meet with AES members and others in Canberra and to share and learn from the experiences of fellow evaluators. Members are encouraged to bring along colleagues with an interest in the topic even if they are not yet members of the AES. Please pass this onto your colleagues and networks.

Workshop title: Designing and Implementing a Monitoring and Evaluation System
Dates and time: 8th, 9th, 10th, 11th April 2019 (4 days) 9am to 5pm (registration from 8.30am) each day

  • Monday 8th April: Introduction to Designing and Implementing a Participatory Monitoring and Evaluation System; 
  • Tuesday 9th April: Planning for Monitoring and Evaluation functions; 
  • Wednesday 10th April: Developing System Capabilities for Data Collection and Analysis; 
  • Thursday 11th April: Developing System Capabilities for Reflection and Reporting for Learning and Program Improvement

People can choose to participate in the full program or part of the program dependent upon their experience and needs

Location: Keelty room at Novotel Canberra, 65 Northbourne Avenue, Canberra ACT 2600
Facilitators: Anne Markiewicz and Ian Patrick
Registrations close:  Friday 29th March 2019
Fees (GST inclusive): For all 4 workshops: Members $1,650, Non-members $2,270. For day workshops: Members $440, Non-members $605 (per day)

About the 4-day Intensive Workshop

Designing and implementing a Monitoring and Evaluation System for a program or an organisation is becoming an increasingly important task required to support the assessment of agreed results and to aid organisational learning for program improvement. A robust and well-considered Monitoring and Evaluation System is also required to determine the scope and parameters of routine monitoring and periodic evaluation processes; to identify how monitoring and evaluation data will be collected and analysed; and to determine how data will be used to inform learning and reporting for accountability, program improvement and decision-making. The Public Governance, Performance and Accountability (PGPA) Act (2013) and the Department of Finance Resource Management Guide No.131 ‘Developing Good Performance Information’ (April 2015) affirm the importance of planning to identify program intentionality and to outline how a program’s performance will be measured and assessed.

Date and time: Tuesday 26th February 2019, 9am to 5pm (registration from 8.30am)
Location: Forest room at Pavilion on Northbourne, 242 Northbourne Avenue, Dickson ACT 2602
Facilitator: Carina Calzoni
Register online by: 20 February 2019
Fees (GST inclusive): Members $440, Non-members $605, Student member $220, Student non-member $302.50

Purpose of Workshop

To support the introduction of the enhanced Commonwealth performance framework, the Department of Finance released a series of guides. Developing good performance information (Resource Management Guide No. 131. April 2015)* emphasises that qualitative and quantitative performance information is critical to telling a coherent performance story that demonstrates the extent to which a Commonwealth entity is meeting its purposes through the activities it undertakes.

Performance story reports are essentially a short report about how a program contributed to outcomes. Although they may vary in content and format, most are short, mention program context and aims, relate to a plausible results chain, and are backed by empirical evidence (Dart and Mayne, 2005). The term 'performance story' was introduced by John Mayne in a paper that was published in 2004.

Performance story reports aim to strike a good balance between depth of information and brevity. They aim to be easy for staff and stakeholders to understand and help build a credible case about the contribution that a program has made towards outcomes or targets. They also provide a common language for discussing different programs and helping teams to focus on results.

This workshop will explore some different approaches to telling an accurate and meaningful performance story, and how they are developed. It will offer some steps for collecting and reporting performance information to tell a clear picture of performance and explore the role of program logic and evidence. It will be an interactive and engaging workshop involving case studies and group process.

*This guide is available on the Department of Finance website at https://www.finance.gov.au/resource-management/performance

Date and time: Monday 19th November 2018. 11.00am - 1.00pm
Topic: So, what’s next? Implications for performance, evaluation and reporting following the Report into the operation of the Independent Review into the PGPA Act and Rule
Facilitator: John Stoney and Kim Grey
Venue: Centraplaza - Learning Lab 1, Ground Floor, 16 Bowes Place, Phillip ACT 2616
Register online by: Thursday 15 November 2018

This event will be delivered in collaboration with the Australian Public Service (APS) Community of Practice event.

This is a free event organised by the Canberra Regional Network Committee of the AES. Our seminar series provides an opportunity for you to meet with AES members and others in Canberra and to share and learn from the experiences of fellow evaluators. Members are encouraged to bring along colleagues with an interest in the topic even if they are not yet members of the AES. Please pass this onto your colleagues and networks.

Date and time: Wednesday 14th November 2018, 9am to 4pm (registration from 8.30am)
Location: Forest room at Pavilion on Northbourne, 242 Northbourne Avenue, Dickson ACT 2602
Facilitator: Full Professor Alexander M Clark, PhD and Bailey J Sousa, PMP - International Institute for Qualitative Methodology (IIQM), University of Alberta, Canada
Register online by: 8 November 2018
Fees (GST inclusive): Members $440, Non-members $605, Student member $220, Student non-member $302.50

About the Workshop

During this introductory, highly participatory course, registrants will develop a ‘hands on’ deep understanding of the what, why, and how of qualitative research methods. Each participant will be supported to develop their knowledge of the main qualitative research methods and appreciate how the components of rigorous design, analysis, and knowledge translation can create effective qualitative research.

Date and time: Monday 26th November 2018, 9am to 5pm (registration from 8.30am)
Location: Forest room at Pavilion on Northbourne, 242 Northbourne Avenue, Dickson ACT 2602
Facilitator: Charlie Tulloch
Register online by: 19 November 2018
Fees (GST inclusive): Members $440, Non-members $605, Student member $220, Student non-member $302.50

Purpose of Workshop

This workshop is aimed at those who are new or inexperienced in the evaluation field. Its purpose is to outline the key concepts, terms and approaches that are relevant to commissioning or conducting evaluation projects. The workshop will step through the set of activities that are most often involved in framing, conducting and reporting on evaluation findings. It will also introduce participants to the AES Evaluators’ Professional Learning Competency Framework, along with further sources for new evaluators to continue building their skills and knowledge in this field.

Date and time: Tuesday 31st July 2018. 12.00 - 1.30pm
Topic: A tested model for evaluation of capacity development
Facilitator: Fiona Kotvojs, Kurrajong Hill Pty Ltd
Venue: Meeting room GR1, DFAT, 255 London Circuit, Canberra 2601
Register online by: Thursday 26th July

This is a free event organised by the Canberra Regional Network Committee of the AES. Our seminar series provides an opportunity for you to meet with AES members and others in Canberra and to share and learn from the experiences of fellow evaluators. Members are encouraged to bring along colleagues with an interest in the topic even if they are not yet members of the AES. Please pass this onto your colleagues and networks.

Date and time: Tuesday 21st August 2018, 9am to 5pm (registration from 8.30am)
Location: Pods 2 room at Crowne Plaza Canberra, 1 Binara Street, Canberra ACT 2601
Facilitator: Jess Dart, Clear Horizon
Register online by: Noon on 15 August 2018
Fees (GST inclusive):  Members $440, Non-members $605, Student member $220, Student non-member $302.50

Date and time: Friday 27th July 2018, 9am to 5pm (registration from 8.30am)
Location: Grosvenor Room at Mercure Canberra, Cnr Ainslie & Limestone Ave, Braddon ACT 2612
Facilitator: Carolyn Page
Register online by: Friday 20 July 2018
Fees (GST inclusive):  Members $440, Non-members $605, Student member $220, Student non-member $302.50

About the workshop

A series of capability reviews have warned of declining policy and evaluation capability across many Australian Public Service agencies, particularly in relation to long-term or broad-ranging strategic policy issues, and public servants’ capacity to undertake the stewardship of their particular policy arena. This is often the indirect result of the structural and cultural separation within agencies of specialist expertise held by policy, program and service-delivery teams. This can be accompanied by a lack of opportunity for these groups to learn from each other and to find ‘safe’ forums to speculate about and contribute ideas to future policy—or to include evaluative thinking within their day-to-day mind-set.

Date and time: Tuesday 15th May 2018. 6.00 - 7.00pm
Topic: Conducting field visits
Presenter: Graham Smith
Venue: Smith’s Alternative Bookshop, 76 Alinga Street, City ACT
Register online by: Thursday 10th May

This is a free event organised by the Canberra Regional Network Committee of the AES. Our book club series provides an opportunity for you to meet with AES members and others in Canberra and to share and learn from the experiences of fellow evaluators. Members are encouraged to bring along colleagues with an interest in the topic even if they are not yet members of the AES. Please pass this onto your colleagues and networks.

There are some aspects of evaluation that seem so obvious. For example, the need to conduct field visits as part of an evaluation. We all do field visits, and probably don’t think a great deal about the science in doing them. But like all obvious things, the process can be done well or it can detract from the final evaluation.
A recent issue of the journal New Directions for Evaluation turned its attention to this, with a special issue on Conducting and Using Evaluative Site Visits. The whole issue is of interest, but we will be focusing on the chapter written by Donna Podems, a South Africa-based evaluation consultant. This is an engagingly written piece that combines ‘war stories’ with some reflections on the standards for field visits.

Reference: Podems, Donna. Site Visits: Necessary Evil or Garden of Eden. New Directions for Evaluation 156, Winter 2017.

If you do not have access to this journal, please let us know when registering (in the Special Requirements section) and a pdf copy will be sent to you on the understanding that it is for personal research use only.

Presenter: Graham Smith is a long-term evaluator and performance auditor, and in his time has had many interesting site visits in both these fields. He is currently focusing on performance measurement, in which field he is a consultant and Ph.D student.

Please note that no catering is provided so you will need to purchase your own food and drinks.


 

Date and time: Monday 19th February 2018, 9am to 5pm (registration from 8.30am)
Location: Keelty room at Novotel Canberra, 65 Northbourne Avenue, Canberra ACT 2600 Click here for parking information
Presenter: Duncan Rintoul & Vanessa Hood
Register online by: Monday 12 February 2018
Fees (GST inclusive): Members $440, Non-members $605, Student member $220, Student non-member $302.50

Purpose of Workshop

This one-day course has been custom designed for people who want to commission better evaluations.
It is for people with a role in planning, commissioning and managing evaluations. It is suitable for beginners through to those with a few years' experience. 
The training is interactive and hands-on, with lots of practical examples and group activities through the day to keep the blood pumping and the brain ticking. It will provide you with tools that you can start using immediately..

Date and time: Friday 30th June 2017, 9am to 5pm (registration from 8.30am)
Location: Meeting Room M22 at Dialogue Canberra, 4 National Circuit Barton ACT 2600 
Presenter: Dr. Krystin Martens
Register online by: Monday 26th June 2017
Fees (GST inclusive): Members $440, Non-members $605, Student member $220, Student non-member $302.50

Abstract
Evaluation rubrics clearly set out criteria and standards for assessing different types and levels of performance. Rubrics can be phrased in generic terms for use in a variety of settings, or be phrased in specific terms for use in a single unique context. As an evaluation-specific methodology rubrics embody the very nature and logic of evaluation and can assist program planners and evaluators to clarify what “good” looks like for particular programs, interventions, or policies. The website, Better Evaluation, contains more information on evaluation rubrics: http://www.betterevaluation.org/en/evaluation-options/rubrics

About the workshop
The workshop’s purpose is to lead participants through rubric design and use considerations to learn to develop and implement a rubric to synthesise evidence and value to make sound and transparent evaluative conclusions. To do so, participants will decompose a rubric to understand how the tool embodies the very nature and logic of evaluation. Then explore the myriad ways the elements can be recombined to fit need and purpose. The discussion will also address facilitation of stakeholder participation and buy in and how to know if you have designed, developed, and implemented a good rubric—one that can assist evaluators in drawing evaluative conclusions and promote understanding and evaluation use.

It is expected that by the end of the workshop participants will be able to:

  • use a rubric to draw evaluative conclusions
  • analyse the various elements that comprise a rubric in order to build a rubric that is credible and fit to purpose
  • gain the basic skills needed to facilitate stakeholder buy in for rubric development and use
  • gain a basic awareness of reliability issues and skills needed to calibrate multiple rubric users/uses

The workshop approach integrates best practice in adult learning. It has been developed to be relevant and practical by using hands-on experiential learning techniques. Participants will work through real life examples independently and in small groups.

This session will enhance evaluator competencies in Theoretical Foundations (Evaluative Knowledge, Theory, and Reasoning) by furthering knowledge of: the logic of evaluation; evaluative actions; and synthesis methodologies. Secondarily, the session will further Evaluative Attitude and Professional Practice by forwarding self-awareness through promotion of transparency in the evaluation process.

Who should attend?
This workshop caters to beginner and intermediate level commissioners and coordinators of evaluation. Novice rubric users and those seeking further technical knowledge are welcome.

Martens

About the facilitator: Dr. Krystin Martens
Krystin Martens (pictured) is a Lecturer and the Coordinator of Online Learning at the University of Melbourne’s Centre for Program Evaluation. Krystin began working in evaluation in 2003 focusing on multi-year, multi-site school-based health program, policy, and product (e.g., curriculum) evaluations. She has since conducted evaluations in an array of settings including international development and higher education. She has also engaged in evaluation capacity building initiatives and organisational development projects. Her work has involved client organisations including the U.S. and Swiss National Science Foundations, the U.S. Centers for Disease Control and Prevention, the European Commission’s Education and Culture Directorate General, and Heifer Project International. Krystin has a Ph.D. in Interdisciplinary Evaluation from Western Michigan University. 

  


 

 

 

Date and time: Thursday 11 May 2017. 12.30 - 2.00pm
Topic: Performance Measurement – a chore or a vital tool?
Presenter: Graham Smith
Venue: DFAT, 255 London Circuit, Canberra 2601
Register online by: Tuesday 9 May

Performance measurement is mandated in many organisations including the Commonwealth and ACT governments. Statements such as ‘what gets measured, gets managed’ indicate the centrality of performance measurement to management everywhere. But there are significant implementation challenges and some question its value.

In this session, the presenter - Graham Smith - will open by giving a framework based on his recent research that helps to understand where, and in what circumstances, performance measurement is more likely to be effective (a vital tool), and where it can sometimes be unhelpful (a chore, or worse, detrimental to the organisation).

Feedback will be sought from the group on their experiences with good and poor performance measurement approaches, and we will try to learn from these by fitting them into this framework. Finally, we will go back to the framework and see what it says about practical steps that might be taken towards improving performance measures and addressing practical difficulties.

This brief session does not attempt to be a workshop on how to measure performance, but an interactive discussion of the principles that help to determine when and how it is more likely to be effective.

 

Graham Smith has a wide background in various types of review including performance audit and evaluation. He has provided evaluation, audit and consultancy services for over 20 years, primarily to government.

In recent times, Graham has focused on performance measures, working for clients including the ACT Audit Office and the ANAO.as well as delivering training workshops. He is a regular speaker at AES conferences (including presenting on performance measurement systems) and has served the AES in his regional group and on the Board. He is currently most of the way through studies at the University of Canberra toward a Ph.D. in performance measurement.

Tea, coffee and water will be provided but you will need to bring your own sandwiches.

Date and time: Thursday 4th and Friday 5th of May 2017, 9am to 5pm (registration from 8.30am)
Location: Keelty room, Novotel Canberra, 65 Northbourne Avenue Canberra
Presenter: Anne Markiewicz
Register online by: 3:00pm, Monday 1 May 2017
Fees (GST inclusive): Members $770, Non-members $935, Student member $385, Student non-member $550AnneMarkiewicz18 200

Monitoring and Evaluation (M&E) Frameworks are becoming increasingly important for developing an agreed approach to the assessment of results achieved and to aid organisational learning. The M&E Framework identifies expected results, key evaluation questions and the means to answer these questions through routine monitoring and periodic evaluation. It also provides a guide to the implementation of M&E processes over the life of a program or other initiative. Monitoring and evaluation functions are essential to the effective operation of programs and will contribute to the overall value derived from them.

Monitoring and Evaluation Frameworks support decision-making, resource allocation and a process of regular program-focused reflection and learning leading to program refinement and improvement over time. The Public Governance, Performance and Accountability (PGPA) Act (2013) and the Department of Finance Resource Management Guide No.131 ‘Developing Good Performance Information’ (April 2015) affirm the importance of planning to identify program intentionality and to outline how program performance will be measured and assessed.

This workshop follows the structure of the text book ‘Developing Monitoring and Evaluation Frameworks’ authored by Anne Markiewicz and Ian Patrick. It will present a clear and staged conceptual model for the systematic development of an M&E Framework. It will examine a range of steps and techniques involved in the design and implementation of the framework; explore potential design issues and implementation barriers; cover the development of Program Theory and Program Logic; the identification of key evaluation questions; the development of performance indicators; and identification of processes for multi-method data collection, on-going analysis and reflection based on data generated.

The facilitator will encourage interactive peer to peer dialogue to share experiences and learning, and also draw on case studies to encourage application of knowledge and skills to evaluation contexts.

Content

  • The importance and function of monitoring and evaluation processes
  • 'Table of Contents' for the development of an M&E Framework – what to do and in what order
  • Design of a viable M&E framework
  • Application of M&E frameworks to programs
  • Key challenges and barriers, and how to address them

Outcomes and Benefits

  • Understanding of an overall structure for the development of a M&E Framework
  • Identification of clear steps and stages involved in the process of development of an M&E Framework, and building knowledge and skills in their implementation
  • Use of case studies to develop key components of an M&E Framework for an initiative
  • Understand how to best support participatory processes in design and implementation of an M&E Framework

Who should attend?

This workshop offers professionals from across government, universities and not for profit and consulting organisations foundation skills in planning for monitoring and evaluation of a program. You would benefit most from the workshop if you have some prior knowledge of evaluation, particularly program theory and program logic and some practical experience with evaluation activities.

About Anne Markiewicz

Anne Markiewicz (pictured) is Director of Anne Markiewicz and Associates, a consultancy that specialises in developing Monitoring and Evaluation Frameworks, and the delivering of training, mentoring and capacity building in monitoring and evaluation. Anne is the co-author of the text book ‘Developing Monitoring and Evaluation Frameworks’ (Sage 2016). She has extensive experience in the design and implementation of monitoring and evaluation frameworks for a wide range of different initiatives, building the capacity of organisations to plan for monitoring and evaluation. Anne has been an evaluator for over 20 years and has been recognised by the Australasian Evaluation Society through receipt of a number of awards for excellence in evaluation and she is a Fellow of the Society. Anne has delivered this training program extensively in Australia, the Pacific and in the USA and the UK. 

Date and time: Monday 20 and Tuesday 21 February 2017, 9am to 5pm (registration from 8.30am)
Location: Novotel Canberra, 65 Northbourne Ave, Canberra
Presenter: David Roberts
Register online by: Monday 13th February 2017
Fees (GST inclusive): Members $770, Non-members $935, Student member $385, Student non-member $550

Reporting for accountability is a key object of the PGPA Act 2013 and the enhanced Commonwealth Performance Framework. The Department of Finance Resource Management Guide No. 131, Developing good performance information (RMG.131) argues that Government entities should be able to tell a cohesive performance story using a mix of quantitative and qualitative information. RMG.131 defines qualitative information as

Information that emphasises narrative rather than numbers. Qualitative inquiry involves capturing and interpreting the characteristics of something to reveal its larger meaning. This can involve tapping into experiences of stakeholders through observations, interviews, focus groups and analysis of documents (pp.50)

When done well, qualitative inquiry provides rich data that allow managers to develop a compelling and descriptive story about the strengths and weaknesses of projects, programs and initiatives.

Purpose of Workshop

The aim of the workshop is to support the implementation of the enhanced Commonwealth Performance Framework through an appreciation of the value of qualitative inquiry for evaluation and accountability reporting and the distinctive nature of the insights it reveals.

Workshop participants will gain a good understanding of the planning and preparation required to:

  • use qualitative inquiry in policy and reporting cycles
  • assess the quality of qualitative research
  • use a range of qualitative data collection and analysis methods, including in-depth interviews, direct observation and written documents.

Participants will develop a conceptual model for the use of qualitative inquiry across the policy cycle; and gain basic skills in several key qualitative techniques through hands-on exercises.

Learning strategiesDRoberts

The approach is based on adult learning principles and experiential learning. There will be some presentations but most of the workshop will engage participants in exercises and discussion. In most of the exercises, some participants (in rotation) will act as observers and lead the debriefing session after each exercise.

Who should attend

This workshop is aimed at performance managers, programme managers and other officers responsible for program evaluation and reporting on the performance of programmes or other activities. It caters for beginner to intermediate level, whether you are commissioning or coordinating an evaluation or performance report for the first time or are seeking further technical information about using qualitative inquiry to prepare a performance story.

About the Facilitator

With over 30 years’ experience in government, David Roberts is now a self-employed consultant with wide experience in using qualitative and quantitative methods. He is the immediate past-President of the AES and was Chair of the AES Awards Committee for three years. David has training in Anthropology, Evaluation and Community Development. He has conducted workshops in areas such as Qualitative Methods, Evaluation Design, Elicitation Techniques, Participatory Research and Program Theory.

Date and time: Friday 24 March 2017, 9am to 5pm (registration from 8.30am)
Location: The Boardroom, Novotel Canberra, 65 Northbourne Avenue, Canberra
Presenter: Dr Jess Dart, Founder Director, Clear Horizon Consulting Pty Ltd
Register online by: Friday 17 March 2017 
Fees: Members $440, Non-members $605, Student member $220, Student non-member $302.50

About the workshopJess Dart
To support the introduction of the enhanced Commonwealth performance framework, the Department of Finance released a series of guides. Developing good performance information (Resource Management Guide No. 131. April 2015)* emphasises that qualitative and quantitative performance information is critical to telling a coherent performance story that demonstrates the extent to which a Commonwealth entity is meeting its purposes through the activities it undertakes.
Performance story reports are essentially a short report about how a program contributed to outcomes. Although they may vary in content and format, most are short, mention program context and aims, relate to a plausible results chain, and are backed by empirical evidence (Dart and Mayne, 2005). The term 'performance story' was introduced by John Mayne in a paper that was published in 2004.
Performance story reports aim to strike a good balance between depth of information and brevity. They aim to be easy for staff and stakeholders to understand, and help build a credible case about the contribution that a program has made towards outcomes or targets. They also provide a common language for discussing different programs and helping teams to focus on results.
This workshop will explore some different approaches to telling an accurate and meaningful performance story, and how they are developed. It will offer some steps for collecting and reporting performance information to tell a clear picture of performance and explore the role of program logic and evidence. It will be an interactive and engaging workshop involving case studies and group process.
*This guide is available on the Department of Finance website at www.finance.gov.au

Who should attend?
This workshop is aimed at performance managers, programme managers and other officers responsible for measuring and reporting on the performance of programmes or other activities. It caters for beginner to intermediate level, whether you are creating, commissioning or coordinating a meaningful performance story for the first time or are seeking further technical information about how to tell a meaningful story about what has been achieved and demonstrate the extent to which your organisation is meeting its purpose through the activities it undertakes.

About the presenter
Jess Dart's professional interests are in evaluation methods, evaluation theory, collaborative approaches, and strategic planning. She has a PhD in program evaluation and an MSc in Sustainable Agriculture. Her doctoral research involved adapting and testing a story-based monitoring and evaluation tool named the 'Most Significant Change' technique (MSC). She went on to co-author the user-guide with Rick Davies. Jess is the founder of Clear Horizon Consulting a medium sized consulting company specialising in evaluation and strategy.
Jess has an extensive experience in performance story approaches. In 2008 to 2012 she championed the 'performance story reporting' pilot process with two divisions of the commonwealth government which led to over 20 performance story reports being written. She also developed a particular approach to documenting and creating performance stories named "Collaborative Outcome Reporting (COR).

Date and Time: Monday 14 November 2016, 9am to 5pm (registration from 8.30am)
Location: Reid Room, Novotel Canberra, 65 Northbourne Avenue, Canberra ACT
Presenters: Duncan Rintoul and Vanessa Hood
Register by: Monday 7 November 2016
Fees: Members $440, Non-members $605, Student members $220, Student non-members $302.50

About the workshop
This one-day course has been custom designed for people who want to commission better evaluations. It is for people with a role in planning, commissioning and managing evaluations. It is suitable for beginners through to those with a few years' experience who want to gain knowledge and consolidate their understanding of:

  •  different types of evaluation and how they can be used to inform policy, strategy and project work
  • principles and steps in successfully planning and implementing an evaluation project
  • techniques for developing and prioritising your evaluation questions
  • effective strategies for stakeholder engagement in the evaluation process
  • elements that make up a good evaluation brief / approach to market
  • factors that influence the scale, budget and timeframe of an evaluation
  • what to look for in an external evaluation team
  • assessment of evaluation proposals and the procurement process
  • management of evaluation consultancies
  • ethical conduct, governance and risk management in evaluation.

The training is interactive and hands-on, with lots of practical examples and group activities through the day to keep the blood pumping and the brain ticking. It will provide you with tools that you can start using immediately.

Who should attend?
Anyone working in government, the community sector or business who has a role in planning, commissioning or managing evaluations. No prior experience in evaluation is required.

The course is limited to 20 participants. If the course books out, it will be run again.

About the presentersrintoul

Duncan Rintoul has over 15 years' research and evaluation experience across a broad range of policy areas, organisational contexts and methodologies and he was a Director of the AES Board from 2012-16. He currently works part-time for the NSW Department of Education, in the Centre for Education Statistics and Evaluation, where he is leading the drive to build evaluation capacity. He is also writing up his PhD at the University of Wollongong, on questionnaire design for the web. In addition, Duncan runs Rooftop Social with Vanessa Hood and teaches popular introductory courses in evaluation for the AES and the Australian Market and Social Research Society. He was also on the Urbis team that won the AES Award for Evaluation Project of the Year in 2011.

Van website AES dinner 2015

Vanessa Hood blends the world of facilitation and evaluation and has over 15 years' experience in a range of settings, including behaviour change for sustainability. Vanessa is passionate about working with people and uses a range of creative facilitation techniques to help participants engage deeply with technical content and, importantly, with each other. In her current role as Associate Director with Rooftop Social she works with a range of NGOs and government organisations across Australia. She regularly delivers structured training, coaching and mentoring in facilitation and evaluative thinking. Prior to this, Vanessa was the Evaluation Lead at Sustainability Victoria, where she had responsibility for numerous internal evaluations at project and strategic organisational levels. She is an active member of the AES, including member of the newly formed Design and Evaluation Special Interest Group.

Date and Time: 12:00 PM-1:30PM, Wednesday 14 September, 2016
Venue: DFAT, 255 London Circuit, Canberra
Presenters: 
Scott Bayley, Principal Specialist Performance Management and Results, DFAT.
Penny Davis, Assistant Director, Office of Development Effectiveness, DFAT.
Elizabeth Prior, PNG Branch, DFAT. 
Jo Hall, Independent evaluation consultant.
Register by: Monday 12 September

This is a free event

In the Department of Foreign Affairs and Trade (DFAT), independent evaluations are undertaken at two levels:

  • Strategic evaluations produced by the Office of Development Effectiveness (ODE). These are high-level evaluations of aid program policies, strategies and approaches to common development issues; and
  • Operational evaluations managed by country and regional programs. These focus on individual aid activities.

This presentation, given by internal and external DFAT evaluators, takes a look at what they have learned about the 'why, what and how' of developing and presenting findings in ways that are useful and how to support them getting used to support the Australian international aid program. DFAT evaluators will outline the factors that inhibit or enhance the implementation of recommendations and provide a range of tips for making recommendations more effective. The session will cover:

  1. Factors that inhibit or enhance the implementation of recommendations.
  2. Options for preparing for evaluations to enhance the implementation of recommendations.

The discussion is targeted towards AES members and others who have found themselves asking why should evaluators make recommendations, what types of recommendations should be made or how should recommendations be made? There will be opportunity for questions and discussion.

Date and time: Monday 5 and Tuesday 6 December 2016, 9am to 5pm (registration from 8.30am)
Location: Novotel Canberra, 65 Northbourne Ave, Canberra ACT
Presenter: Dr Gill Westhorp
Onlien registration closes: Tuesday 29 November 2016
Fees: Members $770, Non-members $935, Student member $385, Student non-member $550

About the workshop
Instead of asking whether or not a program or intervention ‘works’, realist evaluation provides methods for determining “what works for whom in what contexts, in what respects, and how”. Realist evaluation is particularly useful when new interventions are being developed; when interventions are being considered for replication or scaling up; when programs are complex or are being introduced in complex settings; or when previous evaluations of programs have found mixed outcomes.

This practical and applied program will:

  • introduce the concepts that underpin realist approaches and explain the specific ways that three terms are used in realist evaluation: context, mechanism and outcome;
  • introduce realist evaluation design, demonstrating the implications of taking a realist approach for all aspects of the evaluation design process;
  • discuss the nature of evidence required and provide examples of methods for identifying program mechanisms, and for identifying ‘for whom’ programs are and are not effective;
  • introduce the differences between realist interviewing and ‘traditional’ kinds of interviewing.

Outcomes and Benefits

  • By the end of the workshop, it is expected that participants will:
  • Understand where realist evaluation ‘fits’ and how it differs from other evaluation approaches (including other theory based approaches)
  • Understand how to move from descriptive evaluation to explanatory evaluation;
  • Be able to explain key ideas to colleagues;
  • Understand how to approach realist evaluation design.

Target Audience
Evaluation practitioners and commissioners in Government, NGOs, consultants, and academics will benefit from this program. No experience in realist evaluation is necessary. Those with some background in realist approaches will be assisted to work at a more advanced level. This will be an applied program and all participants are requested to 'bring a program' (or policy, initiative, strategy) to work on.

The presenter Gill Westhorp200
Dr Gill Westhorp is a specialist in realist research and evaluation methodologies, a part-time consultant and part-time academic. She is Director of a small research and evaluation consultancy company specialising in realist approaches; a Professorial Research Fellow at Charles Darwin University, Darwin, Australia; an Associate at RMIT University, Melbourne, Australia; a member of the core team for the RAMESES I (standards for realist synthesis) and RAMESES II (standards for realist evaluation) projects based in Oxford, UK; and a member of the Advisory Committee for the Centre for the Advancement of Realist Evaluation and Synthesis (CARES) at Liverpool University, UK.

Date and time: Tuesday 23 August 2016, 9am to 5pm (registration from 8.30am)
Location: University House, Balmain Crescent, Acton, Canberra
Presenter: Dr Ian Patrick, Ian Patrick and Associates
Register online by: Wednesday, 17 August 2016
Fees: Members $440, Non-members $605, Student member $220, Student non-member $302.50

Purpose of Workshop
The workshop will provide participants with insight into theory based approaches to evaluation, and specifically into the role of Program Theory and Program Logic to provide a clear understanding, focus and direction to the practice of evaluation. The use of Program Theory and Program Logic will be clearly detailed within a staged conceptual model, with guidance provided on how they can be applied within the planning and implementation of an evaluation.

Areas covered in the workshop include the use of Program Theory and Program Logic to:

• Identify the expected cause and effect relationships within a program, and the critical assumptions which underpin whether anticipated change occurs.
• Establish relationships between the more operational constructs of inputs, activities, outputs, outcomes, and impacts as they apply to a program
• Identify critical areas of focus for monitoring and evaluation including determining evaluation questions across different evaluation domains

The role of stakeholders in the development of the Program Theory and Program Logic and ways to promote their participation will be a point of emphasis. The workshop will consider how monitoring and evaluation activities can establish the validity of the Program Theory and Program Logic, and assist in making adjustments to these models as a program matures or understandings about its identity change. Constraints and limitations in the use of Program Theory and Program Logic will also be identified, together with common pitfalls in implementation and means to address these.

Teaching/Learning Strategies and Resources to be Used

The workshop will incorporate a mix of training methods including presentations, use of case studies, and small group interactive work. There will be ample opportunity for open discussion and questions.

Target group:

This workshop is pitched at an Intermediate level.

The workshop's worth

Theory based approaches to evaluation are increasingly recognised as having a core role in evaluation, and their use is seen as a means to resolve debates regarding choice of an appropriate evaluation methodology. The importance of a theory based approach is also reinforced within recent Australian government legislation and guidelines. The Public Governance, Performance and Accountability Act (2013) and Resource Management Guide 131 'Developing Good Performance Information' (Department of Finance, April 2015) highlight the important place of logic models as representations of how a program's purpose will be met, the chain of reasoning that connects critical elements to that purpose, and the performance information needed to tell an effective 'performance story'. With a blend of conceptual material and practice, the workshop will position participants to make effective use of Program Theory and Program Logic. The workshop contents are also closely related to the recent SAGE publication, Developing Monitoring and Evaluation Frameworks, of which Ian Patrick is joint author.

About the Traineripatrick 250

Dr. Ian Patrick is a self-employed consultant undertaking evaluation related roles in both Australia and the Asia-Pacific region. Ian has considerable experience as a trainer and has delivered workshops in areas such as Developing M&E Frameworks, Introduction to M&E, Advanced M&E, Impact Assessment, and Participatory Evaluation. This experience crosses Australia, New Zealand, United States, UK, Ireland and a range of developing countries particularly in the Asia-Pacific region. Ian is an Honorary Senior Fellow with the School of Social and Political Sciences, University of Melbourne.

Context
Over the past few years, our local AES Committee of volunteers has coordinated a schedule of professional development and networking activities, including Hot Issues Breakfasts, lunchtime seminars, book-club and formal workshops. The Canberra evaluation scene is thriving and our activities continue to enjoy good levels of engagement and participation from evaluation enthusiasts of all levels from novices to experts.

We are now pleased to announce that we will be hosting the 2017 Australasian Evaluation Society Conference in Canberra. The conference provides an exciting opportunity to both showcase the evaluation scene in our beautiful city and create an outstanding event that delivers a useful legacy.

In the meantime we want to continue to provide local activities but recognise that this will require some re-prioritisation and re-organisation as we also organise the conference.

Invitation

We invite you to an ideas lab to explore options and possibilities for the 2017 AES International Conference and for driving local activities over the next eighteen months. The purpose of the session is to generate ideas about contemporaneous issues in evaluation that we want to pursue and to explore opportunities made possible by hosting an international evaluation conference.

Date and Time: Wednesday 27 July 2016, 3.00 - 5.00pm
Venue: Industry House, Department of Industry, Innovation and Science, 10 Binara St, Canberra (Ground floor meeting room G 021)

Agenda
1. Conference Ideas

  • Themes, sub-themes and logistics
  • Key-note speakers

2. Questions

  • What can we learn from past conference experiences?
  1. AES conferences
  2. Evaluation on the Edge, Canadian Evaluation Society Conference 2016 (Kim Grey, PM&C)
  3. Other conference experiences
  • What are the opportunities and issues we want to pursue?
  • What are our priorities for ongoing local activities?
  • How can we get involved?

Light afternoon tea refreshments will be provided

RSVP: 1pm Monday 25 July 2016

We hope that you’ll be able to join us.

Yours sincerely
John Stoney, Susan Garner, Julie Elliott, Kim Grey, Scott Bayley, Steven Horn, Lisa Barney, Frances Byres, Sue Sutton, Joan ten Brummelaar

Date and Time: Wednesday 11 May, 2016, from 7:30am to 8:45am
Venue: Waldorf Apartment Hotel, 2 Akuna Street, Canberra
Presenters: Emma Williams & Kim Grey
Register by: Monday 9 May 2016
Cost: Free (just pay for your own Breakfast)

Event Description:
The term ‘evaluative thinking’ is increasingly used – but in two very different senses. In one, non-evaluators begin to think the way evaluators think, applying aspects of evaluative thinking to the way they design initiatives, record activities and use data internally. The other sense of ‘evaluative thinking’, also sometimes called ‘evaluative reasoning’, refers to critical thinking being applied by evaluators at different stages of evaluations, to result in more robust processes and more actionable findings.

Both of these will be explored at a breakfast session as participants discuss what improvements in practice could result from greater insight into one or both meanings of the term.

Suggested references:
https://tgarchibald.wordpress.com/2013/11/15/evaluative-thinking-lit-review/ 

http://www.eers.org/sites/default/files/Archibald_PromotingEvaluativeThinking.pdf 

The betterevaluation.org rainbow framework places ‘valuing evaluative thinking’ in the Develop Evaluation Capacity step, but does it apply in other places?

http://betterevaluation.org/plan/manage_evaluation/evaluation_capacity 

About the Facilitators
Emma Williams is Principal Scientist, Evaluation for Northern Contexts, at the Northern Institute, Charles Darwin University, and has run sessions on evaluative thinking with people in a range of roles and in other countries and is keen to hear what it means to you.

Kim Grey works on government evaluations in indigenous affairs and is currently studying the Masters in Evaluation by research at the University of Melbourne, and is interested in how evaluative thinking helps question assumptions and span the divide between evaluation and monitoring.

Date: Thursday 25th February 2016, 9:00am to 5:00pm (registration at 8:30am)
Venue: University House, Balmain Crescent, Acton, Canberra 
Presenters: Vanessa Hood and Natalie Moxham
Register by: Monday 22 February 2016
Cost: Members $440, Non-members $605, Student member $220, Student non-member $302.50

About the workshop - Overview

Stakeholders are more likely to feel ownership of an evaluation and implement the recommendations if they participate throughout the process. Evaluators therefore need to make the most of every opportunity they have with their stakeholders to encourage participation. However, working with stakeholders is not always easy and can be one of the most challenging aspects of managing or conducting an evaluation. The solution to encouraging participation is to have strong facilitation and team building skills.

Facilitation is process focused and a collaborative way of working where the leader takes a neutral position and helps the group achieve their objective (e.g. to articulate a vision of success, make a decision or prioritise recommendations). Facilitative processes value the wisdom of the group and allow everyone to express their view. They enable people to connect with each other and to work effectively as a group. A facilitator actively listens and responds to the needs of the group. They communicate verbally and non-verbally and encourage the group to do this by using active processes. Facilitators are aware of their actions and values and understand how these influence the group. They tap into the hearts and minds of the group and help them understand underlying issues.

Content

Participants leave with practical facilitation skills they can use immediately.

Facilitation fundamentals:
• Understanding the group - thinking and learning styles and personalities, diversity and difference, ensuring all voices are heard, dealing with conflicts
• Understanding yourself - values and behaviour and how they affect the group
• Understanding the facilitation process - developing a facilitation plan to achieve the group's objectives
• Choosing and using the right facilitation tool for the situation - pros and cons

Introduction to participatory methods that require strong facilitation:

✪ Participatory evaluation design including a shared vision of success
✪ Most Significant Change for data collection and analysis
✪ Group semi structured interviews and structured discussion methods
✪ Participatory data analysis and reflection (e.g. in a summit workshop)
✪ Building evaluative capacity of teams

Who should attend

Internal evaluators and external consultants, program managers and policy officers who:
• conduct their own evaluations
• manage or commission evaluations
• design and manage evaluation capacity building projects
• facilitate lessons learned sessions.

Presenters

Vanessa and Natalie are experienced facilitators and evaluators. They have used participatory approaches in their work and have trained other people in this area.
Vanessa is a facilitator and evaluator with over 15 years' experience, in a range of contexts including sustainability and agriculture. Her experience is mainly with State government and she also works as a consultant with Rooftop Social. She is passionate about working with people and uses creative facilitation techniques. These methods enable participants to engage deeply with technical content and, more importantly, with each other. She established evaluation and outcome focused approaches across a government organisation, including embedding reflective practice at all levels of the business. She has also conducted evaluations at project and strategic organisational level. Vanessa delivers training, mentors others in evaluation and regularly facilitates group workshops.

Natalie is a facilitator and consultant in organisational processes, participatory program design, monitoring and evaluation. She successfully designed and delivered training in facilitation, program design, monitoring and evaluation. Natalie's participatory approach uses strength based and engagement processes that build agency and collaboration with communities, networks, organisation and programs. Natalie lives on DjaDjaWurrung Country in central Victoria and has a long-term commitment and extensive experience working with Indigenous groups as well as in the Asia-Pacific region and Australia. She holds a master in international development and works across community, NGO and Government sectors.

Date and Time: Wednesday 25 November 2015, from 7:30am to 9.00am
Venue: A. Baker, New Acton Pavilion Unit 2, 15 Edinburgh Ave Canberra ACT (02) 6287 6150 (refer to attached map)
Presenter: Imogen Wall, Australian National Development Index
Cost: Free - just pay for your own breakfast. (Credit card is the preferred method of payment)
Register online by: Friday 20 November

Event Description:
In the spirit of holiday cheer and goodwill and as the 2015 EvalYear draws to an end, we invite you to meet ANDI and join a conversation over breakfast about what matters most.

ACT251115The Australian National Development Index (ANDI) is a framework that is being developed to capture a complete picture of national wellbeing. It will incorporate a comprehensive set of the key social, health, economic and environmental factors contributing to our overall quality of life. ANDI will include these critical domains of people's lives that lead to enhanced wellbeing. In a nutshell, ANDI expresses the kind of Australia we all aspire to live in and provides a holistic and integrated approach to measuring wellbeing.

This breakfast will include a presentation on options and issues for moving beyond a focus on economic growth to effectively measure equitable and sustainable wellbeing, and the role evaluators can play in this.

About the Presenter
Imogen Wall is an independent consultant in ideas design - helping people formulate, clarify and communicate their creative projects. She worked for many years with the Australian Bureau of Statistics in survey design, social research and wellbeing measurement. As part of this, she reviewed the flagship product Measures of Australia's Progress, galvanising a nationwide public conversation about what matters most to Australians. She is now working with the Australian National Development Index – an independent consortium developing an international index of equitable and sustainable wellbeing – and on an online tool to put statistical power into the hands of communities and propagate ecological thinking.

A Baker Location

ABaker Map

 

 

Date and Time: Tuesday 10 November 2015, from 7:30am to 8.45am
Venue: Waldorf Apartment Hotel, 2 Akuna Street, Canberra
Instructions: Enter from the London Street entrance.
Presenter: Emma Williams
Cost: Free (just pay for your own Breakfast)
Register online by: Friday 6 November

Event Description:
During 2015, the ACT Chapter of the AES is promoting discussion about the AES Evaluator' Professional Learning Competency Framework, with monthly activities devoted to different competencies. Copies of the entire framework can be downloaded at http://www.aes.asn.au/professional-learning/pl-resources.html.

This ACT event will examine the core competency of Interpersonal Skills – twelve areas of skill needed to communicate effectively with clients, consumers and other evaluation stakeholders. They include displaying empathy, written, verbal and non-verbal language skills.

The AES is now looking at identifying levels for these skills, from beginner to advanced. How can you tell what level of skill you have in each interpersonal skill area? How can you identify your own strengths and weaknesses, and how can you improve skills such as empathy, or non-verbal communication?canberra 10 November

This Hot Issues Breakfast will explore these issues, including a short presentation on research on the topic, followed by participant discussion. A potential evaluation capacity building technique (learning circles*) will be introduced.

About the Facilitator

Emma Williams is Principal Scientist, Evaluation for Northern Contexts, at the Northern Institute, Charles Darwin University, and has studied non-verbal communication and evaluative use. She thinks she may be the only AES member to be actively bi-jurisdictional (NT and ACT), but would be happy to hear of others.

* Kishchuk, N., Gauthier, B., Roy, S. N., & Borys, S. (2013). Learning circles for advanced professional development in evaluation. The Canadian Journal of Program Evaluation, 28(1), 87–96.

Date and time: Thursday 3 September 2015, 7:30 am to 9:00 am
Venue: Waldorf Hotel, 2 Akuna Street, Canberra ACT
Cost: Buy your own breakfast
Register online by: Friday 28 August 2015

The AES Canberra Committee warmly welcomes others from the ACT to an informal breakfast meeting. The breakfast is a social occasion to give those attending the AES Conference in Melbourne a chance to meet each other before the conference.

We particularly encourage first-time conference attendees to come along and meet the friendly ACT members as a relaxed way to meet people before the conference. In Melbourne you will then be able to spot some friendly faces in the crowd!

If you have been to AES Conferences before, and will know others attending, it gives you the chance to introduce yourself to new ACT attendees, make them welcome at the Conference and introduce them to evaluators from elsewhere.

New and seasoned attendees are encouraged to come and enjoy breakfast with us.

Date and time: Friday 11 September 2015, 9:00 am to 12:00 pm (registration from 8.30am)
Venue: The RUC (ACT Rugby Union Club), 51 Blackall Street, Barton 
Presenter: Penny Hawkins, Head of Evaluation, UK Department of International Development (DFID)
Register online by: Thursday 10 September 2015
Fees: AES Members $220, Non-members $302.50, Student AES member $110.00, Student non-member $151.00

aes15 International Evaluation Conference speakers presenting a post conference half-day evaluation masterclass in Canberra

About the masterclass
This interactive masterclass will explore the vital role of leadership in evaluation for contributing to public policies in meaningful ways, particularly in complex, contested and politicised spaces and the risks if this is ignored. Topics covered will include creating an enabling environment for good evaluation, building evaluation culture, capacity and systems, embedding evaluative thinking, developing evaluation policy in organisations and political savviness.

The half-day master class is limited to 20 participants and is built around contemporary evaluation needs. By the end of the session, participants will be able to apply the ingredients of effective evaluation leadership for making a positive contribution.

Who should attend
The masterclass aims to support evaluation team leaders, experienced evaluation practitioners and evaluation thought leaders. If you wish to explore some of the challenges of contemporary evaluation and how they might be navigated, this workshop is for you.

About the presenterpenny hawkins
Penny Hawkins is an evaluation specialist with extensive experience in public sector evaluation and a former AES President. She is currently Head of Evaluation at the UK Department of International Development (DFID). Before taking up this role in 2013, Penny was an evaluation specialist at The Rockefeller Foundation in New York and held a number of evaluation management roles in New Zealand government departments including as Head of Evaluation for the New Zealand Aid Programme at the Ministry of Foreign Affairs and Trade. She currently serves as the Chair of the OECD-DAC Network on Development Evaluation and from 2003–13 was a faculty member for the International Program for Development Evaluation Training (IPDET) at Carleton University in Canada.

Penny has contributed to several evaluation publications including co-editing a book published in 2012 Evaluation Cultures – Sense Making in Complex Times. In 2007, she received the AES Award for Outstanding Contribution to Evaluation. Penny's longstanding commitment to the evaluation profession stems from her optimism that evaluation can make a positive contribution to world development and human wellbeing.

Date and time: Thursday 10 September 2015, 9:00 am to 5:00 pm (registration from 8.30am)
Venue: London Waldorf Hotel, 2 Akuna Street, Canberra
Presenters: Kate McKegg and Judy Oakden
Register online by: 9 September
Fees: AES Members $440, Non-members $605, Student AES member $220, Student non-member $302.50

aes15 International Evaluation Conference speakers presenting a post conference workshop in Canberra

About the workshop

Developmental Evaluation (DE) integrates evaluative thinking, learning and knowledge into innovative policy and program development. It is particularly well suited to complex policy issues, responding to crises, radical program re-design, replicating effective programs in new contexts and changing or creating more effective processes, ideas and services to increase their likelihood of succeeding.

Similar to the role of research & development, DE involves facilitating real time feedback to program staff thus facilitating a continuous development loop.

DE is an approach to evaluation that is grounded in the premise that systems change must be grounded in the values, creativity, traditions, skills and practices of the communities and organisations for whom the change will most affect. DE taps into the many diverse and often competing values that underpin the complexity and uncertainty of policy and program development; it makes these values transparent, and provides people with a pathway to learning about what is being developed, how well, or how effective any development is, as well as how they might respond, adapt and continue developing to ensure desired results are achieved.

Workshop participants will learn about the unique niche, strengths and limitations of DE, including the implications of systems thinking and complexity for evaluations. Participants will have the opportunity to learn and engage in facilitated pragmatic, hands-on application of developmental evaluation's concepts. The combination of theoretical and experiential learning will ensure the knowledge gained by participants about developmental evaluation can be more easily applied in real world contexts.

"[DE] sits alongside, it doesn't control or dampen the core values of innovation" (Nan Wehipeihana, cited in Patton, 2010) 

Who should attend
This workshop will be most useful for those evaluation practitioners, policy makers, funders and commissioners of evaluation, who have some prior evaluation experience or knowledge. This workshop will be useful for internal and external evaluators, experienced policy makers, program developers, funders and commissioners of evaluation keen to effectively infuse evaluative thinking, learning and knowledge into collaborative policy and program development, particularly in innovative and complex situations.

About the presenters
Kate McKegg is an independent evaluation consultant with over 20 years evaluation experience. She is the director of The Knowledge Institute Ltd (www.knowledgeinstitute.co.nz) and a member of the Kinnect Group. Kate is the current President of the Aotearoa New Zealand Evaluation Association (www.anzea.org.nz ), and a former board member of the Australasian Evaluation Society. Kate is co-editor of New Zealand's only evaluation text, Evaluating Policy and Practice, a New Zealand Reader (2003), and along with Nan Wehipeihana, Kataraina Pipi and Veronica Thompson) was a recipient of the Australasian Evaluation Society 2013 Best Evaluation Policy and Systems Award for the He Oranga Poutama Developmental Evaluation.

Judy Oakden is an independent evaluation and research consultant, owner of Pragmatica Limited, and a member of the Kinnect Group. She has held management roles in evaluation, market research and management consulting, and also worked in public relations.
Judy's evaluation practice has a strong focus on utilisation. She has knowledge in the development and use of evaluative rubrics to frame evaluations, and the use of methods such as rich pictures (from Soft Systems Methodology) and Human Systems Dynamics to work with complexity. She is adept in the use of sense making processes during analysis.
Judy regularly develops evaluation systems and frameworks, conducts research and evaluation, and mentors others in developing evaluation systems and evaluation rubrics.
Ongoing education and continuous improvement is embedded in Judy's ethos, allowing her to develop trusted relationships with her clients based on the ongoing value she delivers. Recently Judy achieved Professional Certification from the Human Systems Dynamics Institute (US).

Date and Time: Wednesday. 12 August 2015, from 12:30pm to 2:00pm
Venue:
 ACT Health, Conference Room, Level 3, 1 Moore Street, Canberra
Instructions: Go up the lift to level 3 in 1 Moore St, turn left out of the lift. The conference room will be signposted.
Presenters: Helen Lilley
Cost: Free, bring your lunch, tea and coffee provided
Register online by: 11 August 2015 

Event Description: 

This ACT event covers the core competency Research methods and systematic inquiry – with a particular focus on the practical aspect of Questionnaire Design. The seminar is an introduction to two questionnaire planning checklists - a Questionnaire Planning Guide and a Questionnaire Design Checklist.

In 2012 Population Health Division, ACT Health funded the development and pilot of a one day Questionnaire Design Course for ACT Health staff. The two checklists were developed specifically for this course. The course was jointly developed by Dr Marian Currie and Helen Lilley, ACT Health and Dr Helen Jordan, University of Melbourne. The course is conducted regularly free of charge for ACT Health staff.

The presenter will outline key issues covered in the checklists and participants will have the opportunity to critique a range of questions from existing questionnaires.

About the presenter
Helen Lilley
Helen was a physiotherapist for 20 years and is currently the Evaluation Coordinator for the Health Improvement Branch, Population Health Division, ACT Health. In this role Helen is responsible for: managing the evaluation of the ACT Healthy Weight Initiative which aims to keep rates of overweight and obesity at or below their current level; coordinating the evaluation of health promotion programs; and, providing advice on questionnaire design. Helen was an academic supervisor for Masters of Health and Community Development students at the University of Canberra for 5 years and supervised a number of evaluation projects in this role.

Date and time: Monday 3 August 2015, 9:00 am to 5:00 pm (registration from 8.30am)
Venue: Waldorf Hotel, 2 Akuna Street, Canberra
Presenter: Jess Dart, Clear Horzion
Register online by: 31 July 2015
Fees: AES Members $440, Non-members $605, Student AES member $220, Student non-member $302.50

About the workshop

To support the introduction of the enhanced Commonwealth performance framework, the Department of Finance released a series of guides. Developing good performance information (Resource Management Guide No. 131. April 2015)* emphasises that good performance information is critical to telling a cohesive performance story that demonstrates the extent to which a Commonwealth entity is meeting its purposes through the activities it undertakes.

Performance story reports are essentially a short report about how a program contributed to outcomes. Although they may vary in content and format, most are short, mention program context and aims, relate to a plausible results chain, and are backed by empirical evidence (Dart and Mayne, 2005). The term 'performance story' was introduced by John Mayne in a paper that was published in 2004.

Performance story reports aim to strike a good balance between depth of information and brevity. They aim to be easy for staff and stakeholders to understand, and help build a credible case about the contribution that a program has made towards outcomes or targets. They also provide a common language for discussing different programs and helping teams to focus on results.

This workshop will explore some different approaches to performance story, and how they are developed. It will offer some steps to building one and explore the role of program logic and evidence. It will be an interactive and engaging workshop involving case studies and group process.

*This guide is available on the Department of Finance website at www.finance.gov.au 

Who should attend?

This workshop is aimed at programme managers and other officers responsible for measuring and reporting on the performance of activities. It caters for beginner to intermediate level, whether you are creating, commissioning or coordinating a meaningful performance story for the first time or are seeking further technical information about how to tell a meaningful story about what has been achieved and demonstrate the extent to which your organisation is meeting its purpose through the activities it undertakes.

About the presenter
Jess Dart's professional interests are in evaluation methods, evaluation theory, collaborative approaches, and strategic planning. She has a PhD in program evaluation and an MSc in Sustainable Agriculture. Her doctoral research involved adapting and testing a story-based monitoring and evaluation tool named the 'Most Significant Change' technique (MSC). She went on to co-author the user-guide with Rick Davies. Jess is the founder of Clear Horizon Consulting a medium sized consulting company specialising in evaluation and strategy.
Jess has an extensive experience in performance story approaches. In 2008 to 2012 she championed the 'performance story reporting' pilot process with two divisions of the commonwealth government which led to over 20 performance story reports being written. She also developed a particular approach to documenting and creating performance stories named "Collaborative Outcome Reporting (COR).

For a copy of the Powerpoint presentation click here

Date and time: Thursday 23 July 2015, 9:00 am to 5:00 pm (registration from 8.30am)
Venue: Waldorf Hotel, 2 Akuna Street, Canberra ACT 2601
Presenter: Carol Vale, Murawin
Register online by: midday 22 July 2015
Fees: AES Members $440, Non-members $605, Student AES member $220, Student non-member $302.50

About the workshop

It is often said that "Indigenous Australians are the most studied people in the country", and whilst there may be an element of truth in that statement, it is also reasonable to note that for the most part evaluations are undertaken by evaluators with varying degrees of cultural competency in relation to engagement with Indigenous communities. The need to increase the number of Indigenous people doing research at all levels of projects is important to ensure that cultural nuances and protocols are captured in the evaluation project.
This workshop will provide an overview of an evaluation framework that draws on Indigenous understandings and perspectives and western-constructed evaluation models. The aim of the workshop is to provide participants with insights into stakeholder engagement processes that will enhance their evaluation practice.

The workshop will explore:

• Indigenous stakeholder engagement
• cultural competency continuum
• building capacity of local researchers
• Indigenous worldviews and the overlay of evaluation practices.

Outcomes

At the end of the workshop, participants will be able to:
• apply principles of Indigenous stakeholder engagement to their practice
• measure their cultural competence in relation to undertaking evaluation in Indigenous communities
• develop strategies to enhance their evaluation practice in consideration of Indigenous world views.

Who should attend?

Beginner to experienced evaluators, policy advisors and managers looking for techniques to assist with the evaluation processes they use with Indigenous communities.

About the presenter

Carol is a Dunghutti woman from NSW who has extensive hands-on-experience in social research and evaluation. She is Managing Director, of Murawin, a company she jointly established in 2013. Prior to Murawin, Carol had a significant career in the areas of human services and public policy across a range government departments in NSW and Qld. Carol now turns her extensive lived and employment experience to a broader audience, incorporating the international, business and corporates, non-government organisations, communities, individuals and government sectors, to empower all to influence change not just within the Indigenous space but this certainly remains a key focus of her work.

Date and Time: Wednesday. 8 July 2015, from 12:30pm to 2:00pm
Venue:
 ACT Health, Conference Room, Level 3, 1 Moore Street, Canberra
Instructions: Go up the lift to level 3 in 1 Moore St, turn left out of the lift. The conference room will be signposted.
Presenters: Susan Garner and Stephen Horn
Cost: Free, bring your lunch
Register online by: 7 July 2015 

Event Description: 

During 2015, the International Year of Evaluation, the ACT Chapter of the AES is promoting discussion about the AES Evaluator' Professional Learning Competency Framework. This Framework has been designed to support the AES to raise the quality and awareness of evaluation and its role in public policy, and community and organisational change projects.

Copies of the framework can be downloaded at http://www.aes.asn.au/professional-learning/pl-resources.html.

This ACT event will examine the core competency of Research methods and systematic inquiry – with a particular focus on the use of data in evaluation. This seminar is intended to "whet your appetite" about the range and scope of data, its different forms, and the strengths, challenges and limitations of the use data on which evaluative judgments can be based. The presenters will provide case study examples of identifying and using data for evaluation projects to generate interest and discussion. Hopefully it will also stimulate an appetite for further development of this core competency!

Please, bring your lunch. Tea, coffee and drinks will be provided.

From the Framework:canberra080715
Research methods and systematic inquiry
"Within the scope of an evaluation, knowledge and skills in research methods and systematic inquiry are essential for collecting valid and reliable data on which evaluative judgements can be based. This competency covers the knowledge and skills evaluators need to conduct systematic inquiry in an evaluation".
Source: AES Evaluators' Competency Framework page 7.

About the presenters
Stephen Horn
Stephen Horn is an independent statistician primarily in official statistics. He is intrigued by the challenges of complex public data capture and transformation; the ethics and statistics in evidence based policy and seeks to apply quality principles to data practice for government applications. His expertise includes: data management; social reporting, disclosure control, collection support, weighting and imputation, nonresponse adjustment; survey quality assurance, design and use of customer surveys, estimation for process control, technical review of longitudinal surveys.

Susan Garner
Susan Garner is an experienced policy analyst and program evaluator with over 25 years of experience across a range of government portfolios responsible for health and ageing, welfare, social and human services. Susan has been working as a private consultant over the last 7 years, delivering a range of consultancy projects for government, private and not-for-profit organisations. Susan is particularly interested in the use of data in policy development and evaluation to inform evidence based recommendations for decision makers about their policies and programs. She is committed to building capacity in research methods and systematic inquiry as a core competency for evaluators.

 

Source: https://www.ucl.ac.uk/public-policy/events/big-data-and-education/Big-data-and-education-cartoons/ucl4.jpg 

Date and Time: Thursday 2 July 2015, from 7:30am to 8.45am
Venue: Waldorf Apartment Hotel, 2 Akuna Street, Canberra
Instructions: Enter from the London Street entrance.
Presenters: Scott Bayley and John Stoney
Cost: Free (just pay for your own Breakfast)
Register online by: Wednesday 1 July 2015

Event Description:
During 2015, the International Year of Evaluation, the ACT Chapter of the AES is promoting discussion about the AES Evaluator' Professional Learning Competency Framework. This Framework has been designed to support the AES to raise the quality and awareness of evaluation and its role in public policy, and community and organisational change projects. Copies of the framework can be downloaded at http://www.aes.asn.au/professional-learning/pl-resources.html.

This ACT event will examine the core competency of Evaluation Theory – with a particular focus on how to obtain a useful sense of how and where various theories 'fit' within the broader pantheon of evaluative theory and practice. A challenge for practitioners is what could be called the 'transdiscipline effect – evaluation has drawn upon (as well as contributed to) numerous fields so that there is no single 'theory of evaluation' – rather a plethora of approaches, models and methodologies. Scott and John will lead a discussion that will consider two frameworks – Shadish, Cook and Leviton's (1991) evaluation theory framework and Alkin & Christie's (2004) 'evaluation family tree' – that can assist practitioners in developing an overarching perspective of evaluative theory.

About the Facilitators:
Scott Bayley joined AusAID in 2012 (now integrated into the Department of Foreign Affairs and Trade) as Principal Specialist Performance Management and Results. He is responsible for leading organizational change to achieve the continuous improvement of performance management practices. John Stoney currently divides his time between policy development work (and moonlighting in evaluation) at DSS, and being an Evaluation Fellow at Charles Darwin University. Both Scott and John have a longstanding interest in evaluation capacity and capability development and the interface between evaluation theory and practice.

Date and time: Thursday, 28 May 2015, 9am to 5pm (registration from 8.30am)
Location: Canberra TBC
Presenter: Dr Ian Patrick. Ian Patrick & Associates
Register online by: 25 May 2015 
Fees: Members $440, Non-members $605, Student member $220, Student non-member $302.50

Purpose of Workshop
The workshop will provide participants with insight into theory based approaches to evaluation, and specifically into the role of Program Theory and Program Logic to provide a clear understanding, focus and direction to the practice of evaluation. The use of Program Theory and Program Logic will be clearly detailed within a staged conceptual model, with guidance provided on how they can be applied within the planning and implementation of an evaluation.

Areas covered in the workshop include the use of Program Theory and Program Logic to:
• Identify the expected cause and effect relationships within a program, and the critical assumptions which underpin whether anticipated change occurs.
• Establish relationships between the more operational constructs of inputs, activities, outputs, outcomes, and impacts as they apply to a program
• Identify critical areas of focus for monitoring and evaluation including determining evaluation questions across different evaluation domains

The role of stakeholders in the development of the Program Theory and Program Logic and ways to promote their participation will be a point of emphasis. The workshop will consider how monitoring and evaluation activities can establish the validity of the Program Theory and Program Logic, and assist in making adjustments to these models as a program matures or understandings about its identity change. Constraints and limitations in the use of Program Theory and Program Logic will also be identified, together with common pitfalls in implementation and means to address these.

Teaching/Learning Strategies and Resources to be Used
The workshop will incorporate a mix of training methods including presentations, use of case studies, and small group interactive work. There will be ample opportunity for open discussion and questions.

Who should attend
This workshop is pitched at people at an intermediate level.

The workshop's worth
Theory based approaches to evaluation are increasingly recognised as having a core role in evaluation, and their use is seen as a means to resolve debates regarding choice of an appropriate evaluation methodology. In an Australian context, the Public Governance, Performance and Accountability Act (2013) and subsequent guidance on performance management from the Department of Finance (2015) support the need for the development and use of logic models to identify clear connections and factors critical to achieving intended results. With a blend of conceptual material and practice, the workshop will position participants to make effective use of Program Theory and Program Logic. The workshop contents are closely related to the forthcoming SAGE publication, Developing Monitoring and Evaluation Frameworks, of which Ian Patrick is joint author.

About the Trainer
Dr. Ian Patrick is a self-employed consultant undertaking evaluation related roles in both Australia and the Asia-Pacific region. Ian has considerable experience as a trainer and has delivered workshops in areas such as Developing M&E Frameworks, Introduction to M&E, Advanced M&E, Impact Assessment, and Participatory Evaluation. This experience crosses Australia, New Zealand, United States, UK, Ireland and a range of developing countries. Ian is an Honorary Senior Fellow with the School of Social and Political Sciences, University of Melbourne.

Date and Time: Thursday, 14 May 2015, from 5:30pm to 6:45pm
Venue:
 Hyatt Hotel, 120 Commonwealth Avenue, Canberra
Register online by:
13 May 2015
Cost: Free

Event Description: 

We are continuing our focus on the International Year of Evaluation's aim to strengthen evaluation capacities. This time, instead of a workshop, we are proposing an interactive discussion between evaluators on our own experiences with the third domain of the Evaluators' Professional Learning Competency Framework - Attention to Culture, Stakeholders, and Context.

Please come along to a new venue and time we are trialling — interactive discussion at the Hyatt, 120 Commonwealth Avenue (a historic Canberra landmark near the ramp to the Parliament Building). It will provide a cultured context for our discussion, and we expect the discussion to be lively. As the Framework notes:

The evaluator is surrounded by, and works within, a multiplicity of value perspectives, including cultural, social and political. These value perspectives are embedded within the evaluand, the context within which an evaluand exists, and in the perspectives of evaluation commissioners and stakeholders.

The evaluator must be cognisant of, and responsive to, such value perspectives.

Building on the March event, where participants looked at their own evaluative attitudes and professional practice, this event will focus on our understanding of the value perspectives we bring with us to our evaluations, and how those values sometimes inform – and sometimes impede – our capacity to be responsive to evaluand and commissioner values.

For example, an evaluator with a strong social justice focus and egalitarian values may find it more difficult to be responsive to the values of a hierarchical and even paternalistic evaluand. How can evaluators best respond in these situations? Again, evaluators in some policy areas find themselves working within rapidly shifting political contexts – what are the best ways to keep the evaluation on track while being responsive to sometimes dramatic contextual changes?

We are also looking to build a reading list on these topics; participants are asked to provide at least one reading they have found useful in this area and the final list will be circulated to all participants. 

 

Date and Time: Tuesday 12 and Wednesday 13 May 2015, from 9:00am to 5:00pm (registration from 8:30am)
Location:
 Mantra on Northbourne
Presenter:
 Dr Gill Westhorp
Register online by: 5May 2015
Fees: (Day one only) Members $440, Non-members $605, Student member $220, Student non-member $302.50 (Day one and two) Members $770, Non-members $935, Student member $385, Student non-member $550

Event Description: 

Many policies and programs are implemented in large systems, or expect to make changes at multiple levels of a system. Many approaches to program theory either assume that the program itself is simple, or ignore the implications of context for whether and how programs work.

The first day of this program will introduce various approaches to 'systems', 'complexity' and 'context'. Participants will explore the implications for program design and for commissioning and conducting evaluations, and in particular, the many uses of theory for dealing with complexity.

The second day will focus on skills and strategies for evaluators working with complex systems. It will present a particular approach for

  • 'layering' systems, program theories and formal theories from different disciplines; and
  • using formal theories for evaluation design and analysis of evaluation findings.

Implications for tendering, managing evaluations, and reporting will also be discussed.

The first day is designed for evaluators and researchers, policy makers, strategic policy analysts, program designers, performance and quality improvement staff and others interested in conceptualising 'what works in complex systems'? The second day is appropriate for intermediate to advanced evaluators and researchers.Gill Westhorp200

Each day will involve presentations, practical examples, small group work and whole group discussion.

About the presenter
Dr Gill Westhorp is an internationally-recognised specialist in realist research and evaluation methodologies, with an interest in the relationship between realist and complexity theories. She is Director of a small research and evaluation consultancy company; a University Fellow at Charles Darwin University; an Associate at RMIT University; a member of the core team for the RAMESES I and RAMESES II projects based in Oxford, UK; and a member of the Advisory Committee for the Centre for the Advancement of Realist Evaluation and Synthesis (CARES) at Liverpool University, UK.

Date and Time: Monday, 23 March 2015, from 12:30pm to 2:00pm
Venue:
 ACT Health, Conference Room, Level 3, 1 Moore Street, Canberra
Register online by:
18th March 2015

Event Description: 

2015 has been declared the International Year of Evaluation. Responding to '2015 EvalYears'' focus on strengthening evaluation capacity, our networking and capacity activities in the ACT will promote the AES Evaluators' Competency Framework. This Framework has been designed to support the AES work to raise the quality and awareness of evaluation and its role in public policy, and community and organisational change projects.

Our first event to celebrate 2015 EvalYear will outline how we plan to use the Framework to enhance evaluation knowledge and expertise at a local level and will seek feedback about priorities, preferences and training needs.

We will also examine the foundational competency Evaluative Attitude and Professional Practice. This set of knowledge, skills, and attitudes influence all the other competencies.

Ultimately it is the attitude, thinking and ethical standards that we apply to our work, the data we collect and the evidence we bring to bear, and to our commitment to professional development that are the hallmarks of our professional practice.

canberra 23 MarchA range of experienced practitioners will present their perspectives on the role of Evaluative Attitude and Professional Practice and lead a discussion on practical examples for maintaining integrity, building competence as an evaluator and engaging in reflective practice.

The seminar is targeted towards AES members and others looking to enhance their evaluation knowledge and expertise in 2015. The seminar will involve short presentations and plenty of opportunity for interaction and discussion.

Patton, M Utilization-Focused Evaluation (p. 192)

 

Directions: Take the lift to Level 3, turn left and the Conference Room will be facing you. There is no security pass to gain access to the conference room and an AES sign will be on the door. 

 

Date and Time: Thursday 1 and Friday 2 May 2014, from 9:00 am to 5:00 pm (registration from 8:30 am)
Location: University House, Balmain Crescent, Acton, ACT
Topic: Developing Monitoring and Evaluation Frameworks
Presenter: Dr Ian Patrick
Register online by: 24 April 2014

Fees: Members $770.00 | Non-members $935 (inclusive of GST)

Monitoring and Evaluation (M&E) Frameworks are becoming increasingly important for developing an agreed approach to the assessment of results achieved and to aid organisational learning. The M&E Framework identifies expected results, key evaluation questions and the means to answer these questions through routine monitoring and periodic evaluation. It also provides a guide to the implementation of M&E processes over the life of a program or initiative. Monitoring and evaluation functions are essential to the effective operation of initiatives and programs and will contribute to the overall value derived from them. M&E Frameworks should support decision-making, allocation of resources and program refinement based on lessons learned.

This workshop will present a clear and staged conceptual model for the systematic development of an M&E Framework. It will examine a range of steps and techniques involved in the design and implementation of the framework and explore potential design issues and implementation barriers; cover the development of a Program Logic, the identification of key evaluation questions, the development of performance indicators, and identification of processes for on-going analysis and reflection based on data generated.

The facilitator will encourage interactive peer to peer dialogue to share experiences and learning.

Content

  • The importance and function of monitoring and evaluation processes
  • 'Table of Contents' for the development of an M&E Framework – what to do and in what order
  • Design of a viable M&E framework
  • Application of M&E frameworks to programs
  • What are and how to address key challenges and barriers

Outcomes and Benefits

  • Understanding of an overall structure for the development of a M&E Framework
  • Identification of clear steps and stages involved in the process and building knowledge and skills in their implementation
  • Use case studies to develop key components of an M&E Framework for an initiative
  • Understand how to best support the consultative and participatory processes involved in design and implementation of an M&E Framework

Who should attend?
This workshop offers professionals from across government, universities and not for profit and consulting organisations foundation skills in a key approach to program evaluation. You would benefit most from the workshop if you have some prior knowledge of evaluation theory especially program theory and program logic and some practical experience with evaluation activities.

About the facilitator: Ian Patrickipatrick 250
Dr. Ian Patrick is Director of Ian Patrick and Associates, a consultancy which operates in Australia and the Asia-Pacific region. Ian focuses on design of M&E Frameworks, program evaluation and training and mentoring in evaluation. He has extensive experience in development and implementation of M&E Frameworks for a range of organisations in the government, non-government and university sectors. Ian has worked as an evaluator for around 20 years and his sectoral experience crosses many areas such as governance, education, health, law and justice, and migration and Indigenous issues. Ian is a Senior Fellow in the School of Social and Political Sciences, University of Melbourne. Previously he worked at the International NGO Training Research Centre in the UK leading the evaluation practice area. Ian has previously delivered training programs on developing M&E Frameworks in Australia and New Zealand, USA, Europe and in Asia-Pacific countries.

Conditions:
For Student and Organisational member registrations, please contact This email address is being protected from spambots. You need JavaScript enabled to view it., AES Professional Learning Coordinator.

cta_eventscalendar_final