Member login
 
   
Forgot Login?   Sign up  

ImprovingEvaluation

Register now to take advantage of early bird rates!

Program:

Day 1: Monday 20th May

Day 2: Tuesday 21st May

Day 3: Wednesday 22nd May

Participants may register for all three days of workshops or for individual workshops.

Dates and time: Monday 20 May, Tuesday 21 May, and Wednesday 22 May 2019, 9am to 5pm (registration from 8.30am) each day

Venue: Melbourne Metropole Central Hotel, 44 Brunswick Street, Fitzroy

Situated in the heart of Melbourne's cafe district, the Metropole is also adjacent to Melbourne's CBD. More details on venue: http://www.metropolecentral.com.au/

Target audience: Emerging practitioners, project managers, supervisors. Mixed audience from the private, Government, NFP/NGO, and Community sectors.

Registrations close: Early bird: 3 May. Final registrations: 13 May

Fees (GST inclusive):

Participants may register for all three days of workshops or for individual workshops.

 Type   Early bird  After 3 May
      AES members
 Day rate  $440.00   $484.00
 3 days   $1,250.00   $1,452.00
      Non-members
 Day rate  $605.00    $665.00
 3 days   $1,725.00   $1,905.00
      Full time Student day rate*   $300.00   $330.00

 * Students must send proof of their full-time student status to This email address is being protected from spambots. You need JavaScript enabled to view it. 

Please note: if you are attending all three days and would like to do a mix of workshops from the two options, select either option 1 or option 2 and then enter the workshop names in the Special Requirements box or send an email to This email address is being protected from spambots. You need JavaScript enabled to view it. with the workshops you would like to attend. Please note that each option is a one day workshop so there are two workshops to choose from each day.


Day 1: Monday 20th May 2019

Workshop Option 1: Introduction to Evaluation facilitated by Charlie Tulloch

This workshop is aimed at those who are new or inexperienced in the evaluation field. Its purpose is to outline the key concepts, theories and approaches that are relevant to commissioning or conducting evaluation projects. The workshop will step through the set of activities that are most often involved in framing, conducting and reporting on evaluation findings. It will also introduce participants to the AES Evaluators’ Professional Learning Competency Framework and related sources for those new to evaluation to continue building their skills and knowledge in this field.

Outcomes and Benefits

This is a knowledge-building workshop that will help attendees build greater clarity about core evaluation concepts. Those who are new or inexperienced in evaluation planning or practice, or feel they would benefit from a succinct recap of the core themes and theory underpinning this field

About the Facilitator: Charlie Tulloch

Charlie Tulloch has worked extensively with leaders from State and Commonwealth Governments on large and small evaluation projects across a breadth of sectors. He has been an evaluation consultant for the past decade, working in small (HLB Mann Judd), medium (ACIL Allen Consulting) and large (KPMG) companies.

In 2018, Charlie founded Policy Performance Pty Ltd to lead evaluation planning and projects. Charlie has served as a tutor in Impact Evaluation at the University of Melbourne since 2015. He is also a DTF(Vic)-accredited Investment Logic Modelling facilitator, providing business case support services.

 

Workshop Option 2: Theories of Evaluation facilitated by Brad Astbury

This workshop provides an overview of the origins and evolution of evaluation theory. Attention to theory in evaluation has focused predominantly on program theory and few evaluation practitioners have received formal training in evaluation theory. This workshop seeks to remedy this by introducing a framework for conceptualising different theories of evaluation and a set of criteria to support critical thinking about the practice-theory relationship in evaluation.

Outcomes and Benefits

Topics covered will include but are not limited to: the nature and role of evaluation theory; major theorist’s and their contribution; approaches to classifying evaluation theories; key ways evaluation theorists differ and what this means for practice; techniques for selecting and combining theories based on situational analysis and dangers involved in relying too heavily on any one particular theory. Case examples will be used to illustrate why evaluation theory matters and how different theoretical perspectives can inform, shape and guide the design and conduct of evaluations in different practice settings.

About the Facilitator: Brad Astbury

Brad Astbury is a Director at ARTD Consulting and works out of the Melbourne office.  He has over 18 years of experience in evaluation and applied social research and considerable expertise in combining diverse forms of evidence to improve both the quality and utility of evaluation. He has managed and conducted needs assessments, process and impact studies and theory-driven evaluations across a wide range of policy areas for industry, government, community and not-for-profit clients. Prior to joining ARTD, Brad worked for over a decade at the University of Melbourne where he taught and mentored postgraduate evaluation students.


Day 2: Tuesday 21st May 2019

Workshop Option 1: Evaluation reports: writing, editing and design facilitated by Ruth Pitt

Despite the increasing popularity of visual and creative presentation methods, writing is still a core skill for evaluators. Evaluators need to communicate complex topics to diverse audiences and produce attractive, error-free reports that meet their clients’ needs. This workshop offers practical strategies for delivering quality reports when facing tight deadlines and budgets.

This workshop develops participants’ skills in writing, editing and designing evaluation reports. The workshop focuses on how to make the most of the time remaining before deadline—one hour, one day or one week—to improve a draft report. It also presents a range of software programs, Word tips and design tools that can make this work easier. The workshop includes software demonstrations, practical activities and information on useful resources for further competency development

Outcomes and Benefits

Specifically, by the end of the workshop, participants will be able to:

  • explain the importance and features of quality evaluation reports
  • apply principles of clear writing
  • correct common writing problems such as jargon, passive voice and weak verbs
  • perform quick checks for errors using Word and specialist editing software
  • choose appropriate design techniques to support both communication and visual appeal.

The workshop is suitable for evaluators who write and review reports and other materials to communicate evaluation findings.

About the Facilitator: Ruth Pitt

Ruth Pitt started her career in a writing and editing consultancy before moving into evaluation. She has since worked in diverse evaluation roles, in government departments, not-for-profit organisations, consultancies and universities. She has consulted for a number of international organisations on presenting evaluation findings, including writing and editing evaluation reports.

 

Workshop Option 2: Program Logic and Theory of Change facilitated by Carina Calzoni

This workshop introduces the program logic / theory of change concept and lays out a step by step process for creating a logic/theory of change model. A program logic/ theory of change focuses not just on what, and how a project is trying to achieve change but also on ‘the who’ will be changing. The course includes discussion of how program logic / theory of change can be used for program design and how it can be used to provide the structure for monitoring and evaluation plans.

The course will commence with an overview of program logic / theory of change and a hands-on introduction to developing a simple hypothetical logic model. Following a more detailed overview of the various approaches to program logic development and their relative strengths, participants will be introduced to a structured process for developing a logic / theory of change, using a hypothetical behaviour change project.

The course will conclude with a bridging session that outlines the process for using program logic / theory of change to develop meaningful targets, monitoring systems and well-targeted evaluation plans. The training will include a mix of expert presentation, small group work, and questions and answer sessions.

Outcomes and Benefits

By the end of the course, participants will:

  • Understand the concepts of program logic/theory of change and the steps involved in creating a model
  • Understand how program logic/program logic can be used for planning, for monitoring and evaluation and for reporting

The course is designed for people involved in project planning, project managers and people who work in monitoring and evaluation roles.

About the Facilitator: Carina Calzoni

Carina Calzoni is passionate about program design, monitoring and evaluation. She has nearly 20 years of professional evaluation experience within government and consulting to governments and not-for-profit organisation across a wide range of sectors and levels of complexity. She has an in-depth understands public policy and program design and management, and has a deep appreciation for a utilisation-focused approach to evaluation in this context. 

Carina has a Masters in Evaluation as well as qualifications in Public Policy and Applied Science which gives her the breadth of skills and knowledge to work adaptively across a range of specialist fields.  She has been involved in a large number of complex evaluations involving both qualitative and quantitative methods and program planning processes across a wide range of sectors including agriculture, natural resource management, regional development, education, health, mental health and social enterprises. She is also an experienced monitoring and evaluation trainer and facilitator.


Day 3: Wednesday 22nd May 2019

Workshop Option 1: A Gentle Introduction to the Collection and Analysis of Statistical Data for Evaluation facilitated by Mark Griffin

A robust evaluation makes use of both qualitative and quantitative research methods. At the same time many people commissioning or conducting evaluations have little training or understanding of quantitative methods such as survey design and statistics. Indeed, some colleagues may even face some anxiety thinking about such methods. This workshop is not intended to turn evaluation practitioners into hard-core data scientists, but the goal instead is to give evaluation practitioners the tools necessary to work productively and in close collaboration with data scientists and to give evaluation commissioners the tools necessary to scope out projects involving statistical components, to assess the value of subsequent bids from potential statistical consultants, and to maximize the potential for statistical work to lead to true insights and business value within the commissioner’s organization. With such a goal this workshop will not focus on the specific technical intricacies of the mathematical techniques discussed, instead it will focus on a series of case studies where statistical methods have been applied in a sophisticated manner (and in so doing will introduce a range of statistical methods and the types of research questions that can be asked using each method, and some basic guidelines in proper statistical practice such as the importance of checking statistical properties of a dataset prior to conducting the analysis).

Outcomes and Benefits
  • To give an evaluation practitioner the required skills to work productively and in close collaboration with data scientists, to choose statistical methods that offer true value to a commissioner, and to report statistical findings in a way that offers true business value.
  • To give an evaluation commissioner the required skills to scope out projects including statistical components, to assess bids by statistical consultants, and to convert reports written by consultants into true business value.
About the Facilitator: Mark Griffin

Dr Mark Griffin is the Founding Director of Insight Research Services Associated (www.insightrsa.com), and holds academic appointments at the University of Queensland and the University of Sydney. Mark serves on the Executive Committee for the Statistical Society of Australia, and is the Founder and Co-Chair of the Special Interest Group for Business Analytics within the International Institute of Business Analysis. Mark has been the primary statistician for a number of large surveys (including a survey of 140,000 parents receiving the Positive Parenting Program in Queensland), and Insight is a member of a number of government panels including that for the Therapeutic Goods Association within the Australian Department of Health. Since the formation of Insight Mark has presented over 90 two-day and 15 five-day workshops in statistics around Australia, and has recently started an annual international speaking tour.

 

Workshop Option 2: Essentials of Qualitative and Quantitative Methods for Evaluators and evaluation users facilitated by Samantha Abbato

Have you ever felt baffled by the academic jargon of evaluation methods and skipped to the conclusions of a paper or report hopeful that the author “knew what they were doing”? Or have you felt uncertain about the right way to interpret and assess either qualitative or quantitative methods?  The purpose of this workshop is to increase the understanding of core methods and the ability to critique qualitative, quantitative, and mixed methods for evaluation. The goal is for participants to apply the learnings of the workshop to increase rigor of method whether they are evaluators or commissioners of evaluation.

Qualitative and quantitative methods are the basic building blocks of almost all evaluations. The majority of evaluations undertaken today rely on mixed methods, or a combination of qualitative and quantitative methods. Decisions made based on evaluation conclusions and recommendations assume that the evidence-base is sound and true. Evaluation misuse and overuse happens when decisions are made based on flawed or weak evidence and without an understanding of the methods on which the evidence is based. It is critical, not only for evaluators, but for all users of evaluation findings to be able to understand the essentials and critique the basic methodological foundation of findings.

The workshop will be interactive, involve the sharing of experiences as well as hands-on exercises. Practical strategies, “how to” and checklists will be provided, discussed and put to the test without the technical or research jargon.

Outcomes and Benefits

This workshop is for evaluators, evaluation commissioners and other professionals who would like to:

  • Understand the essentials of quantitative and qualitative methods, their theoretical underpinnings and the difference between them;
  • Apply practical strategies to assess qualitative and quantitative methodologies by dividing and understanding the method process using three simple steps -sampling, data collection, analysis;
  • Use checklists of key practical questions and other strategies for both types of methods to assess the credibility of evidence;
  • Identify weakness in method, how to acknowledge limitations and identify strategies to strengthen it;
  • Confidently apply or challenge evaluation findings on the basis of method evidence.
About the Facilitator: Samantha Abbato

It is rare to find someone who is able to navigate and translate across both the academic and grass-roots worlds. - Brad, Brisbane NGO client

Samantha Abbato is an evaluation consultant and director of Visual Insights, a Pictures and Stories approach to evaluation. As an independent evaluation consultant for the past 14 years working with more than 50 NGO and government organisations in Queensland, New South Wales and the Northern Territory, she regularly applies a mixed-methods (qualitative and quantitative) approaches. Her evaluation work is based on an extensive quantitative and qualitative academic background that includes a PhD in epidemiology (UC Berkeley), an MPH in biostatistics and four years of applied academic training in the qualitative methods of medical anthropology (UC Berkeley) applied to a thesis and publication in Aboriginal and Torres Strait Islander health. She is a specialist in health and community sector evaluation with extensive experience in qualitative and quantitative evaluation approaches, working with Aboriginal and Torres Strait Islander communities and a range of culturally and linguistically diverse communities, including refugees. 

Sam has held university lecturing positions including as both a lecturer in epidemiology and qualitative methods for the School of Public Health, University of Queensland (1997-2000). Sam continues to publish peer-reviewed evaluation and research papers in collaboration with clients in both quantitative methods (including statistical analysis) and qualitative methods (including case study). Sam was the recipient of the 2015 AES Evaluation Publication Award (Caulley Tulloch Award). Drawing on her own shift from a purely academic approach to a “use focus” to achieve outcomes goals in partnership with organisations, she is able to share effective tools and strategies that are responsive to organisations’ needs whilst maintaining the integrity of method and evidence.


 

View All Events Register

cta_eventscalendar_final