Member login
 
   
Forgot Login?   Sign up  

ImprovingEvaluation

Workshop title: Designing and Implementing a Monitoring and Evaluation System
Dates and time: 13th, 14th, 15th, 16th May 2019 (4 days) 9am to 5pm (registration from 8.30am) each day

  • Monday 13th May: Introduction to Designing and Implementing a Participatory Monitoring and Evaluation System; 
  • Tuesday 14th May: Planning for Monitoring and Evaluation functions; 
  • Wednesday 15th May: Developing System Capabilities for Data Collection and Analysis; 
  • Thursday 16th May: Developing System Capabilities for Reflection and Reporting for Learning and Program Improvement

People can choose to participate in the full program or part of the program dependent upon their experience and needs

Location: Simpson room at Novotel Brisbane, 200 Creek Street, Brisbane 4000
Facilitators: Anne Markiewicz and Ian Patrick
Registrations close:  Tuesday 7th May 2019
Fees (GST inclusive): For all 4 workshops: Members $1,650, Non-members $2,270. For day workshops: Members $440, Non-members $605 (per day)

About the 4-day Intensive Workshop

Designing and implementing a Monitoring and Evaluation System for a program or an organisation is becoming an increasingly important task required to support the assessment of agreed results and to aid organisational learning for program improvement. A robust and well-considered Monitoring and Evaluation System is also required to determine the scope and parameters of routine monitoring and periodic evaluation processes; to identify how monitoring and evaluation data will be collected and analysed; and to determine how data will be used to inform learning and reporting for accountability, program improvement and decision-making. The Public Governance, Performance and Accountability (PGPA) Act (2013) and the Department of Finance Resource Management Guide No.131 ‘Developing Good Performance Information’ (April 2015) affirm the importance of planning to identify program intentionality and to outline how a program’s performance will be measured and assessed.

This workshop draws on the text book ‘Developing Monitoring and Evaluation Frameworks’ (SAGE, 2016) authored by Anne Markiewicz and Ian Patrick. It presents a clear and staged conceptual model for the systematic development and implementation of an M&E System. The workshop has been developed with four separate, but inter-related components, with one presented each day:

  • Day One (Mon 13 May) provides an introduction to the principles of designing and implementing a participatory monitoring and evaluation system based on program theory and program logic
  • Day Two (Tues 14 May) identifies how to plan for monitoring and evaluation functions
  • Day Three (Wed 15 May) focuses on data collection and analysis processes
  • Day Four (Thurs 16 May) covers learning and reporting for program improvement and decision-making.

The four day intensive training program is structured in such a way that participants can enrol in all four days, thereby providing a comprehensive guide to developing an M&E System. Alternatively, participants can enrol in any one or more of the training days depending on their prior experience, needs and orientation. Each day will be structured as a stand-alone event in terms of content.

This modular training approach should appeal to participants who have already attended a two-day ‘Developing Monitoring and Evaluation Frameworks’ workshop delivered by Anne Markiewicz or Ian Patrick and would like to receive an extension through more content provided on the implementation of monitoring and evaluation plans focused on processes of data collection, analysis, learning and reporting (Days 3&4).


Day 1: Monday 13th May 2019 - Introduction to Designing and Implementing a Participatory Monitoring and Evaluation System

About the Workshop

The scope of the workshop is intended to be foundational, introducing participants to the role and purpose of participatory monitoring and evaluation systems for programs and organisations. The first day will consider the elements that make up a participatory monitoring and evaluation system, the importance of stakeholder engagement and the role of program theory and program logic in embedding an evaluation-led and theory-based approach to the development of a monitoring and evaluation system for a program or organisation.

Workshop Content
  • The elements that make up a Monitoring and Evaluation system and the sequence of steps and stages in developing a system
  • The importance and inter-connectedness of monitoring and evaluation functions and processes
  • Stakeholder engagement and stakeholder roles in developing the M&E system
  • Developing Program Theory – its importance in identifying anticipated change, and the underlying assumptions to be tested through the Monitoring and Evaluation system
  • Developing Program Logic - its importance in detailing the operationalisation of the theory over time through articulation of a sequential set of inputs, outputs and outcomes.
  • Evaluation capacity development in the design and implementation of an M&E system
Learning Outcomes
  • Understanding the core concepts and components of a Monitoring and Evaluation system
  • Understanding the role of key stakeholders in the process of developing a system
  • Understanding the role of Program Theory and Program Logic as foundations for monitoring and evaluation functions
  • Understanding the importance and approaches to planning and guiding the implementation of the M&E system
PL competencies

Day 1 of the workshop strongly aligns with competencies in the AES Evaluator’s Professional Learning Competency Framework:

  • Domain 2 – Evaluation theory
  • Domain 3 – Culture, stakeholders and context
  • Domain 7 – Evaluation activities

Day 2: Tuesday 14th May 2019 - Planning for Monitoring and Evaluation Functions

About the Workshop

Day 2 will focus on developing the core components of the M&E system including agreed Evaluation Questions, Key Performance Indicators and Targets; and Monitoring and Evaluation Plans that identify the range of data required from routine monitoring and periodic evaluation processes. The focus will be on developing a set of robust agreed Evaluation Questions that link with the Program Theory and Program Logic, supported by a mixed-methods approach to data collection. Various data collection options will be considered for inclusion as part of the M&E planning process.

Workshop Content
  • The importance and function of developing evaluation questions as a core component of the design of the M&E system
  • The role of evaluation domains to provide focus and structure to the areas investigated through monitoring and evaluation
  • The role of key performance indicators and targets, and how to incorporate them within monitoring and evaluation plans
  • Developing a Monitoring Plan that maps the routine monitoring data to be collected
  • Developing an Evaluation Plan that maps the periodic evaluation data to be collected
Learning Outcomes
  • Understanding the processes used in developing evaluation questions to guide the development of the M&E system
  • Understanding the role and function of key performance indicators and targets
  • Skills in developing monitoring and evaluation plans that outline the range of data to be collected,
  • Skills in mapping out data collection options for both routine monitoring and periodic evaluation activities
PL competencies

Day 2 strongly aligns with competencies in the AES Evaluator’s Professional Learning Competency Framework:

  • Domain 2 – Evaluation theory
  • Domain 4 – Research methods and systematic inquiry
  • Domain 5 – Project management
  • Domain 7 – Evaluation activities 

Day 3: Wednesday 15th May 2019 - Developing System Capabilities for Data Collection and Analysis

About the Workshop

Day 3 will focus on developing capabilities for the collection, analysis and synthesis of monitoring and evaluation data. In an M&E System, such capacities are crucial for implementing the monitoring and evaluation plans that have been developed. This day will cover a range of data collection processes and the knowledge and skills required for their effective use. A Data Collection Resource Guide will be provided to assist in reviewing options, methods, tools and techniques for data collection (both qualitative and quantitative). The day will cover approaches to effective data entry and analysis and how to develop effective databases. The importance of data synthesis will be considered, before examining evaluation capacity building required for the effective implementation of data collection and analysis systems.

Workshop Content
  • The importance and function of developing effective data collection and analysis processes as part of the M&E system
  • Designing and implementing data collection processes - qualitative and quantitative (e.g. one-time and tracking surveys, semi-structured interviews, focus groups, case studies, narrative stories, workshops, direct observation, etc.)
  • The role of databases in supporting effective data entry and analysis
  • Evaluation capacity building requirements for data collection and analysis
  • Data integration and data synthesis
Learning Outcomes
  • Understanding the elements required for the development and implementation of effective data collection, data entry and data analysis processes
  • Understanding data collection options, qualitative, quantitative, visual and other
  • Understanding requirements to support effective data entry and analysis
  • Understanding capacity building requirements for personnel collecting, entering and analysing data
  • Understanding the need for data synthesis processes that support data aggregation
PL competencies

This workshop specifically aligns with competencies in the AES Evaluator’s Professional Learning Competency Framework:

  • Domain 4 – Research methods and systematic inquiry
  • Domain 5 – Project management
  • Domain 7 – Evaluation Activities

Day 4: Thursday 16th May 2019 - Developing System Capabilities for Reflection and Reporting for Learning and Program Improvement

About the Workshop

The final day will consider how to translate data collected through monitoring and evaluation into useful and useable evaluative processes and products. The formulation of defensible and robust evaluative judgements will be covered, and extend to reaching conclusions about program performance and the identification of lessons and recommendations for program improvement and decision-making. Consideration will be given to different types of reporting for different audiences: formative, summative, topic specific, case-studies, newsletters, bulletins, visual presentations, etc. Ethical issues that arise around formulating conclusions, lessons and recommendations - their objectivity and independence will also be considered. Finally, the importance of fostering a learning culture as part of a well-functioning M&E System will be considered.

Workshop Content
  • Developing useful and useable processes and products for program improvement and decision-making
  • Evaluative judgements, conclusions, lessons and recommendations about program performance, quality and value
  • Learning functions as part of the M&E system
  • Reporting functions as part of the M&E system
  • Ethical issues and dilemmas in undertaking reflection and reporting.
Learning Outcomes
  • Understanding how to embed a culture of learning and reflection as part of the M&E system
  • Understanding the role and function of forming evaluative judgements and conclusions about program performance, quality and value
  • Understanding the role and function of formulating useful and useable lessons and recommendations for program improvement and decision-making
  • Appreciation of some of the ethical issues and dilemmas encountered in M&E reflection and reporting.
PL competencies

This workshop specifically aligns with competencies in the AES Evaluator’s Professional Learning Competency Framework:

  • Domain 1 – Evaluative attitude and professional practice
  • Domain 4 – Research methods and systematic inquiry
  • Domain 6 – Interpersonal skills
  • Domain 7 – Evaluation Activities

Who should attend?

This workshop offers professionals from across government, universities, the private sector, and not for profit and consulting organisations foundation skills in designing and implementing a monitoring and evaluation system for a program or organisations. You would benefit most from the workshop if you have some prior knowledge of evaluation, particularly program theory and program logic and some practical experience with evaluation activities.

About the facilitator: Anne Markiewicz

Anne Markiewicz is the Director of Anne Markiewicz and Associates, a consultancy that specialises in the delivering of training, mentoring and capacity building in monitoring and evaluation. Anne is the co-author of the text book ‘Developing Monitoring and Evaluation Frameworks’ (SAGE 2016). She has extensive experience in the design and implementation of monitoring and evaluation systems for a wide range of different initiatives, building the capacity of organisations to plan for monitoring and evaluation. Anne has been an evaluator for over 20 years and has been recognised by the Australasian Evaluation Society through receipt of a number of awards for excellence in evaluation and she is a Fellow of the Society. Anne has trained in evaluation extensively in Australasia, the Pacific, the United Kingdom and the USA.

About the facilitator: Ian Patrick

Dr. Ian Patrick is an independent consultant and Director of Ian Patrick and Associates. His career as an evaluator extends over around 20 years and includes a focus on both Australia and the Asia Pacific region. He has broad experience across different social sectors such as health, education, law and justice, community development, and human rights and Indigenous issues. Ian has worked with a range of organisations and programs in developing monitoring and evaluation systems, and has conducted evaluation-related training programs including on Introduction to Evaluation, Participatory Approaches in Evaluation, and Developing Monitoring and Evaluation Frameworks. He was awarded the AES Best Evaluation Policy and System Award in 2012 for the Monitoring and Evaluation Framework, Mongolia Australia Scholarship Program. Ian is an Honorary Senior Fellow, Development Studies Program at the University of Melbourne and was previously the leader of the evaluation practice area at the International NGO Training and Research Centre, UK.

 




View All Events

cta_eventscalendar_final