Member login
 
   
Forgot Login?   Sign up  

This is the AES Blog, where we regularly post articles by the Australasian evaluation community on the subjects that matter to us. If you have an idea, please contact us on This email address is being protected from spambots. You need JavaScript enabled to view it.. Blog guidelines can be found here.



Fellows jennyN

September 2019
by Anthea Rutter

All of us in the AES were greatly shocked and saddened by the sudden death of Jenny Neale. Jenny had been a member of the Australian Evaluation Society for over 20 years and was an active contributor to the society both in her local Regional Network Committee in Wellington as well as a regular contributor at the AES International Conferences. 

Jenny was a Senior Research Fellow Health Services Research Centre, Faculty of Health, University of Wellington, New Zealand.

I interviewed Jenny last year and was rewarded by a frank discussion of life in the field of evaluation, its ups and downs and its frustrations!

The first question I asked her to reflect on was, what brought her into the field of evaluation? 

I have swapped between being a researcher and then working in the evaluation field. I first got into evaluation through the Wellington Evaluation Group. I think it would be correct to say that in those days they were a fairly loose list of people. At that stage I was teaching an applied research degree, and I guess that would have been the 90’s.

The Fellows have a diverse range of evaluation or research interests, which keep them involved in the profession. I was, therefore, interested to find out what had been Jenny’s main areas of interest?

My role for the last 8 years has been that of evaluator in the health services research unit at the University. We evaluate a range of health services initiatives. My main interest has been the social justice area. So, I think I am in the right space, as working in the health services field fits well into that.

Like most of us, a career spanning a number of years usually has a few challenges – some have more than others. So I asked Jenny what have been the major challenges to her practice?

I think it is something that we are facing again at the moment – people’s understanding of evaluation, what it is and what it does, as well as what it is not. Sometimes people ask you for an evaluation but at the end of the project they decide they want something different.  So, it’s that whole issue around what evaluation means. Then there is an idea out there that you can evaluate impact immediately.

Another challenge is that people underestimate the amount of time it takes to actually do an evaluation as people want things yesterday.

The other sort of challenge for us is that the main evaluators in NZ are contractors or work for government departments. So, people move in and out of evaluation and research. If you are a more mature person, you sort of drifted into evaluation (as it was a new field in the 80s). So new evaluators, who possibly have been trained in evaluation, talk about new methods and some of them are things we knew years ago. I try very hard to not say “In my day”!

Apart from challenges to practice, careers also have a number of highlights. It was pleasing to note that highlights for Jenny involved the bringing together of evaluators from Australia and New Zealand.

One of my early highlights was teaching with John (Owen) on a course in Wellington. The second time he brought his course over to NZ he also brought Ros Hurworth with him. Then we had some feedback which suggested that it would be good to have local content and, subsequently, I provided that. And John ran a course with my master’s students. Another highlight was joining the AES and then getting to know lot of people in the field. Ralph Straton was over here as a visiting scholar at the time of 9/11. He was due to head to the US. But the insurance companies would not cover US travel. So, Ralph stayed in Wellington and we all learnt a lot. It was a very good interchange of ideas and skills.

Evaluation practice is defined by many factors, including people, evaluation models and cultures. For Jenny, her practice was influenced a lot by Māori and Pacifica.

I think a lot comes from where I sit with Māori and Pacifica, there is also a strong social justice factor, particular because we are bi-cultural (the Treaty) but becoming multi-cultural. So, treaties of understanding and friendship with other Pacific nations. We have to be clear that’s it’s important for them as well as Māori. We don’t want evaluations to just be a ‘tick box’, which leaves people worse off than before.

I am not a theorist; my research was always applied. There have been other influences which have shaped my practice: listening to people talk at AES conferences. Michael Quinn Patton was also influential, John Owen’s teaching and book was very influential and useful. Also, the Community of Practice within AES was very important.

I wanted to find out from Jenny in what ways the field of evaluation has changed during her career.

Things in evaluation become fashionable and then go out of fashion. But I think that people are realising that it is important. I think it has changed. We now have a number of different theories and practices, for example, realist evaluation and developmental evaluation. The profession has changed from being a broad field where it had practitioners who had a research background to people looking to apply what they know.

With these changes what skills and competencies are required to keep pace with emerging trends in evaluation practice. Jenny’s response was spot on!

I think it’s a bit like life skills, you need to be open to different ideas. Listen to people and ideas. Then craft it in the field you are working in. I remember a short debate a while ago about RCTs as being the gold standard, other countries still talk about it as the only method. In social services it does not always work. You need theoreticians and practitioners and the people in between. So, you need both ends of the spectrum. Most of us are in the middle. Both sides ensure that it is a lively debate. You need to be a good listener and adapt key ideas. Problems are the same in several countries, but the context differs. Making sure there is open debate.

Jenny was very definite when asked about the key issues that evaluators ought to be thinking as about and seeking to resolve in the next decade.

The main one which AES is tackling is the professionalism aspect, and quite a bit of work has been done in this area already, and in particular what we need to make it a recognised profession. In government jobs both in NZ and Australia people move between policy, research or evaluation. 

The other thing which we must do is to undertake an educative job in explaining what evaluation and monitoring means.

Finally, I asked Jenny to comment on what she wished she had known before she set out on the evaluation path.

Certainly, wishing I had known more… But my comment is really a wish – wishing other people understood what evaluation could do and what it was! I guess I assumed that if someone wanted an evaluation, they knew what they would get. Certainly, I think that moving between research and evaluation was advantageous in terms of methodologies and seeing what others were doing.

 

This piece is a tribute to a Fellow of the Australian Evaluation Society who made her mark on the work of the Society as well as for the profession of evaluation. I personally regarded her as a friend, and she will be missed. 


 

michael shannon for Open space blog

September 2019
by Jade Maloney

Ever found yourself more engaged in the coffee break than the conference agenda?

Ahead of the International Evaluation Conference #aes19SYD unconference day, Ruth McCausland, Kath Vaughan-Davies and I trialled an approach for the Australian Evaluation Society NSW meet-up that combined the best of both worlds – purposeful encounters with a coffee break vibe.

We adapted Open Space Technology, established by Harrison Owen in the 1980s, with the aim of finding “a way towards meetings that have the energy of a good coffee break combined with the substance of a carefully prepared agenda.” The approach has since been used around the world as a way of enabling people to self-organise around purpose.

At the NSW meet-up, about 30 evaluators braved the wind and cold to talk about evaluation topics that keep them up at night. For those of you in evaluation, it will be no surprise that these were many and varied:

  • managing your involvement in participatory action research
  • communicating findings effectively, particularly the negative
  • scoping evaluations effectively to meet and manage expectations
  • identifying value and dealing with attribution in an education context
  • planning for the data required for statistical analyses and the ethics of analysis
  • crafting useful and useable recommendations.

And that was before we got to our back-pocket topics

Working what we dubbed the “East Wing”, the “West Wing” and “next door”, groups took their discussions in different directions.  

The group discussing evaluation in an education context shared references: the four levels in Kirkpatrick’s Evaluating Training Programs (reaction, learning, behaviour and results), Guskey’s additional fifth level (although organisational support isn’t a level in the same way), as well as Mitchie, Stralen and West’s COM-B system (thinking about behaviour change in terms of capability, opportunity and motivation).

The participatory action research group was prompted by a question from one evaluator about whether he'd become too involved. They segued into how an evaluator’s participation can shape what is being evaluated and questioned whether this matters. The many lines between questions and the “really??” underneath the word "objective" in their record capture the connecting threads of their conversation, but you had to be there for the depth.

Instead of a traditional report back, we came together as we began – in a circle. The energy was palpably different, shifting from hesitant suggestions to each person sharing something they’d take forward and participants building on each other’s thoughts.

Some focused on practical tips, such as taking the time to clearly scope evaluations upfront and having findings meetings before delivering reports; some on tools (like the COM-B system); others on the process. One participant described it as bringing to life a community of practice in the AES. A few said the problem they’d started with might still keep them up at night, but they felt less alone in it. While we all came from diverse backgrounds, we found common ground among our experiences in NGO, government and private sector evaluations.

Not having set questions to answer gave people the freedom to discuss what they wanted and to go deep on a subject, and the process enabled all to have a voice.

Want to experience the process for yourself? Come along to the #aes19SYD unconference on Tuesday, September 17, to discuss how we might un-box evaluation to better contribute to reconciliation, social justice and a healthy planet. You don’t have to have the answers – just a question and the passion to hold a discussion with others on the subject.

If you’d like to learn more about Open Space Technology, there are a wealth of resources online. Chris Corrigan’s website has an easy-to-navigate collation. Or you could go back to the source: Harrison Owen’s Open Space Technology.

-------------------------- 

Jade is a Partner & Managing Director at ARTD Consultants.


 

September 2019
by Aneta Cram, Francesca Demetriou and Eunice Sotelo

We’ve heard it time and again: people don’t necessarily set out to be evaluators, but fall into the field. For those of us relatively new or emerging, this can be confusing to navigate.

As three self-identified early career evaluators (ECEs), who also grapple with what it means to be ‘early career’ or ‘emerging’, we were interested to learn more about how ECEs orient themselves, set their career pathway, and build their evaluation capacity.

For the past eight months we‘ve been working on a research project exploring the experiences that current self-identified ECEs have had entering into and developing themselves across the diverse range of entry pathways and work contexts in Australia and, in part, New Zealand.

Our project

We chose to take an exploratory approach to this research for a number of reasons. For one, we wanted to hear peoples’ lived experiences and be able to share them without the confines of a set analytical framework. Secondly, we didn’t know what would emerge or what we would find. From our own experiences as ECEs working in different sectors in Australia and abroad, we knew what interested us about entering the field, but – because of the variety of individuals and experiences – we didn’t want to make any assumptions about who our research participants might be or what their experiences have been.

Our overarching research questions were: What are early career evaluator experiences in entering and developing careers in the evaluation profession? What facilitating factors, opportunities and challenges emerge as important to early career evaluators in their experience entering and developing a career in the evaluation profession?
We decided to contact ECEs through evaluation associations and our own professional networks and asked them to support our work by sharing project information with their networks. From this, we received responses from 49 self-identified ECEs.
Even though we would have liked to have interviewed them all, as this is a voluntary project, we only had the capacity to interview 14. The 14 were ECEs from 5 different states in Australia and New Zealand. We wanted to include a diverse range of individuals and so chose our participants based on age range, geographical location, sectors and cultural identity.

FindingsESgraph

We are excited to share with you some of our emerging findings and see if they align or differ from your own experiences entering the evaluation field.

From the preliminary analysis, some of the stand-out findings are:

  • There is ambiguity around what it means to be in the ‘early career’ or ‘emerging’ stages of evaluation work.
  • Peer support and mentorship, access to training and resources, and the role of evaluation associations have important roles in facilitating support for early career evaluators.
  • Early career evaluators experience different and unique enablers and challenges across the variety of workplace contexts.
  • Individuals have faced challenges around age discrimination, cultural representation in the field, and how identity plays out in the way that individuals approach evaluation practice.
  • Early career evaluators bring unique and diverse values, experiences and lenses to evaluation from their prior professional experience, life experiences and identities.

You can read the emerging findings report here. We will be presenting these early findings and conducting a participatory sensemaking session on Wednesday, 18 September, at the Australian Evaluation Society’s conference. We will be incorporating feedback from the conference session into the final report. Come along and help us make sense of the Australian and New Zealand ECE experience/s.

 -------------------------- 

The research team includes:

Francesca Demetriou

 

Francesca Demetriou (Project Lead) works as a freelance evaluation consultant. She also volunteers her Monitoring and Evaluation skills to the Asylum Seeker Resource Centre’s Professional Mentoring Program.

 

  

Aneta Katarina Raiha Cram

Aneta Katarina Raiha Cram is a Māori and Pākeha (caucasian) evaluator from Aotearoa New Zealand. Her primary evaluation experience has been grounded in Kaupapa Māori, a culturally responsive indigenous way of approaching evaluation and research that prioritises Māori knowledge and ways of being, working with Māori communities in New Zealand. Aneta identifies as an ECE. She is currently working as a sole contractor, and has been living and working in the United States and will be returning to New Zealand early next year to begin her next journey as a Doctoral Candidate. 


Eunice SoteloEunice Sotelo is an educator and evaluator, with a particular interest in evaluation capacity building. Before moving to Australia over three years ago, she worked as a high school teacher, copywriter and copy editor. Her experience as a migrant – moving to Canada from the Philippines as a teenager, and working in China for two years – has shaped the way she sees her role in the evaluation space. She recently volunteered as mentor and trainer for the Lived Experience Evaluator Project (LEEP) at the Asylum Seeker Resource Centre.

 


 

Presentations picture

September 2019
by Gerard Atkinson

There is less than two weeks to go until the International Evaluation Conference #aes19SYD, taking place on 15 – 19 September here in Sydney. For those presenting at the conference, it’s time to polish off your presentation skills and get your materials ready. In the theme of “unboxing evaluation”, we’ve unboxed the art of developing effective and engaging presentations and put together an easy guide you can use not just in conferences but in any presentation.

Here are our top tips:

Prepare your thinking.

Preparing for a presentation is entirely different to rehearsal and takes place before you even start making your slides. Effective preparation is about identifying what you want to talk about, doing your research, and building a framework for delivering your presentation. Rehearsal, though important, comes much later.

Create an objective statement.

An objective statement is a single sentence that frames your rationale and scope. A good objective statement clearly articulates the given time period, what the presentation will achieve, and what you want your audience to do as a result. For example, when I teach my one-hour presentations seminar, my objective statement is: “Over the next 50 minutes, I want to cover the key elements of creating and delivering a compelling presentation to inspire you to go make your own.” It’s not setting out to change the world, but it sets out the scope of what to create.

Do your research.

This goes beyond just topic research (which is crucial, of course), and includes understanding such things as:

  • the level of knowledge of the audience
  • the number of people
  • the level of seniority
  • the venue size and layout
  • available technology
  • the time of day.

Develop a presentation framework.

Start building the structure of your presentation as a list or (my favourite) a storyboard. There are many different frameworks and formats out there, and you’ll see quite a few at AES 2019, including the rapid-fire Ignite presentations. My personal favourite framework adapts traditional storytelling techniques by following a format of “Open-Body-Close”. It’s a simple framework but can be adapted to presentations of nearly all formats and lengths. Here’s how it works:

  • The opening section is designed to engage an audience and preview the talk.
  • The body section, which can be repeated for each key point of your presentation, states the point, supports it, then links it logically with the next key point.
  • The closing section reinforces engagement, reviews the topics covered, and gives a call to action to the audience.

By using this framework you can tell many different kinds of stories, for example chronologically or starting broadly and delving deeper into a topic as you go along. You can adapt it to fit the narrative you want to tell.

Kill the deck (if you can).

This is always a controversial tip, but there’s a good reason for it. Slides distract the audience. If you can remove a slide from a presentation, do it. If you need to use slides, remember that they should always be used to underline the point you are trying to make. Photos and (well-designed) charts do this best, followed by diagrams. If you need to use bullet points or text, keep it short and avoid reading them out verbatim.

Use speaker notes.

Scripts can be useful in laying out in exact terms what you want to say in a presentation, but they make it hard to be engaging. Actors train for years to be able to take a script and make it look natural. Instead use speaker notes, which are a shorthand version of a script that outline in abbreviated form the content of each key point. They act as prompts for what you want to say, but allow you to deliver a more natural style of speaking.

Develop useful handouts.

Your slides will not convey the full content of your talk on their own (see above). This means that they shouldn’t be used as handouts. Instead, a handout should be a practical resource that turns the key points of your talk into tools that the audience can use afterwards. Most importantly, distribute handouts after the talk to avoid having distractions during the presentation. 

Rehearse, rehearse, rehearse.

Rehearsal is about replicating your presentation environment as closely as possible. Find a room, set it up as you will on the day, and rehearse the talk as if it were the real thing. It’ll help you get a feel for your timing and flow, and boost your confidence. If you can get some sympathetic co-workers to sit in and give feedback, even better. Repeat this process. The more times you can run through the presentation ahead of time, the more comfortable you will be with the material.

Present with credibility.

Credibility is a combination of confidence, character, and charisma. Confidence comes from research and rehearsal. Character and charisma come from the way you deliver your presentation. Some quick ways to build credibility are to use open body language to engage with the audience, and to vary the way you use your voice (tone, volume, tempo). Both go a long way in engaging the audience and carrying them along with you throughout your presentation.

Handle Q&As at the end.

Question and answer sessions are seen by some people as the trickiest parts of a presentation because they can be hard to predict. Prior research can help you anticipate and prepare for some of the questions you might be asked. It’s best to keep questions until the end of the presentation, as this helps keeps things on track. Let the audience know at the start of the presentation so that they can note down their questions for later. To handle Q&As, here’s a five-step process:

  • Ask: Take a step forward while asking the audience if they have any questions.
  • Select: Select questioners by gesturing to them with an open palm (rather than pointing) or their name, if you know it.
  • Listen: Give questioners total concentration, eye contact, and actively listen to their question.
  • Repeat: Pause, then repeat or rephrase the question to the whole group to show you understand what they’re asking. This also helps when there’s no roving microphone.
  • Answer: Make eye contact with members of the audience while answering.

An alternative (and compatible) approach to managing Q&As effectively comes from Eve Tuck.

  • Ask a neutral person to facilitate the Q&A.
  • At the end of the presentation, invite the audience to talk to each other for a few minutes and share the questions they are thinking of asking.
  • Have the facilitator encourage the audience to consider whether those questions are useful to the broader discussion and best asked during the session, or in another context (e.g. the coffee break).

See Eve’s Twitter feed for the full list of suggestions for Q&As.

I hope these tips can help you prepare, construct and deliver your own presentations with confidence. Looking forward to seeing a lot of great presentations #aes19SYD.

-------------------------- 

Gerard is a Manager at ARTD Consultants.