Loading…
This event has ended. Visit the official site or create your own event on Sched.
DELEGATES SHOULD WEAR A FACE MASK whenever possible | PRESENTERS & CHAIRS are to wear face masks whilst in their presentation room, except when presenting 
Tuesday, August 30
 

9:00am ACST

Opening plenary: Welcome to Country, followed by Nan Wehipeihana "Evaluation in pursuit of Indigenous health equity. Weaving courage, evidence and evaluative insights in a funder-commissioned evaluation"
Welcome to Country: Frank Wangutya Wanganeen
Opening address: Kiri Parata, President, Australian Evaluation Society

Keynote address: "Evaluation in pursuit of Indigenous health equity. Weaving courage, evidence and evaluative insights in a funder-commissioned evaluation."

Nan Wehipeihana (Director, Weaving Insights) 

Health equity has been defined as the principle underlying a commitment to reduce and eliminate disparities in health and its determinants. An equity perspective acknowledges these disparities as not only avoidable but unfair and unjust. It recognises that different people with different levels of advantage require different approaches and resources to get equitable health outcomes (Bloomfield, 2019).

Equity is usually defined as the absence of inequity. This reinforces a deficit mindset of Indigenous peoples, and subconsciously blames them for the failure of the system. We typically see Indigenous people labelled as ‘hard-to-reach’ as opposed to the lack of responsiveness of services to reach out and into Indigenous communities.

Evaluation in pursuit of Indigenous health equity challenges evaluators to be ‘stewards of the public good’ (Greene, 2002). Greene argues that “strong, worthy, powerful evaluation is conducted by active engagement with democratic politics, especially the politics of difference. Evaluation by its very nature, is not neutral or value free. It reflects the political, organisational and institutional context of the funder and interpersonal relationships of power, authority, and voice in that context. This is the dominant discourse that holds sway and determines what is ‘good’ evaluation and appropriate and valid analysis and evaluative conclusions.

What does it take to pursue Indigenous health equity within a funder-commissioned evaluation? Some lessons learned, tips and strategies about weaving courage, evidence and evaluative insights in a highly political, COVID-19 dominated environment.

Greene, J. (2002). Towards Evaluation as a ‘Public Craft’1 and Evaluators as Stewards of the Public Good or On Listening Well.Keynote Address presented at the 2002 Australasian Evaluation Society International Conference October/November 2002 – Wollongong Australia.
Bloomfield, A. (2019). Achieving equity. Wellington, Ministry of Health. https://www.health.govt.nz/about-ministry/what-we-do/work-programme-2019-20/achieving-equity
Wehipeihana, N., Sebire, K., Spee, K. & Oakden, J. (2022). In Pursuit of Māori Health Equity. Evaluation of the Māori Influenza and Measles Vaccination Programme. Wellington: Ministry of Health (publication pending)



Chair
avatar for John Pilla

John Pilla

AES22 Convenor. Director, Uneek Consulting. Director, GENI
Convenor AES22, Director Uneek Consulting, Director Guidelines Economists Network International.

Speakers
avatar for Kiri Parata

Kiri Parata

President, Australian Evaluation Society
Kia ora I'm Kiri, living on Kabi Kabi Country on the Sunshine Coast of Queensland. I share my time between here and my home in Aotearoa New Zealand where my iwi (tribal) affiliations are Te Ātiawa, Ngāti Toarangatira, Ngāti Raukawa, Ngāti Ruanui, and Ngāi Tahu. I'm completing... Read More →
avatar for Nan Wehipeihana

Nan Wehipeihana

Director, Weaving Insights
Nan is the director of Weaving Insights (www.weavinginsights.co.nz) and a member of the Kinnect Group (www.kinnect.co.nz). Nan is a founding member of the Aotearoa New Zealand Evaluation Association (ANZEA) and Ma Te Rae, Māori Evaluation Association – the first Indigenous Eva... Read More →
avatar for Uncle Frank Wangutya Wanganeen

Uncle Frank Wangutya Wanganeen

Frank Wangutya Wanganeen is a Kaurna / Nurungga Man raised on PointPearce Mission on the Yorke Peninsular of South Australia.He is passionate about Reconciliation and has been on the Adelaide CityCouncil Reconciliation Committee, Campbelltown Reconciliation Committee,and Salisbury... Read More →


Tuesday August 30, 2022 9:00am - 10:30am ACST
Hall D (plenary)
  Plenary

11:00am ACST

Identifying unintended consequences through inclusive evaluative design
Niketa Kulkarni (Clear Horizon)

While Do no Harm principles and safeguard measures are increasingly common in program design, evaluative activities often miss signs of unintended negative impact. Such gaps are particularly concerning when the negative outcomes pertain to women, persons with disabilities, or other vulnerable populations. Drawing on examples from the presenter's fifteen years of international research experience, this paper will present a rationale and methodology for inclusive evaluation design that can help mitigate the risk of missing unintended consequences of an intervention.

Evaluative activities designed to test the validity of impact pathways often prioritise measuring progress towards intended positive changes. Evaluations less frequently investigate links between program initiatives and unintended consequences, often because they are unknown or considered outside scope at the time of evaluation design. Nevertheless, there remain possibilities of unintended harmful consequences resulting from similar or concurrent pathways. For example, women's economic empowerment initiatives may strengthen income streams. However, disrupted dynamics within the home may also increase risks of gender-based violence, especially amongst communities where gender roles are rigidly defined.

Evaluation designs which focus on articulated pathways to positive impact and do not involve inputs from beneficiary communities may miss identifying potential negative consequences. Inclusive evaluation design strategies can reduce confirmation bias and increase the visibility of potential harmful consequences. For example, insights could be drawn from in-depth qualitative narration of personal experiences loosely prompted by open ended questions about the significance of an intervention. Importantly, purposive sampling methods should ensure participation of women, persons with disabilities, and other marginalised communities to capture results particular to their circumstance. Discussions with community organisations, support institutions, or advocacy groups may also shed light on alternative outcomes. Evaluations can then validate the strength and frequency of these outcomes, and present these alongside the findings on the intended outcomes for a more balanced report.

Chair
avatar for Nataly Bovopoulos

Nataly Bovopoulos

Manager, Grosvenor Performance Group
Nataly is a Manager at Grosvenor Performance Group specialising in evaluation services. She has over 15 years experience in the mental health and community sectors as a researcher, people manager and senior leader. She has expertise in wellbeing in the workplace, with her PhD research... Read More →

Speakers
avatar for Niketa Kulkarni

Niketa Kulkarni

Principal Consultant, Clear Horizon
Niketa is an experienced gender and M&E specialist with over 15 years working in international development. Niketa is trained in both qualitative and quantitative research methodologies and is particularly interested in designing gender and socially inclusive research and frameworks... Read More →
avatar for Nikki Bartlett

Nikki Bartlett

Senior Consultant, Clear Horizon
Nikki has over ten years experience supporting the design and implementation of programs, MEL frameworks and evaluations across the Asia-Pacific. Nikki’s particular experience focuses on culturally responsive and inclusive design and methodologies to amplify the voices of stakeholders... Read More →


Tuesday August 30, 2022 11:00am - 11:30am ACST
Riverbank Room 1

11:00am ACST

Committed to mentoring
Julie Elliott, Jill Thomas (JA Thomas & Associates), Francesca Demetriou (Francesca Demetriou), David Turner (David Turner), David Roberts (Roberts Brown)

Navigating professional life as an evaluator can be daunting. This session reports on the AES Group Mentoring Program which supports members to successfully steer their professional development, growth and competencies as evaluation practitioners. We discuss key features of the online, group mentoring program theory and design, including its objectives, expectations and evaluation. The innovative nature of the AES Group Mentoring Program and the success of the pilot in 2021, has important implications for how we support evaluators to refine their knowledge and practice in an increasingly digitised world. The emerging mentoring program theory can also make an important and transferable contribution to mentoring programs in other disciplines.

Mentoring programs don't just happen. Our online, group mentoring interweaves thoughtful planning and the sustained commitment of a diverse group of people with the insights of seasoned evaluators and associate mentors to guide small groups of emerging and mid-term evaluators with similar interests as they work together towards their professional development goals. Unlike a traditional one-to-one mentoring program, group mentoring also provides mentees with access to the diverse experiences and perspectives of their peers. These peer support networks enable group mentees to grow professionally together. We expect to see sustainable benefits beyond the program. Our evaluation ensures that we build a solid understanding of mentors and mentees expectations and their experience with the program.

This session will be of interest to everyone who has experience of the program as well as those contemplating joining as mentors or mentees in the future and those involved in mentoring more broadly

Chair
avatar for Squirrel Main

Squirrel Main

Evaluation Manager, Paul Ramsay Foundation
An evaluator and a data analyst with a passion for measuring and improving outcomes across education, health and community wellbeing sectors, I was The Ian Potter Foundation’s Research and Evaluation Manager from 2015 until 2022. Over the past two decades, I've conducted evaluations... Read More →

Speakers
avatar for Julie Elliott

Julie Elliott

Evaluator
Collaborator and evaluation scholar-practitioner committed to acknowledging the complexity inherent in all human settings.


Tuesday August 30, 2022 11:00am - 11:30am ACST
Riverbank Room 2

11:00am ACST

Evaluation in 12 Words and Pictures - A Game for all Players 9 to 99
Samantha Abbato (Visual Insights People)

Evaluation has a vital role in evidencing the difference that programs and other initiatives are making and for program improvements. However, newcomers to evaluation can find it intimidating, confusing, and hard to access because of its perceived complexity and jargon. A substantial block to starting the pathway to Evaluation is understanding the language and basic concepts and weaving these into good program management. The presenter uses a set of 12 visual flashcards and game props to explain the basics of Program Evaluation and how it connects to practice improvement, with minimal jargon and maximum fun.

This skill-building session is designed with those new to Evaluation in mind.

The learning objectives include:
  1. Understanding of the essential components of program evaluation
  2. The ability to articulate the role of program evaluation for program learning and improvement
  3. Know how evaluation concepts and tools can be used for program management and strategic decision-making.
Using a 'pictures and stories' flashcard approach, participants will be guided through the 12 evaluation words and related concepts. Simple analogies related to everyday activities, such as food and craft, will be woven into the session. The 12 pictorial flashcards will be presented separately and then knitted together to provide a straightforward evaluation landscape for participants to take-away, adapt, and build. Interactive activities will be combined with knowledge sharing to consolidate participant learning.

Chair
avatar for John Stoney

John Stoney

John is an internal evaluation practitioner within the Australian Government for nearly 10 years (currently with the Dept. of Social Services as a member of its central Evaluation Unit). He describes this as his 'full-time job', as he has also been (effectively part-time) at varying... Read More →

Speakers
avatar for Samantha Abbato

Samantha Abbato

Director, Visual Insights People
Dr. Samantha Abbato is an evaluation consultant and Director of Visual Insights. Sam has a passion for maximising evaluation use through effective communication and evaluation skill building using a pictures and stories approach and increasing the academic rigour of evidence.Sam’s... Read More →


Tuesday August 30, 2022 11:00am - 12:00pm ACST
Room E3

11:00am ACST

Measuring Local Change - Stronger Communities for Children program
Sharon Forrester (Ninti One Limited), Anthony Rologas (Ninti One Limited), Steve Fisher (Community Works), Linda Ivatts (National Indigenous Australians Agency)

Established in 2013, Stronger Communities for Children (SCfC) is a program that supports children in ten remote Aboriginal communities in the Northern Territory to achieve the best start in life. Currently, up to 125 community activities are supported by the program, each decided by a committee of local people in each site. These activities range across diverse priorities including, for example, support to early childhood development, nutrition programs, junior bush rangers and community events. The SCfC Program closely aligns with Closing the Gap Priority Reform 1 - Formal Partnerships and Shared Decision Making, and Priority Reform 2 - Building the Community-Controlled Sector.

A challenge for the SCfC program has been the development of a meaningful, relevant monitoring and evaluation strategy. This strategy needs to collect information that is rigorous in measuring the impact of the work but also adaptable to the diversity of program activities. It must be suitable for application locally by non-specialists and also tell the story of the program and its participants, reflecting its evolution over time. The purpose of the proposed session is to showcase the way the program has worked in close partnership with local communities since 2017 to develop and apply a methodology called Measuring Local Change. This approach has enabled a local assessment of the activities against their intended results to provide data to decision-making groups, often called local community boards, about the impact of decisions they have made on the allocation of resources from the program.

In the session, Community leaders, participating local Indigenous organisations and National Indigenous Australians Agency staff, all of whom have worked on Stronger Communities for Children, will provide a description of the components of Measuring Local Change. As well as the overall approach, we will also describe the Wheel of Measures and examples of its application in specific communities. To bring together the data from all ten communities, Ninti One developed a report called the SCfC Storybook, which will also feature in the session.

Using participatory methods, the team will canvas the views of session participants on the challenges of applying this strategy. They will identify key topics that emerge and facilitate further discussion. All session participants will have the opportunity to contribute. Insights will come from their reactions and responses. A summary of the session will be prepared and distributed after the event.

Chair
Speakers
avatar for Hyemi Jacka

Hyemi Jacka

tronger Communities for Children Program Project Coordinator, Ninti One Limited
Hyemi has over 15-year experience working with youth and children in roles as program manager, coordinator, and developer across diverse cultures in Australia, Asia, North America and sub-Saharan Africa. Over the last two years, she has worked on Stronger Communities for Children... Read More →
AS

Allison Stewart

NT Strategic Partnerships, National Indigenous Australians Agency
Allison is Policy Manager for the Stronger Communities for Children (SCfC) program within the National Indigenous Australians Agency. Allison is committed to social justice and collective action for sustained beneficial change and has a broad range of leadership experience in community... Read More →


Tuesday August 30, 2022 11:00am - 12:00pm ACST
Room E2

11:00am ACST

Overcoming culturally-biased methods: A step by step approach to applying culturally responsive evaluation to evaluation design
Kathryn Dinh (Lotus Evaluation), Heather Worth (UNSW), Bridget Haire (UNSW), Khuat Thu Hong (Institute for Social Development Studies)

Culturally responsive evaluation (CRE) contests the assumed universalism of Western-derived evaluation methods and stresses the importance of adapting evaluation methods to be context-sensitive. It represents an opportunity for evaluators to respect the values and perspectives of the communities participating in an evaluation and for mutual empowerment in addressing inequities. While there is a strong body of literature in evaluation methods, frameworks and approaches that reflect Indigenous knowledge, ways of knowing and experience, and conceptual frameworks for identifying the cultural worldviews of participating communities, a challenge for evaluators is how to bring these together and apply them in an evaluation. Given that, in today's globalized societies, there are many political, religious, cultural, social and other influences that determine how people behave, a further challenge for evaluators is how to practise culturally responsive evaluation so that the multiple ways of knowing and being that underpin these influences can be reflected in evaluation design.

This paper draws on previous research to outline a practical, seven-step approach to modifying evaluation methods to reflect alternative world views. It is also informed by the literature- in particular from culturally responsive Indigenous evaluation. In describing each step, the paper includes guidance for implementing each step and discusses some of the assumptions and issues that may arise.

This is an emerging approach with potential for widespread application. Should it work as intended, this approach will provide evaluators with a practical process for designing bespoke methods that reflect the values and perspectives of a community participating in an evaluation. In applying culturally responsive evaluation in this way, this approach could also assist evaluators to contribute towards the wider goals of redressing colonial and institutional inequities that can be perpetuated by evaluation.

Chair
Speakers
KD

Kathryn Dinh

Director, Lotus Evaluation
Hi I am an Australian-Vietnamese evaluator and researcher in culturally responsive evaluation (CRE) methods and approaches relevant to South East and East Asia. I am Director of Lotus Evaluation located in Sydney and have recently completed my PhD at UNSW in CRE. I am also an Affiliate... Read More →


Tuesday August 30, 2022 11:00am - 12:00pm ACST
Room E1

11:00am ACST

Telling Our Stories Our Way: Grounding First Nations Data Sovereignty
Corinne Hodson (Barang Regional Alliance), Peter Riley (Executive Director, Central Australian Academic Health Science Network), Skye Trudgett (CEO KOWA Collaborations), Darren Clinch (Director, Notitia Consulting), Corinne Hodson (Manager  Community Engagement and Partnerships, Barang Regional Alliance), Jessie Sleep (Chief Executive, Far West Community Partnerships)

Aboriginal and Torres Strait Islander People are exposed daily to deficit stories which don't reflect our values, Community or Culture. We want data to tell our stories about our Communities, Elders and Dreamings, and what matters to our Communities. The UN Declaration on the Rights of Indigenous Peoples talks about the concept of Indigenous Data Sovereignty, but challenges remain in being empowered to have our own data, being empowered to have skills to use data and having the power to use it the way we want to use it from a Community based level. Using current examples from communities across Australia, we ask: What on earth is Indigenous Data Sovereignty in practice and on the ground? What changes when Aboriginal people lead the design, collection, analysis and sharing of data ourselves? How can we keep government and data holders accountable for First Nations access to data and data ownership?

Chair
avatar for Kathleen Stacey

Kathleen Stacey

Managing Director, beyond... (Kathleen Stacey and Associates)
Kathleen Stacey is the Managing Director and Principal Consultant at beyond... She spent her formative working years within the public sector and academia, before establishing and expanding beyond... into its current form. The company conducts consultancy, evaluation, research and... Read More →

Speakers
avatar for Peter Riley

Peter Riley

Executive Director, Central Australian Academic Health Science Network
Peter Riley is a Wiradjuri and Weilwan man from central western NSW. Currently the Executive Director of the Central Australian Academic Health Science Network (CA AHSN). CA AHSN is a regional partnership of Aboriginal Community Controlled Services (ACCSs) and research organisations/universities... Read More →
avatar for Skye Trudgett

Skye Trudgett

CEO, Kowa
Skye is a Gamilaroi researcher who has contributed to numerous evaluations and research projects including place-based, systems change and government initiatives. Skye has a particular interest in Indigenous Data Sovereignty & Governance and seeks to support the practical application... Read More →


Tuesday August 30, 2022 11:00am - 12:00pm ACST
Hall D (plenary)

11:30am ACST

Evaluation of Gippsland Aboriginal Advocacy Support Service (GAASS): Investigating the experiences of employees in preparation for work with NDIS clients, carers, families and Community
Valerie Prokopiv (Federation University Australia)

The Gippsland Aboriginal Advocacy Support Service (GAASS) is a service provided by the National Disability Insurance Scheme (NDIS) to help Aboriginal and Torres Strait Islander People with a disability, their families and carers to understand the NDIS and access services within the Gippsland region. GAASS is working towards increasing workforce capacity in disability services. Since its inception, GAASS has been training and upskilling workers to learn the skills required to meet the needs of local Aboriginal Community and respond to issues that affect access to the NDIS, ensuring inclusion and positive outcomes. The approach adopted by GAASS reflects the needs of Aboriginal and Torres Strait Islander people with a disability living in a regional area, is 'open and accountable to the Aboriginal Community and under the law'.

In collaboration with the GAASS, the evaluation investigated the experiences of workers employed by GAASS in gaining the necessary skills in preparation for working with people with a disability, families and carers in the local Aboriginal community. It aimed to investigate the barriers and enablers to the successful implementation of NDIS services by GAASS.

The approach to the evaluation was informed by a participatory evaluation and co-design framework, working in partnership with all stakeholders and embeds a capacity-building focus. The evaluation of GAASS utilised a qualitative method of data collection gathering data through semi-structured interviews. The evaluation informed future training programs for GAASS, ensuring that employees feel adequately prepared and are able to deliver NDIS support services effectively. The results also showed a need to employ both Aboriginal and Torres Strait Islander and non-Aboriginal and Torres Strait Islander workers to ensure the needs of Community are met.

A metasynthesis, Indigenous Experiences and Underutilisation of Disability Support Services in Australia reviewed the current literature regarding the experiences of Aboriginal and Torres Strait Islander people who engage with disability support services.

Chair
avatar for Nataly Bovopoulos

Nataly Bovopoulos

Manager, Grosvenor Performance Group
Nataly is a Manager at Grosvenor Performance Group specialising in evaluation services. She has over 15 years experience in the mental health and community sectors as a researcher, people manager and senior leader. She has expertise in wellbeing in the workplace, with her PhD research... Read More →

Speakers
avatar for Valerie Prokopiv

Valerie Prokopiv

Research Fellow, Federation University Collaborative Evaluation and Research Group
Valerie is the Deputy Director - Operations of the Collaborative Evaluation and Research Group (CERG) at Federation University Australia. She has extensive experience in working in community and industry,  bringing those skills to CERG as part of a team dedicated to ethical, robust... Read More →


Tuesday August 30, 2022 11:30am - 12:00pm ACST
Riverbank Room 1

11:30am ACST

PDIA: A new fad or the future for capacity development programs?
Erin Blake (Blake Consulting)

Problem Driven Iterative Adaptation (PDIA) is a step-by-step approach which helps to break down problems into its root causes, identify entry points, search for possible solutions, take action, reflect on the lessons, adapt and then act again.

Several insights have been gleaned from first-hand experiences supporting a PDIA-inspired approach, coupled with thinking and working politically principles, aimed at facilitating improvements in the delivery of health services across Vanuatu. These insights have implications for evaluators and evaluation practice, and relate to:
  • Challenging 'projectized, program logic, indicator-orientated' mindsets and systems.
  • Arriving at shared understanding of problem driven change and what success looks like.
  • Supporting a locally based facilitation team to identifying opportunities for reform, navigate working with new partners and to measure change.
  • Balancing demands to intervene and act, particularly during health emergencies, with the demand to implement a problem-driven capacity development process which allows people to fail and learn.
  • Developing new ways of framing monitoring, evaluation and learning (MEL) plans and implementing MEL for adaptive management systems (including question based and qualitative approaches, as well as information management systems).
  • Centering, operationalizing and measuring the effectiveness of program principles.

PDIA is a promising approach. However, there are significant challenges to implementation. Including the challenges associated with navigating new ways delivering and measuring the effectiveness of international aid, in systems unprepared for change.

This presentation will be of interest to evaluators who are curious about adaptive management, capacity development and/or the localization of international development. It will explore a relatively new approach that seeks to take greater stock of local knowledge, context and power dynamics. One that requires a significant shift in understanding and practice for program staff and evaluators, and which draws on developmental and principles-focused evaluation theory.

Chair
avatar for Squirrel Main

Squirrel Main

Evaluation Manager, Paul Ramsay Foundation
An evaluator and a data analyst with a passion for measuring and improving outcomes across education, health and community wellbeing sectors, I was The Ian Potter Foundation’s Research and Evaluation Manager from 2015 until 2022. Over the past two decades, I've conducted evaluations... Read More →

Speakers
avatar for Erin Blake

Erin Blake

Monitoring, Evaluation and Learning Consultant, Erin Blake Consulting
Erin is an independent international development Monitoring, Evaluation and Learning (MEL) consultant with over 15 years experience. He has a passion for ‘working with people to do MEL better’ and working on complex social change programs that seek to bring about long-term positive... Read More →


Tuesday August 30, 2022 11:30am - 12:00pm ACST
Riverbank Room 2

12:00pm ACST

Using Monitoring, Evaluation and Learning to support locally led systemic reform - the case of the Vanuatu Skills Partnership
Ellis Silas (Vanuatu Skills Partnership), Stuart Kinsella (Vanuatu Skills Partnership)

The onset of the COVID-19 pandemic in early 2020 has led to a renewed focus on localisation within the international development sector and the broader debate of decolonisation of the industry itself. The closure of international borders, particularly in the Pacific, has meant that donors and their agents have had to rethink the delivery of their programs, and this has often meant a move away from more traditional approaches where expatriate advisers or 'outsiders' typically work alongside local staff in-country and in many cases, 'wield the power' of program leadership. Donors and aid agencies have been grappling with this 'new normal', with the opportunity it presents to empower local staff to lead on program development and delivery counterbalanced by perceived risks around the appropriate level of oversight and accountability for donor-funded programs.

The Vanuatu Skills Partnership (the 'Partnership') is a nation-wide initiative that aims to improve human resource development for service delivery reform in Vanuatu. While it has been funded by the Australian Government for over a decade it has progressively become a stand-out example of a locally-led reform movement.

At the heart of the Partnership's success has been the way that monitoring, evaluation and learning (MEL) has been embedded within the Partnership's management structure, planning and decision-making processes. MEL is at the heart of locally driven change processes that are being led by, and therefore owned by, local staff and partners within government and civil society.

This paper provides a practical example of how MEL can be 'decolonised' within a donor-funded development program and used in a meaningful way by local actors to support systemic reform that is locally led. It focuses on the characteristics of the enabling environment needed to ensure that MEL is salient to and useful for the drivers of change within the complexity of the local political economy. In so doing, it provides lessons that are of relevance to other donor-funded programs looking to move towards a more localised development cooperation model.

Chair
avatar for Nataly Bovopoulos

Nataly Bovopoulos

Manager, Grosvenor Performance Group
Nataly is a Manager at Grosvenor Performance Group specialising in evaluation services. She has over 15 years experience in the mental health and community sectors as a researcher, people manager and senior leader. She has expertise in wellbeing in the workplace, with her PhD research... Read More →

Speakers
avatar for Ellis Theodor Silas

Ellis Theodor Silas

Quality Systems Manager, Vanuatu Skills Partnership
Mr. Ellis Silas is a 38yr old male, currently employed under the Vanuatu Skills Partnership (an Australian Government funded program in Vanuatu) as the Quality Systems Manager. His key role is to be responsible for the ongoing consolidation of an effective implementation and monitoring... Read More →
avatar for Stuart Kinsella

Stuart Kinsella

Quality Systems Support, Vanuatu Skills Partnership
Stuart is an experienced MEL practitioner and has worked from within and on behalf of both government and non-government organisations in the international development sector for close to 15 years. Delivering both 'in-house' monitoring roles and external/independent reviews and evaluations... Read More →


Tuesday August 30, 2022 12:00pm - 12:30pm ACST
Riverbank Room 1

12:00pm ACST

From Words to Wisdom - New Ways of Using Text Data to Inform Evaluation
Mark Mackay (Complete the Picture Consulting), Curtis Murray (University of Adelaide), Lewis Mitchell (University of Adelaide), Simon Tuke (University of Adelaide)

Evaluators can access a range of text-based data including documents (reports, submissions, journal articles, emails, etc.), responses to open-ended survey questions and more novel sources such as social media and blogs. Readily available tools now exist to assist the evaluator extract meaning from text-based data and present the findings in visually engaging ways.
The purpose of the presentation is to demonstrate how evaluators can use text-based analysis by applying the methods to several different text-based data types, including publicly available written submissions to a Parliamentary Inquiry, a synthetic data set relating to an open-ended survey question and publicly available patient opinion data.
The analysis will be based on the use of the open-source KNIME® software, which includes specific text analysis tools. Analysis will include topic analysis, sentiment analysis and include visualisations such as word graphs.

Main findings and/or conclusions
According to the World Economic Forum (Desjardins), in 2019 the volume of data generated each day included 500 million tweets, 294 billion emails and 4 petabytes of data are created on Facebook (Meta). The volume of data is estimated to increase to 463 exabytes of data will be created each day.

Evaluators have been long accustomed to dealing with quantitative approaches to summarising numeric data and have successfully employed statistical techniques in evaluations. Previously, unstructured data, such as texts were more difficult to analyse. The development of open-source tools, such as KNIME®, means that text analysis is now more accessible to evaluators.

Reference:
Desjardins, J. (2019). How much data is generated each day?| World Economic Forum. In World Economic Forum, accessed September (Vol. 28, p. 2019).

Chair
Speakers
avatar for Mark Mackay

Mark Mackay

Director, Complete the Picture Consulting
Mark has more than 30 years of experience in providing business research and evaluation, and strategy development. Before setting up Complete the Picture Consulting in 2009, Mark worked in the private industry and various state governments departments, including numerous health organisations... Read More →


Tuesday August 30, 2022 12:00pm - 12:30pm ACST
Room E2

12:00pm ACST

Application of principles to practice: an examination of two rapid evaluations.
Jane Howard (Department of Health), Diz McKinnon (Department of Health)

This presentation illustrates how evaluators can deliver rapid evaluations with minimal resourcing that are adaptable and responsive to changing environments and deliver evaluations that are of use and value to multiple stakeholders.

In fast paced policy environments, it can be challenging for evaluators to engage stakeholder groups and balance competing needs in rapid evaluation (RE) delivery. We will present two linked case studies that demonstrate the practical application of the RE methodology, and examine stakeholder relationships, clarity of purpose and scope, fit for purpose evaluation, and key lessons learned.

The overarching policy context is to ensure culturally responsive support to Aboriginal to maintain appropriate, affordable public housing as a pathway to better lives and stronger communities. The rapid evaluations were delivered by the evaluation unit of a large government department during time of COVID-19 restrictions, with key policy, funding and program decisions resting on evaluation results.

Key stakeholders were an Aboriginal Community Controlled Organisation (ACCO) responsible for managing rental public properties for Victorian Aboriginal people, and a state-government Housing division. Both stakeholders required the findings of the independent evaluations to support budget and program development needs. However, evaluation needs differed between stakeholder groups, requiring a balance to be struck with what data was collected and how it was analysed.

A significant challenge was the requirement to create and develop strong and respectful relationships with each stakeholder group within the truncated timeframes allowed by a rapid evaluation. The strength of these relationships, built through persistence, honesty and reliability, underpinned every activity that followed. These relationships were crucial to establishing clarity between all stakeholders for the evaluations' purpose and scope, which in turn contributed to their utility. A further learning was developing a deeper understanding of when and how to adapt the rapid evaluation model to ensure that the evaluation suits the evaluand's context.

Chair
avatar for John Stoney

John Stoney

John is an internal evaluation practitioner within the Australian Government for nearly 10 years (currently with the Dept. of Social Services as a member of its central Evaluation Unit). He describes this as his 'full-time job', as he has also been (effectively part-time) at varying... Read More →

Speakers

Tuesday August 30, 2022 12:00pm - 12:30pm ACST
Room E3

12:00pm ACST

Evaluation 'drop in' Clinics
Delyth (Del) Lloyd (Centre for Evaluation and Research Evidence)

Our evaluation unit serves two large government departments. We train over 500 people a year in monitoring and evaluation. We pilot tested "evaluation drop-in clinics" in 2021, led by the literature on implementation science and evaluation capacity building. The clinics aimed to support staff in implementing their evaluation training into their evaluation practice. The pilot was extended and is now part of our ongoing service offering. We have some tips and tricks to share about how to run a successful evaluation clinic, some lessons learned, and early evidence that our approach is making a difference.

Speakers

Tuesday August 30, 2022 12:00pm - 12:30pm ACST
Riverbank Room 2

12:00pm ACST

Giving the Participant the Driver's Seat: Using Virtual Reality animation for Evaluation
Samantha Abbato (Visual Insights People)

Evaluation has been reluctant to travel the road of participatory film methods. Researchers from fields including public health, social science, and education are already well in the distance with their use of participatory video, Videovoice, and digital storytelling.

Virtual Reality (VR) animation presents an opportunity to weave a distinctly participatory approach to evaluation storytelling and data collection through an immersive communication tool that seats the audience of the evaluation findings in the passenger seat.

Is it time that we immersed ourselves, taking the lead from evaluation participants and other disciplines, and take off along the participatory road with VR?

Chair
avatar for Squirrel Main

Squirrel Main

Evaluation Manager, Paul Ramsay Foundation
An evaluator and a data analyst with a passion for measuring and improving outcomes across education, health and community wellbeing sectors, I was The Ian Potter Foundation’s Research and Evaluation Manager from 2015 until 2022. Over the past two decades, I've conducted evaluations... Read More →

Speakers
avatar for Samantha Abbato

Samantha Abbato

Director, Visual Insights People
Dr. Samantha Abbato is an evaluation consultant and Director of Visual Insights. Sam has a passion for maximising evaluation use through effective communication and evaluation skill building using a pictures and stories approach and increasing the academic rigour of evidence.Sam’s... Read More →


Tuesday August 30, 2022 12:00pm - 12:30pm ACST
Riverbank Room 2

12:00pm ACST

How behavioural science can improve evaluation - 3 useful approaches
Kizzy Gandy (Kantar Public)

Curiously, many behavioural scientists say the biggest contribution behavioural science has made to government is more and better evaluations. Yet few evaluators would say the reverse - that behavioural science has influenced the conduct of evaluations. Since the ultimate purpose of evaluation is to figure out how to make things better - and every policy, program, and service involves behaviour - this talk will argue that weaving behavioural science into evaluation is essential. Learn how behavioural science (a 'new way of weaving') improves theory of change, offers new methods for data collection and analysis, and generates more practical recommendations based on evaluation findings.

Chair
avatar for Squirrel Main

Squirrel Main

Evaluation Manager, Paul Ramsay Foundation
An evaluator and a data analyst with a passion for measuring and improving outcomes across education, health and community wellbeing sectors, I was The Ian Potter Foundation’s Research and Evaluation Manager from 2015 until 2022. Over the past two decades, I've conducted evaluations... Read More →

Speakers
avatar for Kizzy Gandy

Kizzy Gandy

National Director, Program Evaluation, Kantar Public
I have expertise in behavioural science, international development and evaluation. I love talking about integrating all three. Also come talk to me about improving the rigour of casual claims in impact evaluation through theory-based approaches and quasi experimental methods.


Tuesday August 30, 2022 12:00pm - 12:30pm ACST
Riverbank Room 2

12:00pm ACST

New ways of weaving during a pandemic: Adapting to the dynamic flow of an evaluand
Natasha Moloczij (Allen and Clarke)

COVID-19 significantly influenced daily life in Victoria. Programs that were being implemented, and their corresponding evaluations, were challenged to adapt and innovate. Using a developmental evaluation approach, we adapted and adjusted to the constantly changing public health challenges as we implemented a justice sector evaluation.

Key practice learnings were the importance of documenting what happened and when (both in the initiative and due to COVID restrictions), creating data collection strategies to fit an online environment (observations, interviews, survey and sense-making sessions), embedding constant adaptation into analysis, and re-assessing usual evaluation processes during uncertain environments.

Chair
avatar for Squirrel Main

Squirrel Main

Evaluation Manager, Paul Ramsay Foundation
An evaluator and a data analyst with a passion for measuring and improving outcomes across education, health and community wellbeing sectors, I was The Ian Potter Foundation’s Research and Evaluation Manager from 2015 until 2022. Over the past two decades, I've conducted evaluations... Read More →

Speakers
NM

Natasha Moloczij

Senior Consultant, Allen and Clarke


Tuesday August 30, 2022 12:00pm - 12:30pm ACST
Riverbank Room 2

12:00pm ACST

Utility in practice, a prisons pilot project: maximising an evaluation to meet the needs of prisoners, the project implementation team and policymakers
Samantha McArdle (Wellbeing SA),John Strachan (Department for Correctional Services (SA))

In 2021, a range of nutritious menu changes were introduced in the first of six State-run prisons. Using developmental evaluation principles, the pilot project and its evaluation were co-designed and implemented using a "managing for impact" approach to: engage stakeholders, maximise the take up and impact of the intervention, and support sustained changes to prison operations.

Contrary to common assumptions about correctional facilities, the project and evaluation took a responsive, respectful and gentle approach - recognising that changes to food provision might be of heightened importance, given that many of prisoners' other civil liberties are restricted.

The mixed methods evaluation used a wide range of data sources to tell a compelling story of success. However, this value judgment (although necessary for a pilot project) was only one outcome of the evaluation. In addition, the evaluation met the needs of a range of stakeholders: showcasing "prisoner voice" about issues affecting their own health and wellbeing and inviting their input; valuing and empowering kitchen staff (including paid prisoners); informing project actions and timeframes; identifying a operational shortfall that was halting the project's success; providing an influential communication tool for Executive, Senior Management staff and policymakers; and allaying the concerns of Union representatives.

Moving beyond outcomes and judgments, the evaluation produced a roadmap for how to successfully undertake health promotion-based change management at prison sites. This is a substantial outcome as it creates an established pathway to support future health promotion initiatives for the benefit of a population who has experienced high levels of disadvantage.

This presentation will demonstrate novel, far-reaching ways to utilise evaluation for a range of stakeholders and purposes. Evaluation can play an instrumental role in maximising the responsiveness, impact and sustainability of interventions, in this case with an equity lens and within in a highly controlled setting.

Chair
Speakers
avatar for Sam McArdle

Sam McArdle

Senior Evaluation Officer, Wellbeing SA
Program evaluation, campaign evaluation, mixed methods, maximising utility for stakeholder groups


Tuesday August 30, 2022 12:00pm - 12:30pm ACST
Room E1

1:30pm ACST

Participatory Realist Evaluation in a large state government agency: weaving evaluation findings into policy and implementation
Eleanor S C Kerdo (Victorian Department of Education), Hayden Jose (Victorian Department of Education)

The paper shares the experiences of an internal evaluation team using participatory realist evaluation approaches to evaluate a large government pilot initiative providing a two-year structured induction program for graduate teachers employed in state government schools.

We will explore the challenges and facilitators of participatory realist approaches through this evaluation and provide insights generated from both the evaluator and client perspectives. These approaches also includes evaluation capacity building to ensure evaluation findings can be understood and used to shape pilot initiatives and possible future iterations.

Participatory realist approaches have particular value for pilot initiatives in their ability to rapidly synthesise evidence sources, collaboratively test program theory, examine contexts, mechanisms and outcomes for program participants, and provide insights to influence policy and implementation.

We explore examples of how the approaches have facilitated rapid uptake of findings to improve program delivery, including to support implementation fidelity and adaptation. We discuss the challenges and enablers for using participatory realist approaches throughout the evaluation, including in the context of changes to program implementation due to COVID-19, differences in policy and implementation across different pilot areas, and the balance between implementation and outcomes findings across the evaluation lifespan.

Participatory realist approaches have also been useful in looking at impacts of place-based implementation within the pilot to understand underlying generative mechanisms that explain 'how' the outcomes were achieved and the influence of context, as described by Pawson and Tilley (1997).

Chair
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured interviews... Read More →

Speakers
avatar for Eleanor Kerdo

Eleanor Kerdo

Assistant Director, Attorney Generals Department
Eleanor is passionate about social justice, science, and access to safe and high quality health, human services and education. Eleanor is an experienced evaluator specialising in participatory realist approaches and has experience in both biomedical research and consumer lead research... Read More →


Tuesday August 30, 2022 1:30pm - 2:00pm ACST
Room E3

1:30pm ACST

A framework for evaluating in digital contexts
Matt Healey (First Person Consulting), Mallory Notting (First Person Consulting)

The impacts of COVID-19 over the last two years are undeniable, and have led to the adoption of new technologies, ways of working and implementation of programs in ways that are unprecedented. Evaluation as a practice has continued alongside these changes and, as good evaluators, there is value in stepping back and considering the broader lessons from these changes in delivery and subsequent approaches to evaluation.

This session will present a framework for understanding evaluation across two domains: evaluating digital interventions, and implementing digital evaluation methods.

Starting with a background to the framework, facilitators will then discuss different types of digital interventions - ranging from podcasts to websites to video games - and the key features that differentiate them. Then, the focus will shift to different digital interventions with a particular focus on the pros and cons of 'basic' methods such as online surveys, through to more 'sophisticated' approaches such as embedded data collection (e.g., analytics).

The final component of the session will focus on building attendees' understanding of the pros, cons, and practicalities of using digital interventions, digital methods, and the interface between the two. This includes how and when to undertake different steps in the evaluation planning process, and the importance of clarity in terms of 'what' is measured by different approaches. Of particular importance is the role of embedded measurement and what it actually means - for instance 'listens', 'views', 'shares', and 'likes'. Finally, the session will close with discussion on the role and importance of 'humanising' the anonymous user and the value of more traditional methods in this process.

Attendees will receive a 1-page summary copy of the framework that outlines these key considerations, and that can be taken away for use in their own practice.

Chair
avatar for Kate Williams

Kate Williams

Senior Research Fellow, Australian Health Services Research Institute, University of Wollongong
Dr Kate Williams is a senior research fellow at the Australian Health Services Research Institute, University of Wollongong, where she works in multi-disciplinary teams on commissioned research and evaluation projects. Kate has more than 20 years’ experience in research and evaluation... Read More →

Speakers
avatar for Matt Healey

Matt Healey

Senior Consultant, First Person Consulting
I've been working as an evaluation consultant since early 2014. In that time I've led or contributed to over 100 projects spanning sustainability, climate change, public health, health promotion, emergency management, energy efficiency, renewable energy, waste management, innovation... Read More →
avatar for Mallory Notting

Mallory Notting

First Person Consulting


Tuesday August 30, 2022 1:30pm - 2:00pm ACST
Riverbank Room 2

1:30pm ACST

Exploring Action Research as a method of creating evidence that is informed by Aboriginal and Torres Strait Islander ways of being, doing and knowing.
Lucas Moore (QATSICPP), Reno French (QATSICPP), Lisa Hillan (QATSICPP)

Over the past two decades there has been growing acknowledgement of the need to decolonise approaches to evaluation policy, programs and services impacting on the lives of Aboriginal and Torres Strait Islander people . During this time, Aboriginal and Torres Strait Islander leaders, researchers and organisations have called for their ways of knowing, doing and being, to be forefront in evaluative practice.
This presentation outlines how a community controlled peak body is utilising an action research methodology guided by First Nations wisdom and knowledge to drive evaluation. It will highlight how evaluative evidence is utilised to ensure new approaches are developed to address the overrepresentation of Aboriginal and Torres Strait Islander children in child protection. Action research was selected as it is inclusive, supports self-determination, and complimentary with other values and practises in Aboriginal and Torres Strait Islander culture, such as yarning and collective decision making.

Two action research projects will be explored, the first focused on systems reform and collaboration between community controlled organisations and a child protection body to evaluate and inform the ground breaking implementation of delegated authority in one jurisdiction.

The second project involved Aboriginal and Torres Strait Islander practitioners building the evidence base to inform the development of systems and practice grounded in a First Nations view of responding to domestic and family violence.

Critically the approach to evaluation was via research that was Aboriginal and Torres Strait Islander led throughout. Project reflections to date suggest participants found the action research approach inclusive, participatory and effective in assisting them to address critical practice and systemic issues.
The presentation will conclude with a discussion of insights from the research, including reflections on ensuring evaluative research is meaningful for Aboriginal and Torres Strait Islander peoples.

Chair
avatar for Doyen Radcliffe

Doyen Radcliffe

Doyen Radcliffe is a Yamatji Naaguja man from the Midwest Region of Western Australia. Doyen is a community minded individual with a passion for empowering Indigenous communities to reach their real potential to improve quality of life, health, social and economic wellbeing, and inclusion... Read More →

Speakers
avatar for Reno French

Reno French

Project Officer, QATSICPP
Reno is a project officer that is working on the implementation of Delegated Authority. Reno has worked in the Community Controlled, Federal Government and Local Government with experience in management and government initiatives.Reno is qualified in Child Youth and Family intervention... Read More →
avatar for Lucas Moore

Lucas Moore

Policy Officer, QATSICPP
Lucas Moore has been working and volunteering for 20 years in systemic advocacy, community development and the media.  He worked for over 12 years with the CREATE Foundation, a national advocacy body promoting the voices of children and young people with a care experience. While... Read More →


Tuesday August 30, 2022 1:30pm - 2:00pm ACST
Room E1

1:30pm ACST

New tools for supporting culturally safe evaluation and practice
Anna Leditschke (Lowitja Institute)

Our national Aboriginal and Torres Strait Islander research organisation has brought together learning from its two decades of research with contemporary developments in cultural safety, to develop several new tools for use in evaluation and organisational cultural change initiatives. After an overview of available tools, participants in this workshop will undertake an 'art gallery' walk in small groups to gain hands-on experience with the tools, before sharing their experiences, interests and insights with the full participant group.

Chair
avatar for Florent Gomez

Florent Gomez

Flo is a well-seasoned evaluator with over 15 years experience in evaluation across consulting and government in Europe and Australia. He is currently an independent evaluator and chairs the New South Wales committee of the Australian Evaluation Society.Flo is passionate about bringing... Read More →

Speakers
RS

Rosemary Smith

Executive Manager of Policy, Lowitja Institute
Rosemary is a proud Ngarabul from Northern NSW, growing up on Wonnarua country in the Hunter Valley and now living and working on Ngunnawal country in Canberra. Over the years, Rosemary has spent time working across Government, corporates and NGOs in Aboriginal & Torres Strait Islander... Read More →


Tuesday August 30, 2022 1:30pm - 2:30pm ACST
Room E2

1:30pm ACST

Showcasing evaluation through publication in the Evaluation Journal of Australasia
John Guenther (Batchelor Institute of Indigenous Tertiary Education), Jeffery Adams (Massey University, NZ), Anthea Rutter (The University of Melbourne), Melissa Forbes (University of Southern Queensland), Caroline Ladewig (Grosvenor Performance Group), Kwadwo Adusei-Asante (Edith Cowan University)

Evaluators have a vital role to play in communicating the value of evaluation, demonstrating innovation in practice, and sharing effective materials, patterns and practices. Evaluators are also challenged to demonstrate whose values matter in evaluation and to explore inclusive approaches. Writing for publication is one really positive way that evaluators can share their knowledge and the important learnings that come from working in this field.

The editorial team of the Evaluation Journal of Australasia (EJA)—led by John Guenther—extends an invitation for evaluators to address these issues and to contribute articles on any other subject connected with evaluation. The journal is intended to be critical and intellectually rigorous, but at the same time readable and accessible to the widest possible range of scholars and practitioners. Manuscripts from evaluators at all career stages are welcomed.

In this session you will meet members of the editorial team, who will provide guidance on developing journal articles and navigating the EJA publishing and review process. It's easy to be daunted by the task of writing for journals, but this session will show that it is not too difficult. The session will also provide opportunities to network with other potential authors and get some practical tips about how to improve your publication profile. Participants in previous EJA conference sessions have gone on to contribute journal articles and book reviews, and to peer review for the journal.

Chair
avatar for Susie Fletcher

Susie Fletcher

Senior consultant, Australian Healthcare Associates
Dr Susie Fletcher is an experienced health services researcher with over 50 peer reviewed journal articles and 3 book chapters in mental health and primary care. She is passionate about improving health outcomes through integrating services across sectors; her recent work has focused... Read More →

Speakers
avatar for John Guenther

John Guenther

Research Leader, Education and Training, Batchelor Institute of Indigenous Tertiary Education
John Guenther is a senior researcher and evaluator with the Batchelor Institute of Indigenous Tertiary Education, based in Darwin. Much of his work has been based in the field of education. He has worked extensively with community-based researchers in many remote parts of the Northern... Read More →
avatar for Jeff Adams

Jeff Adams

Associate Professor, SHORE & Whariki Research Centre, Massey University
I have led many research and evaluation projects across a range of public and community health and social service initiatives. In addition I maintain a competitively funded research program investigating sexuality, gender and health. I have experience in teaching program evaluation... Read More →
avatar for Anthea Rutter

Anthea Rutter

Research Fellow, Centre for Program Evaluation. The University of Melbourne
Anthea Rutter is a Senior Research Fellow in the Assessment and Evaluation Research Centre (formerly the Centre for Program Evaluation) at The University of Melbourne. She has extensive experience working with a wide range of community, state and national organisations. She is particularly... Read More →


Tuesday August 30, 2022 1:30pm - 2:30pm ACST
Riverbank Room 1

1:30pm ACST

Relationships, reciprocity and trust: Centring First Nations people in evaluation to support effective co-design of financial governance training
Samantha Togni (S2 Consulting)

Panellists: Valerie Martin, Peggy Granites, Robyn Lawson, Karina Menkhorst, Peter Marin, Samantha Togni, Marjorie Brown

Over three years developmental evaluation supported the co-design of a financial governance training program aimed at strengthening the capacity of directors of two First Nations corporations in remote Australia who receive income from mining. Operating at the interface of different knowledge systems, laws and values, this co-design process involved Indigenous directors, land council staff and corporate and financial governance trainers. Developmental evaluation offered an approach that engaged all parties and centred the Indigenous directors’ values and perspectives in the co-design.

Relationships are central to developmental evaluation as it harnesses the knowledge and experience of all stakeholders engaged in the co-design of new initiatives in complex contexts. Developmental evaluation de-centres the evaluator ‘expert’; instead, situating the evaluator within the co-design team to facilitate evaluative thinking and iterative learning. Understanding how developmental evaluation practice supports culturally safe Indigenous-led co-design is important. Directors, land council staff, trainers and the evaluator will share their experiences of weaving together the co-design, learning and evaluation processes.

In conversation with the evaluator, panellists will share how they engaged in the developmental evaluation ensuring that it became integral to the co-design process and the training program delivery. They will explore how, not only was the training program transformed to align with Indigenous ways of learning through the centring of the directors’ values and perspectives in the evaluation, but there was reciprocal learning: the directors learned about financial governance and investing and the land council and trainers learned about delivering effective financial governance training in this context. This reciprocity enhanced relationships and built trust between the stakeholders, which further enhanced the learning environment and the training’s effectiveness.

When there is readiness and capacity for developmental evaluation, this approach can effectively centre First Nations people’s values and perspectives to strengthen relationships and the evaluation’s quality to support Indigenous aspirations.


Chair
avatar for Michael Pilbrow

Michael Pilbrow

Founder and Chairman, Strategic Development Group
Michael Pilbrow is based in regional Australia from where he engages in an unusual mix of evaluation, co-operative development and disaster recovery and resilience work. Michael has led evaluations in Australia and globally in areas as diverse as mining governance, digital technology... Read More →

Speakers
avatar for Belinda Wayne

Belinda Wayne

Director, Granites Mine Affected Area Aboriginal Corporation
Hello, my name is Belinda Napaljarri Wayne. I am from Yuendumu Community northwest of Alice Springs on the Tanami Road in the NT. I speak Warlpiri, which is my first language. I went to school in Yuendumu when I was young and then at 13 years old I went to Yirara College in Alice... Read More →
avatar for Samantha Togni

Samantha Togni

Evaluation facilitator, S2 Consulting
Samantha Togni is an evaluation and social research consultant based in Mparntwe, Alice Springs. Sam has more than 25 years’ experience in Indigenous health and wellbeing research and evaluation, working with rural and remote Aboriginal organisations in northern and central Australia... Read More →
avatar for Karina Menkhorst

Karina Menkhorst

Good Governance Program Coordinator, Central Land Council
Karina Menkhorst is the Good Governance Program Coordinator at the Central Land Council (CLC) in Alice Springs. In this role Karina works closely with the Directors of two Aboriginal Corporations based in the Tanami Desert of the Northern Territory - the Granites Mine Affected Area... Read More →
avatar for Peter Marin

Peter Marin

Director, MLCS Corporate
Peter is Director of MLCS Corporate, Chartered Accountants and Business Advisors based in Adelaide South Australia.  MLCS Corporate specialise in working with Australian Indigenous People in business planning and economic development.  Peter specialises in Corporate Governance with... Read More →
avatar for Valerie Martin

Valerie Martin

Director, Kurra Aboriginal Corporation
avatar for Peggy Granites

Peggy Granites

Director, Kurra Aboriginal Corporation
Peggy Napurrula Brown is a senior Warlpiri woman living in Yuendumu. Peggy is custodian for Dead Bullock Soak and a Director of Kurra Aboriginal Corporation
avatar for Marjorie Brown

Marjorie Brown

Director, Granites Mine Affected Area Aboriginal Corporation
Hello, I’m Marjorie Nampajimpa Brown. I am originally from Willowra. I went to school here. I am a senior Warlpiri woman and Luritja on my father’s side. I have 4 children and 11 grandchildren. I worked at the Literacy Centre in Willowra School when I was 35. I learned English... Read More →


Tuesday August 30, 2022 1:30pm - 2:30pm ACST
Hall D (plenary)

2:00pm ACST

Translating evaluation findings into practice‚ a partnership approach
Cathy Stirling (Insight Consulting)

Evaluation reports often fail to get the right information to the right people at the right time to facilitate change in practice. Whilst there has been an increase in the use of evidence-based programs created abroad, and the requirement to evaluate these programs, there is often little emphasis on how to translate evaluation findings into practice to ensure these programs meet local needs. This presentation will show how partnerships between evaluators and practitioners can help improve outcomes in programs serving vulnerable people.

Insight Consulting evaluated Functional Family Therapy (FFT) as an innovative response to address adolescent violence in the home. This evaluation draws on a mix-methods approach. To understand the context sixty stakeholders were interviewed including 11 Ozchild staff, 15 adolescents, 17 parents and 17 Aboriginal community stakeholders. To understand the impact of FFT on the 65 families who completed the program the following data was used, administrative, surveys and reliable and valid outcome measures.

The partnership approach evolved over three years with the following key learnings:
  • The importance of practitioner buy-in for the evaluation
  • Seeing practitioners as equal partners alongside evaluators
  • Embedding outcome measurement into practice, including improving the quality and quantity of data captured, and ensuring this data is used to continuously refine the program to meet the needs of people in the community
  • Updating the theory of change to represent the contextual barriers and enablers to implementing FFT in the Central Coast
  • Targeting stakeholders within the community to ensure the program can be sustained overtime.

This presentation will show how partnerships between evaluators and practitioners helps to ensure evaluation findings become embedded into practice.


Chair
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured interviews... Read More →

Speakers
avatar for Cathy Stirling

Cathy Stirling

Senior Consultant, Insight Consulting
Cathy has twenty years experience as a researcher and evaluator across academic, government and non-government organisations. She is passionate about knowledge translation and working in partnership to make sure research and evaluation findings are embedded into practice. 


Tuesday August 30, 2022 2:00pm - 2:30pm ACST
Room E3

2:00pm ACST

Taking a digital approach to impact reporting: Creating the Impact Hub
Cliona Fitzpatrick (Movember), Bree Earle (Movember), Jen Anderson (Movember)

The stories organisations tell about the difference they make can engage, inform and inspire its stakeholders. This presentation will share the experiences of a global health organisation in creating an interactive style impact reporting tool, known as the Impact Hub. It will describe how evaluation findings were weaved into the Impact Hub via simple, sharp, tailored messages for its fundraisers and donors.

The presentation will demonstrate new ways of weaving describing the development of impact statements from evaluation findings and the benefits of embedding alongside bold visuals and stories.

The presentation will also highlight the practical nature of this new tool demonstrating how impact can be easily used and shared cross functionally within the organisation and with external stakeholders. It will also describe how the reader can explore and choose the level of detail they wish to engage with.

Chair
avatar for Kate Williams

Kate Williams

Senior Research Fellow, Australian Health Services Research Institute, University of Wollongong
Dr Kate Williams is a senior research fellow at the Australian Health Services Research Institute, University of Wollongong, where she works in multi-disciplinary teams on commissioned research and evaluation projects. Kate has more than 20 years’ experience in research and evaluation... Read More →

Speakers
avatar for Cliona Fitzpatrick​

Cliona Fitzpatrick​

Research & Impact Manager, Movember
Exposing health disparities and demonstrating impact. Cliona is passionate about building evidence, and then presenting it in innovative and compelling ways to reach all audiences. Using skills established in a fourteen-year research and evaluation career, she is now leading how Movember... Read More →
avatar for Bree Earle

Bree Earle

Head of Content, Movember
Golf balls, moustaches and stories. These three simple things have paved Bree’s career so far. Bree is a content strategist with over eight years’ experience in the non-profit sector. From grassroots golf to the hairy halls of Movember, Bree is passionate about using storytelling... Read More →


Tuesday August 30, 2022 2:00pm - 2:30pm ACST
Riverbank Room 2

2:00pm ACST

"- Story Catching, Story Telling and Story Weaving - How story can be weaved into other programs and policies beyond the Monitoring and Evaluation of Indigenous programs."
Susan Rooney-Harding (The Story Catchers), Jade Wilson (Department for Infrastructure and Transport), Nerissa Walton (Nereus Consulting)

Evaluating First Nations programs requires an inclusive, engaging, authentic & values-based evaluation approach. We use a variety of evidence-based methodologies & have developed & refined an innovative story and participatory-focused process that employs documentary film and participatory forums to analyse narrative data. This ensures that we embrace the values & perspectives of the communities, stakeholders & clients that we work with.

We will analyse the participatory monitoring & evaluation process that was used in the 2nd Evaluation & the lessons learned from the 1st Evaluation of the 'One the Right Track Remote' (OTRTR) drivers licencing service delivery in the APY & MT Lands (Maralinga Tjarutja) in remote South Australia.

We will be showing the final documentary report & discuss the uses of the narrative data for the evaluation & how it can be weaved into other programs & policies beyond the evaluation. These may include Reconciliation Action Plans, education programs, meeting communication & events.

Story Catching - Collect - 15min

We will discuss our process in the creation of an agile evaluation framework, & the story collection methods that we adopted. This process had to accommodate dynamic community needs & a flexible service delivery approach due to the ongoing COVID-19 pandemic.

Storytelling - Collate and Collaborate - 15min

We will discuss how the narrative data is edited for analysis in workshops & yarning circles, and then post engagement for the written & visual documentary report. Topics include:
1. The Filmmakers and evaluators' analytical process
2. The Engagement process
3. The Communities & Stakeholders (Yarning Circle)
4. The Clients & Program staff (Workshop)

Story Weaving - Circulate - 10 mins

We will discuss how the narrative data can be edited into pieces that are for purpose, & can then be weaved into other programs & policies beyond the evaluation.

Chair
avatar for Doyen Radcliffe

Doyen Radcliffe

Doyen Radcliffe is a Yamatji Naaguja man from the Midwest Region of Western Australia. Doyen is a community minded individual with a passion for empowering Indigenous communities to reach their real potential to improve quality of life, health, social and economic wellbeing, and inclusion... Read More →

Speakers
avatar for Jade Wilson

Jade Wilson

Coordinator, Dept. for Infrastructure and Transport
I am a proud Arabana, Ngarrindjeri and Pitjantjatjara women. I have grown up up in Port Augusta, South Australia and has worked for the Australian Government for 14 years in Aboriginal Affairs division.  I have been working in the Road Safety field since 2017 and currently the Coordinator... Read More →
avatar for Nerissa Walton

Nerissa Walton

Principal, Nereus Consulting
Hi there I've worked in a range of evaluation, advisory and project management fields throughout my career. I am passionate about maximising benefits achieved through public and private funding and opportunities for landholders and communities to engage in nature-based enterprises... Read More →
avatar for Susan Rooney-Harding

Susan Rooney-Harding

Director, The Story Catchers
Susan Rooney-Harding: Director, Videographer, Editor, Business Development, and Project Management.Susan is a documentary filmmaker and a creative qualitative data specialist. Her inquisitive and intuitive nature is central to her ability to capture meaningful stories for a greater... Read More →


Tuesday August 30, 2022 2:00pm - 3:00pm ACST
Room E1

2:30pm ACST

Weaving evaluation into program implementation: Using evaluation findings to inform program learning and decision making
Jake Phelan (Australian Volunteers Program)

How do you weave the findings from multiple evaluations into the fabric of a global international development program? This paper presents the experience of a large, Government funded program supporting a diverse range of skilled volunteers to work with partner organisations internationally. The program has a strong organisational commitment to learning and continuous improvement, supported by a dedicated Monitoring, Evaluation and Learning (MEL) team and investment in research and evaluation. Each year the program collects monitoring data from hundreds of individuals and partner organisations, across a large range of thematic sectors. It has commissioned numerous formative evaluations plus reviews of specific areas of the program and has conducted multiple summative evaluations. In this data rich context, the challenge is promoting the value of evaluation to strengthen a culture of organisational learning; and making sense of the evidence gathered to inform practice.

The program's MEL team supports several initiatives to address this challenge, and the paper will present learning from different strategies used, some more successfully than others. Annual regional reflection workshops are held with program staff from 22 countries, where staff are supported to collaboratively make sense of data collected. Findings and recommendations from research and evaluations are summarised and communicated in a variety of ways to appeal to different stakeholders. And senior managers and Government stakeholders are supported to review evaluation recommendations - some of which have resonated more than others - and consider what it means for a program to 'learn and adapt.' The paper will share an evaluation user's perspectives on how evidence can have impact.

Justification: This paper fits within the theme of 'evaluation for all'. It showcases how evaluations can involve and have an impact on program staff and other stakeholders in 22 countries. In doing so it speaks to questions of how to weave evaluation into the culture of a global program; how to support staff to engage with evaluative processes, and how to support individuals to make sense of findings from multiple evaluations and weave findings and recommendations into operational and strategic decision making.

The paper shares insights from an evaluation users' perspective about how evidence can have impact, and how evaluators and organisations can help weave evaluation findings into decision-making processes. It will be particularly relevant for AES members interested in utilisation focused evaluations and the process of how evidence informs action.

The paper will provide a case study of the approaches a large program or organisation can use to engage stakeholders in evaluative processes and communicate findings, to improve the program's outcomes. These range from internal evaluation steering groups, informal communication channels, research summaries and videos to annual reflection exercises and formal mechanisms to manage and follow-up on evaluation findings. The paper will explore how the program's innovation pathways and new models have been tested, and how decision-makers have been supported to consider evaluation findings as they review policies and strategies and prepare plans. It will also address the institutional barriers to evaluation impact. The paper will ask what can be done to ensure that evaluations resonate with practitioners and how a culture of organisational learning and evidence-based decision making can be encouraged and strengthened.

Chair
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured interviews... Read More →

Speakers
JP

Jake Phelan

I manage the semi-independent unit responsible for monitoring, evaluation and learning for the Australian Volunteers Program. The program matches a broad range of skilled Australians with partner organisations overseas, to support these organisations to achieve their own development... Read More →


Tuesday August 30, 2022 2:30pm - 3:00pm ACST
Room E3

2:30pm ACST

Challenges and lessons learnt of establishing data sets for rapidly changing initiatives
Veronique Roussy (EACH), Virginia Lewis (La Trobe University), Geraldine Marsh (La Trobe University), Liz Senior (EACH), Danielle Siler (IPC Health)

This presentation draws upon the experience of EACH and La Trobe University in evaluating part of Victoria's COVID-19 rapid response initiatives, as delivered by a consortium of five independent community health organisations in 2020-2021.

Argument:
The establishment of useful, purposeful data sets to evaluate community-based primary health care interventions faces significant challenges when context changes rapidly.

Main findings:
Quantifying, describing and evaluating the processes and impact of the C-19 Consortium's delivery of rapid response COVID-19 testing, community engagement and vaccinations across metropolitan Melbourne was challenged by:
- Different and changing needs of organisations and levels of government.
- Continued uncertainty around timeframes and repeated short-term contracts.
- Development and implementation of custom data platform being a low priority during the crisis.
- Lack of access to government-held data which could have assisted in demonstrating impact.

The crisis nature of COVID-19 and the developmental approach used by evaluators helped to break down some barriers, including:
- Demonstrating the complementary skills required for effective evaluation - i.e., design (knowing what to measure), evaluation (how to measure and analyse) and data/ICT systems (setting up systems for data collection, analysis and reporting).
- Valuing the need to employ dedicated data systems capability as early as possible.
- Valuing the routine gathering of qualitative data.

Finally, the use of program logic modelling and a behavioural model of health service usage proved to be powerful tools to unpack the rapidly evolving programs, and inform the development of case studies to showcase program development and lessons learnt.

Implications for evaluation theory/practice:
A flexible and impact-driven approach is required to evaluate public health interventions that change on the go. Even in crisis situations, it is paramount to invest upfront in designing, implementing and oper

Chair
avatar for Kate Williams

Kate Williams

Senior Research Fellow, Australian Health Services Research Institute, University of Wollongong
Dr Kate Williams is a senior research fellow at the Australian Health Services Research Institute, University of Wollongong, where she works in multi-disciplinary teams on commissioned research and evaluation projects. Kate has more than 20 years’ experience in research and evaluation... Read More →

Speakers
GM

Geraldine Marsh

Research Fellow, Australian Institute for Primary Care and Ageing, La Trobe University
Geraldine is a health services researcher and evaluator with more than 25 years’ experience undertaking research, evaluation, policy development and program delivery in public health, primary care and workplace health. Geraldine has contributed to and managed large-scale national... Read More →
avatar for Danielle Siler

Danielle Siler

Executive Lead C-19 Collaborative, IPC Health
Danielle is Executive Lead for the C-19 Collaborative at IPC Health. The C-19 Collaborative is a community health consortium in Victoria, Australia. IPC Health is the lead agency for this consortium. Danielle has operational oversight of the Collaborative’s funded programs including... Read More →


Tuesday August 30, 2022 2:30pm - 3:00pm ACST
Riverbank Room 2

2:30pm ACST

Weaving a Rebalance of Power and Advancing Equity through Culturally Responsive and Equity-Oriented Evaluation Practises: A Case Example
Veronica G Thomas (Howard University), Eva Sarr (The Centre for Multicultural Program Evaluation)

This mini workshop will provide participants with a step-by-step approach for incorporating culturally responsive and equity-focused principles and strategies in the evaluation planning, implementation, and reporting/dissemination of results.  Initially, participants will be given an overview of the major tenets and principles of culturally responsive and equity-focused evaluation.  Subsequently, using a relevant case study, as well as the facilitators' experiences, the participants will learn how to:
  • rebalance power and reconcile the perspectives of diverse communities and affected parties throughout the evaluation process
  • ensure protection of all participants through an equity-oriented perspective
  • ask evaluation questions that matter to different populations with different goals including ones that address equity, power rebalance, and systems drivers of inequality
  • establish cultural perspectives in establishing validity and rigor in evaluation design
  • analyze and report quantitative and qualitative data from an equity-oriented and culturally responsive perspective
  • incorporate strategies for messaging and disseminating evaluation findings in ways that are culturally appropriate, inclusive, accessible, useful, and actionable

During the workshop, participants will engage in small-group activities using culturally responsive and equity-focused principles to address questions raised by the case example. Through facilitator feedback and guidance, as well as small group discussions, the participants will strengthen their understanding of how culture and equity-related issues can be operationalized throughout the entire evaluation process. Participants will receive a list of relevant resources that can be used to deepen their understanding in this area.

Note: Participants are requested to bring mobile phones to the skills building session.    

Chair
avatar for Florent Gomez

Florent Gomez

Flo is a well-seasoned evaluator with over 15 years experience in evaluation across consulting and government in Europe and Australia. He is currently an independent evaluator and chairs the New South Wales committee of the Australian Evaluation Society.Flo is passionate about bringing... Read More →

Speakers
avatar for Eva Sarr

Eva Sarr

CEO & Founding Director, The Center for Multicultural Program Evaluation
Eva Sarr is an indigenous Wolof woman of Serer ancestry from Sene-Gambia, in West Africa. She is also an indigenous Celtic- Scottish and Irish and 6th generation Australian-woman with a multi-denominational background. Her father was Muslim and her mother, Catholic.Eva is a mixed... Read More →
avatar for Veronica Thomas

Veronica Thomas

Professor, Howard University
Veronica G. Thomas, PhD is a Professor in the Department of Human Development and Psychoeducational Studies at Howard University in Washington, DC (USA). She also serves as the Evaluation and Continuous Improvement (ECI) Director for the Georgetown-Howard Universities Center for Clinical... Read More →


Tuesday August 30, 2022 2:30pm - 3:30pm ACST
Room E2

2:30pm ACST

AES State of Evaluation Project
Charlie Tulloch (Policy Performance), Jade Maloney (ARTD Consultants), Robert Sale (Nous Group), Matt Wright (KPMG), Lauren Bierbaum (KPMG)


The State of Evaluation Project is being led by the AES Relationships Committee and commenced in mid-2021. A working group is guiding the project, with support from a project officer engaged to lead this research task.

The State of Evaluation project seeks to:
  • Generate an evidence-based report regarding the practice of evaluation across Australia
  • Better understand the perception of evaluation among those who commission or use evaluation outputs.

This interactive presentation seeks to share draft findings of the project and to discuss emerging trends and patterns across the Australian evaluation sphere. The session's ambition is to seek feedback on emerging trends and patterns and to identify gaps and other features of the field that the study may have overlooked.

Participants will benefit from hearing first-hand from the project team on findings that have emerged, providing them with a contemporary understanding of the Australian evaluation environment. The evaluation community will more broadly benefit from this 'first-of-its-kind' study into the evaluation landscape and the major trends and patterns that may affect its future directions.

The findings discussed at this session will feed into the draft and final reports, with the study to be released in late 2022. Participants will have the opportunity to discuss their views further with the working group members at the conference and beyond.

The lead facilitator and support members will be determined by the State of Evaluation working group to provide sufficient skill and knowledge of the topic. The project officer will also likely be involved.




Chair
avatar for Susie Fletcher

Susie Fletcher

Senior consultant, Australian Healthcare Associates
Dr Susie Fletcher is an experienced health services researcher with over 50 peer reviewed journal articles and 3 book chapters in mental health and primary care. She is passionate about improving health outcomes through integrating services across sectors; her recent work has focused... Read More →

Speakers
avatar for Jade Maloney

Jade Maloney

Partner & Managing Director, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
avatar for Matt Wright

Matt Wright

Director, KPMG
Matt Wright is a Director in KPMG’s Policy, Economics and Public Impact Practice. Matt is one of KPMG’s national technical leads for evaluation, and specialises in complex program evaluations for government clients, with a focus on the effectiveness and efficiency of policies... Read More →
avatar for Lauren Bierbaum

Lauren Bierbaum

Manager, KPMG
Lauren is a Manager in KPMG’s Policy, Economics and Public Impact practice. Lauren specialises in program evaluation for health, ageing and human services programs with a focus on culturally and linguistically diverse communities.
avatar for Robert Sale

Robert Sale

Director, Nous Group
Robert partners with leaders in government, not-for-profit and Aboriginal Community Controlled organisations to deliver effective and empowering outcomes for their clients. He has conducted evaluations and review of numerous programs, policies and organisations; developed evaluation... Read More →


Tuesday August 30, 2022 2:30pm - 3:30pm ACST
Riverbank Room 1

2:30pm ACST

'Old Ways of Weaving' - evaluation perspectives from the oldest living Culture. Case studies from a Community-led evaluation.
Victorian Aboriginal Community Controlled Health Organisation (VACCHO)

The Victorian Aboriginal Community-Controlled Health Organisation’s (VACCHO) ‘Healthy Communities’ program centres Culture, Kinship and Country to support Aboriginal people and their Communities be healthy in mind, body and spirit. Under this program are self-determined projects led by Budja Budja Aboriginal Co-operative, Goolum Goolum Aboriginal Co-operative, and Moogji Aboriginal Council East Gippsland Inc. ‘Impact Yarns’, a methodology developed by Kowa Collaborations, is being utilised to understand the impact of the program and reflect on the meaning through a First Nations Lens. Aboriginal ways of knowing, being and doing and the 8 ways of Aboriginal learning (Aboriginal pedagogies) are being applied to empower Aboriginal Community Controlled Organisations (ACCOs) and the Communities they serve to share their stories of change in ways meaningful to them – including through yarning, art, dance, film, and photos.

This panel, moderated by VACCHO, will celebrate the ways in which Aboriginal perspectives can enrich evaluation practice. The panellists, including Skye Trudgett (Kowa Collaborations) and representatives from the participating ACCOs, will share their stories and insights from conducting self-determined and culturally driven evaluations in their Communities, through discussion of the following key questions:

  • We may not call it ‘evaluation’, but Aboriginal Communities have existing ways of knowledge creation and understanding change, and have done so for millennia – what does this look and feel like in your Community?
  • How have you been able to apply cultural approaches and perspectives through the evaluation using ‘Impact Yarns’ and what results did you see?
  • How have Aboriginal voices been centred throughout the evaluation and how has the process ensured self-determination?
  • How might mainstream stakeholders (governments, funders, organisations, the wider evaluation community) improve their readiness to receive evaluations done in this way?
  • How might we create greater synthesis between Indigenous knowledges and non-Indigenous knowledges to enhance evaluation practice?

Chair
avatar for Michael Pilbrow

Michael Pilbrow

Founder and Chairman, Strategic Development Group
Michael Pilbrow is based in regional Australia from where he engages in an unusual mix of evaluation, co-operative development and disaster recovery and resilience work. Michael has led evaluations in Australia and globally in areas as diverse as mining governance, digital technology... Read More →

Speakers
avatar for Skye Trudgett

Skye Trudgett

CEO, Kowa
Skye is a Gamilaroi researcher who has contributed to numerous evaluations and research projects including place-based, systems change and government initiatives. Skye has a particular interest in Indigenous Data Sovereignty & Governance and seeks to support the practical application... Read More →
avatar for Tanisha Lovett

Tanisha Lovett

Cultural Engagement Worker, Goolum Goolum
My name is Tanisha Lovett and I am a proud Gunditjmara and Wotjobaluk woman based in Horsham.I am young Aboriginal leader in my community. I work at Goolum Goolum which is my local Aboriginal Co-operative, in the health and well-being team. I work with a range of Aboriginal and Torres... Read More →
avatar for Tammy Bundle

Tammy Bundle

Chief Executive Officer, Moogji Aboriginal Council East Gippsland Inc
Tammy Bundle is the Chief Executive Officer at Moogji Aboriginal Council East Gippsland Inc.She is a highly motivated professional with more than 11 years’ experience in Aboriginal health, financial management, leadership and creating positive clinical outcomes. She is committed... Read More →
avatar for Abbie Lovett

Abbie Lovett

Aboriginal Health Worker Trainee, Budja Budja Aboriginal Co-operative
I am a proud Gunditjmara Woman who has lived in the Gariwerd area with family for the majority of the last 30 years.I am currently working as an Aboriginal Health Worker Trainee at Budja Budja Aboriginal Cooperative, Halls Gap, Victoria. Over the past 12 months I have been involved... Read More →


Tuesday August 30, 2022 2:30pm - 3:30pm ACST
Hall D (plenary)

3:00pm ACST

Ethical evaluation in 3 dimensions
Keryn Hassall (Aptavit)

Ethical issues are woven through our work as evaluators. But most advice on evaluation ethics addresses only one dimension - reducing the risk of harm to participants. Avoiding direct harm is important but evaluation ethics is much broader. This presentation explains 3 dimensions of evaluation ethics - our ethical responsibilities for the design and conduct of evaluation, for knowledge production, and for the effects of evaluation findings and how findings are used. Policies and programs are not always neutral and ethical evaluation practice must consider risks of harm in all three dimensions.

Chair
avatar for Doyen Radcliffe

Doyen Radcliffe

Doyen Radcliffe is a Yamatji Naaguja man from the Midwest Region of Western Australia. Doyen is a community minded individual with a passion for empowering Indigenous communities to reach their real potential to improve quality of life, health, social and economic wellbeing, and inclusion... Read More →

Speakers
avatar for Keryn Hassall

Keryn Hassall

Aptavit
I'm interested in how people think - about evaluation, about policies and programs, about their work, and how people think about other people.I have two primary areas of professional focus:(1) Organisational capability and practice change - using organisation theory and practical... Read More →


Tuesday August 30, 2022 3:00pm - 3:30pm ACST
Room E1

3:00pm ACST

The ins and outs of weaving a collaborative evaluation model alongside the development of a national workforce development strategy.
Kerryn Garner (Emerging Minds), Joanna Schwarzman (Australian Institute of Families Studies), Claire Marsh (Emerging Minds), Melinda Goodyear (Emerging Minds)

The National Workforce Centre for Child Mental Health was established in 2017 to increase the capacity of Australia's health, community and social services workforces to contribute to the prevention and earlier identification of mental health issues in children aged 0-12 years. The National Workforce Centre is operated by Emerging Minds, a non-government organisation which has resourced an internal evaluation team since 2019 alongside an external evaluation since 2017 at program commencement. In addition to building an internal organisational evaluation culture, the team also draws upon expertise from a number of external research and evaluation organisations.

Evaluation functions for the National Workforce Centre span project-level process and impact evaluation, component-level monitoring and evaluation, evaluation of co-design processes, informing organisational decision-making, and contributing to the evidence base on workforce capacity and children's mental health. Operational teams, community and stakeholder advisory panels and external evaluation partners participate in these evaluation functions.

This presentation will explore how weaving together internal and external evaluation functions has maximised the value of evaluation and organisational learning for Emerging Minds. It will also examine the facilitators and challenges to implementing this model and realising its benefits as both the project and its evaluation have matured.

The value of internal evaluators for evaluation capacity building has been recognised, and the independence of external evaluators, as well as the necessity for evaluation resourcing have also been considered. However, there are fewer examples of how a collaborative model of an internal evaluation team and external evaluation partners have developed an effective working relationship, that has provided opportunity to realise a wide range of benefits of evaluation.

Speakers
avatar for Martina Donkers

Martina Donkers

Independent Evaluator
I'm an independent evaluator with a background in program design, grants, and science communication. I have a Master of Evaluation, and I'm finding my sweet spot in qualitative and mixed methods evaluation with a complexity and systems lens. I like rubrics, semi-structured interviews... Read More →
avatar for Claire Marsh

Claire Marsh

Senior Research Officer, Emerging Minds
Claire has worked at Emerging Minds since the inception of it's flagship program, the National Workforce Centre for Child Mental Health, in 2017. She works on the ongoing evaluation and continuous quality improvement of this program. Claire has a Bachelor of Health Science (Honours... Read More →
avatar for Kerryn Garner

Kerryn Garner

Senior Research Officer, Emerging Minds
Kerryn is part of the Research and Evaluation Team at Emerging Minds undertaking evaluation of the National Workforce Centre for Child Mental Health. She has a background in health and social services evaluation and communications, and has worked on projects across mental health... Read More →


Tuesday August 30, 2022 3:00pm - 3:30pm ACST
Room E3

3:00pm ACST

Towards better inclusion of gender and sexually diverse people in evaluation
Jeffery Adams (Massey University)

Gender and sexually diverse people are routinely overlooked in 'mainstream' program evaluation - but it doesn't have to be this way! Find out why this group is often neglected and what you can do about it as an evaluator. Practical tips and resources to address this issue will be provided and the benefits of inclusion of gender and sexually diverse people outlined. Fuller inclusion and recognition of gender and sexually diverse people will result in higher quality and stronger ethical evaluation practice and go some way to achieving equity in health and wellbeing outcomes for this group.

Chair
avatar for Doyen Radcliffe

Doyen Radcliffe

Doyen Radcliffe is a Yamatji Naaguja man from the Midwest Region of Western Australia. Doyen is a community minded individual with a passion for empowering Indigenous communities to reach their real potential to improve quality of life, health, social and economic wellbeing, and inclusion... Read More →

Speakers
avatar for Jeff Adams

Jeff Adams

Associate Professor, SHORE & Whariki Research Centre, Massey University
I have led many research and evaluation projects across a range of public and community health and social service initiatives. In addition I maintain a competitively funded research program investigating sexuality, gender and health. I have experience in teaching program evaluation... Read More →


Tuesday August 30, 2022 3:00pm - 3:30pm ACST
Room E1

3:00pm ACST

Applying advanced technologies to evaluation
Ethel Karskens (Clear Horizon)

Innovative technological solutions have taken over all industries in the past twenty years. These new ways of collecting, storing, and analysing data can create new opportunities for the evaluation space.

Among them, three technologies especially have the power to disrupt the field and unlock enormous potential for evaluators. Internet of Things, Blockchain technology, and Natural Language Processing can create this new path.

More specifically, Internet of Things can allow the collection of new types of data to evaluate impact; blockchain technology allows the decentralisation of the collected information and thus more efficiency and transparency; Natural Language Processing supports the analysis of large sets of qualitative data for evaluators. These three technologies, while being widespread and being a potentially strong ally for evaluators, have not yet been leveraged in this space.

This presentation will review case studies and the opportunities for the evaluation space. It will show to evaluators how to be inspired by new technologies in other spaces and how to apply them to their work.

Chair
avatar for Kate Williams

Kate Williams

Senior Research Fellow, Australian Health Services Research Institute, University of Wollongong
Dr Kate Williams is a senior research fellow at the Australian Health Services Research Institute, University of Wollongong, where she works in multi-disciplinary teams on commissioned research and evaluation projects. Kate has more than 20 years’ experience in research and evaluation... Read More →

Speakers
avatar for Ethel Karskens

Ethel Karskens

Data and Insights Lead, Clear Horizon
I lead the data and insights strategy of Clear Horizon. This includes dashboard development and other data solutions to create insights for our clients.I am interested in innovation, data for good, and creating a data-driven culture in organisations.


Tuesday August 30, 2022 3:00pm - 3:30pm ACST
Riverbank Room 2

3:00pm ACST

Honouring our First Nations Children – Our Stories, Our Knowledge, Our Values - Our Evaluation
Lucas Moore (QATSICP), Candice Butler (QATSICPP)

QATSICPP created a Centre of Excellence to capture our own evidence base on effective service and system responses to support our children to grow up strong and safe in family, community, culture and country. We embed our Aboriginal and Torres Strait Islander knowledge systems and ways of knowing, being and doing in our research and evaluation approach. We walk audiences through how we uphold cultural principles and Indigenous data sovereignty, the elements of conducting evaluation as a collective research team and how we tell the story that emerges from evaluation so our communities that receive support have increased self-determination.

Chair
avatar for Doyen Radcliffe

Doyen Radcliffe

Doyen Radcliffe is a Yamatji Naaguja man from the Midwest Region of Western Australia. Doyen is a community minded individual with a passion for empowering Indigenous communities to reach their real potential to improve quality of life, health, social and economic wellbeing, and inclusion... Read More →

Speakers
avatar for Eva Ruggiero

Eva Ruggiero

Policy Officer, QATSICPP
I lived most of my life in Brisbane, but grew up near Chicago and am new to learning about my Cherokee Indian ancestry.I deliver project management support and research assistance for Aboriginal and Torres Strait Islander colleagues across Queensland's community controlled sector... Read More →


Tuesday August 30, 2022 3:00pm - 3:30pm ACST
Room E1

3:00pm ACST

Sustaining Indigenous youth futures in urban and regional places through culturally based programs: The Ngaramura Program.
Fiona Sheppeard (University of Wollongong), Kathleen Clapham (University of Wollongong), Kathleen Clapham (University of Wollongong), Michelle Wilson (Coomaditchie United Aboriginal Corporation), Lorraine Brown (Coomaditchie United Aboriginal Corporation), Narelle Thomas (Coomaditchie United Aboriginal Corporation), Kaitlen Wellington (University of Wollongong), Valerie Harwood (University of Sydney)

This presentation contributes to the Conference sub-theme of Weaving Perspectives. It brings together Aboriginal community members from Coomaditchie, a leading cultural organisation in the Illawarra region of NSW and researchers from the Ngarruwan Ngadju First Peoples Health and Wellbeing Research Centre, with whom they have a long standing collaborative relationship, to explore the findings of a mixed methods evaluation. Ngaramura, in the Dharawal language - 'See the Way' - is a supportive pathway that assists Indigenous young people to re-engage with education through a cultural learning framework. Delivered since 2018, Ngaramura is an important program because it addresses the significant disparity between educational and employment outcomes for Indigenous young people through a place and strengths-based approach to young people's learning. It offers Indigenous school students who have been suspended, or are at risk of suspension, an alternative culturally safe and structured environment with opportunities for both cultural and western academic learning. Ngaramura's philosophy and pedagogy is based on an Aboriginal cultural work of the Elders incorporating an individualised education program for each young person. As one young person observed, Yeah I'm not really sure on it, but it (school) just feels different than before coming here... Before I used to just hate it, wouldn't really go. Now not so much. It feels better than before (Ngaramura participant). Beginning with an interrogation of the intangible 'difference' that Ngaramura makes to the lives of young Indigenous people in the program, this presentation embraces the values and perspectives of First Nations people offering multiple perspectives around Ngaramura and its impact on young Indigenous futures. The paper contributes to the body of knowledge in the field of evaluation by presenting new insights into how and why place based culturally based programs work in urban and regional Indigenous contexts.

Chair
avatar for Doyen Radcliffe

Doyen Radcliffe

Doyen Radcliffe is a Yamatji Naaguja man from the Midwest Region of Western Australia. Doyen is a community minded individual with a passion for empowering Indigenous communities to reach their real potential to improve quality of life, health, social and economic wellbeing, and inclusion... Read More →

Speakers

Tuesday August 30, 2022 3:00pm - 3:30pm ACST
Room E1

4:00pm ACST

Plenary two: community panel "Storytelling for systems change at community level: insights from the field"
In this session, Rachel Fyfe of Dusseldorp Forum, Jane McCracken, Executive Officer of Hands Up Mallee, and Alistair Ferguson Executive Director of Maranguka, Bourke will share their experiences of using storytelling to evaluate systems change work within their own communities, and those they work with. Thea Snow, Director of the Centre for Public Impact ANZ will be facilitating a conversation which will explore questions such as:

  • What are the strengths of using storytelling to measure and evaluate systems change work?
  • How does storytelling support community led approaches?
  • What does good storytelling look like?
  • What are the limitations of stories?

A session grounded in practice and place, this panel will be conversational, honest and based in lived experience, rather than theory. It will be a joint exploration with the audience of both the great potential that storytelling holds in helping us understand the impact of complex, community-led work, while also surfacing some of the very real barriers which sometimes get in the way.

Speakers
avatar for Rachel Fyfe

Rachel Fyfe

Communications Manager, Dusseldorp Forum
Rachel Fyfe is the Communications Manager at Dusseldorp Forum, a family foundation that works alongside communities to create a better society for Australia. Rachel has worked in NFP communications for over 16 years and is currently exploring the role of storytelling in community-led... Read More →
avatar for Thea Snow

Thea Snow

Director, Centre for Public Impact ANZ
Thea leads CPI's work in Australia and New Zealand. Thea's experience spans the private, public and not-for-profit sectors; she has worked as a lawyer, a civil servant and, most recently, as part of Nesta's Government Innovation Team. She recently returned to Melbourne after spending... Read More →
avatar for Jane McCracken

Jane McCracken

Executive Officer, Hands Up Mallee
Jane McCracken is the Executive Officer at Hands Up Mallee, based in the Northern Mallee region of Victoria. Hands Up Mallee was established to bring local leaders and community together to address social issues and improve health and wellbeing outcomes. Hands Up Mallee works in partnership... Read More →
avatar for Alistair Ferguson

Alistair Ferguson

Maranguka, Bourke, Executive Director
Alistair was the Chairperson of the Bourke Aboriginal Community Working Party for more than ten years. Justice reinvestment is a core component of the Maranguka initiative. Key to Alistair’s community development is the belief in seeing communities truly empowered and taking responsibility... Read More →


Tuesday August 30, 2022 4:00pm - 5:30pm ACST
Hall D (plenary)
  Plenary

5:30pm ACST

AES 2022 Annual General Meeting
Join the Australian Evaluation Society (AES) Board as we celebrate another year’s achievements by members of the AES, and introduce newly elected Board members.

Chair
avatar for Kiri Parata

Kiri Parata

President, Australian Evaluation Society
Kia ora I'm Kiri, living on Kabi Kabi Country on the Sunshine Coast of Queensland. I share my time between here and my home in Aotearoa New Zealand where my iwi (tribal) affiliations are Te Ātiawa, Ngāti Toarangatira, Ngāti Raukawa, Ngāti Ruanui, and Ngāi Tahu. I'm completing... Read More →

Tuesday August 30, 2022 5:30pm - 6:15pm ACST
Hall D (plenary)
  Special session
 
Wednesday, August 31
 

9:00am ACST

Plenary three: Katina D’Onise "Public health challenges and evaluative thinking: rapid responses in the time of Covid-19"
Katina D’Onise, Executive Director, Prevention and Population Health Directorate, Wellbeing SA

In public health we are constantly striving to make the changes in society that will lead to sustainable, positive, meaningful changes in people’s lives. This is a highly complex and challenging task as we attempt to reform legislation, policy, to educate and communicate effectively and engage meaningfully with a myriad of different communities that make up our society. Key to our effectiveness is being data driven and understanding how effective we have been. Evaluation plays a key role in all components of the work of public health from structuring the thinking on what we hope to achieve and how, how we are faring against what we planned, thinking about needing to redirect the work and finally evaluating our effectiveness with a view to making decisions going forward. To increase our effectiveness, can we look to other fields of practice which (intentionally or otherwise) use evaluative thinking as a fundamental tenet to how they work? Who also work in complex environments and seek to make sustainable change? These matters will be discussed, drawing on the context and learning from responding to the COVID-19 pandemic in South Australia.

Speakers
avatar for Katina D'Onise

Katina D'Onise

Executive Director, Prevention and Population Health Directorate, Wellbeing SA
Professor Katina D’Onise is Executive Director, Prevention and Population Health Directorate, in Wellbeing SA. Katina oversees a range of functions including Health Promotion, Cancer Screening and Epidemiology (data collections, data analysis, evaluation, IT systems).A Public Health... Read More →


Wednesday August 31, 2022 9:00am - 10:00am ACST
Hall D (plenary)
  Plenary

10:30am ACST

Using outcome hierarchies to drive consistent and efficient evaluation planning for Government Portfolios
Jasper Odgers (Department of Planning and Environment), Bethany Hanson (Clear Horizon)

Outcome hierarchies are not a new concept in evaluation. In this session, we will discuss how we have used them in a novel way to efficiently develop consistent evaluation plans for multiple programs in a large investment portfolio being delivered by the NSW Department of Planning and Environment.  
 
Between 2022 and 2030, the NSW Climate Change Fund (CCF) will invest over $2.5b across its portfolio of 45 programs. In an environment where evaluation resourcing and approaches can vary, how could we enable efficient and consistent evaluation planning without compromising the potential insights to be gained for learning and improvement? 
 
The new CCF Evaluation Framework uses outcome hierarchies to determine the key evaluation questions that each program will embed in their evaluation plan. This streamlines the evaluation planning process which is critical where there are limited evaluation resources available to support program teams.  
 
This approach also creates consistency across the evaluation plans of a diverse portfolio of programs. Programs map and ask corresponding KEQs against broader strategic outcomes, facilitating the evaluation of impact at not only the program level, but at the policy and strategy levels as well.  
 
We believe this approach could be usefully applied across other portfolios of programs where efficiency and consistency in program-level evaluation planning and reporting is important. It is easy to engage with regardless of evaluation experience or expertise, and ideal for institutions with limited evaluation capacity and maturity.

Chair
avatar for Rae Fry

Rae Fry

Senior Manager, ARTD
I joined ARTD in 2022 after 10 years in specialist evaluation and policy analysis roles in NSW Government and NGOs. I have strong experience in project management, stakeholder engagement and evaluation advice and coaching for program teams.

Speakers
avatar for Jasper Odgers

Jasper Odgers

Manager, Evaluation & Performance, NSW Department of Planning and Environment
Jasper is currently working within the NSW environment department to increase organisational evaluation maturity. He is deeply interested in improving the way we value the benefits of environmental programs, and helping them to measure their impact. He is interested in the rise of... Read More →
avatar for Bethany Hanson

Bethany Hanson

Principal Consultant, Clear Horizon
Beth is a progressive and creative professional with experience in development of logic-based monitoring and evaluation frameworks; evaluation capacity building and reporting; project management; governance arrangements; and government and non-government stakeholder engagement and... Read More →


Wednesday August 31, 2022 10:30am - 11:00am ACST
Riverbank Room 1

10:30am ACST

Evaluation in regulatory contexts: Same same but different, with perspectives from the experts and practitioners
Dana Cross (Grosvenor)

There is a limited body of knowledge available regarding the application of evaluation in regulatory contexts within Australia, and the similarities and differences between evaluation practice in this sector compared with other sectors. This session will facilitate a panel discussion between regulatory experts from the Victorian Environment Protection Authority and Department of Jobs, Precincts and Regions as to the modern value of evaluation in regulation, the wins and pitfalls of evaluating regulatory programs, and provide insights from real-life case studies to provide attendees with practical tips and tricks. The panel will be moderated by our Victorian Program Evaluation Lead, who will guide the discussion through her expertise and through the input of audience members garnered through sli.do pre-pared polls to focus the conversation upon areas of interest.

Chair
avatar for Carina Calzoni

Carina Calzoni

Director, Clear Horizon Consulting
Carina has over 15 years of professional evaluation experience working at the practitioner level with grassroots community groups, working within State government policy levels and consulting for government and not-for-profit sectors. She has a good understanding of local, state and... Read More →

Speakers
avatar for Dana Cross

Dana Cross

Grosvenor
Dana is a public sector expert, possessing over a decade of deep experience advising Government organisations on program evaluation, organisational review, optimisation, procurement and performance management. She is a member of Grosvenor’s Executive Leadership Team and Public Sector... Read More →


Wednesday August 31, 2022 10:30am - 11:30am ACST
Hall D (plenary)

10:30am ACST

Weaving together or patching things up: Are commissioners, implementers, and evaluators partners or antagonists?
Gordon Freer (Insight Strategies)

The SPRING programme worked with businesses in East Africa and South Asia to impact girls. It aimed to empower girls, but was also a very experimental programme - it was a key learning experience in how to develop an innovative, sustainable approach to achieving development goals by engaging the private sector. It was also a multi-donor programme (FCDO, Australian DFAT and USAID), which meant the unique needs of these groups had to be balanced. In addition, the programme had its own internal monitoring team and an external evaluation team. The two teams often found themselves telling a similar story, albeit from different perspectives and for different purposes.

The challenge faced by all the groups (donors, implementers, evaluators) was how to work together, how to deliver robust, valuable work, and how to ensure that all of this was done collegially, effectively, efficiently, and - ultimately - to ensure that the programme benefitted.

This presentation reflects on an adaptive evaluation of an adaptive, complicated programme, where three partners worked together in different ways, in varying degrees and for different purposes. Both the implementer and the evaluator utilised their own monitoring data and could draw on the data and perspective of the other. The donors had access to all of this data, and oversaw the programme from a strategic, position, ensuring the programme and the evaluation remained true to purpose. This unique tripartite relationship holds lessons for the broader evaluation community.

Chair
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →

Speakers

Wednesday August 31, 2022 10:30am - 11:30am ACST
Room E1

10:30am ACST

Mind the innovation gap: the unfortunate tale of great programme design let down by traditional commissioning and contracting methods, and what this means for evaluators.
Judy Oakden (Pragmatica Limited - a member of the Kinnect Group),Julie Elliott (Independent evaluation consultant)

Are you a funder, provider or evaluator working on long-standing and profoundly entrenched complex, wicked problems? Have you noticed commissioning and contracting sometimes gets in the way of the programme's intended goals or services? Have you seen or had to use workarounds to make contracting possible? You are not alone. This session is for you.

LEARNING OBJECTIVES: This skill-building session is for funders, providers or evaluators looking for innovative ways to think through commissioning and contracting issues in programme and service implementation. This session may change how you design and evaluate programmes in the future.

WHY THIS SKILL IS IMPORTANT: New research shows commissioning and contracting can help or hamper innovative initiatives. Tensions abound in navigating traditional contracting rules and procedures associated with New Public Management. Evaluators who understand these tensions may be better able to explain gaps in programme and service delivery and suggest ways to address them.

This skill-building session will lean into some of the critical commissioning and contracting tensions. Instead of the prevailing rigid, predetermined ways of contracting for accountability only, this session will show how a complexity framing can help focus on more equitable service provision. We will introduce you to some new and successful ways of thinking about commissioning and contracting.

HOW WE WILL TEACH THIS SKILL: We will challenge some traditional assumptions about "good commissioning" and the implications for evaluation, such as:
• balancing the tensions between accountability and learning in fast-moving, emerging situations
• allowing for naturally occurring, serendipitous changes in service provision
• coping with the unanticipated, unplanned changes that self-organise in service delivery
• incorporating practical ways to feedback progress to funders and providers, ensuring projects stay on track.

We will use complexity-informed methods and approaches to quickly demonstrate an alternative way of looking at commissioning and contracting.

Chair
avatar for Gerard Atkinson

Gerard Atkinson

Director, ARTD Consultants
I am a Director with ARTD Consultants with expertise in: - program and policy evaluation - workshop and community facilitation - business analytics and data visualisation - market and social research - financial and operational modelling - non-profit, government and business... Read More →

Speakers
JO

Judy Oakden

Director, The Kinnect Group
Judy has held management roles in evaluation, market research and management consulting, and also worked in public relations. Judy shares a passion for finding better ways to help people navigate complexity and deal with the frustrating and seemingly intractable issues they face on... Read More →
avatar for Julie Elliott

Julie Elliott

Evaluator
Collaborator and evaluation scholar-practitioner committed to acknowledging the complexity inherent in all human settings.


Wednesday August 31, 2022 10:30am - 11:30am ACST
Room E2

10:30am ACST

Fellows forum: The Role of Evaluation in a Complicated World
Fellows of the Australian Evaluation Society

In many areas, the world is a complicated place. Policies are expected to work in varying environments with different or multiple target groups. What are the roles that evaluation can play in these situations? In this session, several AES Fellows will discuss their experiences and views when addressing this challenge. There will be ample opportunity to ask questions and seek the views of the Fellows on your own challenges when confronted with complicated policy situations.


Chair
avatar for Ghislain Arbour

Ghislain Arbour

Senior Lecturer, The University of Melbourne
Doctor Ghislain Arbour is a Senior Lecturer at the University of Melbourne where he coordinates the Master of Evaluation.*Research and consultancy*A primary research interest of Dr Arbour is the clarification of necessary concepts in the analysis and judgement of the performance of... Read More →

Speakers
avatar for Rick Cummings

Rick Cummings

Emeritus Professor, Murdoch University
Rick Cummings is an Emeritus Professor in Public Policy at Murdoch University. He has 40 years of experience conducting evaluation studies in education, training, health, and crime prevention primarily for the state and commonwealth government agencies and the World Bank. He currently... Read More →
avatar for Nan Wehipeihana

Nan Wehipeihana

Director, Weaving Insights
Nan is the director of Weaving Insights (www.weavinginsights.co.nz) and a member of the Kinnect Group (www.kinnect.co.nz). Nan is a founding member of the Aotearoa New Zealand Evaluation Association (ANZEA) and Ma Te Rae, Māori Evaluation Association – the first Indigenous Eva... Read More →


Wednesday August 31, 2022 10:30am - 11:30am ACST
Room E3
  Special session

10:30am ACST

"...It's time we researched ourselves back to life"
Jen Lorains (Children's Ground), Amunda Gorey (Children's Ground)

At 'Ampe-kenhe Ahelhe' (Children's Ground (CG) in Central Australia) we are changing the way services are designed, delivered and evaluated. The CG Approach is a different system. It was designed by our First Nations leaders, as our solution - so we can change the future for our children. It is backing us as the decision-makers - we are in the driver's seat.

For too long our people have been the subjects not the leaders of evaluation and research: "Our people have been researched to death. It's time we researched ourselves back to life" (William Tilmouth, Senior Arrernte man, CG Chair). We believe the history of data collected about us and interpreted without us has seen policies developed that want to control us instead of empower us. At CG we are empowered as decision-makers and evaluators.

CG is a 25-year approach. We are working with our little ones now and walking with them and our families while they grow. We have evaluation in each community from the beginning.

Our 'Ingkerrekele Arntarnte-areme' (First Nations Governance Committee) have worked alongside our Western-trained evaluation staff to design, collect and analyse the numbers (quantitative) and stories (qualitative). Our staff are collecting data every day. We use this in our planning for learning and health delivery. We also look at all the data together - so all our families are involved in telling the story of how CG is going for our children, families and communities.

Our presentation will share how we are achieving First Nations leadership in Western evaluation and how this is empowering our communities. It will show how we are designing the data collection, and how we are creating the knowledge and evidence from our ways of thinking and a Western perspective. We will also share some early findings from our 25-year evaluation.

Chair
avatar for Lena Etuk

Lena Etuk

Director, Research & Evaluation, Cultural & Indigenous Research Centre Australia
I’m an applied Sociologist with 16+ years of experience in evaluation and social research. At CIRCA I lead an amazing team of research consultants from a huge range of diverse backgrounds. We specialise in qualitative evaluation and research with non-English speaking CALD and Aboriginal... Read More →

Speakers
JL

Jen Lorains

Director Research & Evaluation, Children's Ground
Jen Lorains is the Director of Research & Evaluation at Children’s Ground. She works with each community to evaluate and evidence the impact of Children’s Ground’s empowerment, systems reform and integrated service platform.Jen has undergraduate and postgraduate qualifications... Read More →


Wednesday August 31, 2022 10:30am - 11:30am ACST
Riverbank Room 2

11:00am ACST

Weaving evaluation capability through public sector practice: a reflexive conversation
Ruth Nicholls (National Indigenous Australians Agency), Anne Fordham (National Indigenous Australians Agency)

As evaluation capability practitioners within a government agency we seek to weave evaluative thinking throughout cycles of policy and program design and delivery. We work within the National Indigenous Australians Agency, where programs are complex, integrated and increasingly place based/responding to local needs. A key challenge is to more deeply embed evaluative thinking in practice. We explore this through applying 'reflexive practice', a process of thinking about the context we work in, our positionality working in cross-cultural contexts, and the extent to which we can influence systems change to strengthen evaluation's role in public sector processes. In this presentation we share a reflexive conversation about our collective endeavours. Our work involves providing support, tools, technical knowledge, and building relationships. The aim of this presentation is to show the value of reflexive practice to consider challenges and opportunities for strengthening organisational evaluation capability. We suggest reflexivity is critical to fostering 'group values', using evaluation to enable collective reflection and surface diverse understandings of how policies and programs work. Our discussion will draw on our experience as researchers on the topics of ethics and the social values of sustainability. This reflexive practice has the potential to extend evaluative thinking beyond 'conventional' program evaluation approaches, and weave different cultural values throughout evaluation, so vital to improving the evidence base about policies and programs.

Chair
avatar for Rae Fry

Rae Fry

Senior Manager, ARTD
I joined ARTD in 2022 after 10 years in specialist evaluation and policy analysis roles in NSW Government and NGOs. I have strong experience in project management, stakeholder engagement and evaluation advice and coaching for program teams.

Speakers
avatar for Ruth Nicholls

Ruth Nicholls

Senior Adviser, Performance Monitoring and Evaluation, National Indigenous Australians Agency
Dr Ruth Nicholls is a senior adviser in Performance, Monitoring and Evaluation at the National Indigenous Australians Agency. Ruth works as an in-house developmental evaluator, supporting projects through co-design processes. She has previously led the Evaluation Capability team... Read More →


Wednesday August 31, 2022 11:00am - 11:30am ACST
Riverbank Room 1

11:30am ACST

"Evaluation Readiness Weaving evaluative inquiry through the fabric of policy/program design, delivery, and performance"
Michael Cole (Department of Social Services), Kale Dyer (Department of Social Services), John Stoney (Department of Social Services)

Over the past decade, the Australian Government has progressively created an authorizing environment for evaluation through its public management reform agenda. This has included the introduction of the Public Governance Performance and Accountability Act 2013, the enhanced Commonwealth Performance Framework, and more recently, the Commonwealth Evaluation Policy and associated toolkit. Even so, previous experience indicates that such high-level imprimatur is not sufficient on its own to effectively embed evaluative inquiry and thinking into departmental policies and programs.

To seize such opportunities and embed them within the culture of an organization and the Australian Public Service more widely, evaluation expertise must be of immediate practical value to policymakers and program managers, and it must be combined with organizational capacity building in performance and evaluation.

This session describes an evaluation unit within the Australian Department of Social Services adopting an Evaluation Readiness approach to achieve this. It involves working collaboratively with policy/program owners to co-design Program Logics, Performance Measurement Frameworks, and Performance and Evaluation Strategies. It also requires providing specialist evaluation expertise in combination with capacity/capability strategies, and developing collaborative relationships with the department's design, data and procurement specialists. This approach is showing promising signs of being able to weave evaluative inquiry into the fabric of policies, programs, pilots, and projects so that it's threaded throughout their design, implementation, and performance - as opposed to being a badge sewn on at the end.

The presentation will outline not just the critical elements of the approach but how an Evaluation Readiness approach supports good practice in performance and evaluation at multiple levels - macro (Australian Government), meso (organizational), and micro (policy/program). It will also outline the emerging signs of success observed at the latter two levels, as well as some emerging challenges being faced as it continues to be developed and delivered.

Chair
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →

Speakers
avatar for John Stoney

John Stoney

John is an internal evaluation practitioner within the Australian Government for nearly 10 years (currently with the Dept. of Social Services as a member of its central Evaluation Unit). He describes this as his 'full-time job', as he has also been (effectively part-time) at varying... Read More →
avatar for Michael Cole

Michael Cole

Assistant Director, Department of Social Services
Michael began his evaluation work 20 years ago in international development programs in Asia with AusAID, United Nations, Asian Development Bank, and many of the larger international non-government organizations. On returning to Australia a few years ago, initially, he worked in research... Read More →


Wednesday August 31, 2022 11:30am - 12:00pm ACST
Room E1

11:30am ACST

Weaving evaluation into a large government agency: Sharing the ATO Evaluation Hub's favoured methods, tools and activities to facilitate learning by current and emerging evaluators
Megan Lugg, Robert Grimshaw, and Abhishek Easwaran (Australian Taxation Office)

The Australian Public Service (APS) reform agenda emphasises the need to strengthen APS evaluation capability to deliver the best outcomes for all Australians. This need is recognised in the Department of Finance Commonwealth Evaluation Policy and Toolkit released in December 2021.

As a large government revenue collection agency, the Australian Taxation Office (ATO) shares this problem but also faces specific challenges. Many priority interventions relate to risk and compliance, and the need to engender community confidence in the Australian tax and superannuation systems. As such, evaluation capability building (ECB) and the selection and development of evaluation methods, tools and practices must consider this particular APS context.

Additionally, most ATO people in defined evaluation roles are emerging evaluators who are unsure how to navigate the wide range of evaluation tools and methods available to them. Where evaluation does happen, there is a strong focus on using experimental methods.

The Evaluation Hub was established to build the ATO's evaluation culture, capacity and practice. During this presentation Hub members will share favoured methods and artefacts to facilitate learning by current and emerging evaluators:
  • An evaluation maturity model to generate discussion around and describe a pathway towards a mature evaluation capability and culture
  • A SharePoint platform that provides a starting point for learning about evaluation and for engaging with tailored guidance, tools and methods
  • A glossary to establish a common language and shared understanding of evaluation terms and concepts
  • A service offer which is underpinned by collaboration, learning and fitness for purpose
  • An adaptation of the evaluation life cycle that helps us explain to internal clients what evaluation looks like and which phase we are at when working with them
  •     An Evaluation Community of Practice as an open forum where ATO people interested in evaluation share experiences and learn from each other

Chair
avatar for Rae Fry

Rae Fry

Senior Manager, ARTD
I joined ARTD in 2022 after 10 years in specialist evaluation and policy analysis roles in NSW Government and NGOs. I have strong experience in project management, stakeholder engagement and evaluation advice and coaching for program teams.

Speakers
avatar for Liesl Harrold

Liesl Harrold

Manager, Bespoke Evaluation, Australian Taxation Office
Liesl works in the Australian Taxation Office (ATO), helping business areas build their evaluation culture, capacity and practice. With over 20 years of evaluation experience, Liesl has also worked for Queensland Treasury and Trade where she assisted government agencies, community... Read More →
avatar for Robert Grimshaw

Robert Grimshaw

Evaluation Manager, Australian Taxation Office
Robert works in the Australian Taxation Office Evaluation Hub, helping business areas to build their evaluation culture, capacity and practice. This includes providing advice and support for using fit-for-purpose evaluation to inform evidence-based policy and program delivery, and... Read More →


Wednesday August 31, 2022 11:30am - 12:00pm ACST
Riverbank Room 1

11:30am ACST

Found in translation
Susan Garner (Garner Willisson), Lucky Chhetri (Three Sisters Empowering Women of Nepal), Sheila Holcombe (Analytical Business Consulting)

This paper presents a story about a collaboration which developed over the last two years of the global COVID-19 pandemic between three Nepali women and two Australian evaluators.
The three women shared their perspectives about how women and girls from rural and remote parts of Nepal were empowered to become socially and economically independent.

The two Australian evaluators captured these perspectives to develop a monitoring and evaluation framework, and a strategic plan that will guide the work of the Nepalese women-led social enterprise into the future.
The perspectives of three Nepali women were woven into the language of evaluation to translate their story into practical ways to measure and report on the outcomes and impact of their work over the last 27 years and into the future.

Adopting a theory based approach to evaluation, the theory of change drew upon historical, cultural, and philosophical perspectives for understanding the work of the women-led social enterprise which had fostered the growth of independent, self-sufficient, decision making women and girls from rural and remote areas of Nepal.

The monitoring, evaluation, and planning (MEP) framework developed through the collaboration explored the context of the work of the social enterprise, developed a theory of change, outcomes chains, identified evaluation criteria and questions and prepared monitoring and evaluation plans using participatory evaluation methods to monitor and evaluate the outcomes of their work. The strategic plan 2021-2023 identified key actions for moving beyond the COVID-19 pandemic.

The short paper will highlight the features of a unique collaboration between the three Nepali women and two Australian evaluators which helped to build evaluation capability in an international development context.

Chair
avatar for Lena Etuk

Lena Etuk

Director, Research & Evaluation, Cultural & Indigenous Research Centre Australia
I’m an applied Sociologist with 16+ years of experience in evaluation and social research. At CIRCA I lead an amazing team of research consultants from a huge range of diverse backgrounds. We specialise in qualitative evaluation and research with non-English speaking CALD and Aboriginal... Read More →

Speakers
avatar for Susan Garner

Susan Garner

Director, Garner Willisson
I see myself as an 'accidental' evaluator having come from a career in science and public policy. I managed my first evaluation project as a policy analyst in the health portfolio. With post graduate qualifications in public policy and public health I found a natural affinity to evaluative... Read More →


Wednesday August 31, 2022 11:30am - 12:00pm ACST
Riverbank Room 2

11:30am ACST

The great debate: should evaluation be the central source of evidence for government decision-making?
Eleanor Williams (Victorian Department of Health), George Argyrous (University of Technology Sydney)

Evidence-based policy making is held up by many as the goal for governments everywhere with the premise that decisions and investments should be made on the basis of 'what works'. Evaluation, as an evidence source seeking to answer questions of effectiveness and efficiency of policies and programs, is well-placed to contribute to the process but is valued to varying degrees depending on the context. Evaluation must compete with a range of influencing sources of information like population surveys, social media analysis, input from service users and expert advisers.

This debate will explore the role that evaluation should play in evidence-based policy by presenting both sides of the question should evaluation be the central source of evidence for government decision-making? The presenters will discuss the challenges that face evaluators if evaluation is to be a central source of evidence and alternatively if it is not to be a central source of evidence.
This debate-style presentation seeks to situate evaluation as part of the broader government policy-making process and weave the specific concept of evaluation into the broader context of evidence utilisation. Panellists will bring multiple perspectives including those of government decision-makers and academics responsible for generating high quality evaluation evidence for decision-making. The moderator will bring experience relevant to both sides of the debate and will work actively to engage audience participation in the topic.

The session will start with the two presenters setting out the case for each side, before inviting the audience to make their own contributions to the debate and concluding with a vote from session participants for which position has been most compelling.


Chair
avatar for Catriona Flavel

Catriona Flavel

Principal Adviser, Performance and Evaluation, Department Environment and Water
Catriona is an experienced evaluator with proven expertise undertaking complex monitoring, evaluation and learning processes in a range of sectors including education, civil society, environment, private sector development and governance. Catriona has led a number of meta-evaluations... Read More →

Speakers
avatar for George Argyrous

George Argyrous

Head of Measurement, Evaluation and Learning, Paul Ramsay Foundation
avatar for Eleanor Williams

Eleanor Williams

Executive Director, Victorian Department of Health
Eleanor Williams is Executive Director of Policy and Strategy at the Victorian Department of Health and was previously the Director of their Centre for Evaluation and Research Evidence. Eleanor holds a Masters of Public Policy and Management and Masters of Evaluation from the University... Read More →


Wednesday August 31, 2022 11:30am - 12:30pm ACST
Hall D (plenary)

11:30am ACST

Developmental Evaluation Demystified
Jess Dart (Clear Horizon Consulting)

In this time of uncertainty and wicked societal challenges, developmental evaluation is coming of age! If you'd like to learn more about what it is and how to use it, come to this skill-building workshop.

Developmental evaluation (DE) is defined by its purpose - it is there to help develop things in real-time. It's a type of evaluation that establishes a continuous feedback loop between the evaluator and the team designing and implementing social change initiatives. This rapid feedback cycle can assist innovators working in complex or uncertain environments test and adapt their design as it is being developed. DE is increasingly an accompaniment to place-based work, social innovation, systems change initiatives, and co-design.

Learning objectives:
  • To be able to explain what developmental evaluation is, and isn't
  • To know where to find resources to learn more.

In this skill-building session, a short overview will be provided, followed by a longer Q&A. Participants will be given a resource sheet to help them find out more. Come along and get your questions answered.

Chair
avatar for Gerard Atkinson

Gerard Atkinson

Director, ARTD Consultants
I am a Director with ARTD Consultants with expertise in: - program and policy evaluation - workshop and community facilitation - business analytics and data visualisation - market and social research - financial and operational modelling - non-profit, government and business... Read More →

Speakers
avatar for Jess Dart

Jess Dart

Chief Evaluator and Founder, Clear Horizon Consulting
Dr Jess Dart is the founder and Chief Evaluator of Clear Horizon, an Australian-based specialist evaluation company. Having received the 2018 Outstanding Contribution to Evaluation Award from the Australian Evaluation Society (AES), Jess is a recognised leader with over 25 years of... Read More →


Wednesday August 31, 2022 11:30am - 12:30pm ACST
Room E2

11:30am ACST

If you build it, they will come: Building organisational evaluative practice and capacity
Georgina Roberts (Grosvenor)

As evaluators, we are often focused on what is in front of us - the evaluation we are planning, the evaluation we are delivering, and how we will balance time, cost and quality objectives in doing so. Rarely do we get the opportunity to lift our lens to the organisational perspective, and look at how we can support organisations to improve their evaluative practice and capacity, to make best use of the evaluation work we are so good at delivering.

We found in researching this topic that there is a limited body of context-neutral knowledge regarding evaluation capacity building. While there were many articles exploring evaluation capacity building approaches in international development organisations or specific industry or sector contexts, there was little to provide a general approach which organisations might adapt and apply in building their own evaluation practices and capacity. Our presentation builds on the body of knowledge by drawing from work done by Taylor-Powell, Boyd and King to create a practical framework which is supplemented by a real-life case study from our work with the Environment Protection Authority (EPA) Victoria's Industry Guidance Unit. This session will give attendees a framework they can apply in their own organisations to build their evaluation practice and capacity, while benefitting from the lived experience of EPA Victoria in this regard.

The interactive session opens with an exploration of the work of Taylor-Powell, Boyd and King in providing a context-neutral framework for improving organisational evaluation practice and capacity and sharing insights from EPA Victoria's own journey before facilitating attendees through world cafes to develop their own framework for building evaluation capacity in their organisation. The attendees will be supported by complimentary hand-outs and tools provided by our facilitator, as well as the insights provided by our facilitator and the members of their small group.

Chair
avatar for Ghislain Arbour

Ghislain Arbour

Senior Lecturer, The University of Melbourne
Doctor Ghislain Arbour is a Senior Lecturer at the University of Melbourne where he coordinates the Master of Evaluation.*Research and consultancy*A primary research interest of Dr Arbour is the clarification of necessary concepts in the analysis and judgement of the performance of... Read More →

Speakers
avatar for Georgina Roberts

Georgina Roberts

Senior Manager, Grosvenor Performance Group
Georgina is one of Grosvenor Performance Group's most experienced evaluators, currently leading the Program Evaluation Practice.With ten years of experience planning and conducting evaluations at all stages of the program lifecycle, Georgina is highly competent in the development... Read More →


Wednesday August 31, 2022 11:30am - 12:30pm ACST
Room E3

12:00pm ACST

Weaving a Theory of Change into complex corporate, policy and program settings for better practice performance information and reporting: A Commonwealth perspective
Evie Cuthbertson (Grosvenor)

Increased public scrutiny and expectations of a return on public investment, together with legislative and regulatory requirements for agencies to frame and report a complete, robust performance story has prompted many Commonwealth agencies to enhance their corporate performance frameworks.

Building capability regarding the efficacy of a Theory of Change to frame their corporate level performance stories has been integral for transitioning to reporting on outcomes over which each of the entities have control.

This paper provides insights into why evaluative techniques are relevant and effective - beyond program evaluation. It will demonstrate how agencies, irrespective of whether they are responsible for policy design and advice and/or its implementation either individually or in partnership with other portfolio agencies, jurisdictions, or diverse stakeholder groups, are able to deliver better practice in performance reporting by weaving evaluative techniques into their performance measurement frameworks.

The merits of this approach include performance measures which endure through changes to the machinery of government, elections and changing government priorities and the ability to demonstrate impact through an outcomes-based performance narrative.

Chair
avatar for Michael Amon

Michael Amon

Director, Data Policy, Evaluation & Visualisation, Attorney-General’s Department
As with most of us here, I have a passion for evaluation and bringing evidence-based policy to the policy space. I've led and established a few evaluation units/teams in the federal governement. I've also lead policy development branches, learning & development teams, and have a PhD... Read More →

Speakers
avatar for Evie Cuthbertson

Evie Cuthbertson

Senior Manager, Grosvenor Performance Group
Senior Manager, Grosvenor Performance GroupEvie is a seasoned evaluation consultant. Her work is characterised as being sensible, practical and accessible.Over the last 20 years she has designed and delivered a range of evaluation related projects and worked with a diverse mix of... Read More →


Wednesday August 31, 2022 12:00pm - 12:30pm ACST
Room E1

12:00pm ACST

Innovative practices to support ex-post CBA evaluation
Danielle Spruyt (NSW Treasury), Aditya Prasad (NSW Treasury)

Ex-post cost-benefit analysis (CBA) provides a method to capture the full range of an implemented initiative's impacts (social, economic, environmental and cultural) and evaluate the net social benefits of an initiative to the NSW community.

The updated NSW government evaluation guidelines include guidance on preparing for and undertaking ex-post CBA. This complements the NSW Guide to CBA, with guidance designed to support the weaving together of appraisal, monitoring and evaluative practices to build strong evidence on the effectiveness and net social benefits of an initiative. Ex-post CBA supports comprehensive consideration of actual benefits and costs when drawing conclusions about an initiative and contributes to building evidence to inform the design and appraisal of future initiatives.

Key topics covered in the presentation will include:
  • the challenges of ex-post CBA
  • the role of innovative practices in building
    • monitoring to support economic evaluation
    • strong foundations in outcome evaluation
    • stakeholder engagement in identifying and quantifying benefits
  •    assessing the strength of evidence.

Chair
avatar for Rae Fry

Rae Fry

Senior Manager, ARTD
I joined ARTD in 2022 after 10 years in specialist evaluation and policy analysis roles in NSW Government and NGOs. I have strong experience in project management, stakeholder engagement and evaluation advice and coaching for program teams.

Speakers
avatar for Danielle Spruyt

Danielle Spruyt

Principal Economic Analyst, NSW Treasury
The Centre for Evidence and Evaluation (CEE) in NSW Treasury works to strengthen the quality of evidence that supports government decision-making. Danielle Spruyt (PhD) is a Principal Economist, and evaluation lead. Danielle has previously led development and evaluation of government... Read More →
avatar for Aditya Prasad

Aditya Prasad

Economic Analyst, NSW Treasury
Adi Prasad works in the Centre for Evidence and Evaluation in NSW Treasury. He specialises in providing advice and expertise on developing cost-benefit analyses and evaluations and has a strong interest in evidence-based policy. Adi has a background in policy development and education... Read More →


Wednesday August 31, 2022 12:00pm - 12:30pm ACST
Riverbank Room 1

12:00pm ACST

Valuing Collecting Institutions in the Northern Territory: Weaving Evaluation into the Sector
Alison Reedy (Northern Territory Government), Ilka Schacht (Northern Territory Government)

Collecting institutions such as museums, art galleries, archives and libraries need an evidence base that demonstrates their value to the economy and to society in order to attract continuing support from the community, government, and corporate and philanthropic donors. While there exists a general understanding of the intrinsic contribution of cultural institutions to the 'public good', the evaluation of collecting institutions is an emerging practice. In the Northern Territory (the Territory), monitoring and evaluation of cultural institutions is mainly limited to measuring tangible outputs.  Counting the number of visitors, exhibitions, other public events or activities, publications, media presence, and revenue generation is common. In contrast, no monitoring and evaluation is happening of the social return on investment for government's substantial contribution of public money into such institutions.

The cultural and linguistic diversity of Aboriginal peoples is a defining element of the Territory and forms a backdrop to waves of migration and settlement of people from around the world. This cultural context and the vast geographical size and sparse population of the Territory influences where collecting institutions are located, their purpose and functions. Many stakeholders have an interest in Territory collecting institutions including residents, interstate and international visitors, arts and cultural organisations and practitioners, with the main financial stakeholder being the Territory Government. All place different values on, and have different expectations of, Territory collecting institutions.

This presentation explores the development for the first time in any Australian state or territory of a sector wide monitoring and evaluation framework for collecting institutions. This new project is a complex undertaking. First, there is no standard approach to measuring the value of intangible cultural assets, with multiple dimensions and a number of methodological approaches that could be used. Second, the priorities and values of collecting institutions should reflect the diversity and perspectives of their context, audiences, and stakeholders, and the development of a sector wide framework necessarily requires the difficult untangling, weaving in and reconciling of the multiple threads of values, perspectives and expectations.

In this presentation, we examine the challenges of this untangling and weaving process, and the tying together of multiple methodological approaches, towards the development of an inclusive monitoring and evaluation framework for Northern Territory collecting institutions.

Chair
avatar for Lena Etuk

Lena Etuk

Director, Research & Evaluation, Cultural & Indigenous Research Centre Australia
I’m an applied Sociologist with 16+ years of experience in evaluation and social research. At CIRCA I lead an amazing team of research consultants from a huge range of diverse backgrounds. We specialise in qualitative evaluation and research with non-English speaking CALD and Aboriginal... Read More →

Speakers
AR

Alison Reedy

Northern Territory Government
Dr Alison Reedy is an education and social sciences researcher and evaluator working in Territory Families, Housing and Communities in the NT.


Wednesday August 31, 2022 12:00pm - 12:30pm ACST
Riverbank Room 2

1:30pm ACST

The Power of Many: How collaboration can strengthen evaluations
Cara Donohue (Charles Darwin University), Ruth Nicholls (Charles Darwin University), Gunjan Veda (Movement for Community Led Development)

How does the process of collaboration between evaluators work, particularly for multi-country, multi-organizational teams? What happens when researchers and practitioners come together to pool knowledge and lived experiences in an evaluation project? In this session, we will share our experience of undertaking a collaborative theory-based review of Community-Led Development, weaving together perspectives and expertise of academia-based evaluators and a team of Monitoring, Evaluation and Learning practitioners in the International Development field. We'll look at successes, challenges, and lessons learned from the collaborative process, as well as how such collaborations can strengthen the evaluation field.

As the world of evaluation increasingly seeks to incorporate diverse perspectives, values, and domains of expertise, examining the practicalities of working together can help us all become better evaluators and collaborators. Learning from each other, including multiple voices, and working together (and doing it well!) brings an increasing richness and depth to the field of evaluation while enhancing our perspectives and credibility. This is possible even for specialised approaches and methodologies such as Realist Synthesis. This session will provide practical take-aways both for evaluators engaging in collaborative projects, and for commissioners initiating them.

Chair
avatar for Laura Baker

Laura Baker

Principal, ACIL Allen

Speakers
avatar for Ruth Nicholls

Ruth Nicholls

Senior Adviser, Performance Monitoring and Evaluation, National Indigenous Australians Agency
Dr Ruth Nicholls is a senior adviser in Performance, Monitoring and Evaluation at the National Indigenous Australians Agency. Ruth works as an in-house developmental evaluator, supporting projects through co-design processes. She has previously led the Evaluation Capability team... Read More →
avatar for Cara Donohue

Cara Donohue

Research Associate, Realist Research Evaluation and Learning Initiative, Charles Darwin University
Cara Donohue is a Research Associate with the Realist Research Evaluation and Learning Initiative (RREALI) within the Northern Institute at Charles Darwin University. She specialises in realist and theory-based approaches to evaluation, research, and program design. Cara has a work... Read More →


Wednesday August 31, 2022 1:30pm - 2:00pm ACST
Riverbank Room 1

1:30pm ACST

Slow cooking and weaving evaluation into a 12-year community-led initiative
Jess Dart (Clear Horizon), Niall Fay (Fay Fuller Foundation), Jess Decampo (The Australian Centre for Social Innovation), Melinda Chiment (Clear Horizon), Mutsumi Karasaki (Clear Horizon)

Panelists:
  • Victoria Halburd – Fay Fuller Foundation, partner and funder
  • Jess Decampo – TACSI – the implementation partner
  • Dr Melinda Chiment – Clear Horizon – evaluator
  • Dr Mutsumi Karasaki – Clear Horizon – evaluator
  • Two community participants – from towns in South Australia
Moderator: Dr Jess Dart

This panel will explore the opportunities and challenges of evaluating a 12 year place-based, community-led initiative - and what we’ve learned from the first three years of evaluating Our Town. 
Starting in 2019 Our Town is supporting six regional towns in South Australia to reclaim the mental wellbeing of their community through community-led action, with an even broader vision of using what we learn to create better conditions for community-led work.

With a strong commitment to weaving evaluation into the initiative, the evaluation team, Clear Horizon, were contracted for the full 12-year period. The evaluators were brought in at the inception phase of the initiative, with a strong focus on developmental evaluation as well as impact evaluation. In the current phase the evaluators are supporting the communities in building their own capability to undertake meaningful measurement, evaluation, and learning (MEL) systems at the community level.

The panel brings together diverse perspectives from across the initiative, including two community members who are leading the work in their towns, the Fay Fuller Foundation (funder and initiative partner), The Australian Centre for Social Innovation (social innovation partner), and Clear Horizon, (the evaluators).

After brief introductions, we’ll kick off with a concise overview of Our Town and the evaluation journey before posing a series of questions to our panel to draw out the value and challenges of the evaluation from multiple perspectives. We will then open the floor to audience questions.

Chair
Speakers
avatar for Jess Dart

Jess Dart

Chief Evaluator and Founder, Clear Horizon Consulting
Dr Jess Dart is the founder and Chief Evaluator of Clear Horizon, an Australian-based specialist evaluation company. Having received the 2018 Outstanding Contribution to Evaluation Award from the Australian Evaluation Society (AES), Jess is a recognised leader with over 25 years of... Read More →
VH

Victoria Halburd

Social Impact and Communications, Fay Fuller Foundation
MC

Melinda Chiment

Principal Consultant, Clear Horizon
Melinda’s personal and professional mission is to connect communities with tools, data and resources to promote social good and inspire change. Melinda is trained in quantitative and qualitative research methods having received her PhD from the University of Queensland and has a... Read More →
avatar for Mutsumi Karasaki

Mutsumi Karasaki

Senior Consultant, Clear Horizon
Mutsumi Karasaki is a senior consultant in Clear Horizon’s Social Impact team. As an experienced anthropologist/sociologist, he is passionate about building an inclusive and caring society through collaboration and co-design. Having completed a PhD in Public Health, Mutsumi has... Read More →


Wednesday August 31, 2022 1:30pm - 2:30pm ACST
Room E1

1:30pm ACST

Citizen science: What is the value and the feasibility for evaluators and for our stakeholders?
Katherine Pontifex (Wellbeing SA), Yvonne Laird (University of Sydney)

Citizen science is an approach to actively engaging the public in the research process, for example in developing questions, designing the project methodology, data collection and analysis, and interpreting and disseminating findings. Interest in citizen science is growing amongst public policy stakeholders, including as an approach to gathering data to plan and evaluate programs and policies. Citizen science offers a useful approach to participatory evaluation, enabling evaluators to actively engage diverse members of the public in the evaluation process. This session will provide an introduction to citizen science and the types of community participation which it may facilitate.

To illustrate real-world use, there will be examples of its application in health sector projects as part of The Australian Prevention Partnership Centre's Citizen Science in Prevention project. This will include a case study of an evaluation project undertaken in 2021 and the evaluation lead will provide reflections on this experience.
Participants will then be guided through World Café discussions (including some online polling) to explore the potential value of citizen science to enable participatory evaluation practices, and for project stakeholders - including funders (government and non-government) and the community. The discussions will then explore the scenarios in which this approach is most likely to be beneficial, seeking the insights of evaluation practitioners and stakeholders. This interactive approach will encourage participants to reflect on the opportunities for using citizen science methods within their practice.

Participants will leave this session with a broader understanding of the citizen science approach and the capability to assess its potential for use in their work. The presenters will provide a brief summary of the key discussion points to participants following the Conference via the Conference administration.

Chair
Speakers
avatar for Yvonne Laird

Yvonne Laird

Lecturer in Prevention and Health Promotion, The University of Sydney
Yvonne is a Lecturer in Prevention and Health Promotion based in the Prevention Research Collaboration at The University of Sydney. Yvonne is currently co-leading a program of work funded by the Australian Prevention Partnership Centre which seeks to build capacity in citizen science... Read More →
avatar for Katherine Pontifex

Katherine Pontifex

AES22 Program Co-Chair. Manager, Evaluation Services, Wellbeing SA
Katherine is the Manager, Evaluation Services at Wellbeing SA. She is an experienced evaluation expert with an extensive background working in government on health and social programs, policies and systems. Her evaluation practice is firmly grounded in a pragmatic approach with an... Read More →


Wednesday August 31, 2022 1:30pm - 2:30pm ACST
Room E3

1:30pm ACST

Rapid Insight Cycles: Generating usable insights 'on-the-run' for equity-focused decision-making at policy and operational levels
Nan Wehipeihana (Research Evaluation Consultancy | Kinnect Group), Judy Oakden (Kinnect Group), Kahiwa Sebire (Invalue Consulting)

In 2020 and 2021, the NZ Ministry of Health launched an equity-focused programme intended to positively effect the influenza vaccination rates of Māori. In the background, COVID-19 meant policy and operational decisions across all levels of the health system were changing rapidly. Learning 'on-the-run' was crucial to successful outcomes. The Ministry sought an evaluation partner that would track and unpack vaccination rates, contributing factors, and, importantly, equity for Māori as the programme rolled out.

We developed a new approach, Rapid Insight Cycles (RICs), to bring the best mix of available data, evidence and analytic insights to the Ministry at regular intervals. RICs are an agile and iterative approach, responding to information questions that emerge over the evaluation and weaving answers to the overall evaluation questions with each cycle.

In this panel, we will present RICs and discuss the experience of using them as an evaluation approach. The panel will discuss:

  • How the evaluators manage resources, adapt methods, and respond to the timing needs? How they maintain a clear line of sight on the key evaluation questions and intended evaluation outcomes while adapting?
  • What is the value of small, frequent data collection cycles and presenting findings regularly? What are the challenges and considerations of determining which data is presented to the client, when sharing all the data is not feasible?
  • Why was collaborative sense-making powerful in reflecting on and informing equity-focused decisions? 

Chair
avatar for Judy Gold

Judy Gold

Co-Director, Cultivating Change

Speakers
JO

Judy Oakden

Director, The Kinnect Group
Judy has held management roles in evaluation, market research and management consulting, and also worked in public relations. Judy shares a passion for finding better ways to help people navigate complexity and deal with the frustrating and seemingly intractable issues they face on... Read More →
avatar for Nan Wehipeihana

Nan Wehipeihana

Director, Weaving Insights
Nan is the director of Weaving Insights (www.weavinginsights.co.nz) and a member of the Kinnect Group (www.kinnect.co.nz). Nan is a founding member of the Aotearoa New Zealand Evaluation Association (ANZEA) and Ma Te Rae, Māori Evaluation Association – the first Indigenous Eva... Read More →
avatar for Kahiwa Sebire

Kahiwa Sebire

Kahiwa Sebire is a puzzling pattern-spotter, an enthusiastic solution finder and a life-long learner. Exploring possibilities and meaning-making with sticky notes and whiteboards (or their digital siblings) in tow.  Kahiwa Sebire has over 10 years’ experience working in and with... Read More →


Wednesday August 31, 2022 1:30pm - 2:30pm ACST
Hall D (plenary)

1:30pm ACST

Rapid prototyping evaluation
Matt Healey (First Person Consulting)

Most evaluators will have experienced barriers or constraints to the implementation of an evaluation at some point in their careers. Whether due to timing, budget, staffing or methodological constraints, we have all experienced these issues to overcome.

A good evaluator is able to step outside their comfort zone and weave different ways of working into their practice to address these constraints. However, we don't know what we don't know, and so where might we start?

This interactive session will introduce participants to rapid prototyping process. Rapid prototyping is a mindset and process often used by designers to quickly generate and test ideas with little impact to project resourcing. The session will take participants through the process as a way of introducing them to a very different way of undertaking what many will be experts in - evaluation planning.

However, in line with the conference theme of 'weaving', we will be incorporating tools and processes that designers use. Working in small teams, participants will be tasked with the development of the core features of an evaluation plan (a program logic, key evaluation questions and data collection approaches) which must account for a range of constraints - both common and uncommon.

It will be fast-paced, hands-on and require creative contributions from those participating (though observers are welcome). The structure of the session will include an introduction to rapid prototyping, introducing the scenario and scope, and outlining the materials that participants can use as part of their prototyping. The final stage of the session will be a whole group vote for who has 'best' addressed the constraints.

By the end of the session, participants will have applied the rapid prototyping in an evaluation context. They will feel equipped to adapt the tool to their own practice, and appreciate its value in different scenarios.

Chair
avatar for Kylie Berg

Kylie Berg

Director Business Development, Allen + Clarke Consulting
I am the Director of Business Development + Systems in Allen + Clarke's Melbourne office.I'm often the first person our Australian clients will speak to about their needs and how we could help.I’m delighted to be in Adelaide for AES22.

Speakers
avatar for Matt Healey

Matt Healey

Senior Consultant, First Person Consulting
I've been working as an evaluation consultant since early 2014. In that time I've led or contributed to over 100 projects spanning sustainability, climate change, public health, health promotion, emergency management, energy efficiency, renewable energy, waste management, innovation... Read More →


Wednesday August 31, 2022 1:30pm - 2:30pm ACST
Room E2

1:30pm ACST

AES Diversity and Inclusion Policy
AES Vitality Committee

The AES 2019-2022 strategic plan committed to developing and implementing an inclusion and diversity strategy as in important step in honouring the AES’ organisational vision. As part of this strategic work, the AES Vitality committee has worked with Allen + Clarke to develop a high level Diversity and Inclusion Policy with input from the AES Board, Fellows, Committees and members. The Policy is intended to be a living document that captures the AES’ ongoing commitment to providing an inclusive and welcoming environment, recognising that as a member-based organisation, diversity and inclusion is fundamental to the enrichment all of our activities. At this session, members of the Vitality Committee and Allen + Clarke will step through the process for developing the Policy and the components of the Policy, including the vision, definitions, commitment, roles, actions and accountabilities.



Chair
avatar for Duncan Rintoul

Duncan Rintoul

Director, Rooftop Social
ECB devotee, mentor in the AES group mentoring program, used to be on the AES board, still have heaps to learn. Keeping busy doing research and evaluation and facilitation work in education and justice and sustainability and health and you name it. Looking forward to catching up with... Read More →

Speakers
avatar for Eleanor Williams

Eleanor Williams

Executive Director, Victorian Department of Health
Eleanor Williams is Executive Director of Policy and Strategy at the Victorian Department of Health and was previously the Director of their Centre for Evaluation and Research Evidence. Eleanor holds a Masters of Public Policy and Management and Masters of Evaluation from the University... Read More →
avatar for Florent Gomez

Florent Gomez

Flo is a well-seasoned evaluator with over 15 years experience in evaluation across consulting and government in Europe and Australia. He is currently an independent evaluator and chairs the New South Wales committee of the Australian Evaluation Society.Flo is passionate about bringing... Read More →


Wednesday August 31, 2022 1:30pm - 2:30pm ACST
Riverbank Room 2

2:00pm ACST

Tackling seven common challenges facing internal evaluators in education: A discussion on process, impact, and team culture
Lisa Walker (CSIRO), Christopher Banks (CSIRO), Julia Siddiqui (CSIRO)

CSIRO's Education and Outreach business unit is seeking to build an evidence-based, impact-focused approach to Science, Technology, Engineering and Mathematics (STEM) education program design, delivery, and improvement. To achieve this, a small team of specialised evaluators have been imbedded within project teams, working to build an evaluation culture, understand what works and to improve the effectiveness of the organisation's education programs. Five years and close to 15 programs into the journey, both the Impact and Evaluation team and program delivery staff have overcome many challenges and learnt a great deal.

CSIRO evaluators and program staff will participate in a facilitated conversation to reflect on progress to date, and share their solutions to the seven common challenges including:
  • Funding of evaluation (building into project proposals)
  • Integrating impact planning
  • Navigating jurisdictional research approval requirements
  • Ensuring a strengths-based approach to evaluation
  • Integrating Indigenous research methodologies
  • Developing a set of common indicators across programs
  • Building a culture of continuous improvement

Attending this presentation will provide insights into how Australia's National Science agency is evaluating its STEM programs and provide attendees with practical tips on how to build a culture of evaluation and evidence-based programming.

Chair
avatar for Laura Baker

Laura Baker

Principal, ACIL Allen

Speakers
avatar for Lisa Walker

Lisa Walker

impact and Evaluation, CSIRO
I am a Principal Advisor within CSIRO's Education and Outreach Business unit in the Impact and Evaluation (I&E) Team. I have a Bachelor of Economics/Arts and a Master of Sustainability. As part of my role in CSIRO, I am responsible for developing evaluation plans and data collection... Read More →
avatar for Christopher Banks

Christopher Banks

Manager, Impact & Evaluation, CSIRO
Christopher Banks is the Manager of the Impact and Evaluation team in CSIRO's Education and Outreach unit. Chris holds a Bachelor of Psychology, a Master of Applied Social Psychology (with a focus on program evaluation), and a PhD involving neighbourhood disadvantage. Chris has over... Read More →
avatar for Julia Siddiqui

Julia Siddiqui

Executive Manager, CSIRO Education & Outreach, CSIRO
Julia Siddiqui is an Executive Manager in CSIRO Education and Outreach.  She has over 20 years experience in leadership and management roles across private, government and non-government organisations. Julia has a tertiary degree in Microbiology (Hons) and an advanced diploma in... Read More →


Wednesday August 31, 2022 2:00pm - 2:30pm ACST
Riverbank Room 1

3:00pm ACST

Plenary four: government panel "Weaving Evaluation in policy and program development"
State and Federal jurisdictions are increasingly using evaluation as a strategic business tool with many aiming to embed evaluation in everyday business. The panellists will discuss their approach to weaving evaluation into their organisations, their experience to date, the challenges they are facing and share the strategies they are adopting to support staff to embrace evaluation as a core activity in policy and program development and management.

Speakers
avatar for Suzanne Butler

Suzanne Butler

Director, Evaluation Leadership, Policy and Capability Unit, Australian Centre for Evaluation (ACE), Australian Government, Department of the Treasury
Suzanne currently leads a team in the Australian Centre for Evaluation (ACE) responsible for embedding good evaluation principles and practices across government and fostering an evaluative culture that supports continuous learning about what works, why, and for whom. This includes... Read More →
avatar for Jessica Hartmann

Jessica Hartmann

Branch Manager, Policy Analysis and Evaluation Branch, National Indigenous Australians Agency
Dr Jessica Hartmann has over twenty years of experience working across a range of agencies with research, advisory and policy functions at both the Australian and state government level, as well as internationally, mainly in the areas of water and fisheries.Currently Dr Hartmann is... Read More →
avatar for Danielle Spruyt

Danielle Spruyt

Principal Economic Analyst, NSW Treasury
The Centre for Evidence and Evaluation (CEE) in NSW Treasury works to strengthen the quality of evidence that supports government decision-making. Danielle Spruyt (PhD) is a Principal Economist, and evaluation lead. Danielle has previously led development and evaluation of government... Read More →
avatar for Eleanor Williams

Eleanor Williams

Executive Director, Victorian Department of Health
Eleanor Williams is Executive Director of Policy and Strategy at the Victorian Department of Health and was previously the Director of their Centre for Evaluation and Research Evidence. Eleanor holds a Masters of Public Policy and Management and Masters of Evaluation from the University... Read More →


Wednesday August 31, 2022 3:00pm - 4:30pm ACST
Hall D (plenary)
  Plenary
 
Thursday, September 1
 

9:00am ACST

Plenary five: Amy Gullickson "Soul and Maturity: On Being Evaluators"
Amy Gullickson (Centre for Program Evaluation, The University of Melbourne)

Evaluation is woven throughout government, organisations, communities, and life. By its nature, evaluation is intimately connected with the values of those entities and their people because the act of evaluation deals with something they think is important. So, while values may not always be explicitly addressed in our evaluation processes, they are always at centre of the work. Thus, as evaluators, we are essentially touching the souls of these entities – their reason for being. Maybe this is why evaluative findings can provoke strong, varied reactions and responses from clients and stakeholders; protecting a soul is serious business.

So, what does it mean to be an evaluator, if we acknowledge that our work involves the souls of organisations and programs? The AES evaluator professional learning competencies focus on knowledge, skills, and techniques – but those do not necessarily prepare one for dealing with souls. What can we, as evaluators, weave into ourselves and our practice to be able to handle souls with integrity? I propose that attending to our maturity as individuals can help ensure evaluation honours and respects the souls with which we work, and thus, can help evaluation positively influence our world.

Speakers
avatar for Amy Gullickson

Amy Gullickson

Associate Professor, Director, Centre for Program Evaluation, The University of Melbourne
Associate Professor Amy Gullickson is the Director of the University of Melbourne Centre for Program Evaluation, which has been delivering evaluation and research services, thought leadership, and qualifications for more than 30 years. She is also a co-founder and current chair of... Read More →


Thursday September 1, 2022 9:00am - 10:00am ACST
Hall D (plenary)
  Plenary

10:30am ACST

Completing the picture: Building the tools to weave Indigenous paradigms into evaluation practice
Sara Hudson (NSW Aboriginal Land Council), Rachel Bertram (University of Technology Sydney)

Celebrating the diversity of Australia requires challenging Western paradigms that currently dominate evaluation practice. Weaving Indigenous paradigms into evaluation practice, involves recognising the distinctiveness of each approach and bringing them together to form a more complete picture. Understanding, celebrating, and integrating Indigenous paradigms into the way we measure success will enable evaluators to take Australia one step further towards reconciliation.  

While research is needed to gain a deeper understanding of the way cultural values shape our metrics of success, we also need practical tools to integrate these learnings into evaluation practice to build capability across both Indigenous and non-Indigenous organisations.

We are building those tools!

A university, in partnership with an Indigenous community-controlled organisation, is building an Indigenous Evaluation Hub that provides free, reliable and accessible evaluation resources to community. This presentation will discuss the origins of the Indigenous Evaluation Hub and the research informing its development, allow time for sharing, and to show how the practical tools can support you in your work.

Our work has highlighted that defining and measuring the success of Indigenous programs needs to take into account metrics that are relevant and meaningful to Indigenous communities. For example, evaluating whether an Indigenous program is successful involves assessing to what extent the outcomes experienced, align with the priorities, needs and cultural values of the Indigenous community the program has been designed for.
Drawing insights from the shared wisdom of Indigenous people, researchers and stakeholders from across the sectors, this project supports organisations to celebrate and integrate Indigenous paradigms into evaluation practice - providing them with practical tools and resources to implement Indigenous perspectives into their methods. It will provide a holistic view of community impact, encourage innovative ways of engaging community to include greater diversity of voices in our narratives.

Chair
AR

Alison Reedy

Northern Territory Government
Dr Alison Reedy is an education and social sciences researcher and evaluator working in Territory Families, Housing and Communities in the NT.

Speakers
avatar for Sara Hudson

Sara Hudson

Senior Policy Officer and PhD Candidate, NSW Aboriginal Land Council
Sara is a Senior Policy Officer at NSW Aboriginal Land Council (NSWALC). She is responsible for managing and coordinating the development and implementation of a range of policy initiatives to support the NSW Aboriginal Land Council's strategic goals. Sara is also responsible for... Read More →
avatar for Rachel Bertram

Rachel Bertram

Co-founder & manager, Social Impact Toolbox, University of Technology Sydney
Rachel is a social impact and evaluation specialist, based at UTS Business School, University of Technology Sydney. She manages the design and delivery of the Social Impact Toolbox (https://socialimpacttoolbox.com/). She also works both within UTS and with the NFP sector, delivering... Read More →
avatar for Bronwen Dalton

Bronwen Dalton

Professor, UTS
Professor Bronwen Dalton is the Head of the Department of Management and the Director of the Masters of Not-for-Profit and Social Enterprise Program at the University of Technology, Sydney. She is also the founder of Ruff Sleepers, a charity that washes and grooms dogs of homeless... Read More →


Thursday September 1, 2022 10:30am - 11:00am ACST
Room E1

10:30am ACST

Learning together - Reflections on the evaluation of the Victorian place-based suicide prevention trial from the community, the policy maker and the evaluator
Anne Redman (Sax Institute), Merryl Whyte (Mildura Base Public Hospital), Andrew Dare (Department of Health, Victoria)

Since 2017, the Victorian Department of Health (DoH) has been partnering with Primary Health Networks (PHNs) to support 12 Victorian communities to trial a place-based approach to suicide prevention. The aim is to collaboratively develop and implement locally tailored, coordinated and evidence-informed approaches to build their communities' suicide prevention capacity. A Collective Impact approach was adopted, with the evaluation design and implementation embedded from the outset.
This panel discussion will reflect on our learning and the successes and challenges of embedding evaluation in a complex, place-based program, from a number of different viewpoints - a coordinator who worked with their community to plan, collaborate and implement the trial; a DoH representative who provided the backbone for the state-wide implementation; and an evaluation team member who worked collaboratively across all levels of the system to design and implement the developmental, formative and summative evaluations.

The format will include:
  • Presentation (5 minutes) on the place-based suicide prevention trial and the evaluation (evaluator)
  • Presentation from the two panel members on their reflections on the successes and challenges of weaving evaluation in the trial (5 minutes each)
  • A panel discussion (facilitated by the evaluator, 25 minutes) exploring:
  • What was the value for your community/organisation of embedding evaluation (including a shared measurement system) in the trial? Were there challenges?
  • What evaluation approaches are important for supporting iterative and ongoing learning and decision-making at the community, organisational and state-wide level? Were there examples where this worked well? Any suggestions for how this could have been improved?
  • What principles do you think are important for ensuring evaluation has value at the community, state-wide and (potentially) national level?
  • If you were doing this evaluation again, what would you keep, and what would you change?
  •    Questions from the audience? (10 minutes)

Chair
avatar for Michaela Sargent

Michaela Sargent

CEO, Exemplar International
Michaela Sargent is the CEO of Exemplar International Development (www. exemplar-international.com) , an Australian based international development managing contractor focused on health, disability inclusion ( health, education and employment) and addressing the health impacts of... Read More →

Speakers
avatar for Anne Redman

Anne Redman

Director- Evaluate, Sax Institute
I have more than 25 years’ experience managing large-scale research and evaluation projects for a wide range of government and non-government organisations, with expertise in managing complex evaluation projects across a diverse portfolio, including health, mental health and wellbeing... Read More →
avatar for Andrew Dare

Andrew Dare

Senior Policy Officer, Victorian Department of Health
Andrew is a Senior Policy Officer in the Victorian Suicide Prevention and Response Office. His professional interests include suicide prevention, postvention, and social policy more broadly. He has worked in Commonwealth, State and local governments as well as the not-for-profit and... Read More →


Thursday September 1, 2022 10:30am - 11:30am ACST
Hall D (plenary)

10:30am ACST

Rapid, rigorous, resource lite: Impact measurement for anyone
Min Seto (Alliance Social Enterprises - Australian Social Value Bank)

Increasingly NGOs, Governments and Investors are wanting to understand the social impact of their programs so they can make decisions about where to direct their limited resources. But Social Impact Measurement is so resource intensive that for many it is often out of reach. So how can Impact Evaluation be made accessible to all?

Join this skill building session to learn about a new rigorous valuation methodology called Wellbeing Valuation and how the Australian Social Value Bank (ASVB) enables anyone to use it to calculate the social impact of their programs in dollar terms.

Importantly, those without the capability to perform their own SCBA will see how they can use the ASVB's online Calculator to conduct rapid social impact assessments for themselves.
This session will take participants through a step-by-step demonstration of how the ASVB calculator can be used to understand the social value created by a program.

The application of the ASVB methodology allows for the comparison of different social programs in a way not previously possible. This means that Governments, service providers, and investors alike can use it to inform decisions about where to direct their resources so as to generate the most benefit to society, despite resource limitations for evaluation. The ASVB calculator can be used ex-ante to forecast the potential social value of a program; it can also be used ex-post to estimate how much social value was realised by a program.

This session will provide each participant with the knowledge to determine whether the ASVB is a good addition to their evaluation toolkit.

Chair
avatar for Mathea Roorda

Mathea Roorda

Senior consultant, Allen + Clarke
Values inform what matters, and are at the heart of evaluation. You literally can't get to an evaluative judgement without them. I'm interested in approaches to systematically identifying what matters, and for whom. Come talk with me about the values identification matrix (VIM) I... Read More →

Speakers
avatar for Min Seto

Min Seto

Executive Officer, Alliance Social Enterprises - Australian Social Value Bank
Min Seto has led the development of the Australian Social Value Bank and its management under the not-for-profit social enterprise, Alliance Social Enterprises.Min’s background is in the Management of Community Services, but she has spent the last seven years advising and supporting... Read More →


Thursday September 1, 2022 10:30am - 11:30am ACST
Riverbank Room 2

10:30am ACST

Amplify Social Impact Online- An introduction to CSI's survey design, distribution and analysis platform.
Elizabeth-Rose Ahearn (University of New South Wales)

Over the past 20 years multiple research projects have been conducted to identify how the social sector within Australia and globally generate their own evidence base and implement evidence into practice. Unanimously these studies have identified a lack of resourcing, internal capacity, clarity from funders, and ability to identify appropriate measurement tools and methods as the barriers preventing the sector from fully engaging with the evidence base. The Centre for Social Impact's Amplify Online is a world first tech solution which aims to alleviate the sector of these barriers.

The platform provides a self-service repository of over 500 outcome indicators, classified into relevant outcome areas, and ranked for quality. Users can either search through the indicators, or have a survey designed for their program objectives based on automated text-based search functionality. The platform then distributes surveys to beneficiaries and stakeholders and offers automated analysis and benchmarking of the data collected via the platform.

Designed for the Australian non-profit sector, the platform is completely free for all Australian small and medium non-profits with a minimal data hosting fee for large non-profits. In this session a comprehensive overview of the platform will be offered including how to integrate it with existing or future evaluation activities for the overall objective of increasing the effectiveness and benefit to society of non-profit programs.

Chair
avatar for Eden Robertson

Eden Robertson

Research and Evaluation Manager, Starlight Children's Foundation
Dr Eden Robertson is an experienced psychosocial researcher, with a special interest in supporting children with a serious illness and their families. Dr Robertson has completed a Bachelors of Psychology (Honours), Graduate Certificate in Adolescent and Young Adult Health and Wellbeing... Read More →

Speakers
avatar for Elizabeth-Rose Ahearn

Elizabeth-Rose Ahearn

Amplify Social Impact Manager, Centre for Social Impact, UNSW
Elizabeth-Rose Ahearn is a mixed-methods researcher with expertise in community sector program evaluation, psychometrics, program design and implementation. She is the Project Manager of the Centre for Social Impact’s innovative Amplify Social Impact Online (Amplify Online) – the social sector’s first online repository of impact measurement tools that are consistent, evidence-based and accessible.Under Rose’s guidance, Amplify Online has gone from concept to reality, supporting community sector organisations to easily identify the ou... Read More →


Thursday September 1, 2022 10:30am - 11:30am ACST
Riverbank Room 1

10:30am ACST

Problem-solving ethics: Weaving resources and strategies into navigating ethics in evaluation
Lisette Kaleveld (University of Western Australia), Samantha Abbato (Visual Insights People)

Ethical issues are woven throughout all stages of evaluation, and for this reason ethical guidelines are critical (such as the AES Ethical Guidelines and the National Statement on Ethical Conduct in Human Research). However, various dynamic relationships are also interwoven through evaluation work, and within these relationships, evaluators face different ethical responsibilities and constraints, which may play out in surprising ways. Existing resources assist with navigating this terrain - as do informal supports and strategies. In this workshop we will examine a case study of navigating ethics in the field, and use problem solving to identify useful strategies (including resources and actions) that can be applied – and perhaps come up with new ideas too.

This session is for people who want to collaboratively build practical strategies for effective ethical decision making. It follows the 2021 AES sessions on ethics in evaluation; a presentation seminar of five panellists, and a follow-up interactive session, involving more than sixty AES members. The outputs of this session will be coded and thematically analysed, and presented together with the case study to engage participants in furthering the identification of practical solutions for evaluators navigating ethics.

We will consider:
  1. Unique ethical challenges experienced by evaluators
  2. How evaluation ethics is situated in networks that include a range of relationships and agendas (i.e., Human Research Ethics Committees, commissioners, clients/practitioners, participants and evaluation peers)
  3. New ideas for practical ways to support each other in navigating ethics.

The session will open with a 10-minute introduction and 5 minutes for questions. Participants will move into small groups for a facilitated analysis and discussion, using analysed data from 2021, a case study and a solutions-focused approach. Group feedback and a final discussion will focus on identifying actionable strategies to build practical support for ethical evaluation practice.

Speakers
avatar for Emma Williams

Emma Williams

Associate Professor, RREALI CDU Maburra
Emma is a Credentialed Evaluator and an Associate Professor at Charles Darwin University in Australia, where she researches, conducts and teaches evaluation, particularly realist evaluation. She has moved between government, academe and private consulting in evaluation over more than... Read More →
avatar for Samantha Abbato

Samantha Abbato

Director, Visual Insights People
Dr. Samantha Abbato is an evaluation consultant and Director of Visual Insights. Sam has a passion for maximising evaluation use through effective communication and evaluation skill building using a pictures and stories approach and increasing the academic rigour of evidence.Sam’s... Read More →
avatar for Lisette Kaleveld

Lisette Kaleveld

Research Officer, Centre for Social Impact
Lisette is a social researcher who has worked on research and evaluation projects conducted in academic contexts, as well as for the not-for-profit sector, as a public servant and as an independent consultant. Lisette’s evaluation experience spans education, health and mental health... Read More →


Thursday September 1, 2022 10:30am - 11:30am ACST
Room E3

10:30am ACST

How we can address environmental sustainability in our evaluations?
Patricia Rogers (Consultant)

Many evaluators and evaluation managers want to ensure that evaluations include consideration of environmental impacts and sustainability, given the numerous interconnected environmental crises the world faces - including climate change, deforestation, biodiversity, air, water and soil quality. However current evaluation practice and training of evaluators do not usually include attention to environmental impacts unless these are stated objectives of programs or policies, nor do evaluation teams usually include both people with expertise in systematic evaluative investigation and reporting, as well as expertise in natural systems.

This session will introduce participants to ways of incorporating environmental sustainability into the ways evaluations are framed, commissioned, managed, designed, undertaken and reported. It will introduce some tools that can be used by evaluators and evaluation managers, illustrated by examples, and provide an opportunity for participants to reflect on how they might apply these in their own practice. To support ongoing learning, participants will be provided with links to additional resources and networks. The session is intended to share lessons from recent innovations and support participants to contribute to these innovations.



Chair
avatar for Julie Elliott

Julie Elliott

Evaluator
Collaborator and evaluation scholar-practitioner committed to acknowledging the complexity inherent in all human settings.

Speakers
avatar for Patricia Rogers

Patricia Rogers

Evaluator and researcher, Footprint Evaluation Initiative
Founder of BetterEvaluation and former Professor of Public Sector Evaluation at RMIT University. Now working as consultant and advisor. My work has focused on supporting appropriate choice and use of evaluation methods and approaches to suit purposes and context. I am currently working... Read More →


Thursday September 1, 2022 10:30am - 11:30am ACST
Room E2

11:00am ACST

Culturally adaptive governance for culturally safe evaluation: Lessons from an Aboriginal majority owned consulting firm
Cat Street (Curijo), Belinda Kendall (Curijo)

AES' First Nations Cultural Safety Framework provides clear and practical guidance for conducting culturally safe evaluation theory, practice and use. We consider this framework from a governance perspective, specifically focusing on how organisational governance can create culturally safe environments and, in turn, achieve culturally safe evaluations.

Organisational governance offers opportunities for culturally safe evaluations by shifting the focus and narrative from a deficit of blaming Indigenous people and communities, which has been the historical approach of policy, to the system that has failed them. This presentation will reflect on progress of an Aboriginal majority owned consulting firm in implementing Duke's Culturally Adaptive Governance Framework (CAGF) in its expanding evaluation practice. CAGF allows for governance that is conscious of power. CAGF puts Indigenous governance, knowledges and cultures at the centre and pays attention to structural enablers of culturally safe evaluation practice. It also ensures that self-determination is at the heart of governance arrangements. As we begin to expand its evaluation practice it will be reflecting on the strengths and opportunities that arose through implementing the CAFG. We will share examples of how we are implementing the three defining attributes of the CAFG - invest, effect and foster - to ensure we meet the AES' principles of culturally safe evaluation.

Chair
AR

Alison Reedy

Northern Territory Government
Dr Alison Reedy is an education and social sciences researcher and evaluator working in Territory Families, Housing and Communities in the NT.

Speakers
avatar for Cat Street

Cat Street

Research, Monitoring and Evaluation Practice Manager, Curijo
avatar for Lauren O’Flaherty

Lauren O’Flaherty

Consultant, Curijo Pty Ltd
Lauren is a First Nations woman with connections to the Wonnarua People of NSW on her maternal side. Lauren works across Curijo's Consulting and Research, Monitoring and Evaluation Practices and was awarded one of the Emerging Indigenous Evaluators Grants for this conference. Lauren... Read More →


Thursday September 1, 2022 11:00am - 11:30am ACST
Room E1

11:30am ACST

Designing for impact with multicultural community-led projects: Evaluation of a Cancer screening grants program
Tove Andersson, Lauren Temminghoff, Osman Osman, Ayesha Ghosh, Kate Broun, Kerryann Lotfi-Jam (Cancer Council Victoria)

This paper describes the design and implementation of a community-led outcomes and impact evaluation for 11 multicultural community projects about cancer screening in Melbourne. There are many factors to consider when evaluating multicultural projects, which include adapting the design to work in multiple languages, across diverse cultures, with participants with varied literacy and digital literacy. Furthermore, demonstrating outcomes and impact for smaller scale community-led projects is a challenge as many designs and methods are disproportionate to the size and style of a community-based intervention. The evaluation of the grants program provides a practical example of how an evaluation was successfully implemented using a codesign model with small multicultural organisations with varying project and evaluation experience and was adapted for remote delivery during COVID-19. The evaluation measured outcomes and impact using a post activity survey and a follow up survey two months after to see if the post activity knowledge and intentions had translated into action. Participants also reflected on their capacity and implementation in a post project qualitative interview. The evaluation was enhanced by simple and culturally appropriate tools codesigned by participating organisations; strong participant understanding of the project and evaluation aims; simple instructions and structure; factoring the evaluation into the plan and budget from the beginning and adapting the design to remote delivery. This design allowed the collective outcomes and impact of diverse smaller projects to be demonstrated to the funder, built the evaluation capacity of participants, and provided insights into evaluating remotely with multicultural communities. This project shows that rigorous evaluation of diverse multicultural projects is possible when attention has been paid to collaborative planning, tailored tools and appropriate training and support. Evaluating these projects captures critical evidence that supports the continuation of funding to grassroots organisations best placed to deliver community interventions.

Chair
avatar for Eden Robertson

Eden Robertson

Research and Evaluation Manager, Starlight Children's Foundation
Dr Eden Robertson is an experienced psychosocial researcher, with a special interest in supporting children with a serious illness and their families. Dr Robertson has completed a Bachelors of Psychology (Honours), Graduate Certificate in Adolescent and Young Adult Health and Wellbeing... Read More →

Speakers
avatar for Tove Anderson

Tove Anderson

Evaluation Coordinator, Cancer Council Victoria
Tove is an evaluation specialist with experience in applied social research and partnership work in international development, collective impact and public health programs. She completed her Master of Migration and Ethnic Relations at the University of Malmö in Sweden and has evaluated... Read More →


Thursday September 1, 2022 11:30am - 12:00pm ACST
Riverbank Room 1

11:30am ACST

"The boys have as much say as what we do around what happens": Co-design in indigenous youth programmes and evaluation design"
Gill Potaka-Osborne (Whakauae Research Services), Teri Albert (Te Oranganui Trust), Hayden Bradley (Te Oranganui Trust)

Tūngia te ururoa kia tupu whakaritorito he tutū o te harakeke. 

Clear the undergrowth so the new shoots of the flax will grow. Every adult would tell you that interacting with teens can be challenging. Imagine evaluating with teens who are experiencing social isolation, are disconnected from their culture, and involved in alcohol and other drugs. Throw into the mix Covid-19 restrictions and you will begin to understand what confronted a mature evaluator who undertook separate evaluations with two youth programmes delivered by a tribally led health and social services provider in Aotearoa New Zealand. Each of the programmes were founded on innovation and targeted marginalised Māori rangatahi (youth), however, that's where the similarity ended. One of the programmes worked with rangatahi 16-24 years who wanted to be work ready while the other with rangatahi 12- 15 years who were disconnected from the community they lived in. The evaluator recognised early on that traditional evaluator data collection methods would not cut it and entered conversations with each of the programme teams. Being a Kaupapa Māori community evaluator was helpful for understanding the community landscape but less helpful in the youth development space. By flipping the traditional evaluation process on its head, the programme facilitators undertook the bulk of the data collection with the rangatahi whilst the evaluator assumed the observer role. Guided by each of the programme facilitators the evaluator then carried out traditional data collection activities such as face to face interviews and a survey. Analysis was completed by the evaluator and sent to each of the programme teams; first as a means of sensemaking and second as a way of learning fundamental evaluation skills. Both programmes were committed in making a difference for the rangatahi they worked with and in their distinct ways were exemplars of youth programmes.

Speakers
AR

Alison Reedy

Northern Territory Government
Dr Alison Reedy is an education and social sciences researcher and evaluator working in Territory Families, Housing and Communities in the NT.
avatar for Hayden Clifford Bradley

Hayden Clifford Bradley

Team Leader, Te Oranganui
Ko Tainui te wakaKo Karioi te maungaKo Waikato te awaKo Ookapu te maraeKo Ngaati te Wehi te hapuuKo Tainui te IwiKo Hayden Bradley toku ingoaI am an active father to three young tamariki and raise my whānau in Whanganui with my beautiful wife. I have an absolute passion for rangatahi... Read More →
avatar for Wheturangi Walsh - Tapiata

Wheturangi Walsh - Tapiata

Mātaiwhetu (CE), Te Oranganui
Wheturangi Walsh-Tapiata is indigenous to Aotearoa New Zealand and descends from a number of the tribal communities along the lower half of the West Coast of the North Island. She is the Chief Executive Officer of an indigenous health and social service organisation called Te Oranganui... Read More →
avatar for Gill Potaka Osborne

Gill Potaka Osborne

Researcher, Whakauae Research Services
Ko Aotea te wakaKo Ruapehu te maungaKo Whanganui te awaKo Ātihaunui-ā-Pāpārangi, Ko Raukawa ki te Tonga nga iwiKo Ngāti Tuera, Ngāti Pamoana, Ngāti Pareraukawa ngā hapū.Ko Pungarehu, ko Parikino, ko Koriniti, Ko Ngātokowaru Marae ngā marae.E rere kau mai te awa nuimai I... Read More →


Thursday September 1, 2022 11:30am - 12:00pm ACST
Room E1

11:30am ACST

Evaluating place-based approaches in schools: Reflections from evaluators and practitioners in Schools as Community Hubs
Hayley Paproth (University of Melbourne), Janet Clinton (University of Melbourne), Ruth Aston (University of Melbourne)

The focus of this panel is to discuss perspectives in the evaluation of schools that are community hubs, where school use is extended to communities for cultural, sporting and education events, and to access primary health and social services.

In the rapidly changing nature of education and school settings in response to the current COVID-19 pandemic, with a greater focus on supporting the wellbeing and health of students, teachers and the wider community, this research is timely and necessary for informing ongoing educational reform (McShane & Coffey, 2022; Winthrop et al., 2021).

The project offers an opportunity to test the impact of evaluative activity and evaluative thinking in the context of a dynamic and complex environment, as well as a value for money approach to economic evaluation and success case methodology. The speakers will present a synthesis of evidence gathered as part of the research project, Building Connections: Schools as Community Hubs, which involves researchers, evaluators, policymakers, architects, principals, and teachers (Building Connections, 2022). The project involves a nationwide survey, network mapping analysis, and the development of an implementation and evaluation framework.

In this panel, we will present three papers that highlight the evaluation challenges, and our response to these using success case methodology, value for money and an evaluation framework that embeds evaluative thinking and action. This will be followed by a discussion, moderated by the facilitator.
  • Applying the Success Case Method to evaluate Schools as Community Hubs effectiveness
  • Investigating the impact and cost-effectiveness of school-community partnerships
  • An Evaluation Framework for Schools as Community Hubs

Chair
avatar for Michaela Sargent

Michaela Sargent

CEO, Exemplar International
Michaela Sargent is the CEO of Exemplar International Development (www. exemplar-international.com) , an Australian based international development managing contractor focused on health, disability inclusion ( health, education and employment) and addressing the health impacts of... Read More →

Speakers
avatar for Ruth Aston

Ruth Aston

Senior Lecturer, Assessment and Evaluation Research Centre (AERC)
Dr Ruth Aston is a Senior Lecturer at the Assessment and Evaluation Research Centre and Honorary Research Fellow at the Centre for Adolescent Health at Murdoch Children's Research Institute. Ruth has a background in public health, including working on a three-year national project... Read More →
avatar for Janet Clinton

Janet Clinton

Professor, University of Melbourne
Professor Janet Clinton is Director of the Centre for Program Evaluation (‘CPE’) at the Melbourne Graduate School of Education (‘MGSE’). She was previously the senior academic in Program Evaluation and the Academic Director for the School of Population Health at the University... Read More →
avatar for Hayley Paproth

Hayley Paproth

PhD Candidate & Research Assistant, University of Melbourne - MGSE
Hayley Paproth is a PhD Candidate and Research Assistant at the Centre for Program Evaluation, at the University of Melbourne.Her PhD is part of the Building Connections: Schools as Community Hubs research project, and she is investigating the outcomes of successful Schools as Community... Read More →
avatar for Katina Tan

Katina Tan

Research Fellow, University of Melbourne
Katina is a Research Fellow at the Centre for Program Evaluation, University of Melbourne. She is an education researcher, and qualified finance professional. Katina has more than 15 years of corporate experience spanning strategic and operational finance, assurance, and consultancy... Read More →


Thursday September 1, 2022 11:30am - 12:30pm ACST
Hall D (plenary)

11:30am ACST

Beyond justification: The potential of program theories and its radical implications for evaluation practice and program planning
Ghislain Arbour (Centre for Program Evaluation, University of Melbourne)

A program theory is often understood as an explanation about how a program produces a chain of outcomes that ultimately leads to the intended outcome. Unfortunately, by assuming the occurrence of an intended outcome in its very definition, this perspective skews the development of a program theory towards the search of an explanation that fits the intent of the program rather than one that reveals its most probable consequences.

This popular view on program theory can have serious consequences on program development and evaluation. Overall, it risks turning a program theory into a tool of justification rather than a tool of investigation that can challenge the causal assumptions behind a program and inform better policy alternatives.
This presentation aims at offering a different perspective on program theory, one that can unlock its full reflective potential for evaluation and program planning. To do so it carefully establishes the essential characteristics of a program theory, around the notion of causal explanation. From there it logically derives a few lessons and challenges common current conceptions in evaluation practice.

Presentation Plan
The presentation will start with a short introduction to get to know the audience and its needs in respect to program theory (5 min.), followed by the main lecture (35 min.), and will end with a group discussion around a short practical application (10 min.).

Chair
avatar for Emma Williams

Emma Williams

Associate Professor, RREALI CDU Maburra
Emma is a Credentialed Evaluator and an Associate Professor at Charles Darwin University in Australia, where she researches, conducts and teaches evaluation, particularly realist evaluation. She has moved between government, academe and private consulting in evaluation over more than... Read More →

Speakers
avatar for Ghislain Arbour

Ghislain Arbour

Senior Lecturer, The University of Melbourne
Doctor Ghislain Arbour is a Senior Lecturer at the University of Melbourne where he coordinates the Master of Evaluation.*Research and consultancy*A primary research interest of Dr Arbour is the clarification of necessary concepts in the analysis and judgement of the performance of... Read More →


Thursday September 1, 2022 11:30am - 12:30pm ACST
Room E3

11:30am ACST

Advances in rubric implementation
Gerard Atkinson (ARTD Consultants), Jack Rutherford (ARTD Consultants), Julian King (Julian King & Associates Limited)

The increased uptake of rubric-driven evaluation has been a positive development for the field, and has improved the clarity, cultural sensitivity, and communication of evaluation activities. This presentation looks at a range of approaches being integrated at the cutting edge of rubric design and implementation:
  • innovations in co-creation of rubrics with stakeholders
  • multi-program evaluation using a single rubric
  • rubric testing, refinement and standardisation
  • rubric-driven data collection
  • dynamic rubric assessment based on individual stakeholder values
The talk will present examples of these novel approaches based on experience and lessons from the field.


Chair
avatar for Mathea Roorda

Mathea Roorda

Senior consultant, Allen + Clarke
Values inform what matters, and are at the heart of evaluation. You literally can't get to an evaluative judgement without them. I'm interested in approaches to systematically identifying what matters, and for whom. Come talk with me about the values identification matrix (VIM) I... Read More →

Speakers
avatar for Julian King

Julian King

Director, Julian King & Associates
Julian specialises in evaluation and value for money. The Value for Investment evaluation system, developed through Julian’s doctoral research, combines evaluative and economic thinking to assess the quality, impact and value of policies and programs. Julian received the 2021 A... Read More →
avatar for Gerard Atkinson

Gerard Atkinson

Director, ARTD Consultants
I am a Director with ARTD Consultants with expertise in: - program and policy evaluation - workshop and community facilitation - business analytics and data visualisation - market and social research - financial and operational modelling - non-profit, government and business... Read More →
avatar for Jack Rutherford

Jack Rutherford

Consultant, ARTD Consultants
Jack joined ARTD in 2018 after completing his Honours thesis in behavioural ecology the previous year, during which he tried to teach colours to jumping spiders. He augments his skills and insights from ecology to the public policy ecosystem, thinking critically about evaluation... Read More →


Thursday September 1, 2022 11:30am - 12:30pm ACST
Riverbank Room 2

11:30am ACST

Where to next? Weaving values into evaluation practice
Amy Gullickson (University of Melbourne), Michael Harnar (Western Michigan University), Allison Clarke (University of Melbourne), Jill Thomas (University of Melbourne)

This interactive session follows on from the Long Paper presentation on the research commissioned by AES into the knowledge, skills and attributes (KSAs) related to the logic of evaluation and values in evaluation practice. In this session, our goal is to hear from you about how the general logic of evaluation features in your evaluation practice, your reactions to the findings, and potential next steps for the research and updates to the AES Evaluator's Professional learning competencies. The format will be as follows
  • 10 minute recap of the findings to kick start discussion
  • 5-minute individual reflection: how participants use the logic of evaluation and what KSAs they associate with it. How does the logic of evaluation relate to you and your practice?
  • 20 minute small group roundtable discussions on the following questions:
  1. After seeing this research on the KSAs, which aspects do you feel like you need to learn? What would you like to learn and how would you like to learn it? What pathways for learning would be feasible and useful?
  2. What questions do you have about values and the general logic?
  3. To continue this research, where else should we look? Who should we talk to? What kind of evidence would you need to feel it was worth updating the competencies?
  •   15 minute sharing of findings and ideas from group discussions

Chair
avatar for Julie Elliott

Julie Elliott

Evaluator
Collaborator and evaluation scholar-practitioner committed to acknowledging the complexity inherent in all human settings.

Speakers
avatar for Amy Gullickson

Amy Gullickson

Associate Professor, Director, Centre for Program Evaluation, The University of Melbourne
Associate Professor Amy Gullickson is the Director of the University of Melbourne Centre for Program Evaluation, which has been delivering evaluation and research services, thought leadership, and qualifications for more than 30 years. She is also a co-founder and current chair of... Read More →
avatar for Allison Clarke

Allison Clarke

Evaluator & Research Fellow, Assessment and Evaluation Research Centre (AERC)
Allison (she/her) is an evaluation specialist with skills in monitoring and evaluation for community and organisational learning. Allison works on Wurundjeri Woi Wurrung Country at the Assessment and Evaluation Research Centre AERC (formerly known as the Centre for Program Evaluation... Read More →
avatar for Jill Thomas

Jill Thomas

Director, J.A Thomas & Associates
Jill is an experienced evaluator and analyst, having worked in the health, higher education and finance sectors in major cities and far northern Queensland. Jill specialises in working with organisations to design and implement performance monitoring and evaluation frameworks, conduct... Read More →


Thursday September 1, 2022 11:30am - 12:30pm ACST
Room E2

12:00pm ACST

Measuring community resilience after disasters
Michael Pilbrow (Strategic Development Group)

After a disaster, hearing stories is a valuable way to learn from people in a traumatised rural community at the heart of a recovery effort. Strategic Development Group worked with the Business Council of Cooperatives and Mutuals to understand the contribution of cooperatives in recent bushfire, cyclone and floods. We established the crucial role often overlooked that cooperatives play due to their social connections, high levels of trust, risk pooling and efficient communication avenues (see https://coopfarming.coop/wp-content/uploads/2021/09/Primary-producer-co-operatives-the-beating-heart-of-community-resilience-and-recovery.pdf). We recommend that cooperatives deserve a seat at the table with government and others in building community resilience and preparing for future disasters.

Chair
avatar for Eden Robertson

Eden Robertson

Research and Evaluation Manager, Starlight Children's Foundation
Dr Eden Robertson is an experienced psychosocial researcher, with a special interest in supporting children with a serious illness and their families. Dr Robertson has completed a Bachelors of Psychology (Honours), Graduate Certificate in Adolescent and Young Adult Health and Wellbeing... Read More →

Speakers
avatar for Michael Pilbrow

Michael Pilbrow

Founder and Chairman, Strategic Development Group
Michael Pilbrow is based in regional Australia from where he engages in an unusual mix of evaluation, co-operative development and disaster recovery and resilience work. Michael has led evaluations in Australia and globally in areas as diverse as mining governance, digital technology... Read More →
avatar for Sarah Dyer

Sarah Dyer

Principal Consultant, Strategic Development Group
I have 30 years experience in development with expertise in gender equality and disability and social inclusion in policy and program design and in facilitating inclusive monitoring and evaluation processes. In all aspects of my practice I seek to ensure that voice is given to people... Read More →


Thursday September 1, 2022 12:00pm - 12:30pm ACST
Riverbank Room 1

12:00pm ACST

Assessing Evaluation Capacity: Introducing the Evaluation Capacity Health Check
Judy Gold (Cultivating Change), Anna Vu (HealthWest), Kate Baker (HealthWest), Emma Thomas (Cultivating Change)

Social purpose organisations are increasingly expected to prioritise accountability, learning and improvement. To support this, Cultivating Change, HealthWest and primary prevention partners co-designed an 'Evaluation Capacity Health Check' to assess current evaluation capacity. The resulting tool allows self-assessment of 17 capabilities across four domains: Leadership and Culture, Staff Capacity, Systems and Structures, and Collective MEL Efforts. Described by the partners as 'revolutionary', the simple tool is available online, and can be used by individuals, teams and entire organisations. The resulting assessment can then be used to identify opportunities for improvement, including advocating for increased organisation focus and investment in evaluation.

Chair
avatar for Eden Robertson

Eden Robertson

Research and Evaluation Manager, Starlight Children's Foundation
Dr Eden Robertson is an experienced psychosocial researcher, with a special interest in supporting children with a serious illness and their families. Dr Robertson has completed a Bachelors of Psychology (Honours), Graduate Certificate in Adolescent and Young Adult Health and Wellbeing... Read More →

Speakers
avatar for Judy Gold

Judy Gold

Co-Director, Cultivating Change
avatar for Emma Thomas

Emma Thomas

Co-Director, Cultivating Change
Emma is an experienced evaluation practitioner with over 15 years working for social purpose organisations. Emma helps organisations plan, collect and use evidence to support performance measurement, decision making, learning and practice improvements. Emma has worked with a range... Read More →


Thursday September 1, 2022 12:00pm - 12:30pm ACST
Riverbank Room 1

12:00pm ACST

Cake and competencies: how we evaluated ourselves and reinvigorated our professional development as self-taught evaluators
Kate Williams (University of Wollongong), Darcy Morris (University of Wollongong), Karen Quinsey (University of Wollongong)

Long ago and far away, I was introduced to program evaluation. A colleague tried, in vain, to convince me it was different to research. Fast forward 22 years and I am guiding regular professional development of a multidisciplinary team of evaluators, many of whom are - like me - highly experienced yet self-taught.

This presentation will describe how our 'Evaluation Special Interest Group' used the AES competency framework to evaluate ourselves as evaluators and plan learning activities. Thus, our professional development program became more engaging and systematic, as demonstrated by a (not strictly scientific) follow-up survey.

A cake roster also helped.

Chair
avatar for Eden Robertson

Eden Robertson

Research and Evaluation Manager, Starlight Children's Foundation
Dr Eden Robertson is an experienced psychosocial researcher, with a special interest in supporting children with a serious illness and their families. Dr Robertson has completed a Bachelors of Psychology (Honours), Graduate Certificate in Adolescent and Young Adult Health and Wellbeing... Read More →

Speakers
avatar for Kate Williams

Kate Williams

Senior Research Fellow, Australian Health Services Research Institute, University of Wollongong
Dr Kate Williams is a senior research fellow at the Australian Health Services Research Institute, University of Wollongong, where she works in multi-disciplinary teams on commissioned research and evaluation projects. Kate has more than 20 years’ experience in research and evaluation... Read More →
avatar for Darcy Morris

Darcy Morris

Research Fellow, Australian Health Services Research Institute, University of Wollongong
Darcy has a strong and diverse set of research skills and applies this expertise to a multitude of regional and national health service research, development and evaluation projects covering diverse health-related content areas. He has extensive experience in quantitative and qualitative... Read More →


Thursday September 1, 2022 12:00pm - 12:30pm ACST
Riverbank Room 1

12:00pm ACST

Cumulative evaluation in practice: Sharing the power of systematically evaluating small community events and activities to increase evaluators' ability to report on project outcomes.
Madeleine Bing-Fish (Wellbeing SA), Clare McGuiness (Wellbeing SA)

This presentation provides an example of systematically evaluating community activities across three South Australian bushfire affected communities. It will demonstrate how cumulative data collection is increasing the ability to report on the overarching project's impact on supporting communities' mental health and wellbeing post 2019-20 bushfires.

This presentation aligns with sub-theme: Materials, Patterns and Practices by modelling an innovative way to capture small amounts of data cumulatively over the life of a project. It will demonstrate how the use of consistent measurement questions, visually appealing surveys and a simple question mapping tool are contributing to a cumulative analysis of project outcomes.



Chair
avatar for Eden Robertson

Eden Robertson

Research and Evaluation Manager, Starlight Children's Foundation
Dr Eden Robertson is an experienced psychosocial researcher, with a special interest in supporting children with a serious illness and their families. Dr Robertson has completed a Bachelors of Psychology (Honours), Graduate Certificate in Adolescent and Young Adult Health and Wellbeing... Read More →

Speakers
avatar for Madeleine Bing-Fish

Madeleine Bing-Fish

Principal Partnerships Officer, Wellbeing SA


Thursday September 1, 2022 12:00pm - 12:30pm ACST
Riverbank Room 1

12:00pm ACST

Weaving together - learnings from the field when working with diverse knowledges towards transformative practice
Stephanie Harrison (Pandanus Evaluation)

The purpose of the presentation is to share learnings from my experiences working in the field as an emerging evaluator. This presentation is a reflection on how I situated myself 'within the weave' of a recent evaluation commission by applying my own cultural lens as well as working to centre First Nations' voices and perspectives in the evaluation process.

The evaluation process was intended to facilitate the development of contextually and culturally responsive evaluation practice aimed at utilisation of findings. The key concept was how to develop processes and adapt my positioning to ensure I was facilitating a learning journey that was inclusive, accessible, and guided by the values of the client. Meaningful collaboration, regular communication, and being able to 'sit' in the uncomfortableness of not-knowing were central to the process.

The main findings will be shared as a reflection and discussion on my experiences working with West Papuan political refugees and arts activists on a complex and politically volatile Project. This reflection on practice and the learning journey is important for other evaluators working to support First Nations' self-determination and culturally centred practice. I will discuss some of the challenges faced when working with and across diverse perspectives.

This presentation will contribute to the body of knowledge in the field of evaluation by describing the transformation that can occur when evaluators are open to weaving together perspectives and sitting with the uncomfortableness of 'not knowing'.

Chair
AR

Alison Reedy

Northern Territory Government
Dr Alison Reedy is an education and social sciences researcher and evaluator working in Territory Families, Housing and Communities in the NT.

Speakers
avatar for Stephanie Harrison

Stephanie Harrison

Evaluation Consultant, Pandanus Evaluation
Stephanie grew up in Central Australia and has worked on a diverse range of community art and cultural programs both as a facilitator and program manager. She recently completed her Master of Evaluation through the University of Melbourne and lives on a small farm in lutruwita/Tasmania... Read More →


Thursday September 1, 2022 12:00pm - 12:30pm ACST
Room E1

1:30pm ACST

Providing a pathway through complexity: using 'simple rules' to guide evaluation of a transformational change program in mental health care
Kate Williams (University of Wollongong), Cristina Thompson (University of Wollongong)

The Australian public mental health care system has the characteristics of a complex adaptive system. It is dynamic and nonlinear, and the actions of clinicians - operating both individually and within established cultures and informal, hierarchical groups - ensure that 'top-down' instructions are not simply accepted but adapted to local circumstances. This presentation explains how we used a complexity lens throughout the five-year evaluation of a major mental health reform program.

The Pathways to Community Living Initiative (PCLI) aims for transformational change in the delivery of care to people with severe and persistent mental illness who have experienced, or are at risk of experiencing, long hospital stays. Strategic leadership and resources are provided by the NSW Ministry of Health. It was progressively implemented from 2015 across the state. The program aims to embed contemporary, recovery-oriented models of care in mental health services by facilitating multidisciplinary working, ensuring consumer and carer engagement in care planning, and building stronger links between inpatient and community mental health teams. Cross-sector working with aged care and disability care providers is an important element of the program.

Evaluating such a large and multi-faceted program over a long period of time could easily lead to data overload and confusion. Using a complexity lens - specifically, the 'simple rules' of transformation in large health care systems - helped focus our efforts and generate useful insights to strengthen the cross-disciplinary, inter-organisational working essential for achieving and sustaining this systemic change. Evaluation insights were shared regularly via formative feedback and recommendations, leading to positive changes in program design and implementation. We believe this evaluative approach ensured the program had the best possible chance of the intended impact and outcomes, added value for stakeholders, and provided a coherent framework for the reporting of findings.

Chair
avatar for Jess Dart

Jess Dart

Chief Evaluator and Founder, Clear Horizon Consulting
Dr Jess Dart is the founder and Chief Evaluator of Clear Horizon, an Australian-based specialist evaluation company. Having received the 2018 Outstanding Contribution to Evaluation Award from the Australian Evaluation Society (AES), Jess is a recognised leader with over 25 years of... Read More →

Speakers
avatar for Kate Williams

Kate Williams

Senior Research Fellow, Australian Health Services Research Institute, University of Wollongong
Dr Kate Williams is a senior research fellow at the Australian Health Services Research Institute, University of Wollongong, where she works in multi-disciplinary teams on commissioned research and evaluation projects. Kate has more than 20 years’ experience in research and evaluation... Read More →


Thursday September 1, 2022 1:30pm - 2:00pm ACST
Riverbank Room 2

1:30pm ACST

Process tracing as an evaluation tool
Richard Bell (Tetra Tech)

Process tracing has gained some prominence and popularity in academic circles as a qualitative tool enabling within-case empirical analysis of a specific causal process. It represents a rigorous and comprehensive analysis of qualitative data to identify causal-process observations (CPOs) that link an independent to a dependent variable along a fully-constructed causal chain. Process tracing enables a researcher to confirm if results are consistent with the program theory, as well as to see if alternative explanations can be ruled out. However, with this rigour comes substantial resource-intensity. This presentation outlines the possibility of applying process tracing to program evaluation using the Australian Aid program as an example of how this might be done.

Process tracing exists alongside realist evaluation and contribution analysis as a Theory Based Evaluation (TBE). In practice, mechanisms in TBE tend to be treated as assumptions rather than explicitly theorised. Where process tracing seeks to be distinct is through a more explicit study into the underlying causal mechanisms connecting inputs to outcomes rather than looking at triggers for certain causal mechanisms in certain circumstances (to arrive at what worked for who and why), or treating causal mechanisms as assumptions (Beach & Schmitt, 2015; Stern, et al., 2012). Foremost, process-tracing is a research tool that enables systematic development of evidence - such as for example: interviews; feedback forms/survey; program records/documents; government records/documents; budget scrutiny and many more - to properly link cause and effect. Given the detailed nature of process tracing, the best use of this approach would be in large-scale impact assessment seeking to provide precise attribution of outcomes against program activities.
This presentation will present process tracing in detail, followed by an outline of how this could be applicable to large scale impact evaluations in development assistance programs. It will involve a detailed example of a country-level evaluation.

Chair
avatar for Graham Smith

Graham Smith

Director, Numerical Advantage
I consult and teach, primarily on performance measurement, which was the focus of a recent Ph.D, and on which I recently delivered an AES workshop. I also conduct performance audits as well as wide-ranging evaluation services.

Speakers
avatar for Richard Bell

Richard Bell

Manager, Tetratech
I’m an Australia-based consultant for Tetratech International working on international development and specialising in Governance in Fragile and Conflict Affected States. I work in program design, evaluation and research. My approach is based in analysis of institutional structures... Read More →


Thursday September 1, 2022 1:30pm - 2:00pm ACST
Room E1

1:30pm ACST

Weaving greater equity into evaluation practice
Erin Blake (Blake Consulting)

Promoting greater equity in evaluation has been a hot topic for reflection, discussion and learning at the Multicultural Evaluation Special Interest Group (MESIG), and across the AES over the last two years. New and emerging approaches hold great promise for better addressing issues of equity  while also making the practice of evaluation more equitable. There is, however, a long way to go in translating this concept into concrete actions in the different contexts and communities within which individual evaluators work.

This interactive session aims to 'maintain the momentum' of the MESIG's efforts to promote greater equity in evaluation among the AES community, by:
  1. Providing a safe space to discuss issues that are often difficult such as racism, ableism, ageism, sexual discrimination, homophobia and other 'isms' impact on the practice of evaluation.
  2. Drawing from the wealth of experience and ideas of the AES membership to better promote more equitable evaluation practices in Australia and across the region.

Delivered in a world café format, the session will provide a space for participants to:
  1. Deeply reflect on their values, relevant ideas and issues that have arisen at other sessions in the conference, the challenges people face applying their values and the concept of equity to the practice of evaluation, and the strategies developed to address these challenges.
  2. Connect with like-minded people and learn from their peers about how they center values and promote equity in their evaluation practice and share wisdom, practice knowledge and tips.

The discussions from this session will be documented and shared via the MESIG's email list to interested members with the aim of furthering the conversation on equity in evaluation at AES, and influencing improved policy and practices more broadly, including the focus of future MESIG initiatives.

Chair
avatar for Kahiwa Sebire

Kahiwa Sebire

Invalue Consulting
Kahiwa Sebire is a puzzling pattern-spotter, an enthusiastic solution finder and a life-long learner. Exploring possibilities and meaning-making with sticky notes and whiteboards (or their digital siblings) in tow.  Kahiwa Sebire has over 10 years’ experience working in and with... Read More →

Speakers
avatar for Erin Blake

Erin Blake

Monitoring, Evaluation and Learning Consultant, Erin Blake Consulting
Erin is an independent international development Monitoring, Evaluation and Learning (MEL) consultant with over 15 years experience. He has a passion for ‘working with people to do MEL better’ and working on complex social change programs that seek to bring about long-term positive... Read More →


Thursday September 1, 2022 1:30pm - 2:30pm ACST
Room E2

1:30pm ACST

Evaluating contribution from the ground up
Sara Lystlund Hansen (Allen+Clarke), Roxanne Bainbridge (Central Queensland university), Mathea Roorda (Allen+Clarke)

Grounded theory applies inductive reasoning and methods for generating hypotheses and theory from the ground. Meanwhile, contribution analysis is a primarily deductive approach that tests an existing program logic to make credible causal claims about interventions and their results. What occurs when these seemingly divergent approaches are combined in a systems-focused evaluation?
In this presentation, we will reflect on our experience of using grounded theory and contribution analysis to develop warranted cases that describe the contribution of an Australian Government program that aims to strengthen comprehensive primary health care (cPHC) for Aboriginal and Torres Strait Islander people. We analysed more than 50 transcripts of interviews and workshops to surface factors that enable or inhibit contribution to cPHC. We will provide examples of how contribution cases were developed, tested and then revised through engagement with key evaluation stakeholders. These cases were integral to supporting defensible evaluative judgements about the program's contribution in a complex context.

Chair
avatar for Laura Manrique

Laura Manrique

Monitoring, Evaluation and Learning Officer, Pacific Community
Hola! I'm Laura Manrique, I come from Colombia and currently I'm working as Monitoring, Evaluation and Learning Officer at the Pacific Community here in the sunny city of Noumea, in New Caledonia. I have experience in M&E of social projects and programmes focused on leadership and... Read More →

Speakers
avatar for Roxanne Bainbridge

Roxanne Bainbridge

Director Centre for Indigenous Health Equity Research, Central Queensland University
I am a Gungarri/Kunja Aboriginal researcher from South Western Queensland in Australia and Professorial Research Fellow at Central Queensland University where I am Director for the Centre for Indigenous Health Equity Research. My current priority evaluation is embedded in a partnership... Read More →
avatar for Mathea Roorda

Mathea Roorda

Senior consultant, Allen + Clarke
Values inform what matters, and are at the heart of evaluation. You literally can't get to an evaluative judgement without them. I'm interested in approaches to systematically identifying what matters, and for whom. Come talk with me about the values identification matrix (VIM) I... Read More →
avatar for Dr Sara Lystlund Hemi

Dr Sara Lystlund Hemi

Research Consultant, Allen + Clarke
Sara Lystlund Hansen has a PhD in Cultural Anthropology and in-depth experience in ethnographic methods and analysis, including participatory research, thick-description, storytelling and narrative analysis. Sara has a longstanding interest in social justice, structures of power and... Read More →


Thursday September 1, 2022 1:30pm - 2:30pm ACST
Riverbank Room 1

1:30pm ACST

Evaluator Competencies: Self-Assessment
Taimur Siddiqi (The Incus Group), Amy Gullickson (University of Melbourne), Delyth Lloyd (Department of Health), Lauren Wildschut (Stellenbosch University), Sarah Mason (University of Mississippi), George Argyrous (Paul Ramsay Foundation), Anne Stephens (Ethos of Engagement)

In 2020, we worked as part of a consortium to develop an online Self-Assessment tool for the AES Evaluators Professional Learning Competency Framework. We performed an alpha test of the new tool at the inaugural FestEVAL 2020 and followed it up with a beta test at FestEVAL 2021. We're back with the final version for curious, emerging and established evaluators to (re)assess their competencies and identify strengths and opportunities for further development!

In this Skill Building Session, we will share the self-assessment results of the AES 2022 conference attendees and provide you a chance to reflect on both these summary results and your individual results, including through the formation of groups based around opportunities for improvement and common areas of professional development interest. The group discussions will help identify what you think you need to learn and experience to advance and how the AES can support that.

We invite anyone who is interested to complete the self-assessment beforehand and bring your individual results to discuss at the session!
 
You can complete the self-assessment here anytime before Thursday September 1st: https://tinyurl.com/AESselfassessGamma  

Chair
avatar for Amanda Jones

Amanda Jones

Senior Manager - Research & Evaluation, VACCA
I have been in community sector research and evaluation for 28 years, mainly in the mainstream child and family sector. Three and half years ago I joined VACCA in this capacity and am loving being challenged to rethink my practice in an ACCO context.

Speakers
avatar for Amy Gullickson

Amy Gullickson

Associate Professor, Director, Centre for Program Evaluation, The University of Melbourne
Associate Professor Amy Gullickson is the Director of the University of Melbourne Centre for Program Evaluation, which has been delivering evaluation and research services, thought leadership, and qualifications for more than 30 years. She is also a co-founder and current chair of... Read More →
avatar for Taimur Siddiqi

Taimur Siddiqi

The Incus Group
Taimur is an experienced evaluation and impact measurement professional who is Managing Director of The Incus Group and current member of the AES Pathways Committee. In his consulting role, he has completed numerous evaluation and impact measurement projects, working with a range... Read More →


Thursday September 1, 2022 1:30pm - 2:30pm ACST
Room E3

1:30pm ACST

Collaborating with people with lived experience in evaluation: a practical framework and perspectives
Alexandra Lorigan (ARTD Consultants), Jade Maloney (ARTD Consultants), Sharon Marra-Brown (ARTD Consultants)

In policy, there is an increasing recognition of the importance of involving people with lived experience to ensure policy and services reflect their needs and support their goals. In mental health and disability, there is a commitment to nothing about us without us. For evaluation to operate in this way, we need to rethink the conception of evaluator as 'independent objective outsider', instead, engaging people with lived experience to inform all stages from design through reporting. But how do we do this well in practice?

We will begin by presenting a practical framework any evaluator can use for engaging lived experience evaluators and lived experience advisory groups - covering recruitment channels and processes, position descriptions and terms of reference, induction, training and support, options for roles and responsibilities, considerations of collaboration modes to suit different communication needs and tasks, managing power dynamics, and timelines.

Guided by an experienced facilitator, our panel discussion will then weave the perspectives of evaluators and people with lived experience from evaluation projects in the mental health sector. They will share their perspectives on what has worked to enable the perspectives of people with lived experience to shape evaluations and how others can learn from this.

The panel will be facilitated by ARTD managing director (and convenor of the AES 2019 Conference), Jade Maloney. Panellists will include Sharon Marra-Brown and Alex Lorigan who are leading NSW Towards Zero Suicides evaluations and Victorian lived and living experience workforce benchmarking projects, along with people with lived experience engaged in these projects as lived experience team members or advisory group members.

Chair
avatar for Allison Clarke

Allison Clarke

Evaluator & Research Fellow, Assessment and Evaluation Research Centre (AERC)
Allison (she/her) is an evaluation specialist with skills in monitoring and evaluation for community and organisational learning. Allison works on Wurundjeri Woi Wurrung Country at the Assessment and Evaluation Research Centre AERC (formerly known as the Centre for Program Evaluation... Read More →

Speakers
avatar for Jade Maloney

Jade Maloney

Partner & Managing Director, ARTD Consultants
I work with government agencies, not-for-profits and citizens to co-design, refine, communicate and evaluate social policies, regulatory systems and programs. I am passionate about ensuring citizens have a voice in shaping the policies that affect their lives, translating research... Read More →
avatar for Alexandra Lorigan

Alexandra Lorigan

Senior Consultant, ARTD Consultants
Alexandra supports evaluations and reviews in areas of complex social policy, most commonly in the mental health and disability sectors. She is passionate about building social inclusion through initiatives that support or build the capacity of people with lived experience, their... Read More →
avatar for Sharon Marra-Brown

Sharon Marra-Brown

Senior manager, ARTD Consultants
Curious for a living - evaluation specialist, combining technical excellence with emotional intelligence.Talk to me about working in health, mental health and suicide prevention, working with lived and living experience researchers, my decade plus in the public service or how I weave... Read More →
avatar for Rosie Dale

Rosie Dale

I am passionate about using my lived experience with mental illness to reduce stigma and increase hope. I represent the lived experience sector and hope to encourage mental health services and systems to be more trauma informed and lived experience/peer worker designed.


Thursday September 1, 2022 1:30pm - 2:30pm ACST
Hall D (plenary)

2:00pm ACST

COVID-19 Vaccination - Integrating evaluation, continuous improvement and delivery during a one in one-hundred-year event
Naveen Tenneti (Victorian Department of Health), Naomi Bromley (Victorian Department of Health)

The rollout of COVID-19 vaccines has progressed at a pace and coverage that is unprecedented in the history of vaccination programs. In Victoria, as of 13 March 2022, 80.9% of the population has had two doses and 61.6% of the eligible population has received a third dose. This translates to over 14.5 million doses administered in just 12 months, 41% (6 million) of which has been delivered by the State system, at a rate of approximately 38,000 doses per day.  

From the outset, robust evaluation and continuous improvement methodologies were embedded in the Victorian Department of Health's COVID-19 vaccination program. A key feature of the evaluation approach was a rapid review platform which involved critical appraisal of 'work in progress' efforts and rapidly identifying practical recommendations for improvement. This work was supplemented by a knowledge management structure supporting evidence synthesis and insight, as well as systems for analysis of consumer sentiment and feedback.

The summative evaluation of the program explored equity and access, trust and confidence, and safety and quality. It employed a mixed methods approach including document review, data monitoring and stakeholder interviews. Three key stages of the program were identified to frame the evaluation based on vaccine demand, vaccine supply and delivery capacity; high demand:low supply, high demand:high supply and low demand:high supply. These stages are also characterised by external influences including COVID-19 case numbers, vaccine mandates, and vaccine safety signals.

Alongside, an overview of the evaluation framework and the evidence appraisal and rapid review methodologies, practical recommendations regarding these methods' utility in evaluation of future public health programs will be discussed.  The approach provides a useful case study for funding bodies and policymakers on how evaluation can be prioritised and integrated into program design, even during high-pressure environments such as a pandemic.

Chair
avatar for Jess Dart

Jess Dart

Chief Evaluator and Founder, Clear Horizon Consulting
Dr Jess Dart is the founder and Chief Evaluator of Clear Horizon, an Australian-based specialist evaluation company. Having received the 2018 Outstanding Contribution to Evaluation Award from the Australian Evaluation Society (AES), Jess is a recognised leader with over 25 years of... Read More →

Speakers
avatar for Jo Norman

Jo Norman

Director, Centre for Evaluation and Research Evidence, Dept of Health/Dept of Families, Fairness and Housing
I run a large internal evaluation unit, directing a team of 30 expert evaluators and analysts to: directly deliver high priority projects; support program area colleagues to make the best use of external evaluators; and, build generalist staff capacity in evaluation principles and... Read More →


Thursday September 1, 2022 2:00pm - 2:30pm ACST
Riverbank Room 2

2:00pm ACST

"Social media as an evaluation listening and monitoring tool Case study: Instagram, Australian farming women, social media and climate change"
Amy Samson (Coutts J&R), Kelly Fielding (The University of Queensland), Natalie Collie (The University of Queensland)

There can be little debate about the pervasiveness of social media in the way humans now communicate and consume information. In turn, the conversations and information exchanges enabled by social media platforms and technology are increasingly recognised by social researchers as a valuable source of public opinion and narrative providing rich insights into current events and issues. Evaluators understand that projects and programs do not happen in a vacuum. Program logics are necessarily underpinned by layers of context which have the potential to shape outcomes and potential impacts. This paper considers social media as a listening tool, with its content able to be mined for contextual background for a project/program, considered as a part of project/program design and listened to for conversations around project activities or relevant topics in real time. It does this by presenting a case study of learnings gained via an analysis of the Instagram content of 25 Australian farming women to understand their experiences when dealing and coping with extreme climate challenges. These women are at the forefront of a farm's response to drought, yet according to the literature their contributions to the farm have been hidden and there is little research on the gendered impacts of climate change in Australia. Common themes emerging include on-farm drought management strategies, maintaining a house garden, caring for wildlife, expressions of anger and frustration at the impact of the drought and expressions of community support, political action, resilience and hope. The Instagram stories are nuanced and rich in qualitative detail, highlighting their potential to contribute to a better understanding of the gendered impacts of climate change on Australian farms. This type of information could, for example, provide further context for more inclusive program design for projects and programs supporting community and farming resilience around climate change.

Chair
avatar for Graham Smith

Graham Smith

Director, Numerical Advantage
I consult and teach, primarily on performance measurement, which was the focus of a recent Ph.D, and on which I recently delivered an AES workshop. I also conduct performance audits as well as wide-ranging evaluation services.

Speakers
avatar for Amy Samson

Amy Samson

Principal Consultant, Coutts J&R
Amy Samson is a PhD candidate at UQ investigating social media use by Australian women on the farm. She is also Principal Consultant at Coutts J&R with expertise in the evaluation of communication and social media strategies. Her interest in this area started growing when she worked... Read More →


Thursday September 1, 2022 2:00pm - 2:30pm ACST
Room E1

2:30pm ACST

Collaborating on evaluation guidance: building the NSW government evaluation guidelines
Danielle Spruyt (NSW Treasury)

Evaluation of government initiatives is necessary to support the NSW government commitment to evidence-informed policy and investment.

NSW Treasury’s Centre for Evidence and Evaluation (CEE) has coordinated the update of the government’s program evaluation guidelines to reflect innovations in NSW Government processes and in evaluation best practice. They are one of a suite within a framework of evidence requirements to inform policy and budget setting in New South Wales.

The process of updating the guidelines has involved extensive consultation across government clusters, drawing upon the evaluation expertise that sits within clusters, and building a framework relevant to a diverse range of evaluation challenges.

This presentation will discuss lessons learned from:
  • the process of including the different stakeholder perspectives and requirements from evaluation in an over-arching guidance document (lessons in collaboration)
  • building evidence across the life of an initiative (lessons in coordinating requirements)
  •  key developments within the guidelines that respond to our stakeholder evaluation challenges and guidance requirements,
including:
  • monitoring and evaluation planning at initiative design and implementation
  • coordinating stages of evaluation (including process, outcome and economic evaluation)
  • engaging and collaborating with client and community stakeholders.

Chair
avatar for Amanda Jones

Amanda Jones

Senior Manager - Research & Evaluation, VACCA
I have been in community sector research and evaluation for 28 years, mainly in the mainstream child and family sector. Three and half years ago I joined VACCA in this capacity and am loving being challenged to rethink my practice in an ACCO context.

Speakers
avatar for Danielle Spruyt

Danielle Spruyt

Principal Economic Analyst, NSW Treasury
The Centre for Evidence and Evaluation (CEE) in NSW Treasury works to strengthen the quality of evidence that supports government decision-making. Danielle Spruyt (PhD) is a Principal Economist, and evaluation lead. Danielle has previously led development and evaluation of government... Read More →


Thursday September 1, 2022 2:30pm - 3:00pm ACST
Room E3

2:30pm ACST

Using evaluation to shine bright at the Starlight Children's Foundation
Eden Robertson (Starlight Children's Foundation), Claire Treadgold (Starlight Children's Foundation)

Every minute of every day, a child or young person is admitted to hospital in Australia. Starlight, the broadest reaching children's charity in Australia, brings positive experiences to these families through our in-hospital and online programs.  The overarching goal of our programs are to support children and young people to ultimately live their best lives.

In this presentation, we share our recently developed Evaluation Framework and learnings. We established this Framework to ensure that our programs are continually creating more impactful experiences, are meeting the needs of our families, and that we are held accountable to funding bodies. Our Framework defines what, when, how and why we evaluate each of our programs. It is underpinned by key principles such as being strengths-based and inclusive of culturally and linguistically diverse groups.

We designed our Framework over 6-months through numerous ideation and planning sessions. We collaborated with Starlight team members who are responsible for operational and strategic planning of programs. Key considerations within our Framework include: i) impact; ii) satisfaction; iii) program check-ins; iv) health professional feedback; and v) reach. Each program has well-defined evaluations to address these five areas. Rather than taking a standardised approach, we purpose-designed evaluations for each Program so that they are stakeholder-driven, feasible, and provide relevant findings. This also allows us to easily pivot to address the constantly shifting hospital system.

To complement our Framework, we also developed Learning Modules to upskill Starlight team members to conduct their own evaluations. These Modules cover what evaluation is, evaluation design, data analysis, interpreting findings, and closing the feedback loop.

Program evaluation helps Starlight shine brighter for children, young people, and their families. Our Framework ensures that evaluations are regular, rigorous and embedded into business-as-usual. Our Framework and experiences provide valuable knowledge for other non-profits and health services when evaluating.

Chair
avatar for Jess Dart

Jess Dart

Chief Evaluator and Founder, Clear Horizon Consulting
Dr Jess Dart is the founder and Chief Evaluator of Clear Horizon, an Australian-based specialist evaluation company. Having received the 2018 Outstanding Contribution to Evaluation Award from the Australian Evaluation Society (AES), Jess is a recognised leader with over 25 years of... Read More →

Speakers
avatar for Eden Robertson

Eden Robertson

Research and Evaluation Manager, Starlight Children's Foundation
Dr Eden Robertson is an experienced psychosocial researcher, with a special interest in supporting children with a serious illness and their families. Dr Robertson has completed a Bachelors of Psychology (Honours), Graduate Certificate in Adolescent and Young Adult Health and Wellbeing... Read More →


Thursday September 1, 2022 2:30pm - 3:00pm ACST
Riverbank Room 2

2:30pm ACST

Zero Suicide Healthcare: program evaluation for improved suicide care and prevention
Alan Woodward (Alan Woodward Consulting), Sue Murray (Zero Suicide Institute of Australasia), Sue Murray OAM (Zero Suicide Institute of Australasia)

Context
The World Health Organisation estimates one person dies by suicide every 40 seconds. In Australia more than half the people who die by suicide have been in recent contact with the Australian healthcare system. Improvements to healthcare will support the prevention of suicides.

Purpose
Zero Suicide Healthcare (ZSH) vision of no suicide deaths by people under the care of health systems, offers a multi-pronged framework of service, systems & cultural improvements based on a data-informed, continuous quality improvement approach. An emphasis on cultural change ensures healthcare professionals are supported motivated & equipped to adopt practice improvements in the care of people who are suicidal. Evaluation design and practice has a part to play in supporting data-informed, person-centred reforms to healthcare for more effective suicide prevention.

Findings
ZSH has shown internationally & nationally, to reduce suicides by more than 20% and prevent hospital re-admissions for suicidal behaviour by 35%. (Turner et al 2021)
However, there has not been a single source of understanding of how the elements of the ZSH framework interact to achieve outcomes. The linkages between activity, practice improvement and systems change are complicated. The development of the Evaluation Framework incorporating the Theory of Change and Theory of Action for ZSH provides greater clarity about the overall design of the framework using program theory. This will underpin the evaluation of ZSH across projects.

Implications
The ZSH Evaluation Framework will be presented and explained, showing how it supports a consistent approach to evaluation and data monitoring of ZSH programs across different healthcare services and in different countries.

Chair
avatar for Graham Smith

Graham Smith

Director, Numerical Advantage
I consult and teach, primarily on performance measurement, which was the focus of a recent Ph.D, and on which I recently delivered an AES workshop. I also conduct performance audits as well as wide-ranging evaluation services.

Speakers

Thursday September 1, 2022 2:30pm - 3:00pm ACST
Room E1

2:30pm ACST

Speed vs evidence: A digital evaluator's perspective
Anna Kean (Movember), Matt Healey (First Person Consulting)

The desire of digital product teams to bring a new program idea to market quickly without first taking the time to establish an evidence base for the intervention, demonstrates a key tension that exists between product decision makers and digital evaluators. Using a case study approach, this presentation will outline how one evaluator negotiated this tension and (eventually) met the needs of both parties.

The evaluation of a mental health digital product called Movember Conversations will form the case study for this presentation. Movember recognised while people are generally aware of the need for healthy conversations about mental health, many people do not have the confidence or skills to have effective conversations . To meet this need, Movember developed a product that adapted the Ask, Listen, Encourage-action, Check-in (ALEC) model originally promoted by R U OK?

While ALEC was a widely known model, there was no clear evidence-base demonstrating whether ALEC led to beneficial conversations or not. However, due to COVID-19, a business decision was made to fast track the development of this product to support the mental health of men during the pandemic, and the product was released in May 2020.

Fast forward to 2022, with the product in market for 22 months and data demonstrating a range of outcomes for product users, it was time to revisit the intervention effectiveness question. A research study was incorporated into the evaluation to determine whether product users could learn and apply the ALEC model concepts to real life conversations, compared to a control group.

This presentation will explore the challenges encountered along the product development journey and offer digital evaluators recommendations for how to handle this tension.



Chair
avatar for Laura Manrique

Laura Manrique

Monitoring, Evaluation and Learning Officer, Pacific Community
Hola! I'm Laura Manrique, I come from Colombia and currently I'm working as Monitoring, Evaluation and Learning Officer at the Pacific Community here in the sunny city of Noumea, in New Caledonia. I have experience in M&E of social projects and programmes focused on leadership and... Read More →

Speakers
avatar for Anna Kean

Anna Kean

Monitoring, Evaluation and Learning Manager, Movember
Anna is an evaluator working to understand the reach and impact Movember's various health programs products are having on men across the world. Movember develops many programs and products to support men with their mental health, as well as supporting men living with prostate or testicular... Read More →


Thursday September 1, 2022 2:30pm - 3:00pm ACST
Riverbank Room 1

2:30pm ACST

Comic-based digital storytelling: An innovative tool in the Evaluator's toolkit
Hilary Davis (Swinburne University of Technology)

Comics-based digital storytelling has unique characteristics that provide much promise for the evaluator's toolkit. Traditional digital storytelling is a process whereby 'ordinary people' create their own narratives, weaving them into two-to-five-minute digital stories, using visual elements such as photographs, images, signs and music. A participatory research approach is used to co-design and co-create digital stories. Narrative is provided through visual elements which convey knowledge or lived experiences that are difficult to express just using words. They present stories of spaces, places and people, that are often unseen or unheard in mainstream media. Thus, digital stories can be powerful artefacts with the potential to capture and share personal, community, and program development.

Spring-boarding on this traditional digital storytelling approach, we introduce comic-based digital stories. Often associated with young people's entertainment, comics cover a broad range of topics, from superheroes, to farming life aka Footrot Flats. We have been using comic-based digital stories with mental health program evaluations in two distinct contexts: a Victorian rural outreach program; and a tradespeople or 'tradies' focused mental health promotion program. Comics were chosen for these digital stories due to their unique ability to engage primarily male audiences, create understanding, and help bridge health literacy and digital divides.

This presentation explores the process of identifying, capturing, co-designing and sharing comic-based digital stories. It showcases three diverse examples, co-created with key stakeholders: service providers, community members and researchers. We highlight how comics authentically and sensitively communicate mental health issues and can be applied in both face to face and online outreach activities. Ultimately, we argue co-designed comic-based digital storytelling has potential for sharing mental health promotion messages, and service support. They build understandings between service providers, program evaluators and community members about contextually-situated mental he

Chair
avatar for Kahiwa Sebire

Kahiwa Sebire

Invalue Consulting
Kahiwa Sebire is a puzzling pattern-spotter, an enthusiastic solution finder and a life-long learner. Exploring possibilities and meaning-making with sticky notes and whiteboards (or their digital siblings) in tow.  Kahiwa Sebire has over 10 years’ experience working in and with... Read More →

Speakers

Thursday September 1, 2022 2:30pm - 3:00pm ACST
Room E2

2:30pm ACST

More than a self-licking ice cream
Sharon Babyack (Community First Development), Donna-Maree Stephens (Community First Development)

There is an intersecting space where different perspectives meet and are sometimes woven together. There are great insights and lessons in the way First Nations’ people in Australia have navigated this space. Due to the ongoing impacts of colonisation, these hard-won lessons have been borne of necessity and through skillful negotiation. These lessons provide a clear pathway to ‘Right Way’ practices where relationship building takes place, trust is built and solutions are designed and implemented together.

Informed by a research partnership with 11 strong First Nations’ communities, discover more about the importance of interdependence, honourable engagement, and the conduit of mutual respect to rebalance power when in an intersecting space of diverse viewpoints.

This presentation will explore some of the key findings of the research project and co-design processes within the field of evaluation that celebrate the strengths and prized insights of First Nations’ people.

Chair
avatar for Allison Clarke

Allison Clarke

Evaluator & Research Fellow, Assessment and Evaluation Research Centre (AERC)
Allison (she/her) is an evaluation specialist with skills in monitoring and evaluation for community and organisational learning. Allison works on Wurundjeri Woi Wurrung Country at the Assessment and Evaluation Research Centre AERC (formerly known as the Centre for Program Evaluation... Read More →

Speakers
avatar for Donna Stephens

Donna Stephens

SEWB Workforce Coordinator/SEWB Acting Manager, AMSANT
Ms. Donna-Maree Stephens is an Iwaidja woman, of the Muran clan, northwest Arnhem land in the Northern Territory.  A social research and evaluation consultant with an interest in transformational practice and impact, Ms Stephens currently works within AMSANT, the peak body of Aboriginal... Read More →
avatar for Sharon Babyack

Sharon Babyack

General Manager Impact & Strategy, Community First Development
My role at Community First Development involves oversight of research, evaluation, communications and effectiveness of the Community Development program. During my time with the organisation I have led teams to deliver major change processes and strategic priorities, have had carriage... Read More →


Thursday September 1, 2022 2:30pm - 3:00pm ACST
Hall D (plenary)

3:00pm ACST

Plenary six: "Unfolding the conference tapestry"
"Unfolding the Conference Tapestry"

Participate in an interactive session where key notes and award winners will share their insights and thoughts about what they will bring home to weave into their evaluation practice as a result of the conference. We will all have an opportunity to contribute to the strategic thoughts about what we need to do to weave evaluation into our own practice and decision making.

Followed by:
Closing address by the AES President
Handover to aes23 Brisbane

Thursday September 1, 2022 3:00pm - 4:30pm ACST
Hall D (plenary)
  Plenary
 
Filter sessions
Apply filters to sessions.