The Expert:

Brooke Hayward

Brooke Hayward (MAppPsyc) joined Ko Awatea as Evaluation Officer in February 2015. She has eight years of experience in research and evaluation in health and community development-related fields. As Ko Awatea’s Evaluation Officer, Brooke is responsible for building evaluation capacity among staff at Counties Manukau Health and supporting and leading evaluations for both internal and external partners. Brooke also has experience working in local government settings through her role as Research and Evaluation Officer with Healthy Together Victoria.

Their View:

Evaluation is imperative for services or organisations to understand the impact they have and to capture valuable learning to improve the efficiency and efficacy of interventions, services and care.

Evaluation should be approached as an integral part of project design and project management, not as an ad hoc activity completed on behalf of your service or team by ‘someone’, ‘somewhere’, months or even years after work has commenced.

Failing to integrate evaluation into project design and management limits the availability of relevant, complete and reliable data. It is also a missed opportunity for improved project management, design and measurement.

Lack of logic is a fundamental barrier to integrating evaluation with project design and management. A logic model ties together a clear vision for the project, its objectives, and how it intends to achieve those objectives. There are many different terms for logic models: driver diagram, programme logic, programme model. Whatever the term used, logic models serve a common purpose – to create clear service or project aims and objectives that are well aligned to activities, deliverables or actions. Logic models create a shared understanding of change and a clear narrative for your choice of interventions.

In evaluation we meet many people seeking support to evaluate their project, programme or service who lack a logic model. For example, some project managers know their deliverables and timeline, but struggle to articulate a clear narrative about the aim of the project or the purpose of an activity.

This lack of logic leads to many difficulties. How do we make design decisions in the absence of a clear direction or overarching objectives? How do we communicate project aims and objectives to stakeholders in a way that creates understanding, buy-in and engagement? How do we measure success when we don’t know what we are aiming for?

Logic models serve as a project management and design tool that can be easily translated into evaluation questions, indicators, and areas for measurement. When evaluation and design intersect this implies:

• advanced planning of monitoring and evaluation/measurement

• monitoring and data collection along the way (not retrospectively)

• that monitoring and evaluation outcomes influence design (i.e. inform change and improvements in interventions)

• that monitoring shapes project implementation (i.e. you are responsive to project data in your project delivery; if something isn’t working, you adapt!).

For support around translating your project logic into evaluation questions, indicators and areas of measurement, please contact Brooke Hayward, Evaluation Officer, or Luis Villa, Research and Evaluation Manager, Ko Awatea Research and Evaluation Office.

Last modified: