Three things you can do to evaluate projects more effectively
A ‘feel good’ factor in a project may have a lot of appeal, but it is not as sustainable or useful as some solid evaluation findings. Measuring impact can be a challenge but here are three ways we find evidence that what we do is making a difference.
At Kaleidoscope we run mission projects. These are projects we think are important, that make the world a kinder, more connected and joyful place. We deliver them because no-one else will, at least not on their own, and because we feel strongly about our responsibility to create the change we want to see.
The problem with having a mission comes back to a phrase that’s been occupying my thoughts: “that’s nice, but so what?”. We might complete a wonderful project, getting a warm fuzzy glow from doing something to benefit others, but how can we be sure that it really made a difference?
Without rigorous, critical evaluation we have no idea about the true impact of what we’ve done, how well it really worked and how we can do better in the future. That means we are unable to justify the value of our mission work. It also leaves us short of achieving the impact we’re capable of, never learning from our experiences or improving our practice.
Evaluation isn’t easy and it isn’t always comfortable.
Evaluation isn’t easy and it isn’t always comfortable. It carries the possibility that our approach, or even our very idea, wasn’t delivering what we expected it to. If we really care about doing good and doing the most good we can, then we have to accept that. We also need to learn from our mistakes.
Beyond this, though, rigorous evaluation can also bolster our pride and belief in what we do. It allows us to know, with confidence, the effects of our work. It can give us insights we didn’t have. It proves to others and to ourselves that what we’re doing truly is worthwhile. As far as I’m concerned, that’s worth just about any cost.
It proves to others and to ourselves that what we’re doing truly is worthwhile.
So how do we do evaluation and measure impact? At Kaleidoscope we are lucky enough to have evaluation experts in house, but you don’t have to be an expert. Here are some steps you can take;
1. Design a Theory of Change.
This is an essential start to a project, as it will help clarify your pathways to outcomes and impact, and any underlying assumptions that will have to hold for things to go as intended. There are lots of free resources to help you get to grips with the basics, such that before long you’ll be more than ready to get stuck in yourself.
2. Agree impact measures.
It’s all well and good clarifying the changes you wish to make through your work, but you also need a way of confirming whether you’ve achieved them.
Working out your output, outcomes and impact measures needn’t be difficult. Some projects may need quite resource intensive methods to test findings. However, you can often get sufficient data from sources like twitter, to understand numbers of people reached via social media and how many engaged with your work. We also find that a well designed survey goes a long way in allowing us to gauge the outcomes of our work and to understand behaviour change.
3. Build in time for evaluation and reflection from the outset.
Impact measurement needn’t be an onerous imposition, but to make the best use of it you need to devote time. Standardised templates can speed up some processes. Create space throughout the lifespan of a project to reflect on your findings and their implications. This reflection will improve practice in the future.
Kaleidoscope’s approach to evaluation can be seen throughout the planning, implementation and follow up phases of our Super Melting Pot conference. This was our most ambitious event ever, bringing people together from across health, care and beyond to mainstream kindness, build cognitive diversity and foster connection. It was essential to properly measure the effects of all the activity.
Through a combination of surveys, follow up discussions and observations, we interrogated what effect the event had on the participants and on Kaleidoscope as an organisation. Now, six months on, we are distributing a follow up survey whether Super Melting Pot outcomes have been achieved and to inform what we plan for the future. Who knows, it might one day help us create a second Super Melting Pot that’s even bigger and better than the first?
Our approach to doing ‘good things’ hasn’t always been this rigorous. Having seen the possibilities of what evidence can do to enhance our impact, however, we’ve become evaluation evangelists – although by this point, I probably didn’t need to point that out, did I?