How do we contribute to change? Takeaways from our impact assessment

Nicole Anand

The plan

A few months ago we shared why we want to assess our impact. We are an intermediary organisation, which means we support partners to reach their goals. More specifically, we focus on supporting partners to use data and technology to increase their impact. This means that we are assisting other groups to make an impact and to bring about systems change. To understand the impact we have, we have to understand how we contribute to these outcomes, without muddling up our impact with the impact of our partners. Ultimately, we want to generate evidence that can inform our continuous improvement.

And so we recently undertook a two-part plan to do this. The first part was developing an organisational learning strategy aimed at better responding to partner needs and measuring our impact. The second part was developing our approach to impact assessment.

We turned the impossibly big task of assessing our impact into something more manageable by focusing on one of our strategic programmes – light touch data and technology support (LiTS). You can read more about how we collect and analyse external feedback and internal reflections on LiTS here. This formed the foundation of our programmatic impact assessment.

Today, we are sharing a Primer on our approach, which summarises our objectives and methodology. Along with it, we are sharing our assessment tool – a social return on investment framework – designed to measure our LiTS support to partners. We are also setting up calls with peer organisations to talk through the framework and see how it might be relevant for other organisations facing similar impact assessment challenges.

As we’ve undertaken this task, we’ve learned a lot about how to define the scope of an assessment to keep it specific and meaningful, how to design an appropriate approach – with mixed methods – for unique needs, and how to use ongoing activities towards measurement activities. Here’s what we’ve learned.

Our takeaways about assessing impact

We’ve heard from partners that measurement is difficult, and we agree. One of the biggest challenges that we’ve been alerted to is not knowing how or where to begin. We wanted to share a few conclusions based on our experience. First, we call out common myths that have led to either avoiding impact assessment or measuring it in ways that overlook critical components. Second, we offer a few insights we discovered that are likely generalisable for others.

Debunking myths to help navigate assessment

  1. Reporting to funders is not impact assessment. Reporting to funders is a great trigger for an organisation to reflect on results because it requires communicating about them. However, the stories we share with funders illustrate how we changed “the field”. These perspectives don’t necessarily capture our role – nor do they tease out its unique contribution to impact.
  2. Partner feedback is insufficient for measuring change. Impact is a series of interrelated steps that starts long before there is something tangible to point to as evidence of success. This means measurement should include components that feed into change but may not be the obvious indicator of it. For example, partner feedback is a great way to understand satisfaction levels with support, but it may not indicate why the quality of the support is high or low. Quality can be a result of team capabilities or coordination efficiencies, or any other input to a complex system. This requires further – and possibly different types of – data and analysis.
  3. Monitoring and evaluation expertise is not required to develop indicators. There are various ways of developing indicators in an evaluation process. More often than not, organisations turn to an expert to develop the relevant categories for measurement. However, we noticed that there is often a gap between what an expert might deem important, and what a user’s experience tells us is critical. Identifying patterns in user feedback and using them to inform indicators is a powerful way of measuring what matters most.

Tips to getting started: from goals to methods

  1. Identify what you aim to change separate from the goals of your partners. Our partners work on issues of transparency and accountability, human rights and environmental justice, and they have diverse sets of goals. Our outcome goals are obviously different. We look at three outcomes related to data and technology that illustrate our goal of systems change: reducing harm, decreasing waste, and growing technical intuition in civil society. This means we assess the value of our contributions as they move us toward these outcomes.
  2. Recognise the limitations of existing methods, but don’t let it limit your methodology. Many civil society organisations lack in-house monitoring and evaluation (M&E) skills. It’s not the first skillset anyone hires for and, even when we’re looking, it’s a hard skill to find. We’ve tried to grow these skills internally and search for external M&E resources. The search leads to plenty of methods, but few that are wholly applicable to our work. From contribution analysis to beneficiary assessment, we found a number of methods inspiring, and we blended them to develop our SROI framework.
  3. Proceed with caution when attempting to quantify qualitative data. Many civil society organisations have experience designing survey questions that quantify feedback. A rating of high (5) or low (1) can be useful for a quick and broad understanding of risks and opportunities. In impact assessment, however, quantified values can obfuscate meaning. For example, a social change may be scored as a 1, 2 or 3, but if the number is simply an indicator of a big or small change, it strips away underlying meaning. One way to combat this problem is to connect quantities to sophisticated layers of qualitative meaning. In the end, it is important to think through when quantification is necessary, and what quantities mean when using them.

We know that many of you are finding ways to articulate your impact and drive continuous improvement. We would love to hear from you about your approach and ours. You can contact Nicole at nicole@theengineroom.org or schedule a call with our team here.