Can you imagine a company that makes, say, New Year party decorations? The company has brisk sales in December, and in January the company has piles of cash in its vault. The boss rejoices in the company’s success: “Good job, everyone. We have coffers full of money!” However, the company has not counted this cash, and does not keep track of the costs of making or selling the decorations. Or, stranger yet, can you imagine a World Cup football tournament where nobody keeps score during the games? Well, if you are reading this blog on the PHAP website, there is a fairly good chance that you work for just such a company, or play in just such a sport.
Humanitarian action aims to save lives, alleviate suffering and protect human dignity. It does accomplish those goals, and we know that some humanitarian actors do it better than others, that it works well in some places, and less well in others. But how do we know these things? How do we know that it meets people’s needs? That it is effective? Or efficient? Often, in humanitarian action, the answer has been that the success of our programming is more a question of faith than evidence. In response, there has been an increasing emphasis placed on evidence-based approaches and decision making.
We are fast approaching Humanitarian Evidence Week, a week full of webinars, discussions, blogs and debates dedicated to the promotion of evidence-based approaches. Evidence, generally speaking, seems like a good thing. The value and impact of this drive towards a more evidence-based humanitarianism is not, however, unproblematic. Questions need to be asked. What does the greater focus on evidence mean in practice? And at the conceptual level, is our faith in evidence somewhat misplaced?
The fact is, we cannot imagine a company that has no idea whether it turned a profit, or a football team that does not know if it scored more goals than the other team. How then, did we come this far in humanitarian action with such an underdeveloped use of evidence to know if our work has led to its intended impact? One obvious concern is the degree to which evidence-based approaches shift humanitarian action towards measurable, technocratic approaches, away from an engagement with the profoundly political nature of the crisis and hence of the needs of people.
A deeper concern is the seeming lack of a culture of evidence. Drivers and commitments towards the use evidence do not seem inherent in the sector, hence the need for policies and incentives to be constructed in order to push our uptake of evidence-based approaches. What, one might ask, can we expect of evidence within a business model based on charity, and within a sector plagued by a longstanding accountability deficit? Is it even possible to engineer an evidence-based humanitarian action in this fashion?
At a more pragmatic level, challenges related to the use and quality of evidence abound. It is already difficult to test whether, for example, a new drug or water filter works better than the existing one. Harder still is to assess the effectiveness of new approaches or policies, not to mention the use of evidence to decide not just whether we are doing things right, but whether we are doing the right things.
An additional barrier to measuring effectiveness comes from the fact that we, the measurers, have a stake in the findings. Bias, in other words, is a problem, and perhaps even the result of political choices. As Rafia Zakaria caustically opines in the New York Times, USAID uses enrolment figures as evidence of success in getting Afghan girls into school, even though we know this evidence tells us little about whether the girls stay in school or learn anything. Another problem is attribution. How do we find evidence that tests whether giving a woman a chicken leads to her empowerment?
This blog highlights just a few of the critical questions that must both temper and guide the sector’s push towards the greater use of evidence in its analyses and decision-making. For some of the answers, join us on 7 November for Contested Evidence: The challenges and limits of evidence-based approaches to humanitarian action.
Marc DuBois will be facilitating and moderating on 7 November the online discussion on Contested Evidence. Marc DuBois, currently an independent humanitarian consultant/researcher/blogger, was the Executive Director of MSF-UK from March 2008 until March 2014.