On 7 November, PHAP organized an online discussion on Contested Evidence: The challenges and limits of evidence-based approaches to humanitarian action. The event, part of the 2017 Humanitarian Evidence Week, featured Marc DuBois, Paul Knox Clarke (ALNAP), Nancy Cartwright (University of Durham and UCSD), Lars Peter Nissen (ACAPS), and Momodou Touray (Independent Advisor).
While many of the questions from participants were answered during the event (listen to these in the event recording), there were more questions than there was time for, and some of the guest experts have answered follow-up questions in writing, which you can now read on this page.
“It is clear that many decisions made in the humanitarian sector are not evidence-based. As far as I am concerned, this might be due to the lack of expertise within humanitarian organizations. However, it might also be related to the lack of interest and/or knowledge of its importance. What reactions have you usually gotten when trying to convince managers of incorporating evidence to support their decision-making?”
- Professor of Nutrition, Canada
I think it is due to the lack of expertise, but it also comes from a lack of incentives. If it is possible to get funded without having a strong evidence-based approach, why would you spend additional scarce resources on producing more information? Having said that, my experience is not that there is resistance to having evidence, it just tends to not be prioritized.
I think this is a great question, and it goes to the root of the issue: Why has humanitarian action evolved so far without calls for evidence to play a central role in decision making, evaluation, etc.? I think the answer to the question is very case specific, but underlying these individual questions is a system that runs on good intentions, and we do not need evidence to prove our good intentions. It has a harder time ensuring that those good intentions yield good results. There, though, we also run into the problems raised in the webinar by Nancy Cartwright – which might also play out in the Manager's mind. Does the evidence actually tell us what we say it tells us?
“Evidence in aid is time dependent. Indeed, receiving perfect information too late will not be useful during a response, yet there is high pressure to achieve methodologically correct evidence in real time. How can we square that circle?”
- Humanitarian Operations Advisor, UK
We develop “good enough,” practical tools that are suited to serve us even in the murkiest of situations. An overly complicated methodology is not right or correct for a fast moving situation. What is missing is often the willingness to take risks and make judgement calls, and then make yourself accountable for the decisions you have made. A lot of us tend to prefer to hide behind methodologies and data.
In my several years of work in development and humanitarian action, I have come to realize that “methodologically correct evidence” is an illusion that no serious actors pursue. Actors do, in various contexts, define what constitutes a “satisficing” level of confidence in the evidence available. Once the evidence is considered “fit for purpose”, there is usually relative little time spent to assess “correctness” – I think this is an effective way of responding at the spur of the moment. However, at the same time that evidence for operational purposes is being used to respond, evidence for matters relating to recovery are explored in a parallel process.
“Do you think there are cultural differences between humanitarians - often accused of being cowboys and with a cultural bias to 'doing' - versus academia’s timeframe and mindset - slower, more focused on methodologically, seeking correct approaches, etc. - and thus creating a clash of timeframes, accessibility, understanding, etc.?”
- Humanitarian Advisor, UK
I think Larissa Fast's article covers this well. Here is a quote on the cultural divide: "The roots of the divide emerge from the differing conceptions of the purpose, sources, characteristics, and time frames of data collection and use on the part of researchers and practitioners." So maybe not so much about “cowboys” and “librarians” as deeper issues.
“Do, and if yes how can the efforts towards improving evidence be also accessible and of use beyond larger organizational contexts, i.e. to smaller local civil society organizations (CSOs) that lack capacity to collect relevant large amounts of data?”
- Systems Designer & Analysts, UK
So far most efforts have been around ensuring that evidence plays a larger role in the formal decision-making of the large international humanitarian agencies. There are good examples from various contexts of local organizations playing a larger role – for example Bangladesh and Kenya. However, there is still a long way to go.
I do not think the issue is that local organizations have a weak capacity to collect data. They have a deep contextual understanding, which is often missing from large international agencies; and tapping into this knowledge will help toward creating a more inclusive and robust humanitarian narrative.
Improving evidence is about providing better methods of collect, analysis, and use of data to support decision making. There seems to be a thirst for more evidence to be provided and made available in response situations – which implies costs, expertise, and investments that smaller local CSOs may find hard to mobilize.
At the moment, in response to the question “Do they have access…”, I would say not always. This situation creates a dependence on secondary data for local CSOs, even though they are the ones often the closest to the emergency situation.
On the question as to “how” local CSOs can better benefit from efforts to improve accessibility to evidence, I can only refer to the call for knowledge networks and knowledge products, mentioned in my intervention on 7 November. Particularly helpful in this endeavor would be, in my view, better use of open data platforms and knowledge platforms to ease access to evidence. For instance, the World Bank’s “Open Knowledge Platform” regularly shares publications on highly thematic issues and makes these available readily online. I think the humanitarian response community should learn from that experience and develop a similar platform to support knowledge networks and better dissemination of knowledge products.
“Overall, if you balance the pros and cons of using evidence in aid, including its flaws and risks, do you think an evidence-based approach in aid tends to damage work standards and the quality of humanitarian action?”
- Child Protection Officer, Tanzania
I do not see any cons of using evidence in aid. I can see how an overly rigid focus on un-achievable standards can have a detrimental impact, but that is very different from saying that it is a bad thing to use evidence. The key is that we must have to courage to work with the data we have and make ourselves accountable for the decisions we make with the evidence at hand as one input. Giving evidence a larger role to play in decision-making cannot be a bad thing.
There is a risk that an “excess emphasis” on use of evidence in aid – e.g. insistence on impact of aid – can lead to serious damage to work standards. The evidence for this is in the rising number of cases where the principal-agent relationship is leading to “forged” or “exaggerated” reports of impact, as well as the “manipulation” of indicators to influence decisions in favor of more funding.
At the same time though, the quest for evidence has been a driving force for a lot of investments to mobilize better data at country level, and certainly, a lot of this “information and statistical capacity building” has led to a better understanding of development and humanitarian outcomes and of what influences them.
Provided that humanitarian and development actors are aware of just how far and how much “evidence” they should demand from actors on the ground, the damage to work standards and quality of humanitarian action can be limited.
“Often, discussions on the value of evidence do not include donors, and therefore it is hard for them to understand the importance of investing in research and evidence. What levels of financial and non-financial investment do you think are needed in the humanitarian sector to overcome challenges in the quality and use of evidence?”
- Director, Consultancy, UK
I am not sure about the premise here. Is there enough funding for evidence-based research? I don't know. But it seems to me a much stronger call for evidence has come from the donors, and that it is donors who are insisting on seeing the evidence regarding the effectiveness of the programs they fund. Secondly, I would also respond that humanitarian agencies themselves must finance this research if they are unable to obtain funding for it. It is their responsibility to ensure that their programming is effective.
"In my opinion, we need in the humanitarian sector an open-data approach to evidence, i.e. by using programs similar to open-source software, in which multiple organizations can contribute and update collected data (e.g. map data, metrics, individuals’ data - subject to privacy constraints, etc.). To what extent do you agree or disagree with this? And how do you think we can move towards this approach without hindering people’s privacy and security?"
- Independent Consultant, UK
We need to always have an eye on the privacy issue, but I do believe that is manageable. We should share data, but I find that we have too much focus on creating portals where we can share data, and then we expect miracles to happen. Our data is often not good enough to “speak on its own.” Therefore, in addition to all the focus on building portals and sharing data sets, I wish we also would focus on building an ecosystem of analysts who would share not only data, but also analysis and help each other, and together make sense out of crisis.
You can access the rest of the Q&A as well as further resources on the event page.