Abstract
Program evaluation can support capacity building and inform practice and policy. Yet long-term efforts to ensure evaluation use (EU) in the humanitarian sector are seldom documented, leaving much uncertainty about EU conditions. This study examined conditions that influenced EU by stakeholders of a humanitarian non-governmental organization (NGO) in Burkina Faso striving to base its health care program on solid evidence. It used 36 qualitative semi-structured interviews and a single case study design to document stakeholders’ (n = 26) perception of EU conditions. Analyses focussed on characteristics of five broad conditions of research use previously documented. Results demonstrate that EU was facilitated by intended users with proactive attitudes, research experience, and willingness to participate in program evaluations. Also helpful was an organizational culture that valued learning, feedback, and accountability, wherein leaders collaborated toward common goals. Evaluation-based knowledge that met information needs and that was actionable, contextualized, and quickly accessible enhanced EU. Knowledge transfer strategies promoting EU were diverse, participatory, adapted to needs, and regularly followed up. Evaluators who were trusted, experienced, credible, and adaptable, promoted EU most effectively. Conversely, EU was compromised when intended users felt distrusting, uninformed, or unable to engage in program evaluations. Knowledge contradicting expectations or deemed inapplicable impeded EU. Adapting knowledge transfer strategies required time and interactions. Initially, evaluations were not sufficiently adapted and put into plain language, which hampered EU. EU conditions are numerous and intricately interrelated, but interpersonal relationships, trust, and effective communication are key conditions for evaluators and stakeholders wishing to promote EU.
Similar content being viewed by others
Notes
Program evaluation and evaluation is used interchangeably in the current article to facilitate reading
References
Adhikari SR, Maskay NM, Sharma BP (2009) Paying for hospital-based care of kala-azar in Nepal: assessing catastrophic, impoverishment and economic consequences. Health Policy Plan 24(2):129–139
Ahmed M (2005) Bridging research and policy. J Int Dev 17:765–773
Alexander J, Bonino F (2014) Ensuring quality of evidence generated through participatory evaluation in humanitarian contexts. ALNAP discussion series: Improving the quality of EHA evidence - Method note 3. ALNAP, London
Alkhalaf A (2012) In: University of British Columbia (ed) The relationship among process use, findings use, and stakeholder involvement in evaluation. Unpublished master’s thesis, Vancouver, BC, Canada https://open.library.ubc.ca/cIRcle/collections/ubctheses/24/items/1.0073351
Alkin MC, Taut SM (2003) Unbundling evaluation use. Stud Educ Eval 29(1):1–12
ALNAP (2001) Humanitarian action: Learning from evaluation. ALNAP annual review 2001. ALNAP/Overseas Development Institute, London
ALNAP (2006) Evaluating humanitarian action using the OECD-DAC criteria: an ALNAP guide for humanitarian agencies. ALNAP/Overseas Development Institute, London http://www.alnap.org/pool/files/eha_2006.pdf
Amnesty International (2010) Burkina Faso: Giving life, risking death: Time for action to reduce maternal mortality in Burkina Faso. Index no. In: AFR 60/001/2010. Amnesty International, London
Antarou L, Ridde V, Kouanda S et al (2013) La charge de travail des agents de santé dans un contexte de gratuité des soins au Burkina Faso et au Niger [health staff workload in a context of user fees exemption policy for health care in Burkina Faso and Niger]. Bulletin de la Société de Pathologie Exotique 106(4):264–271
Banatvala N, Zwi AB (2000) Public health and humanitarian interventions: develo** the evidence base. BMJ 321:101–105
Beck T (2003) Evaluating humanitarian action: An ALNAP guidance booklet. September 2003, draft. ALNAP, London, p 1
Bellman L, Webster J, Jeanes A (2011) Knowledge transfer and the integration of research, policy and practice for patient benefit. J Res Nurs 16(3):254–270
Blake SC, Ottoson JM (2009) Knowledge utilization: implications for evaluation. N Dir Eval 2009(124):21–34. Retrieved from. https://doi.org/10.1002/ev.311
Boutilier Z, Daibes I, Di Ruggiero E (2011) Global health research case studies: lessons from partnerships addressing health inequities. BMC Int Health Hum Rights 11(Suppl 2, S2)
Brehaut JC, Eva KW (2012) Building theories of knowledge translation interventions: use the entire menu of constructs. Implement Sci 7:114
Buchanan-Smith M, Cosgrave J (2013) Evaluation of humanitarian action: pilot guide. ALNAP/Overseas Development Institute, London
Case DO, Andrews JE, Johnson JD et al (2005) Avoiding versus seeking: the relationship of information seeking to avoidance, blunting, co**, dissonance, and related concepts. Journal of the Medical Library Association 93(3):353–362
Chaskin RJ, Goerge RM, Skyles A et al (2006) Measuring social capital: an exploration in community-research partnership. Journal of Community Psychology 34(4):489–514
Clarke P, Ramalingam B (2008) Organisational change in the humanitarian sector. ALNAP/Overseas Development Institute, London
Cousins JB (1998) Organizational consequences of participatory evaluation: school district case study. In: Leithwood K, Louis KS (eds) Organizational learning in schools. Taylor & Francis, New York, pp 127–148
Cousins JB (2003) Utilization effects of participatory evaluation) In: Kellaghan T, Stufflebeam DL, Wingate LA (eds) International handbook of educational evaluation: part two: practice. Kluwer Academic, Boston, pp 245–266
Cousins, JB and Bourgeois I (2014) Organizational capacity to do and use evaluation. In B. Cousins & I. Bourgeois (Eds.), New Directions for Evaluation. In P. R. Brandon (series Ed.) (Vol. 141, pp. 1-5). San Francisco: CA: Jossey-bass and the American evaluation association.
Cousins JB, Earl LM (1992) The case for participatory evaluation. Educ Eval Policy Anal 14(4):397–418
Cousins JB, Shulha LM (2006) In: Shaw I, Greene JC, Mark M (eds) Handbook of evaluation: Policies, programs and practicesA comparative analysis of evaluation utilization and its cognate fields of enquiry. Sage Publications, Thousand Oaks, pp 233–254
Cousins JB, Goh SC, Clark S et al (2004) Integrating evaluative inquiry into the organizational culture: a review and synthesis of the knowledge base. Canadian Journal of Program Evaluation 19(2):99–141
Creswell JW, Plano Clark V (eds) (2006) Designing and conducting mixed methods research. Sage Publications, Thousand Oaks
Crewe E, Young J (2002) Bridging research and policy: context, evidence and links, working paper 173. Overseas Development Institute, London http://www.odi.org.uk/publications/working_papers/wp173.pdf
Crisp J (2004) Thinking outside the box: evaluation and humanitarian action. Forced Migration Review 8:4–7
D’Ostie-Racine L, Dagenais C, Ridde V (2013) An evaluability assessment of a West Africa based non-governmental Organization's (NGO) progressive evaluation strategy. Evaluation and Program Planning 36(1):71–79
D’Ostie-Racine L, Dagenais C, Ridde V (2016a) In: Université de Montréal (ed) Evaluation use within a humanitarian non-governmental organization's health care user-fee exemption program in West Africa. Doctoral dissertation, Montréal, Canada https://papyrus.bib.umontreal.ca/xmlui/handle/1866/16044
D’Ostie-Racine L, Dagenais C, Ridde V (2016b) A qualitative case study of evaluation use in the context of a collaborative program evaluation strategy in Burkina Faso. Health Research Policy and Systems 14(1)
Dagenais C, Malo M, Robert É et al (2013a) Knowledge transfer on complex social interventions in public health: a sco** study. PLoS One 8(12):e80233
Dagenais C, Queuille L, Ridde V (2013b) Evaluation of a knowledge transfer strategy from a user fee exemption program for vulnerable populations in Burkina Faso. Glob Health Promot 20(Supp 1):70–79
Daigneault P-M, Jacob S (2009) Toward accurate measurement of participation: rethinking the conceptualization and operationalization of participatory evaluation. Am J Eval 30:330–348
Darcy J and Knox Clarke P (2013) Evidence & knowledge in humanitarian action. Background paper 28th ALNAP meeting Washington DC 5–7 March 2013. London: ALNAP
Dijkzeul D, Hilhorst D, Walker P (2013) Introduction: evidence-based action in humanitarian crises. Disasters 37:S1): S1–S1):S19
Dobbins M, Ciliska D, Cockerill R, Barnsley J, DiCenso A (2002) A framework for the dissemination and utilization of research for health-care policy and practice. Worldviews on Evidence-Based Nursing presents the archives of Online Journal of Knowledge Synthesis for Nursing E9 (1):149–160
Estabrooks C (1999) The conceptual structure of research utilization. Research in Nursing & Health 22:203–216
Graham ID, Tetroe JM (2009) Getting evidence into policy and practice: perspective of a health research funder. J Can Acad Child Adolesc Psychiatry 18(1):46–50
Griekspoor A, Collins S (2001) Raising standards in emergency relief: how useful are sphere minimum standards for humanitarian assistance? BMJ 323:740–742
Hallam A (2011) Harnessing the power of evaluation in humanitarian action: an initiative to improve understanding and use of evaluation. ALNAP working paper. London, ALNAP/Overseas Development Institute
Hallam A, Bonino F (2013) Using evaluation for a change: insights from humanitarian practitioners. ALNAP/Overseas Development Institute, London
Hallam A, Bonino F (2014) Using evaluation for a change: insights from humanitarian practitioners - ALNAP discussion starter. ALNAP/Overseas Development Institute, London
Harveu P, Stoddard A, Harmer A et al (2010) The state of the humanitarian system : assessing performance and progress. A pilot study. ALNAP/Overseas Development Institute, London
HELP (2008) Annual report 2008. HELP-Hilfe zur Selbshilfe e V, Bonn
Hendricks M (1994) Making a splash: reporting evaluation results effectively. In: Wholey JS & Newcomer KE (eds) Handbook of practical program evaluation. San Francisco: Jossey-Bass, pp.549–575
Henry G (2003a) Beyond use: understanding evaluation's influence on attitudes and actions. Am J Eval 24(3):293–314
Henry G (2003b) Influential evaluations. Am J Eval 24(4):515–524
Herbert JL (2014) Researching evaluation influence: a review of the literature. Eval Rev 38(5):388–419
Hoffman SJ, Røttingen J-A, Bennett S et al (2012) Background paper on conceptual issues related to health systems research to inform a WHO global strategy on health systems research. In: A working paper in progress last revised 29 February 2012 Hamilton ON. McMaster University http://www.who.int/alliance-hpsr/alliancehpsr_backgroundpaperhsrstrat1.pdf
Højlund S (2014) Evaluation use in the organizational context – changing focus to improve theory. Evaluation 20(1):26–43
INSD (2010) La région du Sahel en chiffres. Ministère de l’Économie et des Finances, Ouagadougou
James C, Hanson K, McPake B, Balabanova D, Gwatkin D, Hopwood I et al (2006) To retain or remove user fees?: reflections on the current debate in low-and middle-income countries. Applied Health Economic Health Policy 5(3):137–153
Johnson K, Greenseid LO, Toal SA et al (2009) Volkov B. research on evaluation use: a review of the empirical literature from 1986 to 2005. Am J Eval 30(3):377–410
Karan L (2009) Evaluation use in non-governmental organizations: unlocking the "do – learn – plan" continuum. Tufts University, Medford, MA, Doctoral dissertation http://gradworks.umi.com/3359808.pdf
Kaufman-Levy D, Poulin M (2003) Evaluability assessment: examining the readiness of a program for evaluation. U.S. Department of Justice, Juvenile Justice Evaluation Center, Washington, DC http://www.jrsa.org/pubs/juv-justice/evaluability-assessment.pdf
King JA (2007) Develo** evaluation capacity through process use. N Dir Eval 2007(116):45–59
Kirkhart KE (2000) Reconceptualizing evaluation use: an integrated theory of influence. N Dir Eval 2000(88):5–23
Kitzinger J (1994) The methodology of focus groups: the importance of interaction between research participants. Sociology of Health & Illness 16(1):103–121
Kitzinger J (1995) Qualitative research: introducing focus groups. BMJ 311:299–302
Knox Clarke P, Darcy J (2014) Insufficient evidence? The quality and use of evaluation in humanitarian action. ALNAP/Overseas Development Institute, London
Lavis JN, Lomas J, Hamid M et al (2006) Assessing country-level efforts to link research to action. Bull World Health Organ 84(8):620–628
Lavis JN, Guindon GE, Cameron D et al (2010) Bridging the gaps between research, policy and practice in low- and middle-income countries: a survey of researchers. Can Med Assoc J 182(9):E350–E361
Lavis JN, Boyko J, Gauvin F-P (2014) Evaluating deliberative dialogues focussed on healthy public policy. BMC Public Health 14:1287
Levin-Rozalis M (2009) Recherche et évaluation de programme. In: Ridde V and Dagenais C (eds) Approches et pratiques en évaluation de programme. Les Presses de l'Université de Montréal, Montréal, pp 31–49
Leviton LC, Hughes EFX (1981) Research on the utilization of evaluations: a review and synthesis. Eval Rev 5(4):525–548
Lomas J (1991) Words without action? The production, dissemination, and impact of consensus recommendations. Annu Rev Public Health 12:41–65
Mark MM (2011) Toward better research on—and thinking about—evaluation influence, especially in multisite evaluations. N Dir Eval 2011(129):107–119
Mark MM, Henry GT (2004) The mechanisms and outcomes of evaluation influence. Evaluation 10(1):35–57
Miles MB, Huberman M (1994) Qualitative data analysis: an expanded sourcebook, 2nd edn. Sage Publications, Newbury Park
Mitton C, Adair CE, McKenzie E et al (2007) Knowledge transfer and exchange: review and synthesis of the literature. The Milbank Quarterly 85(4):729–768
National Research Council (2002) Scientific research in education. In: Shavelson RJ, Town L (eds) Washington, DC: National Academic Press. : Center for education. Division of Behavioral and Social Sciences and Education. National Academy Press
Olivier de Sardan JP (2003) L’enquête socio-anthropologique de terrain : synthèse méthodologique et recommandations à usage des étudiants. Niamey, Niger: LASDEL: Laboratoire d’études et recherches sur les dynamiques sociales et le développement local
Olivier de Sardan JP (2008) La rigueur du qualitatif: Les contraintes empiriques de l'interprétation socio-anthropologique. Academia-Bruylant, Louvain-La-Neuve
Olivier de Sardan JP (2011) Promouvoir la recherche face à la consultance: Autour de l’experience du LASDEL (Niger-Bénin). Cahiers d'Études africaines 2011(2):511–528
Patton MQ (1978) Utilization-focused evaluation. Sage Publications, Thousand Oaks
Patton MQ (1990) Qualitative evaluation and research methods, 2nd edn. Sage Publications, Newbury Park
Patton MQ (1996) A world larger than formative and summative. Am J Eval 17(2):131–144
Patton MQ (1997) Utilization-focused evaluation, 3rd edn. Sage Publications, Thousand Oaks
Patton MQ (2007) Process use as a usefulism. N Dir Eval 2007(116):99–112
Patton MQ (2008a) Future trends in evaluation) From policies to results: develo** capacities for country monitoring and evaluation systems. In: Segone M. UNICEF and IPEN, New York, pp 44–56
Patton MQ (2008b) Utilization-focused evaluation, 4th edn. Sage Publications, Thousand Oaks
Patton MQ, LaBossière F (2009) L'évaluation axée sur l'utilisation. In: Ridde V, Dagenais C (eds) Approches et pratiques en évaluation de programme. Les Presses de l'Université de Montréal, Montréal
Pires AP (1997) Échantillonnage et recherche qualitative: essai théorique et méthodologique. In: Poupart J, Deslauriers JP, Groulx LH, Laperrière A, Mayer R, Pires AP (eds) La recherche qualitative: Enjeux épistémologiques et méthodologiques. Montréal: Gaëtan Morin, pp 113–167
Podems D (2007) Process use: a case narrative from southern Africa. N Dir Eval 2007(116):87–97
Proudlock K, Ramalingam B, Sandison P (2006) Improving humanitarian impact assessment: bridging theory and practice. ALNAP’s 8th review of humanitarian action. ALNAP, London http://www.alnap.org/pool/files/8rhach2.pdf
Queuille L and Ridde V (2014) Healthcare financing and access in West Africa: empirical and satirical! Ouagadougou: CRCHUM, HELP e.V., ECHO, http://www.equitesante.org/healthcare-financing-access-west-africa-empirical-satirical/
Rich R (1997) Measuring knowledge utilization: processes and outcomes. Knowledge, Technology & Policy 10(3):11–24
Richard F, Ouédraogo C, Zongo V et al (2009) The difficulty of questioning clinical practice: experience of facility-based case reviews in Ouagadougou, Burkina Faso. BJOG Int J Obstet Gynaecol 116(1):38–44
Ridde V (2010) Per diems undermine health interventions, systems and research in Africa: burying our heads in the sand. Tropical Med Int Health
Ridde V, Diarra A (2009) A process evaluation of user fees abolition for pregnant women and children under five years in two districts in Niger (West Africa). BMC Health Serv Res 9(89)
Ridde V, Haddad S (2013) Pragmatisme et réalisme pour l’évaluation des interventions de santé publique. Rev Epidemiol Sante Publique 61(Supp 2):S95–S106
Ridde V, Queuille L (2010) User fees exemption: one step on the path toward universal access to healthcare. Pilot experiences in, Burkina Faso. http://www.usi.umontreal.ca/pdffile/2010/exemption/exemption_va.pdf
Ridde V, Diarra A, Moha M (2011a) User fees abolition policy in Niger: comparing the under five years exemption implementation in two districts. Health Policy 99(3):219–225
Ridde V, Heinmüller R, Queuille L et al (2011b) Améliorer l’accessibilité financière des soins de santé au Burkina Faso. Glob Health Promot 18(1):110–113
Ridde V, Goossens S, Shakir S (2012a) Short-term consultancy and collaborative evaluation in a post-conflict and humanitarian setting: lessons from Afghanistan. Evaluation and Program Planning 35(1):180–188
Ridde V, Queuille L, Atchessi N et al (2012b) The evaluation of an experiment in healthcare user fees exemption for vulnerable groups in Burkina Faso. Field ACTions Science Reports Special issue 8:1–8
Ridde V, Kouanda S, Yameogo M et al (2013) Why do women pay more than they should? A mixed methods study of the implementation gap in a policy to subsidize the costs of deliveries in Burkina Faso. Evaluation and Program Planning 36(1):145–152
Samb O, Belaid L, Ridde V (2013) Burkina Faso: la gratuité des soins aux dépens de la relation entre les femmes et les soignants? Humanitaire: Enjeux, pratiques, débats 35:34–43
Sandison P (2006) The utilisation of evaluations. ALNAP Review of Humanitarian Action in 2005: Evaluation utilisation, pp 89–144 http://www.livestock-emergency.net/userfiles/file/common-standards/ALNAP-2006.pdf
Sanou A, Kouyaté B, Bibeau G et al (2011) Evaluability assessment of an immunization improvement strategy in rural Burkina Faso: intervention theory versus reality, information need and evaluations. Evaluation and Program Planning 34(3):303–315
Scriven M (1991) Beyond formative and summative evaluation. In: McLaughlin M, Phillips C (eds) Evaluation and education: at quarter century. University of Chicago Press, Chicago, pp 18–64
Scriven M (2003-2004) Michael Scriven on the differences between evaluation and social science research. The evaluation exchange:9–4
Shulha LM, Cousins JB (1997) Evaluation use: theory, research, and practice since 1986. Eval Pract 18(3):195–208
Smith MF (1989) Evaluability assessment: a practical approach. Kluwer Academic, Clemson
Smith MF (2005) Evaluability assessment. In: Mathison S (ed) Encyclopedia of evaluation. Sage Publications, Thousand Oaks, pp 137–140
Stake RE (2003) Case studies. In: Denzin NK and Lincoln YS (eds) Strategies of qualitative inquiry (2nd ed). Thousand Oaks. CA: Sage
Stake RE (2010) Qualitative research: studying how things work. Guilford Press, New York
The Sphere Project (2011) The sphere project: humanitarian charter and minimum standards in humanitarian response. http://www.sphereproject.org/resources/download-publications/?search=1&keywords=&language=English&category=22
Thurston WE, Graham J, Hatfield J (2003) Evaluability assessment: a catalyst for program change and improvement. Evaluation & the Health Professions 26(2):206–221
Trevisan MS (2007) Evaluability assessment from 1986 to 2006. Am J Eval 28(3):290–303
Trevisan MS and Huang YM (2003) Evaluability assessment: a primer, Practical Assessment, Research & Evaluation 8(20) http://PAREonline.net/getvn.asp?v=8&n=20
Tucker JG (2005) Encyclopedia of evaluation. In: Feasibility. In: Mathison S. Sage Publications, Thousand Oaks, p 155
Tugwell P, Robinson V, Grimshaw J et al (2006) Systematic reviews and knowledge translation. Bull World Health Organ 84(8):643–651
UNDP (2011) Human development report 2011, sustainability and equity: a better future for all. United Nations Development Programme (UNDP, New York
UNICEF (2012) The state of the world's children 2012: Children in an urban world. New York: United Nations Children’s Fund (UNICEF)
United Nations (2009) Millenium development goals report 2009. United Nations, New York
Utterback JM (1994) Innovation in industry and the diffusion of technology. Science 183(4125):620–626
Van der Maren JM (1996) In: Les Presses de l’Université de Montréal and De Boeck (ed) Méthodes de recherche pour l'éducation, 2nd edn, Montreal/Brussels
Walter I, Nutley SM, Percy-Smith J et al (2004) SCIE Improving the use of research in social care practice, Knowledge review 07. Social Care Institute for Excellence, London
Weiss CH (1977) Introduction. In: Weiss CH (ed) Using social research in pubic policy making. Lexington Books, Lexington
Weiss CH (1998) Have we learned anything new about the use of evaluation? Am J Eval 19(1):21–33
WHO (2004) World report on knowledge for better health: strengthening health systems. World Health Organization, Geneva
WHO (2007) World health statistics 2007. World Health Organization, Geneva
WHO (2011) World health statistics 2011. World Health Organization, Geneva
WHO (2013) The world health report: research for universal health coverage. World Health Organization, Geneva
WHO (2014) World health statistics 2014. World Health Organization, Geneva
Wholey JS (ed) (1994) Assessing the feasibility and likely usefulness of evaluation. In: Wholey JS, Hatry HP and Newcomer KE (eds) Handbook of practical program evaluation. San Francisco: Jossey-Bass, pp.15–39
Wholey JS (2004) Evaluability assessment. In: Wholey JS, Hatry HP, Newcomer KE (eds) Handbook of practical pogram evaluation (2nd ed). Jossey-Bass, San Francisco, pp 33–62
Wood A, Apthorpe R, Borton J (eds) (2001) Evaluating international humanitarian action: reflections from practitioners. Zed Press, London
World Conference on Science (1999) Excerpts from the declaration on science and the use of scientific knowledge. Sci Commun 21(2):183–186
Wuehler SE, Hess SY, Brown KH (2011a) Accelerating improvements in nutritional and health status of young children in the Sahel region of sub-Saharan Africa: review of international guidelines on infant and young child feeding and nutrition. Maternal & Child Nutrition 7(Supp 1):6–34
Wuehler SE, Hess SY, Brown KH (2011b) Situational analysis of infant and young child nutrition activities in the Sahel – executive summary. Maternal & Child Nutrition 7(Supp1):1–5
Yin RK (1999) Enhancing the quality of case studies in health services research. Health Serv Res 34(5):1209–1224
Yin RK (2014) Case study research: design and methods. Sage Publications, Thousand Oaks
Young J (2005) Research, policy and practice: why develo** countries are different. J Int Dev 17(6):727–733
Acknowledgements
Over the course of this study, Léna D’Ostie-Racine received funding from the Global Health Research Capacity Strengthening Program (GHR-CAPS) a partnership of the Canadian Institutes of Health Research and the Québec Population Health Research Network. She was later also funded by the Fonds de recherche du Québec - Société et culture. The authors wish to express their utmost gratitude for the kind assistance and proactive participation of HELP managers and staff, the external evaluators, the district health management teams of Dori and Sebba in Burkina Faso, and the ECHO representatives, who together made this study possible. The authors also wish to thank Ludovic Queuille for his support throughout the study and for his insightful comments on previous drafts of the present article. The authors are also thankful to Didier Dupont for his consultations on qualitative analyses and to Karine Racicot for her remarkable help in reviewing and clarifying the application of the codebook. We also wish to thank all those, including Zoé Ouangré and Xavier Barsalou-Verge, who helped transcribe the interviews, which contained a vast array of African, Canadian and European accents. Our gratitude also goes out to all colleagues who provided support and insights throughout the study and/or commented on drafts of this article.
Funding
This work was supported by the European Commission (ECHO), which had no influence on the conduct of this evaluation. The first author received financial support from the Fonds de recherche du Québec – Société et culture (FRQSC), from the Global Health Research Capacity Strengthening Program (GHR-CAPS) a partnership of the Canadian Institutes of Health Research and the Québec Population Health Research Network and support from Équipe RENARD.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Decaration of Conflicting Interest
The first author has benefited from HELP’s logistical assistance. The second and third authors have both worked as consultants for HELP. The funders and the NGO HELP did not take part in decisions on the study design, data collection, or analysis, nor in the preparation or publication of the manuscript.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix 1
Appendix 1
2009 and 2011 | Evaluation partners HS: HELP staff EE: External evaluators MoH: Ministry of Health representatives ER: Funding agency, ECHO, representatives AP: Advocacy partners |
CONDITIONS | Verbatim |
User | |
- User attitude (+ Change) | 009_EE3: We are producing relatively interesting data that demonstrate very interesting effects of the project but also particular problems and to my surprise, because often it does not always work, in this case we feel an NGO and people ... (managers), very open to evaluation, very open to criticism which is rarely the case by these NGOs and a real willingness to use what we did. And in fact, we realize in the following months that they really took our recommendations into consideration. They really changed their way of doing things and their way of doing interventions. 2009_EE7: A: I would say that the receptivity of people, from the beginning they were receptive! 2009_HS23: R: Everywhere, they had submitted the (confidentiality) forms as you did to explain that people are free to participate or not, that it’s confidential and it’s just to improve and to see how the work is going, if we can improve and what we can improve. So we have not seen people withdraw or say no I do not want to participate. People were happy and they joined/adhered. |
- Utility perception | 2011_HS7: the R4 (sub-program) is normally is the light of Help. It is this subprogram that normally sheds light on what we are doing by capturing in analytic form the way in which we actually intervene at the level of this project .... So, this sub-program helps us to explain our intervention by hel** to identify what works and what works less well ... of course it is useful, from the moment that it helps us to explain situations and it gives arguments to do our work. Because these reports are submitted but there is also an effort of simplification. Through poster presentations and Policy Briefs, there are there are Policy Briefs that result from these studies and enable access to a wide public. |
- Expertise (real) (+ change) | 2011_HS22: The team was really receptive. The people, they listened. I think it’s because the field seemed impenetrable. Yes, so, people were listening with great interest, but the field seemed impenetrable, we did not understand much about it. So, a little time was necessary before people were able to understand. It’s like I said, the field of research is not given to everyone. So when we come and talk about gibberish, sometimes even the wording of the research question, people do not understand. They did not understand what the research objective was, what we were looking for, it took time for people to understand these things and personally I confess there are aspects that I did not master very well. Especially, the aspect: data production and data dissemination. At first, frankly, we did not really know what it was particularly the diffusion part, but as we went along with the implementation of certain activities, it enabled us to understand. And we even we even became actors in the diffusion activities and actors in the implementation of certain research projects. So, that helped us a lot. |
- Expertise (perceived) (+ change + jargon) | 2011_HS7: Well, when I say that sometimes there are concepts that are a little technical with which I am not very familiar it doesn’t mean that I don’t always understand what is said but I mean that I don’t go in the details. I cannot partake in a debate on whether this type of data collection strategy is reliable and it enables us to get the information we need. No, I can’t partake in such discussions because it is not at all my field. 2011_HS2: In fact, I think there is an increasing interest on my part. At the beginning, I had just arrived and even the goal of leading a project like this on the field was already a huge challenge for me, it was really a challenge to take on. And it was as if these super intelligent researchers were coming and for me, I didn’t have to get involved in the research you know, because I could only slow them down because I didn’t measure up, so there was that also separating us. And also, I wasn’t trained in research so when people spoke to me about research, or about a research protocol, I didn’t know what it was, I didn’t know what a qualitative study was or a quantitative study and the interviews you know, all of that it exceeded me. When technical terms are used it scares you, you don’t even…. Hmmm… however, it was nothing too complex, it was just terms but I felt that I wasn’t in my place. |
- Participation | 2011_EE8: After they more or less collaborated, you know people who say they would like to be involved. Some people would say that but between their words and their actual interest to collaborate… |
- Proactiveness (+ Organisational culture) | 2011_EE8: I think is linked to the development of an evaluation culture. And why he (a HELP staff member) exactly it was because he was receptive and proactive, not just saying yes yes it’s good to talk about it. He demonstrated that he is interested, he acted and did things. There was an international conference on health promotion in which we had a short session to present different research results and communications from our project and he came. He took part in the delegation and he presented, that is an example. |
- Mandate | 2011_HS22 (following the question: what facteurs promote people’s tendency to participate in, or to use, evaluations): I think the principal facteurs are according to the field (of work) and the other factor I believe is… hmmm the position, or the interim position that we hold, (ex. of management) when the manager was not there, and since everything usually went through her we often had the opportunity to be involved, when we replaced her. |
Organisation | |
- Culture | 2009_EE3: so the idea for me in my head is that I do an evaluation for an NGO that I know or don’t know and that it will be really really difficult because I have a long experience in evaluating NGO where often the evaluation reports are not used at all because the NGO’s are not well organized or because they don’t want to use it or etc.… 2011_HS2 (looking back on 2009): so I think that by the nature of things it was not easy to integrate the evaluation strategy to the other three (subprograms of the exemption program) as well as the three other subprograms integrate together. So the other (evaluation strategy) was a little, well a little external whereby studies would be undertaken on this or that without the feeling that there was a comprehensive plan or that it was discussed with the team. 2009_EE27: R: it’s really nice, especially the HELP team, they were always really really nice. They really helped me, they were always there when I need a hand et all. They were welcoming, smiling. It was a nice atmosphere. |
- Engagement | 2009_EE8: (I am) happily surprised by the continuity of their position, of their interest for these documentation/ evaluation activities that were not, let’s say, easily integrated in their usual humanitarian aid mandate at ECHO such as saving lives. Despite all of that they keep supporting us, encouraging us in spite of budgetary restrictings mostly, there was no pressure to eliminate these kinds of activities, which would be the easiest solution for them. So there is a real desire on their end, a real interest for sll of this (evaluation/documentation activities). 2009_HS (management): so we are also in a consultation group with these (other) NGOs; we see each other once a month, often more, especially with the active stakeholders and so they accompany and we do it! We really try to use the data but not only for us, it’s good because we put that in our project. |
- Leadership | 2009_EE3: with HELP, the follow-up story there is that when they (HELP decision makers) decided to try to replicate the same project in Burkina Faso (as in Niger) as they knew that I had worked in Burkina Faso, that it went well... so at the same time that they wrote their funding application to Echo for the Burkina project, they immediately consulted me. Usually NGOs go out looking for researchers or evaluators only once they have written their project, they have the money and need someone to do the assessment. There it was different, that’s why we managed to do all that. |
- Resources (financial) | 2011_HS11: I think for the participants, it had an effect. Now, this effect deserves to be maintained or uplifted. Otherwise, if we limit ourselves to one research and if there are no other partners who will finance the research, the knowledge will degrade and the effects will not be long lasting. Perhaps if at this time there was another research theme with a little funding, we will see that after two or three research projects, people will be able, without anyone behind, to properly conduct research and may be look for knowledgeable people to often read their writings/reports. But if nothing is done, these effects will disappear by then. (Q: So it would be early reinforcement?) HS11: yes! It’s good to have a foundation, but you have to work to consolidate and strengthen that foundation. |
- Resources (time) | 2011_HS11: However, the difficulty at one point was that the members of the action research team did not have the time since there were those of the ECD (équipe cadre de district/district management team) and at some point, the research was suffering terribly, since they could no longer move forward because they also had all the activities of the district too. So it was necessary to send reminder letters, to call, to send mails, to even review the planning of the various research stages to further postpone, to allow the people to advance. So I think there has been a change in the timeline twice. |
- Communication | 2009_HS: often he’s embarrassed because people come here, and then, they do things, he does not even know, they even take decisions and everything and so it is, but it’s a matter of communication even among us too I believe. Yes! |
- Objectives | 2009_HS2: no I think, in fact, all the evaluations are not incorporated in the daily work, these are things that come one at the time; we say we will evaluate that then we will evaluate that etc. There are some evaluations that require a lot of work and it’s true, I feel like, I feel it’s an extra work load but those that do not require additional work is when they are aware that it is useful etc. And sometimes they have trouble because evaluations can disrupt our activities we are planning to do. For example, in the health subprogram for example, the supervisions etc. They had trouble understanding that all our vehicles were needed for this evaluation study, they did not understand, since for them, the feeling they had was that the evaluation is more important than our own activities, those we do every day etc. and it did not go down very well. |
- Needs conscensus | 2011_HS7: I’m more at the stage where the studies begin (…), right at the start, we discuss these with the Université de Montréal to clarify the interest of exploring these themes for us, for our project… |
Knowledge | |
- Aligned with need | 2011_HS7: R4 (the evaluation strategy subprogram) It’s us, we are the ones making the request, we are the ones who need R4 (the evaluation strategy). It’s very complementary what we do. That said, it is us who must facilitate the work of R4 so that they can give us the information we need as answers to all the questions that arise around the user-fee exemption program. |
- Accessible | ER: (following the question: you have access to the documentation that is produced to HELP?) Yes it is shared. We ourselves receive and try to dissiminate that at the level of the actors and our key partners. Each time we receive the documents so when there are people who pass when we go somewhere also with our advocacy work we try to put the documents on the table and discuss the points that are in the documents we are really on this advocacy every day. |
- Applicable | 2009_HS7: We had, EE3, I believe an evaluator who did a field study, even before we got the final results and who said at the HELP level, people do not have the information that the access to care is free in some villages. And so, there are women who still go to health facilities and who pay. This is very good information for us. It is true that in our indicators it was said that all target groups have free access to 90% care. But if we’re told “no, no, it’s not true because we have seen people who do not have access” that’s good information. We must know why, that, it interests us, to look and see that there are flaws in our system. So we decided, we will formalize the information, we will write a roadmap where we will explain who has access to care, what are the terms, when a patient arrives how it is done, and we will give to all health workers, we will post the information in all health facilities. We will do a program on the radio and then we will explain in addition to the work that the facilitators do every day by going to the villages to sensitize the community. We will redo all this. Like that nobody will say I was not aware. But if we had not had this feedback we would not get it! 2011_HS10: And it’s like a training that leads to the development of a protocol and the execution of action research and so there were various themes each chose the theme in relation to what he could identify in his district as a problem and he formulates the theme of the research. |
- Timely | 2009_ HS11: (discussing the action research project) well, it’s simple since there is first the training and the themes had already been determined for each team and a blueprint of the procedures had already been explained so that each team could follow the procedures. Well, I was responsible for ensuring that the teams get to work immediately and follow the steps that had been planned. However, the difficulty at one point was that the members of the action research team did not have time since it involved people from ECD (district management team) and at some point, the research was suffering terribly, since they had their work activities for the district too so they could no longer move forward with the research. So it was necessary to send reminder letters, to call, to send mails, to even review and modify the planning of the various stages, to give time allowing people to advance. So I think the timeline planned was modified twice. s |
KT Strategy | |
- Fits needs | 2009_ER31: A: Yes we share all the information and every time there is a document that comes out, it is in relation to this document that we try to review the whole advocacy strategy and its implementation. So these are documents that we use a lot at our organization (OE). It is very useful and very relevant to us; to advance in our advocacy because so long as we do not have documents of evaluations that have been realized that enable us to work with the partner, to see what has been done and what still must be done, we will not advance. So for us it is very important. 2009_ER20: I’m pretty comfortable with the pace of things because what is being published is on relevant and interesting topics. I know of other projects where systematic reviews are undertaken but the topics, quality and interest of the publications were suffering. But I’m pretty comfortable with how it’s going and I would not want to comment otherwise on it. Because it’s also part of that trust that we have, there’s no lack of interest to publish when it’s needed and when it’s relevant. You see? 2011_HS22: That is to say that the recommendations, there should incentives to the implementation of the recommendations, for the leaders of HELP. That is, it should lead to recommendations for HELP in general. In fact, there should even be a more active incentive to implement this recommendation. However, it’s true that I am not even aware of a recommendation that has not been implemented but... Q: I’m not sure I understand, to apply? R: That’s it! How to do it, how to apply/implement it, what opportunity can we have for ... here, that’s it. Q: It’s not like one needs to show why would be important? R: no Q: So more showing how to implement? A: How and even follow-up to remind them. Q: Do a follow-up? A: Yes, Follow-up on the implementation of the recommendations. It is to attract attention. It’s true, it’s too much to ask them perhaps, but as they are involved in the projects, it’s one of the benefits of having them next door. 2011_HS11: Yes, it answered information needs. And I think it was really a good opportunity. In any case, the factors that facilitated these studies is that they were conducted on our own field, that means that studies were opening doors to allow us to look better inside, so it was the field of study was very well targeted, it really seemed like a mirror. It reflected the true image of the thing we are trying to understand and because it was very easy to import results from somewhere else or at least tell us the results achieved when we did this and so forth. So the studies were realistic and specific to our work context. |
- Tailored to EPs | 2011_HS2 (looking back to 2009) here I think that by nature it was not easy to integrate this component (evaluation strategy) to the 3 others (part of the exemption program), as well as the 3 other project components were well integrated, intermingled in fact even in terms of their activities which came together. The other (evaluation) was a little bit, like a little bit external. So then they came to study this or that without the feeling that it was planned in common or discussed with the team. |
- Adapted | 2011_EE8: I think there are limitations in the end because we do different things, etc. Their responsibility is the implementation. It’s not the, documentation. And, with people there is what you say and what you do. Effort is needed certainly, it is now clear that reports were not worth sending. Because nobody read them, but it was not only internally (HELP), it was also external (partners). |
- Interpersonal relationships | 2009_ER20: Because it’s also part of that trust, there’s no lack of interest in publishing when it’s needed and when it’s relevant. You see? We do not have this type of relationship, it is not a contractile relationship that can be found elsewhere where there is a certain quota of publications required, for example. We are not at all in this situation. 2009_MoH14 (discussing relations with HELP): It’s just for that, since it’s more of a fraternal relationship than a working relationship, because, and by the way it makes things easier. 2009_HS10: It also makes it so that the teams who conducted the research together, it makes a connection and then it allows interpersonal relationships to build and that facilitates more mingling and easy communication between HELP and its partners. |
- Exchange mechanisms | 2011_HS2: HS2: Yes. Even before I left Ouaga, I was more and more involved from the moment they felt that I was interested, from the moment that when they sent me an email I took the time to answer point by point and they felt that so instead of sending me an email they sent me 5 and then it was 10 they felt that there was a return so it’s positive. 2009_MoH25: the intervention by Help and then often the studies. We made this accessible in all health centers... but as here people have a phobia of reading, they do not want to read so often it means that people do not have the information and yet the information is delivered in their lockers. (Q: did you take a time to read these documents?) Yes a lot. And then with the meetings they were organizing and they came and were presenting what was done ... (They were saying here) you remember we came in between such times, we asked for this and that and it is this person who has ... when he (an evaluator) came to Dori and presented the results of his studies, I was there, so all that there are meetings that, as I said ... that allows us to lend ourselves more to the studies and such activities. |
- Common language | 2009_HS23: R: Yes, and I was explaining clearly, she had evaluators, but since I explained clearly the women understood me better! Q: Did you speak in the same language as they? R: yes yes, peuhl. 2011_HS22: The team was really receptive. The people, they listened. I think it’s because the field seemed impenetrable. Yes, so, people were listening with great interest, but the field seemed impenetrable, we did not understand much about it. So, a little time was necessary before people were able to understand. It’s like I said, the field of research is not given to everyone. So when we come and talk about gibberish, sometimes even the wording of the research question, people do not understand. They did not understand what the research objective was, what we were looking for, it took time for people to understand these things and personally I confess there are aspects that I did not master very well. Especially, the aspect: data production and data dissemination. At first, frankly, we did not really know what it was particularly the diffusion part, but as we went along with the implementation of certain activities, it enabled us to understand. And we even we even became actors in the diffusion activities and actors in the implementation of certain research projects. So, that helped us a lot. |
- Timing | 2011_HS2: then the results did not come, it’s as if in monthly meetings for example we did not update on the progress of activities etc. ... and people had trouble understanding where it was at, what was happening with all the data they came to take, where was it going, what was it going to be used for, etc... |
- Participatory | 2009_HS9: HS9: Yes it is a very participative process and that is the particularity because throughout the process, they participated in the analysis of the situation to identify all the problems and they also participated in the process, from defining their protocol and chosing the data collection material. And they went, even went to find employees or they themselves supervised the data collection and the analysis was also done by them and they drew the conclusions, that allows them to appreciate the relevance of what they drew as conclusion and actions that can be, that can be recommended from these results. We think that it is the interest of this approach. |
- Monitoring | 2009_HS9: for the process, there was a follow-up to find out if the teams had followed the different steps. Now once the last activity, the presentation of the results had taken place in July so, there would be the follow-up, it would be after, seeing that the teams implement the conclusions they drew from their research. That’s probably later that they should do it. |
Evaluator(s) | |
- Interpersonal skills | ER20: In fact we contacted (a senior evaluator) because he was known as a specialist of the field internationally because of his publications and his commitment and I think that I put forward his commitment. He is someone who apart from the fact that he is extremely talented and has a huge list of publications it is also someone who, for us, who is a collaborator who really, eh how they say, someone who is without pretence. Someone who really wants to dig deeper together on the relevance but is also someone who is very nice to work with and so it makes a lot of things easier. And I think he has trust in us over the months and years. At least that’s how I feel it. |
- Collaborative attitude | 2011_EE8: After we collaborate, well I do not know what people can say about the project, but afterall we cannot wish for too much integration because the R4 component (evaluation strategy) is different from components 1, 2 and 3 (HELP projects) which are all directly related to the implementation of the exemption. So they have an obligation, and they are all together speaking from a geographical point of view, but they are working on the same thing, in the implementation of the project. We do something different, so at a given moment, it’s normal also that there are distinctions. We do not participate directly in the implementation, it’s normal and we do not complain. They are not directly involved in evaluation activities and to some extent it is normal. What I mean is that I’m not against integration, but at the same time, it’s normal, so we should not try to exaggerate in the other direction and sometimes from the moment when we ask ourselves the question of integration, there are people who... who can think that they do not do enough things together, with..., but everyone has their own field of activity, their own responsibility also. |
- Communication (change) or communication clash | HS 2: For example, (one of the evaluators) from what I see since the beginning of the project so far, I see a very strong evolution of his ability even to share and translate these activities so that everyone understands. When I see his presentations from his beginnings and those from now I find that he has really improved. 2009_HS (concerning the lack of communication with the evaluators): yes and then information that must be shared, you can not send us someone and then say here and then there are evaluators who arrive and who want to sit at the desk there! And nobody knows what they are doing! In fact it’s something that is written somewhere in the big project document, we said we’ll do this, that’s good and we have to do it to get results, that’s clear, but when they plan to come we have to what phase is about to happen and why! |
- Mentoring | 2011_HS10: they accompanied us to elaborate the protocols and to accomplish them. That’s a little they way that I see it ... So it’s not like a protocol that is emailed to you and that you co-write. It’s more of a supported activity like that over time. 2011_HS11: I think for the participants, it had an effect. Now, this effect deserves to be maintained or uplifted. Otherwise, if we limit ourselves to one research and if there are no other partners who will finance the research, the knowledge will degrade and the effects will not be long lasting. Perhaps if at this time there was another research theme with a little funding, we will see that after two or three research projects, people will be able, without anyone behind, to properly conduct research and may be look for knowledgeable people to often read their writings/reports. But if nothing is done, these effects will disappear by then. (Q: So it would be early reinforcement?) HS11: yes! It’s good to have a foundation, but you have to work to consolidate and strengthen that foundation. |
- Adaptable (change + communication) | 2009_HS2: But now it’s really improved, after that, even the evaluator was aware of that, but sometimes it’s a little rushed ... well, we discussed in meetings etc. as they were aware also they said good, now we will communicate better, so now, even when they discuss between themselves for example in Ouagadougou they leave us a copy so that we see what is going on... by internet, by email, or even skype! 2011_EE: (on the need to adapt the KT): we will adapt and try to adjust our strategies including our products (publications, presentations) etc., to circumvent these difficulties. The example of the evaluation reports is significant. The reports people do not read them so we have to see about writing 4 page notes (ex. policy briefs). Now if partners do not read 4 pages, well! Not reading a report we can understand, but 4 pages, we could always say good we could do otherwise, we could make a film every time we need to transfer information (knowledge), video clips for people to watch us, a song, dance or whatever. |
- Credible expertise | 2009_ER20: And it became a regional project where they really reinvested and they needed a hel** hand from more specialists to help with the protocol because it’s a country where it was not yet known, and we started with other partners on the same topic on the user-fee exemption and we got in touch with him (the principal evaluator) who we knew because he had already published, it’s not a big world, but it’s a specialist and that’s how the project has evolved. |
- Network | HS10 (follows questions about the benefits of participating in the evaluations): it allowed me to update my knowledge and skills in the field. Then I was able to take advantage of the mingling with the other people on the team. |
- Resources (institutional) | 2009_HS: There are two organizations that are particularly concerned: it is (an NGO partner of HELP) and HELP. Because they planned it as part of their project the others joined in, they did not have these kinds of activites planned in their project and thus as it was planned by (the other NGO) and by HELP there were budgets and resources. The idea, the principle is: there is a principle of the use of resources to carry out these activities considering that it is mainly HELP and (the partner NGO) and after HELP acts as a motor organization because we have a planned this a little more so we have a little more resources; it is because we have a partnership with the University of Montreal, so HELP is the engine behind all of this… we figured, it is better to use external resources otherwise we risk drowning and not doing things well, not do it in time, or not being capable to do it. Therefore we sought the services of an external evaluator. |
- Geographical distance | 2011_HS2: So that’s the team that based in Dori because the project was at Dori but the R4 component that was studying it even if there were a lot of missions to Dori was still based in Ouaga, so it’s also a geographical distance. Between us we can exchange information, I went in the offices of some and even had informal discussions over a drink here and there but that made everything more cemented unlike the R4 component that even if they made the effort to share the information it was through written documents that arrived via the internet so here we read or we do not read and it was not in the same rhythm as the field team. |
Other coded conditions | |
Change - evolution | 2011_HS2: It’s all those who were not used to being in contact with that for which it caused a change. Now at the level of the partners it’s the same, there are some health workers who were involved and when you discuss with them they really say: we saw the evaluators come to take our data, “take our data” it’s even the expression they use, they came to take our data and go away etc. ... But after having participated in presentations, having seen that it served for something, having seen the results, it encouraged them to collaborate more even next time. 2011_HS2: (Q: Can you describe at this point the level of integration of the evaluation strategy (R4) in the exemption project)?: It is a good question because it has evolved a little I think over time, I do not know if I have to go back so far in time but at the beginning it was very separated actually, people had a hard time figuring out what the R4 was doing, and then as we said last time, it was often students who came to do their study and then they left without explaining exactly what it was. It was planned at the level of Ouaga and decided at the level of Ouaga and the research protocols were done at the level of Ouaga and then it happened at the level of Dori only when the logistics were needed on the field, in the end it was felt like that but it was not necessarily the case. Because there were documents circulating that were not necessarily read at the field level. In any case, it was very separate, it felt like that, whatever, to whomever the fault. And I also think that the two principal evaluators they did a lot of work to make it more and more integrated, but it was not easy because it was really different. The problems, the objectives differed a bit and it’s really when they started to involve the team members well and then also when the field team started to understand also the interest behind these studies, when they also started to see the results because when we take the household survey for example it was very long before having usable results but that’s where it started to integrate. |
Clash - challenges | 2011_HS10: Not catastrophic, but there are always difficulties. I think that once we had major difficulties where the budget that was planned, for example, was totally underestimated. So we had to request a supplementary budget and there we had to explain a lot a lot of times ... once we conducted a study, but we had not taken into account the season of the study, I think it was the rainy season and it was complicated, the vehicles got bogged down and the evaluators could not reach with their motorcycles the villages that had been sampled. Well it’s logistical difficulties like that. |
Rights and permissions
About this article
Cite this article
D’Ostie-Racinea, L., Dagenais, C. & Ridde, V. Examining Conditions that Influence Evaluation use within a Humanitarian Non-Governmental Organization in Burkina Faso (West Africa). Syst Pract Action Res 34, 1–35 (2021). https://doi.org/10.1007/s11213-019-09504-w
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11213-019-09504-w