17-May-2022
1. Tell us about a memorable recent evaluation experience
Conducting evaluations in the context of COVID-19 has been a significant recent experience. In March 2020, we faced travel restrictions due to the onset of the COVID-19 pandemic while evaluating the Bank's strategies and programs in Gabon, which I was leading. This limited the evaluation team's ability to make direct observations in the field and conduct physical interviews with the end beneficiaries of the Bank's interventions. As a result, we had to be very flexible to adapt to this new context, including redesigning our data collection approach. We worked closely with the government and the Bank's office in Gabon and strengthened coordination between team members operating in different parts of the world. Virtual interviews, support from local consultants for site visits, and increased use of secondary data allowed us to collect all the evidence and prepare an evaluation report that was appreciated by the various stakeholders, including the Bank's Operations and Development Effectiveness Committee.
2. Would you say that evaluations are similar, or each evaluation is unique?
Evaluations differ in many ways, by their nature, scope, objectives, stakeholder expectations, and so on. For example, an institutional evaluation focused on analyzing the Bank's internal policies and strategies does not present the same challenges as an evaluation focused on the Bank's strategies and programs in a given country. Likewise, a project evaluation does not present the same complexity as a thematic or sector evaluation. Similarly, a real-time evaluation of an ongoing intervention will place more emphasis on design and implementation issues than a summative evaluation of a completed intervention, where more attention will be paid to issues of effectiveness and sustainability.
However, all evaluations adopt the same rigorous approach based on a clear conceptual framework, stakeholder involvement, and a sound methodological approach. The goal is to gather sufficient evidence to make objective judgments about the results of interventions and the lessons that can be learned from them.
3. Tell us a story about an evaluation that taught you a valuable lesson
I recall a field data collection mission on a completed power project. The completed project had been awarded a management concession, and all the project reports made it look like a success story. Following a meeting with the management company, we insisted on visiting the facilities, despite discouraging us. Direct observations and discussions with the engineers on-site revealed a lack of equipment maintenance and hence a risk in terms of sustainability. This anecdote reminds us of the importance of triangulating sources of information and especially the added value of site visits and exchanges with the final beneficiaries.
4. How can the evaluation practice be strengthened?
I believe that action is needed at five levels:
- Strengthen the self-evaluation system: at the Bank level, for example, operational teams would be more apt to handle accountability issues and to collect reliable data on the Bank's interventions which would facilitate ex-post evaluation.
- Diversify the offer to meet the growing demand for timely and high-quality evaluations. This would require extensive consultation with stakeholders to identify their evaluation knowledge needs.
- Ensure ongoing professionalization and capacity building of evaluation teams to better design and conduct credible and influential evaluations.
- Continue to promote the capitalization of experiences and lessons learned. This means focusing on activities to disseminate lessons learned from the evaluation to the various stakeholders.
- Mobilize resources and join efforts: initiatives in Africa such as national evaluation associations, Evaluation Platform for Regional African Development Institutions (EPRADI), African Parliamentarians’ Network on Development Evaluation (APNODE), and global nodes such as the Evaluation Cooperation Group and EvalPartners are very useful for promoting the evaluation practice through standards-setting, experience sharing, etc.
5. How do you think evaluation practice will evolve? What changes will we see?
I will mention three major developments:
- The world is facing new challenges and increasing complexity in development interventions. Innovative approaches to evaluation will be needed to address these and capture indirect or unintended effects.
- The demands for high-quality evaluations will drive further professionalization of the evaluation function and a readjustment of approaches and tools. The revision of the evaluation criteria of the Development Assistance Committee of the Organization for Economic Co-operation and Development (OECD-DAC) is an example.
- Like many other disciplines, the evaluation will be influenced by technological change. Evaluators will increasingly use sophisticated data collection and analysis techniques, such as geographic information systems (GIS).
6. What do you like most about your job?
Evaluation is one of the key sources of information that informs decision-making. I find my work exciting, and I feel that I contribute to the production of evaluative knowledge and learning at the institutional level through the practice of evaluation. In addition, I like the fact that the evaluator is a key agent of change through constant interaction with a wide variety of stakeholders –those involved in the design and implementation of interventions, the decision-makers, and through to the end beneficiaries.