03-March-2022
1. Tell us more about your work as a Chief Quality and Methods Advisor
As Chief Quality and Methods Advisor, I serve the Evaluator General, Managers, and Evaluation Teams on questions of evaluation standards, approaches, designs, methods and tools for Independent Development Evaluation’s work. My work also involves reviewing IDEV data collection, management, analysis, and reporting standards and tools. I also revise and update quality assurance processes, manuals, standards and tools and support evaluation capacity development in the department. Last but not least, I lead the preparation of the three-year rolling work program and subsequent annual updates.
2. Tell us about one of your most memorable recent work experiences
A recent memorable work experience was the 2022-2024 IDEV work program, for which I was the task manager. The work program naturally seeks to operationalize the three objectives of the independent evaluation function: accountability, learning, and promoting an evaluation culture. In addition, there is a strong focus and emphasis on stakeholder engagement and the utility of evaluations. We followed a two-phase design approach. First, it was a desk review and stakeholder consultations to establish a longlist of potential evaluations. Second, we sought to prioritize evaluations from the list generated in phase one, based on four key criteria, namely, timeliness, primary stakeholder interest, materiality, and alignment with the Bank’s DBDM (Development and Business Delivery Model) priorities (especially for corporate evaluations). This mapping exercise resulted in three budgeted scenarios: low, base, and high. The three scenarios were first presented to the Bank Committee on Operation and Development Effectiveness (CODE) , for comments , guidance and endorsement. The work program was finally approved by the Board.This process was memorable because it provided an opportunity for all the Bank stakeholders to ensure their information needs are adequately catered for in the proposed evaluations. It also provided me with an avenue to meet the various Bank stakeholders and what they do.
3. What are the pitfalls to avoid in your work?
Keeping quality of evidence at the center of the evaluation is key to a credible evaluation. When reviewing evaluations one must take note of the fact that the analytical phase is where the strength and validity of evidence are assessed, and any weakness caused by data gaps is addressed. Quality assurance at this stage is essential, and I ensure that findings are adequately supported by evidence. I assess both the quality of the evidence presented and the clarity of the analysis, identifying any possible gaps and weaknesses in the evidence. One consideration in this regard is the importance of ensuring transparency and a clear evidence trail. In terms of transparency, the sources on which findings are based must be clear, and their reliability and validity assessed. Records of interviews should be kept to allow tracking back to the sources of each finding. In addition, the findings and conclusions of a particular evaluation should be coherently anchored in the analysis and documented in evaluation reports. Moreover, to delineate the evidence trail, I check to ensure that evaluation reports contain cross-references to the pertinent sections and paragraphs in the document to help readers easily identify the findings that led to a particular recommendation and the analysis that led to a particular conclusion.
4. How can evaluation quality and method be strengthened?
Firstly, operations staff and other stakeholders should be more involved in evaluation reference groups. These groups are set up for each evaluation to review the accuracy and quality of various products throughout the evaluation process. Their participation improves both the quality and the utility of the evaluation through stronger ownership, adoption, and implementation of IDEV recommendations. Secondly, evaluations should continue to be reviewed by both internal and external experts. The use of reviewers is important in the quality control process. In addition, and as an example, IDEV has developed a standard evaluation process that has been codified, with an accompanying flowchart, to further clarify and harmonize evaluation practices across the department. Moreover, the peer review process at IDEV has been strengthened by utilizing standardized Terms of Reference, checklists, review templates, and reporting formats. Thirdly, with the adoption of the revised international evaluation criteria by OECD/DAC in December 2019 and the accompanying guidance for their use, fully developed, IDEV has updated its evaluation manual accordingly.
Finally, maintaining quality and method requires investing in enhancing the capacities of IDEV staff. The challenges associated with conducting empirical data collection and site visits during the COVID-19 pandemic have fundamentally changed how data is gathered. IDEV has already engaged two trainings in using remote data collection tools, new sources of evidence such as “big data” and geo-spatial data, and remote stakeholder engagement.
5. How do you think evaluation quality and method will evolve? What changes will we see?
Covid-19 has presented a scenario where we need new methods that can deliver evaluative information quickly in order to make urgent decisions. IDEV’s interest in Rapid Evaluation is heightened by the assumption that the continent will be subjected to new emergencies that will require a quick turnaround by IDEV. We have already planned to use the Rapid Evaluation methodology in the 2024 work program to assess the progress of the Africa Investment Forum.
6. What do you enjoy most about your work?
I particularly enjoy the fact that I get a broad overview of the types of evaluative knowledge generated by the department through my review of all evaluations, from the Concept Note to the final evaluation report presented to the Board. These evaluations include cluster evaluations, impact evaluations, country and regional evaluations, sector and thematic evaluations, corporate evaluations, evaluation synthesis, and other products, e.g., the annual Management Action Record System (MARS) report. From this evaluative knowledge, I also get an overview of the work of the Bank.