31-January-2022
1. What is your most memorable experience with IDEV?
I have two memorable experiences with IDEV. The first is the time when IDEV undertook the Comprehensive Evaluation of the Development Results of the AfDB in 2015-2016. At the initial stage of the evaluation, we were advised that the evaluation is “mission impossible”. However, IDEV maximized the use of its staff’s skills and potential and delivered the evaluation. This always resonates in my mind and reminds me how strong teamwork changed the “impossible” to “possible”.
The second memorable experience is how IDEV responded to the constraints imposed by the COVID-19 pandemic. In evaluation, it is important to directly interact particularly with the program/project beneficiaries in the field to establish credible facts. In the early stage of the pandemic, many member countries imposed travel restriction to curtail the negative impact of the pandemic. The IDEV team quickly became innovative and implemented various approaches including virtual interviews, use of big data, online surveys, and extensive engagement of national consultants in member countries. As a result, IDEV delivered its work program without any delays and informed many decision-making processes.
2. What have been your most important lessons learned over the course of your career as an evaluator?
I would like to share three main lessons I learned as an evaluator:
1) Adequate engagement with stakeholders enhances the quality and usefulness of an evaluation. In evaluation, the stakeholders tell about the performance of an intervention by providing evidence on perceptions, views, and other facts. In addition, stakeholders use or own the evaluation results. Thus, evaluators have to identify the right stakeholders and engage them in the evaluation process.
2) Timely delivery of evaluation results is critical for usefulness of an evaluation. If an evaluation is delivered after the policy makers made a decision, it no longer influences a desired change.
3) An evaluator must continually keep on learning new ideas and approaches to meet the demand of stakeholders and emerging situations. Practical and continuous training in evaluation methodologies -quantitative and qualitative-, big data, and evaluation management are important to make evaluations useful to our stakeholders.
3. What are the common errors/pitfalls to avoid as an evaluator?
In my opinion, the three main mistakes to be avoided by an evaluator are:
a) Using inaccurate data/evidence. As evaluators, we should always check the accuracy of data or evidence we are using in the evaluation. Incorrect data or evidence leads to wrong conclusions and poor decision making.
b) Viewing stakeholders as “alien” or external to the evaluation. Evaluators should involve all relevant stakeholders in the evaluation process.
c) Forgetting triangulation in data analysis and reporting evaluation results. Evaluators should give attention to the different methodologies and sources of data when conducting data analysis and reporting. This helps to reach on objective and unbiased conclusion.
4. How can evaluation practice be strengthened?
Evaluation practice should be strengthened in the areas of conceptualizing/designing the evaluation, analytical tools including rating scales, and knowledge management. This requires a concerted effort among evaluators around the world. I hope that the creation of the Global Evaluation Initiative or EvalPartners would help to improve evaluation practice. Moreover, strengthening evaluation practices in AfDB regional member countries -in evaluation skills and institutional development- are equally important to enhance development results. Thus, supporting M & E systems in regional member countries is crucial. Otherwise, evaluation continues to be a one-sided approach and encounters shortages of critical inputs in terms of data and ownership.
5. How do you think evaluation practice will evolve? What changes will we see?
As global and regional issues are changing, stakeholders demand new evidence to support decision making. This requires innovation and creativity in evaluation methodologies for measuring results that credibly demonstrate what worked and what did not. Evaluators around the world have come up with different approaches in the past, however, I believe we still have a long distance to travel - given the complexity of development - in terms of measuring development results. The lessons from the COVID-19 pandemic also urge us to invest in the areas of big data, artificial intelligence and other ways of data gathering.
6. What do you enjoy most about your work?
I enjoy my work in evaluation in general. The most interesting parts are interacting with stakeholders in the office and in the field/at the intervention site. In addition, I have also enjoyed the wide range of topics I have been involved in IDEV.