Monitoring and Evaluation

IBTCI is a leader in the design and implementation of monitoring and evaluation programs and systems. We undertake regular performance monitoring and reporting of programs, as well as periodic review of specific activities under different programs/projects and overall evaluation of their impact. IBTCI has monitored and evaluated programs in a wide range of areas, including agriculture, banking/finance, economic growth, education, environment, gender, health, infrastructure, and private sector development. In this process, the IBTCI staff uses a number of tools, such as needs assessment, evaluation assessments, institutional analyses, process evaluations, economic and cost analyses, cost-effectiveness and cost-benefit analyses, and impact assessments.

We use a number of quantitative and qualitative research methods, including research design sampling, weighting and estimation, experimental designs, longitudinal analysis, rapid appraisal, and ethnography. In particular, we review and evaluate activities using the following measures: impact, effectiveness, efficiency, replicability, scalability and sustainability. Evaluations carried out by IBTCI include cross-cutting issues such as gender, conflict management, private and voluntary partnerships, transition initiatives, and information technology.

Projects


Capabilities

In all of its engagements of this nature, IBTCI seeks to tailor monitoring and evaluation methodologies to the particular needs of missions, programs and projects. IBTCI recognizes that monitoring and evaluation must be thorough and rigorous, but also tailored to the specific needs and requirements of our clients. IBTCI has also learned that there is a close relationship between the depth and accuracy of monitoring activities and the quality of evaluations, and thus is promoting the use of web-based systems to improve monitoring and management reporting functions. IBTCI has successfully conducted several monitoring and evaluation projects for which it has received highly written commendations.

  • Monitoring: IBTCI views monitoring as a continuing function that aims primarily to provide the client with regular feedback and early indications of progress or lack thereof in the achievement of intended results. Actual performance or situation is tracked against what was planned or expected according to pre-determined standards. IBTCI staff collects and analyzes data on implementation processes, strategies and results, and recommends corrective measures.
  • Evaluation: IBTCI views evaluation as a time-bound exercise to assess systematically and objectively the relevance, performance and success of ongoing and completed programs and projects. Evaluation is undertaken selectively to answer specific questions to guide decision-makers and to provide information on whether underlying theories and assumptions used in program development were valid, what worked and what did not work and why. The aim is to determine relevance, efficiency, effectiveness, impact and sustainability. IBTCI also views evaluations as a vehicle for extracting cross-cutting lessons from operating unit experiences and determining the need for modifications to the strategic results framework.

Monitoring and evaluation methodologies used by IBTCI staff allow for a more rigorous focus on results, learning and the actual application of knowledge gained from monitoring and evaluation. IBTCI staff often plan monitoring and evaluation in parallel, as evaluation is viewed as an important monitoring tool and monitoring is an important input to evaluation. Meaningful information about outcomes and outputs are captured, regardless of the unit of analysis used by a monitoring and evaluation plan. An evaluation plan is drawn up to cover the outcomes of the period, which is a key element in performance assessment. Most important, IBTCI staff view planning as not being primarily about scheduling but about determining the best approach depending on the needs and the nature of what is being monitored and evaluated.