This site uses cookies, as explained in our terms of use. If you consent, please close this message and continue to use this site.
Institutional function
At a glance
Strategic Planning Monitoring and Evaluation is an advisory unit providing technical support to ensure that all of our work sustain meaningful impacts across the region.
Farid Ahmad
Head of Unit
Our Strategic Planning, Monitoring, and Evaluation (SPM&E) Unit focusses on performance and outcomes on the ground. Through participatory strategic planning, review, and evaluation processes, SPM&E provides guidance, advice and technical support to integrate results-based planning, monitoring and evaluation across all of our work so that we can together produce the impactful results. SPM&E also ensures timely internal and external quality evaluations and impact assessments on institutional and programmatic achievements against set objectives and outcomes.
Our impact as a regional knowledge organisation is achieved through utilisation and upscaling of knowledge generated with our vast network of working partners and through knowledge sharing. Fundamentally important to us are three vital ‘I’s’ — Innovation, Integration and Impact. Not everything can be known about how change occurs, yet its important to have and share a vision for potential impact from the start. This is a fundamental tenet of innovation with regard to impact pathways.
Using a theory-based approach, the main impact pathways can be identified making the theory of change explicit in terms of different actors, users of outputs, and outcomes leading to clear development impacts. With this in mind, we work on the twin elements of impact pathways: validation of our contribution to changes related to the HKH region’s poverty and wellbeing and in the physical and social vulnerabilities and ecosystem services; and to increase process understanding enabling change to take place in complex biophysical and socio-cultural contexts. This approach provides us improved evidence toward impact within our strategic ad results framework. The focus is to facilitate stakeholder accountability and to support learning, policy information, and influence.
Each of our regional programmes and initiatives have developed a results-based log-frame with common sets of specific, measurable, attainable, relevant, and timely indicators, as well as reporting mechanisms which satisfy diverse development partners and assist implementation partners in measuring, documenting, and reporting. Indicators incorporate issues of gender equality, environmental sustainability, and other emerging cross-cutting issues such as governance, poverty, and economic analysis. Log-frame indicators are set to ensure measurement of both quantitative and qualitative outputs and outcomes. Each initiative develops a results based monitoring and evaluation plan to serve as an important management instrument for implementing partners to track and report results. The initiative-level monitoring and evaluation plans consider detailed pathways to impact. To evaluate regional programme and achievements, information collected at the initiative levels are compiled and analysed systematically. Quality scientific, product use, and the up-scaling of ICIMOD-generated knowledge are given special attention in monitoring and evaluation.
Our mission and vision are focused on making a positive difference in the wellbeing of people and the environment through impact on poverty, people’s vulnerabilities and ecosystem services. Impact is an essentially important aim for us and we work together with partners, seeking to benefit women, men and children of the HKH region. Our institutional theory of change has a long chain of results for reaching all of our beneficiaries and our impact pathway analysis approach in monitoring and evaluation of RPs and initiatives, conducts external evaluations, and measures impact of selected initiatives and projects.
We think about our reach at three levels:
We think of beneficiaries in two ways:
Guided by our vision, we work on the multiple elements of impact pathways — validation of our contributions to changes in regional poverty, wellbeing, physical and social vulnerabilities and ecosystem services, as well as to increase process understanding enabling changes to take place in complex biophysical and socio cultural contexts.
Multiple pathways — science policy practice impact pathways, regional cooperation pathways and capacity building and communication impact pathways — are critical to achieving expected outcomes and impacts defined in the strategic result framework.
Our evaluation function serves both learning and accountability purposes for our stakeholders. Impact pathways developed for each regional programme and initiative are the main basis for evaluating our programmes. At the institutional level, both internal and external mid-term, terminal evaluations and external quinquennial reviews are conducted based on the terms of reference approved by our Board of Governors and the ICIMOD Support Group to assess the programmes in general and our overall performance in terms of relevance, efficiency, effectiveness, impact and sustainability. Principles, methods, and tools used for evaluation are of international standard. Internal evaluations and impact studies mainly focus on learning aspects from our programmes and external evaluations serve multiple purpose of independence, transparency, accountability and learning.
The evaluation’s credibility are ensured through independent expert evaluators and a transparent evaluation process. Each evaluation emphasizes both intended and unintended results and also positive and negative impacts such as external factors impacting the programmes, changes in basic policy environments, general economic and financial conditions. The evaluations reflect the different interests and needs of the many parties involved in development cooperation by also including perspectives of gender, governance, economic analysis and poverty.
Programme evaluation is mandatory for all of our regional programmes and initiatives. All initiatives undergo midterm and final evaluations given specific donor requirements.
Rigorous impact evaluation are given highest priority. Various impact evaluation designs are developed for initiatives given the nature of the mandate of the initiative. Randomised evaluation methodologies are applied in initiatives wherever appropriate given the population coverage of the initiative. Both experimental and non-experimental evaluation methodologies are also developed and applied.
The purpose of conducting monitoring and evaluation is both to learn and to provide a mechanism for accountability for various stakeholders. Our evaluation processes are transparent with results widely available. Learning coming from programme monitoring and evaluation are communicated both internally and externally. Evaluation findings are reported in a timely fashion to donors and the Board of Governors and opportunities to disseminate learning to other stakeholders are constantly reviewed, identified, and taken up.
Feedback mechanisms are built into the cycle of programme planning, implementation monitoring and evaluation. Review and planning organised every four months helps consolidate learning from each regional programme and Partners also regularly report results and learning.