This website uses cookies to remember your personal preferences and gather statistics. Click here for more information about cookies.

Yes, I agree No, I do not agree X

Measuring outcome and impact

Planning, monitoring and evaluation (PME) in War Child takes place at project, programme and organisational level. In 2013 we made significant steps in our PME strategy.

War Child takes its responsibility to ensure quality project design, implementation, and monitoring and evaluation, in-line with our own quality standards and those developed internationally, very seriously. We are also committed to identifying and communicating our contribution to the success or failure of projects and to taking the steps required to follow-up on lessons learned. Where projects are implemented by partner organisations, War Child’s responsibility extends to the selection of capable partner organisations in line with our partnership policy and minimum standards, and assessing the quality of their project design, implementation, monitoring and reporting. If needed, we have a responsibility to support partners to improve project design before implementation begins and to support them in project implementation, monitoring and reporting. War Child is responsible for monitoring and evaluating partner projects, and to identify and communicate the extent to which we have contributed to the success or failure of projects implemented by partner organisations.


War Child’s global strategy and Programming Guidelines provide the framework for planning at country level. Planning is led by the development of three- to five-year country strategies, which are then operationalised through annual plans. These plans indicate how projects will work within a country’s strategic framework to address the education, protection and psychosocial development rights of children affected by armed conflict. All projects are developed using a logical framework, outlining a clear purpose, planned activities, results, and measurable indicators. Monitoring and evaluation plans are also developed for each project, specifying how and when baseline and monitoring data will be collected and external evaluations will be commissioned. Lessons learnt and best practices from previous projects inform all new plans.


War Child staff and partner organisations monitor the implementation of project activities and capture the number of children and adults participating in activity reports. These output numbers are entered into the central project administration system and compared with the outputs originally planned for the project. If needed, target numbers are revised or extra activities planned in order to reach the planned targets. We also increasingly monitor outcomes, or the changes in the lives of participants, realised through our projects. Monitoring outcomes starts with collecting data about the baseline or ‘before’ situation of the children and adults targeted by our projects before they start. Data collected at the mid-point or end of a project can then be compared to the baseline situation, allowing for a measurement of the change that project participants have experienced. Children contribute to monitoring through various creative and participatory evidence collection tools integrated in War Child’s interventions. Project results each contribute to one or more of War Child’s strategic outcomes in education, child protection and psychosocial support.

At both country and organisational levels, financial reporting takes place every three months and narrative reporting every six months. Aggregated project outputs are reported every six months and project outcomes every twelve months. Major deviations are flagged and addressed through project revisions, determined in collaboration with line management.


War Child commissions external evaluations, which evaluate a project’s relevance, effectiveness, efficiency, impact and sustainability, in addition to its adherence to quality standards. We reserve 3 to 5 percent of all project budgets for evaluation. Key findings and lessons learnt from evaluations are shared in annual reports, and an online portal  is developed to publish reports, summaries and War Child’s commitment to follow-up on lessons learned.

Evaluations are usually conducted by external consultants who have expertise in child-centred programming and/or PME, using evidence collection tools that enable the meaningful participation of children in the process. Evaluations are generally conducted using a mixed-methods approach that includes a desk study, questionnaires, interviews and focus group discussions with the children and adults who have participated in the project as well as other key informants such as community leaders. War Child recognises that by improving the quality of baseline date collection and outcome monitoring in our projects, we can also improve the quality of evaluations.

Added to My report add to My report