Evaluation guidance for economic appraisal


This page provides guidance on the evaluation of appraisals.

11.1 Introduction to evaluation

11.1.1 Evaluation plays an important role complementary to appraisal. Evaluation is an ex post activity which examines the outturn of a project, programme or policy, and is designed to ensure that the lessons learned are fed back into the decision-making process. This ensures government action is continually refined to reflect what best achieves objectives and promotes the public interest.

11.1.2 Departments should make arrangements to measure outturns and record them. Outturns should be compared with initial estimates and the results used to consider how to improve the quality of the assumptions in future appraisals, including, for example, the estimates of costs and benefits and the assumptions made about risks and appraisal optimism.

11.1.3 Evaluation is like appraisal conducted in retrospect. Thus the general principles and techniques of the NIGEAE apply as much to evaluation as to appraisal. This includes the principle of proportionate effort.

11.1.4 When any policy, programme or project is completed or has advanced to a pre-determined degree, it should undergo a comprehensive evaluation. Major or on-going programmes, involving a series of smaller capital projects, must also be subject to ex post evaluations (also known as post project evaluations (PPEs) or post implementation reviews (PIRs)). Major expenditures or changes in resource use should be followed by full scale post implementation reviews. Lesser decisions require a more modest evaluation effort.

11.1.5 Every appraisal of any substance should indicate how the proposals concerned will be evaluated after completion and how the results of the evaluation will be disseminated.

11.1.6 PPEs are an integral part of the process involved in completing a project and should not be seen as an additional complication to the appraisal process, but rather as an opportunity to learn valuable lessons and to avoid repeating mistakes. They also provide an important input into a gateway 5 review (see section 10.5).

11.1.7 An evaluation might address a project, programme or policy, particular aspects of one of these activities, or key issues affecting a number of activities. Where a programme may consist of a large number of small scale projects or activities, it may be appropriate as part of the evaluation of the programme to select a representative sample of these for detailed evaluation. Where a department proposes to adopt such an approach, it should present the details of its proposals for sampling in advance of the exercise to the relevant supply division for consideration. It may equally be appropriate to conduct more than one PPE for a particular project, for example, where it has been implemented in stages. In these cases the gateway 5 review may need to be repeated in line with the outcomes of each stage or phase.

Evaluation terms

Post project evaluation (PPE): General term for an ex post assessment of a project.

Post implementation review (PIR): General term for an ex post assessment of a policy, programme or project. Also known as Post Review.

Project evaluation review (PER): PRINCE2 term for an ex post assessment of management effectiveness, conducted at project closure i.e. on implementation. Previously known as a Project Management Evaluation (PME).

Post project review (PPR): PRINCE2 term for an ex post assessment of benefits and costs obtained from an activity, conducted 6 to 12 months after implementation. This is the main substance of a PPE and hence the terms PPR and PPE are now often used interchangeably.

11.2 Conducting an evaluation

11.2.1 PERs should be conducted by the project manager at project closure. However, all other evaluations should be led by individuals who have not been involved in the management or implementation of the proposal under consideration. This is so that they are in a position to take an independent and unbiased view. It is desirable and should be possible to maintain this principle even for the evaluation of lower value projects, although it may not be practical for all de minimis expenditures (i.e. those below £1m). Basic principles for planning and managing evaluations are outlined at 1.6 above.

11.2.2 DoF will generally expect departments to conduct project evaluations in accordance with the post review section of the programme and project management and assurance web pages and according to PRINCE2 procedures. This requires evaluation to be conducted in two stages:

Project evaluation review (PER)

This reviews the effectiveness of the project management up to the point of project closure. Led by the project manager, it should result in an end project report and a lessons learned report as described in the PRINCE2 guidance.

Post project review (PPR)

A PPR, often referred to as a post project evaluation (PPE), should be planned before project closure and is the main substance of the ex post evaluation. It compares outturns against estimates for all relevant costs and benefits, and generally reviews success in achieving objectives. It should be conducted 6 to 12 months after project closure, led by an individual independent of the project board and project team.

Initiatives which have a long life should be evaluated regularly to ensure that they remain affordable and continue to provide value for money. This should include re-examining the information used and assumptions made in the original business case to ensure that these remain valid.

11.2.3 Departments should apply the guidance provided on the programme and project management and assurance webpages and in PRINCE2 on the conduct of PERs and PPRs. That guidance is not repeated here. However, the following paragraphs provide some general principles for ex post evaluation that are relevant to PPRs/PPEs and apply generally to ex-post evaluation of policies, programmes and projects.

11.2.4 Planning for evaluation must begin at the appraisal stage and should ensure that the appraisal reports contain the information needed for evaluation. This should include an outline plan, setting out the general boundaries of the proposed evaluation, as indicated at 2.9.12-17 above.

11.2.5 An evaluation should normally follow this sequence:

  • establish exactly what is to be evaluated and how the outturns can be measured
  • define the counterfactual i.e. estimate what would have happened if the intervention (e.g. the project, programme policy or financial assistance) had not occurred
  • compare the outturn with the target outturn, and with the effects of the chosen counterfactual(s)
  • present the results and recommendations
  • disseminate and use the results and recommendations

11.2.6 The status quo or other baseline option used in the original appraisal should normally inform the counterfactual. However, viewing events from a post hoc position, evaluators may judge that the counterfactual would actually have been quite different from what was envisaged at the time of the appraisal, due to, for example, alternative states of the world and/or alternative management decisions. In such circumstances it may be helpful to consider other counterfactuals in addition to the original baseline option. The streams of costs and benefits that would have occurred in the counterfactual(s) should be estimated and set out so that the actual outturn costs and benefits can be compared with them.

11.2.7 The above sequence applies broadly as much to projects as to policies and programmes. However, proportionate effort should be applied e.g. when dealing with individual projects, particularly smaller projects, there tends to be less emphasis upon detailed consideration of alternative counterfactual states of the world. In most cases effort should be concentrated upon evaluating the extent to which objectives have been achieved, whether assumptions have proved accurate (for example, by comparing outturns with target outturns), and what lessons can be learned.

11.2.8 In general, evaluation reports should summarise:

  • whether, and if so, why the outturn differed from that foreseen in the appraisal
  • how effective the activity was in achieving its objectives, and why
  • the cost-effectiveness of the activity
  • what the results imply for future management or policy decisions

11.2.9 The results obtained should generally lead to recommendations for the future. These might include, for example, changes in procurement practice, improvements to methods for estimating costs or benefits, changes to management procedures, or the continuation, modification or replacement of a project, programme or policy.

11.2.10 The results and recommendations should feed into future decision making. The methods used to achieve this may require senior management endorsement. Efforts should be made to disseminate the results widely within the organisation, and for this purpose it may be useful to employ summaries of the main points, and synthesis reports incorporating the results from a number of evaluations with common features.

11.2.11 A PPR template is provided on the programme and project management and assurance web pages. This indicates the minimum content of a PPE and is recommended for general use to document PPRs/PPEs.

11.3 DoF monitoring of post project evaluations (PPEs)

11.3.1 In January 1993 the Audit Review Group examined the control of capital projects and recommended that DoF Supply should rigorously pursue and monitor the completion of PPEs. Since then, supply divisions, with the assistance of DoF economists have regularly monitored departments' performance in completing PPEs.

11.3.2 DoF requires all projects to be subject to proper monitoring and control measures including PPEs for all projects, both above and below the de minimis level. Such measures can help ensure good VFM by identifying difficulties, preventing the repetition of mistakes, revealing positive points and generally learning lessons which may be of use in other projects and/or other departments. However, proportionate effort should always be applied, and, as indicated at 11.1.7 above, where a programme consists of a large number of small scale projects or activities, instead it may be more appropriate to select a representative sample in evaluating the programme. Where a department proposes to adopt such an approach, it should present the details of its proposals for sampling in advance of the exercise to the relevant supply division for consideration.

11.3.3 DoF will expect both a PER and PPR to be prepared for all projects as described at 11.2.2 above, in accordance with the guidance at DoF's Successful Delivery (NI) website and PRINCE2. However, DoF recognises that the PER is primarily a management tool for the use of the SRO and project board, whereas the PPR is the main substance of the ex-post evaluation or PPE. Accordingly, DoF Supply will only require the submission of PPRs and will generally use the term PPE to refer to the PPR.

11.3.4 In monitoring PPEs, DoF intends to give greater priority to the larger projects and areas where lessons learned can be of most value. To date, DoF has required departments to submit PPEs to Supply Division for all projects above delegated limits. From now on, DoF will request to see PPEs only for larger projects and those projects which DoF believes to have substantial read across to other projects. In future, the letter of approval from DoF will stipulate in each case whether or not a PPE report must be submitted to Supply.

11.3.5 This does not affect the continuing requirement to ensure that suitable arrangements for PPEs are made for all projects, but simply means that DoF will not require sight of all PPEs as a matter of routine. DoF approval of all projects above delegated limits will still be conditional upon satisfactory arrangements for PPEs in all cases. DoF will request an assurance from departments on an annual basis that all PPEs that are due to be carried out have actually been completed.

11.3.6 In the interest of ensuring good practice and VFM, DoF may occasionally require departments to produce a list of all projects approved in the previous year which were below delegated limits, but above de minimis levels, and to specify when these projects will be, or have been, completed and whether PPEs have been undertaken or when they are due to commence. DoF may ask to see a sample of these PPEs from time to time by way of quality assurance.

Back to top