Evaluator sought


Date: June 23, 2012
  • SHARE:

CALL FOR EXPRESSIONS OF INTEREST
Independent progress Review (IPR) OF GL-DFID
PROGRAMME PARTNERSHIP ARRANGEMENT

Gender Links seeks the services of an experienced evaluator to undertake an Independent Progress Review (IPR) of its three year Programme Partnership Arrangement (IPR) with DFID, currently nearing its half way mark. GL’s first year PPA report can be accessed at https://www.genderlinks.org.za/page/sponsors (DFID PPA).

For more information on the Evaluation Strategy click here

TIMEFRAMES AND SUBMISSION INFORMATION
The consultancy will cover a 30-day period from mid-August to the end of September. A format for the Expression of Interest (EOI), due by COB Friday 6 July 2012, is attached at Annex A.Short listed candidates should be prepared to make a presentation on Friday 13 July. Please take note of the accompanying documents required. Please note that late applications and/or applications that do not make use of the attached format will not be considered. EOI, references, CV’s and samples of at least two previous evaluations should be sent to hr@genderlinks.org.za. Please direct all queriesto Vivian Bakainganga on this E Mail address or phone 27 (0) 11  622 2877. GL will only contact short listed candidates for interviews. GL reserves the right not to appoint anyone if suitable candidates are not identified.

BACKGROUND
Gender Links is a Southern African NGO founded in March 2001 with offices in Johannesburg, Mauritius (Francophone base) and Botswana (headquarters of the Southern African Development Community) as well as seven other countries. The vision of the organisation is a region in which women and men are able to participate equally in all aspects of public and private life in accordance with the provisions of the SADC Protocol on Gender and Development. GL has four programme areas: the SADC Protocol on Gender and Development; media; governance and justice.

DFID provides significant funding to civil society organisations (CSOs) annually in line with its overall strategy to alleviate poverty and promote peace, stability and good governance. The Programme Partnership Arrangements (PPA) and Global Poverty Action Fund (GPAF) are two of DFID’s principal funding mechanisms and will provide £480 million to approximately 230 CSOs between 2011 and 2013. The current political climate and results-based agenda demand a rigorous assessment of the effectiveness of funds disbursed to ensure that they are managed to provide value for money.

One of the key tools in the performance assessments of each agency is the Independent Progress Review (IPR) which will be commissioned by the individual grantees.

Coffey International Development is the Evaluation Manager for the PPA and GPAF and is responsible for assessing the performance of individual grantees and of the funding mechanisms as a whole. The Evaluation Strategy, which accompanies this call, lays out the approach and methodology to the Evaluation and should be read in full in preparation for the IPR.

In terms of grantee performance, the Evaluation is concerned with:

  • the extent to which grantee organisations are performing against their objectives ;
  • the extent to which grantee organisations and achievements align with DFID’s theories of change (annex 2 and 3);
  • the impact of DFID’s funding in terms of the additional benefits realised because of funding and its attributable contribution to organisational effectiveness and the results set out in grantees’ logframes. The impact assessment will consider the value for money organisations derive from DFID funding.

Grantees will be assessed according to standard criteria based on the OECD DAC criteria : relevance, efficiency, effectiveness and results. Further definition of these criteria is provided in appendix 8.1.1. The criteria should be used to structure the IPR.

PURPOSE
The purpose of the IPR is threefold:

  • To assess the extent to which comments provided as part of the Annual Review Process(es) have been acted upon by grantees;
  • To verify, and supplement where necessary, grantees’ reporting through the Annual Review Process, changing lives case study and for PPA holders only, the additionality report; and
  • To independently evaluate the impact that DFID funding has had on organisations and projects and to assess the value for money of the funding. The IPR should answer the questions

What has happened because of DFID funding that wouldn’t have otherwise happened? and To what extent does the use of funding represent good value for money?

ANNUAL REVIEW PROCESS ACTIONS
The IPR will have an important role in assessing the extent to which comments provided during the Annual Review Process (ARP) have been acted upon by grantees. Grantees are accountable to DFID for their use of the grants. The ARP is the process by which DFID hold grantees to account and ensures that they are working towards their stated objectives. The feedback provided during the ARP is DFID’s principle management tool, and as such, it is extremely important that this feedback be acted upon by grantees.The IPR will provide an independent assessment on the extent to which feedback has been acted upon.

VERIFICATION OF GRANTEES REPORTING
Grantees will be assessed by the Evaluation Manager according to the criteria defined in Annex   B. The IPR will contribute to this assessment by:

  • Verifying grantee reporting related to the evaluation criteria; and
  • Providing an independent assessment of the organisation or project in relation to the evaluation criteria.
  • Some relevant assessment questions are detailed below À“ these questions are guidelines only. The Independent Evaluator should use their discretion in obtaining the information relevant to the assessment criteria.

Relevance

  • Representativeness: Do the planned interventions and outcomes (as expressed in the LogFrame) reflect the needs of the target population?
  • Targeting: To what degree do the planned interventions and outcomes reach the poorest and most marginalised? To what degree do these interventions maximise the impact on the poor and marginalised? Is the balance between these two targeting principles appropriate to the situation? (Note: in cases where the organisation or programme is not working directly with beneficiaries an assessment should be made of the implicit or explicit results chain that link the outcomes to changes for the beneficiary population)
  • Do the planned interventions, outcomes and targeting continue to be relevant to the needs of the target population? Does the targeting strategy continue to be appropriate?

Efficiency

  • To what extent are grantees able to evidence their cost effectiveness and as such demonstrate an understanding of their costs, the factors that drive them, the linkages to their performance and an ability to achieve efficiency gains?

Effectiveness

  • Distinctive offering: What is the distinctive offering of the organization and how does it complement or add value to DFID’s portfolio? Examples here might include:
  • The organization has distinctive expertise in a particular area of work;
  • The organization provides support and advice in this area and/or builds the capacity of DFID and others;
  • The project or programme fills a gap in DFID’s portfolio, complementing existing work in country programmes, or offering a channel to provide support where DFID has no presence;
  • Linking together different levels of operation; and
  • Networking and bringing together other actors.
  • Learning and innovation
  • How has organisational culture promoted or impeded learning and innovation?
  • Assess the extent to which the organization has learned from its work and has incorporated the lessons into improved performance. Examples and case studies should be provided. A distinction should be made between two types of learning. Firstly, learning that improves the organization’s capacity (for example improved capacity to monitor and evaluate). This learning is essentially organizational development for the grantee. Assess the degree to which this learning has demonstrably improved programming, in the intervention from which it arose and beyond. Secondly, learning that provides contextual knowledge, for example learning about the situation of a target population. This learning is largely specific to a particular context and will have little generalizability. Assess the degree to which this learning has demonstrably improved programming, in the intervention from which it arose.
  • Assess the extent to which the organization has produced generalizable learning that has been incorporated into its own practice and shared with others. Assess the degree to which this learning has demonstrably improved programming. Describe the strategy for communicating the learning and assess the extent to which others took up the learning in changed policy and practice. Examples and case studies should be provided. This type of learning overlaps with innovation.
  • Innovation is a special type of learning. It is distinguished from learning in general by novelty. Assess the extent to which grantees develop, test, and achieve the adoption by others of new knowledge, such as in techniques, approaches, and design of interventions. Describe the organization’s strategy for communicating the innovation and the extent to which it was taken up by others. If it has not yet been taken up by others, provide evidence indicating the potential for replication and scale-up. Two levels of innovation should be distinguished. Firstly, incremental innovation. This is innovation that applies or develops existing knowledge in new ways. For example, it might involve the application of an existing method to a new context, or it might involve elaboration and improvement of an existing method. Secondly, radical innovation. This is innovation that produces entirely new knowledge. For example, it might involve the development and testing of a new method for vulnerability mapping.
  • Monitoring and evaluation. Assess the organization’s monitoring and evaluation capacity, and in particular its ability to measure results (focusing on the quality of reported results and lessons learned rather than an assessment of M&E systems themselves). Indicate with clear examples of the trajectory of change. Identify and assess any impact assessment studies and clarify what part they play in the organization’s monitoring and evaluation system.

Sustainability

  • Assess the extent to which an intervention or its results are likely to be sustainable. This should include an examination of the outcome of the uptake of learning and innovation by others. It should also include the nature of partnerships built with civil society, governmental and international organisations and their impact on sustainability. Elements of sustainability might include leveraging funds for continuation, securing policy adoption of an intervention or approach, or building capacity of southern actors to deliver a service or to monitor service delivery.

Results À“ please see also note on Impact at Annex C

  • Performance against the LogFrame: To what extent is the organization achieving (or progressing towards) the intended outcomes?
  • Changes in lives. Assess the information about what changes these outcomes are making in people’s lives and how many people are affected.
  • Changes in civil society. To what extent are citizens doing things for themselves (for example community organizations managing and delivering services)? To what extent is civil society enabled to hold government to account?
  • Assess what conditions led to success and failure À“ external, internal combination of interventions.
  • To what extent does DFID funding achieve additionality, i.e. enable CSOs to achieve things they would have otherwise not been able to achieve? Assessment of additionality will be covered during the impact assessment as described below.

IPR METHODS
The methods to be used in the IPR include:

Document review – this will include the assessment of the funding related documents:

  • Organisations applications for funding
  • DFID’s business case for funding (PPA only)
  • Organisation’s MOU with DFID for funding
  • Updated versions of organisational (PPA) logframes / project logframes (GPAF)
  • Organisations’ annual review reports and comments provided by DFID
  • Changing Lives case studies submitted
  • Additionality reports (PPA only)
  • The review should also consider other relevant organisational documents such as:
    • Organisational mission statement and strategy
    • Organisational monitoring & evaluation strategy
    • Impact studies undertaken by the CSO
    • Financial information / information on resources spent
    • Statement of experience
    • Information on synergies / collaboration with DFID country programmes, other actors etc
    • Published material (e.g. to demonstrate sharing of learning with others)
    • Additional documents as required and appropriate (e.g. information to assess changes in lives / changes in civil society)

Interviews and workshops with key stakeholders:

  • Interviews and workshops with management teams to determine how funding is allocated and used
  • Beneficiary interviews
  • Interviews with staff at grantee organisation involved in strategic aspects / delivery of work
  • Interviews with partners looking at e.g. uptake of learning and innovation, partnerships built with civil society, governmental and international organizations, building capacity of southern actors etc
  • Additional interlocutors as appropriate
  • The consultant or consulting firm commissioned to carry out the IPR and the PPA/GPAF Manager are jointly responsible for choosing the methods that are the most appropriate for the purpose of this evaluation. The consultant or consulting firm is also required to present a detailed statement of evaluation methods including the description of data collection instruments and procedures, information sources and procedures for analysing the data.

Proposed work plan

The consultancy is envisaged to take place over 30 person days during a four month period (with a final report back session to the Board in March 2011). The model is based on GL’s five year evaluation (available on our website under Monitoring and Evaluation). It is underpinned by a process approach, with constant interaction between the evaluators and the organisation at all levels. The proposed work plan links the evaluation to key events and programme work, so that these are not only assessed on paper and through interviews with beneficiaries, but also through observations of the work underway. While the methodology and work plan are open to discussion, it is important to agree a road map at the outset in order to make the best use of the time. Some key dates such as date of submission of application, presentation, briefing meeting with the ED, the Gender and Media Summit are not negotiable. The deadline for submission of final report is also fixed.

ACTIVITY

DATES

No of days

Call out

22 June

 

Final submission date for proposals

6 July

 

Presentations

13   July

 

Successful candidates announced

20 July

 

Key documents, meetings at HQ

Week 6 August

3days

Observation of Alliance meeting

13-15 August

3 days

Observation management, M and E processes

16-17 August

2 days

Field visits

20-31 August

10 days

Report, and feedback

3-14 Sept

10 days

Final draft

21 Sept

2 days

Total

 

30 days

Support available to the evaluators

  • GL has a Chief of Operations; Monitoring and Evaluation officer, Knowledge and Learning Manager who will provide briefing materials, analysis and support to the evaluators as required. The unit will also assist in setting up meetings.
  • The GL administration will make all travel and logistic arrangements. The assessor will be accommodated at the GL guest house when they are in Johannesburg, if they are not from here.

IPR CONSULTANT
The IPR shall be carried out by a suitably-qualified and experienced consultant or consulting firm (referred to as “IPR consultant” in the following). The consultant profile should include:

  • A specialist with a minimum of seven years experience in programme/project delivery in an international development context
  • Experience of results-based monitoring and evaluation
  • No conflict of interest with the ongoing activities of grantees.
  • A post-graduate degree or equivalent in monitoring and evaluation or social sciences.
  • Qualifications and or experience in gender, media and governance.
  • Proven experience in conducting organisational evaluations that operate regionally and working with strategic programmatic documents and log frames. Specific experience of conducting DFID evaluations would be an advantage.
  • Knowledge and experience of organisational systems and development, including financial systems preferably including managing an NGO in a challenging funding environment.
  • The ability to think and write critically and constructively.
  • Excellent inter personal and written skills; ability to use IT to the maximum advantage in such an under taking.
  • Fluency in English, knowledge of French and Portuguese would be an added advantage.
  • A sound reputation for independence and fairness; compliance with ethical standards for evaluators.

EOI SUBMISSION PROCESS
The format for the Expressions of Interest is attached at Annex A.The consultancy will cover a 30-day period from mid-August to the end of September. Short listed candidates should be prepared to make a presentation on Friday 13 July. Please take note of the accompanying documents required. Please note that late applications and/or applications that do not make use of the attached format will not be considered. EOI, references, CV’s and samples of at least two previous evaluations should be sent to hr@genderlinks.org.za. Please direct all queries to Vivian Bakainganga on this E Mail address or phone 27 (0) 11  622 2877. GL will only contact short listed candidates for interviews. GL reserves the right not to appoint anyone if suitable candidates are not identified.

Please take note of the following:

  • GL is seeking the services on ONE consultant. The number of consulting days may not succeed 30 day and should be along the lines of this call.
  • Travel and accommodation expenses will be paid directly by GL (economy class airfares, mid-class hotel); these should not be included in the budget.

QUALITY ASSURANCE
It is imperative that the evidence collected as part of the IPR be robust and reliable. Where high quality data is not available, the limitations of the data and any conclusions drawn from it must be clearly stated. The following table provides a framework for appraising the quality of evaluation evidence submitted to the Evaluation Manager. Grantees are responsible for quality assuring the IPR as it is undertaken. The Evaluation Manager will also undertake a quality assurance exercise and will provide comments in an Evaluation Manager Report.

Appraisal focus

Key appraisal questions

Key quality indicators

FINDINGS

1. How credible are the findings?

Findings /conclusions are supported by data /study evidence
Findings /conclusions ‘make sense’ /have a coherent logic
Findings /conclusions are resonant with other knowledge and experience
Use of corroborating evidence to support or refine findings

2. How well does the evaluation /evidence address its original aims and purpose?

Clear statement of study aims and objectives (where relevant)
Findings clearly linked to the purposes of the study À“ and to the initiative or policy being studied
Summary of conclusions directed towards aims of study
Discussions of limitations of study in meeting aims

3. Scope for drawing wider inference À“ how well is this explained?

Discussion of what can be generalised to wider beneficiary population
Detailed description of the contexts in which the study was conducted to allow applicability to other settings /contextual generalities to be assessed
Discussion of how hypotheses /theories of change may relate to wider theories of change at the policy level
Discussion of limitations on drawing wider inference

DESIGN

4. How defensible is the research design?

Discussion of how overall evaluation /research strategy was designed to meet the aims of the study
Discussion of the rationale of the study design
Use of different features of design /data sources evident in findings presented
Discussion of limitations of research design and their implications for the study evidence

DESIGN

5. How well was the data collection carried out?

Discussion of:

  • Who conducted data collection
  • Procedures /documents used for collection /reporting
  • Checks on origin /status

Description of fieldwork methods and how these may have influenced data collected

ANALYSIS

6. How well has the approach to and formulation of the analysis been conveyed?

Description of form of original data
Clear rationale for choice of data management method
Discussion, with examples, of how any constructed analytic concepts have been devised and applied

REPORTING

7. How clear are the links between data, interpretation and conclusions À“ i.e. how well can the route to any conclusions be seen?

Clear conceptual links between analytic commentary and presentations of original data
Discussion of how /why particular interpretation /significance is assigned to specific aspects of data
Discussion of how explanations /theories /conclusions were derived

NEUTRALITY

8. How clear are the assumptions /theoretical perspectives /values that have shaped the form and output of the evaluation /evidence submitted?

Discussion /evidence of the main assumptions /hypotheses /theoretical ideas on which the evaluation was based and how these affected the form, coverage, or output of the evaluation
Discussion /evidence of the ideological perspectives /values of the evaluation team and their impact on the methodological or substantive content of the evaluation
Evidence of openness to new /alternative ways of viewing subject /theories /assumptions
Discussion of how error or bias may have arisen in design /data collection /analysis and how it was addressed, if at all
Reflections on the impact of the researcher on the evaluation process

AUDITABILITY

9. How adequately has the research process been documented?

Discussion of strengths and weaknesses
Documentation and reasons for changes in coverage /data collection /analytic approach and implications
Reproduction of main study documents

ANNEX A
FORMAT FOR EXPRESSION OF INTEREST TO CONDUCT AN ORGANISATIONAL AND SPECIFIC PROGRAMME EVALUATION FOR GENDER LINKS

  • ADMINISTRATIVE INFORMATION

NAME OF CONSULTANT

 

KEY CONTACT/S

 

ADDRESS

 

E MAIL

 

PHONE/CELL PHONE

 

  • METHODOLOGY

With reference to the briefing document and the overall Evaluation Strategy provided please submit a one-two page synopsis of how you would go about this assignment.   Pay careful attention to data sources, sampling, data collection tools and procedures as well as target groups, geographic areas, programme areas and outputs and outcomes to be assessed. Short listed candidates will be expected to make a presentation in Johannesburg on13 July 2012.

  • CONSULTANT PROFILE

Please provide a brief narrative and your CV/s which should include:

  • Educational qualifications
  • Relevant work experience
  • Skills and competencies
  • SUPPORTING DOCUMENTATION
  • Written references from at least two previous clients.
  • Examples of at least two previous evaluations that you have undertaken (written outputs).
  • Any other relevant documentation that the consultant submitting the proposal may wish to include.
  • FEE

Please provide a daily rate for your fees. All other costs which will be covered and or reimbursed directly by GL as per our financial regulations and should not be factored into the EOI

Undertaking
The information presented here is true and reflective of the capacity and ability of the consultant. If required, the consultant will be available to make a presentation on Friday 13 July 2012.

 

__________________________                                                     __________________________
Name                                                                                                                      Designation

 

__________________________                                                     __________________________
Signed                                                                                                                       Date

 

Annex B:   Assessment criteria

Criteria

Sub-criteria

Definition

Relevance

Representa-tiveness

The degree to which the supported civil society organisations represent and respond to the needs and priorities of their constituencies, (including where relevant the poorest and most marginalized). This will include an assessment of whether the planned interventions, as described in the LogFrame, continue to respond to these needs and priorities.

Targeting strategy

The extent to which the interventions target the poorest and most marginalized, and the extent to which they target in such a way as to achieve maximum benefit. These targeting strategies are likely to be mutually exclusive, and the assessment will reflect on the way in which the balance between them has been struck. This will include an assessment of whether the targeting continues to be relevant.
Grantees are required to describe the extent to which DFID funding impacts on their targeting strategy.

Effectiveness

Added value

Whether grantees offer a distinctive competence or otherwise complement and add value to DFID’s portfolio, and how this has been developed and/or demonstrated throughout the funding period. Examples here might include:
The organization has distinctive expertise in a particular area of work,
The organization provides support and advice to other organisations in this area and/or builds the capacity of DFID and others
The project or programme fills a gap in DFID’s portfolio, complementing existing work in country programmes, or offering a channel to provide support where DFID has no presence
Linking together different levels of operation
Networking and bringing together other actors
Grantees are required to describe to what extent DFID funding enables them to provide the added value described.

Learning

The extent to which grantees learn from their work, and integrate the learning into improved programming, as well as the extent to which others (civil society, governmental and international organisations) make use of this learning in altered policy and practice. Learning will be understood under the following headings:
Learning that improves the organization’s own capacity: This learning is essentially organizational development for the grantee. Grantees will need to show that this learning has demonstrably improved programming, in the intervention from which it arose and beyond.
Learning that provides contextual knowledge, essential for good programming: for example learning about the situation of a target population. This learning is largely specific to a particular context and will have little generalizability. Grantees will need to show that this learning has demonstrably improved programming, in the intervention from it arose.
Learning that can be shared with others: for example, improved ways of ensuring participation of marginalized groups. This is learning that can be generalized from the intervention context. Grantees will need to describe their strategy for communicating the learning and the extent to which others took up the learning. Grantees should also use this section to report on their interaction with the Learning Partnership and its four thematic sub-groups and how this interaction affects their capacity to learn and share learning. This type of learning overlaps with innovation.
Grantees are required to describe the extent to which DFID funding impacts on their capacity to learn and use learning in any of the categories above.

Innovation

The extent to which grantees develop, test, and achieve the adoption by others of new knowledge, such as in techniques, approaches, and design of interventions. Innovation is a special type of learning. It is distinguished from learning in general by novelty. Two levels of innovation will be distinguished
Incremental innovation: This is innovation that applies or develops existing knowledge in new ways. For example, it might involve the application of an existing method to a new context, or it might involve elaboration and improvement of an existing method. Grantees will need to describe their strategy for communicating the innovation and the extent to which it was taken up by others. If it has not yet been taken up by others, grantees will need to provide evidence suggesting that it has the potential for replication and scale-up
Radical innovation: This is innovation that produces entirely new knowledge. For example, it might involve the development and testing of a new method for vulnerability mapping. Grantees will need to describe their strategy for communicating the innovation and the extent to which it was taken up by others. If it has not yet been taken up by others, grantees will need to provide evidence suggesting that it has the potential for replication and scale-up
Grantees are required to describe the extent to which DFID funding impacts on their capacity to innovate or share their innovations.

Partnership approach

The extent to which partnerships are made with others (civil society, the private sector, governmental and international organisations) that enhance the effectiveness and impact of interventions and encourage sustainability. Partnerships that build sustainability might include leveraging funds for continuation, securing policy adoption of an intervention or approach, building capacity of southern actors to deliver a service or to monitor service delivery.
Grantees are required to describe the extent to which DFID funding influences their partnership approach.

M&E

The extent to which grantees effectively monitor and evaluate their performance and assess their impact. Effective M&E and impact assessment includes demonstrable assessment and reporting of results at different levels, especially outputs and outcomes.
Grantees are required to describe the extent to which DFID funding influences their M&E systems and capacity to undertake impact assessments.

Efficiency

Cost effectiveness

In its simplest form cost effectiveness assesses the extent to which grantees have delivered units of outputs and outcomes at the ‘least cost’ in order to achieve the ‘desired’ results, typically through the formulation of unit costs. Whilst the assessment of a grantee’s cost effectiveness is most appropriate for outputs and outcomes of a quantitative nature, it is also an appropriate tool for capturing results that are harder to express in monetary units. This is particularly relevant to PPA fund holders and GPAF organisations where outputs and outcomes are presented in more qualitative terms. In these instances, grantees will be expected to demonstrate an acute understanding of key drivers of the costs that are incurred À“ ‘cost drivers’ are the strategic and operational determinants of a specific resource or activity cost. These cost drivers reflect the interdependencies between the strategic decisions that organisations make concerning the ways in which resources are used and the operational requirements associated with the delivery of activities that are relevant to the needs and priorities of poor and marginalised people. It is expected that grantees are able to evidence and demonstrate to a reasonable degree what costs have been incurred, why they have been incurred and the extent to which the costs incurred have been driven by the necessity to deliver the quality and quantity of results required. Essentially, this approach to the assessment of a grantee’s cost effectiveness seeks to understand and demonstrate the strength of the relationship between the ‘value’ and ‘money’ parts of the ‘value for money’ equation.
Whether cost effectiveness is being assessed at the input, output or outcome levels an underlying principle of the cost effectiveness assessment is that grantees should be able to demonstrate that funding and resources are being allocated and managed in ways that delivers the greatest added value at the least cost. Consequently the cost effectiveness assessment should draw on evaluation findings concerning the assessments of the relevance, effectiveness and results achieved by individual grantees.

Results

Performance against the logframe

The extent to which grantees have delivered on outputs and achieved the changes indicated in their LogFrames. In the first annual review this will largely assess outputs, while subsequent reviews will be able to increasingly assess outcomes. For GPAF organizations this assessment will be at project level; for PPA organizations, the assessment will be of the whole organization or of the part of an organization’s programme covered by the PPA. Note: grantees are required to demonstrate and evidence wherever possible the extent to which results are attributable to DFID funding.

Improving lives

An assessment of the extent and the manner of changes in the lives of poor and marginalized people as a result of the changes achieved, and the extent to which these changes are likely to be sustained. It is recognised that PPA/GPAF agency reporting in this area is likely to be illustrative of changes, rather than comprehensive across the portfolio. See Annex 9. Note: grantees are required to demonstrate and evidence wherever possible the extent to which changes in people’s lives are attributable to DFID funding.

Changes in civil society

The extent to which citizens are able to do things for themselves, for example community organizations to manage and deliver a particular service, and the extent to which civil society organizations are able to hold governments (such as the private sector and international bodies) to account. Note: grantees are expected to demonstrate and evidence wherever possible the extent to which changes in civil society are attributable to DFID funding.

 

ANNEX C
Impact assessment of DFID funding
The section set out the proposed approach to the assessment of the additional impacts achieved by grantees as a result of DFID’s funding. It starts by explaining the fundamental principles that underpin the assessment of impact and the type of techniques that are typically used to undertake quantitative analysis. The purpose here is not to prescribe that all grantees should apply these and only these quantitative techniques. The intention is to provide an overview of a robust approach that should be considered if appropriate, cost-effective and proportionate to do so. The section also stresses the importance of a mixed-methods approach to the impact assessmentthat uses qualitative research to provide an explanation of ‘why’ and ‘how’ the programme is affecting the type and scale of changes that are quantitatively assessed.

The section concludes by providing guidance on contribution analysis, which adopts a theory of change approach to evaluation. This approach is informed by a wide range of evidence sources and perspectives brought together to produce a ‘plausible’ assessment of the ‘contribution’ of grantees to higher level outcomes and impacts. This Evaluation Strategy is first and foremost concerned with ensuring that grantees are able to produce the most robust evidence possible by rigorously using evaluation approaches and research tools that best suit the variety of ways in which DFID funding has been used across both the PPA and GPAF portfolios.

Impact assessment is defined here as the ‘net’ impact that an organisation or project intervention has in terms of the additional benefits realised that are directly attributable to the activities delivered by the organisation or project intervention. Theadditionality of the funding is of key importance for DFID as it is crucial to understanding the net impact of its interventions. Additionality is defined as “an impact arising from an intervention is additional if it would not have occurred in the absence of the intervention”. Typically, this requires a comparison between what actually happened (i.e. factually) and what would have happened in the absence of the intervention, otherwise called the counterfactual. The fundamental evaluation problem that all impact assessment faces is that we cannot observe what would have happened if the intervention had not happened to those already affected by the intervention. Therefore impact evaluation requires a rigorous approach to establishing the counterfactual. The most robust way to do this is to compare the outcomes achieved by those who benefited from an intervention with the outcomes achieved by a group of people who are similar in every way to the beneficiaries, except that they were not subject to the project intervention being evaluated i.e. by using a comparison or control group. This approach to the assessment of impact and additionality typically involves experimental or quasi-experimental approaches and methodologies.

Grantees should note that depending on the level of expenditure and ‘evaluability’ of the type of investment or intervention, the expectation is that the additionality and impacts of DFID’s funding should be quantitatively assessed as far as possible. It should be noted that this approach is not exclusive to qualitative methodologies, which are required to ensure that any evaluation of impact is firmly grounded in the context of a grantee’s activities.   Crucially, a mixed-method approach provides a qualitative explanation of ‘why’ and ‘how’ the programme is affecting the type and scale of change assessed through quantitative research.

Acknowledging the impact attribution problem
The higher level objective of PPA and GPAF funding is to alleviate poverty by strengthening civil society and in doing so, contribute to the achievement of the Millennium Development Goals, and good governance. These goals are at the highest level and DFID’s investment through PPA and GPAF to achieving them is relatively insignificant in the context of the global corpus of interventions aimed at alleviating poverty. Moreover there are a large number of very important factors external to DFID’s and the grantees involvement which varies according to circumstance and which will influence the results achieved. For these reasons, experimental or quasi-experimental approaches to credibly assessing the attributable effects and impacts on observed changes may be difficult to achieve and quantify. Under these conditions it is necessary to consider alternative methods for assessing the funds’ ‘contribution’ to change that do not solely rely scientifically quantifying ‘attributable’ change .

Contribution analysis
Whatever the evaluation design or research methodologies used to evaluate the impact of DFID’s funding it is essential that a rigorous assessment of a grantee’s additionality is undertaken. At the very least this should result in a ‘plausible’ account of the difference that DFID’s funding has made to the effectiveness and performance of grantees. Contribution analysis is an approach that can help grantees overcome the attribution problem by systematically constructing an evidence-based and plausible assessment of changes that would not have happened without the support of DFID’s funding.

Contribution analysis involves assessing the ‘contribution’ that the funding is making or has made to results through a ‘theory of change’ based approach. Essentially this requires an evidence-based approach to verifying the plausibility of theories of change that underpin the rationale for the different ways in which grantees have used DFID funding to either:
indirectly ‘enhance’ the delivery of results (in the logframe) in the majority of cases for PPA grantees; or
directly delivery results (in the logframe) in the majority of cases for GPAF grantees.
Contribution analysis entails a more pragmatic, inclusive and iterative evaluation process than more experimental methods that for some grantees may not be feasible or practical given the variety of ways in which DFID funding is being used.
Contribution analysis involves the following 6 steps that typically a grantee would follow:

Step 1: Develop a theory of change and the risks to it
Establish and agree with stakeholders a ‘plausible’ theory of change that accurately reflects the ways in which DFID funding has been used to deliver or enhance the delivery of planned results. Specifically focus on the type and nature of cause and effect relationships at each stage in the impact logic of the theory of change. The Three ‘circles of influence’ (Montague et al., 2002) are useful in this respect :

  • direct control À“ where DFID funding has fairly direct control of the results, typically at the
    output level;
  • direct influence À“ where DFID funding has a direct influence on the expected results, such as the reactions and behaviours of its target groups through direct contact, typically intermediate outcomes; and
  • indirect influence À“ where DFID funding can exert significantly less influence on the expected results due to its lack of direct contact with those involved and/or the significant influence of
    other factors.

Grantees should identify and articulate the assumptions that have been made in order to establish a set of cause and effect linkages between DFID funding, how it has been used and how this relates to the delivery of activities and ultimately the achievement of results set out in the logframe. In parallel, grantees should identify external influencing factors that could affect these linkages.

In the case of PPA grantees where DFID funding has been used in an unrestricted /indirect way these linkages and a theory of change will need to be developed that specifically focuses on how DFID funding has been used e.g. to improve organisational effectiveness through strengthening human resource management to ultimately enhance the delivery of results À“n these instances this is a distinctly different theory of change or impact logic that is presented in the grantee’s logframe.

Step 2: Set out the attribution problem to be addressed
Grantees should determine the specific cause and effect questions that each grantee needs to assess through the evaluation process; assess the nature and extent of the attribution problem by asking:

  • What do we know about the nature and extent of the contribution expected?
  • What would show that DFID funding has made an important contribution?
  • What would show that DFID funding has ‘made a difference’?
  • What would indicate that DFID funding has had the effects envisaged in the theory of change underpinning the way in which the grant has been used?
  • How difficult is it to evidence these effects and why?

Step 3: Gather existing evidence on the theory of change
Grantees should gather evidence through routine monitoring /management data as far as possible. Whatever the nature of the theory of change underpinning how DFID funding has been used it is advisable to establish a baseline position in order to benchmark the progress made. For example, if DFID funding has been used to enhance human resource management of a grantee then a simple survey could be undertaken of a sample of project offices in order to establish the current state of human resource management from the perspective of those that benefit from it. Further questions could elaborate on the extent to which this enhances the capacity of project offices to deliver their activities and ultimately achieve their results.

Step 4: Assemble and assess the contribution narrative and challenges to it
From the outset it is important to validate whether the theory of change and the assumptions that it depends on holds true. This validation process should be undertaken systematically and regularly in order to iteratively build up a convincing and plausible evidence-based narrative of the effects DFID funding is having in direct and/or indirect ways. It is also essential that this process involves relevant external stakeholders who are in a position to externally verify that the original theory of change and future observed changes are plausible and credible.
Step 5: Gather additional evidence
This Evaluation Strategy provides guidance, tools and templates for gathering different types of evidence that could be required to supplement monitoring and management data. The type of evidence gathered will largely depend on the ways in which DFID funding is being used. Ideally the evidence base would consist of a combination of quantitative and qualitative data focused on testing and proving a plausible theory of change that is specific to DFID funding.

Step 6: Revise and strengthen the contribution narrative
This is a continuous process of testing and revising the theory of change that underpins the central argument that DFID’s funding is making a difference. In this way contribution analysis has a formative effect in that it enables grantees to quickly understand whether or not DFID funding is being used in an optimal way to deliver the changes envisaged at the outset.

There are several analytical approaches that could be used to assess the additionality of DFID funding in addition to contribution analysis. However, the key reason for presenting this approach is to demonstrate that this Evaluation Strategy is fully committed to gathering the best possible evidence concerning the impact and value for money attributable to DFID funding however great the challenge is. Even if a scientific approach to impact evaluation is not possible or is inappropriate then at the very least the approach to assessing the additionality of DFID funding should be as plausible and rigorous as possible, including evaluation designs and activities that entail predominantly qualitative research methodologies.

While responsibility for assessing and reporting on the additionality of DFID funding rests with grantees, the independent evaluators who will undertake the independent progress reviews (IPRs) will be involved with the impact assessment. Where feasible, they should be involved as early as possible by grantees so that they can provide technical support to design the assessment or carry out the steps described above.

This is reflected in grantees’ initial applications and their logframes. For PPA holders this is also reflected in the business cases prepared by DFID to justify funding.

http://www.oecd.org/document/22/0,2340,en_2649_34435_2086550_1_1_1_1,00.html

GPAF holders will receive comments from the GPAF Fund Manager, and PPA holders will receive comments from DFID. The Evaluation Manager will be involved in preparing the comments and recommendations to some extent for both funds.

HMT Green Book

Evaluability is defined in this context as the extent to grantees’ activities can be measured to produce reliable evidence-based judgements of performance, impact and value for money.

Please see the Key Evaluation Terms document and the NONIE paper on impact evaluation for more guidance

Mayne, J., (2008) ‘ILAC Brief 16 À“ Contribution analysis À“ an approach to exploring cause and effect’, ILAC

Mayne, J., (2008) ‘ILAC Brief 16 À“ Contribution analysis À“ an approach to exploring cause and effect’, ILAC

Ibid

See section 3.2 for further details on IPRs

 


Download : PPA Evaluation strategy
Download : Expression of interest PPA Review

Comment on Evaluator sought

Your email address will not be published. Required fields are marked *