SHARE:
CALL FOR EXPRESSIONS OF INTEREST
Independent progress Review (IPR) OF GL-DFID
PROGRAMME PARTNERSHIP ARRANGEMENT
Gender Links seeks the services of an experienced evaluator to undertake an Independent Progress Review (IPR) of its three year Programme Partnership Arrangement (IPR) with DFID, currently nearing its half way mark. GL’s first year PPA report can be accessed at https://www.genderlinks.org.za/page/sponsors (DFID PPA).
For more information on the Evaluation Strategy click here
TIMEFRAMES AND SUBMISSION INFORMATION
The consultancy will cover a 30-day period from mid-August to the end of September. A format for the Expression of Interest (EOI), due by COB Friday 6 July 2012, is attached at Annex A.Short listed candidates should be prepared to make a presentation on Friday 13 July. Please take note of the accompanying documents required. Please note that late applications and/or applications that do not make use of the attached format will not be considered. EOI, references, CV’s and samples of at least two previous evaluations should be sent to hr@genderlinks.org.za. Please direct all queriesto Vivian Bakainganga on this E Mail address or phone 27 (0) 11 622 2877. GL will only contact short listed candidates for interviews. GL reserves the right not to appoint anyone if suitable candidates are not identified.
BACKGROUND
Gender Links is a Southern African NGO founded in March 2001 with offices in Johannesburg, Mauritius (Francophone base) and Botswana (headquarters of the Southern African Development Community) as well as seven other countries. The vision of the organisation is a region in which women and men are able to participate equally in all aspects of public and private life in accordance with the provisions of the SADC Protocol on Gender and Development. GL has four programme areas: the SADC Protocol on Gender and Development; media; governance and justice.
DFID provides significant funding to civil society organisations (CSOs) annually in line with its overall strategy to alleviate poverty and promote peace, stability and good governance. The Programme Partnership Arrangements (PPA) and Global Poverty Action Fund (GPAF) are two of DFID’s principal funding mechanisms and will provide £480 million to approximately 230 CSOs between 2011 and 2013. The current political climate and results-based agenda demand a rigorous assessment of the effectiveness of funds disbursed to ensure that they are managed to provide value for money.
One of the key tools in the performance assessments of each agency is the Independent Progress Review (IPR) which will be commissioned by the individual grantees.
Coffey International Development is the Evaluation Manager for the PPA and GPAF and is responsible for assessing the performance of individual grantees and of the funding mechanisms as a whole. The Evaluation Strategy, which accompanies this call, lays out the approach and methodology to the Evaluation and should be read in full in preparation for the IPR.
In terms of grantee performance, the Evaluation is concerned with:
Grantees will be assessed according to standard criteria based on the OECD DAC criteria : relevance, efficiency, effectiveness and results. Further definition of these criteria is provided in appendix 8.1.1. The criteria should be used to structure the IPR.
PURPOSE
The purpose of the IPR is threefold:
What has happened because of DFID funding that wouldn’t have otherwise happened? and To what extent does the use of funding represent good value for money?
ANNUAL REVIEW PROCESS ACTIONS
The IPR will have an important role in assessing the extent to which comments provided during the Annual Review Process (ARP) have been acted upon by grantees. Grantees are accountable to DFID for their use of the grants. The ARP is the process by which DFID hold grantees to account and ensures that they are working towards their stated objectives. The feedback provided during the ARP is DFID’s principle management tool, and as such, it is extremely important that this feedback be acted upon by grantees.The IPR will provide an independent assessment on the extent to which feedback has been acted upon.
VERIFICATION OF GRANTEES REPORTING
Grantees will be assessed by the Evaluation Manager according to the criteria defined in Annex B. The IPR will contribute to this assessment by:
Relevance
Efficiency
Effectiveness
Sustainability
Results À“ please see also note on Impact at Annex C
IPR METHODS
The methods to be used in the IPR include:
Document review – this will include the assessment of the funding related documents:
Interviews and workshops with key stakeholders:
Proposed work plan
The consultancy is envisaged to take place over 30 person days during a four month period (with a final report back session to the Board in March 2011). The model is based on GL’s five year evaluation (available on our website under Monitoring and Evaluation). It is underpinned by a process approach, with constant interaction between the evaluators and the organisation at all levels. The proposed work plan links the evaluation to key events and programme work, so that these are not only assessed on paper and through interviews with beneficiaries, but also through observations of the work underway. While the methodology and work plan are open to discussion, it is important to agree a road map at the outset in order to make the best use of the time. Some key dates such as date of submission of application, presentation, briefing meeting with the ED, the Gender and Media Summit are not negotiable. The deadline for submission of final report is also fixed.
ACTIVITY |
DATES |
No of days |
Call out |
22 June |
|
Final submission date for proposals |
6 July |
|
Presentations |
13 July |
|
Successful candidates announced |
20 July |
|
Key documents, meetings at HQ |
Week 6 August |
3days |
Observation of Alliance meeting |
13-15 August |
3 days |
Observation management, M and E processes |
16-17 August |
2 days |
Field visits |
20-31 August |
10 days |
Report, and feedback |
3-14 Sept |
10 days |
Final draft |
21 Sept |
2 days |
Total |
|
30 days |
Support available to the evaluators
IPR CONSULTANT
The IPR shall be carried out by a suitably-qualified and experienced consultant or consulting firm (referred to as “IPR consultant” in the following). The consultant profile should include:
EOI SUBMISSION PROCESS
The format for the Expressions of Interest is attached at Annex A.The consultancy will cover a 30-day period from mid-August to the end of September. Short listed candidates should be prepared to make a presentation on Friday 13 July. Please take note of the accompanying documents required. Please note that late applications and/or applications that do not make use of the attached format will not be considered. EOI, references, CV’s and samples of at least two previous evaluations should be sent to hr@genderlinks.org.za. Please direct all queries to Vivian Bakainganga on this E Mail address or phone 27 (0) 11 622 2877. GL will only contact short listed candidates for interviews. GL reserves the right not to appoint anyone if suitable candidates are not identified.
Please take note of the following:
QUALITY ASSURANCE
It is imperative that the evidence collected as part of the IPR be robust and reliable. Where high quality data is not available, the limitations of the data and any conclusions drawn from it must be clearly stated. The following table provides a framework for appraising the quality of evaluation evidence submitted to the Evaluation Manager. Grantees are responsible for quality assuring the IPR as it is undertaken. The Evaluation Manager will also undertake a quality assurance exercise and will provide comments in an Evaluation Manager Report.
Appraisal focus |
Key appraisal questions |
Key quality indicators |
FINDINGS |
1. How credible are the findings? |
Findings /conclusions are supported by data /study evidence |
2. How well does the evaluation /evidence address its original aims and purpose? |
Clear statement of study aims and objectives (where relevant) |
|
3. Scope for drawing wider inference À“ how well is this explained? |
Discussion of what can be generalised to wider beneficiary population |
|
DESIGN |
4. How defensible is the research design? |
Discussion of how overall evaluation /research strategy was designed to meet the aims of the study |
DESIGN |
5. How well was the data collection carried out? |
Discussion of:
Description of fieldwork methods and how these may have influenced data collected |
ANALYSIS |
6. How well has the approach to and formulation of the analysis been conveyed? |
Description of form of original data |
REPORTING |
7. How clear are the links between data, interpretation and conclusions À“ i.e. how well can the route to any conclusions be seen? |
Clear conceptual links between analytic commentary and presentations of original data |
NEUTRALITY |
8. How clear are the assumptions /theoretical perspectives /values that have shaped the form and output of the evaluation /evidence submitted? |
Discussion /evidence of the main assumptions /hypotheses /theoretical ideas on which the evaluation was based and how these affected the form, coverage, or output of the evaluation |
AUDITABILITY |
9. How adequately has the research process been documented? |
Discussion of strengths and weaknesses |
ANNEX A
FORMAT FOR EXPRESSION OF INTEREST TO CONDUCT AN ORGANISATIONAL AND SPECIFIC PROGRAMME EVALUATION FOR GENDER LINKS
NAME OF CONSULTANT |
|
KEY CONTACT/S |
|
ADDRESS |
|
E MAIL |
|
PHONE/CELL PHONE |
|
With reference to the briefing document and the overall Evaluation Strategy provided please submit a one-two page synopsis of how you would go about this assignment. Pay careful attention to data sources, sampling, data collection tools and procedures as well as target groups, geographic areas, programme areas and outputs and outcomes to be assessed. Short listed candidates will be expected to make a presentation in Johannesburg on13 July 2012.
Please provide a brief narrative and your CV/s which should include:
Please provide a daily rate for your fees. All other costs which will be covered and or reimbursed directly by GL as per our financial regulations and should not be factored into the EOI
Undertaking
The information presented here is true and reflective of the capacity and ability of the consultant. If required, the consultant will be available to make a presentation on Friday 13 July 2012.
__________________________ __________________________
Name Designation
__________________________ __________________________
Signed Date
Annex B: Assessment criteria
Criteria |
Sub-criteria |
Definition |
Relevance |
Representa-tiveness |
The degree to which the supported civil society organisations represent and respond to the needs and priorities of their constituencies, (including where relevant the poorest and most marginalized). This will include an assessment of whether the planned interventions, as described in the LogFrame, continue to respond to these needs and priorities. |
Targeting strategy |
The extent to which the interventions target the poorest and most marginalized, and the extent to which they target in such a way as to achieve maximum benefit. These targeting strategies are likely to be mutually exclusive, and the assessment will reflect on the way in which the balance between them has been struck. This will include an assessment of whether the targeting continues to be relevant. |
|
Effectiveness |
Added value |
Whether grantees offer a distinctive competence or otherwise complement and add value to DFID’s portfolio, and how this has been developed and/or demonstrated throughout the funding period. Examples here might include: |
Learning |
The extent to which grantees learn from their work, and integrate the learning into improved programming, as well as the extent to which others (civil society, governmental and international organisations) make use of this learning in altered policy and practice. Learning will be understood under the following headings: |
|
Innovation |
The extent to which grantees develop, test, and achieve the adoption by others of new knowledge, such as in techniques, approaches, and design of interventions. Innovation is a special type of learning. It is distinguished from learning in general by novelty. Two levels of innovation will be distinguished |
|
Partnership approach |
The extent to which partnerships are made with others (civil society, the private sector, governmental and international organisations) that enhance the effectiveness and impact of interventions and encourage sustainability. Partnerships that build sustainability might include leveraging funds for continuation, securing policy adoption of an intervention or approach, building capacity of southern actors to deliver a service or to monitor service delivery. |
|
M&E |
The extent to which grantees effectively monitor and evaluate their performance and assess their impact. Effective M&E and impact assessment includes demonstrable assessment and reporting of results at different levels, especially outputs and outcomes. |
|
Efficiency |
Cost effectiveness |
In its simplest form cost effectiveness assesses the extent to which grantees have delivered units of outputs and outcomes at the ‘least cost’ in order to achieve the ‘desired’ results, typically through the formulation of unit costs. Whilst the assessment of a grantee’s cost effectiveness is most appropriate for outputs and outcomes of a quantitative nature, it is also an appropriate tool for capturing results that are harder to express in monetary units. This is particularly relevant to PPA fund holders and GPAF organisations where outputs and outcomes are presented in more qualitative terms. In these instances, grantees will be expected to demonstrate an acute understanding of key drivers of the costs that are incurred À“ ‘cost drivers’ are the strategic and operational determinants of a specific resource or activity cost. These cost drivers reflect the interdependencies between the strategic decisions that organisations make concerning the ways in which resources are used and the operational requirements associated with the delivery of activities that are relevant to the needs and priorities of poor and marginalised people. It is expected that grantees are able to evidence and demonstrate to a reasonable degree what costs have been incurred, why they have been incurred and the extent to which the costs incurred have been driven by the necessity to deliver the quality and quantity of results required. Essentially, this approach to the assessment of a grantee’s cost effectiveness seeks to understand and demonstrate the strength of the relationship between the ‘value’ and ‘money’ parts of the ‘value for money’ equation. |
Results |
Performance against the logframe |
The extent to which grantees have delivered on outputs and achieved the changes indicated in their LogFrames. In the first annual review this will largely assess outputs, while subsequent reviews will be able to increasingly assess outcomes. For GPAF organizations this assessment will be at project level; for PPA organizations, the assessment will be of the whole organization or of the part of an organization’s programme covered by the PPA. Note: grantees are required to demonstrate and evidence wherever possible the extent to which results are attributable to DFID funding. |
Improving lives |
An assessment of the extent and the manner of changes in the lives of poor and marginalized people as a result of the changes achieved, and the extent to which these changes are likely to be sustained. It is recognised that PPA/GPAF agency reporting in this area is likely to be illustrative of changes, rather than comprehensive across the portfolio. See Annex 9. Note: grantees are required to demonstrate and evidence wherever possible the extent to which changes in people’s lives are attributable to DFID funding. |
|
Changes in civil society |
The extent to which citizens are able to do things for themselves, for example community organizations to manage and deliver a particular service, and the extent to which civil society organizations are able to hold governments (such as the private sector and international bodies) to account. Note: grantees are expected to demonstrate and evidence wherever possible the extent to which changes in civil society are attributable to DFID funding. |
ANNEX C
Impact assessment of DFID funding
The section set out the proposed approach to the assessment of the additional impacts achieved by grantees as a result of DFID’s funding. It starts by explaining the fundamental principles that underpin the assessment of impact and the type of techniques that are typically used to undertake quantitative analysis. The purpose here is not to prescribe that all grantees should apply these and only these quantitative techniques. The intention is to provide an overview of a robust approach that should be considered if appropriate, cost-effective and proportionate to do so. The section also stresses the importance of a mixed-methods approach to the impact assessmentthat uses qualitative research to provide an explanation of ‘why’ and ‘how’ the programme is affecting the type and scale of changes that are quantitatively assessed.
The section concludes by providing guidance on contribution analysis, which adopts a theory of change approach to evaluation. This approach is informed by a wide range of evidence sources and perspectives brought together to produce a ‘plausible’ assessment of the ‘contribution’ of grantees to higher level outcomes and impacts. This Evaluation Strategy is first and foremost concerned with ensuring that grantees are able to produce the most robust evidence possible by rigorously using evaluation approaches and research tools that best suit the variety of ways in which DFID funding has been used across both the PPA and GPAF portfolios.
Impact assessment is defined here as the ‘net’ impact that an organisation or project intervention has in terms of the additional benefits realised that are directly attributable to the activities delivered by the organisation or project intervention. Theadditionality of the funding is of key importance for DFID as it is crucial to understanding the net impact of its interventions. Additionality is defined as “an impact arising from an intervention is additional if it would not have occurred in the absence of the intervention”. Typically, this requires a comparison between what actually happened (i.e. factually) and what would have happened in the absence of the intervention, otherwise called the counterfactual. The fundamental evaluation problem that all impact assessment faces is that we cannot observe what would have happened if the intervention had not happened to those already affected by the intervention. Therefore impact evaluation requires a rigorous approach to establishing the counterfactual. The most robust way to do this is to compare the outcomes achieved by those who benefited from an intervention with the outcomes achieved by a group of people who are similar in every way to the beneficiaries, except that they were not subject to the project intervention being evaluated i.e. by using a comparison or control group. This approach to the assessment of impact and additionality typically involves experimental or quasi-experimental approaches and methodologies.
Grantees should note that depending on the level of expenditure and ‘evaluability’ of the type of investment or intervention, the expectation is that the additionality and impacts of DFID’s funding should be quantitatively assessed as far as possible. It should be noted that this approach is not exclusive to qualitative methodologies, which are required to ensure that any evaluation of impact is firmly grounded in the context of a grantee’s activities. Crucially, a mixed-method approach provides a qualitative explanation of ‘why’ and ‘how’ the programme is affecting the type and scale of change assessed through quantitative research.
Acknowledging the impact attribution problem
The higher level objective of PPA and GPAF funding is to alleviate poverty by strengthening civil society and in doing so, contribute to the achievement of the Millennium Development Goals, and good governance. These goals are at the highest level and DFID’s investment through PPA and GPAF to achieving them is relatively insignificant in the context of the global corpus of interventions aimed at alleviating poverty. Moreover there are a large number of very important factors external to DFID’s and the grantees involvement which varies according to circumstance and which will influence the results achieved. For these reasons, experimental or quasi-experimental approaches to credibly assessing the attributable effects and impacts on observed changes may be difficult to achieve and quantify. Under these conditions it is necessary to consider alternative methods for assessing the funds’ ‘contribution’ to change that do not solely rely scientifically quantifying ‘attributable’ change .
Contribution analysis
Whatever the evaluation design or research methodologies used to evaluate the impact of DFID’s funding it is essential that a rigorous assessment of a grantee’s additionality is undertaken. At the very least this should result in a ‘plausible’ account of the difference that DFID’s funding has made to the effectiveness and performance of grantees. Contribution analysis is an approach that can help grantees overcome the attribution problem by systematically constructing an evidence-based and plausible assessment of changes that would not have happened without the support of DFID’s funding.
Contribution analysis involves assessing the ‘contribution’ that the funding is making or has made to results through a ‘theory of change’ based approach. Essentially this requires an evidence-based approach to verifying the plausibility of theories of change that underpin the rationale for the different ways in which grantees have used DFID funding to either:
indirectly ‘enhance’ the delivery of results (in the logframe) in the majority of cases for PPA grantees; or
directly delivery results (in the logframe) in the majority of cases for GPAF grantees.
Contribution analysis entails a more pragmatic, inclusive and iterative evaluation process than more experimental methods that for some grantees may not be feasible or practical given the variety of ways in which DFID funding is being used.
Contribution analysis involves the following 6 steps that typically a grantee would follow:
Step 1: Develop a theory of change and the risks to it
Establish and agree with stakeholders a ‘plausible’ theory of change that accurately reflects the ways in which DFID funding has been used to deliver or enhance the delivery of planned results. Specifically focus on the type and nature of cause and effect relationships at each stage in the impact logic of the theory of change. The Three ‘circles of influence’ (Montague et al., 2002) are useful in this respect :
Grantees should identify and articulate the assumptions that have been made in order to establish a set of cause and effect linkages between DFID funding, how it has been used and how this relates to the delivery of activities and ultimately the achievement of results set out in the logframe. In parallel, grantees should identify external influencing factors that could affect these linkages.
In the case of PPA grantees where DFID funding has been used in an unrestricted /indirect way these linkages and a theory of change will need to be developed that specifically focuses on how DFID funding has been used e.g. to improve organisational effectiveness through strengthening human resource management to ultimately enhance the delivery of results À“n these instances this is a distinctly different theory of change or impact logic that is presented in the grantee’s logframe.
Step 2: Set out the attribution problem to be addressed
Grantees should determine the specific cause and effect questions that each grantee needs to assess through the evaluation process; assess the nature and extent of the attribution problem by asking:
Step 3: Gather existing evidence on the theory of change
Grantees should gather evidence through routine monitoring /management data as far as possible. Whatever the nature of the theory of change underpinning how DFID funding has been used it is advisable to establish a baseline position in order to benchmark the progress made. For example, if DFID funding has been used to enhance human resource management of a grantee then a simple survey could be undertaken of a sample of project offices in order to establish the current state of human resource management from the perspective of those that benefit from it. Further questions could elaborate on the extent to which this enhances the capacity of project offices to deliver their activities and ultimately achieve their results.
Step 4: Assemble and assess the contribution narrative and challenges to it
From the outset it is important to validate whether the theory of change and the assumptions that it depends on holds true. This validation process should be undertaken systematically and regularly in order to iteratively build up a convincing and plausible evidence-based narrative of the effects DFID funding is having in direct and/or indirect ways. It is also essential that this process involves relevant external stakeholders who are in a position to externally verify that the original theory of change and future observed changes are plausible and credible.
Step 5: Gather additional evidence
This Evaluation Strategy provides guidance, tools and templates for gathering different types of evidence that could be required to supplement monitoring and management data. The type of evidence gathered will largely depend on the ways in which DFID funding is being used. Ideally the evidence base would consist of a combination of quantitative and qualitative data focused on testing and proving a plausible theory of change that is specific to DFID funding.
Step 6: Revise and strengthen the contribution narrative
This is a continuous process of testing and revising the theory of change that underpins the central argument that DFID’s funding is making a difference. In this way contribution analysis has a formative effect in that it enables grantees to quickly understand whether or not DFID funding is being used in an optimal way to deliver the changes envisaged at the outset.
There are several analytical approaches that could be used to assess the additionality of DFID funding in addition to contribution analysis. However, the key reason for presenting this approach is to demonstrate that this Evaluation Strategy is fully committed to gathering the best possible evidence concerning the impact and value for money attributable to DFID funding however great the challenge is. Even if a scientific approach to impact evaluation is not possible or is inappropriate then at the very least the approach to assessing the additionality of DFID funding should be as plausible and rigorous as possible, including evaluation designs and activities that entail predominantly qualitative research methodologies.
While responsibility for assessing and reporting on the additionality of DFID funding rests with grantees, the independent evaluators who will undertake the independent progress reviews (IPRs) will be involved with the impact assessment. Where feasible, they should be involved as early as possible by grantees so that they can provide technical support to design the assessment or carry out the steps described above.
This is reflected in grantees’ initial applications and their logframes. For PPA holders this is also reflected in the business cases prepared by DFID to justify funding.
GPAF holders will receive comments from the GPAF Fund Manager, and PPA holders will receive comments from DFID. The Evaluation Manager will be involved in preparing the comments and recommendations to some extent for both funds.
Evaluability is defined in this context as the extent to grantees’ activities can be measured to produce reliable evidence-based judgements of performance, impact and value for money.
Please see the Key Evaluation Terms document and the NONIE paper on impact evaluation for more guidance
Mayne, J., (2008) ‘ILAC Brief 16 À“ Contribution analysis À“ an approach to exploring cause and effect’, ILAC
Download : PPA Evaluation strategy
Download : Expression of interest PPA Review
Comment on Evaluator sought