
CONTENTS
I. Introduction
Purpose of evaluation,
users of this guide
Accountability for monitoring and
evaluation
Glossary of terms
II. Standardised components of monitoring and
evaluation in project proposals
Introduction to evaluation approach
at proposal stage
Monitoring and project preparation
Frameworks for project evaluation
Project evaluability (baseline data)
- monitoring and evaluation indicators
Content of evaluation section
in project proposals: standard templates
Sample evaluation section
in investment project proposals
Sample evaluation
section in non-investment project proposals
III. Evaluation
and monitoring of project implementation
Introduction to monitoring of the
implementation of investment projects
Indicators of progress in the
implementation of investment projects (definition; rationale; optimum number)
List of key indicators for investment
projects
Implementation performance and
decision-making (When are decisions required?
Who needs to know/decide?)
Monitoring of non-investment projects
Relationship of monitoring to evaluation
Mid-term
evaluation
IV. Project
completion reporting
Rationale for project completion reporting
Content of investment project completion
report
Standard templates
Sample of investment project completion
report
Content of non-investment project
completion report
Sample of
non-investment project completion report
V. Conducting evaluations under the Multilateral
Fund
A. Background
and rationale for evaluation
B. Timing,
scope and focus of multilateral fund evaluations
1. Timing
2. Scope
3. Focus
C. Evaluation
management and procedures
1. Initiating
a specific evaluation
2. Evaluation
work plan
3. Roles
and responsibilities
D. Procedures
for implementing work plans
1. Selecting
projects for evaluation
2. Evaluation
framework matrix
3. Activity/effort
analysis
4. Data
collection plan
5. Budget
6. Collecting
and analysing data (see later section for general aspects)
7. Reporting
E. Data
collection and analysis
1. Types
of data
2. Data
sources
3. Methods
of data collection
4. Instrumentation
5. Indicators
Appendices
Appendix I: Sectoral evaluation matrix
Appendix II Non-investment project evaluation matrix
– Training projects
Appendix
III: Non-investment project evaluation
matrix – Institutional strengthening projects
Glossary of Terms
For the purposes of this Guide,
the following definitions will be assumed:
|
Activity
|
Action
taken or work performed within a project in order to transform inputs into outputs.
|
|
Assumption
|
External
factors, influences, situations or conditions which are necessary for project
success, worded in terms of positive conditions.
Assumptions are external factors which are quite likely but not certain to
occur and which are important for the success of the project, but which are
largely or completely beyond the control of project management.
|
|
Baseline
benchmarks
|
Data that
describe the situation before any project intervention.
|
|
Effectiveness
|
A measure
of the extent to which a project is successful in achieving its planned
objectives or results.
|
|
Efficiency
|
A measure
of the extent to which inputs were supplied and managed and activities
organised in the most appropriate manner at the least cost to produce the
required outputs.
|
|
Evaluability
|
The extent
to which a project has been defined in such a way as to enable evaluation later
on.
|
|
Ex-post
evaluation
|
An
evaluation conducted after project completion.
|
|
Findings
vs.
conclusions
|
A finding
is a factual statement (e.g. 405 tonnes of ODS were phased out).
A
conclusion is a synthesis of findings incorporating the evaluator’s analysis
(e.g. the project was not efficient since it cost twice as
much to phase out 3 tonnes of ODS compared to the costs in other
similar projects).
|
|
Impact/effect
|
An
expression of the ultimate planned and unplanned changes brought about as a
result of a project; the planned and unplanned consequences of the project.
In projects that follow logical frameworks, effects are generally related to
the purpose, impacts to the goal.
|
|
Indicator
|
An
explicit statistic or benchmark that defines how performance is
to be measured.
|
|
Input
|
Resources
such as human resources, materials, services, etc., which are required for
achieving the stated results by producing the intended outputs through
relevant activities.
|
|
Objective
|
Expresses
the particular effect which the project is expected to achieve if completed
successfully and on time.
|
|
Output
|
The
physical products, institutional and operational changes or improved skills
and knowledge to be achieved by the project as a result of good management of
the inputs and activities.
|
|
Project
|
A planned
undertaking designed to achieve certain specific objectives/results within a
given budget and specified time period through various
activities.
|
|
Stakeholders
|
Interested and committed parties;
a group of people with a vested interest in the phenomena under study.
|
V. Conducting evaluations under the Multilateral Fund
A. Background and rationale for Evaluation
In the context of the Multilateral Fund,
an evaluation may be defined as “an assessment, as systematic and independent as possible, of projects
or clusters of projects, their design, implementation and results. The aim of
evaluation is to assess the continued relevance of Fund support to various
types of projects in various regions, the efficiency of project implementation,
and the effectiveness of such projects in achieving the Fund’s/project’s
objectives, as well as any lessons that can help guide future policy and
practice”.
The purpose of Multilateral Fund
evaluations is to provide information on:
- overall Fund
performance in reducing ODS according to
established targets;
- the effectiveness
of projects in particular sectors, and of non-investment projects;
- the strengths and
limitations of various types of projects;
- the major causes
of observed failures to reach targets;
- possible
actions that might improve performance of
the Fund.
The Executive Committee and all other stakeholders, such as
Article 5 countries and Implementing Agencies, are intended to benefit from
evaluation information and lessons
learned that will help them improve their efforts in achieving the goals of the
Montreal Protocol. The Executive Committee
acknowledges evaluation priorities through a budget for
evaluations approved annually.
The Executive Committee considered the Multilateral Fund’s
work programme and work plan for monitoring and
evaluation at its Twenty-second Meeting and adopted
deliverables 1, 2, 4, and 5 in the work programme and
outputs 1 to 4 in the work plan.
Output 1 mandates the preparation of an Evaluation Guide
covering both investment and non-investment projects. This guide incorporates
and builds on the guidelines and procedures already
developed by the Implementing Agencies, including, inter alia:
- project baseline data;
- data from Progress
and Completion reports;
- evaluation data
collected by the Implementing Agencies;
- established
guidelines for evaluation data
collection.
Timing,
scope and focus of Multilateral Fund evaluations
Evaluations can be classified according
to their timing, their scope and their focus.
1. Timing
Evaluations may be undertaken during project implementation
or after projects have been completed as characterised below.
|
Evaluation
timing
|
Description
|
Rationale
|
|
Mid-term
evaluation
|
An
evaluation of a specific project, done at any time during
project implementation.
|
Projects
that may require mid-term evaluations include those that are very large, that
have high risks associated with their design, that are using novel
technology, or that are experiencing problems, such as
implementation delays.
|
|
Ex-post
evaluation
|
Evaluation
of one or more projects that takes place at some point after operational
project completion.
|
Such
evaluations are intended to confirm that projects performed as reported, and
to facilitate future decision-making by learning about
strengths, weaknesses and unplanned effects of projects of various types.
|
2. Scope
The scope of Fund evaluations will respond to particular
needs which will be identified by the Executive Committee’s evaluation work
programme. Evaluations may examine a collection of
projects in a sector or region, or may focus on a single
project.
|
Type of
evaluation
|
Scope
|
|
Evaluation
of a single investment project
|
Such an
evaluation would focus on a single project, but would
examine the context in which it is situated. The project may be in the
process of being implemented, or it may be completed.
|
|
Evaluation
of projects within a sector (sectoral or thematic)
|
Such
evaluations would normally deal with a group of projects within the sector. They could include both
investment and non-investment projects, and both completed and non-completed
projects. Specific evaluation studies may relate to a
designated geographic area or theme, or be limited in other ways.
|
|
Evaluation
of non-investment projects
|
Such
evaluations would normally deal with a group of completed projects and may be
designed to focus on one or more of a combination of particular issues,
sectors, Implementing Agencies, or geographic areas.
|
3. Focus
The focus of an evaluation refers to the
types of issue it is to address. These are described by the major questions an
evaluation is expected to answer. The Executive Committee has considered the
following as illustrative of key potential questions for sectoral and
thematic evaluations (training and institutional
strengthening) supported by the Fund. The following tables provide possible
evaluation questions for sectoral, training, and institutional strengthening projects. (Appendices I-III provide
additional examples).
|
Sectoral
evaluations
|
Training
|
Institutional
strengthening
|
|
Effectiveness
and effects
|
Effectiveness
and effects
|
Effectiveness
and effects
|
|
In
general, how effective have the various types of investment projects been in
achieving ODP targets and reducing ODS within the
sector?
|
To what
extent is training supported by the Fund effective?
|
To what
extent is institutional strengthening supported by the Fund effective?
|
|
Was the
old technology successfully discontinued?
|
Is
training impacting the enabling environment
in ways that support achievement of the Fund’s objectives?
|
Is
institutional strengthening impacting the enabling
environment in other ways that support achievement of
the Fund’s objectives?
|
|
What have
been the effects of the new technology on operating costs? On market demand?
On safety and environment?
|
Is technical
training leading to more effective technical conversions?
|
|
|
How
sustainable are the project results?
|
|
|
|
Efficiency
|
Efficiency
|
Efficiency
|
|
What were
the major implementation challenges and how were they overcome? How efficient
are the various approaches to project implementation
(e.g.: financial intermediary; local
executing agency; ozone unit)?
|
Are
training activities planned and implemented in the most
cost-effective way? How could cost-effectiveness
be improved?
|
Are
institutional strengthening activities planned and
implemented in the most cost-effective way? How could cost-effectiveness
be improved?
|
|
Which
aspects of investment projects in this sector (equipment,
technical assistance, training)
worked very well?
|
Do
Implementing Agencies include suitable monitoring and evaluation of
training activities that enable such activities to benefit
from participant feedback?
|
Have
expenditures been allocated appropriately among the allowable categories?
|
|
How
effective was transfer of technology in the various
projects and regions?
|
|
Have
regional network activities been implemented in a cost
effective way?
|
|
Project
design
|
Project
design
|
Project
design
|
|
What were
the critical factors in the enabling environment that
have affected project success? How have they contributed to or hindered
project efficiency and effectiveness?
|
Are
Implementing Agencies addressing the most pressing training needs?
|
Was the
chosen mechanism appropriate for the institutional strengthening tasks?
|
|
Did the
design of various types of projects change prior to implementation?
|
To what
extent are training activities suitably targeted to reach
people and institutions with a need for such support?
|
Did the
original provisions reflect the needs?
|
|
Was the
level of funding provided by the Fund understood by the enterprise and
appropriate to the need and incremental cost requirements?
|
Are
training programmes designed in conformity with
contemporary international standards for training?
|
Did
original project documents contain adequate information for
subsequent evaluation?
|
|
Did
original project documents contain adequate information for
subsequent evaluation?
|
Did
original project documents contain adequate information for
subsequent evaluation?
|
|
|
Lessons
learned
|
Lessons
learned
|
Lessons
learned
|
|
What
lessons have been learned that may be useful in guiding future project
preparation, approval, or implementation?
|
What
lessons have been learned that may be useful in guiding future project
preparation, approval, or implementation?
|
What
lessons have been learned that may be useful in guiding future project
preparation, approval, or implementation?
|
|
What
lessons have been learned about monitoring and evaluation under
the Fund?
|
What
lessons have been learned about monitoring and evaluation under
the Fund?
|
What
lessons have been learned about monitoring and evaluation under
the Fund?
|
C. Evaluation management and
procedures
The general process for approving
and conducting evaluations under the Fund is depicted below.
The Sub-Committee
on Monitoring, Evaluation and Finance recommends the annual evaluation work
programme and work plan of the Multilateral Fund
for approval by the Executive Committee. The approved work programme and plan
of the Fund on monitoring and evaluation is the normal basis on which specific
evaluations are carried out; however, the Executive Committee may decide to
conduct special evaluations at any time. The annual work programme provides, in
the form of proposed outputs, a summary description of
specific evaluations to be undertaken. The management of
these evaluations is the responsibility of the Secretariat as described below.
1. Initiating a specific evaluation
The Monitoring and Evaluation Officer within the Secretariat has overall responsibility for
managing evaluations approved by the Executive Committee. For each evaluation,
it is the responsibility of the Monitoring and Evaluation Officer to prepare
terms of reference (TOR) leading to the contracting
of external consultants. The content of the TOR is as
follows:
|
Terms of reference
|
|
Background
Reasons
for Evaluation
Scope and
Focus
Specific
Evaluation Requirements
|
Estimated
Level of Effort
Description
of Required Evaluators
Schedule
for the Evaluation
Indicative Costs
|
Using established contracting
procedures, the Secretariat will contract a firm or
consultant to conduct the evaluation.
The Secretariat typically issues a letter of invitation to qualified consulting
firms to submit the qualifications of personnel
proposed for the assignment and professional fees for the assignment. The TOR
are normally included with this invitation to bid.
2. Evaluation work plan
Once evaluators have been contracted, the first deliverable in
the contract is normally a work plan for the assignment, with the details
worked out in consultation with the Secretariat. The suggested outline for such
an evaluation work plan is shown below.
|
Evaluation work plan outline
|
|
Overview
Evaluation
team
Project
selection
Evaluation
matrix
Methodology
|
Activity/effort
analysis
Data
collection plan
Budget
|
The evaluation work plan is an
important control document as it supplements the contract and enables the
Monitoring and Evaluation Officer to
exercise control over the quality of the evaluation. The evaluation work
plan will conform to the general requirements of this guide and will continue
to evolve in matters of operational detail.
3. Roles and responsibilities
a) Evaluation Team
In order to benefit from a range of perspectives, and to
ensure a balance of independent views and a mix of expertise,
evaluations are normally conducted by teams of independent experts who are not
directly linked to the preparation and/or implementation of projects and
activities approved under the Multilateral Fund. These
teams are contracted under the normal procedures for
contracting of consultants. The specific composition of
each evaluation team will vary according to the evaluation
needs and cost effectiveness considerations. Evaluation
teams for a simple project evaluation may include as few as one or two external
consultants.
Each evaluation conducted by a team will
involve an Evaluation Team Leader with expertise related to
the work of the Multilateral Fund, and/or ODS technology,
and/or evaluation methodology, experienced in leading evaluation teams in international contexts.
Evaluation teams will be contracted by the Fund Secretariat. The Team Leader’s role is to:
- Lead the
evaluation team in all aspects of the work, so as to
produce all required outputs according to agreed
standards and time frames;
- Be responsible
for co-ordinating the implementation of the required evaluations;
- Liase with the
Evaluation Officer within the Secretariat;
- Participate with
the team in data collection and analysis;
- Be responsible
for drafting the evaluation report;
- Submit
reports that respond to the TOR to the Secretariat.
b) Multilateral Fund
Secretariat
The Fund Secretariat ensures that
evaluations relate to the evaluation needs of the Fund, the
decisions of the Executive Committee and the requirements of
the Executive Committee’s work programme on monitoring
and evaluation. The role of the Secretariat is to:
- Manage the
evaluation process;
- Provide an ongoing
link between the evaluation and the Secretariat;
- Approve the
evaluation work plan developed by the Evaluation Team
Leader;
- Facilitate
communication between the evaluation team and
Implementing Agencies, participating Article 5 countries and bilateral agencies;
- Provide technical
expertise and participate in field missions as required;
- Provide data from
the Secretariat’s databases and archives;
- Review
final evaluation report to ensure it meets the
requirements of the TOR and has adequate technical quality.
c) Implementing agencies
Implementing
agencies are expected to support the evaluation process by:
- Being responsive
to the requirements of evaluation team members;
- Meeting the
evaluators at Headquarters and/or in field offices as
required;
- Facilitating
meetings with financial intermediaries and
enterprises as appropriate;
- Advising the
evaluation team on suitable approaches for data
collection if requested;
- Providing
relevant data on projects, enterprises and their context;
- Commenting on the
accuracy of data in report drafts;
- Contributing
to the formulation of lessons learned.
d) Article 5 Countries
Involvement of Article 5 countries is the key to improving
the Fund’s performance reducing ODS. Country
representatives such as Ozone Officers are important contributors to the work
of evaluation teams. The role of Article 5 country
representatives is to:
- Meet with the
evaluators during field missions;
- Advise the
evaluation team on suitable approaches for data
collection if requested;
- Provide relevant data
and interpretation on projects implemented within the country;
- Facilitate the
collection of data within government departments
and on site visits to enterprises;
- Advise on local
product markets;
- Comment on the
accuracy of data in report drafts;
- Contribute
to the formulation of lessons learned.
D. Procedures for implementing work plans
1. Selecting projects for evaluation
Sometimes the selection of specific projects to be evaluated
will be specified in the TOR. In other situations, such as with sectoral evaluations,
all projects that have certain characteristics will be reviewed, but at
different levels of detail as shown below:
The Evaluation Team Leader, in consultation with the
Monitoring and Evaluation Officer,
and within the context of the approved work programme,
will make the technical decision about the particular
projects which will be included in an evaluation, and at
what level of examination. The selection of projects for site visits will
depend on a variety of factors including the needs for coverage, cost
efficiency, and the scale and type of projects (e.g.: demonstration; completed
or ongoing).
2. Evaluation framework matrix
The framework for data collection and
analysis is recorded in an evaluation matrix. This matrix
outlines the key questions and sub-questions to be addressed, and shows the
indicators and sources of data to be included in the data analysis relative to
each question.
Three generic evaluation matrices
(including possible evaluation questions, indicators and sources of data) are
presented in Appendices I-III): Appendix I: a matrix for a sectoral evaluation,
Appendix II: a matrix for an evaluation of training projects,
and Appendix III: a matrix for an evaluation of institutional strengthening projects.
Using the generic evaluation matrix as a
guide, the Team will refine the evaluation questions and develop the specific
indicators and data sources required to address the specific TOR.
3. Activity/effort analysis
The work plan will include a table of the activities to be
undertaken, who will undertake them, and the amount of time planned for each.
This table will link to the personnel costs in the budget. The
Team will divide responsibilities so that all aspects of data collection and
analysis are efficient. In practice, this may involve
different team members conducting different site and country
visits.
4. Data collection plan
The Evaluation Team Leader will develop a detailed data
collection plan; assign specific roles and
responsibilities; schedule specific activities such as site
visits; and develop the necessary data collection methods and instruments.
In developing the detailed data collection
plan, the Team may review available Implementing Agency
reports and project completion reports. The Evaluation Team Leader may make a
preliminary request for data from Implementing Agencies and from Ozone
Officers.
5. Budget
The work plan will include a budget for the
costs of personnel, travel, and other expenses. This budget is
indicative of the emphasis of various components of the evaluation;
however, contracting may be on a fixed fee basis with payments linked
to specific deliverables.
6. Collecting
and analysing data (see later section for general aspects)
a) Initial analysis
The first level of analysis will be through the existing data
found in Implementing Agency reports, of which the Project Completion Reports
are particularly important. The initial data analysis will help the team to
understand what data are not available and need to be collected elsewhere, and
will help define issues that require follow-up.
b) Country field missions
Field missions are an important supplement to existing
reported data. They provide an opportunity to validate available data, to
supplement it, and to collect data on developments following operational
completion of a project.
Once the dates of field missions are
known, the Secretariat informs the concerned Article 5 countries and
Implementing Agencies of the start of the evaluation field
mission. The nature of their involvement and expected support will be
indicated.
Country missions may begin with in-country briefings with the
Ozone Officer to review and obtain input and assistance on the data collection plan.
The purpose of site visits will be to gain additional
understanding by confirming and/or complementing information available
from existing data sources, and situating the findings in the context. During
the mission, data will be collected according to the data collection plan
(through interviews and visits with government representatives,
Implementing Agencies’ field offices, enterprises, and
bilateral donors as applicable) with modifications made as
needed and as agreed by the Team.
c) Non-investment evaluations
As in other types of evaluations, studies of non-investment
projects will involve analysis of extensive existing data (e.g. internal
evaluations of training workshops, country
programmes and reports). These tend to be self-reported data that are collected
before or at project completion. In addition, evaluations emphasising effects
and impact will require follow-up or tracer study methods such
as questionnaire surveys, telephone interviews, electronic
communication, and, when warranted, visits to the field.
7. Reporting
The Team Leader bears overall responsibility for the final
analysis and reporting. Following accepted practice for sound evaluation, the
Team Leader will attempt to share drafts of relevant sections of
reports with involved Implementing Agencies and Article 5 countries to give
them the opportunity to correct factual errors in the drafts. While every
attempt will be made to ensure factual accuracy, the substantive conclusions of
the evaluation are the responsibility of the evaluators.
The Evaluation Team Leader will submit the report to the
Monitoring and Evaluation Officer.
The latter ensures conformity to the TOR, technical accuracy and quality, and
may require revisions before submitting the report to the Sub-Committee.
a) Sectoral evaluations
The outline of each evaluation report will
be tailored to the specific TOR and other requirements. A suggested outline is
provided below to indicate the type of reporting desired. The emphasis is on
clear reports that state what was found, the resulting conclusions and
recommendations directed at specific stakeholders. Every report should contain
a concise executive summary of 2-5 pages.
|
Sectoral
evaluation report outline
|
|
Executive
summary
Introduction
Background
Description
of projects
Investment
Non-investment
Evaluation
methodology
Organisation
of report
Design and
Rationale
Assumptions
Sector
context
Context -
enabling environment
Design
Changes
Evaluability
Alternative
designs
Cost
Planned/actual
Cost
sharing
Sources of
extra cost
|
Effectiveness
and effects
Achievement
of results
ODS phase-out
Institutional
strengthening at operational level
Differences
by sector, region
Equipment
rendered unusable
Effects on
enterprises
Effects on
safety/environment
Implementation
efficiency
Conversion
of inputs to outputs
Differences
by component
Differences
by type of project, region, agency
Project
management
Sustainability
Conclusions
Recommendations
and follow-up
Lessons
Learned
Annex 1 -
TOR
Annex 2 -
Evaluation matrix
Annex 3 -
Organisations visited
Annex 4 -
Project list
|
b) Reporting on evaluations of
non-investment projects
The outlines of the evaluation reports
for non-investment projects will follow the key questions of the evaluation
framework matrix. A sample outline for a training evaluation
and for an institutional strengthening evaluation are shown below.
|
Training
evaluation report outline
|
|
Executive
summary
Introduction
Background
Description
of projects
Evaluation
methodology
Organisation
of report
Design and
rationale
Assumptions
Context -
enabling environment
Design
Relevance
of plan
Changes
Cost
Planned/actual
Cost
sharing
Sources of
extra cost
|
Effectiveness
and Effects
Achievement
of targets
Effects on
enterprises
Effects on
safety/environment
Implementation
efficiency
Delivery
of inputs
Project
management
Sustainability
Conclusions
Recommendations
Lessons
Learned
Annex 1 –
TOR
Annex 2 –
Evaluation matrix
Annex 3 –
Organisations visited and interviews conducted
Annex 4 -
Project list
|
|
Institutional
strengthening evaluation report
outline
|
|
Executive
summary
Introduction
Background
Description
of IS funding
Evaluation
methodology
Organisation
of report
Design and
rationale
Assumptions
Design
Relevance
of plan
Level of
responsibility
Variations
in different category countries
Changes in
roles of units
Cost
Planned/actual
Cost
sharing
Sources of
extra cost
Effectiveness
and effects
Achievement
of objectives: data-gathering; information exchange;
dissemination; monitoring; co-ordination
|
Fulfilment
of obligations
Differences
by sector, region, category of country
Regional
networks
Effects on
ODS phase-out
Efficiency
Time lags
in implementation
Capital
expenditures
Professional
staff
Operational
costs
Regional
networks
Sustainability
Need for
continuation
Government
plans
Conclusions
Recommendations
Lessons
Learned
Annex 1 -
TOR
Annex 2 -
Evaluation matrix
Annex 3 -
Organisations visited and interviews conducted
Annex 4 -
Project list
|
E. Data
Collection And Analysis
1. Types
of Data
Data can be hard or soft, quantitative or qualitative. Hard
(quantitative) data generally include technical or financial facts such as the
amount of ODS phased-out through a project or the number of
trainees who participated in a course. Soft (qualitative) data reflects
perceptions or judgements. It includes both non-technical judgements such as
the perceptions of people about what took place, and the expert judgement of an
individual who is knowledgeable and experienced in a particular field. Valid
evaluations try to obtain as many types of data from as many sources as
possible. One of the rules of thumb of evaluation is
that the more sources that confirm a finding, the more valid the finding.
2. Data sources
Evaluation studies draw from many data sources, as it is a
combination of sources that lend strength to evaluation findings.
Some of the major sources include the following:
Documents:
Project proposals;
Project documents;
Project progress reports;
Project completion reports;
Country
programmes.
Interviews:
Government officials;
Persons involved in any aspect of
project implementation;
Persons involved in training and
institutional strengthening supported by the Fund;
Bilateral donors
involved in the sector;
Managers (e.g.: production;
marketing) and technical personnel from involved enterprises;
Persons
involved in product markets (e.g.: distributors; retailers).
Enterprises:
Equipment and production processes;
Production reports;
Product
sampling.
Note that there are instances where data are missing or not
available, in which case alternative sources may provide data with which to
address the questions. In extreme cases, there are no data and the questions
cannot be answered, at least at the time of the evaluation.
This would suggest recommendations for improved data systems in future project
approvals and implementation.
3. Methods of data collection
It is expected that the Evaluation Team will use a combination
of methods of data collection and analysis, including:
review of project
proposals and reports, especially project completion reports;
surveys and
telephone interviews with project stakeholders;
country and on-site
visits to enterprises, where the volume of projects warrants it;
selective sampling of products considered to
be ozone-friendly may also be undertaken through market surveys.
Whatever methods are used, the evaluators will ensure the
confidentiality of people who provided data by
avoiding the use of interpretations and conclusions that could be traced back
to the person providing them.
4. Instrumentation
Each evaluation team will also develop
data collection instruments and procedures suited
to the needs of particular evaluation studies and sites. The types of
instruments normally used include:
Interview protocols:
Country officials;
Persons knowledgeable about project
implementation;
Persons who have been supported by
non-investment projects;
Other
stakeholders (bilateral donors; persons involved with
product markets).
Checklists:
Factors in the enabling environment;
Environmental
and safety concerns.
Questionnaire surveys:
Training
participant tracer surveys.
5. Indicators
Indicators are important quantifiable measures of various
aspects of project performance. The amount of ODP phased-out
is an example. The proportion of training participants who
are successful in applying new skills is another. The time taken to reach
agreed targets is a third. Each of the evaluation questions
will be judged using one or more indicators of this type. The use of indicators
helps make the rules of judgement transparent,
and it provides a sound and rational basis for data analysis.
|
Sectoral
evaluation report outline
|
|
Executive
Summary
Introduction
Background
Description
of projects
Evaluation
methodology
Organisation
of report
Design and
rationale
Assumptions
Sector
context
Regulation/legislation
Context -
enabling environment
Design
Relevance
of plan
Changes
Cost
Planned/actual
Cost
sharing
Sources of
extra cost
|
Effectiveness
and effects
Achievement
of targets
Differences
by sector, region, etc.
Effects on
enterprises
Effects on
safety/environment
Sustainability
Implementation
efficiency
Delivery
of inputs
Project
management
Conclusions
Recommendations
Lessons
Learned
Annex 1 -
TOR
Annex 2 -
Evaluation matrix
Annex 3 -
Organisations visited and interviews conducted
Annex 4 -
Project list
|
Appendix I: Sectoral
Evaluation Matrix
The following matrix includes
generic questions, indicators and data sources. It is included to suggest the
types of questions and approaches that may be useful; however, it is not
intended to be prescriptive – each evaluation will need to
develop a matrix that addresses its TOR.
|
Possible evaluation
questions
|
Possible sub-questions
|
Possible indicators
|
Possible sources of data
|
|
Effectiveness and Effects
|
|
|
|
|
In general, how effective have the
various types of investment projects been in achieving ODP targets
and reducing ODS within the sector?
|
Were there differences by region or
Implementing Agency?
Were there differences by sub-sector?
Were there differences by type of
technology?
|
Baseline +
ODS reduction
Change in ODP
Planned/actual target achievement
|
Project documents
Enterprise data
Country representatives
Project implementation agencies
|
|
Was the old technology successfully
discontinued?
|
For how long was the old technology in
use after implementation of the project?
How was the de-commissioned equipment rendered
unusable?
|
% old technology destruction
% of various means of disposal
months for phase-out
|
Project documents
Enterprise
Country representatives
Project implementation agencies
|
|
What have been the effects of the new
technology on operating costs? On market demand? On safety and
environment?
|
What were the effects on production following
conversion?
What were the effects of conversion on
product quality, price, market acceptance?
What were the effects on safety and
the environment?
|
% change in products
% change in costs
% market penetration
Changes in accident rates; safety guidelines
|
Project documents
Enterprise
Product testing
Market sampling
|
|
How sustainable are the project
results?
|
Has the project led to plans for
additional conversions?
What are the risks of re-conversion?
|
Number of inquiries about adopting
technology
Instances of re-conversion
|
Project documents
Enterprise
Country representatives
Project implementation agencies
Bilateral agencies
|
|
Efficiency
|
|
|
|
|
What were the major implementation
challenges and how were they overcome? How efficient are
the various approaches to project implementation (e.g.: financial
intermediary; local executing agency; ozone unit)?
|
How has the capacity of
local Implementing Agencies affected project efficiency and effectiveness?
Have conversions complied with
environmental/safety standards?
Have new equipment or
processes introduced new safety or environmental risks?
|
Time to various project milestones
Frequency of specific contextual
constraints
Frequency of specific environmental or
safety concerns
|
Project documents
Enterprises
Country representatives
Project implementation agencies and
associates
|
|
Which aspects of investment projects in
this sector (equipment, technical
assistance, training) worked
very well?
|
Were there contextual factors that
affected the implementation of certain components?
|
Frequency of specific contextual
constraints
|
Project documents and IAs
Enterprises
Country representatives
|
|
How effective was transfer of
technology in the various projects and regions?
|
What types of difficulty were
encountered in obtaining non-ODS technology?
Is there any evidence of conversion back
to ODS?
Have other producers demonstrated
interest in adopting this technology?
|
Frequency of specific difficulties
Instances of re-conversion
Number of inquiries about adopting technology
|
Project documents
Enterprises
Country representatives
Project implementation agencies
Bilateral agencies
|
|
Project
design
|
|
|
|
|
What were the critical factors in the
enabling environment that have affected project success?
How have they contributed to or hindered project efficiency and
effectiveness?
|
Have there been effective changes in
regulation and policy during project implementation?
Are there constraints in the enabling
environment that the Fund or country should attempt to
address?
Have training and
institutional strengthening activities supported the success
of investment projects?
Were assumptions valid? Are there any
contextual factors that should be a concern for future project approvals?
|
Checklist of critical factors in the
enabling environment
List of changes in legislation/regulation
|
Country representatives, IAs, project
implementation agencies, enterprises, bilateral agencies
Legislation, regulations
|
|
Did the design of various types of
project change prior to implementation?
|
Did the technology implemented differ
from the technology approved? Why and with what effects?
|
% of each alternative technology
changed
% popularity of alternative
technologies
|
Project documents
Enterprise
Country representatives
Project implementation agencies
|
|
Was the level of funding provided by
the Fund understood by the enterprise and appropriate to the need and
incremental cost requirements?
|
Did the cost change appreciably during
implementation? If so, who paid the additional cost?
|
% change in project cost
% cost borne by different stakeholders
|
Project documents
Enterprise
Country representatives
Project implementation agencies
|
|
Did original project documents contain
adequate information for subsequent evaluation?
|
|
Sufficient material available to
complete evaluability checklist (e.g.: baseline data,
training needs assessments include skill levels prior to
training)
|
Project documents
|
|
Lessons
Learned
|
|
|
|
|
What lessons have been learned that may
be useful in guiding future project preparation, approval, or implementation?
|
What are the implications of the
findings for additional and/or alternative information in
future project proposals?
|
|
All stakeholders
|
Appendix
II: Non-Investment Project Evaluation Matrix – Training Projects
The following matrix includes
generic questions, indicators and data sources. It is included to suggest the
types of question and approach that may be useful; however, it is not intended
to be prescriptive – each evaluation will need to develop a
matrix that addresses its TOR.
|
Possible
Evaluation Questions
|
Possible
Sub-Questions
|
Possible
Indicators
|
Possible
Sources of Data
|
|
Design
|
|
|
|
|
Are Implementing Agencies addressing
the most pressing training needs?
|
Are training needs
assessments conducted in conformity with contemporary international
standards?
Do programming priorities reflect
priorities of key stakeholders?
|
Expert judgement
Congruence of training demand
and supply
|
Training experts
Stakeholders: IAs, countries
|
|
To what extent are training activities
suitably targeted to reach people and institutions with a need for such
support?
|
Are policies and procedures for
identification of training participants suitable for
addressing identified needs?
|
Expert judgement
|
Training experts
Stakeholders: IAs, countries
|
|
Are training programmes
designed in conformity with contemporary international standards for
training?
|
Do training workshops
incorporate key principles for effective adult learning?
Are training materials
effective in supporting training outcomes?
|
Expert judgement
Participant ratings of satisfaction;
effectiveness of materials
|
Training experts
Training participants
Training manuals and materials
|
|
Did original project documents contain
adequate information for subsequent evaluation?
|
|
Sufficient material available to
complete evaluability checklist (e.g.: baseline data,
training needs assessments include skill levels prior to
training)
|
Project documents
|
|
Effectiveness
and Effects
|
|
|
|
|
To what extent is training supported
by the Fund effective?
|
Are participants learning the intended
knowledge and skills?
Is training being
applied on the job? If not, what are the constraints?
|
Skill performance;
Knowledge acquisition
% participants reporting successful
transfer
Frequency of constraints
|
Tests and records
Training participants
Ozone Units
Enterprises
|
|
Is training impacting
the enabling environment in ways that support
achievement of the Fund’s objectives?
|
What policies, regulations,
procedures have been initiated by countries as a result
of training programmes?
|
Frequency of targeted changes to
regulations, etc. (e.g.: customs and import, licensing,
re-export, non-compliance measures))
Degree of implementation of Article 4
of the Montreal Protocol
Extent of financial support of ODS phase-out
activities
|
Training participants
Ozone Units
Enterprises
IAs
|
|
Is technical training leading
to more effective technical conversions?
|
|
Reduced time for introduction of new
technology
|
Enterprises
Project completion reports
|
|
Efficiency
|
|
|
|
|
Are training activities
planned and implemented in the most cost-effective way? How could
cost-effectiveness be improved?
|
What are unit training costs,
and how do they compare with costs of other international training of this
type?
What is the breakdown of training costs
and are there ways to reduce cost components without negatively affecting
quality?
|
Cost comparisons
Expert judgement
|
Budgets
financial reports
Training experts
Other UN agencies
|
|
Do Implementing Agencies include
suitable monitoring and evaluation of training activities
that enable such activities to benefit from participant feedback?
|
Does M&E address all the steps in
the training cycle: attitudes? learning?
transfer? impact? How might monitoring and evaluation systems
be improved?
|
Expert judgement
|
Training experts
|
|
Lessons
Learned
|
|
|
|
|
What lessons have been learned that may
be useful in guiding future project preparation, approval, or implementation?
|
|
|
All stakeholders
|
Appendix
III: Non-Investment Project Evaluation Matrix – Institutional Strengthening
Projects
The following matrix includes
generic questions, indicators and data sources. It is included to suggest the
types of questions and approaches that may be useful; however, it is not
intended to be prescriptive – each evaluation will need to
develop a matrix that addresses its TOR.
|
Possible evaluation
questions
|
Possible sub-questions
|
Possible indicators
|
Possible sources of data
|
|
Design
|
|
|
|
|
Was the chosen mechanism appropriate
for the institutional strengthening tasks?
|
Is the designated mechanism a central
national facility?
|
Degree of confidence in the mechanism
|
Ozone/Institutional strengthening experts
Stakeholders: IAs; enterprises
|
|
Did the original provisions reflect the
needs?
|
Was funding adequate for country requirements?
|
Amount of supplementary funding
required
|
Government representatives
Ozone unit
|
|
Did original project documents contain
adequate information for subsequent evaluation?
|
Did the proposal conform to the requirements
of the TOR and qualifying areas of expenditure?
Did documents identify indicators?
|
Number of instances of non-congruence
|
Project documents
|
|
Effectiveness and effects
|
|
|
|
|
To what extent is institutional
strengthening supported
by the Fund effective?
|
Are ozone units collecting and processing data to fulfil national obligations as
parties to the Protocol?
Have units exchanged relevant information
with other countries, etc. and disseminated information
to end-users?
Are capacities to co-ordinate phase-out
activities being enhanced?
Are capacities to monitor phase-out
activities being enhanced?
Have units served as a focal point for the Fund Secretariat and IAs,
including reporting?
|
Extent of obligations for
data collection and reporting to Meeting of Parties met
Amount of information exchange
and public awareness activities
Improved co-ordination
Improved monitoring
Contributions to country programmes
Adoption/Changes/
harmonisation of legislation and/or
regulations
|
Ozone units
Ozone Secretariat
Enterprises
Implementing agencies
Fund Secretariat
|
|
Is institutional strengthening impacting the enabling
environment in other ways that support achievement of
the Fund’s objectives?
|
Have regional networks
been effective in supporting institutional strengthening?
What actions have been initiated by countries as a result of
the institutional strengthening programme?
|
Ratings of the extent to which regional
networks effective
Frequency of various actions
|
Ozone Units
Enterprises
IAs
Participants in regional
networks
|
|
Efficiency
|
|
|
|
|
Are institutional strengthening activities planned and
implemented in the most cost-effective way? How could cost-effectiveness
be improved?
|
What has been the time lag in
implementation and what are the reasons?
|
Planned/actual time variance
|
Reports of ozone units
Ozone units
|
|
Have expenditures been allocated
appropriately among the allowable categories?
|
What proportions have been allocated
between capital and recurrent expenditures in various categories of country?
|
Proportions of budget
|
Proposals
Reports
Ozone Units
|
|
Have regional network
activities been implemented in a cost effective way?
|
Have network meetings conformed to
standards of similar international gatherings of this type?
|
Cost comparisons
|
UNEP reports and budgets
|
|
Lessons Learned
|
|
|
|
|
What lessons have been learned that may
be useful in guiding future project preparation, approval, or implementation?
|
|
|
All stakeholders
|
(UNEP/OzL.Pro/ExCom/23/68,
Decision 23/5, para. 17).
(Supporting
document: UNEP/OzL.Pro/ExCom/23/4)).
