Maintaining BRIDGE

Section Description: This section covers evaluation, monitoring, documentation and sustainability.

Target Audience: BRIDGE partners, project managers, administrators, facilitators.

Maintaining and Sustaining BRIDGE Programs

One of the key messages of this manual is to consider, plan and implement BRIDGE programs in a sustainable, meaningful manner. This section will summarise and reiterate the key points that relate to sustainable BRIDGE programming, and will look at the post-election period of the electoral cycle as a particularly important part of the electoral cycle from a sustainability point of view.

For BRIDGE partners or implementing organisations, a process rather than event (workshop/election) driven approach assumes continuing dialogue with the client even as a program comes to a close, for example by working through recommendations of a BRIDGE program evaluation report. Program planners need to ask whether future interventions are desirable, given the priority which BRIDGE places on empowering clients to internalise BRIDGE as a sustainable professional development tool. Instead of further interventions, routine follow up visits could be considered as part of an overall networking approach. These assumptions could be spelled out in maintenance and sustainability plans, for incorporation into the client organisation's professional development and planning cycle.

When it comes to designing BRIDGE programs post-election environment, experience has shown that immediately after an electoral event there is the likelihood of the withdrawal of both attention and funding whether by government or donor funds. This is often coupled with staff reduction and the loss of internal and external expertise.

This post-election period can be seen as a moment of opportunity to implement a capacity development or sustainability plan however, allowing for a focus on planning and working with core or permanent staff in a way that the operational imperatives of the pre-election period does not permit . A post-election evaluation process can be used as an opportunity to bring together stakeholders and repair differences by looking forward and seeking to improve the electoral process. BRIDGE can be an ideal vehicle for designing workshops to serve both these purposes.

In maintaining and sustaining BRIDGE programs, continuity of staff is desirable and yet the ability of the client organisation to attract and retain capable facilitators may be out of the control of program planners. Nevertheless, the key personnel, even if not permanent, which could drive, own, implement and administer any future program should be identified and included in any maintenance and planning process. The departure of one key BRIDGE-trained person in an organisation may, after all, be the link that breaks the chain of sustainability.

Continuing and increasing the number of networks and partners after a program is complete is a core component of any sustainability plan. Organisations could look to other organisations - national and international - to continue the identified work.

The following table is a summary of the points made in the manual related to good practice in implementing sustainable, high quality and relevant BRIDGE programs :

Stages Measures enhancing sustainability
Before program
  • Participatory needs assessment reviewing in details existing capacities (three layers: individual, organisational, systemic)
  • Showcase BRIDGE
  • Encourage dialogue inside beneficiary institution on professional development and relevance of BRIDGE
  • Official demand for BRIDGE comes from beneficiary
  • Include beneficiary in needs assessment or scoping mission team
  • Identify most relevant unit inside institution to become anchor of BRIDGE program and involve it in all aspects of scoping mission and program definition. In most cases, this would be an existing training unit
  • Design with beneficiary a flexible and customised program with realistic program objectives that answer priority needs. If beneficiary has strategic plan, ensure that BRIDGE program contributes to its achievement
  • Allocate sufficient time to program - think long-term
  • Develop monitoring and evaluation indicators and methods for the program as a whole and agreed upon the choice of each workshop with beneficiary
  • Secure long-term financial resources, including from beneficiary institution, to support sustainability plan
  • Tailor planned number of facilitators (TtF) to objectives defined
  • Establish Steering Committee to supervise implementation and measure impact
  • Project implementation team includes training unit
  • Capacity-based selection of potential local facilitators. Must include personnel from training unit (if it exists)
  • Use existing training resources in BRIDGE workshop resources (customisation process).
During program
  • Coordinate closely with senior management, relevant technical units and other providers of capacity development (e.g. BRIDGE partners) - if applicable - to apply outcomes of workshop activities to on-going and planned change processes
  • Negotiate criteria for selection of participants (target group, level, gender, diversity, capacity) and strive to participate in selection process
  • Ensure visibility of workshops and their outcomes inside institution with wider stakeholder community
  • Involve training unit in a meaningful fashion in each step of preparing, delivering and evaluating workshops
  • Accredit local pool of facilitators (according to needs identified to serve long-term strategy)
  • Choose workshop activities that allow participants to apply skills and knowledge for addressing concrete institutional needs
  • Analyse workshop (schedule, activities, trainers, resources) and results of participants evaluations with training unit after each workshop
  • Involve training unit in writing workshop report
  • Assist training unit in presenting workshop results to Steering Committee
  • Jointly monitor (BRIDGE partner + training unit + relevant technical unit) workshop impact
After program
  • Support beneficiary institution to plan for continued implementation of professional development program, including financial needs. This could involve advising institution about reforming training unit into a full-blown capacity development unit
  • Support fund-raising from national budget and donors for continued implementation of professional development program
  • Advise human resources unit to incorporate professional development as part of induction and incentive strategy
  • Support training unit in compiling, finalising and archiving training resources based on lessons learnt during program
  • Final ‘lessons learnt’ workshop with institution and joint drafting of final report
  • Present final report to Steering Committee with recommendations for sustainability
  • Disseminate final report with recommendations to wider electoral stakeholder community
  • Periodically evaluate the program impact on institution according to pre-agreed schedule and indicators (see evaluation plan). In particular, wherever workshops triggered change processes inside institution, document and evaluate the outcomes of these
  • Coordinate with providers of long-term technical assistance to support implementation of change processes and policy development identified during program
  • Help secure support to networks of electoral stakeholders that might have appeared during program

Transition: Reporting, Documenting and Updating BRIDGE

BRIDGE can be particularly useful and successful as a capacity development tool because it aims to systematically transfer ownership and responsibility for the conduct of BRIDGE to the client organisation or country. Ideally this occurs throughout the first two or three years of the rollout of BRIDGE. The aim is to have the client organisation or the country develop and implement a professional or community development strategy which is taken up and institutionalised. Commitment from senior managers and a pool of accredited facilitators will be necessary so that control of BRIDGE is transferred from international donors or funders.

Transition marks the completion of a program to the satisfaction of the client. On this occasion, program records and documentation are completed and relevant sections delivered to the client. A transfer document is drafted. The purpose of the transfer procedure is to ensure the following:

  • contractual conditions have been satisfied
  • delivered outputs conform with specifications
  • the program is integrated into the ongoing business
  • legal and psychological ownership is transferred
  • all accounts are paid

Transition also marks the point at which the program team's responsibility for development ends and the end user is fully capable of taking on whatever the project produced. Purely at a practical level, this requires certain adjustments by both parties. However, there is also an important psychological element in transition that program managers ignore at their peril.

Capacity building and the transition process for handing over responsibilities to counterparts should begin at the start of the intervention. In transferring responsibility for a program, program managers should prepare a transition strategy, which includes sustainability strategies, and should also include close consultation with the clients. The transition comprises three main elements the:

  • documentation process
  • closing of programs and workshops - through some sort of celebration
  • sustainability planning process (dealt with in Part 10)

Documentation

All projects generate many documents. Provided the project's logical framework has been followed, the preparation, dissemination and filing of all documents should be a straightforward process.

Archiving of BRIDGE documentation is a responsibility of the BRIDGE Office. It is the responsibility of the implementing organisation to get all the correct documents to BRIDGE Office securely and within reasonable time frames. All relevant documentation should be emailed (or sent) to the BRIDGE Office. Progress or summary reports should also be supplied to provide material for inclusion into the quarterly BRIDGE newsletters.

Program reports

To ensure that BRIDGE partners are informed of BRIDGE events the office requires a descriptive article about the workshop or event. In addition to this article please send the following files, reports or documents:

  • Scoping or needs assessment report
  • Workshop report including
  • Names of facilitation team
  • Implementing organisation details
  • Donor details
  • Participant profile
  • Workshop content
  • Lessons learned
  • Workshop agenda
  • Participant list
  • Group photo and other workshop photos
  • Participant evaluation report
  • Facilitator evaluation report
  • Any feedback on the module/s run (suggested improvements, criticisms, compliments)
  • Lead facilitator report 
  • Media coverage

All of the above reports will be archived in the BRIDGE Office as a repository of information on past BRIDGE events.

TtF Reports

Lead facilitators usually bear the responsibility of writing and sending the TtF report to the implementing organisation and the BRIDGE office. The information in these reports may vary from organisation to organisation, but in general there are common features that should be included in every TtF report.

In general, a TtF Report could include:

  • scope of work
  • short overview of the TtF
  • selection of materials material's production
  • facilitation team
  • participants and quality of participation
  • venue
  • evaluation summary by participants
  • recommendations by facilitators (for future TtFs, for TtF Facilitators Notes)
  • media coverage
  • general observations and conclusions

Also, a TtF report for the BRIDGE website could include:

  • a short summary of the TtF context in terms of the broader BRIDGE program
  • summary of facilitators and participants (where they come from)
  • a group photo
  • a short summary of evaluations

BRIDGE Office role

The BRIDGE Office will publish all news articles written for the website with the related photos of the event. The remaining reports or documents will be archived in the BRIDGE Office.

One of the challenges of creating such a comprehensive curriculum on electoral administration is keeping it up to date and relevant. For this reason, the curriculum has been designed to be an active document that can be updated as new information becomes available, and is open to improvements and innovations from those who facilitate and participate in BRIDGE workshops.

The curriculum is updated annually.  In between updates the BRIDGE Office collects feedback, suggestions and new material from facilitators and other stakeholders which can be incorporated at each update. Facilitators who are registered on the website will be notified of updates by email.

Updating BRIDGE Content

Reporting and documentation is also important to BRIDGE because it is through feedback from facilitators and implementers in the field that the BRIDGE Office is able to improve and update the BRIDGE curriculum.

The BRIDGE Office actively seeks feedback and suggestions from facilitators who have used the curriculum, in order to improve the content and make it easier to use. Facilitators and other stakeholders using the curriculum are encouraged to give feedback in various ways:

  • Where they have created a new activity, submitting it for inclusion in the curriculum
  • Where they have had problems running an activity, whether due to clarity, complexity or other reasons, letting the BRIDGE Office know, and providing any amendments or suggestions on improving the activity for easier use
  • Giving general feedback on how they found the different activities or modules
  • Giving general suggestions for improvements
  • Identifying potential resources for use or reference in the curriculum
  • Identifying any outdated content or documents that should be updated or removed
  • Identifying any numbering or typographical errors

The most up-to-date version of the curriculum is the one that is available on the website. With each update, only a fraction of the total documents will be changed, so a system has been put in place to keep track of updates and make it easy to understand for facilitators and implementers. More information can be found on the update section of the BRIDGE website.

For facilitators and implementers working from a previous download, a hard copy or a DVD copy, this update section should be a first place to look for assistance on working out what has changed since the version they hold, and whether or not they need to substitute any of the updated documents.

Translators will also want to work from the most up-to-date version of the curriculum and should also refer to the website.

Version 2 introduced many new modules to the curriculum, expanding in response to demand.  However, the BRIDGE partners are open to the inclusion of additional modules outside of the 23 Version 2 modules, should there be a demonstrated need.

Ideas, suggestions and content can be sent to the BRIDGE Office to be kept on file for possible new modules to be introduced in the future.

Evaluating BRIDGE

Although evaluation happens at the end of an event or program, it should have already been considered from the very first stages of planning.  2.5 Planning for Evaluation gives an introduction to evaluation and outlines the steps taken in planning for evaluation at the beginning of a program.  A good evaluation process is built on strong foundations set at the beginning.

Refer to: 8.6 Annex 6: Post-workshop Evaluation Sheets for questions to both client organisation and participants after BRIDGE workshops and 8.5 Annex 5: BRIDGE Evaluation Cycle for a summary of the main elements of evaluation, and things to consider when designing an evaluation process for BRIDGE.

Evaluation by the client organisation

This would normally be achieved by collating the workshop evaluation sheets (daily, landmark, or end of workshop or program) and creating a written report which summarises the strengths and weaknesses of the program, and makes recommendations based on these findings. The report would normally be prepared by the program organisers.

End-of-workshop evaluation sheets, in which participants rate facilitators and contents, give an indication of how participants felt at the end of the workshop. But participants cannot, at the end of a workshop, tell the full story of whether they have benefited from the training, because they have not had time yet to put into practice what they have learned. It is therefore useful also to distribute evaluation sheets several weeks later and ask participants how they are using in their work environment the skills and information they gained from the workshop; how easy or difficult it is for them to apply new knowledge and skills; and what would make the program more effective. One should remember that the reason for training is not to improve how participants perform in the training room, but how they perform outside it.

Care should be taken when designing surveys: both open and closed questions should be asked. Open-ended questions are questions where there is not one definite answer. These can be useful, but the drawback is that they can sometimes be hard to interpret. Closed questions have a restricted set of answers from which the respondent chooses (one choice may be 'other'). It is easy to gather data from these types of questions. A report of these collated sheets would need to be prepared by the program organisers.

Evaluation reports should not be so lengthy that decision-makers don't bother to read them. To make an impact, and increase the likelihood that decision-makers read reports, evaluation reports should be broken up into easy-to-consume 'chunks' of information, for example 'Issues', 'Evidence' and 'Recommendations'.

If client organisations wish to evaluate the participants of a BRIDGE workshop (separate from the workshop organisers), using tests, they may do so. Formal tests of participant learning could be used some time after the workshop has been completed, ensuring that Learning Outcomes are matched with the test content.

Clients may also wish to assess the level of program stakeholder satisfaction (e.g. donors, sponsors) after a program.

Evaluation by and of the facilitators (and the program team)

Your browser may not support display of this image.If a client wishes to evaluate the facilitators of a BRIDGE workshop, they may do so. The BRIDGE partners have a process of 'quality control' of all accredited facilitators, which can draw on information from workshop evaluation reports. Facilitators themselves are encouraged to engage in self-appraisal and peer appraisal during the in-workshop monitoring (a self-evaluation form is included as a Facilitators Resource in every module). They are also encouraged to conduct post-workshop facilitator evaluations as part of their end of workshop debrief. They may also be responsible for preparing post-workshop evaluations on behalf of the program organisers or partner organisations. Results of these meetings could also be included in the final reports of the program.

In order for evaluations to reflect BRIDGE's capacity development philosophy and values, beneficiaries should not simply provide input or render opinions about activities or interventions; they should be participants who are involved in the evaluation process right from the start. The BRIDGE partners recommend that an 'empowerment' or 'participatory' evaluation approach be adopted where possible. In this approach, which is fundamentally democratic, the entire group - not just an evaluator - is responsible for conducting the evaluation (of a program) and assessing their own achievements. Evaluators are co-equal - with the client, beneficiaries or stakeholders - so that the whole process is a shared and collaborative one. This derives from the partners' acknowledgement and respect for people's capacity to create knowledge about, and solutions to, their own experiences.

Post-program evaluation tasks

Post-program evaluations can usefully be spread over three stages, the first of which seeks to assess the immediate impacts, the second of which focuses on mid-term organisational impacts and the third which looks at longer-term organisational impacts. Tasks to be performed at each stage are summarised in the tables below.

Table 5: Short-term evaluation

Who is being evaluated? Immediate post-workshop evaluation (to be conducted as soon as possible after the end of the program) Product of evaluation
BRIDGE partners and country client
  • Project history and outcomes can be collated
  • Donor reports
  • Other reports (including archived information)
Project team and counterpart training unit
  • Debriefing of facilitator
  • Post-program assessment
  • Constructive forward planning
  • Standard evaluation process
  • Standard report format
  • Briefing of country client
  • Collated project information/history
  • Recommendations on future BRIDGE opportunities (standard format)
Facilitators
  • Workshop evaluation
  • End of training evaluation
Participants
  • Application of learning (if operational-related)
  • Improved work plans
  • Expanded view of job
  • Personal enrichment (measurement)
 

Table 6: Medium-term evaluation

Who is being evaluated? Organisational impact (to be assessed on the occasion of the next electoral event or before the end of a six-month period, whichever occurs first) Product of evaluation
BRIDGE partners and country client
  • Stakeholder surveys
  • Collation of information
  • Report to donors
  • Report to country client
  • Proposal for future work/continuity
  • Agreement on further country client strategy
  • Strategy for future training/capacity development
Project team and counterpart training unit
  • Input into impact assessment
  • Report to BRIDGE partners on process
Facilitators
  • Input into impact assessment
  • Increased skill levels
  • Bigger pool of experience
Participants
  • Interviews
  • Improved work plans
  • Changed operations
  • More positive work environment
 

Table 7: Long-term Evaluation

Who is being evaluated? Organisational impact (to be assessed after at least a year) Product of evaluation
BRIDGE partners and country client
  • Stakeholder surveys
  • Collation of information
  • Report to donors
  • Report to country client
  • Proposal for future work/continuity
  • Agreement on further country client strategy
  • Strategy for future training/capacity development
Project team and counterpart training unit
  • Input into impact assessment
  • Report to BRIDGE partners on process
Facilitators
  • Input into impact assessment
  • Increased skill levels
  • Bigger pool of experience
Participants
  • Interviews
  • Improved work plans
  • Changed operations
  • More positive work environment

Evaluation reports

The program organisers would be responsible for preparing the reports associated with workshops and the program. These reports may be tailored according to the audience, which may include a client such as an EMB, donors, or other stakeholders.

The program report should:

  • Be clearly dated
  • Include the clearly stated purpose of the report
  • Specify the training events being evaluated and the time period during which they took place
  • Include an appropriate amount of detail for the needs of the intended audience
  • Include information that is presented in an interesting and understandable way, with graphics that help to make the findings clear
  • Not contain unnecessary information

Also, it should be clear who the audience for the report is, and the evaluators should have clear expectations for how it will be used by that audience.

An evaluation report should include the following components:

  • Executive Summary
  • Details of the training event(s) being evaluated
  • time span
  • number of times conducted
  • number of participants
  • number and names of facilitators (and accreditation status)
  • purpose and objectives of the training event(s)
  • key content areas
  • Methodology
  • composition of evaluation team
  • objectives of evaluation
  • selection of sample (size, characteristics)
  • number and location of sites visited
  • Analysis of findings
  • Interpretation
  • Recommendations (for changes in or maintenance of training, organisational systems and procedures, and environmental factors)
  • Annexes/Appendices that could include data analyses