My interest & Skills areas

The following are some of my interests & skill areas; Program & Strategy development; Monitoring, Evaluation & Research; Training, Coaching & Mentoring; Participatory M&E Processes; Capacity development in (Log-frame, Most Significant Change Technology, Rights Based Programming; Rights Based Approaches to M&E); Participatory Planning Processes; use of Open Space Technology; Indicator development; documentation & development of Case studies

Monday, January 19, 2009

Utilising evaluation findings

Most development Organisations spend a lot of resources (financial, time and human) to conduct evaluations and other researches however there are compelling evidence to show that evaluation results are not fully utilised. As development practitioners and Evaluators, how can we contribute to the discussion on 'utilisation of evaluation findings?' Please share your thoughts and experiences

6 comments:

  1. We first must define who the evaluation is meant for and carry them along from the begining. I have had an experience where we were contracted to do an evaluation of CDP activities for an oil company and throughout our 6-week assignment never, we met with management to be briefed on the scope. Obviously,their objective is not to use the evaluation to enhance work delivery. They just want to sign-post that an evaluation has been carried out. For an evaluation to be used, the scope must be thoroughly discuused by all stakeholders and agreement reached on how each wants to measure success. It is only then that the results could become useful. Secondly, every evaluation must have an audience that the results is to be disserminated to and a deliberate action plan to implement the recommendations thereof. So the stateholders should meet again after the completion of the evaluation and agree on the deliberate actions to be taken to implement the recommendations.

    ReplyDelete
  2. Hi Charles, these are very useful comments. I particularly like your suggestion of developing a kind of action plan to carry through the implementation of the recommendations from the evaluation. One problem I have with some of the recommendations is that in some cases, they seem too theoritical and very difficult to implement. What I think the consultant should do is to dicuss and agree on the recommendations with the client before finalising the evaluation report. Possibly help the client develop the follow up plan. Cheers

    ReplyDelete
  3. I find this discussion very interesting, infact I have also put together comments based on the recent evaluation we had.
    I think we should begin thinking about how the evaluation will be used during the preparation stage. The evaluation objectives and how the project team hopes
    to use the information generated by the evaluation must be aligned.
    Some questions to ask ourselves include:
    • Was the process we used within each activity the right one?
    • How can we improve our techniques (e.g., training techniques)?
    • What do the evaluation recommendations mean for the project
    and organization?
    • What are the key learning points from this for the future?
    • What were the critical factors for success?
    • What are the pitfalls to look for next time?
    To facilitate sharing and building institutional memory, the process and
    conclusions from these discussions should be documented.
    I strongly agree with Charles that we need to come up with a plan to ensure that evaluation results/recommendations are effectively utilised.

    For those recommendations not adopted, the reasons for not adopting them should be stated. The utilization plan should also outline what will be done, by whom,
    and when. This document can be a valuable addition to institutional memory since
    it captures decisions that may be useful for future evaluators, proposal
    writers, donors or new staff joining the team. Pending the content of
    the utilization plan, it can also be beneficial to attach it to the evaluation
    circulated to donors and key stakeholders. This shows that the organization
    is taking the evaluation seriously and has the capacity to learn and
    improve.

    ReplyDelete
  4. Hi Alice, you made some very crucial points which need to be taken on board by those who commission evaluations. The whole issue of institutional memory and accountability need to be taken seriously. It appears to me that financial audit is given more attention than the outcome of the project on which the funds were spent.

    Secondly, I believe that evaluation findings should provide opportunities for program/project enhancement and learning.

    ReplyDelete
  5. • This challenge could underline evolving perspectives to development practice, where country programmes are designed to take up rigorous impact evaluation and use emerging findings as part of all investments rolled out. MCC prescribes this in its Compact with all beneficiary countries, including Ghana. Evaluation findings, so far, have provided further incentives for Project Managers to refine the direction of planned activities.
    • A second MCC caveat prescribes that various project funds would be released only when performance is obtained on indicators measuring changes in investments. Data Quality Assessments (DQAs), appraising the sanity, validity and completeness of data representing the actual phenomenon on the ground, are therefore conducted by MiDA on periodic basis. DQA findings are important for management decision making, revision of strategies for programme interventions, and systemic reengineering to ensure a strengthened M&E interplay between all stakeholders, if recommendations from the assessment are tracked and implemented.
    • Collaboration with academic entities, especially universities to provoke further thoughts on evaluation findings is encouraged by MCC.
    • All the above, designed to engender the use of findings from evaluations/assessments seem to have had a smooth take-off so far. Nonetheless, the continued commitment of the M&E Unit, Project Managers and a buy-in from key stakeholders in the entire evaluation roll is required. A system to ensure that findings are disseminated to all stakeholders, and not put on the back burner to gather dust should be established. Action plan matrices, spelling out the role each stakeholder plays in responding to the recommendations spelt out in evaluation findings and the relevant timelines should be agreed on.

    ReplyDelete
  6. What a great and needed discussion, thanks!
    I think that if we reward staff for drawing on and learning from past project evaluations, and make it a condition of future projects for them to be familiar with similar ones done, many mistakes could be avoided and further improvements could be made. Great about the MCC requiring it!
    Optimal knowledge management tools to help technical staff (especially new ones) identify the most relevant of recent evaluations would also help.
    Finally, organizations too rarely invest in ex-post evaluations which are really needed as sustainability is one key aspect that we know so little about...
    Thanks, Jindra Cekan

    ReplyDelete