Showing posts with label Simple Evaluation Tools. Show all posts
Showing posts with label Simple Evaluation Tools. Show all posts

Tuesday, August 12, 2014

Further Resources and Links for those who attended the Bridge M&E Colloquium on 12 August 2014


Today, I got the opportunity to present to the Bridge M&E Colloquium on the work I'm doing with the CSIR Meraka Institute on the ICT4RED project. My first presentation gave some background about the ICT4RED project. 




I referred to the availability of the Teacher Professional Development course under a creative commons licence here, - This resource also includes a full description of the micro-accreditation system or Badging system. 

What seemed to get the participants in the meeting really excited is the 12 Component model of the project - which seems to suggest that one has to pay attention to much more than just technology when you implement a project of this nature. My colleagues published a paper on this topic here.

Participants also resonated with the "Earn as you Learn" model that the project follows - If teachers demonstrate that they comply with certain assessment criteria, they earn technology and peripherals for themselves and for their schools. A paper on the gamification philosophy that underlies the course, is available here.  The Learn to Earn model was documented in a learning brief here.


And then I was able to speak a little more about the evaluation design of the project. The paper that underlies this work is available here, and the presentation is accessible below:



I think what sets our project evaluation apart from many others being conducted in South Africa, is that it truly uses "Developmental Evaluation" as the evaluation approach. For more information about this (and for a very provocative evaluation read in general), make sure you get your hands on Michael Patton's book. A short description of the approach and a list of other resources can also be found here.

People really liked the idea of using Learning Briefs to document learning for / from team members, and to share with a wider community. This is an idea inspired by the DG Murray Trust. I blogged about the process and template we used before. An example of the learning brief that the M&E team developed for the previous round, is available here. More learning briefs are available on the ICT4RED blog.

I also explained that we use the Impact Story Tool for capturing and verifying an array of anticipated and unanticipated impacts. I've explained the use and analysis of the tool in more detail in another blog post. There was immediate interest in this simple little tool.

A neat trick that also got some people excited, is how we use Survey Monkey. To make sure that our data is available quickly to all potential users on the team, we capture our data (even data collected on paper) in Survey Monkey, and then share the results with our project partners via the sharing interface on Surveymonkey - even before we've really been able to analyse the data. The Survey Monkey site, explains this in a little more detail with examples.

The idea of using non-traditional electronic means to help with data collection also got some participants excited. I explained that we have a Whatsapp group for facilitators, and we monitor this, together with our more traditional post-training feedback forms, to ascertain if there are problems that need solving. In an upcoming blog post, I'll share a little bit about exactly how we used the WhatsApp data, and what we were able to learn from it.

Monday, February 17, 2014

Working Rigorously with Stories - Impact Story Tool




I've had some people email me about a paper I presented at the 2013 SAMEA conference. This paper introduces a tool for collecting and rigorously analysing impact stories that could be used as part of an evaluation. The full paper with the tool can be accessed here. The abstract is presented below:

 Beneficiary stories are an easily collected data source, but without specific information in the story, it may be impossible to attribute the mentioned changes to an intervention or to verify that the change actually occurred. Approaches such as Appreciative Inquiry and the Most Significant Change Technique have been developed in response to the need to work more rigorously with this potentially rich form of data. The “Impact Story Tool” is yet another attempt to make the most of rich qualitative data and was developed and tested in the context of a few programme evaluations conducted by Feedback RA.
The tool consists of a story collection template and an evaluation rubric that allows for the story to be captured, verified and analysed. Project participants are encouraged to share examples of changes in skills, knowledge, attitudes, motivations, individual behaviours or organizational practice. The tool encourages respondents to think about the degree to which the evaluated programme contributed towards the mentioned change, and also asks for the details of another person that may be able to verify the reported change. The analyst collects the story, verifies the story and then codes the story using a rubric. When a number of stories are collected in this way, they are then analysed together with other evaluation data. It may illuminate which parts of a specific intervention are most frequently credited with contributing towards a change.
Besides introducing the tool as it was used in three different evaluations, the usefulness of this tool and possible drawbacks are discussed.
 (The picture above is of a character known as "Benny Bookworm" from a South African TV show called "Wielie Walie" which I watched as a child)

Wednesday, November 20, 2013

ICT4RED Learning Workshop

I'm contributing to the evaluation of the ICT4RED (Information Communication Technology for Rural Education Development) initiative - A very ambitious project that rolls out teacher professional development to enable teachers and learners to use 21st century methods and tablet computers in rural schools in Cofimvaba in the Eastern Cape. More information about the project here and here.

We decided to use a developmental evaluation approach - I'm practically embedded in the organization that's responsible for implementing the project. I'm finding that this is a wonderful opportunity to influence what happens... But because this project is so different from the many failed technology projects that I've evaluated before, I sometimes wonder whether I am "objective" enough to add actual value.

We organised the M&E team's work into four categories -
  • Monitoring - Measuring progress made on outputs, facilitating ocassional debrief meetings with team members, reflecting on abundant data from various social media streams and participating in weekly project management meetings
  • Evaluation - Measuring the success after various implementation phases - incorporating some self-evaluation workshops, and other more standard evaluation measures including ethnograpic descriptions, baseline and follow up surveys, and a small scale RCT and tracer study
  • Learning - Asking team members to ocassionally reflect on what theyve learnt - from successes or failures
  • Model Development - Developing a modular theory of action underpinned by a theory of change that can be used to support scale up and replication. 

 Yesterday we held a "learning workshop" where team members had to reflect on what they've learnt.
Our first template for the learning briefs asked for the following information:

Project Name
    Give the project name here
Submitted by
    Give the name and component name
Date
    Give the date on which you submit the learning brief
What was the learning?
    Please describe the learning that occurred
Learning brief type
    Indicate which of the following three is applicable and provide a short description
    •    Learning from failure during implementation
    •    Learning from implementation success
    •    Learning from review of previous research and practice (i.e. not practically tested yet)
The Context
    Say something about the context of the learning / project context, add relevant pictures if they are available
Why is this learning important?
    Please describe why this learning is important
Evidence Base
    Please indicate what the evidence base is for this learning brief, if possible, provide references that may help the reader track down the evidence base.
Recommendations for future similar projects
    Please provide your recommendations in a list wise form. If possible add pictures, graphs or diagrams
Recommendations that should be taken into account by the current project
    Please indicate which of the above recommendations should be taken into account for the current project

My colleagues, who are great at packaging information, asked that we focus the learning presentations on
& What we designed
? What we learnt
# What we're doing now
! Advice for Policy and practice



This made for quite nice presentations. We'll probably adapt our learning brief templates accordingly.

Thursday, February 14, 2013

Simple Evaluation Tools

I'm starting a project soon where I will have to develop and compile really simple Evaluation materials for organizations that may not have a lot of expertise to do M&E. Here is one of the really simple but striking tools that I came across at the community sustainability engagement evaluation toolbox






Getting the right tools into people's hands is of course only part of the solution to making sure evaluation at grass roots improve. Sometimes it is less the case that people don't know how to do M&E, and more that they are spread too thin to also do M&E....