Friday, December 20, 2013

Making a serious point... With a Mini-mouse Ribbon on my head

The ICT4RED project released this video of team members sharing how their lives were affected by the project (Follow Mobilina Cofimvaba's channel on Youtube for more videos about the project). In the video I share how I was motivated to use Twitter as professional learning tool.



For some reason the M&E people got associated with being mice. Maybe because we snoop around everywhere, maybe because we need big ears to listen, maybe because our job involves us being quiet... (Though I haven't managed being quiet yet...) So for the tablet fun day on 2 November I donned a mini-mouse ribbon to man the M&E Mouse station.



Monday, December 09, 2013

Twitter as professional development tool

I was one of the earlyish adopters of Twitter. In 2008 however, I couldn't see the point of maintaining a Facebook profile and a Twitter profile just to keep my friends and family updated about goings on. Very few of my friends were on Twitter so it was hard finding a reason to check in... Since I wasn't into the Kardashians' and Hiltons' business,Twitter just did not have what I wanted. So my account became dormant.

In 2013 however, I started evaluating a tech for education initiative. Maggie Verster @maggiev showed me how teachers could use Twitter to develop their own personal learning network. Here  is a nice summary. Finally I could see a use for it.


Now Twitter is my professional social networking tool and Facebook is kept for personal networking. I use Twitter in a general sense

*To find newspaper articles I'm interested in. All the big newspapers post links to top stories to twitter which links to their online sites. Mail and Guardian is one publication that I follow at @mailandguardian
*To check traffic between johannesburg and pretoria on the @itrafficGP handle whenever I tavel
*To get a sense of public sentiment on major news stories #RipNelsonMandela was quick to trend once the news broke. I was also amused by the #underdog story

But the real value is in the professional applications of Twitter. It helps me to find relevant content and people and to share my own content and interests with others. I have used Twitter:

*To find interesting blog posts by other evaluators and development players @BetterEval for example post snippets from their blog onto Twitter. So does the @Worldbank, @DGMurrayTrust  @Tshikululu and @RockefellerFDN
*To find information about education and evaluation events. This year I followed the American Evaluation Association's #eval13 conference from afar, and Bridge @BridgeProjectSA is very good with keeping a running commentary going on twitter for their education events
*To publicise my own blog content to potential users. My handle @benitaW sometimes carry links to www.mandeblog.blogspot.com
*To share interesting reading with other people. Twitter is probably not the best content curation tool, but its easy to find an article you've read and shared if you need to. It also helps to show other colleagues what your thinking is influenced by. They may suggest content on other or similar viewpoints... in essence allowing a little debate to take place, and extending your horizons a bit
 *To express opinion about published content. @DBE_SA ocassionally puts out very good and very nonsensical content that I just *have to* respond to.
*To maintain a back channel of communication at events. At the #SAMEA 2013 conference there was quite a vibe going on Twitter between persons attending the conference (@aidencholes @SouthernHemis @mmarais). At the #ICT4RED #tabletfunday I helped someone find their lost cellphone via twitter.
*To live tweet events. I kept up a running commentary of the #Samea 2013 conference sessions I attended. This helped me to keep a record of important points, and provided other members of the international evaluation community (e.g. @txtPablo @guijti @patriciajrogers) with a sense of important news.
*To find other like minded professionals. I started following @aidencholes because he is linked to the narrative lab and they also look at narrative methods... the topic of a recent conference paper. At the Samea conference we finally met face to face and we already had lots to talk about. @louisevanrhyn also works with schools
*To figure out who the movers and shakers are in other fields that I'm interested in. Dave Snowden @snowded is a systems thinker whose work I started following as a result of Twitter.

So if you are ready to take the plunge, here is a ten day twitter challenge that Sean Cole @seanhcole created for South African teachers. It applies well to evaluators too. Give it a bash!

Evaluators (#eval #evaluation) that I follow:
@patriciajrogers
@clysy
@John_Gargani
@ejanedavidson
@AnnKEmery
@txtPablo
@chiyanlam
@evalu8r
@EvaluationMaven
@sukist
@guijti


Evaluators from South Africa
@Duganf
@SouthernHemis
@aidencholes
@mmarais
@alfredeinstein
@developmentWorx

Orgnizations involved in Evaluation
@EvalPartners
@aeaweb
@BetterEval
@CDIwageningenUR
@JPAL_Global
@gatesfoundation
@Worldbank
@DGMurrayTrust
@RockefellerFDN


Wednesday, November 20, 2013

ICT4RED Learning Workshop

I'm contributing to the evaluation of the ICT4RED (Information Communication Technology for Rural Education Development) initiative - A very ambitious project that rolls out teacher professional development to enable teachers and learners to use 21st century methods and tablet computers in rural schools in Cofimvaba in the Eastern Cape. More information about the project here and here.

We decided to use a developmental evaluation approach - I'm practically embedded in the organization that's responsible for implementing the project. I'm finding that this is a wonderful opportunity to influence what happens... But because this project is so different from the many failed technology projects that I've evaluated before, I sometimes wonder whether I am "objective" enough to add actual value.

We organised the M&E team's work into four categories -
  • Monitoring - Measuring progress made on outputs, facilitating ocassional debrief meetings with team members, reflecting on abundant data from various social media streams and participating in weekly project management meetings
  • Evaluation - Measuring the success after various implementation phases - incorporating some self-evaluation workshops, and other more standard evaluation measures including ethnograpic descriptions, baseline and follow up surveys, and a small scale RCT and tracer study
  • Learning - Asking team members to ocassionally reflect on what theyve learnt - from successes or failures
  • Model Development - Developing a modular theory of action underpinned by a theory of change that can be used to support scale up and replication. 

 Yesterday we held a "learning workshop" where team members had to reflect on what they've learnt.
Our first template for the learning briefs asked for the following information:

Project Name
    Give the project name here
Submitted by
    Give the name and component name
Date
    Give the date on which you submit the learning brief
What was the learning?
    Please describe the learning that occurred
Learning brief type
    Indicate which of the following three is applicable and provide a short description
    •    Learning from failure during implementation
    •    Learning from implementation success
    •    Learning from review of previous research and practice (i.e. not practically tested yet)
The Context
    Say something about the context of the learning / project context, add relevant pictures if they are available
Why is this learning important?
    Please describe why this learning is important
Evidence Base
    Please indicate what the evidence base is for this learning brief, if possible, provide references that may help the reader track down the evidence base.
Recommendations for future similar projects
    Please provide your recommendations in a list wise form. If possible add pictures, graphs or diagrams
Recommendations that should be taken into account by the current project
    Please indicate which of the above recommendations should be taken into account for the current project

My colleagues, who are great at packaging information, asked that we focus the learning presentations on
& What we designed
? What we learnt
# What we're doing now
! Advice for Policy and practice



This made for quite nice presentations. We'll probably adapt our learning brief templates accordingly.

Monday, November 11, 2013

Monitoring and Evaluation framework and design for SAESC - July 2013

In July 2013, I did a presentation on the M&E framework I helped to develop for the South African Extraordinary Schools Coalition.

Better Evaluation, OECD DAC Evaluation criteria, and Michael Quinn Patton's Developmental Evaluation all featured in there.



Friday, October 25, 2013

Finding Data for Evaluations

Finding existing Government Data is often a huge problem for evaluators who are designing evaluations in Africa. We don't always know what is available, but even if we know, it is usually very hard to get hold of the relevant data.


The first eResearch Africa conference took place from 06-10 October 2013 in Cape Town and was hosted by the Association of South African University Directors of Information Technology

Presentations from the conference are available here.


What I do find exciting is that quite a few of the presentations spoke about making government data available. I learnt about the Accelerated Data Programme, in a presentation by Lynn Woolfrey, and was also excited to see the Human Science Research Council and UCT share something about their intiatives to make data available.





Tuesday, October 08, 2013

One of my new favourite productivity tools

I'm currently working with some smart people who aren't shy to self identify as geeks and nerds. One of them introduced me to a tool that has now become indispensable at work.

Trello.com is an electronic version of the old fashioned whiteboard in the office. Except its much more portable, it integrates with dropbox and it sends you email reminders of tasks due. And its free! I use it to:

* Manage task lists with evaluation teams  - task cards move across three lists: to do, doing, done.
* Keep track of data flows  - each type of data gets a card that moves across lists like: to administer, administering, received back, initial quality control done, to capturing, capturing done, data quality control done, added to master database
* I imagine it can work pretty well for keeping track of action points between meetings.

Its not quite as cute as this lego calendar that syncs with google calendar, but it comes close. Here is a description of trello by Mashable.




Wednesday, September 25, 2013

African Evaluation Journal

Africa finally has its own evaluation journal. The first issue has been compiled, and will be released soon. Congratulations to AFREA and SAMEA for your work on this!

The African Evaluation Journal will publish high quality peer-reviewed articles of merit on any subject related to evaluation, and provide targeted information of professional interest to members of AfrEA and its national associations and evaluators across the globe. This will encompass the following aims:
  • To build a high quality, useful body of evaluation knowledge for development.
  • To develop a culture of peer-reviewed publication in African evaluation.
  • To stimulate Africa-oriented knowledge networks and collaborative efforts.
  • To strengthen the African voice in evaluation.
Editor-in-Chief: Mark Abrahams, Division for Lifelong Learning,University of the Western Cape, South Africa
Associate Editor: Guy Blaise Nkamleu, Principal Evaluator, African Development Bank, Tunisia

Monday, September 16, 2013

SAMEA Conference 16 - 20 September 2013

I can't wait. I am looking forward to this week's SAMEA conference.

Good luck to Babette and team. I'm sure the blood sweat and tears you've invested in this conference will pay off.


I'm involved in a paper session on Evaluating ICT for Education programmes, and I'll also be sharing something about a little tool I call the "Impact Story Template". 

Friday, March 15, 2013

An Excellent Read - Application of Systems thinking

What it looks like when it's fixed

What it looks like when it's fixed The more we study the major challenges of our time – such as poverty, crime, unemployment, health and the environment – the more we realise that conventional solutions are failing to create the impact they had in the past.
What it looks like when it's fixed provides a case study in the development of a different approach that offers new hope in tackling the most daunting challenges facing our society and institutions.
This work draws on the growing body of systems and design thinking knowledge to address the wicked social problems facing our society. What it looks like when it's fixed offers a new holistic way of understanding complex social systems, building stakeholder cohesion and designing solutions that will work in our era.

About the author Dr Barbara Holtmann uses systems and design thinking to facilitate understanding and insight among key stakeholders dealing with fragile social systems across the world. She has worked in business, government and most recently at the CSIR. Barbara is Vice President of the International Centre for the Prevention of Crime and serves on the boards of Women in Cities International and the Open Society Foundation for South Africa. She was the recipient of the Ann van Dyk Applied Research Award in 2010.

Tuesday, March 12, 2013

Survey of ICT in Education in Africa

This from the FOSSA website (Free and Open Source Software Africa)

How are ICTs currently being used in the education sector in Africa, and what are the strategies and policies related to this use?

infoDev is helped to coordinate a comprehensive study surveying the current landscape of ICT in education initiatives in Africa, and was interested in collaborating with partner organizations who wished to be involved in this work.
Key questions:
- How are ICTs currently being used in the education sector in Africa, and what are the strategies and policies related to this use?
- What are the common challenges and constraints faced by African countries in this area?
- What is actually happening on the ground, and to what extent are donors involved?

You can download the reports Free and Open Source Software Africa's Reports and White Papers page  A new survey is forthcoming 

Monday, March 04, 2013

How to specify your needs if you require a case study

A client is interested in contracting us to write up a case study for one of their programmes, but they don't really know which information will be necessary. Since there are no terms of reference yet for the case study, I suggested that the client clarifies the following, in order for us to be able to assess the level of effort required.

1. What will the case study be used for? (To document lessons learnt, to help with marketing, to document evidence of a successful initiative)
2. What is the final product that you have in mind, and how long does it need to be? (A written report, or a presentation, or a glossy publication)
3. Who will be reading the Case Study?
4. How much background documents do you have available? (Project descriptions, evaluation findings, participation data, survey data)
5. What kind of additional data collection will be necessary? (Interviews, photo's, site observations)
6. Would you want to meet with the evaluation team before the assignment starts, and after it is completed? 

I came across this useful little guide on how to use Case Studies to do Program Evaluation.It helps one to assess whether a case study should be used, and how to do it.
Edith D. Balbach, Tufts University
March 1999
Copyright © 1999 California Department of Health Services
Developed by the Stanford Center for Research in Disease Prevention
The Better Evaluation page on Case Studies can be found here.

Thursday, February 28, 2013

Real Time Evaluation and the rise of the Evaluation SWAT team

This five minute presentation shares Michael Patton's view on real time evaluation. Looks like evaluators will soon have to get tactical training to make sure they are ready to execute like a SWAT team!


Thursday, February 14, 2013

Simple Evaluation Tools

I'm starting a project soon where I will have to develop and compile really simple Evaluation materials for organizations that may not have a lot of expertise to do M&E. Here is one of the really simple but striking tools that I came across at the community sustainability engagement evaluation toolbox






Getting the right tools into people's hands is of course only part of the solution to making sure evaluation at grass roots improve. Sometimes it is less the case that people don't know how to do M&E, and more that they are spread too thin to also do M&E....

Monday, February 11, 2013

WEF Global Competitiveness Report

Sadly, but not surprisingly, South Africa still hovers near the bottom on most of the education related indicators measured by the World Economic Forum Global Competitiveness Report.

This just means that there is a lot of scope for making a difference here! To play with the data, go here: 

Thursday, February 07, 2013

MOOCs that Evaluators might consider



In a previous post I shared some ideas about Massive Open Online Courses (MOOCs). I came across a listing of free courses offered by some prominent US Universities via online platforms. The full list with more than 200 courses is here:

The site uses the following key to provide information on the certification offered through these courses.
Free Courses Credential Key
CC = Certificate of Completion
SA  = Statement of Accomplishment
CM = Certificate of Mastery
C-VA = Certificate, with Varied Levels of Accomplishment
NI – No Information About Certificate Available
NC = No Certificate

What caught my eye is the fact that there are quite a few courses listed that might be interesting to evaluators looking to improve their stats capacity.

Introduction to Statistics (NI) – UC Berkeley on edX – January 30 (TBD weeks)
Probability and Statistics (NC) – Carnegie Mellon
Statistical Reasoning (NC) – Carnegie Mellon

A few of the courses that started recently that also looks interesting include:

Data Analysis (NI) – Johns Hopkins on Coursera – January 22 (8 weeks)
Introduction to Databases (SA) – Stanford on Class2Go – January 15 (9 weeks)
Introduction to Infographics and Data Visualization (CC) Knight Center at UT-Austin - January 12 (6 weeks)
Social Network Analysis (CC) – University of Michigan on Coursera – January 28 (9 weeks)

Looks like we will have to keep a closer eye on this type of information! 
 

Monday, February 04, 2013

Reflections from various Evaluations of ICT projects

After doing a few evaluations of ICT projects implemented in schools, I reflected on some of the lessons we've learnt throughout. Its not an exhaustive list, and certainly a lot of it is common sense, but somehow it is the common sense things that people do not always plan for.




Some of the key questions that I would like to see answered in evaluations of these type of initiatives include:

›Is the content relevant? (Content review)
›Is the content user friendly for the intended users (Heuristics Evaluation)
›Was it implemented at the requisite “dosage” level for it to possibly work? (Fidelity monitoring)
›Can it effect change? (Experimental design)
›At what cost (to participants and donors) (Cost analysis)
›Then only, can you start to answer: Did it work (Quasi-experimental design)
›Does it work better than “something else” (comparative analysis), or how does it work with “something else”

Tuesday, January 29, 2013

Online Tertiary Education

Thomas Friedman wrote an article in the NYTimes about the "revolution" in Universities .
Revolution Hits the Universities
Nothing has more potential to let us reimagine higher education than massive open online course, or MOOC, platforms.
I think this is a wonderful development and one that I have eagerly awaited. Having access to great education opportunities without having to travel will help me become a better evaluator. Already I visit www.betterevaluation.org; www.mymande.org and www.statistics.com for some of my personal capacity development needs. I might pursue formal credentialing some time in the future via this route.

I acknowledge that this move to online training is  a juggernaut that will not be stopped. I just wonder what the systemic effects will be? How much "blood" will be shed in this "revolution" before the necessary checks and balances will be implemented? As with all revolutions, its not going to have good effects for everybody!

One category of "deaths" that I foresee is that of the average university professor as a teacher.

 If everyone does a course with the "best" prof. in the world, the second and third best profs won't have teaching jobs anymore. The effects might be that we could end up with a dangerously monolithic way of thinking, with all kinds of implications for how we define problems, seek answers and develop the body of scientific knowledge. On the other hand, a common language may finally emerge, allowing more people to stand on the shoulders of giants to reach for diverse solutions in their diverse contexts.

Back when TV was introduced we had no idea what impact it will eventually have. I think we are standing in that exact same spot again...

Monday, January 28, 2013

A new start for 2013

I recently took on a long term development project that involves a certain baby with beautiful blue eyes, so the blogging had to move to the back burner. But here is a fresh contribution for this month.



As part of the M&E I do for educational programmes, I frequently suggest to clients that they not only consider the question: “Did the project produce the anticipated gains?” but that they also answer the question “Was the project implemented as planned? This is because sub-optimal implementation is, in my experience, almost always to blame for negative outcomes of the type of initiatives tried out in education. 

Consider the example of a project which aims to roll out extra computer lessons in Maths and Science in order to improve learner test scores in Maths and Science.  We not only do pre-and post-testing of the learner test scores in the participating schools, but we also track how many hours of exposure the kids got, what content they covered, how they reacted to the content, etc. And we attend the project progress meetings where the project implementer reflects on the implementation of the project.  Where we eventually don’t see the kind of learning gains anticipated, we are then able to pinpoint what went “wrong” with the implementation – frequently we can predict what the outcome will be based on what we know from the implementations. This manual on implementation research outlines a more systematic approach to figuring out how best to implement an initiative – written with the health sector in mind.  

Of course implementation success and the final outcomes of the project is only worth investigating for interventions where there is some evidence that the kind of intended changes are possible. If there is no evidence of this kind, we sometimes conduct a field trial with a limited number of kids, on limited content, over a short period of time in an implementation context similar to the one designed for the bigger project.  This helps us to answer the question “Under ideal circumstances, can the initiative make a difference in test scores?

What a client chooses to include in an evaluation is always up to them, but let this be a cautionary tale: A client recently declined to include a monitoring / evaluation component that considered programme fidelity, on the basis that it would make the evaluation too expensive. When we started collecting post-test data for the evaluation, we discovered a huge discrepancy between what happened on the ground, and what was initially planned–  Leaving the donor, the evaluation team and the implementing agency with a situation that has progressed too far to fix easily.  Perhaps if there was better internal monitoring this situation could have been prevented. But involving the evaluators in some monitoring would have definitely helped too!