Wednesday, July 13, 2016

Evaluative Rubrics - Helping you to make sense of your evaluation data

Three times in one week I've now found myself explaining the use of evaluation rubrics to potential evaluation users. I usually start with an example like this, that people can relate to:
When your high school creative writing paper was graded, your teacher most likely gave you an evaluative rubric which specified that you do well if you 1) used good grammar and spelling, 2) structured your arguments well, and 3) found an innovative and interesting angle on your topic. In essence, this rubric helped you to know what is "good" and what is "not good".
In an evaluation, a rubric does exactly the same. What is a good outcome if you judge a post- school science and maths bridging programme? How does the outcomes of "being employed" or  "busy with a third year  B Sc. Degree at university" compare to an outcome like "being a self-employed university drop-out with three registered patents" or to an outcome like "being unemployed and not sure what to do about the future". A rubric can help you to figure this out.

E. Jane Davidson has some excellent resources on rubrics here and here. If you need a rubric on evaluating value for investment, Julian King has a good resource here.  And of course, there is the usual great content on better evaluation here.

I love how Jane describes why we need evaluation rubrics:
Evaluative rubrics make transparent how quality and value are defined and applied. I sometimes refer to rubrics as the antidote to both ‘Rorschach inkblot’ (“You work it out”) and ‘divine judgment’ (“I looked upon it and saw that it was good”)-type evaluations.

Monday, February 29, 2016

Writing Summaries for Evaluation Reports


Last year I attended a course on "Using Evidence for Policy and Practice" presented by Philip Davies from the International Initiative for Impact Evaluation [3ie]. I found his guidelines for what should go into the 1:3:25 summaries most helpful. Here they are:
The full course material is available on the website of the African Evidence Network's Website. Here

Wednesday, October 07, 2015

What I'm up to at the 2015 SAMEA Conference

The SAMEA conference is happening from 12 to 16 October and I'm looking forward to it. 


5thSAMEA Conference LogoSince January, I've had to temporarily downscale my professional involvement in the M&E and Educational networks and I had to neglect this little blog a bit because of a second long term development project I took on in January 2015. The project has lovely brown eyes, an infectious laugh and goes by the name of Clarissa. I'm happy to report that no major clashes with the first development project, (Named Ruan) has so far occurred, but its been a bit of an adjustment to balance work, and volunteering, and life in general. 


So what am I up to  at the conference?
I'll be tweeting from @benitaw if you are interested in my perspective of the conference. I will also attend an IOCE stand at the conference, aiming to promote the VOPE Institutional Capacity Toolkit which my consultancy developed under the EvalPartners leadership of Jennifer Bisgard, Patricia Rogers, Jim Rugh, and Matt Galen. This is an online toolkit full of helpful resources aimed to equip VOPEs (Voluntary Organisations for Professional Evaluation) to become more accountable and more active. 

Then, I'll be teaming up with Cara Waller (from CLEAR) and Donna Podems (from OtherWise) in a session for African VOPEs  on Friday 16th October. This is a ‘world-cafĂ©’ style event, from 10 –11:30am, to be held as a joint ‘Made in Africa’ and ‘Discussing the Professionalisation of Evaluation and Evaluators’ stream session.  The aim of the session is to provide a space for those involved with VOPEs in the region (and those with an interest in strengthening African VOPEs) to come together to discuss current topics around building quality supply and generating demand for evaluation in contextually-specific ways. So please come and chat all things VOPE on the day!

Good luck to my colleague Fazeela Hoosen and the rest of the SAMEA board on hosting this year's conference with the DPME and the PSC. I know (and boy.... do I know) it is very hard work. So thanks in advance for all of the hours you are putting in, to make this event happen. 

Thursday, October 16, 2014

True Confessions of an Economic Evaluation Phobic

You know how the forces at work in the universe sometimes conspire and confronts you with a persistent nudge... over an over again? Well this week's nudge was "You know nothing about economic evaluation... do something about it - Other than ignoring it".

Words like "cost-benefit analysis, cost-efficiency analysis, cost-utility analysis"... actually anything with the word "cost" or "expenditure" in it... makes me nervous. So my usual strategy is to ignore the "Efficiency" criterion suggested by the OECD DAC, or I start fidgeting around for the contact details of one of my economist friends, and pass the job along. I have even managed to be part of a team doing a Public Expenditure Tracking Survey without touching the "Expenditure" side of the data.

But then I found these two resources that helped me to start to make a little bit more sense of it all. They are:

The South African Department of Planning Monitoring and Evaluation's Guideline on Economic Evaluation  At least it starts to explain the very many different kinds of economic evaluation you should consider if you work within the context of South Africa's National Evaluation Policy Framework.


And then this. 
http://www.julianking.co.nz/downloads/


A free ebook by Julian King that presents a short theory for helping to answer the question "Does XYZ deliver good (enough) value for investment?" - Essentially the question any evaluator is supposed to help answer.

So, now, there is one more topic on my ever expanding reading list! If there is a "Bible" of economic evaluation, let me have the reference, ok?