Thursday, October 16, 2014

True Confessions of an Economic Evaluation Phobic

You know how the forces at work in the universe sometimes conspire and confronts you with a persistent nudge... over an over again? Well this week's nudge was "You know nothing about economic evaluation... do something about it - Other than ignoring it".

Words like "cost-benefit analysis, cost-efficiency analysis, cost-utility analysis"... actually anything with the word "cost" or "expenditure" in it... makes me nervous. So my usual strategy is to ignore the "Efficiency" criterion suggested by the OECD DAC, or I start fidgeting around for the contact details of one of my economist friends, and pass the job along. I have even managed to be part of a team doing a Public Expenditure Tracking Survey without touching the "Expenditure" side of the data.

But then I found these two resources that helped me to start to make a little bit more sense of it all. They are:

The South African Department of Planning Monitoring and Evaluation's Guideline on Economic Evaluation  At least it starts to explain the very many different kinds of economic evaluation you should consider if you work within the context of South Africa's National Evaluation Policy Framework.


And then this. 
http://www.julianking.co.nz/downloads/


A free ebook by Julian King that presents a short theory for helping to answer the question "Does XYZ deliver good (enough) value for investment?" - Essentially the question any evaluator is supposed to help answer.

So, now, there is one more topic on my ever expanding reading list! If there is a "Bible" of economic evaluation, let me have the reference, ok?

Friday, September 12, 2014

What if, mid career as a researcher, you become interested in Evaluation?

An old classmate, that took the market research route after completing her Research Psych Master's Degree, asked me for a couple of references to check out if she wanted to develop her evaluation knowledge and skills. What came to mind is the following professional development resources. I'm sure there's many more easily accessible ones, but this is a good start for a list!

Tuesday, August 12, 2014

Further Resources and Links for those who attended the Bridge M&E Colloquium on 12 August 2014


Today, I got the opportunity to present to the Bridge M&E Colloquium on the work I'm doing with the CSIR Meraka Institute on the ICT4RED project. My first presentation gave some background about the ICT4RED project. 




I referred to the availability of the Teacher Professional Development course under a creative commons licence here, - This resource also includes a full description of the micro-accreditation system or Badging system. 

What seemed to get the participants in the meeting really excited is the 12 Component model of the project - which seems to suggest that one has to pay attention to much more than just technology when you implement a project of this nature. My colleagues published a paper on this topic here.

Participants also resonated with the "Earn as you Learn" model that the project follows - If teachers demonstrate that they comply with certain assessment criteria, they earn technology and peripherals for themselves and for their schools. A paper on the gamification philosophy that underlies the course, is available here.  The Learn to Earn model was documented in a learning brief here.


And then I was able to speak a little more about the evaluation design of the project. The paper that underlies this work is available here, and the presentation is accessible below:



I think what sets our project evaluation apart from many others being conducted in South Africa, is that it truly uses "Developmental Evaluation" as the evaluation approach. For more information about this (and for a very provocative evaluation read in general), make sure you get your hands on Michael Patton's book. A short description of the approach and a list of other resources can also be found here.

People really liked the idea of using Learning Briefs to document learning for / from team members, and to share with a wider community. This is an idea inspired by the DG Murray Trust. I blogged about the process and template we used before. An example of the learning brief that the M&E team developed for the previous round, is available here. More learning briefs are available on the ICT4RED blog.

I also explained that we use the Impact Story Tool for capturing and verifying an array of anticipated and unanticipated impacts. I've explained the use and analysis of the tool in more detail in another blog post. There was immediate interest in this simple little tool.

A neat trick that also got some people excited, is how we use Survey Monkey. To make sure that our data is available quickly to all potential users on the team, we capture our data (even data collected on paper) in Survey Monkey, and then share the results with our project partners via the sharing interface on Surveymonkey - even before we've really been able to analyse the data. The Survey Monkey site, explains this in a little more detail with examples.

The idea of using non-traditional electronic means to help with data collection also got some participants excited. I explained that we have a Whatsapp group for facilitators, and we monitor this, together with our more traditional post-training feedback forms, to ascertain if there are problems that need solving. In an upcoming blog post, I'll share a little bit about exactly how we used the WhatsApp data, and what we were able to learn from it.

Tuesday, June 24, 2014

Exciting Learning from people involved in South African ICT in Education

I've been fortunate to be invited to a small gathering of people working with Coza Cares in the ICT space in South Africa. The luxury of sitting down for two days and listening to people talk about what they are passionate about, is something to truly savour.

I did a presentation on some ideas I have to define and measure learners' 21st Century Skills in the context of the ICT4RED project. I currently have more questions than answers, but I'm sure we will get there soon. Here is a link to a summary table comparing different Definitions of 21st Century Skills.

Other presentations I really enjoyed was
* Barbara Dale Jones on the role of Bridge and learning communities and knowledge management
* Fiona Wallace, on the CoZaCares model of ICT intervention
* John Thole on Edunova's programme to train and deploy youth to support ICT implementation in Schools
* Siobhan Thatcher from Edunova's presentation on Edunova's model for deploying Learning Centres in support of Schools
* Brett Simpson from Breadbin Interactive on the learning they've done on the deployment of their content repository.
*Ben Bredenkamp from Pendula ICT talking about their model for ICt integration and experience of the One Laptop per Child project in South Africa.
* Dylan Busa from Mindset speaking about the relaunch of their website content.
* Merryl Ford and Maggie Verster talking about the ICT4RED project




Wednesday, May 28, 2014

Impact Evaluation Guidance for Non-profits

Interaction has this lovely Guidance note and Webinar Series on Impact Evaluation available on their website.

Impact Evaluation Guidance Note and Webinar Series

With financial support from the Rockefeller Foundation, InterAction developed a four-part series of guidance notes and webinars on impact evaluation. The purpose of the series is to build the capacity of NGOs (and others) to demonstrate effectiveness by increasing their understanding of and ability to conduct high quality impact evaluation.
The four guidance notes in the series are:
  1. Introduction to Impact Evaluation, by Patricia Rogers, Professor in Public Sector Evaluation, RMIT University
  2. Linking Monitoring & Evaluation to Impact Evaluation,  by Burt Perrin, Independent Consultant
  3. Introduction to Mixed Methods in Impact Evaluation, by Michael Bamberger, Independent Consultant
  4. Use of Impact Evaluation Results, by David Bonbright, Chief Executive, Keystone Accountability
Each guidance note is accompanied by two webinars. In the first webinar, the authors present an overview of their note. In the second webinar, two organizations - typically NGOs - present on their experiences with different aspects of impact evaluation. In addition, each guidance note has been translated into several languages, including Spanish and French. Webinar recordings, presentation slides and the translated versions of each note are provided on the website.

Wednesday, May 21, 2014

Resources on Impact Evaluation

This post consolidates a list of impact evaluation resources that I usually refer to when I am asked about impact evaluations. 

This cute video explains the factors that distinguishes impact evaluation from other kinds of evaluation, in two minutes. Of course randomization isn't the only way of credibly attributing causes and effects - and this is a particularly hot evaluation methodology debate.  For an example of why this is sometimes an irrelevant debate - see this write up on parachutes and Chris Lysy's cartoons on the topic.

Literature on the Impact Evaluation Debate

The Impact Evaluation debate flared up after this report, titled "When will we ever learn" was released in 2006. In the States there also was a prominent funding mechanism which required programmes to include experimental evaluation methods in their design, or not get funding (from about 2003 or so).

The bone of contention was that Randomized Control Trials (RCTs) and Experimental methods (and to some extent Quasi Experimental Designs) were held up as the "gold standard" in evaluation. Which, in my opinion, is nonsense. So the debate about what counts as evidence started again. The World Bank and big corporate donors were perceived to push for Experimental Methods, Evaluation Associations (with members committed to mixed methods) pushed back saying methods can't be determined without knowing what the questions are. And others pushed back saying that RCTs are probably applicable in only about 5% of the cases in which evaluation is necessary.

The methods debate in Evaluation is really an old debate. Some really prominent evaluators decided to leave the AEA because they embarked on a position that they equated with "The flat earth movement" in geography.  Here is a nice overview article, (The 2004 Claremont Debate: Lipsey vs. Scriven. DeterminingCausality in Program Evaluation and Applied Research: Should ExperimentalEvidence Be the Gold Standard?) to summarise some of it.
The Network of Networks in Impact evaluation then sought to write a guidance document, but even after this was released, there was a feeling that not enough was said to counter the "gold standard" mentality.  This document, titled "Designing impact evaluations, different perspectives" provides a bit more information on the "other views".

Literature on Impact Evaluation Methods
 If you are interested in literature on Evaluation Methods, look at Better Evaluation to get a quick overview.

I like Cook, Campbell and Shadish to understand experimental and quasi experimental methods, but this online knowledge base resource is good too.
 
For some resources on other more mixed methods approaches to impact evaluation, you need to look at Realist Synthesis, General Elimination Method, Theory Based Evaluation, and something that I think has potential, the Collaborative Outcomes Reporting approach. 


The South African Department for Performance Monitoring and Evaluation's guideline on Impact Evaluation is also relevant if you are interested in work in the South African Context.

Wednesday, April 02, 2014

Getting authorisation to do Research and Evaluation in Schools

 
A colleague working in an educational NGO asked this question, about working in schools in South Africa:

I just wanted to ask a quick question. Do I need to get permission from the relevant Provincial Department of Education to carry out research in schools if the schools are part of a project we’re running? In other words, the district is aware of us and probably interacting with us?
 
My answer: 
I've only done research or evaluations in a few Provinces, not all of them, but in all of those Provinces the Education Departments have guidelines for researchers that require you to fill in forms, submit your research proposal (and sometimes evaluation instruments) for review, and also binds you to some promises about the use of your research or evaluation findings. (E.g. the Province may require copies of reports, may require you to present your findings, etc.) Check any of the Provinces' annual reports to see which Director in the Provincial office is in charge of Research, and lodge your enquiry about requirements there, if you can't find details on the Provincial Education Website.

The officials in Education Districts are often not aware of the Provincial requirements, so one might be able to get away without Provincial authorization, but this is a bad idea for at least two reasons: 

* It helps if the Research Directorate in the Provincial Education Department have your details on their database because it promotes use and coordination of research, and
*It can solve a lot of headaches for you should someone complain about your research going forward. 

Since Education in schools is a Provincial competence, I have been unable to get blanket approval from National Education to work in multiple Provinces - so that meant filling in the different forms and providing the different details to the different Provinces, and following up on the outcome of each of these processes.

Besides Provincial approval, some clients might also require that any human subject research gets vetted by a research ethics approval board, like the ones attached to universities, or science councils. I've only dealt with a few of these, but they mostly require you to prove that you have authorization to conduct the research, so the two goes hand in hand.

Of course approval by the Province and Research Ethics Boards are still not all that you need to do to ensure that you conduct your work ethically - Some fields (E.g. Marketing Research - see the ESOMAR guidelines),  have guidelines about ethics... so it would be good to study these and make sure your practice remains above board.

And then this, of course, is also true:

Live one day at a time emphasizing ethics rather than rules.
Wayne Dyer


 

Thursday, March 27, 2014

I am because you are

In a previous blogpost I reflected on how African values shape my practice of Evaluation.

This week I attended a seminar during which Gertjan Van Stam shared some provocative views on development in Africa. I started reading his book 'Placemark'. I love the way he gives voice to rural Africa. I find it interesting that this Dutch Engineer manages to give voice to Africa in a way that I can relate with.

His beautifully written take on Ubuntu:

I am, because You are

Is it possible that people in rural areas of Africa can connect with people in urban areas around the world?

That one can walk into a scene and meet someone who walks into the same scene, even if it is geographically separated?

That we explore and connect rural and urban worlds worldwide without anyone being forced into cultural suicide?

That we meet around the globe and relate, embrace, love, and build meaningful relationships?

That we find ways to be of significance and support to each other and together shuffle poverty and disease into the abyss?

That we encourage each other to withstand drunkenness and drugs, bullying, self harm, and greed?

That we share spiritual nutrition to deal with wealth, loss, alienation and pain in this generation?

That we unite through social networks, overcoming divides and separations?

That we share ancient, tested, and new resources, opportunities, visions, and dreams that lead to knowledge, understanding and wisdom?

That we collaborate to discuss, and engineer tools, taking into account the integral health of all systems?

That together, South and North, build capacity, mutual accountability, and progress, for justice and fairness?

That I am, because You are?

Monday, March 17, 2014

21st Century Skills of Rural African Teachers and Learners



I’m evaluating a project that aims to build the 21st century skills of rural African teachers and learners. Until recently I did not even know what people meant when they used the phrase 21st Century skills, but I have been enlightened and must now find a way to measure it for our evaluation. 

It seems I’m not the only one struggling with the problem of having to measure something very broad - There are a range of resources available that wrangle with the idea of defining and measuring 21st Century Skills – Some of the resources I found particularly useful include:

Everything I read, however, seems to have the focus on a context that is not rural and not African. Perhaps there is scope for our project to contribute to the general discussion on 21st Century Skills by adapting the definitions and measures specifically for our context? Perhaps this is an opportunity to develop an example of African Made, African Owned Evaluation?

Wednesday, February 26, 2014

My plans for AfrEA 2014 conference

I'm off to Cameroon on Sunday for a week of networking, learning and sharing at the 2014 AfrEA conference in Yaondé. I love seeing bits of my continent. If internet access is available I'll try to tweet from @benitaW.

I am facilitating a workshop on Tuesday together with the wise Jim Rugh and the efficient Marie Gervais to share a bit about a VOPE toolkit EvalPartners is developing. ( A VOPE is an evaluation association or society... voluntary organization for professional evaluation)

Workshop title:Establishing and strengthening VOPEs: testing and applying the EvalPartners Institutional Capacity Toolkit

Abstract: One of the EvalPartners initiatives, responding to requests received from leaders of many VOPEs (Voluntary Organizations for Professional Evaluation), is to develop a toolkit which provides guidance to those who wish to form even informal VOPEs, and leaders of existing VOPEs who seek guidance on strengthening their organization’s capacities.  During this workshop participants will be introduced to the many subjects addressed in the VOPE Institutional Capacity Toolkit, and asked to test the tools as they determine how they could help them apply such resources in strengthening their own VOPEs.

The workshop will be very interactive with lots of exploring, engaging, and evaluating of the toolkit resources. Participants should not come to this workshop expecting that they will sit still for more than 30 minutes at a time. We'll use a combination of learning stations and fishbowls as the workshop methodology.  I'm really looking forward to it!

Eventually the toolkit will be made available online. Follow @vopetoolkit on twitter for more news about developments.

I served on the boards of both AfrEA and SAMEA so I hope that the resources that the Toolkit task force and their marvellous team of collaborators put together in the toolkit will be of use to colleagues across the continent who are still founding or strengthening their VOPEs. It is hard and sometimes thankless work to serve on a VOPE board, and if this toolkit can make someone's life a little easier with examples, tools and advice, I would count this as a worthy effort.

I expect that the workshop will be a good opportunity to get some Feedback to guide us in the completion of the work.

Monday, February 17, 2014

Working Rigorously with Stories - Impact Story Tool




I've had some people email me about a paper I presented at the 2013 SAMEA conference. This paper introduces a tool for collecting and rigorously analysing impact stories that could be used as part of an evaluation. The full paper with the tool can be accessed here. The abstract is presented below:

 Beneficiary stories are an easily collected data source, but without specific information in the story, it may be impossible to attribute the mentioned changes to an intervention or to verify that the change actually occurred. Approaches such as Appreciative Inquiry and the Most Significant Change Technique have been developed in response to the need to work more rigorously with this potentially rich form of data. The “Impact Story Tool” is yet another attempt to make the most of rich qualitative data and was developed and tested in the context of a few programme evaluations conducted by Feedback RA.
The tool consists of a story collection template and an evaluation rubric that allows for the story to be captured, verified and analysed. Project participants are encouraged to share examples of changes in skills, knowledge, attitudes, motivations, individual behaviours or organizational practice. The tool encourages respondents to think about the degree to which the evaluated programme contributed towards the mentioned change, and also asks for the details of another person that may be able to verify the reported change. The analyst collects the story, verifies the story and then codes the story using a rubric. When a number of stories are collected in this way, they are then analysed together with other evaluation data. It may illuminate which parts of a specific intervention are most frequently credited with contributing towards a change.
Besides introducing the tool as it was used in three different evaluations, the usefulness of this tool and possible drawbacks are discussed.
 (The picture above is of a character known as "Benny Bookworm" from a South African TV show called "Wielie Walie" which I watched as a child)

Monday, February 10, 2014

Reflection: Thanks Tom Grayson

So since I started a stroll down memory lane, I thought I'd share this too. In 2002 I decided I wanted to be an evaluator. I was working at an evaluation company, and I decided to start my own consultancy, so I was getting great practical exposure.  But I really did not have a good academic grounding in the theory and literature surrounding Evaluation. This was back in the day when there weren't MOOCS and webinars... so I had to READ to get my education.

During my studies I had read Cook and Campbell, and somehow I also stumbled upon Guba and Lincoln. I was introduced to Utilization Focused Evaluation.  In 2004 I got Rossi, Lipsey and Freeman for a going away present from Khulisa, and I read any evaluation journal articles I could lay my hands on.

Its after reading something that Tom Grayson (from the University of Illinois at Urbana-Champaign) wrote in a journal article about teaching evaluation, that I decided to email him. I asked him for some reading material that will give me a good basis in Evaluation. He responded by sending me a package of course reading materials via post... This was such an unexpected gesture of goodwill. Above is a little handwritten note that he sent with the material.
 
So Tom, thanks a lot. And this is me letting you know about my adventures in evaluation!

Monday, February 03, 2014

Reflection: I decided to become an evaluator in May of 2002

As I was packing up the FeedbackRA office, I found a few files that I needed to clear out. This one is special, because it was during this workshop that I chose to relate to the identity of Evaluator.

It is a file for a workshop titled "Evaluation for Development: An Advanced Course in Evaluation". It was presented by Michael Quinn Patton in Pretoria in 2002, and it was arranged by Zenda Ofir from Evalnet.
  
My academic training in the field of Research Psychology meant that I was comfortable with research methods, but I also wanted to be involved in Development...  I didn't have a good idea of what I wanted to do with the research skills, and frankly, before joining Khulisa I had never heard of evaluation as a career option. The Community Psychology training I did at Honours and Masters level resonated deeply with me. Previously I had thought that I wanted to be a project officer at an NGO or international development organization, but I also realised that I like doing the research. I think that after about a year's working experience I started to think of myself as a researcher. 

I will forever be grateful for the experience I got at Khulisa Management Services, and the fact that Jennifer Bisgard let me go to this workshop in 2002. Thanks to Zenda Ofir for arranging it, and thanks for Michael Quinn Patton for preaching/teaching so convincingly.

Thursday, January 30, 2014

So long FeedbackRA, and thanks for all the fish!

Below is a picture of myself, my business partner, and our spouses on the day we moved into our offices in April 2006. And then a picture of the people in the Feedback Offices today, for one last time (Terence is away in Rwanda on a jobbie).



Today is a little bit of a sad day for me. We’re packing up the FeedbackRA offices which have been the place where I practiced my profession for the past eight years. I'm starting at my new offices tomorrow, and it means that my relationship with FeedbackRA is one step closer to dissolving further.

The timeline:
I started the business in 2002 together with two colleagues. Our first job was a survey we did for MTN. Our first evaluation job was for the Gauteng Education Development Trust.

In 2004 I quit my fulltime job to earn my own salary at Feedback. In this year we did a nice piece of research for the Department of Science and Technology, and started expanding our CSI business base.

In 2006 we graduated to proper offices.  We appointed our first employees and embarked on a range of projects that just saw the business  grow – in terms of its focus, our skill and the turnover.

 In 2007 and 2008 we did a strategic piece of research on JIPSA, and I got to interview ministers and captains of industry. In this time I was also involved with setting up SAMEA and in 2009 I started to contribute to AFREA too.

In September 2009 I took a step back to reflect on the important things in my life. Up and ‘til 2009 I managed the business and business finances, and delivered as a key consultant, while keeping an eye on things at SAMEA and AFREA. I just couldn’t do it all any longer.

In 2010 I scaled down on my work and volunteer responsibilities, and we got Daleen involved to help us run the business. My life was much simpler after that… but it was still tough, because the business kept on growing. I did some lovely work with colleagues from Stellenbosch on a Public expenditure tracking survey… and I learnt so much from being the “junior” on the team.

In 2010 we started negotiations with a range of other high-profile consultants to see if they would like to join as business partners. We worked together on a few projects, we had a strategic planning session… everything looked good.  Our business was expanding and we decided to take up more office space. We started two big contracts which allowed us the scope to do some longer term planning, but it also opened the business up to risk… because things did not always go according to plan.

In 2011 an advertisement for the only other job I ever thought I’d consider, crossed my desk. I decided to apply, and let my business partners know. We let the conversation with the other potential business partners cool down a bit. After 6 months in limbo, I found out that I did not get the other job, but it took only 1 month for me to realise that what I’ve gained at Feedback was significant enough for me to call it quits. The stress of managing a business and a full consulting plate was just too much. I took on one too many assignments – because a colleague that I respect a lot twisted my arm.  This had bad consequences for me as an individual. I didn’t do my best work any longer.  I wanted to do things well again… I wanted to focus on things that I could do well, instead of just taking on jobs to make sure cash flow was sorted. A week after my decision, I found out I was going to be a mom. So suddenly I had another reason to reevaluate my priorities.

I sold my interest in Feedback RA in September 2011, and handed over all management responsibility to my colleagues. I worked as a consultant at FeedbackRA until April 2012, and then returned on a part time basis on selected assignments from October 2012. In this time I realised that I really preferred working on Education issues, and the area of ICT for Education became my core focus.

I continued working at Feedback until October 2013, when I decided to start my own consultancy again – Benita Williams Evaluation Consultants. I still helped out with some of the FeedbackRA work, and by January 2014 I was able to take over some of the FeedbackRA staff.

Tomorrow will be the first day in our new offices, which is quite exciting. But there is a  side of me that is really sad and nostalgic for what was.

Thanks colleagues, collaborators, clients and friends. Thanks to my family. It is the end of an era, so.... "So long, and thanks for all the fish!"