Thursday, November 20, 2014
Thursday, October 16, 2014
Words like "cost-benefit analysis, cost-efficiency analysis, cost-utility analysis"... actually anything with the word "cost" or "expenditure" in it... makes me nervous. So my usual strategy is to ignore the "Efficiency" criterion suggested by the OECD DAC, or I start fidgeting around for the contact details of one of my economist friends, and pass the job along. I have even managed to be part of a team doing a Public Expenditure Tracking Survey without touching the "Expenditure" side of the data.
But then I found these two resources that helped me to start to make a little bit more sense of it all. They are:
The South African Department of Planning Monitoring and Evaluation's Guideline on Economic Evaluation At least it starts to explain the very many different kinds of economic evaluation you should consider if you work within the context of South Africa's National Evaluation Policy Framework.
And then this.
A free ebook by Julian King that presents a short theory for helping to answer the question "Does XYZ deliver good (enough) value for investment?" - Essentially the question any evaluator is supposed to help answer.
So, now, there is one more topic on my ever expanding reading list! If there is a "Bible" of economic evaluation, let me have the reference, ok?
Friday, September 12, 2014
- If you are willing to spend some time learning from online lectures, try out any of the free online courses developed by EvalPartners, Rockefeller and the IOCE. New entrants allowed in January, March and September of each year, and learning is totally self-paced. They are certified.
- If you are looking for less intense professional development - Why not check out the American Evaluation Association's Coffee Break Webinars? (I think you do have to be an AEA member though!)
- If you are looking for something to read about any evaluation method, approach, tool or task, check out Better Evaluation. Subscribe to their blog and their twitter stream to get handy little tips. An amazing resource made available totally free!
- Do you only have time for a short email or blog every now and again? Sign up for the American Evaluation Association Tip-a-Day blog/ emails or check out the collection of Evaluation Blogs curated at EvalCentral
- If you are looking for an accredited online course, try out the Claremont E-learning options. They usually have bursaries available for Developing Country Evaluators.
- What are the two Evaluation books I suggest you should read first? Utilization Focused Evaluation - Michael Quinn Patton and Purposeful Programme Theory - Sue Funnell & Patricia Rogers
- If you are planning to work in the M&E of Government programmes in South Africa, you have to be familiar with The South African Department of Planning, Monitoring and Evaluation's National Evaluation Policy Framework, and their guidelines.
Tuesday, August 12, 2014
Today, I got the opportunity to present to the Bridge M&E Colloquium on the work I'm doing with the CSIR Meraka Institute on the ICT4RED project. My first presentation gave some background about the ICT4RED project.
I referred to the availability of the Teacher Professional Development course under a creative commons licence here, - This resource also includes a full description of the micro-accreditation system or Badging system.
What seemed to get the participants in the meeting really excited is the 12 Component model of the project - which seems to suggest that one has to pay attention to much more than just technology when you implement a project of this nature. My colleagues published a paper on this topic here.
Participants also resonated with the "Earn as you Learn" model that the project follows - If teachers demonstrate that they comply with certain assessment criteria, they earn technology and peripherals for themselves and for their schools. A paper on the gamification philosophy that underlies the course, is available here. The Learn to Earn model was documented in a learning brief here.
And then I was able to speak a little more about the evaluation design of the project. The paper that underlies this work is available here, and the presentation is accessible below:
I think what sets our project evaluation apart from many others being conducted in South Africa, is that it truly uses "Developmental Evaluation" as the evaluation approach. For more information about this (and for a very provocative evaluation read in general), make sure you get your hands on Michael Patton's book. A short description of the approach and a list of other resources can also be found here.
People really liked the idea of using Learning Briefs to document learning for / from team members, and to share with a wider community. This is an idea inspired by the DG Murray Trust. I blogged about the process and template we used before. An example of the learning brief that the M&E team developed for the previous round, is available here. More learning briefs are available on the ICT4RED blog.
I also explained that we use the Impact Story Tool for capturing and verifying an array of anticipated and unanticipated impacts. I've explained the use and analysis of the tool in more detail in another blog post. There was immediate interest in this simple little tool.
A neat trick that also got some people excited, is how we use Survey Monkey. To make sure that our data is available quickly to all potential users on the team, we capture our data (even data collected on paper) in Survey Monkey, and then share the results with our project partners via the sharing interface on Surveymonkey - even before we've really been able to analyse the data. The Survey Monkey site, explains this in a little more detail with examples.
The idea of using non-traditional electronic means to help with data collection also got some participants excited. I explained that we have a Whatsapp group for facilitators, and we monitor this, together with our more traditional post-training feedback forms, to ascertain if there are problems that need solving. In an upcoming blog post, I'll share a little bit about exactly how we used the WhatsApp data, and what we were able to learn from it.
Tuesday, June 24, 2014
I did a presentation on some ideas I have to define and measure learners' 21st Century Skills in the context of the ICT4RED project. I currently have more questions than answers, but I'm sure we will get there soon. Here is a link to a summary table comparing different Definitions of 21st Century Skills.
Other presentations I really enjoyed was
* Barbara Dale Jones on the role of Bridge and learning communities and knowledge management
* Fiona Wallace, on the CoZaCares model of ICT intervention
* John Thole on Edunova's programme to train and deploy youth to support ICT implementation in Schools
* Siobhan Thatcher from Edunova's presentation on Edunova's model for deploying Learning Centres in support of Schools
* Brett Simpson from Breadbin Interactive on the learning they've done on the deployment of their content repository.
*Ben Bredenkamp from Pendula ICT talking about their model for ICt integration and experience of the One Laptop per Child project in South Africa.
* Dylan Busa from Mindset speaking about the relaunch of their website content.
* Merryl Ford and Maggie Verster talking about the ICT4RED project
Wednesday, May 28, 2014
Impact Evaluation Guidance Note and Webinar SeriesWith financial support from the Rockefeller Foundation, InterAction developed a four-part series of guidance notes and webinars on impact evaluation. The purpose of the series is to build the capacity of NGOs (and others) to demonstrate effectiveness by increasing their understanding of and ability to conduct high quality impact evaluation.
The four guidance notes in the series are:
Each guidance note is accompanied by two webinars. In the first webinar, the authors present an overview of their note. In the second webinar, two organizations - typically NGOs - present on their experiences with different aspects of impact evaluation. In addition, each guidance note has been translated into several languages, including Spanish and French. Webinar recordings, presentation slides and the translated versions of each note are provided on the website.
- Introduction to Impact Evaluation, by Patricia Rogers, Professor in Public Sector Evaluation, RMIT University
- Linking Monitoring & Evaluation to Impact Evaluation, by Burt Perrin, Independent Consultant
- Introduction to Mixed Methods in Impact Evaluation, by Michael Bamberger, Independent Consultant
- Use of Impact Evaluation Results, by David Bonbright, Chief Executive, Keystone Accountability
Wednesday, May 21, 2014
This post consolidates a list of impact evaluation resources that I usually refer to when I am asked about impact evaluations.
cute video explains the factors that distinguishes impact evaluation from other kinds of evaluation, in two minutes. Of course randomization isn't the only way of credibly attributing causes and effects - and this is a particularly hot evaluation methodology debate. For an example of why this is sometimes an irrelevant debate - see this write up on parachutes and Chris Lysy's cartoons on the topic.
The Impact Evaluation debate flared up after this report, titled "When will we ever learn" was released in 2006. In the States there also was a prominent funding mechanism which required programmes to include experimental evaluation methods in their design, or not get funding (from about 2003 or so).
The methods debate in Evaluation is really an old debate. Some really prominent evaluators decided to leave the AEA because they embarked on a position that they equated with "The flat earth movement" in geography. Here is a nice overview article, (The 2004 Claremont Debate: Lipsey vs. Scriven. DeterminingCausality in Program Evaluation and Applied Research: Should ExperimentalEvidence Be the Gold Standard?) to summarise some of it.
The South African Department for Performance Monitoring and Evaluation's guideline on Impact Evaluation is also relevant if you are interested in work in the South African Context.
Wednesday, April 02, 2014
I just wanted to ask a quick question. Do I need to get permission from the relevant Provincial Department of Education to carry out research in schools if the schools are part of a project we’re running? In other words, the district is aware of us and probably interacting with us?
Of course approval by the Province and Research Ethics Boards are still not all that you need to do to ensure that you conduct your work ethically - Some fields (E.g. Marketing Research - see the ESOMAR guidelines), have guidelines about ethics... so it would be good to study these and make sure your practice remains above board.
And then this, of course, is also true:
Live one day at a time emphasizing ethics rather than rules.
Thursday, March 27, 2014
In a previous blogpost I reflected on how African values shape my practice of Evaluation.
This week I attended a seminar during which Gertjan Van Stam shared some provocative views on development in Africa. I started reading his book 'Placemark'. I love the way he gives voice to rural Africa. I find it interesting that this Dutch Engineer manages to give voice to Africa in a way that I can relate with.
His beautifully written take on Ubuntu:
I am, because You are
Is it possible that people in rural areas of Africa can connect with people in urban areas around the world?
That one can walk into a scene and meet someone who walks into the same scene, even if it is geographically separated?
That we explore and connect rural and urban worlds worldwide without anyone being forced into cultural suicide?
That we meet around the globe and relate, embrace, love, and build meaningful relationships?
That we find ways to be of significance and support to each other and together shuffle poverty and disease into the abyss?
That we encourage each other to withstand drunkenness and drugs, bullying, self harm, and greed?
That we share spiritual nutrition to deal with wealth, loss, alienation and pain in this generation?
That we unite through social networks, overcoming divides and separations?
That we share ancient, tested, and new resources, opportunities, visions, and dreams that lead to knowledge, understanding and wisdom?
That we collaborate to discuss, and engineer tools, taking into account the integral health of all systems?
That together, South and North, build capacity, mutual accountability, and progress, for justice and fairness?
That I am, because You are?
Monday, March 17, 2014
Wednesday, February 26, 2014
I'm off to Cameroon on Sunday for a week of networking, learning and sharing at the 2014 AfrEA conference in Yaondé. I love seeing bits of my continent. If internet access is available I'll try to tweet from @benitaW.
I am facilitating a workshop on Tuesday together with the wise Jim Rugh and the efficient Marie Gervais to share a bit about a VOPE toolkit EvalPartners is developing. ( A VOPE is an evaluation association or society... voluntary organization for professional evaluation)
Workshop title:Establishing and strengthening VOPEs: testing and applying the EvalPartners Institutional Capacity Toolkit
Abstract: One of the EvalPartners initiatives, responding to requests received from leaders of many VOPEs (Voluntary Organizations for Professional Evaluation), is to develop a toolkit which provides guidance to those who wish to form even informal VOPEs, and leaders of existing VOPEs who seek guidance on strengthening their organization’s capacities. During this workshop participants will be introduced to the many subjects addressed in the VOPE Institutional Capacity Toolkit, and asked to test the tools as they determine how they could help them apply such resources in strengthening their own VOPEs.
The workshop will be very interactive with lots of exploring, engaging, and evaluating of the toolkit resources. Participants should not come to this workshop expecting that they will sit still for more than 30 minutes at a time. We'll use a combination of learning stations and fishbowls as the workshop methodology. I'm really looking forward to it!
Eventually the toolkit will be made available online. Follow @vopetoolkit on twitter for more news about developments.
I served on the boards of both AfrEA and SAMEA so I hope that the resources that the Toolkit task force and their marvellous team of collaborators put together in the toolkit will be of use to colleagues across the continent who are still founding or strengthening their VOPEs. It is hard and sometimes thankless work to serve on a VOPE board, and if this toolkit can make someone's life a little easier with examples, tools and advice, I would count this as a worthy effort.
I expect that the workshop will be a good opportunity to get some Feedback to guide us in the completion of the work.
Monday, February 17, 2014
(The picture above is of a character known as "Benny Bookworm" from a South African TV show called "Wielie Walie" which I watched as a child)Beneficiary stories are an easily collected data source, but without specific information in the story, it may be impossible to attribute the mentioned changes to an intervention or to verify that the change actually occurred. Approaches such as Appreciative Inquiry and the Most Significant Change Technique have been developed in response to the need to work more rigorously with this potentially rich form of data. The “Impact Story Tool” is yet another attempt to make the most of rich qualitative data and was developed and tested in the context of a few programme evaluations conducted by Feedback RA.The tool consists of a story collection template and an evaluation rubric that allows for the story to be captured, verified and analysed. Project participants are encouraged to share examples of changes in skills, knowledge, attitudes, motivations, individual behaviours or organizational practice. The tool encourages respondents to think about the degree to which the evaluated programme contributed towards the mentioned change, and also asks for the details of another person that may be able to verify the reported change. The analyst collects the story, verifies the story and then codes the story using a rubric. When a number of stories are collected in this way, they are then analysed together with other evaluation data. It may illuminate which parts of a specific intervention are most frequently credited with contributing towards a change.Besides introducing the tool as it was used in three different evaluations, the usefulness of this tool and possible drawbacks are discussed.
Monday, February 10, 2014
During my studies I had read Cook and Campbell, and somehow I also stumbled upon Guba and Lincoln. I was introduced to Utilization Focused Evaluation. In 2004 I got Rossi, Lipsey and Freeman for a going away present from Khulisa, and I read any evaluation journal articles I could lay my hands on.
Its after reading something that Tom Grayson (from the University of Illinois at Urbana-Champaign) wrote in a journal article about teaching evaluation, that I decided to email him. I asked him for some reading material that will give me a good basis in Evaluation. He responded by sending me a package of course reading materials via post... This was such an unexpected gesture of goodwill. Above is a little handwritten note that he sent with the material.
So Tom, thanks a lot. And this is me letting you know about my adventures in evaluation!
Monday, February 03, 2014
Thursday, January 30, 2014
Today is a little bit of a sad day for me. We’re packing up the FeedbackRA offices which have been the place where I practiced my profession for the past eight years. I'm starting at my new offices tomorrow, and it means that my relationship with FeedbackRA is one step closer to dissolving further.
I started the business in 2002 together with two colleagues. Our first job was a survey we did for MTN. Our first evaluation job was for the Gauteng Education Development Trust.
In 2004 I quit my fulltime job to earn my own salary at Feedback. In this year we did a nice piece of research for the Department of Science and Technology, and started expanding our CSI business base.
In 2006 we graduated to proper offices. We appointed our first employees and embarked on a range of projects that just saw the business grow – in terms of its focus, our skill and the turnover.
In 2007 and 2008 we did a strategic piece of research on JIPSA, and I got to interview ministers and captains of industry. In this time I was also involved with setting up SAMEA and in 2009 I started to contribute to AFREA too.
In September 2009 I took a step back to reflect on the important things in my life. Up and ‘til 2009 I managed the business and business finances, and delivered as a key consultant, while keeping an eye on things at SAMEA and AFREA. I just couldn’t do it all any longer.
In 2010 I scaled down on my work and volunteer responsibilities, and we got Daleen involved to help us run the business. My life was much simpler after that… but it was still tough, because the business kept on growing. I did some lovely work with colleagues from Stellenbosch on a Public expenditure tracking survey… and I learnt so much from being the “junior” on the team.
In 2010 we started negotiations with a range of other high-profile consultants to see if they would like to join as business partners. We worked together on a few projects, we had a strategic planning session… everything looked good. Our business was expanding and we decided to take up more office space. We started two big contracts which allowed us the scope to do some longer term planning, but it also opened the business up to risk… because things did not always go according to plan.
In 2011 an advertisement for the only other job I ever thought I’d consider, crossed my desk. I decided to apply, and let my business partners know. We let the conversation with the other potential business partners cool down a bit. After 6 months in limbo, I found out that I did not get the other job, but it took only 1 month for me to realise that what I’ve gained at Feedback was significant enough for me to call it quits. The stress of managing a business and a full consulting plate was just too much. I took on one too many assignments – because a colleague that I respect a lot twisted my arm. This had bad consequences for me as an individual. I didn’t do my best work any longer. I wanted to do things well again… I wanted to focus on things that I could do well, instead of just taking on jobs to make sure cash flow was sorted. A week after my decision, I found out I was going to be a mom. So suddenly I had another reason to reevaluate my priorities.
I sold my interest in Feedback RA in September 2011, and handed over all management responsibility to my colleagues. I worked as a consultant at FeedbackRA until April 2012, and then returned on a part time basis on selected assignments from October 2012. In this time I realised that I really preferred working on Education issues, and the area of ICT for Education became my core focus.
I continued working at Feedback until October 2013, when I decided to start my own consultancy again – Benita Williams Evaluation Consultants. I still helped out with some of the FeedbackRA work, and by January 2014 I was able to take over some of the FeedbackRA staff.
Tomorrow will be the first day in our new offices, which is quite exciting. But there is a side of me that is really sad and nostalgic for what was.
Thanks colleagues, collaborators, clients and friends. Thanks to my family. It is the end of an era, so.... "So long, and thanks for all the fish!"