Example: analysis of a social problem for ongoing project

Nancy Waweru runs the Mrembo (beautiful girls) Project in Nairobi, Kenya.

Through her organization (VAP), adolescent girls get to attend after-school programs that prepare them to face the dangers of Majengo slums[1]. She started this project with support from local teachers and practically zero budget, but not without a sense that she would need to demonstrate the program’s impact in the future to get a grant. She chose to use the storytelling approach because it was a new program and needed a more open-exploratory approach to monitoring if it was going to evolve to better suit the girls’ needs[2].

            Nancy took 20 minutes out of scheduled events at the beginning and end of the ten-week program and prompted the girls to tell stories (a pre-post evaluation design). She read all the stories and built a basic understanding of any changing patterns during the program, as well as feeding all data into the shared analysis tool for deeper comparisons. Searching for (“vijana amani pamoja” VAP) on djotjog.com/search reveals 70 stories that named this organization:

Additional searches by location reveal another 100 or so relevant stories. A visual summary below reveals who told stories (demographics), and about what (topic). Most stories told by girls are red, meaning they are about negative events.

Further filtering to show only stories from girls reveals that self-esteem and respect are the major shared themes. Comparing this result to all stories from girls of the same age in Kibera, another Nairobi slum reveals that the Mrembo project girls are different:

There are many other possible comparisons. Instead of comparing girls in two slums, Nancy could compare girls to boys, or drill down on these specific issues and read stories. Through the girls’ stories Nancy realized that rape and sexual assault were a major issue, even for 11 year old girls. So she changed the curriculum to focus much more on these issues, which in turn led to more sessions on teen pregnancy and HIV/AIDS. This is what iterative program design is all about. But by recording the learning process in this way, she not only improves her chance of getting funding, she also shares this data with other organizations that could use it for a baseline (see below). 

Nancy was also able to compare the second-iteration of her Mrembo program[1] to a similar USAID-funded program in Kibera, called Sita Kimya (“I will not be silent!”), for which hundreds of people had shared stories (See http://chewychunks.wordpress.com/2011/07/25/comparing-two-rape-prevention-programs/). By contrasting words found or omitted in these two sets of stories, she realized according to the girls’ perceptions, Mrembo program addressed hiv/aids, preventing early marriage, and pregnancy and Sita Kimya did not. Sita Kimya was narrowly focused on preventing rape, and most of the 314 stories were from men ages 16 to 30. I posted these findings on my blog and got an interesting response from Michael Owigar, that program’s “Chartered Marketer” with a littany of defenses:

“Sita kimya for example was not addressing early marriage and sexual violence manifested itself in other more prominent ways like intimate partner violence, child defilement etc.... Sita Kimya for example was implemented in under a year. Lastly, how did you gather your data Sita kimya for example segmented beneficiaries into cohorts- males 15-24 (Morio) Females 15-24 ( Sella’s) Males 25+ (Oyoo) and Females 25+. The project actually reached more females than males.”

Here are the fruitful beginnings of an actual conversation between local implementing organizations based around data. Michael’s last comment is a typical example of why standard monitoring and evaluation methods are broken;by their internal accounting, they reached more women than men, and yet out of 7,619 stories from Kibera only 22 percent of those about Sita Kimya are from women. Most women they “reached” chose to talk about something else, and were not counted in our approach that defers to the community’s sense of priorities. Given Sita Kimya’s intent to change community-wide behavior, these (free) numbers and stories are precisely what they needed to evaluate whether their outreach was effective. I wished they had been more open to a shared monitoring approach, but the project ended and traces disappeared from the Internet, with the GlobalGiving storytelling data as the only public record of what they tried and what people liked/disliked about it (google “sita kimya usaid” and see for yourself).

The happy ending to Nancy’s story is that she has since then raised $6,609 on GlobalGiving and attracted grant-based funding from the Girl Effect, Women Win, and the Dutch Embassy in Nairobi.

Read more examples of story-based analysis

[1] http://www.globalgiving.org/projects/improving-lives-of-girls-in-nairobi-slums/

[1] http://chewychunks.wordpress.com/2011/06/04/nairobi-slum-girls-get-straight-talk-from-vaps-mrembo-project/

[2] Randomized controlled trials (RCT) determine if an intervention should be replicated. Storytelling reveals whether the broader program is working: http://www.nominettrust.org.uk/knowledge-centre/blogs/rcts-should-only-be-part-story

WARNING: Javascript is currently disabled or is not available in your browser. GlobalGiving makes extensive use of Javascript and will not function properly with Javascript disabled. Please enable Javascript and refresh this page.