Mixing Research Methods in an Impact Evaluation in Pakistan | Qualitative Research | Quantitative Research

Please download to get full document.

View again

of 8
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Information Report



Views: 5 | Pages: 8

Extension: PDF | Download: 0

Related documents
Mixed-methods research for evaluation, combining qualitative and quantitative approaches, is widely recognized for its potential. It also poses challenges, especially for NGOs with limited resources and capacity for undertaking complex programmes of research and evaluation. This study describes one of Oxfam’s first attempts to integrate qualitative research into an effectiveness review that was based primarily on the use of a quantitative methodology. It examines the way in which different qualitative research methods, including literature review, semi-structured interviews, and focus group discussions, were used to inform survey research and analysis, as well as the various problems encountered in this process. 
    www.oxfam.org.uk/policyandpractice MIXING RESEARCH METHODS IN AN IMPACT EVALUATION IN PAKISTAN W HAT IS THIS CASE STUDY ABOUT ? Mixed-methods research for evaluation, combining qualitative and quantitative approaches, is widely recognized for its potential. It also poses challenges, especially for NGOs with limited resources and capacity for undertaking complex programmes of research and evaluation. This study describes one of Ox fam’s first attempts to integrate qualitative research into an  effectiveness review that was based primarily on the use of a quantitative methodology. It examines the way in which different qualitative research methods, including literature review, semi-structured interviews, and focus group discussions, were used to inform survey research and analysis, as well as the various problems encountered in this process. These difficulties hampered the full integration of methods and left a number of research questions unanswered. Several methodological lessons emerge from this and are summarized at the end of this study, which aims to provide a constructive insight into practice for anyone interested in conducting mixed methods research.   W HY WAS THE RESEARCH NEEDED ?   In 2014, one of Oxfam GB’s projects in Pakistan,  Empowering Small Producers, especially Women, in the Dairy Sector , was randomly selected for inclusion in its 2014/15 sample of  effectiveness reviews. This project in South Punjab ai med to increase women’s incomes and promote their empowerment by reviving the local dairy sector and enhancing the role of women within it. In conjunction with a local partner, the Doaba Foundation, Oxfam established a dairy enterprise and collection centr  es and instituted community learning groups designed to promote women’s economic leadership. As an empowerment project with a large number of participants, the effectiveness review was srcinally planned as a quantitative study using a quasi-experimental design, to tease out cause-and-effect relationships and estimate project impacts. Many, but not all, effectiveness reviews are based on quantitative impact evaluations of this kind (see Oxfam’s  How are effectiveness reviews carried out? ). While changes in women’s employment, incomes and consumption can be measured relatively easily, women’s empowerment is much more elusive. NGOs have long struggle d to identify adequate indicators and systematically capture changes in empowerment. Conceptual questions (for example about the extent to which empowerment can be defined using local or universal criteria) are compounded by the challenges of data collection in a context in which there can be considerable sensitivity to the issues being studied. An opportunity to address these questions in greater depth arose when additional funds became available as the impact evaluation was being planned. Given a longstan ding desire to ‘mix methods’, the idea of adding a narrative or qualitative component to the    R   E   S   E   A   R   C   H    I   N    P   R   A   C   T   I   C   E    2 evaluation crystallized. This had previously only been done by undertaking follow-up research after a quantitative impact evaluation had been completed and published. By integrating qualitative and quantitative research methods, the different approaches can provide complementary perspectives. Furthermore, qualitative methods would allow for in-depth engagement with particular questions about project impacts and the reasons for observed impacts. W HAT METHODS WERE USED ? Quantitative data were collected through household surveys in the project district in South Punjab. 300 women (or their spouses) who had participated in the project were interviewed, as well as a comparison group of 500 women from different villages. Because baseline data were not gathered from the project site before it started, respondents were asked to recall basic information about their household’s situation in 2009. The aim of this quantitative research was to measure differences in income and women’s empowerment that were causally attributable to the project, using the comparison with non-participants. Details of the results and the propensity score matching technique and other statistical methods used can be read in the full effectiveness review report,  Women’s Empowerment in Pakistan: Impact Evaluation of the Empowering Small Scale Producers in the Dairy Sector Project  . The qualitative component of the evaluation aimed at gathering information on relevant cultural and socio-economic characteristics of the survey population, in particular of women involved in dairy production. This was expected to serve two purposes: 1. To help inform Oxfam’s understanding of the concept of ‘empowerment’, so that an index measuring it would be contextually appropriate. 2. To inform decisions taken during quantitative analysis, for example the determination of ‘cut - off’ points marking different degrees of empowerment. In the past, evaluators had taken these decisions without reference to the local context.  As the evaluation progressed and various problems arose, a third use for the qualitative component became apparent: to provide evidence that would help to inform, as well as corroborate or falsify, hypotheses that were associated with the results of quantitative analysis. In addition to a workshop with programme and Doaba Foundation staff that was already planned to directly inform the survey research (see below), qualitative data were taken from different sources:    A literature review  of women’s empowerment issues in Pakistan .   Six key informant interviews with active women leaders.   Six key informant interviews with dairy processors (all men).   Ten focus group discussions with female-farmer group members involved in the project and four with male group-members, or husbands of female group-members. Village sites for qualitative research were selected at the suggestion of Oxfam’s local partner organization, the Doaba Foundation. The literature review was conducted by a professional research and evaluation consultant in Pakistan, with a background in gender. Semi-structured interviews and focus group discussions were conducted in the field by a team of local female researchers under the consultant’s management. The qualitative and quantitative components were set to be sequenced after each other, to allow time for primary qualitative data to thoroughly inform survey questions.    3   H OW DID QUALITATIVE RESEARCH METHODS ACTUALLY INFORM THE REVIEW ? The qualitative evaluation informed the overall evaluation and the quantitative analysis components in multiple ways; some planned and some unforeseen. Developing a context-sensitive framework and indicators of empowerment Oxfam needed a reliable measurement tool to assess the impact of its interventions on women’s empowerment. A composite index was adapted from the emerging  academic literature on women’s empowerment  and Oxfam’s own experiences. This index allowed different dimensions and indicators to inform an overall measure of empowerment with measurable levels of change. In order to make sure this measure was context-specific; a four-day workshop was held In Islamabad with programme and Doaba Foundation staff. In the index, empowerment was defined as comprising different types of power (power over, power with, power to, and power from within), and this helped the formulation of specific survey questions. Informing survey analysis and synthesis For some outcomes, qualitative research findings corroborated survey results. For instance, focus-group discussants and survey data established that selling milk alone provided insufficient income, but was one of a range of income-generating activities that women had to engage in. Other findings from qualitative data challenged or nuanced interpretations suggested by the quantitative survey data. Survey results, for example, showed a positive difference in project participants’ ability to vaccinate and promote de-worming of livestock. Interviews suggested, however, that in some instances husbands had attended the relevant training instead of their wives, who had therefore not been directly empowered. Focus groups also shed light on marketing and sales data. Surprisingly, project participants made less profit from milk sales than women in the comparison group. Interviews suggested that this may have been due to operational problems with the new milk enterprises, which were not keeping up with competitors in the area like Nestlé, who were said to have provided advance loans to milk sellers so that they could capture a greater share of the market themselves. Without the additional information from the discussions, the sales data might have led the project team to disregard this state of affairs. C HALLENGES POSED BY MIXING METHODS   The addition of a qualitative component to an evaluation srcinally designed purely as a quantitative study posed a number of challenges, of which the three main ones are discussed below. Managing different components of the evaluation The design of a complex evaluation employing different research methods is challenging. Staff involved as evaluators are often specialists in quite different research methods, with their specific framing and logic. In this evaluation there was also a need to outsource different data-collection components to local staff. As a consequence, the connection of disparate pieces of evidence was    4 like assembling a puzzle. The following circumstances added to this difficulty:   Managing the qualitative evaluation at a distance through a consultant, while a number of elements were also being trialled for the first time.      A need to agree on protocols for qualitative data collection, including the recording, transcription and translation of interviews. Quantitative data collection, and in particular survey research, has more widely accepted standards.   Sequencing of qualitative and quantitative research Most of the qualitative data collection was srcinally planned for an early stage of the evaluation, so that interviews could inform the subsequent survey design. This turned out to be difficult for a range of reasons, including delays in qualitative data collection by local consultants. Likewise, qualitative information should have played a more integral role in the analysis of the actual survey results, though this only happened to a limited extent. In practice, aside from the literature review (conducted before the workshop), qualitative data were gathered and analysed in parallel with quantitative data collection, and findings were interpreted separately, before being brought together. This meant that the sequencing of qualitative and quantitative research was not fully successful, and while both types of evidence provided important insights, often this took the form of pointing to gaps in understanding. Questions that were not asked and/or remain unanswered   Some important issues were uncovered by the qualitative research but emerged too late to be incorporated into the survey design and so were omitted from the large-scale data collection. These included:   Historic prices and context for sales in the project district. In some focus groups, participants mentioned that the price paid by middlemen for dairy products from the project area had gone up significantly. Because questions on the local sector were not included in the survey, a potential causal link was flagged, but could not be disentangled and tested appropriately.   The context-specific nature of empowerment. Qualitative research revealed additional dimensions of empowerment that were not built into the srcinal conceptual framework. For example, voting freedom for women in the project district is highly constrained. More men than women are registered to vote. In addition, in interviews women disclosed that when they voted, this was often done at the ‘will and wish’ of their hu sbands and families. Some focus-group data also pointed to some men taking on a second wife, an indicator (to them) of the first wife’s disempowerment. As the published effectiveness review points out, in the future an empowerment index might include such contextually important indicators, but this would require more qualitative, in-depth research preceding the setting of indicators. The effectiveness review report was refreshingly honest about these issues and the problems encountered, setting an important benchmark for similar reviews. L ESSONS LEARNED   Oxfam’s goal is to produce project evaluations that are as rigorous, informative, and cost -effective as possible within the context of considerable resource and capacity constraints. The dynamic between this ambition, constraining factors and the production of reliable evidence remains an ongoing challenge. Here is a summary of the key methodological lessons OGB learned from the endeavour to mix methods in this particular impact evaluation:
View more...
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks