Over the summer, I spent ten weeks as an intern with the Overseas Development Institute (ODI) in London, working in their Research and Policy in Development (RAPID) department. The department, as the name suggests, examines the links between research and policy, asking such questions as how to promote uptake of research in policy, how to empower policymakers to request the research they need and how to better shape research in order to facilitate uptake.
One of the fields to which RAPID has made a significant contribution is that of evidence-based policy. Recently the Department for International Development (DFID) published a How To Note: Assessing the Strength of Evidence (1). In conjunction, the DFID has been pushing for policy briefs, which are visual and short, in the interest of communicating a more digestible message to policymakers. In theory, pushing for simplification is an obvious and positive step. However, the How To Note has generated heated debate from researchers who are concerned that the rigour of their research, and the nuance of their message, are threatened. Amongst other concerns is an uneasiness that assessing studies with different designs that ask different questions and explore different contexts, and reducing the assessment into a score of high, medium or low strength, demands such oversimplification that the true meaning of the individual studies is often lost. Researchers are forced to expose themselves to possible criticism and conflict through making such judgements, of which the end result is more a caricature of the research.
As a statistician, one is constantly attempting to find a balance between oversimplifying the picture and emphasising nuance to the point where no picture can be discerned. My immediate inclination, therefore, was to suggest that by gathering more systematic meta-data on the studies included in a systematic review, one would be able to create visual summaries of the information. The researcher could then carefully calibrate the amount of information retained. The result should be visually accessible for policymakers, whilst ensuring that an appropriate amount of information is retained.
This became, to some degree, my research project for the summer. I worked on it in preparation for a workshop on an implementation framework for the How To Note, which will be chaired by Louise Shaxson of the ODI. The workshop is an attempt on the part of the DFID to consult with researchers. I gathered a sample of studies from a body of literature, which had been reviewed in the past by ODI staff, on land rights and their effects on rural households. I created a database of meta-data on those studies, and I experimented with certain techniques for visual summary. Along the way, I summarised the literature on conducting systematic reviews and recorded any thoughts that I had on the process, what I struggled with, and where I found possible solutions. I found the process rewarding.
I showed that maps can be used to show the geographic clustering of studies and thereby identify potential bias, and that network models can reflect the interactions of the research papers by modelling cross-citations, thereby showing the influence that individual studies have on the body of literature. I also experimented with using multiple correspondence analysis (MCA) to reflect patterns in the results in order to pick up, for example, where research design unduly influences the outcomes of a study. This proved problematic.
I ran the technique, and created the visuals, but I was not able to demonstrate that interesting patterns could be determined. I was convinced that quantitative, qualitative and mixed methods studies can be evaluated in the same database. And yet, as I would tailor my database to make sense for one or another study, it would stubbornly become inappropriate for capturing information about another. This leads me to the key point which I want to highlight in this article: where the group of studies is diverse, and the policy outcome to which the synthesis is to contribute is unknown, creating a global template for collecting meta-data using categorical variables becomes impossible.
In order for a synthesis to be valuable, the conclusions of the papers need to be reflected – questions such as whether or not a relationship was recorded and, if a relationship was recorded, how strong and in what direction must be addressed. But how to create categories of outcomes when each paper asks a slightly different question, is searching for slightly different relationships, is evaluating more than one question, and while some were asking what the relationship was, others were asking why the relationship occurs?
In a recent discussion with Keith Coleman, a public sector strategy consultant, it became clear to me that the problem arises when researchers are unclear about the purpose of the research - a situation that tends to occur because policymakers themselves are unsure about their logical framework. Before evidence is gathered on a certain topic, policymakers need to know what the outcome is that they wish to achieve, and how this research is going to influence the achievement of that outcome. Of course, this is easier said than done in a world of complex, emergent and chaotic policy issues (2). If policymakers could frame the question clearly enough, however, then social scientists should be able to be more selective in their choice of studies, compare papers with diverse research designs, and yet evaluate them all according to the same criteria. The result would be a framework, which would, without distortion, facilitate assessment according to common categories and variables in a database.
A strategic outlook on the part of the person commissioning the research is a necessary condition, if not sufficient, for a high quality output from a systematic review. There is scope for improving a researcher’s techniques for synthesis and communication, but without an understanding of the research user’s intent, a truly convincing and clear summary will never be achieved.
(1) Available for download on the website https://www.gov.uk. 2 My supervisor and a research fellow at RAPID
(2)For more on this see Shaxson, L. 2011. Why wicked issues need nore than Wikis. Seminar at the Centre for International Governance Innovation. Waterloo, Canada.