Use the typology search to search our podcasts using terms from the NCRM research methods typology.
Laura Camfield on undertaking cross-national mixed methods research
Catherine McDonald, Laura Camfield (22-02-23)
In this episode of the Methods podcast, host Catherine McDonald talks to Laura Camfield, Professor of Development Research and Evaluation in the School of International Development at the University of East Anglia.
Laura discusses cross-country comparisons, the importance of theory in relation to mixed methods work and gives her advice around what can and can’t be compared. She also talks about the ethical challenges around presenting project work in regard to recognising younger researchers and other members of the team.
The series of the Methods podcast is produced by the National Centre for Research Methods as part of the EU Horizon2020 funded YouthLife project, and is looking at how researchers can do better longitudinal research on youth transitions.
For further information on the YouthLife project, visit www.EUqualimix.ncrm.ac.uk
Methods: Laura Camfield
Catherine McDonald 0:00
Hello, and welcome to Methods a podcast from the National Centre for Research Methods. In this series, as part of the EU Horizon 2020 funded Youth Life project, we're looking at how researchers can do better longitudinal research on youth transitions. I'm Catherine McDonald. And today I'm talking to Laura Camfield, Professor of Development, Research and Evaluation at the School of International Development at the University of East Anglia. Laura has over 25 year’s worth of experience in cross national mixed methods research. And I began by asking her how that has evolved.
Laura Camfield 0:37
Perhaps this is paradoxical, or perhaps other practitioners would say the same thing. I think as time has progressed, I've got much less confident, much more aware of the difficulties is a challenging thing to do well, and I think both the cross-national aspect and the mixing of methods aspect carry particular problems, some of which are similar in that you often see work that's cross national in the sense that lots of countries are involved. But actually, it's multi country, it's not cross nationally, comparative. Similarly, you often see mixed methods research, I'm doing my little inverted commas symbol with my fingers, which is actually multi methods research, because when you come to the outputs, the data has been presented quite separately. There's no attempt at integration in terms of findings. But I think more importantly, there's no sense of the different methods, philosophical underpinnings engaging in any kind of critical dialogue. So that element of integration that you sometimes don't find in cross national studies, where you've got lots of countries, but actually, you get lots of country papers, you don't get much iteration, and also in mixed methods studies are challenges in themselves. And then when you add the two together, I think it becomes even more challenging, added to which when you're in across national mixed methods study you are dealing with, by definition with an awful lot of organisations. So if you think you've maybe got four or five participating countries, in each of those participating countries, it's very rare to have the same team offering qualitative, quantitative and impact related expertise. So you might have your five countries, but then you've got three institutions in those five countries, so potentially 15 different actors, and then whoever the institutions in the UK that are providing supervisory support, intellectual leadership, etc. because it often is done in that way, that slightly top down way. So although I become very excited by the idea of doing that kind of research, the more elements the better. My more sensible side says, actually, it's really quite hard to pull off. And maybe it's more important to have a slightly more scaled down design, which you can do really well. So I've answered that question in quite a roundabout way. But essentially, I'm saying, I'm now less is more, whereas when I first started out, I was yeah, you know, eight countries, 73 different methods, bring it on!
Catherine McDonald 2:39
So what's the main challenge in cross national and makes methods work? Would you say?
Laura Camfield 2:44
Multidisciplinarity is probably the big challenge in cross national and mixed methods work, because you are bringing together very different perspectives and understandings of how the world works, what it means to do research, what constitutes valid knowledge, what's interesting to policymakers? I mean, pretty much any fundamental question you mentioned, including your what do you actually need to know from this research exercise you will have in a multidisciplinary team, you will have a very wide set of views. And these can't always be predicted by disciplines. But it is, of course, often the case that the economists, understandably will be very interested in the more material elements; income and expenditure data, asset indices, whereas anthropologists will be more drawn to more subjective elements. But they might also feel that actually, there's so much you can't capture in a quantitative survey that you really need to have rigorous in depth qualitative research alongside that, they might in fact, be very critical about the quantitative survey, they might say, for example, we're embarking on the survey with an inadequate understanding of what a household is, in each of the different countries in which we're working, or how kinship relationships work. And that means that we're actually going to generate data that's really not valid, it can't be used in the way that we hope. And economists might retort, perhaps scornfully and say, well, actually, you know, this is the way we've always done it in economics, the World Bank, does it LSMS surveys, you know, everyone seems to accept that. I don't have any problems getting published, why are you asking me to worry about this thing? Because if you collect enough data, then any small problems of the sort that you've described, will just sort of iron themselves out. So you see quite fundamental understandings about what methods can and can't do. And maybe not always a great deal of mutual respect for methods that are particularly associated to particular disciplines.
Catherine McDonald 4:22
So drill down a little bit into your mixed methods work. How do you see the relationship between theory and mixed methods approaches?
Laura Camfield 4:31
I think it's tremendously important. I think it's important in a couple of different ways. So one of them is recognising that methods arrive from particular traditions, and they have particular theoretical underpinnings. And if you attempt to combine them without being aware of kind of where they're coming from, theoretically, you can run into an awful lot of problems. The second aspect of theory, of course, is the extent to which you're actually engaging with it. And coming from or being located in a more applied field like Development Studies. I find that lipstick obviously paid theory, it rarely drives our work in the way that it should. So we have the fig leaf, of course of Amartya Sen its capabilities approach. And if you say, you know, my study is based around the capabilities approach, the underpinning theoretical framework, you can really a feel for an awful lot because it is essentially more of a kind of framework for data collection than it is a theoretical understanding. So I think within my field engagement with theory has perhaps not been as strong as it could be. But I'm starting to see particularly in evaluation and causal studies that are not sort of randomised controlled trial influence, a much greater drawing of ideas of middle range theory, which I find tremendously helpful. I mean, that feels to me intuitively plausible, when you talk to a student who's about to put together a proposal for supervision, and they say to you, I want to use Marxism to analyse labour relations in Telangana. You know, that's not necessarily a bad project. But that statement doesn't tell you an awful lot about what they're actually going to do. And I sort of worry that, you know, they have a very high-level theoretical understanding, and then they have the empirical work that they're going to go out and do. But how are they in the end, going to bridge between those two bits. So within sociology, middle range theory, where you're also building up from the ground, you're constantly testing different understandings, understandings drawn from literature, of course, but also from your own empirical research. You're constantly critical and questioning, I mean, I find that very helpful. And something that I encourage students to do is to look at the realist evaluation tradition that one sees increasingly seeing with an impact evaluation in international development, which sets out in a very clear way, the way in which contexts and mechanisms can interact to produce particular outcomes. So something development is always very interested in is why is it that this works here, and it doesn't work there? Or why is it that this works for this particular kind of people or for these people overall, but actually, when you look at this particular subgroup, you see, it's really not working. So this provides you with a kind of quite concrete way to think about it, and to think about it in a very fine grained way. So I mean, that sense of sort of building up to your middle range theory, whilst also having this sort of bigger theoretical understandings, I mean, I, I like grand theory, as much as next to this, I could quite happily sit down with Bourdieu and Lucas and read about social reproduction. But in terms of my practice, in terms of the practical theoretical understandings that I use to investigate a particular problem, I find I sometimes have to go a little bit deeper. And what is really helpful then is to engage with the literature to see how people have picked up these big ideas, and applied them appropriately in smaller scale studies. And also to see you know, what understandings people have developed themselves through a more inductive approach? And how you might possibly take those understandings those new relationships and apply them in your own work?
Catherine McDonald 7:44
And what about comparison? And generalizability? Do you have a particular approach to that? And if so, what is it?
Laura Camfield 7:51
Well, I would say I've become more cautious about what can and can't be compared. And I have a mental checklist. I mean, in fact, I didn't really need to have a checklist because there are lots of sort of checklists out there that you could use. But essentially, I'm always thinking, what are the aspects of this particular context? That would make it comparable to this other context? What are the aspects of the way in which data was collected in this context versus in this other context that might create differences or similarities? So essentially, I'm always trying to look at things like a critical epidemiologist would think to start from the premise that okay, I think I started comparing two things, I think I see a difference. But actually, what are the many ways in which that difference could in fact be explained way? Is it to do with the personality of the researchers? I've actually seen that happen. Is it to deal with the particular way in which research methods were applied? Is it to do with differences in the context that are not immediately visible, but tremendously important? Perhaps relating to, I mean, if you're doing an evaluation, for example, you might think you see a genuine difference. But maybe that difference is explained by an earlier project about which you know, nothing, rather than anything that the intervention you're evaluating has actually done. So I think always being ready to have your comparisons overturned, and being a bit cautious about what you actually compare. I don't necessarily think everything can be compared. And I don't think that the biggest comparisons are the most useful ones. I mean, often, I think you get a lot more when you look at a more detailed level. So I was fortunate enough to be at a really stimulating mixed methods workshop earlier this week. And I talked a little bit about a paper that I'd written that was not across nationally comparative, it just stayed with the single country. And because I've been asked, you know, I was talking about this in the presentation. And I've been asked to talk about cross nationally comparative work. So I felt I had to say, okay, would it have made it better if I'd made cross-nationally comparative? And then I went through the kind of things that I think about when I think about, you know, would it have worked better? And a lot of that is around sort of methodological caution, as I've already described. But part of it is around thinking well, actually, is that the most interesting comparison to be made? So I felt in the paper that I talked about, which was comparing three different methods and the way in which they yielded quite different data about the same group of young people's experiences and the comparative dimension was methodological. And there was a story to be told there, I didn't feel I needed to kind of overlay it with other stories. But I also thought, actually, given the dataset I had, if I'd wanted to take it further, I could have compared different cohorts, different cohort's ambitions, but I focused on a single rural community in a particular region. Actually, I could have looked across multiple regions, I could have looked at a rural community and an urban community. So in that situation, it wasn't about saying, Okay, I've got data on Ethiopia, I want to make it richer and more interesting. I know, why don't I go and get some data from India, but not necessarily the obvious thing to do. There are situations in which that can work, but not all situations, I felt I could actually have gone deeper just within that single country dataset. So I think that's probably something I would encourage students to do. I mean, I don't necessarily think that it's always more enlightening to have more countries. I mean, particularly if you are doing a one year PhD. And the more you add, the less deep you can go with what you actually have. So thinking about comparison, I don't think it's great in all cases, thinking about generalizability. I think to do that, you need to have a very good and critical understanding of the data that you're working from all sorts of data. People say, of course, the quality of data is particularly problematic to make generalisations from. But actually, I think if you have a very good understanding of your context, you know what your quality data can and can't tell you, then you're in a much better position to make a sensible comparison than you might be if you had a large and sprawling quantitative data set that you perhaps know rather less well, and in which there might be all sorts of problems that you're not entirely sure of. So I don't think having a small sample necessarily means that you can't generalise. But you need to be very cautious, no sweeping generalisations, please. And the sorts of theoretical generalisations that look more at the processes that you're observing within your data, even though there's not an awful lot of it are going to always carry a little bit more weight, particularly because they can be referenced back to the literature, that the kind of generalisations that say, 'okay, well, I only talked to 10 girls, but they said this... And that means every girl in Ethiopia thinks that' I mean, that's nonsense, clearly. But perhaps you might also hear people say, 'Well, I survey 2000 girls in Ethiopia and because there are 2000 of them, I can say something that applies to the whole of Ethiopia,' regardless of the fact that your survey sample might have come from a single region or be heavily biased towards people from particular socio-economic backgrounds or urban centres or whatever. I think that problem sits on both sides. And coming from more of a positive orientation. I'd like to fondly imagine that qualitative researchers are, you know, more aware of this problem sometimes being quantitative researchers are, but that's just my prejudice.
Catherine McDonald 12:35
And other their particular research questions or issues that you feel are best suited to across national mixed methods approach?
Laura Camfield 12:43
I think issues where there's a lot of comparative data. But I mean, I know that sounds a bit, you know, a bit like looking where the streetlamp is, rather than that the more interesting stuff around. So I've seen some very good comparisons of welfare regimes that seems to work particularly well. There's often comparable administrative data and good knowledge both about policy and interventions and data showing outcomes, often over quite long periods. So that sort of study, which you see a lot in social policy seems to work very well. I've seen some really nice studies recently of the effects of COVID-19. Now, whilst I know that if you wanted, for example, to compare death rates for COVID-19, even just in Europe, you've run up against the problem of very different forms of measurement. Although I think we can say with some certainty that the UK death rates were the worst, exactly by what proportion, very much depends on what kind of measure you use. So I think looking somewhere where you know that there are not well known measurement problems, for example, income and expenditure surveys, whoa, yes, that's probably one to steer clear of, but good administrative data collected by the respective tax offices that can offer some points of comparison. So these COVID-19 studies that I've mentioned, Linda Hantrais has been involved in ones for the demographers that I met at the workshop I mentioned earlier, we're involved in another, I mean, COVID-19, in addition, of course, to being a global tragedy, it has provided a very interesting lens with which to look at sort of shared problems, and also to look at why certain problems are not shared. I mean, here you have something that has affected everybody in the world to some extent. And yet, you see sort of very different responses to that. So I mean, those differences in responses and differences in outcomes can tell us so much about how society is structured in different contexts and how it benefits certain types of people, and disadvantages other types, in terms of data this is that goes on giving in terms of personal experience, of course, it's been absolutely horrendous. And it's ongoing. You know, I wouldn't in any way diminish that. But, I mean, here's a global problem that it seems that we do need cross national research to understand it. And of course, as we move now into a new global financial crisis, as people are experiencing universal rises in fuel costs, in food, we're maybe a little bit less vulnerable in the UK, but it certainly doesn't feel like that when you go to the supermarket and you tried to buy cheese. I mean, these feel like situations in which if funders can be agile enough, if researchers can be agile enough, we can kind of come together, I'm kind of thinking from a bit of a European comparative tradition, because of course, the advantages of excellent research institutions, with researchers on standard, with university funded time to engage in these sorts of comparisons is not of course, the reality in most institutions in the global south. So I don't mean to suggest a kind of exclusive practice. But I think that where that potential for collaboration exists, then these sorts of global problems, present great opportunities for cross national approaches. And it's not the case, of course, that cross national work hasn't been done successfully. IDS had an excellent food diaries project that arose out of the last global financial crisis. And I imagined they're going to sort of reinvigorate that pattern continue. So I suppose what I'm saying in summary, is that where you have a big question, where you have something that's very much at the forefront of people's minds, that is affecting everybody globally, but in very different ways, then that's an ideal opportunity to do cross national research, to try and understand what kinds of societal structures and greater resilience among populations to these sorts of common shocks. I mean, it's not pleasant to be, for example, in the UK right now and watch that great, unfettered capitalism experiment that's going on. But if you've ever wondered, you know, what would happen if we really did what free marketeers said, then, you know, now, you know, and it's good to be a researcher out there chronicling it, I have to say.
Catherine McDonald 16:22
And you mentioned there about people's experience, in addition to the data that was being gathered. So obviously, that makes me want to ask you what your approach is to bringing together analysis of qualitative and quantitative elements, you know, so what is your approach? And also why do you take that approach?
Laura Camfield 16:40
The reality of doing it has been, as I've sort of said, you know, qualitative and quantitative sitting in different institutions, never be twain shall meet people with quite disciplinary backgrounds, collecting and analysing it very hard. What I would say is, if you're doing your integration, when you've got two piles of analysis dumped on your desk, from different teams, who can't even be in the same room together without arguing, then you've already left it too late, it's all over for you, essentially. As you set a project up, I think it's good to think about, well, how can we bring these different elements together from the very start, so that you have genuine traction, genuine iteration, mutual respect, you start with your qualitative work that informs the quantitative measures that you develop the quantity, your the findings from your quantitative research that I mean, it's a sequential approach, but it's a very carefully calibrated sequential approach, that gives enough time for different people to feed in at different points. And also to have a kind of debate that challenges the paradigms that underpin these, and attempts to move towards something that is an approach that's really a bit more interdisciplinary. And so what's quite helpful is to have often young researchers in your team who are pragmatists, they have knowledge of a wide range of research methods. And this can be challenging, of course, to get people who have very high level quantitative skills, but are also experienced in and understand the value of qualitative research, but I have met those people they do exist. So have those people cheerleading, encouraging others, thinking always about integrated outputs. So even in the way that you select it, so say you're thinking you've collected all your data, and you're thinking about the kinds of papers you write, be aware of the very title that you think of will mean that you end up taking much more from the quantitative than from the qualitative, and vice versa. Because obviously, people understand things in quite different ways. So trying to be as inclusive as possible from the start, and every stage of data collection of analysis. And writing is tremendously important. So if you're co-writing a paper with economist colleagues, and the first time that they see your qualitative data is when you're sort of prompting your analysis in front of them, then that's not good. Really, you want your economist colleagues to be sitting alongside you, as you look at your data as you analyse it, and vice versa, as they do their quantitative analysis, constantly saying, it's interesting that you're looking at that, and you're not looking at that, why is that? Or this seems interesting, what would happen if we focused on that rather than that? I mean, I think having that kind of challenge, that encouragement to think a bit more broadly, to think actually, you know, why do I do this thing? This methodological practice that I've been doing, you know, almost in my sleep since I first learned to do it could be open to question. So I think having that awareness on both sides of the equation is tremendously important. I've been very lucky in the past to have PhD students who've worked in an integrated way to work with colleagues like Katie Rolland and the IDS. I'm very, very happy also to work in that integrated way. And I think from that you get the most rich and exciting papers, papers that are not a kind of one plus one makes two. But as other mixed methods, people have said a sort of, you know, one plus one make three, which might reinforce the prejudiced against mixed methods people that they can't actually add up, but it also conveys the idea that there's a synergistic element when you do mixed methods work, which exceeds simply the additive element of putting qual with quant there's something about bringing them together which makes both of them better.
Catherine McDonald 19:48
So I'd like to come on now to talk about ethical dilemmas. Now I can imagine that you have encountered some. Have you got any advice around ethical dilemmas to pass on?
Laura Camfield 19:59
I would say try and benefit as much as possible from the experience of other projects who've tried to do similar things. So one thing Young Lives did very well under the leadership of Ginny Morrow was to publish a number of ethical papers about the sorts of challenges of doing longitudinal research, and particularly of doing research for children and young people in the Global South. And anyone who wanted to do that kind of research, I'd happily point them towards that set of papers, I think what can be a little bit harder to see, I mean, there's, there's a great deal of writing around ethics. But what can be a bit harder to see is the ethical challenges of the way that you're actually constructing the research. So younger researchers work is being appropriated by more senior researchers, they're not receiving appropriate credit. They're not being given the opportunity to take need authorship of papers, I did a review for ESRC with a couple of colleagues about five years ago, with all the projects that they'd funded in the ESRC-FCDO Poverty Alleviation Programme, lots of great projects. But we did notice that everyone worked cross nationally, they work with partners in the Global South, when I looked at the authorship, you would almost not know that there was one project, and it's one that's extremely well thought of, and you know, they've produced good stuff. I won't say which one it is. But it was of note that the Principle Investigator had 25 First authored papers on that project. And I mean, it's lovely to see this and being celebrated in that way. But I did sort of think no, well, okay, where are your country collaborators that have done so much work? Wouldn't it be nice for them to lead occasionally, maybe they're a little bit less advanced in their career than you are. And they'd actually really benefit from having a lead off the paper in World Development or wherever. So I suppose I would say, you know, that looking to your, the way your project is structured and thinking, actually, is this fair, can we do it better? That's an ethical dimension, which is sometimes neglected, even down to the sort of ethics of the bureaucracy around grant management, the sort of assumptions that are made by funders around the level of due diligence type information that your partner colleagues can provide to, you know, how do you approach that sort of thing sensitively, how do you say to them, we wouldn't be asking, we trust you, we respect you, we wouldn't be asking for this step if we didn't have to. And now tell us how we can help make this as unpainful for you as possible, because we recognise that this is putting a big burden on you. So having all this ethical sensitivity about the way you work as a researcher, as well as what you actually do in the field is so important.
Catherine McDonald 22:13
And so to come full circle towards the end of a project and the writing of the research, how do you approach that? And do you have top tips that you'd like to pass on?
Laura Camfield 22:21
I think something I've learned is, I love collecting data, I could do it all day that is absolutely the most pleasurable part of doing any of these sorts of projects, collecting data, talking about data, reading data, love it. When it comes to actually writing about it, that can be a bit harder, even if you're writing in a very collaborative way. So it is also in a sense, a social way. It's not just kind of you and your computer, I think what I would say is, it's great to collect data as a public good. And one can always be more comprehensive and more thorough in data collection. And it's good to have a dataset that can do practically anything. But when you come to actually write up your data set that you think can do practically anything cannot actually do all of those things that you imagined. And at that point, you might think, well, if I put at the outset, what papers am I going to want to write from this data set? What are the key specific questions I want to answer with it? If you'd had that in your mind from the beginning, then you might have collected slightly different data, which would then mean that you actually could answer the question that you want to answer in a much more in depth and interesting way than you're currently able to do. So I suppose as you embark on your project, first, obviously, you know, look at the literature, look at what's already known, find the gaps, but also look at the specific ways in which your dataset which you're not going to go out and collect can fill them. So it sounds like I'm giving this advice to an individual researcher. But it actually it's advice that applies as well, I think to country teams, and you can kind of lose sight of what you think you're doing in the desire to just do the data collection really, really well. And then you get to the end of it. And you end up with a mound of data analysable amounts of data, which will basically all you do is archived because you just don't know what on earth you're going to do with it. And yet, when you actually try to do a specific paper, you think, okay, well, I'm going to see, this actually something that happened for us in Young Lives, we wanted to dig deep into the effects of childhood nutrition, we realised that our data looks good to development studies, specialists in terms of nutrition, but if you wanted to speak to nutritional epidemiologists, then actually there was a lot of stuff that was missing a lot of really interesting stuff that they would like to have known. So our papers were really limited in terms of their audience, and a little bit on the superficial side. Actually, if we thought from the outset, we really want to write this sort of paper, what do we need to have to make it credible to the kind of audiences that we're going to be aiming for, if we put a bit more thought in at the outset, we wouldn't have had that problem at the end. So I think it's the gap between what you have and what you want to do that is always the challenge in writing. And if you find yourself in that position, as you were writing, then don't say to yourself, hey, you know, I can wing this, I can bridge this gap. Be critical and reflective and say, you know, now on balance, I recognise that these sorts of data would have been useful in enriching our understanding of this particular problem. And I would advocate that anyone else in the future collects this data. However, given the limitations that I've just outlined I still feel that this paper can make a valuable contribution to x y z. So I suppose being reflective and being humble, but ideally, not putting yourself in a position where you have to be reflective and humble by thinking at the start, what do you actually want to do at the end in terms of your writing up?
Catherine McDonald 25:14
My thanks to Professor Camfield the Youth Life project is funded by the EU Horizon 2020 Research and Innovation Programme and is a twinning initiative between the University of Southampton, Tallinn and Bamberg, and the Netherlands Interdisciplinary Demographic Institute. You can find out more about the project at https://www.euqualimix.ncrm.ac.uk/. This was a Research Podcasts production. Thank you for listening and remember to subscribe wherever you receive your podcasts.