Item nonresponse and measurement error in cross-national surveys: Methods of data collection and analysis


Principal Investigator: Jouni Kuha, London School of Economics and Political Science

Co-Investigators: Chris Skinner, Irini Moustaki and Jonathan Jackson (LSE), Sarah Butt (City University London)

Project duration: 1 April 2013 - 30 September 2014


Cross-national surveys are one of the key resources of social science. For example, the European Social Survey (ESS) interviews over 50,000 people in around 30 countries. Such data can be used to address many research questions about attitudes and behaviour of individuals, and how these vary across countries. Answers to such questions can inform many areas of social science and public policy making.

The complexity of the surveys raises methodological challenges which need to be met in order to make the best use of the data. Two of these are problems of data quality: measurement error where the answers by survey respondents are in some way erroneous, and nonresponse where some questions are not answered at all. These can distort conclusions from any survey. In cross-national surveys the problem is compounded if the levels and patterns of nonresponse and error are also different between countries.

The goal of this project is to develop and evaluate research methods for these problems. The researchers will focus on cross-national surveys because the challenges can be particularly serious there, while knowledge is most lacking: even if we know something about how methods work in one country, we often know little about how they work differently in different countries and how this may affect cross-national comparisons.

The first strand of the project concerns multiple questions which are used jointly to measure constructs such as attitudes. For them we will study the statistical method of latent variable modelling, which can be used to analyse such measurement and how it varies across countries. We will add to it a way of analysing also nonresponse in the questions. We will then apply it to ESS questions on attitudes to immigration, an area where complex patterns of measurement error and nonresponse are likely. The results of our analysis may then inform the design of questions in the ESS itself, which will include a large module on immigration in 2014.

We will also study a method of data collection whose aim is to reduce data problems in multiple-item questions in the first place. This is “prompting”, where survey interviewers query initial answers of “don’t know” to try to replace them with substantive answers. This can reduce “don’t knows”, but it may also replace them with ill-considered replies. We will study these possibilities with a study in an ESS innovation survey, and use the results to provide recommendations about the use of prompting in the future.

The second strand of the project concerns questions on sensitive topics such as illegal behaviour. When asked about these, people often lie or refuse to answer at all. There are alternative methods of questioning which avoid direct questions but still provide information about the behaviours. We will examine one of them, the “item count technique”, and methods of analysing it. We will study it in a 7-country “Fiducia” survey, with an experiment of item count questions on acts such as downloading of pirated material from the internet. The results can tell us if the method reduces nonresponse and error compared to direct questions, and provide recommendations for the use of the method.

The research team includes statisticians, survey researchers and social scientists from the LSE and City University London. To enable users across the world to learn from the project, we will develop a training module for the EduNet e-learning site of the ESS. We will also provide information on a project website and disseminate our findings through academic conferences and publications.

Listen to Jouni Kuha talk about the project findings in podcast 'To probe or not to probe - Nonresponse in surveys'.