Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
Increasing respondent engagement
Time:
Monday, 08/July/2024:
3:30pm - 5:00pm

Session Chair: Joost Kappelhof
Session Chair: May Doušak
Location: C401, Floor 4

Iscte's Building 2 / Edifício 2

Session Abstract

In terms of its methodological rigour and data quality, the ESS has been a benchmark international comparative face-to-face survey project for over 20 years. However, the upcoming transition from interviewer-administered to self-completion survey brings new methodological and ethical challenges to the ESS, with respondent engagement being one of them. To this end, new ways of respondent motivation in self-completion surveys need to be developed and evaluated thereby ensuring that the ESS remains the benchmark when it comes to methodological and data quality in international comparative surveys.

In face-to-face surveys, trained interviewers can keep the respondent focused on the survey and can even stop the interview when the external context (circumstances) is not appropriate. It is much more challenging to keep the respondent focused (solely) on the survey in self-completion: a paper questionnaire offers no control or information on the respondent engagement or number of sittings, and Web surveys are conducted on devices with multiple applications constantly battling for user attention. While not true for all question types and all respondents, research into cognitive processes shows that respondent focus and response times generally affect the quality of the responses (Tourangeau, 1989; Yan and Tourangeau, 2007).

Passive monitoring of the respondent by collecting paradata when they are completing a self-completion survey can provide vast and diverse information on their level of engagement. Still, it cannot increase the quality of responses while the interview is being completed in the way a trained interviewer can. Active monitoring and intervention (e.g. notifications when the respondent is speeding, straight-lining, not responding, etc.) can partly increase the respondent's focus but can also influence the responses or even willingness to complete the survey. Graphical approaches and gamification can make the survey more enjoyable. Still, the literature shows that the researchers should mainly focus on the fundamental components of respondent burden coming from the instrument itself (e.g. Guin, T. D.-L. et al, 2012).

The session is focused on both instrument design to keep the respondents engaged and approaches to monitoring and assisting the respondent through the survey in an ethically appropriate manner with the aim to produce data of the highest possible quality.


Show help for 'Increase or decrease the abstract text size'
Presentations

How far is close enough? Effect of distance to the closest outlet on the effectiveness of shop vouchers as a non-monetary incentive in a face-to-face survey

Dragan Bagić, Luka Jurković

Faculty of Humanities and Social Sciences, University of Zagreb, Croatia

Continuous decreases in the levels of response rates (RR) in survey studies have led to various experiments with survey procedures to boost RR. One of the most often-used solutions is to introduce incentives for respondents. Both control trials and meta-analyses of such studies have confirmed that incentives are effective and can boost RR to some extent in survey studies, regardless of the survey mode (Jia et al., 2021; Pforr et al., 2015). It has been well-established that monetary incentives in the form of cash or checks are more effective than non-monetary incentives such as store vouchers, lottery, small gifts, etc. (Abdelazeem et al., 2023; Jia et al., 2021) The general explanation is that monetary incentives have no obstacles to usage, and respondents can use them whenever and wherever they want.

Although many studies are exploring the effects of incentives, there is limited knowledge about the effectiveness of different types of non-monetary incentives. In a meta-analysis, non-monetary incentives are often treated as a homogeneous group, despite the fact that differences in their specific characteristics can have a significant impact on their effectiveness. Shop vouchers are considered non-monetary incentives because respondents are limited to where and how they can use them. They are frequently used as supplementary method to monetary incentives in cases where giving cash to respondents is not legal or can put a tax burden on them. However, only a limited number of studies have specifically looked at the effects of shop vouchers, and there is not enough evidence to determine how specific characteristics of shop vouchers, such as their value or the type of shop they can be used in, influence their effectiveness (Willimack et al., 1995).

The purpose of this paper is to enhance our understanding of how certain characteristics of shop vouchers affect their effectiveness. Specifically, we will examine how the distance between a respondent's home and the nearest retail chain outlet impacts the effectiveness of vouchers in increasing RR. If the diminished effectiveness of non-monetary incentives, such as shop vouchers, is due to usage barriers, then we can propose the hypothesis that the distance between the respondent's home and the nearest location where the voucher can be used will influence the voucher effectiveness, as a greater distance presents an obvious potential obstacle.

The hypothesis will be tested using observational data from ESS R10 in Croatia. R10 data collection was conducted between April and November 2021 on an individual-based gross sample of 3940 respondents. Shop vouchers were offered to all respondents as a conditional incentive of a value of around 6,5 euros, resulting in an RR of 51%. Anonymized standard contact forms, including the respondent's gender, age, settlement size, and type, and the final contact outcome are used as a database for the analysis with additional variables: exact distance between the respondent’s home address and nearest retail outlet where the voucher can be used and whether the retail outlet is in the same settlement. Multilevel logistic regression was performed to test the main hypothesis.



Increasing respondent engagement in self-completion surveys

Joost Kappelhof1, Paulette Flore1, May Doušak2, Roberto Briceno-Rosas3

1SCP - The Netherlands Institute for Social Research; 2University of Ljubljana; 3GESIS - Leibniz Institute for the Social Sciences

In terms of its methodological rigour and data quality, the ESS has been a benchmark international comparative face-to-face survey project for over 20 years. However, the upcoming transition from interviewer-administered to self-completion survey brings new methodological and ethical challenges to the ESS, with respondent engagement being one of them. To this end, new ways of respondent motivation in self-completion surveys need to be developed and evaluated thereby ensuring that the ESS remains the benchmark when it comes to methodological and data quality in international comparative surveys.

In face-to-face surveys, trained interviewers can keep the respondent focused on the survey and can even stop the interview when the external context (circumstances) is not appropriate. It is much more challenging to keep the respondent focused (solely) on the survey in self-completion: a paper questionnaire offers no control or information on the respondent engagement or number of sittings, and Web surveys are conducted on devices with multiple applications constantly battling for user attention. While not true for all question types and all respondents, research into cognitive processes shows that respondent focus and response times generally affect the quality of the responses (Tourangeau, 1989; Yan and Tourangeau, 2007).

Passive monitoring of the respondent by collecting paradata when they are completing a self-completion survey can provide vast and diverse information on their level of engagement. Still, it cannot increase the quality of responses while the interview is being completed in the way a trained interviewer can. Active monitoring and intervention (e.g. notifications when the respondent is speeding, straight-lining, not responding, etc.) can partly increase the respondent's focus but can also influence the responses or even willingness to complete the survey. Graphical approaches and gamification can make the survey more enjoyable. Still, the literature shows that the researchers should mainly focus on the fundamental components of respondent burden coming from the instrument itself (e.g. Guin, T. D.-L. et al, 2012).

The paper is focused on both instrument design to keep the respondents engaged and approaches to monitoring and assisting the respondent through the survey in an ethically appropriate manner with the aim to produce data of the highest possible quality.



Self-completion feasibility testing in ESS Round 11 in Slovakia. Aims, worries and reality.

Michal Kentoš, Denisa Fedáková

Centre of Social and Psychological Sciences, Slovak Republic

Slovakia belongs to a number of countries where self-completion approach has rarely been used in comparable surveys. ESS plans the transition to a self-completion approach from the Round13 and it indicates the necessity to carry out the feasibility testing. NCT in Slovakia made the first „try“ as early as possible, i.e. shortly after the ESS Round 11 face-to-face fieldwork was completed in December 2023. A sample of 1046 households from the same PSUs (523) as the sample for the F2F approach was selected. In the field (started in January 2024) the concurrent approach (push to web and paper) was employed. The full ESS questionnaire was used and delivered to addresses by postal operator. Invitation letter, GDPR letter and return envelope were attached and also three reminder letters were delivered later in the specified intervals. The aim of the testing was to verify a few basic parameters: i) reliability of the postal operator ii) technical solution (push to web), iii) response rate for different modes and demographic groups, iiii) feedback from the helpline and communication with respondents iiiii) reveal practical issues experienced in self-completion mode. Preliminary results indicate an overall low response rate for both the web and paper mode. Final results will be presented and follow-up interventions focused on instrument design and respondent engagement will be discussed.



Surely shorter is better? A questionnaire length experiment in a self-completion survey

Tim Hanson1, Eva Aizpurua2, Rory Fitzgerald1, Marta Vukovic3

1European Social Survey Headquarters (City, University of London), United Kingdom; 2NatCen Social Research; 3University of Vienna

As surveys increasingly transition to self-completion approaches, understanding the impact of design decisions on survey outcomes becomes paramount. This includes assessing the effects of different questionnaire lengths on survey quality and considering what length of questionnaire can reasonably be fielded in a self-completion environment. This paper presents findings from a self-completion experiment conducted in Austria in 2021, which compared the full European Social Survey Round 10 questionnaire (estimated 50-minute completion time) with a shorter version (anticipated 35-minute completion time). The analysis includes a comparison between the longer and shorter versions based on response rates, sample composition, and data quality indicators. Except for response rates, which were significantly but only slightly higher in the shorter condition, results for both versions were generally comparable. Fielding a shorter questionnaire did not produce clear benefits in terms of sample composition or data quality. These results suggest that a questionnaire that would traditionally be regarded as lengthy for self-completion can yield acceptable outcomes.



The role of survey enjoyment in understanding nonresponse in an online probability panel

Katya Kostadintcheva, Patrick Sturgis, Jouni Kuha

London School of Economics and Political Science, United Kingdom

Online probability panels are an increasingly common feature of the modern survey landscape. Their design is based on recruiting a random sample of respondents who agree to complete surveys at regular intervals for small monetary incentives. However, compared to interviewer-based survey panels they are characterised by considerably higher rates of nonresponse at each panel wave. This is in addition to the already lower initial recruitment rates when set against other survey modes. Together, this means that the net cross-sectional response rate for this type of design is often very low. Given their increasing prevalence, it is essential that we better understand the factors that lead online panel respondents to decline survey invitations.

In this paper, we focus on the role of survey enjoyment in determining subsequent nonresponse. The research uses data from UK’s Kantar Public Voice online probability panel, which employs a mixture of face-to-face and address-based sampling to recruit respondents randomly. We model nonresponse across multiple panel waves using discrete time survival analysis. This approach enables us to deal with the unbalanced data structure that characterises this type of panel, where some panel members receive more frequent survey invitations than others based on their response propensity. We use information from earlier survey waves of the panel to predict the probability of nonresponse in future waves. Specifically, we examine if previous evaluation of the survey experience by the respondent, defined as survey enjoyment, predicts subsequent survey panel participation.

Additionally, we assess whether the role of survey enjoyment is moderated by the personality of the respondent measured using a short form version of the Big Fiver Personality Inventory. The rationale here is that, according to ‘leverage-salience’ theory, respondents who have less intrinsic motivation to complete surveys will, all else equal, require a more enjoyable survey experience to incentivise them to respond.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: ESS 2024
Conference Software: ConfTool Pro 2.8.101+TC
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany