Spending wisely: Investigating survey mode effects in discrete choice experiment responses

Lead Research Organisation: University of Aberdeen
Department Name: Institute of Applied Health Sciences

Abstract

To collect information from patients or the general public researchers often use surveys. In health economics, surveys are used to ask patients or the general public about the kind of National Health Service (NHS) they would like. Typically, these surveys have been sent by post to a random sample of people. Over time the number of people who answer postal surveys has decreased, and recently researchers have turned to the internet to collect information. Internet surveys have several advantages compared to postal surveys: they are cheaper, faster, and can include pictures or videos. But using the internet may change who answers the survey, how they answer the survey questions, and how easy it is for them to provide accurate answers.

Not everyone in the UK has access to and uses the internet: 27% of the UK population do not have access to the internet either through dial-up connections or broadband. People without internet access are, on average, older, poorer and less likely to live in South-East England than those with access. Furthermore, a random sample of UK internet users cannot be asked to complete the survey. First, there is no database of all email addresses. Second, even if there were, sending people unsolicited email (SPAM) is illegal. For these reasons, researchers using the internet to collect information usually pay market research companies for access their 'online panels'. In the UK, these online panels are groups of people who have volunteered to answer surveys online for market research companies in exchange for rewards (money, vouchers, or entry to a prize draw). Market research companies advertise on many websites and search engines to encourage people to volunteer. In 2010 the American Association for Public Opinion Research (AAPOR) advised that "researchers should avoid [volunteer] online panels when one of the research objectives is to accurately estimate population values".

Studies have used internet surveys to collect information about patients' or the public's preferences for health care, but few have tested if internet surveys give different results than surveys sent by post or interviews in people's home. The aim of this research project is to concentrate on this knowledge gap and test if different ways of collecting survey information affects the study's results. In particular, are the preferences reported in the internet surveys different from those reported in postal surveys or interviews?

We will compare four ways to collect survey information: internet panel survey, postal survey, postal invitation to complete an internet survey and in-person interviews. We will compare answers to a survey asking the general public about their use of community pharmacies to manage minor illness. We will consider:

- How well respondents to each survey represent the general population
- If the answers people give are different across the different surveys and if this would lead us to different conclusions about the type of community pharmacist service people prefer
- If the quality of people's answers are better for some surveys than others
- By how much the cost of collecting information differs across the different survey types.

The results of this research will help researchers to make decisions about how they collect information from patients or the general public in the future.

Technical Summary

Surveys of patient or public preferences are used in health economics research. While in-person interviews are recommended for preference elicitation research they are expensive and researchers have used mail or increasingly internet surveys instead. The survey mode may affect who responds to the survey, how respondents answer the questions and their ability to provide accurate responses. Only one health economics study has investigated the effect of survey mode on the data obtained.

This study aims to investigate the effect of survey mode on responses to a questionnaire asking the general population about their preferences for health care. We will compare four survey modes: internet panel survey, mail survey, mail invitation to complete an internet survey, and in-person interviews. The surveys will elicit preferences for a health care good relevant to the general population: the use of community pharmacies for managing minor illness. No previous research has compared all four survey modes in a preference elicitation study. An identical questionnaire will be used to collect data across the four survey modes and data collection for each mode will take place at the same time. For each mode, we will consider three measures of data quality (sample representativeness, response rate, response validity). We will test if the elicited preferences are significantly different across modes, and will also compare the survey cost across modes. To achieve the study aims, the proposed research has five objectives:

to test if respondents to each mode are representative of the general population
to test if the elicited preferences are significantly different across modes
to explore the use of statistical techniques to account for differences in respondent characteristics across modes
to test if response validity varies across modes
to compare research costs across survey modes.

Planned Impact

This research will directly benefit health economists, health services researchers, and social scientists who conduct research using surveys and in particular researchers who use surveys to measure preferences. While in-person interviews have long been advocated for preference elicitation studies, mail surveys were frequently used because they were cheaper. In recent years, however, health economists have faced lower response rates to mail surveys and increasingly have used the internet to collect survey data. However, the survey mode may also affect who responds to the survey, how respondents answer the questions and their ability to provide accurate responses. In 2010 the American Association for Public Opinion Research (AAPOR) cautioned that "researchers should avoid [volunteer] online panels when one of the research objectives is to accurately estimate population values".

This study will systematically investigate the effect of survey mode on the quality of responses to a survey of the general population about their preferences. We will compare four modes: internet panel survey, mail survey, mail invitation to complete an internet survey and in-person interviews. No previous research has compared all four modes in a preference elicitation study. The results of this research will provide researchers with a characterisation and quantification of the advantages and disadvantages of each mode and thus allow them make an informed decision about which mode(s) to use in their research.

The proposed research will have an indirect impact on population health, wealth and culture. UK government decision making uses cost benefit analysis (CBA) to compare the costs, benefits, and risks of policies and proposals. CBA requires that costs and benefits are measured. For many costs and benefits, such as the health benefits of new drug treatments for cancer or the environmental benefits of reducing greenhouse gas emissions no market price exists. If a value for these benefits is not included in government decisions, then public spending will not reflect all that society believes to be important. Surveys are used to elicit values for these non-market costs and benefits. This research will provide a better understanding of how the choice of survey mode affects the quality of the data obtain and thus the validity of the values inferred.

The research will also indirectly benefit users of survey results. For instance, the results of in-person surveys to measure preferences for health and health care have fed into measures such as the EQ-5D, which are widely used on cost-effectiveness studies of health technologies. Our results will be beneficial to users such as the National Institute for Clinical Effectiveness (NICE), and other health care decision makers who make resources allocation decisions, when they have to compare preferences elicited from using different survey modes and potentially different populations.

Publications


10 25 50
 
Description MRC Methodology Grant
Amount £232,152 (GBP)
Organisation Medical Research Council (MRC) 
Sector Academic/University
Country United Kingdom of Great Britain & Northern Ireland (UK)
Start 02/2012 
End 07/2014
 
Title CAPI data set 
Description The data collected using the CAPI survey has been deposited with UK Data archive under a Creative Commons License Attribution - share a like 4.0 International 
Type Of Material Database/Collection of data 
Year Produced 2016 
Provided To Others? Yes  
Impact No impacts yet 
 
Description HERU - DCE Methodology 
Organisation Medical Research Council (MRC)
Department MRC Research Grant
Country United Kingdom of Great Britain & Northern Ireland (UK) 
Sector Public 
PI Contribution I am a co-applicant on a grant application; Spending wisely. Investigating survey mode effects in discrete choice experiments.; (MRC Methodology Grant). I contributed to the research application, protocol. I am one of two researchers responsible for delivering the research (including ongoing development of the research materials and methods) . Dr Verity Watson of HERU is the principal investigator and researcher responsible for delivering the research. Professor Mandy Ryan of HERU is a co-applicant; she contributes to team meetings, steering group meetings and ongoing conduct of the study.
Collaborator Contribution This research was funded through an MRC Methodology Grant
Impact This collaboration resulted in the award of funding via an MRC Methodology Grant. It is a multi-disciplinary collaboration (health economists and pharmacist)
Start Year 2012
 
Description Mode and perspective 
Organisation University of Sheffield
Country United Kingdom of Great Britain & Northern Ireland (UK) 
Sector Academic/University 
PI Contribution Development of grant application, Conducted literature review, Developed a poster for presentation at a conference, Wrote a journal article
Collaborator Contribution The collaboration started in 2014 as the development of grant application to build on our research in this grant, and collaborators research on two other MRC funded grants. THe grant application was unsuccessful. Since this time, VW has continued the collaboration and the team has expanded to address the comments on the previous grant and new MRC application was submitted in 2016. The collaboration started with a literature review, which was presented as a poster at a UK conference in 2014. Building on this VW developed a manuscript for submission to a peer reviewed journal with one of the new collaborators. This paper has been accepted for publication in health economics.
Impact Poster at a UK conference Since 2015 this collaboration is multidisciplinary and includes economists and psychologists. One output so far: Tsuchiya, A. Watson, V. (Forthcoming). Re-thinking "the different perspectives that can be used when eliciting preferences in health" Health Economics (no DOI yet)
Start Year 2014
 
Description Sample representativeness 
Organisation Technical University of Berlin
Country Germany, Federal Republic of 
Sector Academic/University 
PI Contribution I have contributed data and knowledge of survey methods. Collaborators at UoA and Tu Berlin have contributed statistics skills
Collaborator Contribution The partnership is developing statistical methods to test multidimensional sample representativeness. The partners have the experience and knowledge of statistical methods needed to develop these tests.
Impact Presentation at the TU Berlin. Not multidisciplinary.
Start Year 2016
 
Description Sample representativeness 
Organisation University of Aberdeen
Department School of Engineering
Country United Kingdom of Great Britain & Northern Ireland (UK) 
Sector Academic/University 
PI Contribution I have contributed data and knowledge of survey methods. Collaborators at UoA and Tu Berlin have contributed statistics skills
Collaborator Contribution The partnership is developing statistical methods to test multidimensional sample representativeness. The partners have the experience and knowledge of statistical methods needed to develop these tests.
Impact Presentation at the TU Berlin. Not multidisciplinary.
Start Year 2016
 
Description Haindorf statistics seminar 
Form Of Engagement Activity Participation in an activity, workshop or similar
Part Of Official Scheme? No
Geographic Reach International
Primary Audience Other academic audiences (collaborators, peers etc.)
Results and Impact Talk raised interest of statisticians in issues of data comparison. I received useful feeback and ideas

None that I am aware of
Year(s) Of Engagement Activity 2014
 
Description Health services research and pharmacy practice conference 
Form Of Engagement Activity A talk or presentation
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Health professionals
Results and Impact Talk brought questions afterwards

None i am aware of
Year(s) Of Engagement Activity 2014
 
Description Policy Brief 
Form Of Engagement Activity A magazine, newsletter or online publication
Part Of Official Scheme? No
Geographic Reach National
Primary Audience Professional Practitioners
Results and Impact Based on the applied case study, we wrote a 'policy brief' as part of a series in HERU. The policy brief was a two page summary of our research findings focusing on the results that are of most relevance to pharmacy practice and primary care. The policy brief was sent to around 200 people across the UK in pharmacy organisations and primary care organisations, within the NHS and within government and arms length bodies. The policy brief is also available electronically on the HERU website, was tweeted by the HERU account (836 followers) - it was retweeted 17 times including by the Chief Pharmaceutical Officer for Scotland and several Pharmacists.
Year(s) Of Engagement Activity 2016
URL http://www.abdn.ac.uk/heru/news/10080/