Access the full text.
Sign up today, get DeepDyve free for 14 days.
J. Aaker (1997)
Dimensions of Brand PersonalityJournal of Marketing Research, 34
R. Likert
A technique for the measurement of attitudes
J. Miller
Online marketing research
G. Albaum, F. Evangelista, N. Medina
Australasian practitioners' views of the role of response behavior theory in survey research strategy
J.F. Hair, R.E. Anderson, R.L. Tatham, W.C. Black
Multivariate Data Analysis
N.J. Adler
International Dimensions of Organization Behavior
M. Couper (2011)
The Future of Modes of Data CollectionPublic Opinion Quarterly, 75
R. Groves, R. Cialdini, M. Couper (1992)
UNDERSTANDING THE DECISION TO PARTICIPATE IN A SURVEYPublic Opinion Quarterly, 56
E. Brüggen, K. de Ruyter, M. Wetzels
What motivates respondents to participate in online panels?
G. Albaum, Catherine Roster, Scott Smith (2015)
Topic Sensitivity: Implications for Web-Based Surveys
E. Brüggen, M. Wetzels, K. de Ruyter, N. Schillewaert
Individual differences in motivation to participate in online panels: the effect on response rate and response quality perceptions
D.A. Dillman, J.D. Smyth, L.M. Christian
Internet, Mail, and Mixed‐Mode Surveys: The Tailored Design Method
Vallen Han, G. Albaum, J. Wiley, P. Thirkell (2009)
Applying theory to structure respondents' stated motivations for participating in web surveysQualitative Market Research: An International Journal, 12
A. Duhachek, D. Iacobucci (2004)
Alpha's Standard Error (ASE): An Accurate and Precise Confidence Interval Estimate
Kristina Kays, Kathleen Gathercoal, W. Buhrow (2012)
Does survey format influence self-disclosure on sensitive question items?Comput. Hum. Behav., 28
Hong Kong SAR Government
Hong Kong: The Facts
R. Groves, E. Singer, A. Corning (2000)
Leverage-saliency theory of survey participation: description and an illustration.Public opinion quarterly, 64 3
E. Brüggen, Martin Wetzels, Ko Ruyter, N. Schillewaert (2011)
Individual Differences in Motivation to Participate in Online PanelsInternational Journal of Market Research, 53
N. Bradburn, S. Sudman, Edward Blair, C. Stocking (1978)
Question Threat and Response BiasPublic Opinion Quarterly, 42
J. Sakshaug, Ting Yan, R. Tourangeau (2010)
Nonresponse Error, Measurement Error, And Mode Of Data Collection: Tradeoffs in a Multi-mode Survey of Sensitive and Non-sensitive ItemsPublic Opinion Quarterly, 74
G. Hofstede
Culture's Consequences: Comparing Values, Behaviors, Institutions, and Organizations Across Nations
D. Dillman, R. Tortora, Dennis Bowker (1998)
Principles for Constructing Web Surveys
P.S. Poon, G. Albaum, F. Evangelista
Why people respond to surveys: a theory‐based study of Hong Kong respondents
R.M. Groves, M.P. Couper
Non Response in Household Interview Surveys
F. Kreuter, S. Presser, R. Tourangeau (2008)
Social Desirability Bias in CATI, IVR, and Web Surveys The Effects of Mode and Question SensitivityPublic Opinion Quarterly, 72
A. Peytchev
Survey breakoff
N.C. Schaeffer
Asking questions about threatening topics: a selective overview
R.M. Lee, C.M. Renzetti
The problems of researching sensitive topics: an overview
Anna Costello, J. Osborne (2005)
Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis.Practical Assessment, Research and Evaluation, 10
D.A. Dillman
Mail and Internet Surveys: The Tailored Design Method 2007 Update with New Internet
Patrick Poon, G. Albaum, F. Evangelista (2004)
Why People Respond to SurveysJournal of International Consumer Marketing, 16
R. Tourangeau, Tom Smith (1996)
ASKING SENSITIVE QUESTIONS THE IMPACT OF DATA COLLECTION MODE, QUESTION FORMAT, AND QUESTION CONTEXTPublic Opinion Quarterly, 60
J. Sieber, B. Stanley (1988)
Ethical and professional dimensions of socially sensitive research.The American psychologist, 43 1
R. Groves, S. Presser, S. Dipko (2004)
The Role of Topic Interest in Survey Participation DecisionsPublic Opinion Quarterly, 68
J. Cortina (1993)
What Is Coefficient Alpha? An Examination of Theory and ApplicationsJournal of Applied Psychology, 78
Reg Baker, S. Blumberg, J. Brick, M. Couper, Melanie Courtright, J. Dennis, D. Dillman, M. Frankel, Philip Garland, R. Groves, Courtney Kennedy, J. Krosnick, P. Lavrakas, Sunghee Lee, Michael Link, Linda Piekarski, K. Rao, Randall Thomas, D. Zahs (2010)
Research Synthesis AAPOR Report on Online PanelsPublic Opinion Quarterly, 74
Charles Martin (1994)
The Impact of Topic Interest on Mail Survey Response BehaviourInternational Journal of Market Research, 36
P. Poon, F. Evangelista, G. Albaum
Survey participation and response rates: a cross‐culture comparison
R. Tourangeau, Ting Yan (2007)
Sensitive questions in surveys.Psychological bulletin, 133 5
R.A. Peterson, N.M. Ridgway
A note on the perception of threatening questions
Patrick Poon, G. Albaum, F. Evangelista (1999)
An Empirical Test of Alternative Theories of Survey Response BehaviourInternational Journal of Market Research, 41
R. Peterson (1994)
A Meta-analysis of Cronbach's Coefficient AlphaJournal of Consumer Research, 21
Purpose – The purpose of this paper is to examine the effect of topic sensitivity and the research design techniques of forced answering (FA) (i.e. cannot proceed if leave an answer blank) and response options (use of “prefer not to answer” (PNA) option) on respondent motives for participating in an internet‐based survey. Design/methodology/approach – Data were collected in a field experiment in Hong Kong using a 2×2×2 factorial design. Variables manipulated were topic sensitivity, use of FA, and response options. The dependent variables were eight specific motives which were obtained from responses to the survey participation inventory (SPI). Findings – Topic sensitivity has a significant influence on seven of the eight motives. The use of FA does not appear to affect motives. In contrast, the use of the response option “PNA” has a significant effect on all motives except “obligation”. The SPI appears to be a viable measure to the use with Hong Kong online panellists, and perhaps with other Asian and non‐Western cultures/countries as well. Research limitations/implications – The present study tested only two specific topics, each with a specific level of sensitivity. Further research should apply the SPI to topics of varying levels of sensitivity. The present study used a sample of panel members. Future research could examine motivation for survey participation for use with off‐line samples. Practical implications – There are differences in motivation for survey participation among panellists. The authors relate panellists' motivation to topic sensitivity and confirm that panellists who answered questions about a sensitive topic were less motivated to participate in every motivational aspect, except for incentives. The authors find that the survey design feature of FA is largely unrelated to panellists' motivation. Originality/value – This is one of the few studies that show the impact of topic sensitivity, FA, and response options on motives for responding. It is the first use of the SPI in a non‐Western culture/nation.
Asia Pacific Journal of Marketing and Logistics – Emerald Publishing
Published: Jan 7, 2014
Keywords: Hong Kong; Internet‐based survey method; Survey respondent motives; Topic sensitivity
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.