Difference between revisions of "Sensitive Topics"

Jump to: navigation, search
(6 intermediate revisions by 4 users not shown)
Line 2: Line 2:




== Read First ==
== Read First ==<onlyinclude>
For certain topics, respondents will have incentives to conceal the truth, due to taboos, social pressure (e.g. [[Social Desirability Bias]], fear of retaliation, etc. This can create bias, the size and direction of which can be hard to predict. To avoid this, it is essential to guarantee anonymity / confidentiality, and to develop [[Survey Protocols]] to guarantee privacy and maximize trust. If this is not sufficient, experimental methods such as [[Randomized Response Technique]], [[List Experiments]] and [[Endorsement Experiments]] can be used.
For certain topics, respondents will have incentives to conceal the truth, due to taboos, social pressure (e.g. Social Desirability Bias, fear of retaliation, etc). This can create bias, the size and direction of which can be hard to predict. To avoid this, it is essential to guarantee anonymity / confidentiality, and to develop [[Survey Protocols]] to guarantee privacy and maximize trust. If this is not sufficient, experimental methods such as [[Randomized Response Technique]], [[List Experiments]] and [[Endorsement Experiments]] can be used.</onlyinclude>


== Guidelines ==
== Guidelines ==
Line 10: Line 10:
* Survey mode: self-administered questionnaires may provide more accurate data than interviews
* Survey mode: self-administered questionnaires may provide more accurate data than interviews
* Frame questions to avoid social desirability bias
* Frame questions to avoid social desirability bias
* One possible strategy is to ask to count the number of statements that are true among a list that contains '''one''' sensitive statement (ex: My partner is sometimes violent with me). The difference in the counts between the treatment arms will reveal the effect (or lack of).


=== Survey Protocols for Collecting Sensitive Data ===
=== Survey Protocols for Collecting Sensitive Data ===
Line 21: Line 22:


== Additional Resources ==
== Additional Resources ==
*DIME Analytics’ guidelines on [https://github.com/worldbank/DIME-Resources/blob/master/survey-instruments.pdf survey design and pilot]
* Survey Methods for Sensitive Topics: https://graemeblair.com/papers/sensitive.pdf
* Survey Methods for Sensitive Topics: https://graemeblair.com/papers/sensitive.pdf
* Bowling, Ann. "Mode of questionnaire administration can have serious effects on data quality." Journal of public health 27.3 (2005): 281-291. [https://msrc.fsu.edu/system/files/Bowling%202005%20Mode%20of%20Questionnaire%20Administration%20Can%20Have%20Serious%20Effects%20on%20Date%20Quality.pdf]
* Bowling, Ann. "Mode of questionnaire administration can have serious effects on data quality." Journal of public health 27.3 (2005): 281-291. [https://academic.oup.com/jpubhealth/article/27/3/281/1511097]
* Frauke Kreuter, Stanley Presser, Roger Tourangeau; Social Desirability Bias in CATI, IVR, and Web Surveys: The Effects of Mode and Question Sensitivity. Public Opin Q 2009; 72 (5): 847-865. doi: 10.1093/poq/nfn063  
* Frauke Kreuter, Stanley Presser, Roger Tourangeau; Social Desirability Bias in CATI, IVR, and Web Surveys: The Effects of Mode and Question Sensitivity. Public Opin Q 2009; 72 (5): 847-865. doi: 10.1093/poq/nfn063  
[https://oup.silverchair-cdn.com/oup/backfile/Content_public/Journal/poq/72/5/10.1093/poq/nfn063/2/nfn063.pdf?Expires=1486496735&Signature=GonbJEVn87BQJ1B9ouJHx4IqiRPGNSbzmU4YNoLwcQpTSizbX828jFkHaJRt8Pj6fSkvWIoy60XwiSIlL2BhmqOVQqzSHfcgiH3tpV6F45FTL8CK-naAfkbDH589fKC51d1aYvhXNQfo5vIZprjxv3cOuuexvB1a-Aqwma3jQqMra8izONnypi0EfglULbPqcVisT3T1sPWeXEK5ii1ilzc74jRQ7sOiixdlGJ29EDdlUbH3MGYtfxquayAuEySiACUQ-ERAvfYuiI0XOyv5kI-yCn9vmMKz1ZBQfuLI6Y6W~z6Vcc3mmgVpkXWpgodoX-obFUL7X4OXUK6MTFQ14w__&Key-Pair-Id=APKAIUCZBIA4LVPAVW3Q]
[https://academic.oup.com/poq/article/72/5/847/1833162]


[[Category: Questionnaire Design]]
[[Category: Questionnaire Design]]

Revision as of 15:02, 12 June 2019

This article provides guidance for how to collect data on sensitive topics.


Read First

For certain topics, respondents will have incentives to conceal the truth, due to taboos, social pressure (e.g. Social Desirability Bias, fear of retaliation, etc). This can create bias, the size and direction of which can be hard to predict. To avoid this, it is essential to guarantee anonymity / confidentiality, and to develop Survey Protocols to guarantee privacy and maximize trust. If this is not sufficient, experimental methods such as Randomized Response Technique, List Experiments and Endorsement Experiments can be used.

Guidelines

Survey Design for Sensitive Data

  • Never start with difficult or sensitive modules! Start with easy questions, and work up to harder questions, expecting that the respondent will become increasingly comfortable with / trusting of the enumerator as the interview proceeds.
  • Survey mode: self-administered questionnaires may provide more accurate data than interviews
  • Frame questions to avoid social desirability bias
  • One possible strategy is to ask to count the number of statements that are true among a list that contains one sensitive statement (ex: My partner is sometimes violent with me). The difference in the counts between the treatment arms will reveal the effect (or lack of).

Survey Protocols for Collecting Sensitive Data

  • Make sure the respondent knows that responses will never be personally identified. This should be part of the Informed Consent module.
  • Interviews should be done privately, without even family members around (especially for discussing issues such as domestic violence)
  • Enumerators who share characteristics with the respondent (same gender, age, ethnic group, background) may garner increased trust

Back to Parent

This article is part of the topic Questionnaire Design


Additional Resources

  • DIME Analytics’ guidelines on survey design and pilot
  • Survey Methods for Sensitive Topics: https://graemeblair.com/papers/sensitive.pdf
  • Bowling, Ann. "Mode of questionnaire administration can have serious effects on data quality." Journal of public health 27.3 (2005): 281-291. [1]
  • Frauke Kreuter, Stanley Presser, Roger Tourangeau; Social Desirability Bias in CATI, IVR, and Web Surveys: The Effects of Mode and Question Sensitivity. Public Opin Q 2009; 72 (5): 847-865. doi: 10.1093/poq/nfn063

[2]