Difference between revisions of "Remote Surveys"
(56 intermediate revisions by 3 users not shown) | |||
Line 1: | Line 1: | ||
Depending on whether data is collected in-person or remotely, there are two methods of [[Primary Data Collection|primary data collection]] - '''remote surveys''' and [[Field Surveys|field surveys]]. | Depending on whether data is collected in-person or remotely, there are two methods of [[Primary Data Collection|primary data collection]] - '''remote surveys''' and [[Field Surveys|field surveys]]. The [[Impact Evaluation Team|impact evaluation team]] (or research team) can conduct '''remote surveys''' either to conduct a '''follow-up''' survey after an initial round of [[Field Surveys|field survey]], or when the study area is not accessible, for instance in the case of a conflict. In such cases, the '''research team''' can still get access to data that can help them solve common development challenges such as lack of access to water, electricity or healthcare. | ||
== Read First == | == Read First == | ||
* Primary data collection is the process of gathering data through surveys, interviews, or experiments. | * [[Primary Data Collection|Primary data collection]] is the process of gathering data through surveys, interviews, or experiments. | ||
* | * [[Preparing for Remote Data Collection|Preparing for remote data collection.]] | ||
* | * [https://www.worldbank.org/en/research/dime/data-and-analytics DIME Analytics] has created the following [https://github.com/worldbank/DIME-Resources/blob/master/survey-preparing.pdf guidelines for data collection.] | ||
* Depending on the medium over which they are conducted, '''remote surveys''' can be of different types - [[Remote Surveys#Phone Surveys (CATI)|phone surveys]], [[Remote Surveys#Other Remote Surveys|web surveys]], and [[Remote Surveys#Other Remote Surveys|recorded surveys]]. | |||
# | |||
# | |||
== | == Phone Surveys (CATI) == | ||
In [[Phone Surveys (CATI)|phone surveys]] or '''computer-assisted telephone interviews (CATI)''', the enumerator calls the respondent and asks them the questions over telephone. The enumerator enters these responses onto a programmed digital survey, which is then shared electronically with the research team. '''Phone surveys''' have 2 major advantages. Firstly, they allow the [[Impact Evaluation Team|research team]] to [[Primary Data Collection|collect data]] remotely, which is cost-effective since collecting data through face-to-face interviews can often be expensive. Secondly, they are a great option for collecting data during emergencies or in conflict areas while still retaining the personal touch that researchers prefer. | |||
However, when a '''research team''' decides to use a phone survey for collecting data, they must keep in mind concerns about: | |||
* [[Phone Surveys (CATI)#Feasibility|Feasibility]], that is assessing whether or not a phone survey can be carried out. | |||
* [[Phone Surveys (CATI)#Preparation|Preparation]], that is discussing and planning the steps and guidelines involved in [[Preparing for Remote Data Collection|remote data collection]]. | |||
* | * [[Phone Surveys (CATI)#Communication|Communication]], that is, establishing a chain of command, and strictly following the established [[Preparing for Remote Data Collection#Protocols|protocols for remote data collection]]. | ||
* | |||
* [[ | |||
== Other Remote Surveys == | == Other Remote Surveys == | ||
Besides phone surveys, there are other | Besides phone surveys, there are other forms of '''remote data collection''' as well. These include '''web surveys''' and '''recorded surveys'''. | ||
=== Web surveys === | === Web surveys === | ||
In a '''web survey''', the research team directly shares a link to a programmed | In a '''web survey''', the [[Impact Evaluation Team|research team]] directly shares a link to a [[Questionnaire Programming|programmed questionnaire]] with respondents via text or email, without relying on an enumerator. The respondent then answers the questions in the survey by clicking the link to the survey, either on a phone or a laptop. Therefore, in the case of web surveys, it is important for respondents to have access to internet and have the ability to read and understand a written questionnaire. | ||
Web-surveys are useful because they reduce survey costs considerably as there are no enumerators involved. They can also be programmed easily using open data kit (ODK)-based tools, without the need for a lot of modification in the drafting process. However, they too suffer from the problem lower response rates which arise due to | Web-surveys are useful because they reduce survey costs considerably as there are no enumerators involved. They can also be programmed easily using open data kit (ODK)-based tools, without the need for a lot of modification in the drafting process. However, they too suffer from the problem of lower '''response rates''' which arise due to [[Selection Bias|self-selection]]. | ||
=== Recorded surveys === | === Recorded surveys === | ||
In a '''recorded survey''', | In a '''recorded survey''', the questionnaire is pre-recorded in either written or oral form. Examples of recorded surveys include '''interactive voice response (IVR) surveys''', '''short message service (SMS)''' surveys, and surveys shared via social media and email. These surveys are effective when the surveys are very short (4-5 questions and/or less than 5 minutes on average), have zero or minimal [[SurveyCTO_Programming_Work_Flow#Pseudo Code|skip patterns]], and the questions are very straightforward to understand. | ||
The downside of a recorded survey is that there is a lack of human | The downside of a recorded survey is that there is a lack of human interaction, and the respondent cannot ask for more clarity on a question. This can lead to lower '''response rates''' compared to phone surveys as well as in-person interviews. They can also result in data that is potentially '''noisy''', that is, the answers either vary a lot or are vague because respondents could not interpret questions accurately. | ||
== | == Related Pages == | ||
[[Special:WhatLinksHere/Remote_Surveys|Click here for pages that link to this topic]]. | |||
== Additional Resources == | == Additional Resources == | ||
* Andrew et. al. (World Bank Group), [http://documents.worldbank.org/curated/en/877231468391801912/pdf/106741-PUB-BOX396275B-PUBLIC-PUBDATE-7-6-16.pdf General guide on conducting phone surveys] | * Andrew et. al. (World Bank Group), [http://documents.worldbank.org/curated/en/877231468391801912/pdf/106741-PUB-BOX396275B-PUBLIC-PUBDATE-7-6-16.pdf General guide on conducting phone surveys] | ||
* DIME Analytics (World Bank), [https://paper.dropbox.com/doc/A-Guide-to-WFH-Data-Collection-AwiZ_7GxbzaF8aXCRri8qCAg-DVhtGiokwboliGBqmjYfS Guide to work-from-home data collection] | |||
* Özler and Cuevas (World Bank), [https://blogs.worldbank.org/impactevaluations/reducing-attrition-phone-surveys?CID=WBW_AL_BlogNotification_EN_EXT Blog post on reducing attrition in phone surveys] | * Özler and Cuevas (World Bank), [https://blogs.worldbank.org/impactevaluations/reducing-attrition-phone-surveys?CID=WBW_AL_BlogNotification_EN_EXT Blog post on reducing attrition in phone surveys] | ||
* Goldstein and Kondylis (World Bank), [https://blogs.worldbank.org/impactevaluations/impact-evaluations-time-covid-19-part-1 Blog post on impact Evaluations in the time of Covid-19] | * Goldstein and Kondylis (World Bank), [https://blogs.worldbank.org/impactevaluations/impact-evaluations-time-covid-19-part-1 Blog post on impact Evaluations in the time of Covid-19] | ||
* Markus Goldstein (World Bank), [https://blogs.worldbank.org/impactevaluations/calling-it-in-using-phones-for-repeat-surveys Blog post on using phones for repeat surveys] | * Markus Goldstein (World Bank), [https://blogs.worldbank.org/impactevaluations/calling-it-in-using-phones-for-repeat-surveys Blog post on using phones for repeat surveys] | ||
* Herath et al. (J-PAL Africa), [https://www.saldru.uct.ac.za/2019/09/27/surveying-young-workseekers-in-south-africa/ Blog post on increasing response rates in surveys] | * Herath et al. (J-PAL Africa), [https://www.saldru.uct.ac.za/2019/09/27/surveying-young-workseekers-in-south-africa/ Blog post on increasing response rates in surveys] | ||
* IPA, [https://www.poverty-action.org/sites/default/files/IPA-Policy-on-Restarting-Face-to-Face-Data-Collection.pdf Restarting face-to-face data collection] | |||
* IPA, [https://www.poverty-action.org/publication/remote-surveying-pandemic-handbook IPA Handbook: Remote Surveys in a Pandemic] | |||
* J-PAL, [https://www.povertyactionlab.org/blog/3-20-20/best-practices-conducting-phone-surveys Best practices for conducting phone surveys] | * J-PAL, [https://www.povertyactionlab.org/blog/3-20-20/best-practices-conducting-phone-surveys Best practices for conducting phone surveys] | ||
* J-PAL, [https://www.povertyactionlab.org/sites/default/files/files/2020/03/spouse-and-gender-relations-phone-survey-protocol.pdf Protocols for phone surveys] | * J-PAL, [https://www.povertyactionlab.org/sites/default/files/files/2020/03/spouse-and-gender-relations-phone-survey-protocol.pdf Protocols for phone surveys] | ||
* J-PAL South Asia, [https://www.povertyactionlab.org/sites/default/files/research-resources/2020/03/transitioning-to-CATI-Checklists.pdf Checklists and resources for transitioning to CATI] | * J-PAL South Asia, [https://www.povertyactionlab.org/sites/default/files/research-resources/2020/03/transitioning-to-CATI-Checklists.pdf Checklists and resources for transitioning to CATI] | ||
* | * Nagpal et. al., [https://gh.bmj.com/content/6/Suppl_5/e005610.full Who do phone surveys miss, and how to reduce exclusion: recommendations from phone surveys in nine Indian states] | ||
* SurveyCTO, [https://www.surveycto.com/videos/data-collection-and-phone-surveying/ Adapting to COVID-19] | |||
* SurveyCTO, [https://www.surveycto.com/blog/decentralized-data-collection/ Conducting remote and decentralized data collection with SurveyCTO] | |||
* Tavneet Suri, [https://www.youtube.com/watch?v=ZMe_YU18BOI#action=share Webinar on adaptations for phone surveys] | * Tavneet Suri, [https://www.youtube.com/watch?v=ZMe_YU18BOI#action=share Webinar on adaptations for phone surveys] | ||
[[Category: Research Design]] | |||
* Patricia Paskov and Gailius Praninskas [https://blogs.worldbank.org/impactevaluations/we-ran-video-survey-heres-how-it-worked-guest-blog-post/ Blogpost on video surveys] |
Latest revision as of 17:35, 7 April 2022
Depending on whether data is collected in-person or remotely, there are two methods of primary data collection - remote surveys and field surveys. The impact evaluation team (or research team) can conduct remote surveys either to conduct a follow-up survey after an initial round of field survey, or when the study area is not accessible, for instance in the case of a conflict. In such cases, the research team can still get access to data that can help them solve common development challenges such as lack of access to water, electricity or healthcare.
Read First
- Primary data collection is the process of gathering data through surveys, interviews, or experiments.
- Preparing for remote data collection.
- DIME Analytics has created the following guidelines for data collection.
- Depending on the medium over which they are conducted, remote surveys can be of different types - phone surveys, web surveys, and recorded surveys.
Phone Surveys (CATI)
In phone surveys or computer-assisted telephone interviews (CATI), the enumerator calls the respondent and asks them the questions over telephone. The enumerator enters these responses onto a programmed digital survey, which is then shared electronically with the research team. Phone surveys have 2 major advantages. Firstly, they allow the research team to collect data remotely, which is cost-effective since collecting data through face-to-face interviews can often be expensive. Secondly, they are a great option for collecting data during emergencies or in conflict areas while still retaining the personal touch that researchers prefer.
However, when a research team decides to use a phone survey for collecting data, they must keep in mind concerns about:
- Feasibility, that is assessing whether or not a phone survey can be carried out.
- Preparation, that is discussing and planning the steps and guidelines involved in remote data collection.
- Communication, that is, establishing a chain of command, and strictly following the established protocols for remote data collection.
Other Remote Surveys
Besides phone surveys, there are other forms of remote data collection as well. These include web surveys and recorded surveys.
Web surveys
In a web survey, the research team directly shares a link to a programmed questionnaire with respondents via text or email, without relying on an enumerator. The respondent then answers the questions in the survey by clicking the link to the survey, either on a phone or a laptop. Therefore, in the case of web surveys, it is important for respondents to have access to internet and have the ability to read and understand a written questionnaire.
Web-surveys are useful because they reduce survey costs considerably as there are no enumerators involved. They can also be programmed easily using open data kit (ODK)-based tools, without the need for a lot of modification in the drafting process. However, they too suffer from the problem of lower response rates which arise due to self-selection.
Recorded surveys
In a recorded survey, the questionnaire is pre-recorded in either written or oral form. Examples of recorded surveys include interactive voice response (IVR) surveys, short message service (SMS) surveys, and surveys shared via social media and email. These surveys are effective when the surveys are very short (4-5 questions and/or less than 5 minutes on average), have zero or minimal skip patterns, and the questions are very straightforward to understand.
The downside of a recorded survey is that there is a lack of human interaction, and the respondent cannot ask for more clarity on a question. This can lead to lower response rates compared to phone surveys as well as in-person interviews. They can also result in data that is potentially noisy, that is, the answers either vary a lot or are vague because respondents could not interpret questions accurately.
Related Pages
Click here for pages that link to this topic.
Additional Resources
- Andrew et. al. (World Bank Group), General guide on conducting phone surveys
- DIME Analytics (World Bank), Guide to work-from-home data collection
- Özler and Cuevas (World Bank), Blog post on reducing attrition in phone surveys
- Goldstein and Kondylis (World Bank), Blog post on impact Evaluations in the time of Covid-19
- Markus Goldstein (World Bank), Blog post on using phones for repeat surveys
- Herath et al. (J-PAL Africa), Blog post on increasing response rates in surveys
- IPA, Restarting face-to-face data collection
- IPA, IPA Handbook: Remote Surveys in a Pandemic
- J-PAL, Best practices for conducting phone surveys
- J-PAL, Protocols for phone surveys
- J-PAL South Asia, Checklists and resources for transitioning to CATI
- Nagpal et. al., Who do phone surveys miss, and how to reduce exclusion: recommendations from phone surveys in nine Indian states
- SurveyCTO, Adapting to COVID-19
- SurveyCTO, Conducting remote and decentralized data collection with SurveyCTO
- Tavneet Suri, Webinar on adaptations for phone surveys
- Patricia Paskov and Gailius Praninskas Blogpost on video surveys