Difference between revisions of "Remote Surveys"

Jump to: navigation, search
 
(70 intermediate revisions by 3 users not shown)
Line 1: Line 1:
Depending on whether data is collected in-person or remotely, there are two methods of [[Primary Data Collection|primary data collection]] - [[Field Surveys|field surveys]] and '''remote surveys'''. Researchers often remote surveys to collect data in areas that are not accessible, or are affected by conflict. In such cases, researchers can still get access to data that can help them solve common development challenges such as lack of access to water, electricity or healthcare.  
Depending on whether data is collected in-person or remotely, there are two methods of [[Primary Data Collection|primary data collection]] - '''remote surveys''' and [[Field Surveys|field surveys]]. The [[Impact Evaluation Team|impact evaluation team]] (or research team) can conduct '''remote surveys''' either to conduct a '''follow-up''' survey after an initial round of [[Field Surveys|field survey]], or when the study area is not accessible, for instance in the case of a conflict. In such cases, the '''research team''' can still get access to data that can help them solve common development challenges such as lack of access to water, electricity or healthcare.
 
== Read First ==  
== Read First ==  
* Primary data collection is the process of gathering data through surveys, interviews, or experiments.
* [[Primary Data Collection|Primary data collection]] is the process of gathering data through surveys, interviews, or experiments.
* Dime Analytics [https://github.com/worldbank/DIME-Resources/blob/master/survey-preparing.pdf Guidelines on preparing for data collection]
* [[Preparing for Remote Data Collection|Preparing for remote data collection.]]
* Remote surveys are also a cost-effective method of performing a follow-up after an initial in-person interview.
* [https://www.worldbank.org/en/research/dime/data-and-analytics DIME Analytics] has created the following [https://github.com/worldbank/DIME-Resources/blob/master/survey-preparing.pdf guidelines for data collection.]
* Remote surveys can be of different types, depending on the medium over which they are conducted: '''phone surveys''', '''web surveys''', and '''recorded surveys'''.
* Depending on the medium over which they are conducted, '''remote surveys''' can be of different types - [[Remote Surveys#Phone Surveys (CATI)|phone surveys]], [[Remote Surveys#Other Remote Surveys|web surveys]], and [[Remote Surveys#Other Remote Surveys|recorded surveys]].
 
== Phone Surveys ==
A phone survey or a computer-assisted telephone interview (CATI) is one where the enumerator calls the respondent and asks them the questions over telephone. The enumerator enters these responses onto a programmed digital survey, which is then shared electronically with the research team. Phone surveys have 2 major advantages. Firstly, they allow data to be collected remotely, thus saving resources since collecting data through face-to-face interviews can often be expensive. Secondly, they are a great option for collecting data in emergencies or in conflict areas while still retaining the personal touch that researchers prefer.
 
However, when a research team decides to use a phone survey for collecting data, they must keep in mind concerns about:
#'''Feasibility'''. Whether or not a phone survey is feasible.
#'''Preparation.''' Things to keep in mind before starting data collection.
#'''Communication and Data Quality'''. During data collection.
 
=== Feasibility ===
The research team must keep in mind that in a lot of cases, data collection through phone surveys will not be feasible. In that case, they will have to rely on field data collection, or pause data collection in case in-person data collection is not possible. In order to decide whether conducting a phone survey is feasible, ask the following questions:
*'''Is it practical to ask these questions over a telephone?'''
This is an important consideration because in some cases asking questions over the phone can drastically alter the nature of data that is collected, or the manner in which respondents interpret questions in the survey.
Examples of questions which cannot be asked over the phone are: questions on sensitive topics (such as gender-based violence), and questions with test components (such as a Math test for a school-based survey).   
* '''How long is the survey?'''
General guidelines and best practices suggest that phone interviews should be short and longer surveys result in lower response rates. If your original questionnaire (for a field survey) took longer than 20 minutes to answer (on average), you might need to reduce the length of your questionnaire or the number of questions when transitioning to a phone survey.
However sometimes it is not possible to reduce the duration of the survey without affecting the outcomes of interest. In such cases, you might consider the feasibility of conducting the survey over multiple phone calls.
* '''Does a majority of the target population have access to telephones?'''
This is important because if the target population does not have access to telephones , the survey will introduce a [[Selection Bias|selection bias]] into the sample. Further, it will lead to lower response rates and reduce the cost-effectiveness of the survey.
* '''Are contact details of the target population easily available?'''
In the case of a '''follow-up survey''' (that is, if a prior round of surveys has already been conducted), it is likely that the research team has already collected contact information about the respondents. But in the case of a '''baseline survey''' (that is, when it is the first round of survey), if the research team does not have access to the contact numbers, they can consider '''random digit sampling (RDS)'''.  This is a method for selecting people for involvement in telephone statistical surveys by generating or selecting telephone numbers at random. However it is only possible when the research team has access to a long list of phone numbers from the population.


If the answers to the above questions are yes, then the research team can move to the next step, that of preparing for remote data collection. In the next section, we discuss the logistics of data collection for a phone survey.
== Phone Surveys (CATI) ==
In [[Phone Surveys (CATI)|phone surveys]] or '''computer-assisted telephone interviews (CATI)''', the enumerator calls the respondent and asks them the questions over telephone. The enumerator enters these responses onto a programmed digital survey, which is then shared electronically with the research team. '''Phone surveys''' have 2 major advantages. Firstly, they allow the [[Impact Evaluation Team|research team]] to [[Primary Data Collection|collect data]] remotely, which is cost-effective since collecting data through face-to-face interviews can often be expensive. Secondly, they are a great option for collecting data during emergencies or in conflict areas while still retaining the personal touch that researchers prefer.  


=== Preparation ===
However, when a '''research team''' decides to use a phone survey for collecting data, they must keep in mind concerns about:
The steps involved in preparing for phone surveys are different from the steps in [[Preparing for Field Data Collection|preparing for a field survey]]. At each stage of this process, everyone involved in the data collection (government agencies, survey firm, enumerators) must communicate clearly. The research team must carefully work through certain additional steps before conducting a phone survey. These steps deal with the following:
* [[Phone Surveys (CATI)#Feasibility|Feasibility]], that is assessing whether or not a phone survey can be carried out.
 
* [[Phone Surveys (CATI)#Preparation|Preparation]], that is discussing and planning the steps and guidelines involved in [[Preparing for Remote Data Collection|remote data collection]].
# '''Timeline for a phone survey.''' <br/> The research team must therefore keep in mind that for a phone survey, each step leading up to, and including, the data collection will take additional time. Therefore it is important to account for this, and allocate a bigger window for each step.
* [[Phone Surveys (CATI)#Communication|Communication]], that is, establishing a chain of command, and strictly following the established [[Preparing for Remote Data Collection#Protocols|protocols for remote data collection]].
# '''IRB approvals for a phone survey.'''<br/>  For a phone survey, the research team must obtain additional approvals from an IRB for aspects like oral consent, [[Random Audio Audits|audio audits]], and incentives for respondents. (link to Preparing for Remote Data Collection#IRB Approvals)
# '''Procurement-related documents.''' <br/> [[Procuring a Survey Firm|Procuring a survey firm]] involves drafting [[Survey Firm TOR|terms of reference (TOR)]], and preparing a [[Survey Budget|budget]] for each component of the survey process. These procurement-related documents should also reflect the fact that the data collection will be conducted over the phone, and not in-person. An example of additional costs in the case of phone surveys can be the cost of setting up a call-center for interviewers.
# '''Questionnaire for a phone survey.'''<br/> There are some differences in the considerations for designing a field survey, and designing a phone survey. These involve managing the length of a phone survey, including additional requirements for verbal consent, and adding '''audio audits'''.
# '''Remote data collection protocols.'''<br/> Similarly, the protocols for phone surveys are different from [[Survey Protocols|protocols for field surveys]]. Some of these differences are in terms of the team setup, communication protocols and data-quality checks.
# '''Enumerator training.''' <br/>The process of [[Enumerator Training|training enumerators]] becomes even more challenging in the case of a work-from-home phone survey. For instance, when a pandemic like COVID-19 prevents the research team from training enumerators in-person, the research team must establish certain logistical and content-based changes to the training method. These changes include sharing recorded training sessions, preparing '''enumerators''' for follow-up questions from respondents, and discussing steps to keep respondents engaged.
 
'''Note:''' These steps will also be useful in case the the research team has to transition from an in-person survey to a phone survey, for instance, during a pandemic or natural disaster.'''
 
=== Communication during data collection ===
Since there are so many aspects involved in the process of data collection, it is important to make sure that all members of the research team communicate frequently and effectively. Communication becomes even more important when data collection is being carried out remotely. Some guidelines to keep in mind are:
* Set up strong monitoring and communication protocols before starting data collection. Test them during training and the pilot, to identify any gaps that might arise.
* Set up a clear chain of command, and make sure each enumerator knows who their '''first point of contact (POC)''' is on the research team.
* Schedule regular calls to check-in with each enumerator, and to resolve any concerns they may have.
* Conduct team meetings at least once a week.
* Consider using a team-collaboration and communication tool to maintain a 24X7 channel for communication with supervisors in the survey team. You may add enumerators too if the survey team is small.
* Monitor data carefully, and on a daily basis. Provide regular feedback to enumerators on the quality of data being collected.
 
=== Data quality assurance ===
When dealing with phone surveys, the research team must run data quality checks more extensively. Since phone surveys have an inherent degree of self selection, it is essential to keep track of outcomes of interest. This is because self selection can result in samples which are not entirely representative. Therefore, in addition to the guidelines for [[Monitoring Data Quality|monitoring field data quality]], there are some steps that the research team must follow to monitor the quality of phone data in the context of phone surveys:
 
* Track progress and completion rate of surveys more closely.
* Since the survey is short, include consistency checks on all questions if possible.
* Include more stringent reviews on enumerator performance, and share this feedback with enumerators regularly.
* Insert [[Random Audio Audits|audio audits]] at predefined or random points of the survey. Then ask the supervisors to review the audio audits on a daily basis and cross-check the responses with the submitted forms. One way to do this is to create an audio audits form for the supervisor to fill in while reviewing audio audits.
* If conference calls are feasible, the supervisor can shadow the enumerator a few times to ensure that the enumerator is following all guidelines.
* Ask the supervisor to call a few respondents at random to confirm that the enumerator actually called them
* [[Back Checks|Back checks]] can be run on certain questions as well.
*Carefully track data quality throughout the process of data collection. This is a departure from in-person data collection where the emphasis on data quality checks is higher during the initial days of data collection.


== Other Remote Surveys ==
== Other Remote Surveys ==
Besides phone surveys, there are other alternatives for remote data collection as well. These include '''web surveys''' and '''recorded surveys'''.
Besides phone surveys, there are other forms of '''remote data collection''' as well. These include '''web surveys''' and '''recorded surveys'''.
=== Web surveys ===
=== Web surveys ===
In a '''web survey''', the research team directly shares a link to a programmed survey is shared with respondents via text or email, without relying on an enumerator. The respondent then answers the questions in the survey by accessing the link to the survey, either on a phone or a laptop. Therefore, in the case of web surveys, it is essential that respondents have access to internet and that they can read and understand a written questionnaire.
In a '''web survey''', the [[Impact Evaluation Team|research team]] directly shares a link to a [[Questionnaire Programming|programmed questionnaire]] with respondents via text or email, without relying on an enumerator. The respondent then answers the questions in the survey by clicking the link to the survey, either on a phone or a laptop. Therefore, in the case of web surveys, it is important for respondents to have access to internet and have the ability to read and understand a written questionnaire.


Also note that if the need to transition from an in-person survey to a web-based survey arises, it can be a very low-cost effort. Web-surveys reduce survey costs considerably as there are no enumerators involved. Further, pre-programmed surveys can be readily used for web-based data collection without the need for a lot of modification or preparation.  
Web-surveys are useful because they reduce survey costs considerably as there are no enumerators involved. They can also be programmed easily using open data kit (ODK)-based tools, without the need for a lot of modification in the drafting process. However, they too suffer from the problem of lower '''response rates''' which arise due to [[Selection Bias|self-selection]].
 
However, web surveys have the disadvantage of lower response rates, because the problem of '''self-selection''' persists even in this case. .


=== Recorded surveys ===
=== Recorded surveys ===
In a '''recorded survey''', survey instruments are pre-recorded in either written or oral form. Examples of recorded surveys include IVR, SMS/Text surveys, and surveys shared via social media and email.
In a '''recorded survey''', the questionnaire is pre-recorded in either written or oral form. Examples of recorded surveys include '''interactive voice response (IVR) surveys''', '''short message service (SMS)''' surveys, and surveys shared via social media and email. These surveys are effective when the surveys are very short (4-5 questions and/or less than 5 minutes on average), have zero or minimal [[SurveyCTO_Programming_Work_Flow#Pseudo Code|skip patterns]], and the questions are very straightforward to understand.  
 
These surveys work well when the survey is very short (4-5 questions and/or less than 5 minutes on average), has zero or minimal '''skip patterns''', and the questions are very straightforward to understand.  


The downside of a recorded survey is that there is a lack of human connect and the respondent cannot ask for more clarity on a question.. This can lead to lower response rates, as well as data that is potentially '''noisy''', that is, the answers either vary a lot or are vague because respondents could not interpret questions accurately.
The downside of a recorded survey is that there is a lack of human interaction, and the respondent cannot ask for more clarity on a question. This can lead to lower '''response rates''' compared to phone surveys as well as in-person interviews. They can also result in data that is potentially '''noisy''', that is, the answers either vary a lot or are vague because respondents could not interpret questions accurately.


== Back to Parent ==
== Related Pages ==
This page redirects from [[Primary Data Collection|primary data collection]].
[[Special:WhatLinksHere/Remote_Surveys|Click here for pages that link to this topic]].


== Additional Resources ==
== Additional Resources ==
* Andrew et. al. (World Bank Group), [http://documents.worldbank.org/curated/en/877231468391801912/pdf/106741-PUB-BOX396275B-PUBLIC-PUBDATE-7-6-16.pdf General guide on conducting phone surveys]
* Andrew et. al. (World Bank Group), [http://documents.worldbank.org/curated/en/877231468391801912/pdf/106741-PUB-BOX396275B-PUBLIC-PUBDATE-7-6-16.pdf General guide on conducting phone surveys]
* DIME Analytics (World Bank), [https://paper.dropbox.com/doc/A-Guide-to-WFH-Data-Collection-AwiZ_7GxbzaF8aXCRri8qCAg-DVhtGiokwboliGBqmjYfS Guide to work-from-home data collection]
* Özler and Cuevas (World Bank), [https://blogs.worldbank.org/impactevaluations/reducing-attrition-phone-surveys?CID=WBW_AL_BlogNotification_EN_EXT Blog post on reducing attrition in phone surveys]
* Özler and Cuevas (World Bank), [https://blogs.worldbank.org/impactevaluations/reducing-attrition-phone-surveys?CID=WBW_AL_BlogNotification_EN_EXT Blog post on reducing attrition in phone surveys]
* Goldstein and Kondylis on IEs in the time of Covid-19
* Goldstein and Kondylis (World Bank), [https://blogs.worldbank.org/impactevaluations/impact-evaluations-time-covid-19-part-1 Blog post on impact Evaluations in the time of Covid-19]
* Markus Goldstein (World Bank),  [https://blogs.worldbank.org/impactevaluations/calling-it-in-using-phones-for-repeat-surveys Blog post on using phones for repeat surveys]
* Markus Goldstein (World Bank),  [https://blogs.worldbank.org/impactevaluations/calling-it-in-using-phones-for-repeat-surveys Blog post on using phones for repeat surveys]
* Herath et al. (J-PAL Africa), [https://www.saldru.uct.ac.za/2019/09/27/surveying-young-workseekers-in-south-africa/ Article on increasing response rates]
* Herath et al. (J-PAL Africa), [https://www.saldru.uct.ac.za/2019/09/27/surveying-young-workseekers-in-south-africa/ Blog post on increasing response rates in surveys]
* IPA, [https://www.poverty-action.org/sites/default/files/IPA-Policy-on-Restarting-Face-to-Face-Data-Collection.pdf Restarting face-to-face data collection]
* IPA, [https://www.poverty-action.org/publication/remote-surveying-pandemic-handbook IPA Handbook: Remote Surveys in a Pandemic]
* J-PAL, [https://www.povertyactionlab.org/blog/3-20-20/best-practices-conducting-phone-surveys Best practices for conducting phone surveys]
* J-PAL, [https://www.povertyactionlab.org/blog/3-20-20/best-practices-conducting-phone-surveys Best practices for conducting phone surveys]
* J-PAL, [https://www.povertyactionlab.org/sites/default/files/files/2020/03/spouse-and-gender-relations-phone-survey-protocol.pdf Protocols for phone surveys]
* J-PAL, [https://www.povertyactionlab.org/sites/default/files/files/2020/03/spouse-and-gender-relations-phone-survey-protocol.pdf Protocols for phone surveys]
* J-PAL South Asia, [https://www.povertyactionlab.org/sites/default/files/research-resources/2020/03/transitioning-to-CATI-Checklists.pdf Checklists and resources for transitioning to CATI]
* J-PAL South Asia, [https://www.povertyactionlab.org/sites/default/files/research-resources/2020/03/transitioning-to-CATI-Checklists.pdf Checklists and resources for transitioning to CATI]
* IPA, [https://docs.google.com/document/d/1HhDcTJB0YgouBnN8SOu-05r73gh7pcS4_q2c93qs6aM/edit Guide on conducting phone surveys during a pandemic]
* Nagpal et. al., [https://gh.bmj.com/content/6/Suppl_5/e005610.full Who do phone surveys miss, and how to reduce exclusion: recommendations from phone surveys in nine Indian states]
* SurveyCTO, [https://www.surveycto.com/videos/data-collection-and-phone-surveying/ Adapting to COVID-19]
* SurveyCTO, [https://www.surveycto.com/blog/decentralized-data-collection/ Conducting remote and decentralized data collection with SurveyCTO]
* Tavneet Suri, [https://www.youtube.com/watch?v=ZMe_YU18BOI#action=share Webinar on adaptations for phone surveys]
* Tavneet Suri, [https://www.youtube.com/watch?v=ZMe_YU18BOI#action=share Webinar on adaptations for phone surveys]
[[Category: Research Design]]
* Patricia Paskov and Gailius Praninskas [https://blogs.worldbank.org/impactevaluations/we-ran-video-survey-heres-how-it-worked-guest-blog-post/ Blogpost on video surveys]

Latest revision as of 17:35, 7 April 2022

Depending on whether data is collected in-person or remotely, there are two methods of primary data collection - remote surveys and field surveys. The impact evaluation team (or research team) can conduct remote surveys either to conduct a follow-up survey after an initial round of field survey, or when the study area is not accessible, for instance in the case of a conflict. In such cases, the research team can still get access to data that can help them solve common development challenges such as lack of access to water, electricity or healthcare.

Read First

Phone Surveys (CATI)

In phone surveys or computer-assisted telephone interviews (CATI), the enumerator calls the respondent and asks them the questions over telephone. The enumerator enters these responses onto a programmed digital survey, which is then shared electronically with the research team. Phone surveys have 2 major advantages. Firstly, they allow the research team to collect data remotely, which is cost-effective since collecting data through face-to-face interviews can often be expensive. Secondly, they are a great option for collecting data during emergencies or in conflict areas while still retaining the personal touch that researchers prefer.

However, when a research team decides to use a phone survey for collecting data, they must keep in mind concerns about:

Other Remote Surveys

Besides phone surveys, there are other forms of remote data collection as well. These include web surveys and recorded surveys.

Web surveys

In a web survey, the research team directly shares a link to a programmed questionnaire with respondents via text or email, without relying on an enumerator. The respondent then answers the questions in the survey by clicking the link to the survey, either on a phone or a laptop. Therefore, in the case of web surveys, it is important for respondents to have access to internet and have the ability to read and understand a written questionnaire.

Web-surveys are useful because they reduce survey costs considerably as there are no enumerators involved. They can also be programmed easily using open data kit (ODK)-based tools, without the need for a lot of modification in the drafting process. However, they too suffer from the problem of lower response rates which arise due to self-selection.

Recorded surveys

In a recorded survey, the questionnaire is pre-recorded in either written or oral form. Examples of recorded surveys include interactive voice response (IVR) surveys, short message service (SMS) surveys, and surveys shared via social media and email. These surveys are effective when the surveys are very short (4-5 questions and/or less than 5 minutes on average), have zero or minimal skip patterns, and the questions are very straightforward to understand.

The downside of a recorded survey is that there is a lack of human interaction, and the respondent cannot ask for more clarity on a question. This can lead to lower response rates compared to phone surveys as well as in-person interviews. They can also result in data that is potentially noisy, that is, the answers either vary a lot or are vague because respondents could not interpret questions accurately.

Related Pages

Click here for pages that link to this topic.

Additional Resources