Internet-based surveys: relevance, methodological considerations and troubleshooting strategies =============================================================================================== * Vikas Menon * Aparna Muraleedharan * research design * data accuracy * data collection Sir Internet-based surveys have steadily gained popularity with researchers because of their myriad advantages such as ability to reach a larger pool of potential participants within a shorter period of time (vis-à-vis face-to-face surveys), study subjects who maybe geographically dispersed or otherwise difficult to access and efficiency of data management and collation.1 2 This is in addition to obvious reasons such as convenience, relative inexpensiveness and user-friendly features such as comfortable pace and enhanced sense of participant control. With the advent of the COVID-19 pandemic and dwindling opportunities for face-to-face data collection, internet-based tools offer a powerful alternative to rapidly collect data. Moreover, they could be useful tools from a public health perspective to track public perceptions, myths and misconceptions3 in times of disaster. Many methodological issues confront a prospective researcher while designing online questionnaires/surveys. A few of them and some corresponding suggestions for troubleshooting are outlined below: 1. Web or mailed questionnaire?—A meta-analysis of 39 studies4 concluded that response rates to mail surveys are, in general, higher than web surveys. Interestingly, two important factors underlying this variation were the type of respondents and medium of follow-up reminders. While college students were more responsive to web surveys, physicians and laypersons were found to be more receptive to mail surveys. Further, follow-up reminders were more effective when given by mail, probably owing to greater personalisation, than the web. Recently, a hybrid method called web-push surveys, wherein, initial and subsequent follow-up contacts are made by mail to request a response by web was found to be a parsimonious method to elicit responses.5 However, results on combining web and mail surveys have not been consistent. Examination of different methods of using web portals for surveying consumers about their experience with medical care revealed that response rates were higher for mail questionnaires, compared with web-based protocols.6 A study on physicians7 found that initial mailing of questionnaire followed by web surveying non-responders increased response rates and enhanced the representativeness of the sample. For surveys with shorter time frame, the reverse method, namely initial web survey followed by mailing the questionnaire to non-responders was recommended, though key outcome variables did not differ between these data collection methods. The message appears to be that hybridisation of surveys involving both web and mail surveys may be a resource-effective method to augment response rates, though the optimal method of combining them may differ based on respondent and survey characteristics. 2. How to enhance response rates?—Response rates to email surveys are highly variable and traditionally in the range of 25%–30%, especially without follow-up reminders or reinforcements.8 Reasons for this include survey fatigue, competing demands and privacy concerns. These response rates are much lower than traditional response rates to telephonic surveys, which in turn are less than response rates to surveys using the face-to-face method.9 10 However, with the use of multimode approaches, the response rate can be improved to as much as 60%–70%.11 Substantial evidence exists on methods to enhance response rates in internet surveys. More than a decade ago, a Cochrane review suggested prenotifying respondents, shortening questionnaires, incentivising responses and multiple follow-up contacts to enhance response rates.12 Subsequently, personalised invitations,13 personalisation of reminders,14 dynamic strategies such as changes in the wording of reminders15 and inclusion of Quick Response (QR) codes16 have all been found to augment response rates. 3. How long is too long?—In academic surveys, the association between questionnaire length and response rates, particularly for web surveys, is weak and inconsistent.17 18 Considering the attention span of an average adult, 20 min has been recommended as the maximum length of web surveys by market research experts.19 However, it is difficult to be dogmatic about this as many non-survey factors, such as familiarity with the device and platform, may affect interest and attention span.20 On current evidence, restricting survey length to below 13 min may offer the best balance between response burden and quality.21 4. What are the optimal frequency and number of reminders?—Sending reminders have been noted to positively influence survey response rates.21 Response rates appear to improve till a maximum of three to four reminders, beyond which concerns about spamming or pressurising the respondents increase.14 Less is known about the optimal frequency of reminders. Blumenberg and others (2019) noted that sending reminders once every 15 days was associated with higher response rates, especially to the first two questionnaires in a series. Researchers should carefully weigh the efforts involved in sending multiple reminders against the risks of survey fatigue and lower engagement in later surveys, especially when multiple rounds of surveys are indicated. 5. How to minimise the non-representativeness of the sample? In any internet survey, the respondents are not selected through probability sampling and this may affect the generalisability of findings. A larger sample, too, will not solve this issue because information about non-responders is not available. One solution, though more resource-intensive, would be to randomly select the required number of respondents from a defined list (such as professional membership directories in a survey of professionals), seek their consent through telephone calls, collect basic sociodemographic information from those who decline to participate and send the questionnaire to only those who consent to participate, with a request not to forward the questionnaire to anyone else.22 This way, we prevent snowballing which can also affect sample representativeness. Further, if the sociodemographic profile of those who declined matches with that of those who consented to participate, then we can conclude that responders are representative of the intended population. Of course, this strategy may only work when a list of potential respondents is available. To avoid missing out those with no internet access or low-frequency users, investigators may contact people, randomly selected from a list, through another mode (telephone, face to face or mail) and ask them to complete the survey. Further, allowing respondents to complete the survey in a variety of modes (web/mail/telephone) may be another strategy to avoid missing out on those with limited access to the web. In conclusion, hybrid survey designs incorporating email surveys with web-based reminders probably lead to quicker and higher response rates. Fortnightly, or even weekly reminders (maximum of 3–4), would serve to enhance response rates. Questionnaire length is not associated with response rates, but limiting the survey to under 20 min may be a pragmatic consideration. Other suggestions to improve response rates that must be treated as preliminary due to limited evidence include sending personalised email reminders, modifying reminder contents over the survey life-cycle and creating QR codes that may seamlessly direct respondents to the survey. ## Footnotes * Contributors VM conceptualised the work, did the review of the literature and wrote the first draft of the manuscript. AM co-conceptualised the work, contributed to the review of the literature and revised the first draft of the manuscript for intellectual content. Both authors read and approved the final version of the manuscript. * Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors. * Competing interests None declared. * Patient consent for publication Not required. * Provenance and peer review Not commissioned; externally peer reviewed. * Received April 29, 2020. * Revision received May 27, 2020. * Accepted June 28, 2020. * © Author(s) (or their employer(s)) 2020. Re-use permitted under CC BY-NC. No commercial re-use. See rights and permissions. Published by BMJ. [http://creativecommons.org/licenses/by-nc/4.0/](http://creativecommons.org/licenses/by-nc/4.0/) This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: [http://creativecommons.org/licenses/by-nc/4.0/](http://creativecommons.org/licenses/by-nc/4.0/). ## References 1. Ahern NR . Using the Internet to conduct research. Nurse Res 2005;13:55–70.[doi:10.7748/nr2005.10.13.2.55.c5968](http://dx.doi.org/10.7748/nr2005.10.13.2.55.c5968) pmid:http://www.ncbi.nlm.nih.gov/pubmed/16416980 [PubMed](http://gpsych.bmj.com/lookup/external-ref?access_num=16416980&link_type=MED&atom=%2Fgpsych%2F33%2F5%2Fe100264.atom) 2. Lefever S , Dal M , Matthíasdóttir Ásrún . Online data collection in academic research: advantages and limitations. Br J Educ Technol 2007;38:574–82.[doi:10.1111/j.1467-8535.2006.00638.x](http://dx.doi.org/10.1111/j.1467-8535.2006.00638.x) [CrossRef](http://gpsych.bmj.com/lookup/external-ref?access_num=10.1111/j.1467-8535.2006.00638.x&link_type=DOI) 3. Geldsetzer P . Use of rapid online surveys to assess people's perceptions during infectious disease outbreaks: a cross-sectional survey on COVID-19. J Med Internet Res 2020;22:e18790. [doi:10.2196/18790](http://dx.doi.org/10.2196/18790) pmid:http://www.ncbi.nlm.nih.gov/pubmed/32240094 [PubMed](http://gpsych.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Fgpsych%2F33%2F5%2Fe100264.atom) 4. Shih T-H . Comparing response rates from web and mail surveys: a meta-analysis. Field methods 2008;20:249–71.[doi:10.1177/1525822X08317085](http://dx.doi.org/10.1177/1525822X08317085) [CrossRef](http://gpsych.bmj.com/lookup/external-ref?access_num=10.1177/1525822X08317085&link_type=DOI) [Web of Science](http://gpsych.bmj.com/lookup/external-ref?access_num=000257353100003&link_type=ISI) 5. Delnevo CD , Singh B . The effect of a web-push survey on physician survey response rates: a randomized experiment. Health Economics and Outcome Research 2019. 6. Fowler FJ , Cosenza C , Cripps LA , et al . The effect of administration mode on CAHPS survey response rates and results: a comparison of mail and web-based approaches. Health Serv Res 2019;54:714–21.[doi:10.1111/1475-6773.13109](http://dx.doi.org/10.1111/1475-6773.13109) pmid:http://www.ncbi.nlm.nih.gov/pubmed/30656646 [PubMed](http://gpsych.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Fgpsych%2F33%2F5%2Fe100264.atom) 7. Beebe TJ , Locke GR , Barnes SA , et al . Mixing web and mail methods in a survey of physicians. Health Serv Res 2007;42:1219–34.[doi:10.1111/j.1475-6773.2006.00652.x](http://dx.doi.org/10.1111/j.1475-6773.2006.00652.x) pmid:http://www.ncbi.nlm.nih.gov/pubmed/17489911 [CrossRef](http://gpsych.bmj.com/lookup/external-ref?access_num=10.1111/j.1475-6773.2006.00652.x&link_type=DOI) [PubMed](http://gpsych.bmj.com/lookup/external-ref?access_num=17489911&link_type=MED&atom=%2Fgpsych%2F33%2F5%2Fe100264.atom) [Web of Science](http://gpsych.bmj.com/lookup/external-ref?access_num=000246201400017&link_type=ISI) 8. Fincham JE . Response rates and responsiveness for surveys, Standards, and the Journal. Am J Pharm Educ 2008;72:43. [doi:10.5688/aj720243](http://dx.doi.org/10.5688/aj720243) pmid:http://www.ncbi.nlm.nih.gov/pubmed/18483608 [FREE Full Text](http://gpsych.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiRlVMTCI7czoxMToiam91cm5hbENvZGUiO3M6NDoiYWpwZSI7czo1OiJyZXNpZCI7czo3OiI3Mi8yLzQzIjtzOjQ6ImF0b20iO3M6MjU6Ii9ncHN5Y2gvMzMvNS9lMTAwMjY0LmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 9. Fricker S , Galesic M , Tourangeau R . An experimental comparison of web and telephone surveys. Public Opin Q 2005;69:370–92.[doi:10.1093/poq/nfi027](http://dx.doi.org/10.1093/poq/nfi027) [CrossRef](http://gpsych.bmj.com/lookup/external-ref?access_num=10.1093/poq/nfi027&link_type=DOI) [Web of Science](http://gpsych.bmj.com/lookup/external-ref?access_num=000231670700003&link_type=ISI) 10. Aquilino WS . Telephone versus face-to-face interviewing for household drug use surveys. Int J Addict 1991;27:71–91.[doi:10.3109/10826089109063463](http://dx.doi.org/10.3109/10826089109063463) 11. Yun GW , Trumbo CW . Comparative Response to a Survey Executed by Post, E-mail, & Web Form. ‎J Comput Mediat Commun 2000;6:1–2.[doi:10.1111/j.1083-6101.2000.tb00112.x](http://dx.doi.org/10.1111/j.1083-6101.2000.tb00112.x) 12. Edwards PJ , Roberts I , Clarke MJ , et al . Methods to increase response to postal and electronic questionnaires. Cochrane Database Syst Rev 2009:MR000008. [doi:10.1002/14651858.MR000008.pub4](http://dx.doi.org/10.1002/14651858.MR000008.pub4) pmid:http://www.ncbi.nlm.nih.gov/pubmed/19588449 [PubMed](http://gpsych.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Fgpsych%2F33%2F5%2Fe100264.atom) 13. Short CE , Rebar AL , Vandelanotte C . Do personalised e-mail invitations increase the response rates of breast cancer survivors invited to participate in a web-based behaviour change intervention? A quasi-randomised 2-arm controlled trial. BMC Med Res Methodol 2015;15. 14. Muñoz-Leiva F , Sánchez-Fernández J , Montoro-Ríos F , et al . Improving the response rate and quality in web-based surveys through the personalization and frequency of reminder mailings. Qual Quant 2010;44:1037–52.[doi:10.1007/s11135-009-9256-5](http://dx.doi.org/10.1007/s11135-009-9256-5) 15. Sauermann H , Roach M . Increasing web survey response rates in innovation research: an experimental study of static and dynamic contact design features. Res Policy 2013;42:273–86.[doi:10.1016/j.respol.2012.05.003](http://dx.doi.org/10.1016/j.respol.2012.05.003) 16. Harrison S , Henderson J , Alderdice F , et al . Methods to increase response rates to a population-based maternity survey: a comparison of two pilot studies. BMC Med Res Methodol 2019;19:65. [doi:10.1186/s12874-019-0702-3](http://dx.doi.org/10.1186/s12874-019-0702-3) pmid:http://www.ncbi.nlm.nih.gov/pubmed/30894130 [PubMed](http://gpsych.bmj.com/lookup/external-ref?access_num=http://www.n&link_type=MED&atom=%2Fgpsych%2F33%2F5%2Fe100264.atom) 17. Blumenberg C , Menezes AMB , Gonçalves H , et al . The role of questionnaire length and reminders frequency on response rates to a web-based epidemiologic study: a randomised trial. Int J Soc Res Methodol 2019;22:625–35.[doi:10.1080/13645579.2019.1629755](http://dx.doi.org/10.1080/13645579.2019.1629755) 18. Cook C , Heath F , Thompson RL . A meta-analysis of response rates in Web- or Internet-based surveys. Educ Psychol Meas 2016. 19. Revilla M , Ochoa C . Ideal and maximum length for a web survey. Int J Mark Res 2017;59:9. 20. Antoun C , Couper MP , Conrad FG . Effects of mobile versus PC web on survey response QualityA crossover experiment in a probability web panel. Public Opin Q 2017;81:280–306. 21. Fan W , Yan Z . Factors affecting response rates of the web survey: a systematic review. Comput Human Behav 2010;26:132–9.[doi:10.1016/j.chb.2009.10.015](http://dx.doi.org/10.1016/j.chb.2009.10.015) [CrossRef](http://gpsych.bmj.com/lookup/external-ref?access_num=10.1016/j.chb.2009.10.015&link_type=DOI) [Web of Science](http://gpsych.bmj.com/lookup/external-ref?access_num=000274616800003&link_type=ISI) 22. Ameen S , Praharaj S . Problems in using WhatsApp groups for survey research. Indian J Psychiatry 2020;62:327. [doi:10.4103/psychiatry.IndianJPsychiatry\_321\_20](http://dx.doi.org/10.4103/psychiatry.IndianJPsychiatry_321_20) Dr Vikas Menon obtained a bachelor's degree from the Jawaharlal Institute of Postgraduate Medical Education and Research (JIPMER), Puducherry in 2004. He completed postgraduate training in Psychiatry in 2008 from JIPMER. He has been working as a faculty in the Jawaharlal Institute of Postgraduate Medical Education and Research (JIPMER), Puducherry since 2012 and is currently an Additional Professor of Psychiatry. His main research interests include digital psychiatry, suicidology and mood disorders.
![][1] [1]: /embed/graphic-1.gif