An exploration of the strengths and weaknesses of using text messaging as a tool for self-report data collection in psychological research 

How should a researcher design an SMS self-report study?

So far, we've seen that there is indeed the capacity for SMS to be used in self-report psychological research, from both a researcher and participant perspective. The next logical questions ask how SMS can be applied in a research setting, the quality of the data it provides, and how a researcher can design their studies to maximise data quality.

It is beguiling to jump straight to possible research topics or populations, and evaluate the data quality SMS provides, potentially in comparison to other modes  (e.g. in Lim, Hocking, & Hellard, 2008; Suwamaru, 2012; Walsh & Brinker, 2012). Doing so is indeed an important part of investigating the properties of a research tool, and there is a growing literature outlining the varied topics and populations where SMS may be a useful tool (e.g. La Rue, Li, Karimi, & Mitchell, 2012; Lim et al., 2010). However, the literature’s focus on the topics of self-report SMS research overlooks some fundamental considerations. This section focusses on the relationship between research design choices when using SMS to collect self-report information, and consequent participant retention, response rates, response delays, and data completeness.

To evaluate how a data collection tool performs, participants need to respond to the study in the first place. The previous section found that people were generally open the idea of participating in research using SMS, but there were worryingly low response rates when they were asked to do so. Because of this, the first paper in this section, Delay between recruitment and participation impacts on preinclusion attrition investigates one way in which researchers may retain participants once recruited.

Secondly, once participants are engaged with a study, SMS has limited usefulness as a tool for self-report data collection if it is associated with poor response behaviour. Poor response behaviour includes responses being missing, delayed, or incomplete, and participants dropping out of a study altogether. A fundamental difficulty in comparing the performance of SMS with other data-collection methods is that their differences in performance may be moderated by other factors, such as the nature of the questions or participant characteristics making it impossible to identify and assess all possible moderators of this kind. Instead, I adopt two practical approaches to addressing this problem. One way of examining how study features may impact on response behaviour in SMS research, or interact with the SMS mode itself, is to look for patterns in the extant literature. The second paper in this section, Meta-analysis of response rates when SMS is used as a research mode does this to provide suggestions regarding how to retain participants, and maximise response rates, when using SMS as a tool for data collection.

The second approach is to design research with an explicitly methodological hypothesis in mind. This provides a clearer picture of the properties of a data collection tool than incidental examination of how that tool fared when the initial research goal was pursuing a substantive hypothesis. The remaining three papers in this section take this approach. Perhaps due to the relatively small literature currently using SMS as a tool for self-report research, the meta-analysis revealed two critical knowledge gaps. The first gap is the temporal factors that may influence response rates, including the time of day and frequency of sampling. These are addressed in the third paper in this section, Temporal considerations for self-report research using Short Message Service. The second knowledge gap concerns how much information can be expected from self-report SMS responses. The fourth paper, As you Likert – cross-mode equivalence of administering lengthy self-report instruments via text message, examines how many questions may be asked via SMS before response behaviour and quality break down. The fifth paper, Short and sweet? Length and informative content of open-ended responses using SMS as a research mode, examines whether SMS limits the length of answers given via SMS.

Delay between recruitment and participation impacts on pre-inclusion attrition

Despite being a common aspect of psychological research, the impact of delay between recruitment and active participation on dropout rates has received little research attention. This is likely due to the intuitive sense that longer delays will increase the drop-out rate. Pre-inclusion attrition diminishes sample sizes and may threaten data representativeness. One hundred and two university undergraduates were recruited to participate in a short, one-off study via Short Message Service (SMS). Upon receipt of an SMS indicating consent to participate, the researchers delayed sending the study questions for one day, one week, one month or two months. Delay was significantly associated with response rate with an 80% response rate in the one-day delay condition, 56% at one-week, and 42% at one-month. No responses were received in the two month delay condition. This research confirms that the delay between recruitment and active participation impacts on pre-inclusion attrition when conducting research via SMS. [click here to read the paper in full]


Response rates where SMS is used as a tool for self-report psychological research, a meta-analysis

Short Message Service (SMS) is a ubiquitous text-based communication tool with potential applications for collecting self-report research data. This meta-analysis draws from the existing literature to explore whether particular study features are associated with response rates when using SMS as a tool for self-report data collection. SMS-related keywords were used to search six major journal databases. Of the original sample of 1049 papers, 122 studies met the inclusion criterion of participants sending self-report data to researchers via SMS. Response rates averaged 53% across studies.  Studies where the research topic was salient to participants, and participants reflected upon themselves (rather than other-report, or facts) had significantly higher response rates. Incentives and follow-up messages for missed responses were associated with significantly higher response rates. Follow-up reminders were particularly effective for child samples, while incentives were more effective for adult samples. SMS provides acceptable response rates when used as a tool for self-report data collection.

[I'm still considering publishing this, so no full text available online quite yet]

 

Temporal considerations for self-report research using Short Message Service

When using Short Message Service (SMS) as a tool for data collection in psychological research, participants can be contacted at any time. This study examined how sampling frequency and time of day of contact impacted on response rates, response completeness, and response delay in repeated measures data collected via SMS. Eighty five undergraduate students completed a six-item self-report questionnaire via SMS, in response to twenty SMS prompts sent on a random schedule. One group responded across two days, the other on a compressed schedule of one day  Overall, there was a high response rate. There was no significant difference in response rate, completeness and delay of those responding across one or two days. Timing between prompts did not impact on response behaviour. Responses were more likely to be complete if prompts were sent during the working day. The shortest time between prompts was fifteen minutes, and use of an undergraduate sample limits generalizability. When conducting repeated measures sampling using SMS, researchers should be aware that more frequent sampling can be associated with poorer data quality, and should aim to collect data during the working day rather than mornings or evenings. [click here to read the paper in full]

I also presented this as "When should I TXT? Temporal factors associated with response rates in an SMS ambulatory assessment paradigm" at the Australasian Mathematical Psychology Society Conference (Canberra, 12th-14th February). Click here to read the slides.

 

As you Likert – cross-mode equivalence of administering lengthy self-report instruments via text message

One of the most widely used data services worldwide, Short Messaging Service (SMS) offers an unprecedented opportunity for researchers to communicate with participants at any location, or time. One concern when using SMS for research is whether the mode’s brevity may make it unsuitable for administering multi-question, self-report psychological instruments originally developed for paper or online administration. Across two studies, this paper explores the psychometric properties and cross-mode measurement invariance of self-report, likert-style psychology questionnaires administered via SMS. The first study (n=417) examined this using different length variants of the same instrument, while the second (n=911) used instruments of varying lengths. Results demonstrated that, whilst some questionnaires were problematic, a self-report likert-style instrument as long as sixty items can be administered by SMS, with comparable response rates, internal reliability, and factor structures to online administration. However, in instruments over ten items in length, mean responses tended to be higher, leading to lack of equivalence in terms of latent means and intercepts.

This paper has been accepted in the peer-reviewed 2014 ACSPRI  Social Science Methodology conference proceedings. Click here to read it in full.

This paper was presented at the ACSPRI Social Science Methodology Conference, December 2014, Sydney, Australia. Click here to read the slides.


Short and sweet? Length and informative content of open-ended responses using SMS as a research mode

Short Message Service (SMS) is one of the most widely used data services worldwide. This paper examines the assumption that the 160 character limit would force brief and thus comparatively uninformative responses in psychological research compared to other data collection modes. In laboratory classes, 463 psychology undergraduate students were randomly assigned to complete a two-item questionnaire by SMS, email, online survey, or paper survey. Two weeks later, participants completed a multiple-choice self-report risk taking questionnaire on paper. While SMS response lengths were statistically significantly shorter than those yielded in other modes, they did not contain less information. [click here to read the paper in full]

This paper was presented at the International Society for the Study of Individual Differences, July 2013, Barcelona, Spain. I thought I was just going for a poster, but the day after I arrived, I found out they'd not informed me that my poster had been upgraded to a talk. Jet lagged out of my brain I pulled an all-nighter and gave a talk. It went... OK. The slides are ugly but you can read them here, or the the poster I had actually prepared. Moral of the story: don't get so excited you've got a scholarly legit reason to be in the same room as Frank Spinath that you neglect to double check the conference program, lest you wind up with suprise!presentation. I'm sure this lesson is applicable to everyone.

 

References

La Rue, E. M., Li, Y., Karimi, H. a., & Mitchell, A. M. (2012). A Description of the Development and Architecture of an SMS-Based System for Dealing With Depression. Procedia Technology, 5, 670–678. http://doi.org/10.1016/j.protcy.2012.09.074

Lim, M. S. C., Sacks-Davis, R., Aitken, C. K., Hocking, J. S., & Hellard, M. E. (2010). Randomised Controlled Trial of Paper, Online and SMS Diaries for Collecting Sexual Behavior Information from Young People. Journal of Epidemiology and Community Health, 64(10), 885–889. http://doi.org/10.1136/jech.2008.085316

Lim, Hocking, J. S., & Hellard, M. E. (2008). SMS STI : a review of the uses of mobile phone text messaging in sexual health. International Journal of STD & AIDS, 19, 287 – 290. http://doi.org/10.1258/ijsa.2007.007264.

Suwamaru, J. K. (2012). An SMS-based HIV / AIDS Education and Awareness Model for Rural Areas in Papua New Guinea. Global Telehealth, 161 – 169. http://doi.org/10.3233/978-1-61499-152-6-161

Walsh, E. I., & Brinker, J. K. (2012). Evaluation of a Short Message Service diary methodology in a nonclinical, naturalistic setting. Cyberpsychology, Behavior and Social Networking, 15(11), 615–8. http://doi.org/10.1089/cyber.2012.0189