Use of Smartphones to Collect Information about Health Behaviors:
Feasibility Study
New
Supporting Statement: Part A
Program official/project officer: Shanta Dube
Office on Smoking and Health
Centers for Disease Control and Prevention
Tel: (770) 488-6287
Email: skd7@cdc.gov
March 19, 2013
Table of Contents
Section A
A-1 Circumstances Making the Collection of Information Necessary
A-2 Purpose and Use of the Information Collection
A-3 Use of Improved Information Technology and Burden Reduction
A-4 Efforts to Identify Duplication and Use of Similar Information
A-5 Impact on Small Businesses or Other Small Entities
A-6 Consequences of Collecting the Information Less Frequently
A-7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5
A-8 Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency
A-9 Explanation of Any Payment or Gift to Respondents
A-10 Assurance of Confidentiality Provided to Respondents
A-11 Justification for Sensitive Questions
A-12 Estimates of Annualized Burden Hours and Costs
A-13 Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers
A-14 Annualized Cost to the Government
A-15 Explanation for Program Changes or Adjustments
A-16 Plans for Tabulation and Publication and Project Time Schedule
A-17 Reason(s) Display of OMB Expiration Date is Inappropriate
A-18 Exceptions to Certification for Paperwork Reduction Act Submissions
Appendices
Appendix A Authorizing Legislation
Appendix B1 Federal Register Notice
Appendix B2 Comments in Response to the Federal Register Notice
Appendix C1 Screener/ CATI Recruitment
Appendix C2 CATI Informed Consent Statement
Appendix D Initial CATI Survey (Recruitment)
Appendix E1 First Web Survey Follow-up for Smartphone Users
Appendix E2 Invitation for First Web Survey Follow-up for Smartphone Users
Appendix F Second Web Survey Follow-up for Smartphone Users
Appendix G1 First Text Message Survey Follow-up for non-Smartphone Users
Appendix G2 Invitation for First Text Message Survey Follow-up for non-Smartphone Users
Appendix H Second Text Message Survey Follow-up for non-Smartphone Users
Use of Smartphones to Collect Information about Health Behaviors: Feasibility Study
Background
This is a new Information Collection Request. The Centers for Chronic Disease Prevention and Health Promotion (NCCDPHP), Centers for Disease Control and Prevention (CDC) requests a one-year approval from the Office of Management and Budget to conduct a feasibility study examining the use of smartphones to collect information about health behaviors.
Despite the high level of public knowledge about the adverse effects of smoking, tobacco use remains the leading preventable cause of disease and death in the U.S., resulting in approximately 443,000 deaths annually. During 2005-2010, the overall proportion of U.S. adults who were current smokers declined from 20.9% to 19.3%. Despite this decrease smoking rates are still well above Healthy People 2010 targets for reducing adult smoking prevalence to 12%, and the decline in prevalence was not uniform across the population.i In addition, smoking has been estimated to cost the United States $96 billion in direct medical expenses and $97 billion in lost productivity each year.ii
One of the highest priorities emanating from the American Recovery and Reinvestment Act of 2009 is tobacco control and cessation programs. In addition, the Family Smoking Prevention and Tobacco Control Act gave the Food and Drug Administration new authority to regulate tobacco products, and the Children’s Health Insurance Program Reauthorization Act of 2009 included increases in Federal excise taxes on tobacco products. These developments reinforce the importance of timely collection of data related to tobacco usage.
CDC’s Office on Smoking and Health (OSH) created the National Tobacco Control Program (NTCP) in 1999 to encourage coordinated efforts nationwide to reduce tobacco-related diseases and deaths. The program provides funding and technical support to state and territorial health departments for comprehensive tobacco control programs. The four goals of the NTCP are to: (1) prevent initiation of tobacco use among young people; (2) eliminate nonsmokers’ exposure to secondhand smoke; (3) promote quitting among adults and young people; and (4) identify and eliminate tobacco-related disparities. The four components of the NTCP are: (1) population-based community interventions; (2) counter-marketing; (3) program policy/regulation; and (4) surveillance and evaluation.
Public health practice depends on having access to valid and reliable assessments of populations. Mobile communications technologies have potential immediate applicability to CDC’s tobacco prevention and control activities, and there is interest in investigating their use to facilitate rapid response. Traditionally, paper-based surveys (including interviewer-administered and self-administered) and telephone surveys have been the primary data collection modes for population-based data on health and behavioral indicators. However, these modes have various drawbacks including errors resulting from interviewer transcription and data entry; challenges with conducting studies on tight timelines; and inaccurate data resulting from retrospective self-report. With advancing technology and methods, traditional survey modes are being increasingly replaced by electronic methods of data collection which merge the processes of data collection and data entry. This merging may be beneficial for data accuracy. For example, Lai observed that electronic diaries were more accurate than other survey modes that rely on patient memory for recall.iii
Given the continuing rise in the use of smartphones among the general population, smartphones present an attractive mode to reach populations for survey research. Nearly 116 million Americans will use a smartphone at least monthly by the end of this year, up from 93.1 million in 2011. By 2013, smartphones will represent over half of all mobile phone users, and by 2016, nearly three in five consumers will have a smartphone.iv With this recent and continuing rise of smartphone use and availability, approaches to data collection are being revolutionized; mobile communications technologies provides a unique opportunity to expand on data collection via more traditional modes, and to reach the 18-29 population which is both a historically difficult to reach population for surveys and a population that engages in some risky behaviors at higher rates than their older counterparts. Smartphone survey applications offer additional data collection features, including instant location data, multimedia (camera/video), and communication tools such as push notifications, e-mail, and short message service (SMS). However, because smartphones are primarily used as personal devices there is a need to understand perspectives from users on the feasibility and usability of smartphones for participating in surveys. Some initial studies have shown that willingness to participate in smartphone surveys may vary by demographic characteristics. For example, previous research has found that younger individuals may be more willing to participate in smartphone surveys.v,vi,vii For example, one study which compared participation by survey mode observed that smartphone respondents were younger, more diverse, and less affluent than traditional computer respondents.viii In addition, Mays found that college students liked completing a daily assessment of their drinking via mobile surveys more than they liked completing via paper surveys.ix However, enthusiasm for a particular survey mode does not necessarily translate into compliance: smartphone surveys have higher abandonment rates than other survey modes.x,xi However, the offering of contingent incentives (i.e., incentives given upon survey completion) as well as respondents’ curiosity and interest in the research project are strong motivators for smartphone participation and compliance.xii
Rapid response, which may be defined as the ability to generate data on emerging topics rapidly and efficiently, has been identified as one of the key steps for enhanced surveillance in tobacco control, and smartphones can play an important role in enhancing surveillance efforts and meeting the needs of tobacco control programs. First, tobacco control requires the ability to make rapid assessments in an ever evolving environment of changing levels of tobacco control funding and policies and tobacco industry promotions. A study looking at the feasibility of using mobile phones for data collection showed that data collection with a mobile phone has the potential to dramatically improve data quality, accuracy and timeliness in diverse areas of data collection including disease monitoring to agricultural management to emergency response services.xiii Second, in some instances there is a need to follow tobacco users over time to understand cessation as well as exposure to promotions and can provide ecological momentary assessments—experiential and behavioral data in real time, especially in rapidly changing environments.xiv,xv Fourth, smartphone survey features can provide additional data points such as tobacco industry influences, point of sales/retail environment, and product pricing by screening bar codes from tobacco products. Finally, smartphones may be utilized to target specific populations.
Despite the numerous advantages of smartphone data collection, some challenges persist, including maintaining participation over time, issues related to usability, and the costs of implementation. In general, evaluating the feasibility and usability of smartphones for data collection from the end user perspective is needed. Therefore, the proposed feasibility study will focus on process outcomes in order to explore how smartphones might be employed for conducting population-based surveys.
Primary outcomes of interest include defining and evaluating the process of conducting surveys by smartphone; quantifying outcome rates (response, cooperation, refusal, and contact rates; item non-response; and measurement error); evaluating behavioral data in comparison to data collected by cell phone and landline phone; and evaluating the cost and value of these data relative to other data collection efforts. To do this, we propose to conduct a feasibility study with adults ages 18 to 65 that consists of recruiting cell phone users (both smartphone users and feature phone users) and conducting an initial computer-assisted telephone interview (CATI) via cell phone, and enrolling consenting respondents into a brief longitudinal study consisting of two follow-up surveys conducted via Web for respondents with smartphones, or text message for respondents with feature cell phones.
The legal justification for the survey may be found in Section 301 of the Public Health Service Act (42 USC 241).
This study will collect information on alcohol and tobacco use, attempts to quit tobacco use, and exposure to forces promoting and discouraging tobacco use. Data on alcohol and tobacco use are generally regarded as being no greater than minimally sensitive. Therefore, the data collection will have little or no effect on the respondent’s privacy. Nevertheless, safeguards will be put in place to ensure that all collected data remain private.
The proposed study involves a minimum amount of information in identifiable form (IIF). Respondents will be recruited through random digit dial, and neither names nor addresses will be collected. The data collection contractor, ICF, will have access to respondents’ phone numbers for recruitment purposes and e-mail addresses in order to provide incentive payments. Please refer to section A.10 for a description of how data will be de-identified prior to transmission to CDC.
Overview of the Data Collection System
The purpose of this data collection is to evaluate the utility of smartphones to collect data; because not all cell phones are smartphones, we will also collect data from feature phone (cell phones that do not have Web access) respondents, both to assess the utility of collecting data via these devices and to serve as a comparison for process and outcome data collected by smartphone. One hundred percent of the recruitment and initial interviews will be conducted on cell phones using CATI. Seventy-four percent of the two follow-up surveys will be conducted with smartphone respondents via Web survey. Twenty-six percent of the follow-up surveys will be conducted via text message with respondents who have feature cell phones and not smartphones.
The following diagram in Exhibit 1 illustrates the study design:
Exhibit 1. Overview of Feasibility Study Design
RDD CATI recruit using a cell phone sampling frame
Smartphone respondents (74%)
Feature phone respondents (26%)
Conduct follow up surveys via Web: Respondents will be provided a
link to survey URL via text message.
Conduct follow up surveys via text: Respondents will receive survey
questions via text and will text their survey responses.
By recruiting and conducting follow-up surveys with respondents both with and without smartphones, we will be able to compare and evaluate process and outcome differences due to survey modes. The proposed feasibility study does not include respondents who do not have wireless (cell) phones, but there is an existing literature base on differences between cell phone and non-cell phone populations.
This data collection is comprised of the following instruments: screener/CATI recruitment (Appendix C1); CATI Informed Consent Statement (Appendix C2); Initial CATI Survey (Appendix D); two Web Survey follow-ups for smartphone users (Appendices E1 and F); and two Text Message Survey follow-ups for non-smartphone users (Appendices G1 and H).
Items of Information to be Collected
Process evaluation as well as outcome evaluation data will be collected. Risk behavior questions will be drawn from previous OMB-approved data collections, such as the Youth Risk Behavior Survey (OMB No. 0920-0493, exp. 11/30/2011) and the National Adult Tobacco Survey (OMB No. 0920-0828, exp. 7/31/2015). The following topics will be addressed in the initial CATI and/or follow-up surveys:
Process Evaluation
Percent consenting to participate in the follow-up surveys;
Type of cell phone (smartphone or feature phone)
Type of cell phone plan (prepaid, unlimited minutes, etc.)
Delay between text invitation and initial Web survey login (smartphone respondents) or text response (feature phone respondents);
Percent attrition to the first and second follow-up surveys;
Volume and type of Helpdesk questions;
Percent of survey break-offs and time-outs;
Percent of text-based opt-outs;
Difficulties encountered by staff with different kinds of smartphones “seeded” into the sample;
Percent Paypal email bounceback; and
Percent of respondents reporting difficulties obtaining incentive
AAPOR Outcome Rate Evaluations
Response rate
Cooperation rate
Refusal rate
Contact rate
These rates will be determined for smartphone participants and feature phone participants using AAPOR formulas. We will compare these outcome rates against rates for established studies, as shown in the table shell below.
|
Response Rate |
Cooperation Rate |
Refusal Rate |
Contact Rate |
Smartphone CATI Recruit 1st Follow-up Survey 2nd Follow-up Survey |
X |
X X X |
X X X |
X X X |
Feature phone CATI Recruit 1st Follow-up Survey 2nd Follow-up Survey |
X |
X X X |
X X X |
X X X |
National Adult Tobacco Survey (NATS): Cell Respondents |
X |
X |
X |
X |
National Adult Tobacco Survey (NATS): Landline Respondents |
X |
X |
X |
X |
Behavioral Risk Factor Surveillance System (BRFSS): Cell Respondents |
X |
X |
X |
X |
National Health Interview Survey (NHIS) |
X |
X |
X |
X |
Risk Behavior Outcome Evaluation
Alcohol use
Tobacco use
Attempts to quit tobacco use, and
Exposure to forces promoting and discouraging tobacco use
When possible, behavioral risk questions will be identical to those asked on existing surveys such as the NATS, BRFSS, and NHIS. Outcomes will be determined for smartphone participants and feature phone participants.
The analysis will be conducted in three parts. First, we will examine the relationship between survey modes and demographic characteristics of respondents through pairwise contingency tables. Next, a similar approach will be used to examine the bivariate relationship between survey approaches and these risk behavior measures. Logistic models will be developed for each of the risk behavior measures to examine whether survey approach affected responses after adjusting for the impact of demographic characteristics such as age, gender, education level, marital status, and employment status. Finally, we will use a logistic regression model to compare mode effect and accuracy of data estimates when benchmarked against NATS, BRFSS, and NHIS data.
Identification of Websites and Website content Directed at Children under 13 Years of Age
All respondents will be over the age of 18. When a potential respondent is contacted via CATI interview, he or she first completes a screener to determine study eligibility (see Appendix C1). Specifically, the person answering the telephone is asked whether they have been reached on a cellular telephone, and are aged 18 or older. For those who respond they are not on a cellular telephone, or that they are less than 18 years old, the interview is terminated. There will be no Websites with content directed at children less than 13 years of age. Follow-up Web surveys will be developed for smartphone participants. Web surveys will only be accessible to study participants by entering a unique ID which will be sent via text to a phone number provided by the participant. Once a survey has been completed using a specific ID, the survey cannot be accessed again. IDs will be generated randomly and will be non-sequential, making it highly unlikely that a respondent could input another ID and gain access to the survey.
The data collected by this method are different in nature from that which traditional public health surveys collect. Advances in smartphone technology would make smartphones a potential tool for additional public health surveillance. The overall goals of the feasibility study are to explore how smartphones and feature phones might successfully be used to collect behavioral risk information. Specifically we will address the following:
Quality of the solution: How smartphones should be used to collect the best possible quality data.
Technological feasibility: Whether and under what circumstances smartphones can be used to collect population-based public health and behavior data.
Cost effectiveness: How much such a data collection would cost compared to more conventional data collection approaches.
Perhaps the most important goal of the evaluation is to determine whether and how smartphones can be leveraged to yield “good” data, and how these data compare to data gathered via text messages to feature phones and to data collected via more traditional RDD CATI surveys. Obtaining high quality data is a matter of minimizing survey error. For the purposes here, it is useful to distinguish between error associated with the sample and error associated with survey responses. From our perspective “error” can mean a systematic bias in the results or random noise in the data.
Young people and renters, for instance, are heavy cell phone users, but older Americans are much less likely to have adopted a wireless-only lifestyle (Blumberg & Luke, 2010). Statistically adequate coverage of a specific, young population might be feasible, but coverage of all American adults might not. The study will include respondents ages 18-65 in order to evaluate population coverage of both younger and older respondents.
In the pilot survey, we will evaluate quality of:
Sample. We will evaluate unit non-response by comparing initial CATI responses for 1) people who opt in to the follow-up study to those who do not, 2) people who have smartphones to those who do not, 3) and people who comply with the follow-up surveys to those who do not.
Data. We will compare item non-response in the initial CATI survey, follow up Web surveys with smartphone respondents, and follow-up text message surveys with feature phone respondents.
To determine how smartphone surveys can be implemented, we will conduct a process evaluation as part of the feasibility study to document lessons learned. Process evaluation began with conducting two focus groups of four participants each. During these groups, participants were asked questions to discern what kinds of and how many questions could be asked on a survey to be completed by smartphone or feature phone, what type of incentive would be needed, and to identify other barriers to participation. Process evaluation outcomes measured during the quantitative data collection will include:
Delay between text invitation and initial login;
Volume and type of Helpdesk questions were received;
Percent of survey break-offs and time-outs;
Percent of text-based opt-outs;
Percent attrition to the first and second follow-up surveys;
Difficulties were encountered by staff with different kinds of smartphones “seeded” into the sample;
Percent of Paypal emails bounceback; and
Percent of respondents reported difficulties obtaining incentive
We will evaluate the relative cost-effectiveness of the different survey types being piloted (CATI interview, smartphone Web surveys, and feature phone text message surveys). For instance, we might discover that the cost of conducting a series of brief surveys via smartphone (i.e., a diary study) is $x per question and that the item response rate on sensitive questions is y%. A similar study conducted retrospectively by cell phone might cost $a per question and achieve b% response. Even if the smartphone cost is higher ($x>$a), if the ratio of cost to quality ($x:y%) is lower, then smartphone research might be preferable to conventional methods. We will evaluate this cost:quality ratio for smartphone vs. feature phone vs. conventional RDD cell phone surveys with 18-65 year olds and, if cell sizes are large enough, for subpopulations including smokers, 18-34 year olds, and minorities.
CDC will use the results of the study to create a Standard Operating Procedure (SOP) for implementing smartphone Web surveys, and feature phone text message surveys. This SOP will contain details on the circumstances under which each type of data collection can produce high quality data, the circumstances under which each is affordable, and the specific methods by which smartphone Web surveys, and feature phone text message surveys can be carried out.
We would continuously monitor new innovations and adjust our systems to take advantage of the latest technology available. Other future improvements may include addition of voice capability and providing survey material in Spanish and other languages. A more comprehensive and thorough understanding of the above goals would help CDC better prepare for public health surveillance in the future.
This study will collect information on alcohol and tobacco use, attempts to quit tobacco use, and exposure to forces promoting or discouraging tobacco use. Data on alcohol and tobacco use are generally regarded as being no greater than minimally sensitive. Therefore, the data collection will have little or no effect on the respondent’s privacy. However, because data will be collected from respondents less than 21 years of age, safeguards will be put in place to ensure that all collected data remain private. First, at the onset of the initial CATI interview, interviewers will ask respondents whether they are in a place where they can answer questions privately. In addition, although questions may be sensitive (for example, “In the past year, how often did you drink any type of alcoholic beverage?”), participants’ responses will not divulge any sensitive information – responses will typically be yes, no, a numeric value, or a frequency (every day, sometimes, never, etc.). For both the follow-up Web and text message surveys, respondents will enter responses directly into their cell phone. Data from the initial CATI interview will be stored on secure servers, and data from the follow-up surveys will be transmitted to the Contractor, ICF, via secured data transmission.
The proposed study involves a minimum amount of information in identifiable form (IIF). Respondents will be recruited through random digit dial, and neither names nor addresses will be collected. The data collection contractor will have access to respondents’ phone numbers for recruitment purposes and e-mail addresses in order to provide incentive payments. Please refer to section A.10 for a description of how response data will be de-identified prior to transmission to CDC.
One hundred percent of the initial CATI information collection involves the use of Computer Assisted Telephone Interviewing (CATI) to reduce burden to the respondent and permit electronic collection and submission of responses. The follow-up information collection will also be electronic; 26% will be conducted via text messages with feature phone users and 74% will be conducted via Web survey through a link provided via text message to smartphone users.
Collecting data with smartphones has been the subject of some research. A project from the University of Washington called the Open Data Kit (http://opendatakit.org/) uses open source software to collect and compile survey data on smartphones in support of surveys in developing nations, but the focus of this project is surveys conducted by interviewers in the field for which data are recorded on a smartphone.
Some studies in the past two years have concerned smartphone surveys. In 2011, the Annual Meeting of the American Association for Public Opinion Research hosted a symposium on smartphone and iPad data collection where several researchers presented research on the implementation of research in these contexts. These studies, however, focused on the design and presentation of smartphone surveys. None specifically concerned response rates, behavioral-risk outcomes, or cost, major concerns of the present research. In evaluating data quality and cost, the present research represents an important extension of this work, which has focused on best practices.
No small businesses will be involved in this data collection.
Because the proposed study is a feasibility study, this data collection will be conducted only once. As noted above, the resulting information will use the results of the study to create a Standard Operating Procedure (SOP) for implementing smartphone and text message research. There are no legal obstacles to reduce the burden.
There are no special circumstances with this information collection request. This request fully complies with the guidelines of 5 CFR 1320.5.
A 60-day Notice was published in the Federal Register on February 27, 2012 (see Appendix B1), Vol. 77, No. 38, pp. 11545-11547. One public comment was received and acknowledged (see Appendix B2).
Because of the limited scope of this feasibility study, CDC did not consult with persons outside the agency on the design of the study.
Study participants will receive up to $10 for their participation in the feasibility study. This amount was determined based on the time commitment and effort asked of respondents; it demonstrates appreciation for participation and will help prevent panel attrition. The estimated maximum cost to a participant for their phone usage is $4.40.
Specifically, smartphone and feature phone participants will be asked to participate in three separate surveys (an initial CATI interview and 2 follow-up surveys), so they represent a panel of participants. Effectively incentivizing a smartphone panel is difficult. Ideally, the survey will not collect respondent addresses, since preserving respondents’ sense of anonymity is an important part of obtaining accurate data. Without addresses, however, we can transmit incentives in only two ways: using an internet bank such as PayPal as a go between or sending an electronic gift code via text message. We will offer all respondents the choice between these options. For PayPal reimbursement, respondents will need to provide email addresses and have a PayPal account. If they prefer not to provide their emails, we will send them Amazon.com gift codes via text message. The former is more like cash, the latter is easier and preserves privacy. Which option respondents choose is an outcome of interest.
In conventional surveys, incentives that are provided in close proximity to participation help promote response, but in this case, the small amounts involved make incremental distribution of incentives infeasible. A $3 or $4 Amazon.com gift code does not have much real value. Providing the entire incentive at the end of the survey cycle seems ideal. However, keeping respondents engaged with the incentive is important. We will use a “points” system like that employed by some internet panels. Each survey that respondents participate in will gain them three or four points, for a maximum of 10 points for completion of all three surveys. At the start of each survey respondents will be informed of the current number of points allocated. At the end of the last survey, respondents will redeem each survey point for approximately one dollar (with a minimum of $3 for respondents who completed the initial phone interview but did not participate in either of the follow-up surveys).
CDC has determined that this activity does not constitute research with human subjects and therefore does not require IRB approval. Prior to initiating data collection, the project director will train all interviewers prior to making calls to respondents.
This study will collect information on tobacco use and alcohol use in the survey component. The data are being gathered to determine whether it is feasible to collect this information (historically collected by mail, telephone, or in person surveys) through internet surveys on smartphones.
This submission has been reviewed by staff in CDC’s Information Collection Review Office, who determined that the Privacy Act does not apply. Respondent names and addresses will not be collected for the smartphone feasibility study. In order to distribute incentives, the data collection contractor will have temporary access to an email address for some respondents, but this information will not be linked to response data.
Precautions will be taken in how the data are handled to prevent a breach of security. Survey data and identifying information (phone number and email address) about respondents will be handled in ways that prevent unauthorized access at any point during the study. To maintain security, the telephone number and email address of the respondent will be excluded from the data file used for analysis. If reports or tabular data are submitted, the data will be reviewed to determine if the subject(s) can be identified when small cell counts occur. If there is the potential for the identification of these subject(s), (cell count fewer than 30 records), the data in these cells will be removed. Respondents will be told during the initial screener that the information they provide will be maintained in a secure manner. All interviewers will be required to sign a non-disclosure agreement on the date of hire, which will be reinforced at training.
Verbal consent will be elicited from participants. Before the telephone interview, the interviewer will read the informed consent script (see Appendix C2) to each participant. The consent script describes the study, the types of questions that will be asked on the actual surveys, the risks and benefits of participation, and participants’ rights, and it provides information on whom to contact with questions about any aspect of the study. The consent script also indicates that participation is completely voluntary and that participants can refuse to answer any question or discontinue the study at any time without penalty or loss of benefits. The interviewer will enter a code via the keyboard to signify that the participant was read the informed consent script and agreed to participate.
Participation in the survey is voluntary. Interviewers will tell respondents that “Any information you give me will be treated in a secure manner and will not be disclosed, unless otherwise compelled by law." Interviewers will also tell respondents: "I will not ask for your last name, address, or other personal information that can identify you. You do not have to answer any question you do not want to, and you can end the study at any time.”
The surveys (Appendix D: Initial CATI Survey; Appendix E1: First Web Survey Follow-up for Smartphone Users; Appendix F: Second Web Survey Follow-up for Smartphone Users; Appendix G1: First Text Message Survey Follow-up for non-Smartphone Users; Appendix H: Second Text Message Survey Follow-up for non-Smartphone Users) ask about general alcohol and tobacco use, and include demographic questions such as respondent’s race and ethnicity.
While an individual may be sensitive about answering questions about alcohol or tobacco use, the items are, for the most part, not of a sensitive nature and are commonly found in surveys of health behavior. Data on tobacco use are generally regarded as being no greater than minimally sensitive. Therefore, the data collection will have little or no effect on the respondent’s privacy. Nevertheless, safeguards will be put in place to ensure that all collected data remainsecure. There are no questions concerning illegal drug use or other criminal acts. There are no questions about emotionally charged experiences such as parental or sexual abuse. Race and ethnicity questions will conform to OMB standards.
The total average time to recruit, screen, and conduct the initial CATI interview is eight minutes or less. Approximately 30-60 seconds are needed to introduce the survey and screen for an eligible respondent (Appendix C1, Screener/CATI Recruitment), one minute is devoted to the informed consent process, and seven minutes are required to complete the interview (Appendix D, Initial CATI Survey). We estimate that we will contact 1,990 respondents for screening and 1,590 for recruitment.
After recruitment is completed, each respondent will be routed to two follow-up surveys. Slightly different versions of the follow-up survey will be used for smartphone users (Appendix E1 and Appendix F) and feature phone users (Appendix G1 and Appendix H). The estimated burden per response for all follow-up surveys is three minutes.
We have not found any data to inform the expected rate of respondents completing the initial CATI interview who will consent to participate in smartphone or feature phone follow-up surveys, nor of the expected rate of completion for the follow-up surveys; both are outcomes of interest for this feasibility study. For the purposes of computing annualized burden hours and cost, we will assume that 70% of the respondents completing the initial CATI interview will agree to and complete the first follow-up survey, and that 85% of those completing the first follow-up survey will also complete the second follow-up survey. For the smartphone data collection, we hope to engage 700 respondents for the first web survey follow-up and 595 respondents for the second web survey follow-up. For the feature data collection, we hope to engage 200 respondents for the first text message survey follow-up and 170 respondents for the second text message survey follow-up.
Prior to fielding the study, the data collection contractor will pre-test recruitment instruments and procedures. The total estimated burden for the pre-test is 20 hours.
The total estimated annualized burden for the Smartphone Study and associated support activities is 306 hours.
Estimated Annualized Burden Hours
Type of Respondents |
Form Name |
Number of Respondents |
Number of Responses per Respondent |
Avg. Burden per Response (in hours) |
Total Burden (in hours) |
Adults Aged 18 to 65, All cell phone users
|
Pre-test (CATI Screener/CATI Recruitment) |
20 |
1 |
8/60 |
3 |
CATI Screener |
1,990 |
1 |
1/60 |
33 |
|
CATI Recruitment |
1,590 |
1 |
7/60 |
186 |
|
Adults Aged 18 to 65, Smartphone Users
|
First Web Survey Follow-up for Smartphone Users |
700 |
1 |
3/60 |
35 |
Second Web Survey Follow-up for Smartphone Users |
595 |
1 |
3/60 |
30 |
|
Adults Aged 18 to 65, Non-smartphone Users
|
First Text Message Survey Follow-up for non-Smartphone Users |
200 |
1 |
3/60 |
10 |
Second Text Message Survey Follow-up for non-Smartphone Users |
170 |
1 |
3/60 |
9 |
|
Total |
306 |
The total estimated cost to respondents is $7,038, using $23 per hour as the average hourly wage (http://www.bls.gov/ncs/ocs/sp/nctb1475.pdf). Additional information is provided in the following table.
Estimated Annualized Burden Costs
Type of Respondents |
Form Name |
No. of Respondents |
Total Burden (in hours) |
Avg. Hourly Wage Rate |
Total Annualized Respondent Costs |
Adults Aged 18 to 65, All cell phone users |
Pre-test of CATI Screener/Initial CATI Survey |
20 |
3 |
$23 |
$69 |
CATI Screener |
1,990 |
33 |
$23 |
$759 |
|
Initial CATI Survey |
1,590 |
186 |
$23 |
$ 4,278 |
|
Adults Aged 18 to 65, Smartphone Users |
First Web Survey Follow-up for Smartphone Users |
700 |
35 |
$23 |
$805 |
Second Web Survey Follow-up for Smartphone Users |
595 |
30 |
$23 |
$690 |
|
Adults Aged 18 to 65, Non-smartphone Users |
First Text Message Survey Follow-up for non-Smartphone Users |
200 |
10 |
$23 |
$230 |
Second Text Message Survey Follow-up for non-Smartphone Users |
170 |
9 |
$23 |
$207 |
|
|
Total |
$7,038 |
There are no capital, startup, operational, or maintenance costs to respondents other than the costs of cell phone talk minutes and data.
The total contract award to the data collection contractor, ICF, is $95,000 over a 12-month period. These costs cover the activities in Table A-14 below.
Additional costs will be incurred indirectly by the government in personnel costs of staff involved in oversight of the study and in conducting data analysis. It is estimated that 2 CDC employees will be involved for approximately 10% of their time (for federal personnel 100% time = 2,080 hours annually). The two salaries are $50.03 and $56.48 per hour. The direct annual costs in CDC staff time will be approximately $22,154 annually.
The total annualized cost for the study over a 12-month period, including the contract cost and federal government personnel cost is $117,154.
Type of Cost |
Description |
|
|
CDC Personnel |
10% of GS-14 @ $104,062/year |
$10,406 |
|
|
10% of GS-14 @ $117,478/year |
$11,748 |
|
|
Subtotal, CDC Personnel |
|
$22,154 |
Contractual Costs |
Conduct focus groups, Instrument design, survey programming, data collection and cleaning, analysis, report writing |
|
$95,000 |
|
|
Total |
$117,154 |
This is a new, one-time data collection.
Six weeks are needed for data collection. Therefore, we are requesting OMB approval by May 6 in order to allow for project completion before the end of the current contract period. A proposed project schedule is included in the table below.
Schedule
Planning I |
CDC approves work plan |
Prior to OMB Approval |
Project kickoff |
||
Draft study design plan |
||
Existing knowledge review |
||
Focus Groups |
Focus group recruiting begins |
Prior to OMB Approval |
Develop focus group moderator guide |
||
Conduct 2 focus groups* |
||
Draft focus group report |
||
Planning II |
Study design plan update |
1 week following OMB approval |
Pilot |
Program Data Collection Instruments |
1 week following updated study design |
Data collection begins |
2 weeks following instrument programming |
|
Data collection ends |
6 weeks after data collection begins |
|
Reporting |
Draft process evaluation/survey methods report |
2 weeks after data collection ends |
Draft outcomes evaluation and cost analysis |
4 weeks after data collection ends |
|
Final report |
4 weeks after data collection ends |
*Combined, the 2 focus groups were conducted with fewer than 10 participants.
The evolution of mobile communications technologies provides a unique opportunity for innovation in public health surveillance. Text messaging and smartphone web access are immediate, accessible, and confidential, a combination of features that could make them ideal for ongoing research, surveillance, and evaluation of risk behaviors and health conditions. We will explore the perceived feasibility, advantages and disadvantages of conducting a population-based survey via smartphones and feature phones. Deeper understanding of factors that promote and hinder participation will be useful in creating a population-based pilot survey using mobile communications technology. We will determine and describe the technological feasibility: whether and under what circumstances smartphones and feature phones can be used to collect population-based public health and behavior data. In addition, response rate, contact rate, cooperation rate, and refusal rate will be calculated separately for smartphone users and feature phone users. As mobile communications continue to evolve, a better understanding of how mobile communications technologies can be used to collect data on risk behaviors and health conditions is critical to public health surveillance and evaluation efforts.
The results of the focus groups were presented at the 2012 American Association of Public Opinion Research annual meeting; results of the full feasibility study will be presented at the 2013 Joint Statistical Meetings.
The OMB expiration date will be displayed on all data collection instruments.
There are no exceptions to the certification statement.
i Vital Signs: Current Cigarette Smoking Among Adults Aged ≥18 Years --- United States, 2005—2010. MMWR 2011:60(35):1207-1212.
ii CDC. Smoking-attributable mortality, years of potential life lost, and productivity losses---United States, 2000--2004. MMWR 2008;57:1226--8.
iiiLai JW; Lorelle V; Link MW; Pearson J; Makowska H; Benezra K; Green M. The Nielsen Company Life360: Usability of Mobile Devices for Time Use Surveys, Presented at AAPOR – May 14-17, 2009.
ivhttp://www.emarketer.com/Mobile/Article.aspx?R=1009014, accessed October 26, 2012.
vHaberer JE, Kiwanuka J, Nansera D, Wilson IB, Bangsberg DR. Challenges in using mobile phones for collection of antiretroviral therapy adherence data in a resource-limited setting. AIDS and Behavior. 2010. 14(6):1294-301.
viLai JW; Lorelle V; Link MW; Pearson J; Makowska H; Benezra K; Green M. The Nielsen Company Life360: Usability of Mobile Devices for Time Use Surveys, Presented at AAPOR – May 14-17, 2009.
viiStapleton C. The Smart(Phone) Way to Collect Survey Data. Service Management Group Annual Conference of the American Association for Public Opinion Researchers Phoenix, AZ; May 2011
viiiIbid
ixMays D; Cremeens J; Usdan S; Martin RJ; Arriola KJ; Bernhardt JM; The feasibility of assessing alcohol use among college students using wireless mobile devices: Implications for health education and behavioural research.Health Education Journal, 2010. 69(3): 311-20
xBuskirk TD; Andrus C; Gaynor M; Gorrell C. An App a Day Could Keep the Doctor Away –Quantifying the Use of Prevention Related Smartphone Apps Among iPhone Users. Saint Louis University School of Public Health. Presented at AAPOR, 2011.
xiMays D; Cremeens J; Usdan S; Martin RJ; Arriola KJ; Bernhardt JM; The feasibility of assessing alcohol use among college students using wireless mobile devices: Implications for health education and behavioural research.Health Education Journal, 2010. 69(3): 311-20
xiiLai JW; Lorelle V; Link MW; Pearson J; Makowska H; Benezra K; Green M. The Nielsen Company Life360: Usability of Mobile Devices for Time Use Surveys, Presented at AAPOR – May 14-17, 2009.
xiii Mourão S & Okada K. Mobile Phone as a Tool for Data Collection in Field Research. World Academy of Science, Engineering & Technology, 2010. 70: 222-226.
xivShiffman S; Stone AA; Hufford MR. Ecological Momentary Assessment. Annual Review of Clinical Psychology, 2008. 4:1-32.
xvMulvaney SA.; Rothman RL; Dietrich MS; Wallston KA; Grove E; Elasy TA.; Johnson KB; Using mobile phones to measure adolescent diabetes adherence. Health Psychology, 2012. 31(1):43-50.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | 21280 |
File Modified | 0000-00-00 |
File Created | 2021-01-29 |