This is an update to the letter originally submitted on July 19, 2012 and approved on September 5, 2012. We have made two primary modifications to the proposal:
The number of treatment conditions, and
The type of analysis we will be using.
We have also modified the questions that participants will be asked. As a result of these modifications, the number of respondents will decrease from 150 to 100 and the number of burden hours will be reduced from 25 to 17.
In the original request, we planned for five conditions, including a control. We have scaled back the number of conditions, and now have one control and one treatment condition. (See Table 1).
In the original request, we proposed using a frequentist approach to determine the sample size. Now we propose using the Bayesian statistical method to determine the number of participants to include in the study. This method allows one to employ data as they accumulate and make decisions on modifying procedures during the study. The adaptation in this case might be to cease data collection prior to achieving the frequentist-based sample size, if interim results suggest that the findings are clear. Such an approach is similar to what is used in industry for medical devices and clinical trials. (See U.S. Department of Health and Human Services, 2010). This approach allows us some flexibility when determining the sample size, specifically when determining the stopping criteria. At this time we plan for 100 participants. We will decide when to stop data collection, or whether to pursue more data collection by looking at the estimates and their precisions. We will not run more than 100 participants, unless we send a revised OMB memo. However it is quite possible that our stopping criteria may suggest that we stop data collection before we run all 100 participants.
Participants in the usability studies will be randomly assigned to one of conditions outlined in Table 1.
Table 1. Conditions proposed for research on commitment statements.
Condition |
Description |
Introduction |
1. Control |
Factual questions are asked without introductory text or commitment to accurate reporting |
None |
2. Accuracy commitment |
Test administrator orally tells the participant about the importance of answering the questions accurately before the participant receives the questions. Then, the test administrator has the participant commit in writing to report accurately, by signing their name on a contract. |
“When you answer the questions that you’ll see on the following screens, it’s very important that you take your time and think carefully so that you answer each question as accurately as possible.”
Would you be willing to commit to answer each question as accurately as as you can? After the respondent says yes, the test administrator will say: Please check this box and sign your name if you are willing to commit to answer each question as accurately as you can.
|
The revised set of questions is included below.
The estimated time to answer the 10 factual questions is 8.5 minutes. The estimated time for completion of the short term memory task is 1.5 minutes. Thus, the total time for the experimental session is 10 minutes. The maximum estimated burden for this research is a 17 hours.
Revised List of Questions:
Q1. From the time you walked into this room until right now, how many minutes do you think have passed?
Q2. One of the first questions you answered at the beginning of today’s session was:
During the last year, how many times did you complete a survey on the Internet?
What answer did you give to that question?
Q3. Also, you answered this question earlier:
Five years ago, about how many hours did you use the Internet during a typical week?
What answer did you give to that question?
Q4. Also, you answered this question earlier:
Not including email, how uncomfortable are you with providing personal information on Internet forms and surveys?
Extremely uncomfortable
Very uncomfortable
Moderately uncomfortable
Slightly Uncomfortable
Not uncomfortable at all
What answer did you give to that question?
Q5. On what day of the week did you first talk to someone about participating in this study?
Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Sunday
Q6. The first questionnaire that you completed when you walked in today was the “Background Questionnaire.” You completed this on the computer. It asked about your computer use and Internet experience. How many minutes did you spend answering the Background Questionnaire?
Q7. When you spoke with someone by telephone to schedule your visit here today, you told them how you first heard about participating in usability studies at the Census Bureau. What did you tell the person over the phone?
Q8. When you spoke with someone by telephone to schedule your visit here today, you were asked for what two purposes you used the Internet, besides checking your email. What two purposes did you tell the person over the telephone?
Q9. When you spoke with someone by telephone to schedule your visit here today, you were asked, “How many times in the past month did you eat out?”
What number did you tell the person over the telephone?
Q10. When you spoke with someone by telephone to schedule your visit here today, you were asked, “How many times in the past month did you go shopping?”
What number did you tell the person over the telephone?
References
Conrad, F (2011). “Interactive interventions in web surveys can increase response accuracy.” Proceedings of the American Association for Public Opinion Research (AAPOR) 66th Annual Conference. Phoenix, AZ, May 2011
Cannell, C., Oksenberg, L., Converse, J. (1977). Experiments in interviewing techniques : field experiments in health reporting, 1971-1977. Hyattsville, MD: National Center for Health Services Research.
Krosnick, J. A. (1991). Response strategies for coping with the cognitive demands of attitude measures in surveys. Applied Cognitive Psychology, 5(3), 213–236.
Lezak, M. D. (1983). Neuropsychological assessment. New York: Oxford University Press.
U.S. Department of Health and Human Services. Food and Drug Administration. Center for Devices and Radiological Health, Guidance for the Use of Bayesian Statistics in Medical Device Clinical Trials, 2010. Available at: http://www.fda.gov/MedicalDevices/DeviceRegulationandGuidance/GuidanceDocuments/ucm071072.htm
The contact person for questions regarding data collection and statistical aspects of the design of this research is listed below:
Erica L. Olmsted-Hawala
Center for Survey Measurement
Room 5K104D
U.S. Census Bureau
Washington, D.C. 20233
(301) 763- 4893
Erica.l.olmsted.hawala@census.gov
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | olmst001 |
File Modified | 0000-00-00 |
File Created | 2021-02-02 |