OMB Control Number: 1220-0141
Expiration Date: 07/31/2027
DATE: |
August 30, 2024 |
NOTE TO THE REVIEWER OF:
|
OMB CLEARANCE 1220-0141 “Cognitive and Psychological Research” |
FROM: |
Douglas Williams and Robin Kaplan Office of Survey Methods Research |
SUBJECT: |
Submission of Materials for ATUS Time and Activity Recall Research |
Please accept the enclosed materials for approval under the OMB clearance package 1220-0141 “Cognitive and Psychological Research.” In accordance with our agreement with OMB, we are submitting a brief description of the study.
The total estimated respondent burden for this study is 410 hours.
If there are any questions regarding this project, please direct them to Douglas Williams (202-691-5707; Douglas.Williams@bls.gov).
Introduction
The American Time Use Survey (ATUS) provides nationally representative estimates of how, where, and with whom Americans spend their time, and is the only federal survey providing data on the full range on nonmarket activities, from housework to caregiving activities. The ATUS diary collects information about the activities respondents performed on one day and, for most activities, where they were and who they were with. It also includes summary questions about the diary activities, such as questions asking about times when the respondent may have provided childcare as a secondary activity.
The ATUS has been conducted solely via Computer Assisted Telephone Interviewing (CATI) since the survey began in 2003. To modernize the ATUS data collection and offer an alternative mode of data collection to CATI, BLS has been developing a self-administered web diary tool such that the ATUS would become a mixed-mode CATI and web-based survey.
The ATUS program has conducted several rounds of testing of a web diary prototype. Research to date has included a literature review of web diaries, mixed-mode testing to compare reporting behaviors, usability testing to identify areas to improve respondent’s ability to use and report activities, and most recently a pilot test to compare activity reporting between CATI and web. This research has provided rich information into understanding how web responses may differ from CATI. Specifically, this research has identified that respondents have difficulty with identifying activities using the ATUS lexicon (categorical grouping and labeling of activities). It has also noted that respondents vary in the level of detail, sometimes omitting brief activities, or activities perceived as unimportant.
The purpose of this research is three-fold. First is to understand how respondents conceptualize their day when recalling what they did (i.e., activities). This information will help inform what is most salient to respondents to incorporate guides and instructions into the web diary. The second is to understand if there is terminology used by respondents for vague activities (e.g., socializing) that could help to improve terms used in the ATUS lexicon. The last will focus on the use of mental models. Mental models are used in user experience research to identify design improvements. Mental models ask respondents their expectations about how a system will work, in this case, their expectations of how they would report time and activity information.
Methodology
To accomplish these goals, the proposed questions will be tested using a mixture of cognitive interviewing and online surveys.
Cognitive Interviews
Cognitive interviews provide an in-depth understanding of the respondents’ thought processes and reactions to the questions. One-on-one interviews will be conducted asking participants to recall their day with little guidance. This will help assess how participants organize their day and identify salient themes and where and what type of instructions would be most beneficial. Participants will then be asked why certain activities were excluded and how they would label activities that have complex label mappings. Participants will also be asked about their mental models, or their expectations for self-administered web reporting that will help inform a diary web design. Interviews will primarily be conducted in-person, while Microsoft Teams may be used for remote interviews. Interviews will be conducted by staff from OSMR who are experienced in conducting these types of interviews. The testing protocol for one-on-one interviews in included in Appendix A.
Online Surveys
Online surveys will be conducted asking respondents to recall the most salient activities and free recall of activities that would cover the entirety of their day. This approach will also replicate the self-administered web mode with debriefing probes embedded to gain a deeper understanding of respondent’s recall of their activities the previous day. An experiment will be embedded where half of the participants (n=750) will receive 15 individual text boxes to input their activities, and the other half (n=750) will receive 22 individual text boxes to input their activities (these numbers were determined by previous research showing that participants included a median range of 14-17 items in ATUS web diaries studies). The number of activities and level of detail included in each condition will be compared. Online surveys enable us to reach a larger sample than using interviews alone. This will allow us to identify common activities, among a much larger group, that are most susceptible to omission. The instrument will also include some debriefing probes to get at mental models and preference for self-reporting on the web to inform a diary web design. The testing protocol for online surveys is included in Appendix B.
These testing activities will be coordinated iteratively, as activities collected during cognitive testing will be necessary for the classification activity during online testing. The findings from both modes of testing will be evaluated qualitatively. The feedback and findings collected in cognitive interviews will be used to inform the design and placement of instructions in the web diary, while findings from the online interviews provide insight into how respondents recall their day and preferences for reporting in a web mode.
As this is pretesting, we expect that modifications may be made during the course of the study based on initial results. Although the goals of the testing, and overall design, will remain the same, findings from preliminary results may be used to improve the interview questions. Modifications are likely to range from slight changes to question wording to the inclusion of additional instructions.
Participants
Activities reported in the ATUS are expected to vary by respondent and household compositions. For example, those with children are more likely to have childcare activities. Those who are employed will be able to report work activities. These are specific activities that, in the case of childcare have demonstrated under-reporting in a web mode, and in the case of working, may miss intervening activities (such as lunch, breaks, or commuting/travel time). As such, this pretesting will target those that have children under the age of 5 and those who are currently employed.
Cognitive Interviews
Up to 30 participants will participate in one-on-one cognitive interviews. This number of participants was determined to provide a sufficient number of participants to identify common behaviors in activity reporting, or recommendations for necessary guidance (i.e., instruction) in a self-administered mode.
Since ATUS activity reporting differs for some subgroups, at least 10 participants will be recruited that have children aged 5 or under; and at least another 10 participants will be recruited that are currently employed.
To find eligible participants for one-on-one interviews, we will use the following recruitment methods:
Advertisements on Craigslist.
Snowball sampling, by asking participants to advertise the study to others who may meet the eligibility criteria.
Flyers on bulletin boards at grocery stores, libraries, and other community locations.
Emails to community listservs.
Interested people will be asked to call or email OSMR staff to learn more about the study and see if they are eligible. The study recruiter will administer screening questions to ensure subgroups of interested are selected. The screening questions are in Appendix C and advertisements are in Appendix D. As we reach the end of the data collection period, we may revise the advertisements to target specifically the subgroups we are interested in, such as by removing group names from the advertisement if we no longer need them or using different subgroup labels if the advertisements are not bringing in the target people.
Online Testing
Up to 1,500 participants will participate in online testing. This sample size was determined to sufficiently explore difference in activity labeling to identify common terms or themes. This sample size also takes into account break-offs, incomplete data, and participants who do not follow the task instructions, similar to other OMB-approved samples used for BLS studies of this nature.
To find eligible participants for online surveys, we will use the CloudResearch platform. CloudResearch is an online marketplace where individuals sign up to participate in online research tasks. Given that we are interested in labeling applied to a broad array of verbatim activities, across a large number of participants, no screening will be conducted.
Burden Hours
This study will use up to 410 burden hours for recruitment and data collection (Table 1).
For one-on-one interviews, we anticipate making contact with up to 60 people to conduct eligibility screening, in order to schedule interviews with 30 people. Screening is expected to take no more than 5 minutes. The interview sessions will take no more than 60 minutes.
For online surveys, participants will spend up to 15 minutes completing the online task.
Table 1. Estimated Burden Hours
Mode |
Participants Contacted |
Recruitment Hours |
Recruitment Total Hours |
Participants Completed |
Session Hours |
Session Total Hours |
Total Collection Burden |
One-on-One Interview |
60 |
0.08 |
4.8 |
30 |
1 |
30 |
34.8 |
Online Survey |
1,500 |
0 |
0 |
1,500 |
0.25 |
375 |
375 |
Total Hours |
|
|
409.8 |
Payment to Participants
Participants who complete the one-on-one interviews will receive $50 for their participation. This payment is for participant’s costs of participation, including travel and child care.
Participants who complete the online survey will receive $3.75, a typical rate for similar online tasks.
Data Confidentiality
Cognitive Interviews
Statement for cognitive interview participants
Cognitive interview participants will be read this information and asked for their verbal consent prior to beginning the study. For any remote interview participants, the following information will be shown through screen sharing on Microsoft Teams. In-person cognitive interviews will be audio recorded, with the participant’s permission. If given permission, interviewers will state the date and time of the interview and re-ask permission for the recording.
With your permission, I would like to audio record our conversation. This will allow me to concentrate on what you are saying instead of taking notes while you are talking.
All your responses and everything you say will be kept strictly confidential, and only researchers working on this project will see your answers or hear the recording.
Your participation in this research project is voluntary, and you have the right to stop at any time.
We estimate the session will last up to 1 hour. If you have any comments regarding this estimate or any other aspect of this study, send them by email to BLS_PRA_Public@bls.gov.
The Bureau of Labor Statistics is conducting this voluntary study under OMB No. 1220-0141, which expires on 7/31/2027. Without this currently approved number, we could not conduct this research. Your responses are also protected by law:
The Bureau of Labor Statistics, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act (44 U.S.C. 3572) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. Per the Federal Cybersecurity Enhancement Act of 2015, Federal information systems are protected from malicious activities through cybersecurity screening of transmitted data. The Privacy Act notice describes the conditions under which information related to this study will be used by BLS employees and agents.
In accordance with the Privacy Act of 1974 as amended (5 U.S.C. 552a), this study is being conducted by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under the authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The information will only be used by and disclosed to BLS personnel and contractors who need the information for activities related to improving BLS information collection. Information on routine uses can be found in the system of records notice, DOL/BLS – 14, BLS Behavioral Science Research Laboratory Project Files (81 FR 47418).
Do you have any questions before we proceed?
Do you agree to participate?
Yes, I agree.
No – terminate
Do you agree to our recording our session?
Yes, I agree (obtain consent after starting recording)
No (state that this session will not be recorded)
Online Surveys
Statement for online participants
Online survey participants will be informed of the OMB number and the voluntary nature of the study.
This voluntary study is being collected by the Bureau of Labor Statistics under OMB No. 1220-0141 (Expiration Date: July 31, 2027). Without this currently approved number, we could not conduct this survey. This survey will take approximately 15 minutes to complete. If you have any comments regarding this estimate or any other aspect of this study, send them to BLS_PRA_Public@bls.gov. The BLS cannot guarantee the protection of survey responses and advises against the inclusion of sensitive personal information in any response. This survey is being administered by SurveyMonkey and resides on a server outside of the BLS Domain. Your participation is voluntary, and you have the right to stop at any time.
Attachments
Appendix A: Cognitive Interview Protocol
Appendix B: Online survey
Appendix C: Screening Questions for Cognitive Interviews
Appendix D: Advertisements for Cognitive Interviews
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Edgar, Jennifer - BLS |
File Modified | 0000-00-00 |
File Created | 2025-02-04 |