Cognitive and Psychological Research
OMB Control Number 1220-0141
OMB Expiration Date: 7/31/2024
Supporting Statement For
Cognitive and Psychological Research
OMB CONTROL NO. 1220-0141
This ICR seeks OMB approval for an extension of the BLS Cognitive and Psychological generic package administered by the Bureau of Labor Statistics (BLS) Behavioral Science Research Center (BSRC) to conduct research to improve the quality of data collection and dissemination by examining the psychological and cognitive aspects of methods and procedures. BLS staff, employing state-of-the-art user experience and cognitive psychological testing methods, will conduct these research and development activities. The use of cognitive techniques to improve the quality of data collection and dissemination has been advocated by the Cognitive Aspects of Survey Methodology seminar sponsored by the National Academy of Science. The planned research and development activities will be conducted during FY2024 through FY2026 with the goal of improving overall data quality through improved procedures.
A. Justification
1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.
BSRC conducts psychological research focusing on the design and execution of the data collection process in order to improve the quality of data collected and published by BLS. BSRC conducts research aimed at improving data collection quality by assessing questionnaire/form management and administration, issues related to interviewer training and respondent interaction, as well as how the information is presented to data users and stakeholders. BSRC staff work closely with economists and/or program specialists responsible for defining the concepts to be measured by BLS collection programs.
Both questionnaires and forms are used in BLS surveys. Questionnaires specify the preferred wording of the questions to be asked, whereas forms specify the data items to be collected. Each possesses distinctive problems, which in many cases can be related to respondent characteristics, survey content, or format of administration. Such problems impede the effectiveness of surveys and the mission of BLS in general.
The purpose of this request for clearance for cognitive psychological research and development activities by the BSRC is to enhance the quality of BLS data collection procedures, respondent materials, and data dissemination. The basic goal of BSRC is to improve, through interdisciplinary research, the quality of the data collected and published by the BLS. BLS is committed to producing the most accurate and complete data within the highest quality assurance guidelines. It is with this mission in mind that BSRC was created to aid in the effort of not only maintaining, but also improving the quality of the data collection process.
In 1988, Commissioner Janet Norwood established this laboratory to employ behavioral science to investigate all forms of oral and written communication used in the collection and processing of survey data. This exploration also includes all aspects of data collection such as mode, manuals, and interviewer training. BSRC performs a state-of-the-art service for many programs within BLS, DOL, and other agencies as requested, providing questionnaire redesign efforts, survey updates, and improvements in the overall quality of the data collection process. These efforts, in turn, increase data quality and reduce respondent burden. The techniques proposed here have been successfully applied to many BLS surveys.
The research techniques and methods BSRC uses in these studies will include both analyses of questionnaire construction and interview process, as well as survey technology. Within the structure of the questionnaire, BSRC anticipates conducting analyses in the following domains:
Question Analysis - Evaluation of individual questions, appraising question intent, assessment of semantic clarity, and an examination of relationships between questions.
Term Analysis - Evaluation of specific wording and phrases in terms of their psycholinguistic properties and an assessment of respondent interpretation of the meaning of these terms, at both the conscious and unconscious levels.
Instruction Analysis - Inspection of instructions for their semantic clarity, the degree to which they reflect the stated intention of investigators, ease of interpretation, and other considerations which may elicit unambiguous and appropriate answers or behaviors from respondents or interviewers.
Format Analysis - Review of questionnaires or subsets of questions for perceptual characteristics in order to facilitate better respondent comprehension and to promote more focused attention on the questionnaire or form.
Within the interview process, BSRC conducts several analyses to assess nonverbal communication, interpersonal dynamics, and symbolic interaction--the use of cultural symbols to make social statements. Staff members conduct research to evaluate the overall effectiveness of data collection procedural characteristics, including:
Interviewer Characteristics and Behavior Analysis - Study of the presentation of appearance, manner, relation to respondent population, etc., in order to enhance interpersonal skills of interviewers in general and to develop and improve procedures for training interviewers.
Respondent Characteristics and Behavior Analysis - Assessment of the social, cultural, ethnic, and demographic characteristics of the respondent and how they may bear upon interactions with the interviewer. Staff members also observe the behavior of respondents for cues concerning their reactions to the interview process. Because BLS collects data from different populations that change over time, the analysis of respondent characteristics needs frequent updating.
Mode Characteristics - Examination of the unique properties of interviewer and/or respondent behavior as a function of the media used to collect data; for example, self-administered surveys, personal interviews, telephone interviews, video interviews, and interviews utilizing computer assistive technologies (e.g., CAPI, CASI and CATI).
Usability Analysis - Evaluation of the effectiveness, efficiency, and satisfaction with which respondents complete tasks assigned to them, especially when using self-guided instruments (PAPI or CASI). This method also applies to data collectors’ interactions with CATI or CAPI instruments, and data user interactions with published information.
Data Collection Methodology Analysis – Assessment of alternative formats for collecting survey data (e.g., respondent provided records, administrative records). Staff will evaluate the validity and reliability of data collected through the alternative methodology as well as the level of respondent burden relative to current procedures.
BLS also uses a variety of methodologies, such as usability analysis, debriefings, and in-depth interviews, to better understand how to communicate more effectively with stakeholder and user communities through its website and other materials.
2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.
The purpose of BSRC’s data collection is to improve Federal data collection and dissemination processes through scientific research. Theories and methods of cognitive science provide essential tools for the development of effective questionnaires. For example, they can provide an understanding of how respondents comprehend survey questions, recall relevant facts from memory, make judgments, and respond. On the basis of such knowledge, BLS can tailor questions to increase the accuracy and validity of the collected information and to reduce respondent burden. BLS can make similar improvements with respect to other aspects of the data collection process.
BSRC’s research contributes to BLS and to the entire survey research field. BSRC shares their research results with BLS and beyond through seminars, training sessions, reports, publications, and presentations at professional conferences. The BSRC staff has instituted a method of peer review to encourage high standards of social science research practice. To show how the research has been used, a list of BSRC staff publications and internal reports1 covering the last five years can be found in Attachment I.
The BSRC’s research is expected to 1) improve the data collection instruments employed by BLS, 2) increase the accuracy of the economic data produced by BLS and on which economic policy decisions are based, 3) increase the ease of administering survey instruments for both respondents and interviewers, 4) increase response rates in panel surveys as a result of reduced respondent burden, 5) increase the ease of use of the BLS website and other BLS products, and 6) enhance the reputation of BLS through an enhanced user experience with BLS products and publications.
The application of cognitive and psychological theories and methods to survey data collection is widespread and well established. The consequences of failing to scientifically investigate the data collection process is to lag in the use of accepted practices, to apply archaic survey development techniques based on intuition and trial and error, and ultimately to incur a severe cost in data quality and in burden to respondents, interviewers, and data users alike. For example, without knowledge of what respondents can be expected to remember about the past and how to ask questions that effectively aid in the retrieval of the appropriate information, survey researchers cannot ensure that respondents will not take shortcuts to avoid careful thought in answering the questions or to be subject to undue burden. Likewise, without investigation of the interviewers’ roles and abilities in the data collection process, survey researchers cannot ensure that interviewers will ask their questions correctly with ease and fluency, or record the respondent’s answers correctly.
3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also, describe any consideration of using information technology to reduce burden.
Staff members will design, conduct, and interpret research that contributes new knowledge of the cognitive aspects of human behavior in relationship to questionnaire design and survey methodology. Cognitive psychological research methods in use include such techniques as probing, memory cueing, group discussion, and in-depth interviewing. Depending on research goals, these methods may be used separately or in combination with one another.
Each method has its own advantages, which can include rapid and in-depth testing of questionnaire items, a more detailed understanding of the respondents’ comprehension of concepts, and access to special populations who can be quickly recruited and tested. BSRC will use different methods in different studies, depending on the aspects of the data collection process being evaluated.
Burden in this collection will be held to a minimum. The proposed approach to research of data collection methods is designed to obtain the maximum amount of information for the minimum burden. The research includes such methods as:
cognitive interviews, including the use of vignettes
focus groups
eye tracking
structured evaluation tasks such as card sorts
experiments involving the administration of forms to study respondents
usability tests of existing or proposed data collection and data dissemination systems (including survey respondent materials, software applications, and the public BLS website).
BSRC will use a variety of technologies to support their research. For some methods, such as cognitive interviews, usability tests, or focus groups, BSRC may make audio and/or video recordings of the sessions, with the respondent’s consent, storing the recordings in secure locations. These recordings allow for more detailed analysis of the sessions. They also allow team members to observe the sessions if they were not able to attend it live. BSRC may also use tools to automate the presentation of test materials to respondents.
BSRC will also use online tools to recruit respondents and administer unmoderated, self-administered methods. With these tools, BSRC can conduct studies easily and efficiently with a large number of respondents with specific characteristics of interest. These online studies can allow researchers to administer smaller tasks across large groups of respondents, reducing the burden for any one respondent. Finally, these self-administered online methods allow for experimentation of survey features such as question wording or format, in a way that is simply not possible in the laboratory or via video conferencing due to the resources required to obtain necessary sample sizes to detect statistical differences.
4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item A.2 above.
This research does not duplicate any other research effort being done within the BLS. This research will provide critical, groundbreaking, and important supplemental information beyond that currently available in the field of survey methodology as it applies to BLS surveys.
This research also does not duplicate any outside-of-government research effort, as its purpose is not to replicate survey research studies. The staff of BSRC is cognizant of current research being done in the field of cognitive psychology through attendance at conferences, research reported in professional journals, and through in-house staff meetings and peer review processes. There is no current, similar, existing data that can be used or modified for the purposes of improving the overall data collection process.
5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.
BSRC data collection efforts focus primarily on information gained through interviews, telephone interviews, and self-administered questionnaires with individuals recruited from the general public. However, in some instances, organizational goals necessitate the involvement of businesses, state agencies, or other entities. To the extent these establishments are included in a research project, they normally participate only once.
6. Describe the consequence to federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
The planned collection of data will allow BSRC to suggest modifications and alterations to survey research in an ongoing manner. Because this collection is expected to be an ongoing effort, it has the potential to have immediate impact on all survey collection methods within BLS jurisdiction. Its delay would sacrifice potential gain in survey modification within BLS as a whole.
7. Explain any special circumstances that would cause an information collection to be conducted in a manner:
requiring respondents to report information to the agency more often than quarterly;
requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
requiring respondents to submit more than an original and two copies of any document;
requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;
in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
requiring the use of statistical data classification that has not been reviewed and approved by OMB;
that includes a pledge of confidentially that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentially to the extent permitted by law.
There are no special circumstances.
8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.
Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.
Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years -- even if the collection-of-information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.
Federal Register: One comment was received as a result of the Federal Register notice published in 88 FR 86681 on Thursday, December 14, 2023.
The comment, which was emailed to BLS on January 1, 2024, was out of scope. The mission of BLS is to measure labor market activity, working conditions, price changes, and productivity in the U.S. economy to support public and private decision making. The research covered by this collection is critical to maximize the quality of BLS data and minimize the burden to respondents.
Outside Consultation: Consultation with individuals outside BLS to obtain views on the availability of data, frequency of collection, suitability of particular methods, clarity of instruction and record keeping, disclosure, or reporting format, and on the data elements to be recorded and/or disclosed, are frequent and ongoing with the National Center for Health Statistics, the Bureau of the Census, the National Center for Science and Engineering Statistics, the National Agricultural Statistics Service, the University of Maryland, the University of Michigan, and other federal agencies and institutions of higher learning. Consultants come from a wide range of subject areas and expertise.
The individual responsible for the BSRC research efforts is:
Rebecca L. Morrison
Director of Behavioral Science Research
Office of Survey Methods Research
Bureau of Labor Statistics
Washington, DC
20212
202-691-5715
9. Explain any decision to provide any payments or gifts to respondents, other than remuneration of contractors or grantees.
Respondents for activities such as cognitive interviews, focus groups, and usability tests under this clearance will receive a small stipend. This practice has proven necessary and effective in recruiting respondents to participate in this small-scale research, and is also employed by the other Federal cognitive laboratories. The incentive for participation in a cognitive interview or usability test of up to 60 minutes in length is up to $50, and for participation in a focus group of up to 90 minutes in length is up to $90, for in person or using methods that are equivalent to in person. For cognitive interviews or usability tests conducted through other means of up to 60 minutes in length, the incentive for participation is up to $40. And, for focus groups conducted through other means of up to 90 minutes in length, the incentive for participation is up to $75. BLS may provide smaller incentives than these amounts at its discretion. All requests for incentives must be justified in writing to OMB.
10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.
The data collected from respondents will be tabulated and analyzed only for the purpose of evaluating the research in question. The goal of the consent process, whether through a full consent form or other means, is to ensure respondents understand the voluntary nature of the studies, the use of the information, and that the interview may be recorded or observed.
When feasible, respondents will be shown the Consent form explaining the Privacy Act Statement. (Attachment II). The Privacy Act Statement given to respondents is as follows:
In accordance with the Privacy Act of 1974 as amended (5 U.S.C. 552a), this study is being conducted by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under the authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The information will only be used by and disclosed to BLS personnel and contractors who need the information for activities related to improving BLS information collection. Information on routine uses can be found in the system of records notice, DOL/BLS – 14, BLS Behavioral Science Research Laboratory Project Files (81 FR 47418).
When not feasible to show the Consent form, such as with some remote or self-administered methods, the researcher will identify the most efficient way to get the required information to the respondent. The approach used for each study will be included in the individual study package.
BSRC research involving BLS surveys with current OMB approval, such as mail or CATI surveys, use the pledge of the existing approved collection or the Privacy Act Statement.
The Confidential Information Protection and Statistical Efficiency Act (CIPSEA) safeguards the confidentiality of individually identifiable information acquired under a pledge of confidentiality for exclusively statistical purposes by controlling access to, and uses made of, such information. CIPSEA includes fines and penalties for any knowing and willful disclosure of individually identifiable information by an officer, employee, or agent of the BLS.
When a study is being conducted under a pledge of confidentiality, the CIPSEA statement provided to respondents in the consent form is as follows:
The Bureau of Labor Statistics, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act (44 U.S.C. 3572) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. Per the Federal Cybersecurity Enhancement Act of 2015, Federal information systems are protected from malicious activities through cybersecurity screening of transmitted data. The Privacy Act notice describes the conditions under which information related to this study will be used by BLS employees and agents.
BLS policy on the confidential nature of respondent identifiable information (RII) states that “RII acquired or maintained by the BLS for exclusively statistical purposes and under a pledge of confidentiality shall be treated in a manner that ensures the information will be used only for statistical purposes and will be accessible only to authorized individuals with a need-to-know.”
11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
Most of the questions that are included on BLS questionnaires are not of a sensitive nature. However, it is possible that some potentially sensitive questions may be included in questionnaires that are tested under this clearance. One of the purposes of this testing is to identify such questions, determine sources of sensitivity, and alleviate them insofar as possible before the actual survey is administered in a production setting.
12. Provide estimates of the hour burden of the collection of information. The statement should:
Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. General, estimates should not include burden hours for customary and usual business practices.
If this request for approval covers more than one form, provide separate hour burden estimates for each form.
Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 14.
Estimated Burden
The FY 2024, FY 2025, and FY 2026 estimated burdens are as follows:
Activity |
FY |
No. of Respondents |
No. of Responses per Respondent |
Total Responses |
Average Burden (Hours) |
Total Burden (Hours)* |
Hourly Wage Rate |
Total Burden Cost |
Individuals and Households |
2024 |
10,000 |
1 |
10,000 |
0.3333 |
3,333 |
$33.82 |
$112,722.06 |
Individuals and Households |
2025 |
10,000 |
1 |
10,000 |
0.3333 |
3,333 |
$33.82 |
$112,722.06 |
Individuals and Households |
2026 |
10,000 |
1 |
10,000 |
0.3333 |
3,333 |
$33.82 |
$112,722.06 |
Total Individuals and Households |
|
30,000 |
|
30,000 |
|
9,999 |
|
$338,166.18 |
Private Sector |
2024 |
450 |
1 |
450 |
1 |
450 |
$33.82 |
$15,219.00 |
Private Sector |
2025 |
450 |
1 |
450 |
1 |
450 |
$33.82 |
$15,219.00 |
Private Sector |
2026 |
450 |
1 |
450 |
1 |
450 |
$33.82 |
$15,219.00 |
Total Private Sector |
|
1350 |
|
1,350 |
|
1,350 |
|
$45,657.00 |
FY24-26
|
|
31,350 |
|
31,350 |
|
11,349 |
|
$383,823.18 |
|
|
|
|
|
|
|
|
|
Average Annualized Amounts |
|
10,450 |
|
10,450 |
|
3,783 |
$33.82 |
$127,941.06 |
*Burden estimates include recruiting, screening, and participation. |
The estimated dollar cost for these individuals is based on average hourly earnings of employed wage and salaried workers, $33.82 per hour, taken from the August 2023 Current Employment Statistics Program data.
The burden hours are estimated based on the anticipation that the research will require approximately one hour per respondent for interviews and other methods moderated by a researcher. Unmoderated (self-administered) studies range from 5 to 20 minutes on average. Each study will differ substantively from one another. The projects are expected to be complex, involving at times several testing methods to test the hypotheses of the given research question.
In addition to burden hours required for data collection, the Office of Management and Budget has instructed that time spent recruiting and screening respondents for studies be included in estimates of burden. We estimate that screening takes approximately 10 minutes per household respondent for moderated sessions, and 2 minutes per respondent for self-administered methods. Business respondents are often sampled from an existing BLS frame and so screening is not necessary.
Coverage of the Estimates
The estimates cover the time that each respondent will spend being screened and participating in the research. The time required to travel to the laboratory, if needed, is not covered, since distances and modes of transportation are unknown. No retrieval of information by respondents is anticipated, although it is possible that validation of data at some point may require respondents to keep and check records. In this case, burden hour requests will include the estimated time required to gather records.
Basis for the Estimate
These estimates are based on the BSRC’s
previous experience in conducting such research under the existing
OMB Clearance 1220-0141, and on expectations concerning the research
projects to be conducted in the next 3 years. BSRC staff and its
facilities have been increasingly utilized by both program offices
and outside agencies, and it is anticipated that this trend will
continue.
13. Provide an estimate of the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).
There will be no total capital and start-up cost component for respondents or record keepers resulting from the collection of information.
b) The respondents and record keepers will have no expenses for operation and maintenance or purchase of services resulting from the collection of information.
14. Provide estimates of the annualized cost to the Federal Government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), any other expense that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 into a single table.
The maximum cost to the Federal Government is $50,000 annually for FY 2024, FY 2025, and FY 2026. Those costs are entirely comprised of the reimbursements paid to respondents and advertisement costs. Other costs such as operational expenses (e.g., equipment, overhead, printing, fees for use of online panels, and support staff), and any other expense that would not have been incurred without the paperwork burden, are in place as a function of the laboratory proper and are not contingent or necessary to perform this research.
15. Explain the reasons for any program changes or adjustments.
This is a request for extension to the existing OMB Clearance 1220-0141 in order to continue the research mission for another 3 years. The expected total burden for the next 3 years is 11,349 hours, with an annual average of 3,783 hours. This is an increase of 2,949 hours from the previously approved package. The expanded use of unmoderated online testing, mixed mode testing, and projects investigating transitioning from interviewer-administered to self-administered surveys account for the increase.
16. For collections of information whose results will be published, outline plans for tabulations, and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.
This clearance request is for survey methodological and questionnaire quality assurance, which include the exploratory activities leading to the evaluation of data quality. Both quantitative and qualitative analyses are planned for the evaluation of these activities depending on the circumstances.
The results of these investigations will be used primarily to improve the quality of data collection and assure total collection quality as it relates to data dissemination. Because BLS is using the latest techniques and cognitive psychological testing methodology, methodological papers may be written that include some tallies of response problems, recall strategies, or results from other testing procedures used, etc. The methodological results may be included as a methodological appendix or footnote in a report containing data from a larger data collection effort. The methodological results of this research may be prepared for presentation at professional meetings or publication in professional journals. While these methodological publications may include substantive findings as part of the results, the substantive findings will not be published on their own.
This project schedule calls for research to commence once OMB approval is received.
A time schedule is proposed that is continuously ongoing in nature, with research publication dates dependent on data collection limited to the researchers’ proposals and subsequent results.
17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
The BSRC is not seeking approval to avoid displaying the expiration date for OMB approval of the information collection.
18. Explain each exception to the certification statement.
There are no exceptions to the certification statement “Certification for Paperwork Reduction Act Submissions.”
ATTACHMENT I
BEHAVIORAL SCIENCE RESEARCH LAB RECENT ARTICLES AND REPORTS BASED ON OMB APPROVED STUDIES
Kopp, B., & Yu, E. (2018). Usability Testing of a Prototype Expenditure Diary. Internal report.
Mockovak, W. (2018). Preliminary Results from Navigation and Card Sorting Tasks for Business Owners and Students Recruited Using Amazon’s Mechanical Turk. Internal report.
Mockovak, W., Fox, J., & Kopp, B. (2018). Cognitive Testing Summary for the Taxonomy Team. Internal report.
Westat (2018). HSOII Cognitive Testing Findings and Recommendations. Internal BLS Report.
Yu, E. (2018). Final testing report for the 2019 CE Surveys Revisions. Internal Report.
Earp, M., Fox, J., & Kaplan, R. (2019). Evaluating Qualifiers in Rating Scales. Presented at the European Survey Research Association 2019 Conference, Zagreb, Croatia, July 18, 2019.
Fox, J., Biagas, D., & Edgar. J. (2019). Usability Testing of BLS.gov Interactive Charts and Tables. Internal Report.
Fox, J.E. (2019). The Impact of Question Format on Reading and Response Behaviors. Presented at the FedCASIC Workshop, Washington, DC, April 16, 2019.
Holzberg, J., Ellis, R., Kaplan, R., Virgile, M., & Edgar, J. (2019). Can They and Will They? Exploring Proxy Response of Sexual Orientation and Gender Identity in the Current Population Survey. Journal of Official Statistics, 35(4), 885-911.
Kaplan, R. (2019). Summary of the Occupational Employment Statistics (OES) “Count Column” Test. Internal report.
Kaplan, R. Holzberg, J., & Phipps, P. (2019). Pretesting SOGI Questions: How Do In-Person Cognitive Interviews Compare To Online Testing? Paper presented at AAPOR, Toronto, Canada.
Kaplan, R., & Holzberg, J. (2019). What Does it Mean to be Burdened? Exploring Subjective Perceptions of Burden. Presented at the European Survey Research Association. Paper presented at ESRA conference, Zagreb, Croatia.
Kaplan, R., & Holzberg, J. (2019). What Does it Mean to be Burdened? Exploring Subjective Perceptions of Burden. Presented at the DC-AAPOR Respondent Burden Workshop.
Kaplan, R., & Yu, E. (2019). Exploring the Mind of the Interviewer: Findings from Research with Interviewers to Improve the Survey Process. Poster presented at the Interviewer Effects from a Total Survey Error Perspective. Lincoln, NE.
Kaplan, R., Biagas, D., & Edgar, J. (2019). Summary of Cognitive Testing of the OES Point Estimate Form. Internal report.
Kaplan, R., Kopp, B., & Phipps, P. (2019). Contrasting Stylized Questions of Sleep with Diary Measures from the American Time Use Survey. Chapter accepted to the Questionnaire Design, Evaluation, and Testing Wiley Volume.
Fox, J., Biagas, D., & Van Horn, S. (2020). Usability Testing the Next Generation of BLS News Releases. Internal Report.
Fox, J.E., Earp, M., & Kaplan, R. (2020). Item Scale Performance. Presented at the American Association of Public Opinion Research (AAPOR) Annual Conference, Virtual, June 11, 2020.
Kaplan R., & Edgar J. (2020). Multi-mode question pretesting: Using traditional cognitive interviews and online testing as complementary methods in Survey Methods: Insights from the Field, Special issue: ‘Advancements in Online and Mobile Survey Methods’. Retrieved from https://surveyinsights.org/?p=14659.
Kaplan, R., & Holzberg, J. (2020). Measuring Subjective Perceptions of Burden over Time. Paper presented at AAPOR Virtual Conference.
Kaplan, R., & Yu, E. (2020). Exploring the Mind of the Interviewer: Findings from Research with Interviewers to Improve the Survey Process. Chapter published in Interviewer Effects from a Total Survey Error Perspective, Edited by Olson, K., Smyth, J. D., Dykema, J., Holbrook, A. L., Kreuter, F., & West, B. T. CRC Press.
Kaplan, R., Kopp, B., & Phipps, P. (2020). Contrasting Stylized Questions of Sleep with Diary Measures from the American Time Use Survey. Advances in Questionnaire Design, Development, Evaluation and Testing, 671-695.
Mockovak, W. (2020). What Is Gained by Asking Retrospective Probes after an Online, Think-Aloud Cognitive Interview? Paper presented at the General Online Research Conference. Berlin, Germany, September 2020.
Yu, E. (2020). Cognitive testing report for the 2021 CE Surveys Revisions. Internal Report.
Yu, E., & Van Horn, S. (2020). Recommended revisions to the Veterans Supplement to the CPS. Internal Report.
Fox, J.E, Van Horn, S., & Biagas, D. (2021). Results of the Scoping Interviews for the Survey of Respirator of Use and Practices (SRUP). Internal Report.
Fox, J.E. (2021). Results of the Round 1 Cognitive Interview for the Survey of Respirator Use and Practices. Internal Report.
Johnson, N., & Fox, J.E. (2021). Implementing a New Web Collection Tool for Multi-Worksite Respondents in the Current Employment Statistics Survey. Presented at the FedCASIC Workshop, Virtual, April 14, 2021. Available at https://www.census.gov/fedcasic/fc2021/pdf/5B_NJohnson.pdf.
Van Horn, S., Fox, J.E., & Biagas, D. (2021). Using Moderated and Unmoderated Usability Testing to Assess Interactive BLS News Releases. Presented at the American Association of Public Opinion Research (AAPOR) Annual Conference, Virtual, May 11, 2021.
Fox, J.E. (2022). Results of the Round 2 Cognitive Interviews for the Survey of Respirator Use and Practices. Internal Report.
Grotophorst, Z., Denton, S., Lee, L., & Bulgar-Medina, J. (2022). Developing a Self-Administered Eldercare Module for the American Time Use Survey. Conference of the Federal Committee on Statistical Methodology, Washington, DC.
Kaplan, R., & Yu, E. (2022). Summary of Online Testing of BLS Survey Questions. Internal memo.
NORC (2022). ATUS Eldercare Questions for Web. Internal Report.
NORC (2022). Mode Differences in CATI and Self-Administered ATUS Web Diaries. Internal Report.
Simpson, H. (2022). BLS Recommendations for Final Third Wave Measurement Changes ORS FY 2022 IAA – Deliverable 7b. Internal Memo.
Williams, D., & Kaplan, R. (2022). Debriefing Results for the Business Response Survey 3.0. Internal Memo.
Fox, J.E. (2023). The Impact of Response Option Wording on Survey Scale Performance. Poster presented at the American Association of Public Opinion Research (AAPOR) Annual Conference, Philadelphia, PA, May 12, 2023.
Kaplan, R., & Narine, V. (2023). Summary of cognitive testing of the American Time Use Survey Leave Module. Internal memo.
Kaplan, R., & Narine, V. (2023). Summary of cognitive testing of the Current Population Survey Work Schedule Supplement. Internal memo.
Kaplan, R., & Walker, T. (2023). Summary of cognitive testing of the Current Population Survey Disability Supplement. Internal memo.
Kaplan, R., & Williams, D. (2023). Cognitive Test Results for COVID-19 and Telework Related Question in the CPS. Internal Memo.
Kaplan, R., & Yu, Erica. (2023). Please Answer This Important Question: Can Web Prompts Reduce Item Nonresponse? Paper presented at the American Association for Public Opinion Research (AAPOR) Conference, Philadelphia, PA.
Walker, T., Narine, V. R., & Williams, D. (2023). Mode Differences in CATI and Self-Administered American Time Use Survey Web Diaries – OSMR Reanalysis and Corrections. Internal Memo.
Walker, T., Narine, V. R., Williams, D., & Denton, S. (2023). Understanding Potential Mode Effects in Transitioning from a Computer Assisted Telephone Interview to a Self-Administered Web Diary. 78th Annual Conference of the American Association for Public Opinion Research, Philadelphia, PA.
Westat (2023). ATUS Web Diary Development and Testing: Final Report. Internal Report.
Williams, D. (2023). Pilot Test Results for ETA Sponsored Survey on Layoffs and STC. Internal Memo.
Williams, D., Stang S., & Ulrich F. (2023). Survey Length & Burden: The Effect of Number of Survey Questions and Complexity on Survey Outcomes. 10th Conference of the European Survey Research Association, Milan, Italy.
Williams, D., Stang, S., & Thomas, E. (2023). Classifying Response Errors in a Low-Budget Establishment Survey to Improve Data Quality. 78th Annual Conference of the American Association for Public Opinion Research, Philadelphia, PA.
ATTACHMENT II
Consent
OMB Control Number: 1220-0141
Expiration Date: month xx, 2027
The Bureau of Labor Statistics (BLS) is conducting research to increase the quality of BLS surveys. This study is intended to suggest ways to improve the procedures the BLS uses to collect survey data.
The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act (44 U.S.C. 3572) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. Per the Federal Cybersecurity Enhancement Act of 2015, Federal information systems are protected from malicious activities through cybersecurity screening of transmitted data. The Privacy Act notice describes the conditions under which information related to this study will be used by BLS employees and agents.
During this research, you may be audio and/or video recorded or you may be observed. If you do not wish to be recorded, you still may participate in this research.
We estimate it will take you an average of xx minutes to participate in this research. If you have any comments regarding this estimate or any other aspect of this study, send them to Rebecca L. Morrison, Bureau of Labor Statistics, Office of Survey Methods Research (1220-0141), 2 Massachusetts Ave, NE, Room 2850, Washington, D.C. 20212 or by email to BLS_PRA_Public@bls.gov.
Your participation in this research project is voluntary, and you have the right to stop at any time.
The OMB control number for this information collection is 1220-0141 and expires <date>. Without this approved OMB number, BLS would not be able to conduct this survey.
I have read and understand the statements above. I consent to participate in this study.
__________________________ __________________________ ____________
Participant's signature Participant’s printed name Date
__________________________
Researcher's signature
PRIVACY ACT STATEMENT
In accordance with the Privacy Act of 1974 as amended (5 U.S.C. 552a), this study is being conducted by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under the authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The information will only be used by and disclosed to BLS personnel and contractors who need the information for activities related to improving BLS information collection. Information on routine uses can be found in the system of records notice, DOL/BLS – 14, BLS Behavioral Science Research Laboratory Project Files (81 FR 47418).
1 Internal reports available upon request.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | BUREAU OF LABOR STATISTICS |
Author | Nora Kincaid |
File Modified | 0000-00-00 |
File Created | 2024-07-22 |