Cognitive and Psychological Research
OMB Control Number 1220-0141
OMB Expiration Date: 3/31/2021
Supporting Statement For
Cognitive and Psychological Research
OMB CONTROL NO. 1220-0141
This ICR seeks OMB approval for an extension of the BLS Cognitive and Psychological generic package administered by the Bureau of Labor Statistics (BLS) Behavioral Science Research Center (BSRC) to conduct research to improve the quality of data collection by examining the psychological and cognitive aspects of methods and procedures. BLS staff, employing state-of-the-art cognitive psychological testing methods, will conduct these research and development activities. The use of cognitive techniques to improve the quality of data collection has been advocated by the Cognitive Aspects of Survey Methodology seminar sponsored by the National Academy of Science. The planned research and development activities will be conducted during FY2021 through FY2023 with the goal of improving overall data quality through improved procedures. This revision requests an increase in incentive payments from $40 to $45. This increase reflects some of the inflation since the incentive was set at $40 in 2006.
A. Justification
1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.
BSRC conducts psychological research focusing on the design and execution of the data collection process in order to improve the quality of data collected and published by BLS. BSRC conducts research aimed at improving data collection quality by assessing questionnaire/form management and administration, issues related to interviewer training and respondent interaction, as well as how the information is presented to data users and stakeholders. BSRC staff work closely with economists and/or program specialists responsible for defining the concepts to be measured by BLS collection programs.
Both questionnaires and forms are used in BLS surveys. Questionnaires specify the preferred wording of the questions to be asked, whereas forms specify the data items to be collected. Each possesses distinctive problems, which in many cases can be related to respondent characteristics, survey content, or format of administration. Such problems impede the effectiveness of surveys and the mission of BLS in general.
The purpose of this request for clearance for cognitive psychological research and development activities by the BSRC is to enhance the quality of BLS data collection procedures, respondent materials, and data dissemination. The basic goal of BSRC is to improve, through interdisciplinary research, the quality of the data collected and published by the BLS. BLS is committed to producing the most accurate and complete data within the highest quality assurance guidelines. It is with this mission in mind, that BSRC was created to aid in the effort of not only maintaining, but also improving the quality of the data collection process.
In 1988, Commissioner Janet Norwood established this laboratory to employ behavioral science to investigate all forms of oral and written communication used in the collection and processing of survey data. This exploration also includes all aspects of data collection such as mode, manuals, and interviewer training. BSRC performs a state-of-the-art service for many programs within BLS, DOL, and other agencies as requested, providing questionnaire redesign efforts, survey updates, and improvements in the overall quality of the data collection process. These efforts, in turn, increase data quality and reduce respondent burden. The techniques proposed here have been successfully applied to many BLS surveys.
The research techniques and methods BSRC uses in these studies will include both analyses of questionnaire construction and interview process, as well as survey technology. Within the structure of the questionnaire, BSRC will conduct analyses in the following domains:
Question Analysis - Evaluation of individual questions, appraising question intent, assessment of semantic clarity, and an examination of relationships between questions.
Term Analysis - Evaluation of specific wording and phrases in terms of their psycholinguistic properties and an assessment of respondent interpretation of the meaning of these terms, at both the conscious and unconscious levels.
Instruction Analysis - Inspection of instructions for their semantic clarity, the degree to which they reflect the stated intention of investigators, ease of interpretation, and other considerations which may elicit unambiguous and appropriate answers or behaviors from respondents or interviewers.
Format Analysis - Review of questionnaires or subsets of questions for perceptual characteristics in order to facilitate better respondent comprehension and to promote more focused attention on the questionnaire or form.
Within the interview process, BSRC conducts several analyses to assess nonverbal communication, interpersonal dynamics, and symbolic interaction--the use of cultural symbols to make social statements. Staff members conduct research to evaluate the overall effectiveness of data collection procedural characteristics, including:
Interviewer Characteristics and Behavior Analysis - Study of the presentation of appearance, manner, relation to respondent population, etc., in order to enhance interpersonal skills of interviewers in general and to develop and improve procedures for training interviewers.
Respondent Characteristics and Behavior Analysis - Assessment of the social, cultural, and ethnic characteristics of the respondent and how they may bear upon interactions with the interviewer. Staff members also observe the behavior of respondents for cues concerning their reactions to the interview process. Because BLS constantly collects data from different populations that change over time, the analysis of respondent characteristics needs frequent updating.
Mode Characteristics - Examination of the unique properties of interviewer and/or respondent behavior as a function of the media used to collect data; for example, self-administered surveys, personal interviews, telephone interviews, and interviews utilizing computer assistive technologies (e.g., CAPI, CASI and CATI).
Usability Analysis - Evaluation of the effectiveness, efficiency, and satisfaction with which respondents complete tasks assigned to them, especially when using self-guided instruments (PAPI or CASI). This method also applies to data collectors’ interactions with CATI or CAPI instruments, and data user interactions with published information.
Data Collection Methodology Analysis – Assessment of alternative formats for collecting survey data (e.g., respondent provided records, administrative records). Staff will evaluate the validity and reliability of data collected through the alternative methodology as well as the level of respondent burden relative to current procedures.
BLS also uses a variety of methodologies, such as usability analysis, debriefings, and in-depth interviews, to better understand how to communicate more effectively with stakeholder and user communities through its website and other materials.
2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.
The purpose of BSRC's data collection is to improve Federal data collection and dissemination processes through scientific research. Theories and methods of cognitive science provide essential tools for the development of effective questionnaires. For example, they can provide an understanding of how respondents comprehend survey questions, recall relevant facts from memory, make judgments, and respond. On the basis of such knowledge, BLS can tailor questions to increase the accuracy and validity of the collected information and to reduce respondent burden. BLS can make similar improvements with respect to other aspects of the data collection process.
BSRC’s research contributes to BLS and to the entire survey research field. BSRC shares their research results with BLS and beyond through seminars, training sessions, reports, publications, and presentations at professional conferences. The BSRC staff has instituted a method of peer review to encourage high standards of social science research practice. To show how the research has been used, a list of BSRC staff publications and internal reports1 covering the last five years can be found in Attachment I.
The BSRC’s research is expected to 1) improve the data collection instruments employed by BLS, 2) increase the accuracy of the economic data produced by BLS and on which economic policy decisions are based, 3) increase the ease of administering survey instruments for both respondents and interviewers, 4) increase response rates in panel surveys as a result of reduced respondent burden, 5) increase the ease of use of the BLS website and other BLS products, and 6) enhance the reputation of BLS through an enhanced user experience with BLS products and publications.
The application of cognitive and psychological theories and methods to survey data collection is widespread and well established. The consequences of failing to scientifically investigate the data collection process is to lag in the use of accepted practices, to apply archaic survey development techniques based on intuition and trial and error, and ultimately to incur a severe cost in data quality and in burden to respondents, interviewers, and data users alike. For example, without knowledge of what respondents can be expected to remember about the past and how to ask questions that effectively aid in the retrieval of the appropriate information, survey researchers cannot ensure that respondents will not take shortcuts to avoid careful thought in answering the questions or to be subject to undue burden. Likewise, without investigation of the interviewers’ roles and abilities in the data collection process, survey researchers cannot ensure that interviewers will ask their questions correctly with ease and fluency, or record the respondent’s answers correctly.
3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also, describe any consideration of using information technology to reduce burden.
Staff members will design, conduct, and interpret research that contributes new knowledge of the cognitive aspects of human behavior in relationship to questionnaire design and survey methodology. Cognitive psychological research methods in use include such techniques as probing, memory cueing, group discussion, and in-depth interviewing. Depending on research goals, these methods may be used separately or in combination with one another.
Each method has its own advantages, which can include rapid and in-depth testing of questionnaire items, a more detailed understanding of the respondents’ comprehension of concepts, and access to special populations who can be quickly recruited and tested. BSRC will use different methods in different studies, depending on the aspects of the data collection process being evaluated.
Burden in this collection will be held to a minimum. The proposed approach to research of data collection methods is designed to obtain the maximum amount of information for the minimum burden. The research includes such methods as:
cognitive interviews, including the use of vignettes
focus groups
structured evaluation tasks such as card sorts
experiments involving the administration of forms to study respondents
usability tests of existing or proposed data collection and data dissemination systems (including survey respondent materials and the public BLS website).
BSRC will use a variety of technologies to support their research. For some methods such as cognitive interviews, usability tests, or focus groups, BSRC may make electronic recordings of the sessions, with the respondent’s consent, storing the recordings in secure locations. These recordings allow for more detailed analysis of the sessions. They also allow team members to observe the sessions if they were not able to attend it live. BSRC may also use tools to automate the presentation of test materials to respondents.
BSRC will also use online tools to recruit respondents and administer unmoderated, self-administered methods. With these tools, BSRC can conduct studies with a large number of respondents with specific characteristics of interest easily and efficiently. These online studies can allow researchers to administer smaller tasks across large groups of respondents, reducing the burden for any one respondent. Finally, these self-administered online methods allow for experimentation of survey features such as question wording or format, in a way that is simply not possible in the laboratory due to the resources required to obtain necessary sample sizes to detect statistical differences.
4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item A.2 above.
This research does not duplicate any other research effort being done within the BLS. This research will provide critical, groundbreaking, and important supplemental information beyond that currently available in the field of survey methodology as it applies to BLS surveys.
This research also does not duplicate any outside-of-government research effort, as its purpose is not to replicate survey research studies. The staff of BSRC is cognizant of current research being done in the field of cognitive psychology through attendance at conferences, research reported in professional journals, and through in-house staff meetings and peer review processes. There is no current, similar, existing data that can be used or modified for the purposes of improving the overall data collection process.
5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.
BSRC data collection efforts focus primarily on information gained through interviews, telephone interviews, and self-administered questionnaires with individuals recruited from the general public. However, in some instances, organizational goals necessitate the involvement of businesses, state agencies, or other entities. To the extent these establishments are included in a research project, they normally participate only once.
6. Describe the consequence to federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
The planned collection of data will allow BSRC to suggest modifications and alterations to survey research in an ongoing manner. Because this collection is expected to be an ongoing effort, it has the potential to have immediate impact on all survey collection methods within BLS jurisdiction. Its delay would sacrifice potential gain in survey modification within BLS as a whole.
7. Explain any special circumstances that would cause an information collection to be conducted in a manner:
requiring respondents to report information to the agency more often than quarterly;
requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
requiring respondents to submit more than an original and two copies of any document;
requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;
in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
requiring the use of statistical data classification that has not been reviewed and approved by OMB;
that includes a pledge of confidentially that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentially to the extent permitted by law.
There are no special circumstances.
8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.
Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.
Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years -- even if the collection-of-information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.
Federal Register: One comment was received as a result of the Federal Register notice published in 85 FR 64168 on October 9, 2020. This comment which was emailed to BLS on October 9, 2020 expressed the opinion that there is no benefit to BLS from this collection. The work covered by this collection is critical to maximize the quality of BLS data and minimize the burden to respondents.
Outside Consultation: Consultation with individuals outside BLS to obtain views on the availability of data, frequency of collection, suitability of particular methods, clarity of instruction and record keeping, disclosure, or reporting format, and on the data elements to be recorded and/or disclosed, are frequent and ongoing with the National Center for Health Statistics, the Bureau of the Census, the National Center for Science and Engineering Statistics, the National Agricultural Statistics Service, the University of Maryland, the University of Michigan, and other federal agencies and institutions of higher learning. Consultants come from a wide range of subject areas and expertise.
The individual responsible for the BSRC research efforts is:
Douglas Williams
Director of Behavioral Science Research
Office of Survey Methods Research
Bureau of Labor Statistics
Washington,
DC 20212
(202) 691-5707
9. Explain any decision to provide any payments or gifts to respondents, other than remuneration of contractors or grantees.
Respondents for activities such as cognitive interviews, focus groups, and usability tests under this clearance will receive a small stipend. This practice has proven necessary and effective in recruiting respondents to participate in this small-scale research, and is also employed by the other Federal cognitive laboratories. The incentive for participation in a cognitive interview or usability test is $45, and for participation in a focus group is $75. This slight increase in the incentive for cognitive interviews or usability test is an adjustment to make it closer to the purchasing power of the $40 incentive that originally offered in 2006. According to the BLS Consumer Price Index, $40 in 2006 is worth $52.43 today. BLS may provide smaller incentives than these amounts at its discretion; however, any requests for larger amounts must be justified in writing to OMB.
10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.
The data collected from respondents will be tabulated and analyzed only for the purpose of evaluating the research in question. The goal of the consent process, whether through a full consent form or other means, is to ensure respondents understand the voluntary nature of the studies, the use of the information, and that the interview may be recorded or observed.
When feasible, respondents will be shown the Consent form explaining the Privacy Act Statement. (Attachment II). The Privacy Act Statement given to respondents is as follows:
In accordance with the Privacy Act of 1974 as amended (5 U.S.C. 552a), this study is being conducted by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under the authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The information will only be used by and disclosed to BLS personnel and contractors who need the information for activities related to improving BLS information collection. Information on routine uses can be found in the system of records notice, DOL/BLS – 14, BLS Behavioral Science Research Laboratory Project Files (81 FR 47418).
When not feasible to show the Consent form, such as with some remote or self-administered methods, the researcher will identify the most efficient way to get the required information to the respondent. The approach used for each study will be included in the individual study package.
BSRC research involving BLS surveys with current OMB approval, such as mail or CATI surveys, use the pledge of the existing approved collection or the Privacy Act Statement.
The Confidential Information Protection and Statistical Efficiency Act (CIPSEA) safeguards the confidentiality of individually identifiable information acquired under a pledge of confidentiality for exclusively statistical purposes by controlling access to, and uses made of, such information. CIPSEA includes fines and penalties for any knowing and willful disclosure of individually identifiable information by an officer, employee, or agent of the BLS.
When a study is being conducted under a pledge of confidentiality, the CIPSEA statement provided to respondents in the consent form is as follows:
The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act (44 U.S.C. 3572) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. Per the Federal Cybersecurity Enhancement Act of 2015, Federal information systems are protected from malicious activities through cybersecurity screening of transmitted data. The Privacy Act notice on the back of this form describes the conditions under which information related to this study will be used by BLS employees and agents.
BLS policy on the confidential nature of respondent identifiable information (RII) states that “RII acquired or maintained by the BLS for exclusively statistical purposes and under a pledge of confidentiality shall be treated in a manner that ensures the information will be used only for statistical purposes and will be accessible only to authorized individuals with a need-to-know.”
11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
Most of the questions that are included on BLS questionnaires are not of a sensitive nature. However, it is possible that some potentially sensitive questions may be included in questionnaires that are tested under this clearance. One of the purposes of this testing is to identify such questions, determine sources of sensitivity, and alleviate them insofar as possible before the actual survey is administered.
12. Provide estimates of the hour burden of the collection of information. The statement should:
Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. General, estimates should not include burden hours for customary and usual business practices.
If this request for approval covers more than one form, provide separate hour burden estimates for each form.
Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 14.
Estimated Burden
The FY 2021, FY 2022, and FY 2023 estimated burdens are as follows:
Activity |
No. of Respondents |
No. of Responses per Respondent |
Total Responses |
Average Burden (Hours) |
Total Burden (Hours) |
Hourly Wage Rate |
Total Burden Cost |
Individuals and Households FY 2021† FY 2022 FY 2023
|
12,000 6,000 6,000
|
Once Once Once
|
12,000 6,000 6,000 |
.3333 .3333 .3333
|
4,000 2000 2000 |
$29.47 $29.47 $29.47
|
$117,880.00 $58,940.00 $58,940.00 |
Total |
24,000 |
|
24,000 |
|
8,000* |
|
$235,760.00 |
Private Sector FY 2021 FY 2022 FY 2023 |
150 150 100 |
Once Once Once |
150 150 100 |
1 1 1 |
150 150 100 |
$29.47 $29.47 $29.47 |
$4,420.50 $4,420.50 $2,947.00 |
Total |
400 |
|
400 |
|
400** |
|
$11,778.00 |
FY21-23 Overall Total |
24,400 |
|
24,400 |
|
8,400 |
|
$247,548.00 |
Average Annualized Amounts |
8,133 |
|
8,133 |
|
2,800 |
$29.47 |
$82,516.00 |
†Increase in participants and burden hours for FY21 is due to a one-time project investigating methods for transitioning survey questions from interviewer-administered to mixed mode and self-administered surveys (see item 15)
*Burden estimates include recruiting, screening, online studies, and interviews.
**Burden estimates include recruiting and interviews.
The estimated dollar cost for these individuals is based on average hourly earnings of employed wage and salaried workers, $29.47 per hour, taken from the August 2020 Current Employment Statistics Program data.
The burden hours are estimated based on the anticipation that the research will require approximately one hour per respondent for interviews and other methods moderated by a researcher. Unmoderated (self-administered) studies range from 5 to 20 minutes on average. Each study will differ substantively from one another. The projects are expected to be complex, involving at times several cognitive testing methods to test the hypotheses of the given research question.
In addition to burden hours required for data collection, the Office of Management and Budget has instructed that time spent recruiting and screening respondents for studies be included in estimates of burden. We estimate that screening takes approximately 10 minutes per household respondent for moderated sessions, and 2 minutes per respondent for self-administered methods. Business respondents are often sampled from an existing BLS frame and so screening is not necessary.
Coverage of the Estimates
The estimates cover the time that each respondent will spend being screened and participating in the research. The time required to travel to the laboratory, if needed, is not covered, since distances and modes of transportation are unknown. No retrieval of information by respondents is anticipated, although it is possible that validation of data at some point may require respondents to keep and check records. In this case, burden hour requests will include the estimated time required to gather records.
Basis for the Estimate
These
estimates are based on the BSRC’s previous experience in
conducting such research under the existing OMB Clearance 1220-0141,
and on expectations concerning the research projects to be conducted
in the next 3 years. BSRC staff and its facilities have been
increasingly utilized by both program offices and outside agencies,
and it is anticipated that this trend will continue.
13. Provide an estimate of the total annual cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).
There will be no total capital and start-up cost component for respondents or record keepers resulting from the collection of information.
b) The respondents and record keepers will have no expenses for operation and maintenance or purchase of services resulting from the collection of information.
14. Provide estimates of the annualized cost to the Federal Government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), any other expense that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 into a single table.
The maximum cost to the Federal Government is $30,000 annually for FY 2021, FY 2022, and FY 2023. Those costs are entirely comprised of the reimbursements paid to respondents and newspaper advertisement costs. Other costs such as operational expenses (e.g., equipment, overhead, printing, fees for use of online panels, and support staff), and any other expense that would not have been incurred without the paperwork burden, are in place as a function of the laboratory proper and are not contingent or necessary to perform this research.
15. Explain the reasons for any program changes or adjustments.
This is a request for extension to the existing OMB Clearance 1220-0141 in order to continue the research mission for another 3 years. The expected burden for the next 3 years is 8,400 hours, with an annual average of 2,800 hours. This is an increase of 2,100 hours from the previously approved package. The request for FY 2021 includes 1,000 burden hours for individuals and households to cover a one-time project investigating methods for transitioning survey questions from interviewer-administered to mixed mode and self-administered surveys, an area relevant to several BLS surveys. The requests for FYs 2021 and 2022 include an additional 50 hours per year to cover a two-year project involving private sector respondents.
16. For collections of information whose results will be published, outline plans for tabulations, and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.
This clearance request is for survey methodological and questionnaire quality assurance, which include the exploratory activities leading to the evaluation of data quality. Both quantitative and qualitative analyses are planned for the evaluation of these activities depending on the circumstances.
The results of these investigations will be used primarily to improve the quality of data collection and assure total collection quality as it relates to data dissemination. Because BLS is using the latest techniques and cognitive psychological testing methodology, methodological papers may be written that include some tallies of response problems, recall strategies, or results from other testing procedures used, etc. The methodological results may be included as a methodological appendix or footnote in a report containing data from a larger data collection effort. The methodological results of this research may be prepared for presentation at professional meetings or publication in professional journals. While these methodological publications may include substantive findings as part of the results, the substantive findings will not be published on their own.
This project schedule calls for research to commence once OMB approval is received.
A time schedule is proposed that is continuously ongoing in nature, with research publication dates dependent on data collection limited to the researchers’ proposals and subsequent results.
17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
The BSRC is not seeking approval to avoid displaying the expiration date for OMB approval of the information collection.
18. Explain each exception to the certification statement.
There are no exceptions to the certification statement “Certification for Paperwork Reduction Act Submissions.”
ATTACHMENT I
BEHAVIORAL SCIENCE RESEARCH LAB RECENT ARTICLES AND REPORTS BASED ON OMB APPROVED STUDIES
Edgar, J. (2016). Cognitive Testing of Revised CIPSEA Pledge. Internal BLS Report.
Edgar, J. (2016). Respondent Perceptions: What happens after legislation and technology collide, collaborate and compromise? 2016 FCSM Policy Conference.
Edgar, J., Esposito, J., Kopp, B., Mockovak, W., and Yu, E. (2016). Identifying Factoryless Goods Producers in the U.S. Statistical System. Presented at the 5th International Conference on Establishment Surveys, Geneva, Switzerland.
Edgar, J., Kaplan, R., Earp, M. (2016). Online Testing of Revised CIPSEA Pledge. Internal BLS Report.
Edgar, J., Kaplan, R. (2017). Does It Matter? Impact of Confidentiality Pledges on Web Survey Response. 2017 American Association of Public Opinion Research Conference.
Edgar, J. (2017). Cognitive Testing Results: CPS UI Non-filers. Internal BLS report.
Edgar, J., Samuel, S. (2017). Factoryless Goods Producers Sensitivity Results Memo. Internal BLS Report. Westat (2016). Factoryless Goods Producers: Report on In-depth Interviews with Establishments. Internal BLS Report
Eggleston, C., Olmsted Hawala, E., Edgar, J. (2017). Do They Read It? Using Paradata to Evaluate the Extent to Which Respondents Attend to Confidentiality Pledge Language. 2017 American Association of Public Opinion Research Conference.
Ellis, R., Virgile, M., Holzberg, J., Nelson, D. V., Edgar, J., Phipps, P., & Kaplan, R. (2017). Assessing the Feasibility of Asking Sexual Orientation and Gender Identity in the Current Population Survey: Executive Summary. Internal Report.
Ellis, R., Virgile, M., Holzberg, J., Nelson, D. V., Edgar, J., Phipps, P., & Kaplan, R. (2017). Assessing the Feasibility of Asking Sexual Orientation and Gender Identity in the Current Population Survey: Results from Cognitive Interviews. Internal Report.
Fox. J., Biagas, D., and Edgar. J. (2019). Usability Testing of BLS.gov Interactive Charts and Tables. Internal Report.
Fox. J., Biagas, D., and Van Horn, S. (2020). Usability Testing the Next Generation of BLS News Releases. Internal Report.
Fox, J., Earp, M., and Kaplan, R. (2020). Item Scale Performance. Presented at the American Association of Public Opinion Research, Remote Conference.
Holzberg, J., Ellis, R., Kaplan, R., Virgile, M., & Edgar, J. (2019). Can They and Will They? Exploring Proxy Response of Sexual Orientation and Gender Identity in the Current Population Survey. Journal of Official Statistics, 35(4), 885-911.
Holzberg, J., Ellis, R., Virgile, M., Nelson, D., Edgar, J., Phipps, P., & Kaplan, R. (2017). Assessing the Feasibility of Asking about Gender Identity in the Current Population Survey. Results from Focus Groups with members of the Transgender Population. Internal Report.
Jones, C., Martinelli, C., Edgar, J. (2016). Collecting Previously Reported Data: Testing Telephone Interviewing Techniques in the Occupational Employment Statistics Survey. Presentation at the Questionnaire Design, Evaluation and Testing 2 conference.
Kaplan, R. & Edgar, J. (2017). Confidentiality Concerns, Do They Matter More than Confidentiality Pledges? Presented at the American Association of Public Opinion Research, New Orleans, LA.
Kaplan, R., Kopp, B., & Phipps, P. (2017). Using wearable devices to assess the validity of diary and stylized sleep measures. Presented at the European Survey Research Association, Lisbon, Portugal.
Kaplan, R., Kopp, B., & Phipps, P. (2019). Contrasting Stylized Questions of Sleep with Diary Measures from the American Time Use Survey. Chapter accepted to the Questionnaire Design, Evaluation, and Testing Wiley Volume.
Kaplan, R., & Phipps, P. (2016). Results from the Office of Survey Methods Research’s Pretesting of the SOII Respondent Re-contact Survey. Internal report.
Kaplan R., &, Yu, Erica. (2016): What would you ask?: Exploring why interviewers select different techniques to reduce question sensitivity. Presented at the American Association of Public Opinion Research, Austin, TX.
Kopp B., Kaplan, R., & Phipps, P. (2016). Contrasting Stylized Questions of Sleep with Diary Measures from the American Time Use Survey. Presented at the American Association of Public Opinion Research, Austin, TX.
Kaplan, R., Kopp, B., & Phipps, P. (2016). Contrasting Stylized Questions of Sleep with Diary Measures from the American Time Use Survey. Invited presentation at the Questionnaire Design, Evaluation, and Testing Conference, Miami, FL.
Kaplan, R., Biagas, D., and Edgar, J. (2019). Summary of Cognitive Testing of the OES Point Estimate Form. Internal report.
Kaplan, R. (2019). Summary of the Occupational Employment Statistics (OES) “Count Column” Test. Internal report.
Kaplan, R., and Yu, E. (2019). Exploring the Mind of the Interviewer: Findings from Research with Interviewers to Improve the Survey Process. Poster presented at the Interviewer Effects from a Total Survey Error Perspective. Lincoln, NE.
Kaplan, R. Holzberg, J., and Phipps, P. (2019). Pretesting SOGI Questions: How Do In-Person Cognitive Interviews Compare To Online Testing? Paper presented at AAPOR, Toronto, Canada.
Kaplan, R. and Holzberg, J. (2019). What Does it Mean to be Burdened? Exploring Subjective Perceptions of Burden. Presented at the European Survey Research Association. Paper presented at ESRA conference, Zagreb, Croatia.
Kaplan, R. and Holzberg, J. (2019). What Does it Mean to be Burdened? Exploring Subjective Perceptions of Burden. Presented at the DC-AAPOR Respondent Burden Workshop.
Kaplan, R. and Holzberg, J. (2020). Measuring Subjective Perceptions of Burden over Time. Paper presented at AAPOR Virtual Conference.
Kaplan, R. L., Kopp, B., & Phipps, P. (2020). Contrasting Stylized Questions of Sleep with Diary Measures from the American Time Use Survey. Advances in Questionnaire Design, Development, Evaluation and Testing, 671-695.
Kaplan, R., and Yu, E. (2020). Exploring the Mind of the Interviewer: Findings from Research with Interviewers to Improve the Survey Process. Chapter published in Interviewer Effects from a Total Survey Error Perspective, Edited by Olson, K., Smyth, J. D., Dykema, J., Holbrook, A. L., Kreuter, F., & West, B. T. CRC Press.
Kopp, B., Edgar, J. (2016). Current Population Survey Program Electronic Mediation of Contingent Work Question Cognitive Testing. Internal BLS report.
Kopp, B., & Yu, E. (2016). Final Cognitive Testing Report for the CEQ 2017 Changes. Internal Report.
Kopp, B. (2016). Usability Test Results from the Testing of the CE Paper Diary. Internal Report.
Kopp, B. & Yu, E. (2018). Usability Testing of a Prototype Expenditure Diary. Internal report.
Mockovak, W; Fox, J.; and Kopp, B. (2018). Cognitive Testing Summary for the Taxonomy Team. Internal report.
Mockovak, W. (2018). Preliminary Results from Navigation and Card Sorting Tasks for Business Owners and Students Recruited Using Amazon’s Mechanical Turk. Internal report.
Mockovak, W. (2020). What Is Gained by Asking Retrospective Probes after an Online, Think-Aloud Cognitive Interview? Paper presented at the General Online Research Conference. Berlin, Germany, September 2020.
Mockovak, W., & Kaplan, R. (2016). Comparing Face-to-Face Cognitive Interviewing with Unmoderated, Online Cognitive Interviewing with Embedded and Follow-Up Probing. Presented at the Questionnaire Design, Evaluation, and Testing Conference, Miami, FL.
Phipps, P., Kaplan, R., & Kopp, B. (2017). Exploring Interviewer and Respondent Interactions Surrounding Sleep Questions in the American Time Use Survey. Presented at the American Association of Public Opinion Research, New Orleans, LA.
Redline, C., Bournazian, J., Edgar, J., Ridolfo, H. (2017). Do Establishments Understand It? Cognitive Interviewing Assessment of Confidentiality Pledges for Establishment Surveys. 2017 American Association of Public Opinion Research Conference.
Scherer, A., Edgar, J. (2016). Confidentiality Pledge Changes: TryMyUI Testing. Internal BLS Report.
Swallow, A., Kaplan, R., Edgar, J. (2017). Exploring Respondent’s perceptions of data confidentiality and enhanced cybersecurity. 2017 FedCASIC conference.
Westat (2017). Factoryless Goods Producers Enterprise vs. Establishment – Task Order #27. Internal BLS Report.
Westat (2016). Factoryless Goods Producers: Report on In-depth Interviews with Industry Associations. Internal BLS Report.
Westat (2017). Testing Global Questions for the Consumer Expenditure Gemini Recall Interview. Internal BLS Report.
Westat (2016). U.S. Consumer Expenditure Survey (CE) - Electronic Desktop Diary Design Improvements – Task Order #24. Internal BLS Report.
Westat (2018). HSOII Cognitive Testing Findings and Recommendations. Internal BLS Report.
Yan, T., Warren, A., Sun, H., & Muller, M. (2017). Testing Global Questions for the Consumer Expenditure Gemini Recall Interview. Internal Report.
Yu, E., (2017). Results from pre-testing the proposed questions for collecting outlet information in the Q183 Consumer Expenditure Quarterly Interview Survey. Internal Report.
Yu, E. (2017). Finding the value of electronic records in the CE. Internal Report.
Yu, E. & Swallow, A. (2018). The Roles of Typicality, Specificity, and Set Size in Providing Examples for Survey Questions: Evidence from an online study using self-administered comprehension probes
Yu, E. (2018). Final testing report for the 2019 CE Surveys Revisions. Internal Report.
Yu, E. (2020). Cognitive testing report for the 2021 CE Surveys Revisions. Internal Report.
Yu, E. & Van Horn, S. (2020). Recommended revisions to the Veterans Supplement to the CPS. Internal Report.
Yu, E. (2016). Electronic records online study report. Internal Report.
Yu, E. (2016). Testing instructions for organizing electronic records for the redesigned CE interview – Final report. Internal Report.
Yu, E. (2016). Testing New Interview Protocols: Lessons Learned about Interviewers, Respondents, and Survey Content. Presented at the Annual Conference of the Joint Statistical Meetings, Chicago, IL.
Yu, E., Martinez, W., Kopp, B., & Fricker, S. (2016). Using Text Analysis to Find the Meaning of Respondent Burden. Presented at the Annual Conference of the American Association for Public Opinion Research, Austin, TX.
ATTACHMENT II
Consent Form
OMB Control Number: 1220-0141
Expiration Date: month xx, 2024
The Bureau of Labor Statistics (BLS) is conducting research to increase the quality of BLS surveys. This study is intended to suggest ways to improve the procedures the BLS uses to collect survey data.
The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act (44 U.S.C. 3572) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. Per the Federal Cybersecurity Enhancement Act of 2015, Federal information systems are protected from malicious activities through cybersecurity screening of transmitted data. The Privacy Act notice on the back of this form describes the conditions under which information related to this study will be used by BLS employees and agents.
During this research you may be audio and/or video recorded, or you may be observed. If you do not wish to be taped, you still may participate in this research.
We estimate it will take you an average of xx minutes to participate in this research.
Your participation in this research project is voluntary, and you have the right to stop at any time. If you agree to participate, please sign below.
The OMB control number for this information collection is 1220-0141 and expires <date>. Without this approved OMB number, BLS would not be able to conduct this survey.
------------------------------------------------------------------------------------------------------------
I have read and understand the statements above. I consent to participate in this study.
___________________________________ ___________________________
Participant's signature Date
___________________________________
Participant's printed name
___________________________________
Researcher's signature
PRIVACY ACT STATEMENT
In accordance with the Privacy Act of 1974 as amended (5 U.S.C. 552a), this study is being conducted by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under the authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The information will only be used by and disclosed to BLS personnel and contractors who need the information for activities related to improving BLS information collection. Information on routine uses can be found in the system of records notice, DOL/BLS – 14, BLS Behavioral Science Research Laboratory Project Files (81 FR 47418).
1 Internal reports available upon request.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | BUREAU OF LABOR STATISTICS |
Author | Nora Kincaid |
File Modified | 0000-00-00 |
File Created | 2021-02-19 |