Alternative Supporting Statement for Information Collections Designed for
Research, Public Health Surveillance, and Program Evaluation Purposes
2019 National Survey of Early Care and Education COVID-19 Follow-up
OMB Information Collection Request
0970 - 0391
Supporting Statement
Part A
October 2020
Submitted By:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Project Officers: Ivelisse Martinez-Beck and Ann Rivera
Part A
Executive Summary
Type of Request: This Information Collection Request is for a reinstatement with changes to a previously approved information collection (OMB #0970-0391, previously titled, “The 2019 National Survey of Early Care and Education (2019 NSECE): The Household, Provider, and Workforce Surveys”). We are requesting one year of approval.
Progress to Date: Data collection for the 2019 NSECE ended in July 2019, as scheduled. Participants included 11,000 center-based or home-based early care and education (ECE) providers, and 5,000 workforce members from center-based providers, as well as 8,000 households with children under age 13. Data are currently being analyzed and released.
Timeline: One additional year of approval is requested to field two additional waves of data collection from early care and education providers and workforce members to understand their experience of the COVID-19 pandemic.
Previous Terms of Clearance: None.
Summary of changes requested: Two waves of longitudinal data collection are proposed for providers and workforce members who completed interviews in the 2019 NSECE. Using characteristics of sample members from the 2019 NSECE, the collected data will allow nationally-representative descriptions of the impact of the COVID-19 pandemic on the nation’s ECE supply as of 2019 and the individuals who were working to provide that supply in 2019.
We do not intend for this information to be used as the principal basis for public policy decisions.
Time Sensitivity: Data collection approval is requested by November 1, 2020, so that timely data collection can take place during and in the immediate aftermath of the pandemic emergency. Moreover, as delays occur in data collection launch, we anticipate increased difficulties in reaching sampled providers and workforce members as contact information becomes invalid due to pandemic-induced changes in employment, residence, and financial ability to maintain phone numbers and internet access.
A1. Necessity for Collection
There are no legal or administrative requirements that necessitate this collection. ACF is undertaking the collection at the discretion of the agency. ACF seeks approval for information collection (IC) activities as part of their effort to better understand how the COVID-19 pandemic is affecting early care and education (ECE) supply and workforce through a COVID-focused follow-up IC of the 2019 National Survey of Early Care and Education’s sampled ECE programs and workforce (2019 NSECE; OMB # 0970-0391), sponsored by the Office of Planning, Research, and Evaluation, Administration for Children & Families (ACF), U.S. Department of Health and Human Services (HHS). A follow-up study using the NSECE sample offers a unique opportunity to answer questions about the effect of the pandemic on ECE as it is nationally representative of all sectors of the ECE system, and offers recent pre-pandemic data that could be used for pre- and post-pandemic comparisons.
The 2012 and 2019 NSECE surveys provided the first national portrait of demand for and supply of ECE in 20 years. In March 2020, about a year after the 2019 NSECE data were collected (January-June 2019), the COVID-19 pandemic struck the United States. The pandemic has dramatically affected the ECE system: some states mandated closures of all ECEs except those providing emergency care to essential workers; all ECEs have different policies and operating procedures to support worker and child safety; and, multiple funding streams have emerged to help ECEs weather closures and additional costs during the pandemic1. Data is needed to identify these changes to the ECE system and their effects, including how many and which providers were able to access financial assistance, and the impact of this assistance on the program’s capacity and quality. Americans rely on ECE to work. Thus, a functional ECE system is critical to our country’s economic recovery. There is a need for policy makers, researchers, and practitioners to better understand how the pandemic is affecting ECEs to better support the ECE workforce and families during and after the pandemic.
A2. Purpose
Purpose and Use
The COVID-19 pandemic has had an enormous impact on the early care and education sector, although consistent, representative data are not available that impact. The NSECE COVID-19 Follow-up data will provide consistent and representative data to ACF for better understanding how the pandemic has affected providers and workforce members, how the ECE supply available to families may have changed, and how providers and workforce members might better be supported in future emergency situations. The information collected is meant to contribute to the body of knowledge on ACF programs. It is not intended to be used as the principal basis for a decision by a federal decision-maker, and is not expected to meet the threshold of influential or highly influential scientific information.
Research Questions or Tests
The NSECE COVID-19 Follow-up is designed to address the research questions listed below.
Study Design
We propose two longitudinal waves following-up the nationally representative samples achieved in the 2019 NSECE of center-based ECE providers, paid home-based ECE providers, and classroom-assigned center-based workers. The resulting data will document the experiences of ECE providers during the COVID-19 pandemic. Center-based ECE providers operated by school districts requiring review of research protocols will not be fielded, due to time constraints in securing approval; publicly available administrative data will be sought to represent these centers where available. NSECE COVID Follow-up data will describe the pandemic experiences of providers active in 2019. To the extent that new providers have entered ECE provision since spring 2019, these providers will not be described by the collected Follow-up data.
Exhibit A1. Key Features of Proposed Data Collection Activities
Data Collection Activity |
Instrument(s) |
Respondent, Content, Purpose of Collection |
Mode and Duration |
Home-based Provider Survey |
Home-based Provider Questionnaire, Wave 1 and Wave 2 |
Respondents: paid home-based providers who participated in the 2019 NSECE
Content: Employment calendar, experience with pandemic programs, ECE practices during reference period, ECE status during focal week, current personal situation
Purpose: Understand center practices and statuses since March 2020, as well as personal impact on providers |
Mode: web, with phone as needed
Duration: 20 minutes |
Center-based Provider Survey |
Center-based Provider Questionnaires, Wave 1 and Wave 2 |
Respondents: center-based providers who participated in the 2019 NSECE
Content: Care status calendar, experience with pandemic programs, ECE practices during reference period, ECE status during focal week
Purpose: Understand center practices and statuses since March 2020 |
Mode: web, with phone as needed
Duration: 20 minutes |
Workforce Survey |
Center-based Worker Questionnaire, Wave 1 and Wave 2 |
Respondents: center-based workers who participated in the 2019 NSECE
Content: Employment calendar, ECE practices during reference period, ECE status during focal week, current personal situation
Purpose: Understand ECE professional experiences and personal impact on ECE workers since March 2020 |
Mode: web, with phone as needed
Duration: 20 minutes |
Other Data Sources and Uses of Information
The most important additional data sources will be the 2019 NSECE data, which will provide the baseline pre-pandemic conditions for participating sample members, as well as the basis for assessing any non-response bias. In addition, the 2012 and 2019 NSECE analyses and data files have linked to tract-level data from the Census Bureau’s American Community Survey for community-level contextual information, for example, on community poverty density and urbanicity. We plan to make use of school districts’ administrative data to gather information for centers that cannot be fielded due to research review requirements.
Additional linkages to publicly available data may be undertaken by secondary data analysts to understand how pandemic conditions such as local infection rates or emergency child care licensing rules may have affected workers or programs.
A3. Use of Information Technology to Reduce Burden
Data collection is proposed using web-based interviewing, which allows respondents to complete the interview at their own convenience and with minimum contact with contractor data collection staff. Self-administration times are generally shorter than interviewer-administered administration times due to faster silent reading ability of most adults, so web-based interviews are expected to reduce burden relative to interviewer administration. We also propose to use responses from prior interviews, administrative data sources, and programming within the instrument to minimize burden. The proposed approach minimizes cost to the government, maximizes respondent flexibility and convenience, and reduces processing time to provide data to the agency and research community.
A4. Use of Existing Data: Efforts to reduce duplication, minimize burden, and increase utility and government efficiency
Every effort has been made to determine whether similar research and information exists by searching existing data and reports, and in consultation with federal staff. As part of these efforts, the study team reviewed reports and other published resources that specifically focus on the impacts of COVID-19 on ECEs. Reports were found for 23 states as well as from three private organizations that conducted multi-state surveys.
Several state and three national surveys have been conducted to identify the impact of COVID-19 on early care and education in the U.S. These include both household surveys, which primarily focus on identifying families’ needs and preferences for child care in light of COVID-19; and provider surveys, which primarily focus on the impact of COVID-19 on program closures and financial needs.
Though extant data sets provide useful information, they also leave multiple gaps in our knowledge base on the impact of COVID-19 on early care and education (ECE) programs and the ECE workforce. First, existing data from national organizations, such as the National Association for the Education of Young Children, LENA and the National Association of Family Child Care Providers, and from each of the 23 states with published reports use convenience sampling procedures via web-based surveys. This sampling technique favors data from motivated providers and cannot be assumed to yield representative data. Second, the data collected tend to focus on a limited number of topics and do not include much, if any, information on members of the ECE workforce or the impact of COVID-19 on the quality of care offered and educational practices used in ECE settings.
A COVID follow-up on the 2019 NSECE differs from existing studies in several important ways:
A 2019 COVID follow-up of the 2019 NSECE would allow for the reliable comparison of pre- and post-pandemic data. The 2019 NSECE provider surveys were fielded in January-June of 2019, yielding recent comprehensive and nationally representative data on center- and home-based ECE programs as well as members of the ECE workforce. A COVID follow-up survey would allow for comparisons of the availability and usage of child care, dimensions of quality, and educational practices by type of care pre- and post-pandemic.
A 2019 COVID follow-up of the 2019 NSECE would provide nationally-representative data of the impact of the pandemic on ECE. A COVID follow-up of the 2019 NSECE survey would also provide a nationally representative picture of the impact of the pandemic on early care and education, which would serve to contextualize and situate smaller state surveys.
A 2019 COVID follow-up of the 2019 NSECE would survey multiple sectors of providers. The 2019 NSECE collects data from providers across all sectors of ECE (informal home-based providers, licensed home-based providers, and center-based including Head Start, pre-K, and community-based ECE). This is in contrast to existing surveys that have targeted solely one or two sectors of ECE (e.g., just community-based centers and/or family child care providers). Having pre- and post-pandemic data from providers in each of these sectors would allow for analyses of shifts in the workforce (e.g., from center- to home-based care).
A 2019 COVID follow-up of the 2019 NSECE would provide information on low-income families. The 2019 NSECE oversampled in low-income communities. A COVID follow-up of this sample would allow researchers to explore the impact of COVID comparing high and low density poverty neighborhoods.
We therefore conclude that additional information beyond existing research and data is needed to better understand the impacts of COVID-19 on all sectors of ECEs across the country. The study team concluded that no existing data source can provide all data needed to fully address the study’s objectives. We also propose to seek administrative data for school-based centers where available, rather than imposing burden on these organizations.
A5. Impact on Small Businesses
Some ECE providers in the selected sample are small businesses. The proposed data collection approach provides flexibility in timing and mode of interview participation to minimize burden on these businesses and other respondents.
A6. Consequences of Less Frequent Collection
Given the extended duration of the current pandemic, a single wave of data would not be able to capture both the pandemic experience and the efforts to recover after the pandemic, both of which are critical to the agency’s information needs to support ECE providers and the availability of high-quality ECE to families for their participation in the economy and their children’s development
A7. Now subsumed under 2(b) above and 10 (below)
A8. Consultation
Federal Register Notice and Comments
In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on July 30, 2020, Volume 85, Number 147, page 45887, and provided a sixty-day period for public comment. During the notice and comment period, two comments were received, which are attached. In response to comments from Local Initiatives Support Corporation (LISC), we have included questions on anticipated facilities improvement needs in the Center-based and Home-based questionnaires. We note that these questions (D27-D28 in Center-based, and D31-D32 in Home-based) have not undergone cognitive testing, so may require further adjustment prior to fielding. We have reviewed comments from the Center for the Study of Child Care Employment and find that the proposed questionnaires and design are consistent with their recommendations.
Expert consultations have taken place regarding how best to design an information collection to support the agency’s information needs, as well as on data availability, data collection, and dissemination strategies. We have consulted several individuals outside of ACF, through group discussions, materials review, and one-on-one correspondence.
Daphna Bassok University of Virginia |
Erin Hardy Brandeis University
|
Kate Gallagher Buffett Early Childhood Institute University of Nebraska |
Heather Sandstrom Urban Institute |
Robert M. Goerge Chapin Hall at the University of Chicago
|
Marcy Whitebook Center for the Study of Child Care Employment University of California, Berkeley |
A9. Tokens of Appreciation
We propose to use pre-paid incentives as part of the overall data collection strategy for the NSECE COVID-19 study for the following reasons:
To increase the likelihood that data collection can be completed within the three-month data collection windows for both proposed waves of the COVID-19 Follow-up. Compressed time periods are essential, both to facilitate collection of two separate waves of data, and because extended data collection periods will make it more difficult to chart the course of the pandemic over time.
To minimize bias in estimates, ensuring that the final data sets are representative of the target population.
To ensure that target sample sizes are achieved. Target sample sizes are required to support high priority subgroup analyses, especially ECE providers and workers in rural areas.
Pre-Paid Incentive and Experiment
Research about the effectiveness of survey incentives has found that pre-paid mail incentives have a larger impact per dollar than pre-paid incentives by other modes or post-paid incentives. Based on detailed experimentation in the 2017 National Household Education Survey (NHES), the US Department of Education implemented a $5 pre-pay incentive in the 2019 NHES (OMB # 1850-0768). Although the NHES was a household survey, it appears to be the most rigorous test available of pre-payment incentives.
We propose a $5 unconditional pre-paid incentive to encourage participation in all three sample types, however we will build into the Wave 1 data collection an experiment as to the timing of the pre-payment. Our data collection approach calls for two parallel modes of outreach which will occur simultaneously but in a coordinated fashion: email encouragement to complete the web questionnaire, together with interviewer telephone outreach to encourage web completion (or a telephone interview, if needed). We propose an experiment to understand the optimal timing of the pre-paid incentive, which will be sent by postal mail.
The table below indicates four experimental conditions, varying the timing of a postal mailing with incentive and including no-incentive conditions as controls. Studies often include pre-payment incentives in initial mailings to encourage self-response before deploying more costly interviewer outreach methods. Also, the 2017 NHES found that larger pre-payment incentives were more effective in the first two mailings to respondents than in subsequent outreach.
However, the 2012 and 2019 NSECE experience with early care and education providers and workers is that they rarely will respond to invitations for survey participation without personalized interviewer outreach supplemented by repeated email encouragement. For these individuals, a pre-payment incentive may be more effective once respondents are convinced of the legitimacy of the survey through interviewer communication and facilitated by the ease of an emailed survey link (rather than having to type in a web address from a hard copy mailing).
The 2019 NSECE included a similar $2 pre-payment incentive for the listed home-based provider sample to induce sampled individuals to reveal whether or not they were eligible for the listed home-based provider survey (many would have stopped providing care and so become ineligible). Because there was no experimental design, we are unable to report findings of the efficacy of that incentive offer.
The table below includes a breakdown of the approximate percentages of respondents we propose to include into each condition.
Exhibit A2. Proposed experiment for pre-paid incentive to encourage self-administration
Treatment category |
Percentage of sample |
Timing of postal mailing |
Dollar amount of postal mailing |
Initial postal mailing with incentive |
42 |
Prior to email or interviewer outreach |
$5 |
Postal mailing in Week 4 with incentive |
42 |
Week 4, after attempted email and interviewer outreach |
$5 |
Initial postal mailing no incentive |
8 |
Prior to email or interviewer outreach |
None |
Postal mailing in Week 4 with no incentive |
8 |
Week 4, after attempted email and interviewer outreach |
None |
The experiment pertains to sample members for whom we have both email and a mailing address. Some respondents have only one type of contact information (only email or only a mailing address), while others have both. If the only contact information we have is a mailing address, we will include a $5 pre-paid incentive along with their initial mailing. If the only contact information we have is an email, the sample member will not receive a pre-incentive.
We propose an experiment in Wave 1 to identify the optimal timing of a postal mailing with incentive among this survey population. We would implement the pre-payment incentive in Wave 2 based on the Wave 1 results, which may yield different implications for different sample types or sample subgroups. We will evaluate the experiment both in terms of overall completion rates, costs incurred, and representativeness of the sample.
A10. Privacy: Procedures to protect privacy of information, while maximizing data sharing
Personally Identifiable Information
Information will not be maintained in a paper or electronic system from which data are actually or directly retrieved by an individuals’ personal identifier.
Assurances of Privacy
Information collected will be kept private to the extent permitted by law. Respondents will be informed of all planned uses of data, that their participation is voluntary, and that their information will be kept private to the extent permitted by law. As specified in the contract, the Contractor will comply with all Federal and Departmental regulations for private information.
Due to the sensitive nature of this research (see A11 for more information), the evaluation will obtain a Certificate of Confidentiality. The study team has applied for this Certificate and will provide it to OMB once it is received. The Certificate of Confidentiality helps to assure participants that their information will be kept private to the fullest extent permitted by law.
Data Security and Monitoring
As specified in the contract, the Contractor shall protect respondent privacy to the extent permitted by law and will comply with all Federal and Departmental regulations for private information. The Contractor has developed a Data Safety and Monitoring Plan that assesses all protections of respondents’ PII. The Contractor shall ensure that all of its employees, subcontractors (at all tiers), and employees of each subcontractor, who perform work under this contract/subcontract, are trained on data privacy issues and comply with the above requirements.
As specified in the evaluator’s contract, the Contractor shall use Federal Information Processing Standard compliant encryption (Security Requirements for Cryptographic Module, as amended) to protect all instances of sensitive information during storage and transmission. The Contractor shall securely generate and manage encryption keys to prevent unauthorized decryption of information, in accordance with the Federal Processing Standard. The Contractor shall: ensure that this standard is incorporated into the Contractor’s property management/control system; establish a procedure to account for all laptop computers, desktop computers, and other mobile devices and portable media that store or process sensitive information. Any data stored electronically will be secured in accordance with the most current National Institute of Standards and Technology (NIST) requirements and other applicable Federal and Departmental regulations. In addition, the Contractor must submit a plan for minimizing to the extent possible the inclusion of sensitive information on paper records and for the protection of any paper records, field notes, or other documents that contain sensitive or PII that ensures secure storage and limits on access.
A11. Sensitive Information 2
The home-based provider and workforce questionnaires ask respondents to report their total household income the year preceding questionnaire completion (calendar year 2019 for Wave 1 and 2020 for Wave 2). Questionnaires ask respondents to report income as an exact dollar amount. Since income may be difficult to remember or report, if respondents do not know or refuse to provide this information, the questionnaires then ask them to describe income by pre-defined approximate ranges, such as less than $15,000, $15,001 to $22,500, and so on.
These same questionnaires have additional questions about respondents’ current financial situation, such as recent food insecurity, and receipt of government assistance for low-income individuals. Understanding the financial impact of the COVID-19 pandemic on providers and workforce members and their ability to get help is a critical purpose of the COVID-19 Follow-up, so these questions are essential to achieving the study objectives.
Last, the home-based provider questionnaire may raise additional concerns about disclosure, such as individuals without full work permission in the U.S., or providing ECE services without full compliance with licensing or other requirements. The questionnaire, therefore, very intentionally avoids any reference to such issues as visa status or licensing status. We view it as essential in gaining cooperation and respondents’ trust to be able to assure them that no questions will be asked on these potentially sensitive topics. Visa status and licensing status are not germane to the research questions for the study.
A12. Burden
Explanation of Burden Estimates
Each of the three instruments is expected to average 20 minutes administration time. The administration time will be quite a bit less for respondents who are not participating in early care and activities during the survey focal periods of the last week of October, 2020 and the first week of April, 2021. These rates of prevalence are not possible to know or accurately estimate at this time, given the volatility of the current economic and pandemic situation, but we have used the following assumptions: 20 percent of home-based providers completing the COVID Follow-up interview will not be providing ECE during the focal weeks of each wave; 25 percent of center-based providers will not be providing ECE; and 33 percent of center-based workforce members will not be providing ECE during the focal weeks of each wave.
Estimated Annualized Cost to Respondents
Exhibit A3. Estimated Annualized Cost to Respondents
Instrument |
Number of Respondents |
Number of Responses per Respondent |
Average Burden Hours per Response |
Estimated Total/ Annual Burden Hours |
Average Hourly Wage Rate |
Total/ Annual Respondent Cost |
Home-based Provider Interview, Waves 1 and 2 -- In ECE during focal week |
2,700 |
1.5 |
0.35 |
1,418 |
$12.27 |
$17,393 |
Home-based Provider Interview, Waves 1 and 2 -- Not in ECE during focal week |
675 |
1.5 |
0.25 |
252 |
$12.27 |
$3,092 |
Center-based Provider Interview, Waves 1 and 2; in ECE during focal week |
4,388 |
1.5 |
0.37 |
2,435 |
$25.81 |
$62,849 |
Center-based Provider Interview, Waves 1 and 2; not in ECE during focal week |
1,463 |
1.5 |
0.22 |
483 |
$25.81 |
$12,466 |
(Center-based) Workforce Interview – Waves 1 and 2; In ECE during Focal Week |
2,367 |
1.5 |
0.375 |
1,331 |
$12.27 |
$16,337 |
(Center-based) Workforce Interview – Waves 1 and 2; Not in ECE during Focal Week |
1,166 |
1.5 |
0.239 |
418 |
$12.27 |
$5,129 |
Estimated Total Annual Burden Hours |
6,337 |
|
$117,307 |
Note: The above average hourly wage information includes estimates from the Bureau of Labor Statistics. The median hourly wage for 11-9031 Education Administrators, Preschool and Child Care Center/Program is $25.81.3 The median hourly wage for home-based providers or workforce members is $12.27.4
A13. Costs
Honoraria will be provided directly to individual participants as compensation for their time participating in the survey.
Exhibit A4. Proposed Honoraria Models for 2019 NSECE COVID-19 Follow-up
Sample |
Post-paid Honorarium Value** |
Estimated COVID-19 Follow-up Questionnaire Completion Time (hours) |
Estimated 2020 Wages |
Home-based provider |
$10 |
0.33 |
$12.27/hour |
Center-based provider |
$10 |
0.33 |
Median: $25.81/hour |
(Center-based) Workforce |
$10 |
0.33 |
Median: $12.27/hour |
Note: Estimated wages from BLS as indicated for Exhibit 3.
A14. Estimated Annualized Costs to the Federal Government
Exhibit A5. Estimated Costs for Proposed Data Collection
Cost Category |
Estimated Costs |
Instrument Development and OMB Clearance |
$162,260 |
Field Work |
$5,211,009 |
Analysis |
$1,294,691 |
Publications/Dissemination |
$407,789 |
Total costs over the request period |
A15. Reasons for changes in burden
ACF is requesting to reinstate the previously approved information collection (0970-0391) with changes, initiating two additional waves of information collection and thus increasing the previously approved burden estimate.
A16. Timeline
Pending OMB approval, ACF proposes the following updated project timeline:
Questionnaire design July 2020 – September 2020
Wave 1 data collection November 1, 2020 – January 31, 2021
Wave 1 data processing/analysis February – May, 2021
Wave 2 data collection April 1, 2021 – June 30, 2021
Initial findings release Wave 1 April-June 2021
Wave 2 data processing/analysis July – October, 2021
Initial findings release Wave 2 September-November, 2021
Data files available for public use November-December 2021
A17. Exceptions
No exceptions are necessary for this information collection.
Attachments
Instrument 1: NSECE COVID-19 Follow-up Home-based Provider Questionnaire
Instrument 2: NSECE COVID-19 Follow-up Center-based Provider Questionnaire
Instrument 3: NSECE COVID-19 Follow-up Workforce Questionnaire
Appendix A: NSECE COVID-19 Follow-up Study Research Questions
Appendix B: Center-based Provider COVID-19 Follow-up Survey Contact Materials
Appendix C: Home-based Provider COVID-19 Follow-up Survey Contact Materials
Appendix D: Workforce COVID-19 Follow-up Survey Contact Materials
1 Hunt Institute (2020). COVID-19 resources and policy considerations: Child care state actions. Retrieved from: http://www.hunt-institute.org/covid-19-resources/state-child-care-actions-covid-19/
2 Examples of sensitive topics include (but not limited to): social security number; sex behavior and attitudes; illegal, anti-social, self-incriminating and demeaning behavior; critical appraisals of other individuals with whom respondents have close relationships, e.g., family, pupil-teacher, employee-supervisor; mental and psychological problems potentially embarrassing to respondents; religion and indicators of religion; community activities which indicate political affiliation and attitudes; legally recognized privileged and analogous relationships, such as those of lawyers, physicians and ministers; records describing how an individual exercises rights guaranteed by the First Amendment; receipt of economic assistance from the government (e.g., unemployment or WIC or SNAP); immigration/citizenship status.
3 https://www.bls.gov/oes/current/oes119031.htm
4 https://www.bls.gov/oes/current/oes399011.htm
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Ivelisse Martinez-Beck |
File Modified | 0000-00-00 |
File Created | 2021-01-13 |