Supporting Statement for OMB Clearance for the Study of Non-Response to the School Meals Application Verification Process
Part B
Social Science Research Analyst
Office of Policy Support
Food and Nutrition Service
United States Department of Agriculture
3101 Park Center Drive, Room 1014
Alexandria, Virginia 22302
Phone: 703-305-2105
Email: Holly.Figueroa@fns.usda.gov
TABLE OF CONTENTS
list of
APPENDICES 2
PART B. COLLECTION OF INFORMATION USING STATISTICAL METHODS 3
1. Respondent Universe and Sampling Methods 3
2. Procedures for the Collection of Information 9
3. Methods to Maximize Response Rates 16
4. Test of Procedures 18
5. Individuals Consulted on Statistical Aspects of the Design 20
References 22
TABLES
B1.1 Respondent Universe, Samples, and Expected Response Rates 7
B5.1 Individuals Consulted on Data Collection or Analysis 20
1 Code of Federal Regulations, Title 7, Part 210
2 Code of Federal Regulations, Title 7, Part 245
3 Improper Payment Information Act (IPIA) of 2002 (PL 107-300)
4 Improper Payment Elimination and Recovery Act (IPERA) of 2010 (PL 111-204)
5 School Meals: USDA Could Improve Verification Process for Program Access (GAO-15-634T)
6 FNS – National School Lunch and School Breakfast Programs (OIG-27601-0001-41)
7 Framework for Case Study of Verification Outcomes
8 The Healthy, Hunger Free Kids Act (HHFKA, Public Law 111-296; December 13, 2010)
9 The 2004 Child Nutrition and WIC Reauthorization Act
10 Verification data request
11 District interview
12.a Household survey—English
12.b Household survey—Spanish
13 Reapplication data request
14 District recruitment letter
15 District frequently asked questions
16 District recruitment call script
17 Verification data request advance email
18 Verification data request template
19 District interview invitation email
20.a Household survey advance letter—English
20.b Household survey advance letter —Spanish
21.a Household survey brochure—English
21.b Household survey brochure—Spanish
22.a Household survey frequently asked questions—English
22.b Household survey frequently asked questions—Spanish
23 Reapplication data request advance email
24 Reapplication data request template
25 Public Comments to 60 Day Federal Register Notice
26 Response to Public Comments
27.a NASS comments
27.b Response to NASS Comments
28 Pretest Memo
29.a Household survey call script—English
29.b Household survey call script—Spanish
30 Contractor Confidentiality Agreement
31 Contractor Federal Wide Assurance
32 Burden Table
33.a Household survey thank you letter—English
33.b Household survey thank you letter —Spanish
34 Verification data request thank you email
35 Verification data request pre-visit telephone protocol
36 Verification data request confirmation email
37 Respondent payment log
38 Reapplication data request thank you email
39 State recruitment letter
40.a Household survey door hanger—English
40.b Household survey door hanger—Spanish
41.a Police letter—English
41.b Police letter—Spanish
42 IRB approval letter
PART B. COLLECTION OF INFORMATION USING STATISTICAL METHODS
1. Respondent Universe and Sampling Methods
Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
The study will use a multi-stage clustered sampling design. At the first stage, the study team, together with FNS, will purposively select 251 school districts. At the second stage, within each district, two samples of households will be selected using equal probability sampling from the group of households initially approved for school meal benefits on the basis of income or categorical eligibility and selected for verification. The first sample will consist of those households that did not respond to a verification request (“nonresponding households”); the second sample will consist of those households that responded to the verification request and did not experience a change in certification status due to the verification process (“responding households with no changes”).
The proposed sampling design is the most efficient design to achieve the goals of the study. It provides desired accuracy to examine the current verification process and assess how the study results compare to results from the prior study: Case Study of National School Lunch Program Verification Outcomes in Large Metropolitan School Districts (2004) (OMB Control Number 0584-0516 Evaluation of the NSLP Application and Verification and Pilot Program, expiration date October 31, 2003). It improves on the 2004 study by providing larger sample sizes and will provide estimates that will be the same or more precise than those obtained in the 2004 study. This increased precision will allow us to accurately compare the results of this study to those reported in Burghardt et al. (2004).
This is a case study design of districts with particular characteristics suited to the research objectives, not a representative sample. The respondent universe for this study is school districts eligible for the study (described below) and within such districts the households initially approved for school meal benefits on the basis of income or categorical eligibility and selected for verification that either (1) did not respond to a verification request (“nonresponding households”) or (2) did not experience a change in certification status due to the verification process (“responding households with no changes”). The target population for the household survey excludes households that the State or district directly verified as well as those that responded to verification and subsequently experienced a change in status. Appendix 7 provides a framework for the verification outcomes of interest in this case study.
Districts that are ineligible for the study are those in which all schools are exempt from verification, and those in which almost all students selected for verification are directly verified. Those districts that cannot make application and verification data available at a central location are also considered ineligible.
To determine which districts have schools that are exempt from verification and which districts have a high percentage of students who are directly verified, the study team will review data from the FNS-742 School Food Authority (SFA) Verification Collection Report form (OMB Control Number 0584-0594, expiration date September 30, 2019). This form is completed by school districts annually, and provides information on the school district such as the percentage of students certified for free or reduced-price meals, the percentage of students certified for free or reduced-price meals on the basis of an application, and the percentage of students certified for free or reduced-price meals on the basis of categorical eligibility. However, the FNS-742 form does not contain information about whether the district’s application and verification data are available at a central location or not. Hence, during the recruitment phase, the study team will assess this eligibility condition when making the initial contact with SFA Directors (District Recruitment Call Script, Appendix 16) and discussing the study’s data collection requirements.
To ensure a sufficient household sample size (n=42 completed surveys for nonresponding households and n=32 completed surveys for responding households with no changes per district), the selected districts need to have a sufficient number of households that are selected for verification but not directly verified so that design requirements are met. However, depending on what the analysis of the most up to date FNS-742 data reveals, the study team might include districts that have a large number of households in either group of households, but not both. Lastly, for logistical and financial reasons, the selected districts must be clustered geographically; however, the study team will ensure the selected districts represent multiple FNS regions.
The FNS-742 data will be the primary sampling frame for the selection of the districts. The school district selection process is described below. The selected districts will then provide the sampling frames of the households belonging to the nonresponding households and responding households with no changes groups. Districts will provide the sampling frames because they are required to conduct verifications on a small sample of approved free or reduced priced (F/RP) applications by November 15 of each year. The verification process can result in four types of certification outcomes for households initially approved on the basis of income or categorical eligibility: no change in status, a change in eligibility status from free meals to reduced-price meals, a change in eligibility status from reduced-price meals to free meals, or a loss of eligibility for F/RP meals. A loss of eligibility can result from (1) the verification process finding incomes too high to qualify for F/RP meals, or (2) a household failing to respond to the verification request.
As mentioned above, the households from each group will be sampled with equal probability from each district’s frame. To ensure the desired level of precision, the study team will attempt to achieve 42 completed surveys from each district for the nonresponding households and 32 completed surveys from responding households with no changes. Given the expected response rates shown in Table B1.1, this means that the team will need to sample approximately 62 nonresponding households and 41 responding households with no changes per district. Overall, the team will contact 1,235 nonresponding households and 821 responding households with no changes. 2 To achieve this response rate, the study team will employ a variety of efforts including using well-trained, professional recruiters and data collectors, sending advance informational materials to sampled households, and providing households with a modest financial incentive to cover the costs of participation. These efforts are fully discussed in Part B, Question 3.
Table B1.1 shows the respondent universe, the initial sample sizes to be released for contact, expected response rates, and the target number of completed cases for each level of data collection.
In February 2017, an analysis of the most recent FNS-742 data (from school year 2014-2015) showed that of 19,403 school districts in the file, 4,030 were exempt from verification. This resulted in 15,373 potentially eligible school districts. However, only 61 of these districts had at least 62 nonresponding households and at least 41 responding households with no changes. Thus, it is estimated that the school district respondent universe contains approximately 61 districts drawn from approximately 13 States. However, the exact number of school districts and States will be determined using the most up-to-date FNS 742 data at the time of recruitment.
Table B1.1. Respondent Universe, Samples, and Expected Response Rates
Respondent |
Universea |
Initial Sample |
Expected Response Rate |
Target completed cases |
2004 Study3 Response Rates |
School districts (SFAs) |
61 |
25 |
80% |
20 |
na4 |
Verification Data Request |
|
20 |
100% |
20 |
100% |
SFA Director Interview |
|
20 |
100% |
20 |
na5 |
Reapplication Data Request |
|
20 |
100% |
20 |
100% |
Households |
20,260 |
2,056 |
72% |
1480 |
79.8%6 |
Nonresponding households |
15,750b |
1,235 |
68% |
840c |
74.4% |
Responding households with no changes |
4,510b |
821 |
78% |
640d |
83.3% |
States |
13 |
13 |
100% |
13 |
na |
Police Departments |
20 |
20 |
100% |
20 |
na |
TOTAL |
20,354 |
2,114 |
72.5% |
1,533 |
78.5% |
aBased on FNS 742 data from SY 2014-2015. For purposes of this study, the universe consists only of school districts meeting the study’s eligibility criteria (described in preceding paragraphs), States in which those school districts are located, households within those districts that meet the inclusion criteria, and police departments with jurisdiction over the area including school districts in the final study sample (n=20).
bBased on FNS 742 data from SY 2014-2015, these were the numbers of nonresponding households and responding households with no changes from the 61 school districts which met the study’s inclusion criteria.
cFor the nonresponding households, 42 completed cases in a district will provide 95 percent confidence intervals of +0.15 or less for an outcome expressed as a proportion.
dFor the responding households with no changes, 32 completed cases in a district will provide 95 percent confidence intervals of +0.17 or less for an outcome expressed as a proportion.
Selecting school districts
The study team will determine which of the eligible school districts will be included in this case study. FNS-742 data from the most recent school year available will be used as the sampling frame from which to determine eligible school districts. The geographic spread of potentially eligible districts will be analyzed in order to form logistically feasible clusters. Based on that clustering, expectations of eligible household sample sizes, and the desire to encompass diversity by region, urbanicity, enrollment, and concentration of students receiving F/RP meals, the study team will propose a set of the most advantageous districts from which the sample should be selected. The study team will then identify districts that (1) meet the criteria for the study, and (2) would be willing to participate. These conversations will produce the initial sample of 25 school districts from which the final 20 districts will be enrolled. In the event that 20 districts do not agree to participate, the study team will return to the original list of 61 eligible SFAs and, using the methods described above, select additional districts until 20 SFAs have been enrolled.
Sampling households
After district recruitment, the study team will work with the participating districts to obtain the sampling frames of the households belonging to the two primary study groups: nonresponding households and responding households with no changes.
From each district the study team will try to obtain 42 completed interviews of nonresponding households and 32 completed interviews from responding households with no changes. The study team estimates the response rate for nonresponding households will be about 68 percent, whereas response rates for responding households with no changes will be about 78 percent, based on response rates of 73.3 percent and 84.4 percent, respectively, for similar groups in the Case Study of National School Lunch Program Verification Outcomes in Large Metropolitan School Districts (2004). The study team reduced expected response rates from those obtained in 2004 to account for the general trend of declining participation in household surveys. Additionally, the universe of school districts from the 2004 study was much greater than the current universe for the current study. Current policies such as the Community Eligibility Provision were not in place in 2004, but will limit the universe of districts in the current study. Additionally, because the universe is smaller, the sample for the current study will be fundamentally different than the sample from 2004. As such, the study team conservatively anticipates a slightly lower response rate for the current study than in the 2004 study.
To obtain the desired number of completed surveys in each group, the study team must sample 62 nonresponding households and 41 responding households with no changes. However, it is possible that some districts may not have an adequate number of eligible households to achieve the desired number of completed interviews, or the number of eligible households in the district are exactly the same as the number of required completed interviews. In these few instances, the team will select all households from that district and a larger sample of households from the other districts in order to provide the desired number of completed interviews across the 20 districts.
The households will be selected with equal probability sampling from each of the two sampling frames provided by the districts.
2. Procedures for the Collection of Information
Describe the procedures for the collection of information including:
Statistical methodology for stratification and sample selection,
Estimation procedure,
Degree of accuracy needed for the purpose described in the justification,
Unusual problems requiring specialized sampling procedures, and
Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
Statistical methodology for stratification and sample selection. The current study is not a nationally representative study. The proposed study design does not call for explicit stratification. As described in the section above, the school districts will be selected purposively to ensure they satisfy the eligibility and feasibility requirements specified above. Households from the two groups (nonresponding households and responding households with no changes) will be selected randomly from the sampling frames provided by the districts.
Estimation procedure and degree of accuracy needed for the purpose described in the justification.
Estimation procedure. This is not a nationally representative study; therefore, concern regarding the estimation procedure is not applicable. This study will not characterize any of the findings as being representative of all Child Nutrition Programs nor of all participants in Child Nutrition Programs. Nonetheless, the collected data will allow for both district-level and overall estimates. The main estimate of interest will be the percentage of households whose survey-determined eligibility differs from the pre-verification eligibility status based on data from the districts.
Degree of accuracy. This is not a nationally representative study; therefore, concern regarding the accuracy is not applicable. Nonetheless, the study will report estimates for all districts (overall) and for each district.
For nonresponding households, 42 completed interviews per district will provide district-level 95 percent confidence intervals with half widths of 15 percentage points or less for percentage measures with a mean of 50 percent (the half width of the 95 percent confidence interval will be shorter for measures with means closer to 0 or 100 percent). For responding households with no changes, 32 completed interviews per district will provide district-level 95 percent confidence intervals with half widths of 17 percentage points or less for a binary measure with a mean of 50 percent. For analyses across the 20 districts, the confidence intervals will be much smaller than the district-level 95 percent confidence intervals given the larger sample sizes.
Although the proposed sampling design will not produce a sample that generalizes to a broader population because the 20 districts will be selected purposively, the study team plans to compute household nonresponse weights. These weights, which will make use of relevant characteristics that are known for non-respondents (such as household size and pre-verification certification status), will account for potential survey nonresponse bias. The precision estimates presented above, however, do not consider the potential loss of precision from the increase in the estimates’ variance due to the variability of nonresponse weights or the increase in precision due to the use of the finite population correction (fpc). The use of the fpc, however, may offset the loss of precision due to the nonresponse weights in some districts.
General analysis approach. The study team’s analyses of collected data will consist of basic frequencies and cross-tabulations, with standard errors presented in some tables. Specifically, it will begin with a review of frequencies to identify outliers or implausible values and assess missing data. If necessary, analysts will impute incomplete income data from the household survey (if a source is reported and amount is not known) based on actual income amounts reported by other cases in the data. The study team will review analyses that utilized imputed data to determine if the imputation is appropriate or if households with missing data should be excluded from the analyses. The team will then construct variables to facilitate table production. For example, the analytic file created from the household interview data will include a constructed value for income relative to the federal poverty level (FPL). The team will also construct indicators of eligibility status based on the household interviews and categorical variables to indicate the combinations of pre-verification, post-verification, and study-determined eligibility statuses. Using the data along with any imputations and constructed variables, the study team will prepare frequency tables and cross-tabulations similar to those presented in Burghardt et al. (2004).
Unusual problems requiring specialized sampling procedures. The study team does not foresee problems with the proposed sampling design. The only potential problem would be if the selected districts do not have enough households in both groups to achieve the desired number of completed surveys. If that were to happen, the study team will sample all the eligible households from those districts and sample a larger number of households from the other districts so that the average number of completes per district equals the desired number of completes (42 and 32 for the two groups of households respectively).
Any use of periodic (less frequent than annual) data collection cycles to reduce burden. This is a one-time study; concern regarding the periodicity of data collection cycles is not applicable.
Summary of Data Collection Procedures
State recruitment. The study team will work with the FNS Regional Office (RO) Director to engage the sampled districts’ State Child Nutrition (CN) Directors in supporting the study. The RO director will send the CN director a State Recruitment Letter (see Appendix 39) describing the study and asking for their support in confirming the legitimacy of the study as necessary. CN directors are also asked to identify someone in their office to be a contact for the study and send their contact information to the RO director. The RO director will give the contact information to the study team for inclusion on SFA recruitment materials.
SFA recruitment. SFA and school participation in the study is encouraged under the Healthy, Hunger-Free Kids Act of 2010 (P.L. 111–296), Section 305: “States, State educational agencies, local educational agencies, schools, institutions, facilities, and contractors participating in programs authorized under this Act and the Child Nutrition Act of 1966 (42 U.S.C. 1771 et seq.) shall cooperate with officials and contractors acting on behalf of the Secretary, in the conduct of evaluations and studies under those Acts.” SFA directors in sampled districts will be sent district recruitment letters (Appendix 14), district frequently asked questions (Appendix 15), and copies of the household survey brochure (Appendix 21. a/b) ahead of a telephone recruiting call. (The district recruitment call script, which includes text for calling or speaking to the SFA Director, is included in Appendix 16).
Verification data collection procedures. The study team will send a verification data request advance email (Appendix 17) to SFA directors requesting data on each household that was part of the district’s 2017 verification sample. The email will include the Verification Data Request (Appendix 10) which will request data on: background information on households selected for verification, including household size and reported income; information related to the original application for school meal benefits, including the district’s initial eligibility determination and whether the household was certified based on application or categorical eligibility (participation in SNAP, TANF, or FDPIR); and information related to the verification process, including an indicator of whether the application was selected “for cause” and the result of the verification process. The data request will include a Microsoft Excel template to facilitate data collection (Verification data request template, Appendix 18). The study team will perform a preliminary review of all collected data as soon as it is received to ensure it includes all requested information, and a more thorough review of the data once it has been standardized to ensure it includes all expected variables and observations. The study team will send verification data request thank you emails (Appendix 34) to SFA directors upon receipt of verification data.
We expect that some districts may not be able to provide electronic files in a timely manner. For these districts we will send trained staff to collect the necessary data. Staff will contact the district using the verification data request pre-visit telephone protocol (Appendix 35) to schedule the best time to visit the district. Staff will send the verification data request confirmation email (Appendix 36) a few days ahead of the scheduled visit to confirm.
District interview data collection procedures. Trained interviewers will conduct the District Interview (Appendix 11) with a total of 20 SFA directors, one from each of the sampled school districts, using a telephone-administered questionnaire that includes questions about the district’s process for selecting applications for verification for cause and general verification procedures. The interview will collect information on: how, when, and how often the district selects applications for cause; the district’s procedures for communicating with households whose applications have been selected “for cause;” and the district’s process for ensuring the integrity of application review. The trained data collection staff will send a district interview invitation email (Appendix 19) to SFA directors in early 2018. The email will ask SFA directors to select a date and time to complete the interview, and offer them the opportunity to designate a representative to complete the interview in their stead if they are knowledgeable about the process of selecting applications for cause. Trained interview staff will take notes during the interview to facilitate summarizing interview findings, and will record interviews to serve as backups in case interview notes are incomplete or lost. The data collection period will last eight weeks.
Household survey procedures. In early 2018, after the two groups of households have been selected using the district verification data, an advance package will be sent by postal mail to each sampled household; it will include a household survey advance letter informing them of the study and encouraging them to participate (Appendix 20. a/b), and a household survey brochure (Appendix 21. a/b) describing the aims of the study and providing answers to household survey frequently asked questions. Using the household survey call script (Appendix 29. a/b) trained field interviewers will then contact households to schedule the in-person survey. The household survey will collect detailed information on household structure, sources of income, employment history of adult members of the household, and monthly income (for October 2017) by source for each member of the household. Interviewers will ask respondents to provide them with documentation for each income amount reported from October 2017. The household survey will include other self-reported characteristics such as household size, level of education of the respondent, grade of the student, and race/ethnicity of the respondent and student. The survey will also collect information about how often students in the household eat school meals, parental perceptions of the verification process, and parental perceptions of the school meal programs. The interviewers will ask households in the nonresponding households sample about the reasons they failed to respond to verification requests (e.g. whether respondents thought the requirements were difficult to understand, had difficulty in obtaining proof of various types of income, and were aware the household would lose eligibility for F/RP meals if they did not respond to the request). The interviewers will ask responding households with no changes why they chose to complete the request (for example, the request was easy, the respondent did not want the child to lose eligibility, and so forth). They also will ask both responding and nonresponding households about their perceptions of the barriers to responding to verification. The household survey will be structured to investigate differences in perceptions between households that responded to verification requests and those that did not. Upon completing the survey, households will be asked to sign the Respondent Payment Log (Appendix 37) verifying they received their $25 Visa gift card. Households will also receive a household survey thank you letter (Appendix 33. a/b) recognizing them for participating in the study. The household survey software will enable the study team to monitor item-level missing data in real time throughout the field period, and a member of the study team will review item frequencies and cross-tabulations once the first 100 interviews are completed to identify any errors or incorrect values and verify that the skip logic is functioning properly. The data collection period for the household survey will last 12 weeks.
Reapplication data collection procedures. In spring 2018, a reapplication data request advance email (Appendix 23) will be sent to SFA directors with Reapplication Data Requests (Appendix 13) for updated data on households’ reapplication and certification status as of March 1, 2018. These variables will include: reapplication and ultimate certification status, including whether the household reapplied for F/RP school meals between the district’s verification determination date (late October/early November 2017) and March 1, 2018; direct certification information, including whether the household was directly certified and which program was the basis for direct certification; and enrollment information, including whether the student is no longer enrolled and if so, the last date of enrollment. The data request will include a Reapplication Data Request Template to facilitate data collection (Appendix 24). Reapplication data request thank you emails (Appendix 38) will be sent to thank SFA directors upon receipt of reapplication data. A preliminary review of all received reapplication data will be performed to ensure it is complete, followed by a thorough review of all variables and observations once data has been standardized and prepared for analysis. The reapplication data collection period is expected to last six weeks, ending in May 2018.
3. Methods to Maximize Response Rates
Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.
Anticipated response rates are shown in Table B1.1 (see Section B.1). Response rates are based on those achieved in similar studies, including the aforementioned 2004 study and the second Access, Participation, Eligibility, and Characteristics (APEC-II) (OMB Control Number 0584-0530, NSLP/SBP Access, Participation, Eligibility, and Certification Study, expiration date August 31, 2015) study on school meals. A wide range of comparable methods will be used to maximize participation and reduce nonresponse in all aspects of the data collection:
SFAs will be recruited by trained, permanent, professional research staff with relevant experience working with school meals programs or other relevant stakeholders.
A State Recruitment Letter (see Appendix 39) will be sent to each State child nutrition director to build support for the study and encourage SFA directors to offer their full cooperation.
If necessary, field interviewers will be sent onsite to collect district-level verification data. This serves to significantly limit the burden placed on district and school staff in order to provide the data required for analysis.
Advance information will be sent to sampled households, including: (1) A household survey advance letter (Appendix 20. a/b) describing the importance of the study, incentives, privacy protections, and the fact that receipt of benefits will be unaffected by study participation; and (2) a household survey brochure with frequently asked questions (Appendix 21. a/b) that will include general information about the study and instructions on who to contact with questions or for additional information.
All study materials will include a toll-free study hotline number and a dedicated study email address if respondents have questions or require additional assistance.
A $25 Visa gift card will be provided to households who participate in the household survey.
Thank you emails will be sent to district staff upon their submission of the verification data (Appendix 34) and reapplication data (Appendix 38).
Data collectors will make multiple attempts to contact sampled households throughout the data collection period, including an average of 10 phone calls placed at varying times of day and on varying days of the week. Data collectors will make up to three in-person visits to households that: (1) have no confirmed working phone numbers (for example, all known phone numbers are out of service) or (2) have not responded to telephone attempts after the first three weeks of data collection. Data collectors will leave a household survey door hanger (Appendix 40. a/b) with a name, phone number, and email address if sample members are not home during each in-person attempt.
Data collectors will carry photo identification and copies of study materials to validate their visits to neighborhoods and households included in the study. A police letter (Appendix 41.a/b) will be sent to police departments in participating school districts explaining the purpose of the study.
Data collectors will also carry a copy of the police letter that can be shown to local authorities to legitimize the work and their presence in a neighborhood. Letters can also be shown to local authorities if inquiring on the location of an obscure address.
Data collectors will be qualified, well-trained professional interviewers. Project-specific training will emphasize achieving high response rates by focusing on sensitivity issues relevant to the study population (for example, stigma associated with public assistance and fear of being investigated), the privacy protections that respondents can be assured of, and refusal conversion techniques. A sufficient number of data collectors will be bilingual in Spanish in order to maximize response among non-English speaking respondents. All study materials provided to households will be available in Spanish.
In addition to the steps outlined above, the study team will use data collection management software to prepare daily reports to track data collection efforts against target response rates. The reports will provide the information needed to take corrective action if required, such as identifying patterns of nonresponse. As a contingency plan, data collectors may use additional techniques such as offering to schedule interviews outside of the home or over the phone to increase respondents’ levels of comfort and to allow for greater flexibility in scheduling; however, the study team does not anticipate needing to use these techniques. Based on similar studies, the planned methods of data collection should result in accurate and reliable data necessary for planned analyses and modeling at the response rates noted in this document. The number of completed instruments will be the numerator in response rate calculations. A completed instrument will be defined as one in which all critical items for inclusion in the main analysis are complete and within valid ranges. All attempted respondents, excluding those determined to be ineligible, will be the denominator in response rate calculations.
After data have been collected, a non-response analysis will be conducted and the results will be used in constructing weights to be used in analysis. At a minimum, the non-response analysis will examine which district, school, and household characteristics are correlated with non-response, and the study team will use the results of the non-response analysis to define non-response adjustments to the sampling weights (which account for the household selection probability). We will use logistic regression models within district and household groups to predict the response propensity of households based on these characteristics. The resulting propensity scores will be used to construct nonresponse weighting adjustments. If any response rates should fall below 80 percent, as expected for the household survey, the analysis will also estimate the potential for bias and the ability of weighting adjustments to correct for that bias.
Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.
Pretesting of study instruments and protocols took place from February through March 2017. A pretest memo describing the procedures and findings from the pretest can be found in Appendix 28.
Selecting pretest participants. The study team selected five potential districts that met the sampling criteria described in Supporting Statement B Question 1 but were unlikely to be included in the final sample because they are geographically isolated from other districts likely to be selected. The study team then worked with the Western and Southeastern FNS Regional Offices and State Child Nutrition Directors in California, Arizona, and Georgia to obtain contact information for SFA directors identified as potential pretest participants. Three of the five SFAs agreed to participate in the pretest. Pretest procedures and findings are summarized below. As part of the pretest activities, the 3 SFAs provided contact information for 10-25 households that met the selection criteria described in Supporting Statement B Question 1 to serve as candidates for pretesting the household survey.
Pretest evaluation criteria. In order to minimize burden on respondents, study staff interviewed SFA directors about the verification data request (Appendix 10), district interview (Appendix 11), and reapplication data request (Appendix 13) in a single phone call which lasted between 60 and 90 minutes. SFA directors received the two data requests via email approximately one week before the phone call, and were asked to review the data requests prior to the call. SFA directors were asked to consider the burden of the two data requests, the clarity of the instructions provided with the requests, and the utility of the data request templates. Trained data collection staff then called the SFA directors to discuss their feedback on the data requests, and conduct the district interview. During the district interview, study staff used spontaneous probes when the respondents hesitated or expressed confusion in order to identify problematic questions or instructions.
Although contact information for potential household pretest respondents was collected for all three districts, Tucson, Arizona was chosen as the site of the pretest for the household survey due to its proximity to the data collection center. Recruiting for the household survey pretest began in late February 2017, with study staff scheduling in-person interviews with five responding households with no changes and one nonresponding household. During the pretest, study staff tested both the household survey call script (Appendix 29. a/b) and the household survey instrument7 (Appendix 12. a/b). The trained data collection staff were once again encouraged to use spontaneous probes when respondents showed hesitation or confusion, and time was set aside at the end of the household survey to ask respondents about the clarity and flow of survey questions and procedures. Additional details on pretest procedures for districts and households can be found in the pretest memo in Appendix 28.
Summary of pretest findings. Overall, respondents found the survey materials to be clear and the data collection staff reported few problems with the study instruments. Slight modifications were made to some materials based on feedback from respondents and interviewers, mostly related to instructions and question text. A complete list of revisions resulting from the pretest can be found in Appendix 28. During the pretest there was some difficulty in getting non-responding households to agree to participate. We will employ a variety of methods, including contingency plans if necessary, to maximize response rates. These methods and contingency plans are discussed in detail in the previous section. In addition to testing the clarity of study materials and instruments, the pretest also measured burden associated with completing the district and household interviews. SFA directors were also asked to provide burden estimates for completing both the verification and reapplication data requests. Those estimates were used to calculate total burden estimates, which can be found in Supporting Statement A, Question 12 and Appendix 32.
5. Individuals Consulted on Statistical Aspects of the Design
Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
A summary of staff consulted on statistical aspects of the design is presented in Table B5.1. The same staff will be responsible for the collection and analysis of the study’s data.
Table B5.1. Individuals Consulted on Data Collection or Analysis
Mathematica Staff (Contractor) |
||
Eric Zeidman |
Project Director |
609-936-2784 |
Quinn Moore |
Senior Researcher |
609-945-6592 |
Cheryl De Saw |
Senior Survey Researcher |
609-275-2204 |
Daniela Golinelli |
Senior Statistician |
202-838-3597 |
Joshua Leftin |
Researcher |
202-250-3531 |
Bryce Onaran |
Survey Researcher |
202-484-4524 |
FNS Staff |
||
Courtney Paolicelli |
Social Science Research Analyst |
703-605-4370 |
Holly Figueroa |
Social Science Research Analyst |
703-305-2105 |
NASS Staff |
||
Alison Black |
Methods Division |
202-690-2388 |
Burghardt, J., Silva, T., & Hulsey, L. (2004). Case study of National School Lunch Program verification outcomes in large metropolitan school districts. Alexandria, VA: U.S. Department of Agriculture, Food and Nutrition Service, Office of Analysis, Nutrition, and Evaluation.
The Healthy, Hunger Free Kids Act of 2010, Pub. L. 111–296, 124 Stat. 3183. Retrieved from http://childnutrition.ncpublicschools.gov/regulations-policies/public-law/pl-111-296.pdf/view.
U.S. Department of Agriculture, Food and Nutrition Service, Office of Policy Support. (2015, May). Program error in the National School Lunch Program and School Breakfast Program: Findings from the Second Access, Participation, Eligibility, and Certification Study (APEC II), Volume 1. Alexandria, VA: Author.
1 The study team will purposively recruit 25 districts in order to enroll 20 districts in the case study.
2 Calculations are as follows:
Nonresponding households: 1235 household contacts / 20 school districts = ~62 households per district x 68% response rate = ~42 completed interviews with nonresponding households per district
Responding households with no changes: 821 household contacts / 20 school districts = ~41 households x 78% response rate = ~32 completed interviews with responding households with no changes per district
3 Case Study of National School Lunch Program Verification Outcomes in Large Metropolitan School Districts (2004), OMB Control Number 0584-0516, Evaluation of the NSLP Application and Verification and Pilot Program, expiration date October 31, 2003.
4 Like the 2004 study, this data collection will employ a purposive sample of districts meant to best support the study in terms of district characteristics and their ability to provide necessary data. The study will initially select 25 districts in order to recruit 20 participating districts. The 2004 study included 21 participating districts but the initial sample size is unavailable.
5 The 2004 study did not include an SFA Director interview.
6 According to the 2004 study final report, these response rates were calculated using the total eligible sample of 1,488 households as determined after recruitment and data collection activities were completed. The initial sample included 1,554 households, with whom 1,164 surveys were completed, for a comparable household response rate of 74.9 percent.
7 The draft, pretest version of this instrument included questions about student’s gender. These questions were later removed from the instrument due to privacy concerns, and they will not be included in the final instruments.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | DPatterson |
File Modified | 0000-00-00 |
File Created | 2021-01-22 |