SFIC Study OMB Part B- 20120411

SFIC Study OMB Part B- 20120411.docx

School Foodservice Indirect Cost Study

OMB: 0584-0568

Document [docx]
Download: docx | pdf


O MB Submission

(Part B)




School Foodservice Indirect Cost Study



Contract # GS-10F-0086K

Order # AG-3198-D-11-0047








March 26, 2012







Prepared for:

John Endahl

Office of Research and Analysis

Food and Nutrition Service/USDA

3101 Park Center Dr, Rm 1004

Alexandria, VA 22302




Prepared by:

Abt Associates Inc.

55 Wheeler Street

Cambridge, MA 02138


In Partnership with:

Kokopelli Associates LLC

School Foodservice Indirect Cost Study – OMB Clearance Package

Table of Contents


Part B. Statistical Methods

B.1 Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

The respondent universe for the School Foodservice Indirect Cost Study will include:

  • 51 State Child Nutrition Directors;

  • 51 State Department of Education Staff responsible for setting/reviewing LEA indirect cost rates;

  • 14,875 Public School Food Authority Directors;

  • 14,875 Public School District Business Managers;

  • 3,890 Private School Food Authority Directors; and

  • 3,890 Private School District Business Managers.

More detail on the respondent universe for each of the above is given below in the context of sampling methods. Below we describe the procedures that will be used to select the sample of school districts/School Foodservice Authorities for the School Foodservice Indirect Cost Study, including:

  • sampling frame;

  • sample sizes;

  • sample selection;

  • sampling weights; and

  • response rates.

Sampling Frame

In order to draw an efficient, nationally representative sample of SFAs for the study, and to permit the rapid recruiting of sampled SFAs necessary to meet the deadline for the report to Congress specified in P.L. 111- 296, the sample frame must have the following characteristics:

  • The frame must include all SFAs or be a nationally representative sample of SFAs.

  • The frame must have SFA-level information on variables needed for sampling: size (e.g., enrollment or numbers of lunches and breakfasts) and the proportion of students approved for free/reduced-price meals.

The database created by FNS from the Form FNS-742 SFA Verification Summary Report (VSR) (OMB Control No. 0584-0026, Expiration Date 3/31/2013) meets these criteria and will be used as the sample frame for the School Foodservice Indirect Cost Study. This database, which is compiled annually, contains the most up-to-date information on all SFAs participating in the NSLP. In addition, it is a “rich” sample frame in that it contains information on several key SFA characteristics such as enrollment; number of students approved for free, reduced-price, and paid meals, etc. The database also included information on poverty level. Contact information for the selected SFAs will be imported from external sources such as the QED Education Database from MCH Strategic Data or the EdConnect Database from Agile Education Marketing database. While these databases should have up-to-date contact information for SFA Directors and LEA Business Managers, we anticipate that project staff will need to find some contact information from LEA web sites. This information will enable us to work with the nationally representative sample for the study and facilitate the rapid recruitment of selected SFAs.

Sample Size

The target population for this survey is all public and private SFAs participating in the NSLP and SPB. In order to reach the required level of precision for public SFAs, we must have an initial sample of 2,373 SFAs and, assuming an 80 percent response rate,1 a final sample of 1,898 completed public school district/SFA surveys. The sample will be stratified by region and then within each region by SFA size. The total initial sample of 2,373 SFAs will be equally allocated to each region. Within each region, the sample will be allocated in proportion to the number of SFAs in each size group in the population. We will select an initial sample of at least 339 SFAs in each size group to ensure an analytic sample of at least 271 completed school district/SFA surveys in each of the three size groups. The sample will also be examined to ensure an analytic sample of at least 271 completed school district/SFA surveys in each of two poverty strata (≥60% and ≤59% of students enrolled approved for free and reduced-price meals). An illustrative allocation of the total sample by region and size is shown in Exhibit B.1. The distribution of SFAs by size and region is taken from the FNS School Food Authority Verification Summary Report (VSR) Data Set for the SY 2010-11. The size groups small, medium, and large were defined in terms of student enrollment as follows: Large = 10,000 or more, Medium= 1,000 to 9,999 and Small= less than 1,000.

In order to reach the required level of precision, we will select an initial sample of 125 private SFAs stratified by FNS region to ensure that the sample is not only nationally representative, but also has face validity. The sample of private SFAs will be allocated to each region in proportion to the number of private SFAs in the region. An equal probability systematic sample will be selected in each stratum after sorting the SFAs by State and size. Assuming an 80% response rate, this will provide an analytic sample of 100 private SFAs.

Exhibit B.1. Distribution of Public SFA Population and Sample of Completed Public School District/Public SFA Surveys by Strata

Region

School District/SFA Size Class

Small

Medium

Large

Total

Population

Sample

Population

Sample

Population

Sample

Population

Sample

Mid-Atlantic

533

116

861

188

92

35

1,486

339

Midwest

1,942

159

1,694

139

111

41

3,747

339

Mountain Plains

1,776

238

534

71

79

30

2,389

339

Northeast

775

148

923

176

41

15

1,739

339

Southeast

264

67

785

203

181

69

1,230

339

Southwest

1,302

181

735

103

146

55

2,183

339

Western

1,108

146

745

99

248

94

2,101

339

Total

7,700

1,055

6,277

979

898

339

14,875

2,373

Poverty Rate

Population

Sample

Population

Sample

Population

Sample

Population

Sample

Low (0-59 pct)

4,646

636

4,616

700

603

228

9,865

1,564

High (60-100 pct)

3,054

419

1,661

279

295

111

5,010

809

Total

7,700

1,055

6,277

979

898

339

14,875

2,373


Sample Selection

For public SFAs we will select an equal probability systematic sample in each region x size group x poverty level after sorting the public SFAs by State. The number of SFAs in each poverty level is sufficient to ensure a sample of 271 completed school district/SFA surveys from each group. We will sample in each size x region stratum after sorting the SFAs by poverty level and State and then select an equal probability systematic sample. The sampling procedure for private SFAs is the same as that of the public SFAs except that we will not stratify by poverty rate.

While we will not explicitly stratify the sample by single-school vs. multiple-school LEAs (public or private), our proposed systematic sampling design is tantamount to such stratification. To briefly summarize our proposed systematic sampling design, within each FNS region, we are selecting an equal probability sample after sorting the SFAs by state and size. This method of sampling is equivalent to creating strata by state and size in each region and then sampling from each stratum, and we are giving proportional representation to state and size. The only difference is that when we create actual strata, we will need to select a sample from every stratum whereas in systematic sampling there is no such restriction. It may happen that not all states and sizes are represented, but the sample will still provide representation to size and state. The variability in single schools versus multi-schools is taken care of by storing and selecting a systematic sample.

Sampling Weights, Response Rates and Non-Response Adjustments

Each responding school district/SFA in the sample will be assigned a sampling weight. The base weight is the inverse of the probability of selection of the responding school district/SFA. The base weight will be adjusted for nonresponse to the survey. We anticipate that all states and the District of Columbia will complete the State-level surveys and provide the requested archival data. We anticipate an 80% response rate for the SFA survey. The sampling weights ensure that the results will be nationally representative.

In addition, we will perform sensitivity analyses comparing the characteristics of respondents and nonrespondents using information from the Form FNS-742 SFA Verification Summary Report (VSR) database used to construct the sampling frame. This will allow us to assess the extent to which survey nonresponse differs by SFA characteristics such as income level, SFA size, or region.

B.2 Describe the procedures for the collection of information including:

• Statistical methodology for stratification and sample selection,

• Estimation procedure,

• Degree of accuracy needed for the pur¬pose described in the justification,

• Unusual problems requiring specialized sampling procedures, and

• Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

This is a one-time data collection effort with no unusual problems that require specialized sampling procedures. Procedures for the collection of information addressed below include:

  • statistical methodology for stratification and sample selection;

  • estimation procedure;

  • degree of accuracy needed:

  • unusual problems requiring specialized sampling procedures; and

  • use of periodic (less frequent than annual) data collection cycles to reduce burden.

Statistical Methodology for Stratification and Sample Selection

The statistical methodology for stratification and sample selection was discussed in Section B.1.

Estimation Procedures

National estimates of the prevalence of LEA practices and procedures for obtaining an approved indirect cost rate will be derived from the study sample. We will provide a confidence interval around each estimate to account for sampling variation.

Degree of Accuracy Needed: Precision, Statistical Power, and Minimum Detectable Differences

The sample sizes for this study are sufficient to ensure that the level of precision meets FNS’ needs. For public SFAs, the study will provide national estimates with a 95% confidence interval of ±5 percentage points. This sample will also provide a 90% confidence interval of ±5 percentage points for each subgroup. For private SFAs, the sample has been designed to provide a 95% confidence interval of ±10 percentage points. No subgroup analysis will be conducted for private SFAs. Below we discuss the required sample sizes to meet these precision requirements.

FNS is also interested in estimates of the difference between key subgroups of LEAs in the prevalence of LEA practices and procedures for calculating and applying indirect cost rates. In this context, accuracy is defined in terms of minimum detectable difference (MDDs). The sample size for this study have an MDD of two standard deviations (about 10 percentage points) in prevalence rates between key subgroups with 80% statistical power and α2 = .10.

B.3 Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

We do not anticipate a problem obtaining the necessary response rates. The major factor ensuring high response rates is that SFA participation in the survey in not voluntary. HHFK stipulates that “States, State educational agencies, local educational agencies, schools, institutions, facilities, and contractors participating in programs authorized under this Act and the Child Nutrition Act of 1966 (42 U.S.C. 1771 et seq.) shall cooperate with officials and contractors acting on behalf of the Secretary, in the conduct of evaluations and studies under those Actsas a condition of receiving funding. We plan to reference HHFK in the invitation to respondents. In addition, every effort will be made to minimize the burden placed on web survey respondents. The web survey allows respondents to work within their schedule by starting and stopping completion of the survey as often as they need.

B.4 Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

All instruments have been based on the SLBCS-II (OMB No. 0584-0533; Expiration Date: 2/28/2009). A pretest of the SFA Director and School District Business Manager web surveys will be conducted with no more than 9 respondents each, who will be chosen from LEAs/SFAs in Massachusetts (not included in the study sample).

B.5 Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.

Name

Affiliation

Telephone Number

e-mail

Frederic Glantz

Principal Investigator, Kokopelli Associates LLC

505-983-0785

fred@kokopelliassociates.com

K.P. Srinath

Sampling Statistician, Abt Associates Inc.

301-634-1836

KP_Srinath@abtassoc.com

Jacob Klerman

Director of Analysis, Abt Associates Inc.

617-520-2613

Jacob_Klerman@abtassoc.com

Chris Logan

Senior Associate, Abt Associates Inc.

617.349.2817

Chris_Logan@abtassoc.com

Matthew Gregg

Statistician, USDA/National Agricultural Statistics Service

202-720-3388

Matthew.Gregg@nass.usda.gov


1 The mandatory reporting requirement in the Healthy Hunger-Free Kids Act of 2010 (HHFKA) should ensure a high response rate, but we expect some portion of the respondents will not provide data within the study schedule. For a recent study, the Direct Verification Evaluation (OMB No. 0584-0546, Exp. 12/31/2010), response rates for the LEA surveys across five states ranged from 80.8 % to 96.2%. The survey was conducted by mail and by web with email and telephone follow up. Based on this experience, we expect at least an 80 percent response rate for the LEA Business Manager and School Foodservice Director web surveys.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSFIC Study OMB Part B
SubjectSchool Foodservice Indirect Cost Study (FNS)
AuthorFred Glantz;Carissa_Climaco@abtassoc.com
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy