SNOPS OMB Part B Jan-30-2013R (2)

SNOPS OMB Part B Jan-30-2013R (2).docx

Special Nutrition Program Operations Study (SNPOS)

OMB: 0584-0562

Document [docx]
Download: docx | pdf

Special Nutrition Program Operations Study (SN-OPS)


Statement for Paperwork Reduction Act Submission

Revision of current OMB Number 0584-0562

Part B: Supporting Statement



January 30, 2013










Office of Nutrition Analysis

Food and Nutrition Service

United States Department of Agriculture

Project Officer: John Endahl

Telephone: 703-305-2127


Table of Contents

Pag

e

Introduction A-3

Part A: Justification A-5


A.1 Circumstances That Make the Collection of Information Necessary A-5

A.2 Purpose and use of the Information A-10

A.3 Use of Information Technology and Burden Reduction A-16

A.4 Efforts to Identify Duplication and Use of Similar Information A-17

A.5 Impact on Small Businesses or Other Small Entities A-17

A.6 Consequences of Collecting the Information less Frequently A-17

A.7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5 A-18

A.8 Comments in Response to Federal Register Notice and Efforts to Consult Outside Agency A-19

A.9 Explanation of Any Payment or Gift to Respondents A-19

A.10 Assurance of Confidentiality Provided to Respondents A-19

A.11 Justification for Sensitive Questions A-20

A.12 Estimates of Annualized Burden Hours and Costs A-20

A.13 Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers A-23

A.14 Annualized Cost to the Federal Government.. A-23

A.15 Explanation for Program Changes or Adjustments. A-23

A.16 Plans for Tabulation and Publication and Project Time Schedule. A-24

A.17 Reason(s) Display of OMB Expiration Date is Innapropriate. A-28

A.18 Exceptions to Certification for Paperwork Reduction Act Submission. A-29


Part B. Collections of Information Employing Statistical Methods B-3


B.1 Respondent Universe and Sampling Methods B-3

B.2 Procedures for the Collection of Information B-5

B.3 Methods to Maximize the Response Rates and to Deal with Nonresponse B-15

B.4 Test of Procedures or Methods to be Undertaken B-17

B.5 Individuals Consulted on Statistical Aspects and Individuals
Collecting and/or Analyzing Data B-18




Tables

Table A1. Estimates of respondent burden A-21

Table A2. Annualized cost to respondents A-22

Table A3. Data collection schedule A-25

Table B1. Distribution of eligible SFAs in the 2011-12 FNS-742 universe file B-4

Table B2. Expected margins of error* for various sample sizes (n) and
design effects (DEFF) B-7

Table B3. Proposed sample sizes for the SFA survey B-10

Table B4. Expected sample sizes and corresponding standard error of an
estimated proportion under proposed design for selected

analytic domains B-11

Table B5. Number of base year SFA respondents by FNS region, enrollment size category, and percent of students eligible for free/reduced price lunch B-14

Table B6. Number of base year SFA respondents to be selected for site visits by FNS region, enrollment size category, and percent of students eligible for free/reduced price lunch B-14


Appendixes

A Cross-walk of the SFA Director survey from base year and Option
Year 1 A-1

B Cross-walk of the State Child Nutrition Director survey from base year
and Option Year 1 B-1

C Option Year 1 Research Issues and Research Questions C-1

D1 Invitation Letter to State Child Nutrition Director D1-1

D2 Follow-up Email Reminder to State CN Directors D2-1

D3 State CN Director Telephone Interviewer Script D3-1

D4 Thank You Postcard to State Child Nutrition Director D4-1

D5 Invitation Letter to SFA Directors D5-1

D6 Follow-Up Email Reminder to SFA Directors D6-1

D7 Follow-Up Reminder Postcard to SFA Directors D7-1

D8 Telephone Script to call SFA Directors D8-1

D9 Thank You Letter to SFA Director for Completing the
SNPOS Survey D9-1

D10 Invitation Letter to SFA Directors for On-Site Observations D10-1

D11 Confirmation Letter to SFA Directors Participating in the
On-Site Data Collection D11-1

E State Child Nutrition Director Survey 2012 E-1

F School Food Authority (SFA) Director Survey 2012 F-1

G On-Site Data Collection Instruments G-1

H 2011 State Child Nutrition Director Survey H-1

I 2011 SFA Director Survey I-1





SUPPORTING STATEMENT B


Special Nutrition Program Operations Study



  1. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

B.1 Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.



The respondent universe for the proposed survey includes all SFAs operating in public school districts in the United States and outlying territories that were required to submit form FNS-742 (SFA Verification Summary Data 7 CFR Part 245, Determining Eligibility for Free & Reduced Price Meals, OMB# 0584-0026, expiration date 3/31/2013) to USDA- FNS in SY2011-12. In general, all SFAs that participated in the NSLP or SBP are included in the respondent universe with the following exceptions:


  • SFAs that operate only in Residential Child Care Institutions that do not have day time students;

  • SFAs that do not have students who are eligible for free/reduced-price lunch;

  • SFAs in some outlying territories that are not required to complete form FNS-742; and

  • Private schools that participate in the NSLP.



The SY 2011-12 FNS-742 database will be used to construct the SFA sampling frame (i.e., the universe file) from which the respondent samples will be drawn. There are currently over 19,000 SFAs in the 2011-12 FNS-742 data base. However, only the approximately 15,000 SFAs operating in public school districts will be included in the sampling frame. Note that the unit of analysis for the proposed study will be the SFA which usually (but not always) coincides with a local education agency (LEA) as defined in the U.S. Department of Education’s Common Core of Data (CCD) Local Education Agency Universe Survey File maintained by the National Center for Education Statistics (NCES). Exceptions are SFAs that operate school food programs for multiple school districts and those operating individual schools (e.g., some public charter schools). In the 2011-12 FNS-742 data base, approximately 85 percent of the eligible SFAs are expected to match a district (LEA) in the current CCD universe file. Table B1 summarizes the distribution of eligible SFAs in the current sampling frame by enrollment size, class and categories of poverty status based on the percentage of students eligible for free/reduced-price lunch.


Table B1. Distribution of eligible SFAs in the 2011-12 FNS-742 universe file
(sampling frame) by enrollment size class and percent of students
eligible for free/reduced price lunch

 

Percent of students eligible for free/reduced price lunch

 

SFA enrollment size class1

Less than 30

30 to 59

60 or more

Total

Under 1,000

1,146

3,474

3,299

7,919

1,000 to 4,999

1,421

2,441

1,400

5,262

5,000 to 24,999

434

707

508

1,649

25,000 or more

55

129

112

296

Total

3,056

6,751

5,319

15,126

1 Number of students with access to NSLP/SBP as reported in 2011-12 FNS 742.



Expected Response Rates


The response rate is the proportion of sampled SFAs that complete the SFA survey. Based on experience with the base year SFA survey, we expect to achieve an SFA response rate of 80 percent. Thus, we plan to sample 1,875 SFAs to obtain 1,500 completed surveys with SFA directors. The State Child Nutrition Director survey will be conducted among all 56 State directors and will not involve any sampling. We expect at least a 95 percent response rate for the State Child Nutrition Director survey.



Previous Data Collections and Response Rates


This data collection is similar to the base year data collection conducted in SY 2011-12. The assumed 80 percent and 95 percent response rates for the SFA and State Child Nutrition Directors, respectively, are based on experience in the prior surveys involving SFA directors and State Child Nutrition Directors.



B.2 Describe the procedures for the collection of information including:


  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


Below we describe the procedures for the collection of information including statistical methodology for stratification and sample selection, estimation procedure, and the degree of accuracy needed for the purpose described in the justification.


A goal of the survey sample design is to obtain a nationally representative sample of SFAs that will yield population estimates with a precision of ±5 percent at the 95 percent level of confidence for the overall SFA population and for specified subgroups of SFAs. Under simple random sampling, this translates to a sample size of 400-500 responding SFAs for each subgroup. For example, with three key subgroups of roughly equal size, the total required sample size would range from 1,200-1,500 SFAs to meet the specified precision levels. In general, however, simple random sampling is not efficient for the multiple analytic objectives of the study. For example, while a simple random (or self-weighting) sample would be optimal for estimating the overall prevalence of SFAs reporting various types of food service practices or programs, it can be inefficient for estimating the numbers of students involved in these types of services or programs. A stratified sample design using variable rates that depend on the size of the SFA would better meet these conflicting objectives. Stratification not only helps to ensure that adequate sample sizes are obtained for important analytic subgroups of interest, but can also be effective in reducing the sampling errors of estimates that are correlated with enrollment size.


A stratified sampling design employing varying sampling fractions will be used to select the SFA sample for the study. Such a design will generally inflate the standard errors of prevalence estimates as compared with simple random sampling but is justifiable for reasons mentioned above. A measure of the relative precision of a complex sample design is given by the design effect (DEFF), which is defined to be the ratio of the variance of an estimate based on the complex sample design to the hypothetical variance based on a simple random sample of the same size. A design effect of 1.00 means that the complex sample is roughly equivalent to a simple random sample in terms of sampling precision. (A design effect less than 1.00 can sometimes occur if the sampling rates in some strata are very high, resulting in non-negligible finite population correction factors.) Under the proposed design, we have estimated that the resulting design effects will range from slightly under 1.00 to 1.50 depending on the subgroup being analyzed. As indicated in table B2, which summarizes the expected margins of error of a prevalence estimate under the proposed design for a range of sample sizes and design effects, a total SFA sample size of 1,500 responding SFAs should be more than adequate to meet or exceed the ±5 percent precision requirement even for design effect as large as 1.5. For a subgroup consisting of 500 SFAs for which the design effect is 1.10 (e.g., this would be reasonable for subgroups defined by size of SFA, but may be larger for other subgroups), the expected level of precision for the subgroup would be at most ±4.9 percent (and would be lower for prevalence estimates that are less than 50 percent or greater than 50 percent).


Table B2. Expected margins of error* for various sample sizes (n) and design
effects (DEFF)


Design effect (DEFF)

n

1.10

1.25

1.50

100

11.0%

12.5%

15.0%

200

7.8%

8.8%

10.6%

300

6.4%

7.2%

8.7%

400

5.5%

6.3%

7.5%

500

4.9%

5.6%

6.7%

600

4.5%

5.1%

6.1%

700

4.2%

4.7%

5.7%

800

3.9%

4.4%

5.3%

900

3.7%

4.2%

5.0%

1,000

3.5%

4.0%

4.7%

1,100

3.3%

3.8%

4.5%

1,200

3.2%

3.6%

4.3%

1,300

3.1%

3.5%

4.2%

1,400

2.9%

3.3%

4.0%

1,500

2.8%

3.2%

3.9%

*Entries correspond to 95% confidence limits for an estimated prevalence of approximately 50%. For estimated prevalence less than 50% or greater than 50%, the confidence limits will be smaller than those indicated in the table.



Sample Stratification and Selection


As indicated in Section B.1, an SFA-level database derived from 2011-12 Verification Summary Reports data (FNS form 742) will be used to construct the SFA sampling frame. In addition to a unique identifier (SFAID), name of SFA, and state in which the SFA is located, the database includes information about the type of control of the SFA/school district (public or private), number of schools participating in the NSLP/SBP, total enrollment in participating schools, and the number of students eligible for free or reduced-price lunch. This information, along with data from the most recent NCES Common Core of Data (CCD) LEA universe file where applicable will be used to stratify SFAs for sampling purposes. Note that all known eligible SFAs, including those that cannot be matched to the current CCD file, will be included in the sampling frame.


The types of SFA/district-level variables that can be used either as explicit or implicit stratifiers include region (defined by the seven FNS regions1), enrollment size class, a measure of poverty status defined by the percent of students eligible for free/reduced-price lunch, minority status defined by the percent of non-white students served by the SFA, type of locale (e.g., central city, suburban, town, rural), and instructional level of schools served by the SFA (e.g., elementary schools only, secondary schools only, or both)2. Since many of these characteristics are related, it will not be necessary to employ all of them in stratification to account for the variation in SFAs. Thus, we propose to define explicit sampling strata based on three primary variables: SFA enrollment size, FNS region, and poverty status. Note that since type-of-locale, minority status, and instructional level will not be available for SFAs that are not matched to LEAs in the CCD file, the non-matched cases will be placed in a separate category for sampling purposes. The CCD variables will be used as implicit stratifiers (i.e., sorting variables) to ensure appropriate representation in the sample. A total sample of 1,865 SFAs will be allocated to the strata in proportion to the aggregate square root of the enrollment of SFAs in the stratum. Such an allocation gives large SFAs relatively higher selection probabilities than smaller ones and is expected to provide acceptable sampling precision for both prevalence estimates (e.g., the proportion of SFAs with a specified characteristic) and numeric measures correlated with enrollment (e.g., the number of students in SFAs with access to various food services or programs).


During the Base Year, 1,400 SFAs completed the SFA directors’ survey in 2011. To permit longitudinal analyses of the prior SFA respondents, all of the still-eligible SFAs selected for the Base Year (including responding as well as nonresponding SFAs) will be retained in the sample for the current survey. We currently estimate that 1,734 (i.e., all but 34 of the 1,768 SFAs sampled in the Base Year) are included in the current SFA sampling frame. While we expect the 34 non-matched SFAs to be closures, we will conduct further checks of the status of these cases prior to sampling. Any cases that are found to be open and can be linked to an SFA in the new frame will be placed in the appropriate stratum for sampling. Assuming an 80 percent response rate, 1,387 of the 1,734 SFAs from the base year will complete the followup survey. To achieve the desired total sample size of 1,500 respondents, a supplemental sample of 141 SFAs will be selected from those SFAs in the current frame that do not appear in the previous frame, bringing the total number to be sampled to 1,875. Similar to procedures used to select the original Base Year sample, the supplemental sample will be selected at rates that depend on the size of the SFA, where large SFAs are selected at relatively higher rates than smaller ones. Because the sampling rates for the carry-over samples from the Base Year will not be adjusted for any changes in size category, additional design effects will be introduced; however, this design effect is expected to be modest and have a relatively small impact on cross-sectional estimates. Table B3 summarizes the expected numbers of SFAs to be sampled and the corresponding expected numbers of responding SFAs by percent eligible for free/reduced-price lunch and enrollment size class.



Table B3. Proposed sample sizes for the SFA survey

Percent eligible for free/reduced price lunch1

Enrollment size class2

Expected number of SFAs to be sampled

Expected number of responding SFAs3

Under 60 percent

Less than 1,000

292

234


1,000 to 4,999

579

463


5,000 to 24,999

357

286


25,000 or more

124

99


Subtotal

1,352

1,082

60 percent or more

Less than 1,000

148

118


1,000 to 4,999

181

145


5,000 to 24,999

131

105


25,000 or more

63

50


Subtotal

523

418

All SFAs

Total

1,875

1,500

1Calculated from the numbers of students eligible for free or reduced price lunch as reported in 2011-12 FNS 742.

2Number of students with access to NSLP/SBP as reported in 2011-12 FNS 742.

3Based on 80% response rate. Note: See Table B4 for additional breakouts of the sample.



Expected Levels of Precision


Table B4 summarizes the approximate survey sample sizes and standard errors to be expected under the proposed design for selected subgroups. The standard errors in table B4 reflect design effects ranging from 1.0 or less to 1.5 depending on subgroup. The design effect primarily reflects the fact that under the proposed stratified design, large SFAs will be sampled at relatively higher rates (i.e., have smaller sampling weights) than small SFAs. The standard errors in table B4 can be converted to 95 percent confidence bounds by multiplying the entries by 2. For example, an estimated proportion of the order of 20 percent (P = 0.20) for SFAs in which fewer than 30 percent of students are eligible for free/reduced price lunch will be subject to a margin of error of ±4.6 percent at the 95 percent confidence level. Similarly, an estimated proportion of the order of 50 percent (P = 0.50) for SFAs in the Northeast region will be subject to a margin of error of ±8.6 percent at the 95 percent confidence level.


Table B4. Expected sample sizes and corresponding standard error of an estimated proportion under proposed design for selected analytic domains

Domain (subset)

Expected sample size*

Standard error† of an estimated
proportion equal to …

P = 0.20

P = 0.33

P = 0.50

Total sample

1,500

0.012

0.014

0.015

Percent of students eligible for free/reduced price lunch

 




Less than 30

394

0.023

0.027

0.028

30 to 59.9

688

0.018

0.021

0.022

60 or more

418

0.024

0.028

0.030

FNS Region

 




Mid Atlantic

173

0.035

0.041

0.043

Midwest

339

0.024

0.028

0.030

Mountain

173

0.035

0.041

0.043

Northeast

167

0.035

0.041

0.043

Southeast

195

0.033

0.039

0.041

Southwest

213

0.032

0.038

0.041

Western

240

0.034

0.040

0.042

SFA Enrollment Size

 




Under 1,000

352

0.020

0.024

0.026

1,000 to 4,999

608

0.015

0.018

0.019

5,000 or more

540

0.015

0.018

0.019

* Expected number of responding eligible SFAs, assuming response rate of 80 percent. The standard errors given in this table are given for illustration. Actual standard errors will depend on characteristics being estimated and may differ from those shown.

Assumes unequal weighting design effect ranging from 0.78 to 1.87 depending on subgroup.



Estimation and Calculation of Sampling Errors


For estimation purposes, sampling weights reflecting the overall probabilities of selection and differential nonresponse rates will be attached to each data record providing usable SFA data. The first step in the weighting process will be to assign a base weight to each sampled SFA. The base weight is equal to the reciprocal of the probability of selecting the SFA for the study, which will vary by sampling stratum under the proposed stratified sample design, and also depend on whether the SFA was originally sampled for the Base Year or was selected for the supplemental sample. Next, the base weights will be adjusted for nonresponse within cells consisting of SFAs that are expected to be homogeneous with respect to response propensity. To determine the appropriate adjustment cells, we will conduct a nonresponse bias analysis to identify characteristics of SFAs that are correlated with nonresponse. The potential set of predictors to be used to define the adjustment cells will include SFA-level characteristics that are available from the FNS database and data from the most recent CCD file. Within these cells, a weighted response rate will be computed and applied to the SFA base weights to obtain the corresponding nonresponse-adjusted weights.


To properly reflect the complex features of the sample design, standard errors of the survey-based estimates will be calculated using jackknife replication. Under the jackknife replication approach, 100 subsamples or "replicates" will be formed in a way that preserves the basic features of the full sample design. A set of weights (referred to as “replicate weights”) will then be constructed for each jackknife replicate. Using the full sample weights and the replicate weights, estimates of any survey statistic can be calculated for the full sample and for each of the jackknife replicates. The variability of the replicate estimates is used to obtain the variance of the survey statistic. The replicate weights can be imported into variance estimation software (e.g., SAS, SUDAAN, WESVAR) to calculate standard errors of the survey-based estimates. In addition to the replicate weights, stratum and unit codes will be provided in the data files to permit calculation of standard errors using Taylor series approximations if desired. Note that while replication and Taylor series methods often produce similar results, jackknife replication has some advantages in reflecting statistical adjustments used in weighting such as nonresponse and poststratification (e.g., see Rust, K.F., and Rao, J.N.K., 1996. Variance estimation for complex surveys using replication techniques. Statistical Methods in Medical Research, 5: 283-310).



Sampling plan for on-site visits


A total of 125 SFAs will be recruited for the on-site visits that are planned for the spring of 2013. The SFAs will be selected so as to cover a broad range of SFAs with respect to geography (FNS region), size category, and poverty status. Within the selected SFAs, a maximum of three schools will be selected for the in-person site visits, including where possible one elementary school, one middle school, and one high school per SFA.3 To enable linking of site visits observations with SFA survey data collected in year 1, the sample will be restricted to the 1,400 SFAs that completed the base year survey. Table B5 summarizes the distribution of the 1,400 responding SFAs by region, size category, and poverty status. Table B6 summarizes the proposed numbers of SFAs to be selected for the site visits. Note that the sample sizes in Table B6 assume that all of the selected SFAs will agree to participate in the site visits. In the event that a selected SFA does not agree to participate, a backup SFA will be selected from other SFAs in the same cell to replace the non-cooperating SFA.


As indicated above, up to three schools will be selected from each cooperating SFA. Where possible, the selected SFAs will be linked to districts (LEAs) in the current Common Core of Data (CCD) Universe files maintained by NCES to develop an initial list of schools for subsampling. The SFA will be asked to review and update the school lists where necessary, and one school from each grade level will be selected for the site visits. A “backup” school for each grade level will also be designated at the time of sampling to act as a substitute in case the originally sampled school does not agree to participate. Note that where CCD lists are not available, the SFA will be asked to provide a list of the schools it serves for sampling purposes.




Table B5. Number of base year SFA respondents by FNS region, enrollment size category, and percent of students eligible for free/reduced price lunch



Enrollment size category


Under 1,000

1,000 to 4,999

5,000 or more

Percent eligible for free/reduced price lunch

Percent eligible for free/reduced price lunch

Percent eligible for free/reduced price lunch

FNS
REGION

Total

Under 60

60 or higher

Under 60

60 or higher

Under 60

60 or higher

Northeast

147

26

3

80

1

31

6

Mid Atlantic

142

13

6

62

4

47

10

Southeast

183

3

5

27

35

80

33

Midwest

278

55

19

116

19

51

18

Southwest

237

26

42

42

34

48

45

Mountain Plains

188

71

18

42

10

40

7

Western

225

21

24

40

23

77

40

Total

1,400

215

117

409

126

374

159




Table B6. Number of base year SFA respondents to be selected for site visits by FNS region, enrollment size category, and percent of students eligible for free/reduced price lunch



Enrollment size category


Under 1,000

1,000 to 4,999

5,000 or more

Percent eligible for free/reduced price lunch

Percent eligible for free/reduced price lunch

Percent eligible for free/reduced price lunch

FNS
REGION

Total

Under 60

60 or higher

Under 60

60 or higher

Under 60

60 or higher

Northeast

13

2

0

7

0

3

1

Mid Atlantic

13

1

1

6

0

4

1

Southeast

14

0

0

2

3

6

3

Midwest

25

5

2

9

2

5

2

Southwest

21

2

4

4

3

4

4

Mountain Plains

18

6

2

4

1

4

1

Western

21

2

2

4

2

7

4

Total

125

18

11

36

11

33

16



B.3 Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


Overall response rate projections were presented earlier. Achieving the specified response rate involves locating the sample members to secure participation using procedures described below. We estimate 80 percent of the sampled SFA Directors will either complete the web-administered survey or a self-administered paper survey. We estimate 80 percent of the sampled SFAs will participate in the on-site data collection. We expect all State Child Nutrition Directors to complete their survey.


Below we describe procedures to be followed to maximize the number of sample members who complete the survey:


  • The letters inviting SFA Directors and State Child Nutrition Directors to participate in the surveys will be very carefully developed to emphasize the importance of this study and how the information will help the Food and Nutrition Service (FNS) to better understand and address current policy issues related to Special Nutrition Program (SNP) operations.

  • Before the SFAs are invited to participate in the study, the contractor will gain support from relevant associations representing organizations with an interest in the success of this study (e.g. School Nutrition Association).

  • Designated FNS regional staff will serve as regional study liaisons and be kept closely informed of the project so that they will be able to answer questions from SFAs and encourage participation.

  • The contractor will have a toll free number that SFAs can call to ask any questions related to the study.

  • Sampled SFA Directors will have the option of completing the survey using the mode of their choice (hard copy or web). The State Child Nutrition Directors will have the option of completing a hard copy survey or a telephone survey.

  • We will follow up by telephone with all sampled SFA and the State Child Nutrition Directors who do not complete the survey within a specified period and urge them to complete the survey. At that point if the State Child Nutrition Directors prefer to complete the survey over the telephone, a telephone interviewer will administer the survey over the telephone. The SFA Directors will not be given the option of completing a telephone survey because they need to gather data to complete the survey, and it is not practical to complete the SFA survey on the telephone.

  • Follow-up reminders will be sent either by email (if an email address is available) or by regular mail to respondents who have not mailed the survey back or completed the web survey.


The following procedures will be used to maximize the completion rates for surveys that are administered by telephone:


  • Use a core of interviewers with experience working on telephone surveys, particularly interviewers who have proven their ability to obtain cooperation from a high proportion of sample members, to administer the survey over the telephone to State Child Nutrition Directors who do not complete the hard copy survey.

  • All telephone interviewers will complete training specific to this study.

  • Use call scheduling procedures that are designed to call numbers at different times of the day (between 8am and 6pm) and week (Monday through Friday), to improve the chances of finding a respondent at work.

  • Make every reasonable effort to obtain an interview at the initial contact, but allow respondents flexibility in scheduling appointments to be interviewed.

  • Conduct silent monitoring of interviews to identify and promptly correct behaviors that could be inviting refusals or otherwise contributing to low cooperation rates.

  • Leave a message on voice mail in order to let the respondent know the call was for a research study.

  • Provide a toll-free number for respondents to call to verify the study’s legitimacy or to ask other questions about the study.

  • Require six unsuccessful call attempts to a number without reaching someone before considering whether to treat the case as “unable to contact.”

  • Implement refusal conversion efforts for first-time refusals and use interviewers who are skilled at refusal conversion and will not unduly pressure the respondent.



B.4 Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


The discussion below provides the results of the pretest of the survey instruments and the feasibility study.


Pretest. The State CN Director and SFA Director surveys were pretested to determine the (1) clarity of the wording, (2) availability of the information, and (3) response burden. The State CN Director survey was pretested with 3 states. For the SFA Director survey Westat identified a pool of potential SFAs ranging in size and poverty level from across all FNS regions. Westat contacted SFAs until 9 agreed to participate in the pretest.


SFAs participating in the pretest reported that the time taken to complete the survey was more than the original 1.5 hour estimated. Respondents provided valuable feedback on question wording as well as questions that were hard to answer. The questionnaires have been revised to reflect the results of the pretest.


B.5 Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.



The contractor, Westat will conduct this study.


Name

Affiliation

Telephone Number

e-mail

Juanita Lucas-McLean

Westat

301-294-2866

JuanitaLucas-McLean@Westat.com

Adam Chu

Westat

301-251-4326

AdamChu@Westat.com

Kim Standing

Westat

301-294-3943

KimStanding@Westat.com

John Endahl

FNS/USDA

703-305-2127

john.endahl@fns.usda.gov

Jennifer Rhorer

NASS/USDA

202-720-2616

Jennifer_rhorer@nass.usda.gov




1 The seven regions (and states) are: Northeast (CT, ME, MA, NH, NY, RI, VT), Mid-Atlantic (DE, DC, MD, NJ, PA, PR, VA, VI, WV), Southeast (AL, FL, GA, KY, MS, NC, SC, TN), Midwest (IL, IN, MI, MN, OH, WI), Southwest (AR, LA, NM, OK, TX), Mountain Plains (CO, IA, KS, MO, MT, NE, ND, SD, UT, WY), and Western (AK, AZ, CA, GU, HI, ID, NV, OR, WA).

2 Elementary school is defined as any school with any span of grades from kindergarten through grade 6. Middle or junior high school is defined as any school that has no grade lower than grade 6 and no grade higher than grade 9. High school is defined as any school that has no grade lower than grade 9 and continues through grade 12. Schools that do not fit these definitions are categorized as “other.”

3 Elementary school is defined as any school with any span of grades from kindergarten through grade 6. Middle or junior high school is defined as any school that has no grade lower than grade 6 and no grade higher than grade 9. High school is defined as any school that has no grade lower than grade 9 and continues through grade 12.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorFreeland_s
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy