Part A: Justification Mathematica Policy Research
Part A: Justification
Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.
This study updates a previously approved and discontinued data collection effort (OMB number 0584-0529), “Feasibility of Computer Matching in the National School Lunch Program.” It builds on analysis of that data collection, as well as other studies of data matching, by examining current methods of direct certification used by States and districts, specific improvements in direct certification methods made since 2005, and challenges facing States and districts in attaining high matching rates. It is needed to help FNS, State agencies and districts, and State child nutrition (CN) directors recognize promising trends, understand new approaches, and provide technical assistance for continuous improvement in direct certification practices.
Direct certification enables children in households that receive Supplemental Nutrition Assistance Program (SNAP) or other public assistance program benefits to be certified to receive free school meals without application. The Child Nutrition and Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) Reauthorization Act of 2004 (PL 108-265) required States and local education agencies (LEAs)1 to use direct certification. The Food and Nutrition Service (FNS) issued a new guideline, effective for school year (SY) 2009–2010, that direct certification must apply to all students in the household, to the extent possible, if any household member receives SNAP, Food Distribution Program on Indian Reservations (FDPIR), or Temporary Assistance for Needy Families (TANF) benefits
Use of direct certification has increased since the 2004 reauthorization but is still not universal, despite the mandate. In SY 2009–2010—2011, 83 85 percent of National School Lunch Program (NSLP) districts directly certified children in SNAP households; these districts included 97 percent of all students in NSLP schools. Most States now employ computer data-matching techniques to directly certify categorically eligible students. These techniques involve matching names and other identifying information between an electronic student enrollment file and electronic files containing information on children receiving benefits from SNAP or other programs that confer eligibility for free school meals. A comprehensive set of information on current State and local direct certification practices is needed to inform policy and program operation decisions to improve the reach and efficacy of direct certification. Such information is not available; this collection will meet that need.
FNS has authority to conduct this study under its responsibility for the development and implementation of national policy for the NSLP. This responsibility includes the promulgation of regulations, monitoring State operations, review and reimbursement of State and local expenditures, and program evaluations. States and districts, as well as schools and other institutions, participating in the NSLP are expected to cooperate with officials and contractors acting on behalf of FNS, in the conduct of evaluations and studies under the Richard B. Russell National School Lunch Act and the Child Nutrition Act of 1966
Indicate how, by whom, how frequently, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.
On behalf of FNS, Mathematica will collect information for the National School Lunch Program Direct Certification Improvement Study. The previously approved data collection studied the feasibility of expanding the use of computer matching for certification and verification of eligible children under the NSLP. With respect to direct certification, that study “found considerable variation in the methods and effectiveness of direct certification across States, suggesting that it may be possible to increase effectiveness in some States and thereby expand direct certification to more [SNAP] or TANF recipients” (Cole and Logan 2007). That study also provided a number of suggestions for how States and local districts could make improvements. This current study will build on those results and provide a current picture of direct certification efforts. Therefore, the key purpose of this study is to describe and characterize practices used by States and local education agencies to conduct direct certification. The study is not intended to produce national estimates or to draw comparisons between States. The information collected for this study will help FNS, State CN directors, and districts recognize promising trends, understand new approaches, and identify steps needed to improve their direct certification efforts.
The project has 11 study objectives: (1) update national information on current practices used by States and districts to conduct direct certification; (2) describe State information systems (ISs) and databases that are used to conduct direct certification and what analyses are conducted to determine the efficiency of the data matching, and correlate State system and database characteristics with State performance measures, including those based on the agency’s direct certification reporting; (3) develop a comprehensive, up-to-date reference library of data-matching algorithms and computer code used for NSLP direct certification at the State and local levels, including a library of the data elements, formats, and definitions for all variables used in the matching; (4) examine relationships between direct certification implementation procedures, information systems and databases, and State performance measures of direct certification; (5) determine what barriers exist in the use of data matching in direct certification in NSLP in different States and districts; (6) determine what States have been doing with direct certification grants awarded by FNS, in terms of improvements made and their effects; (7) identify best practices that could be used to provide technical assistance to those States developing continuous improvement plans to reach higher rates of data matching; (8) examine the current plans for improvement of the direct certification process in the future and the capability to adopt any potential changes that might be required in the Child Nutrition and WIC Reauthorization; (9) explore the records of unmatched SNAP households with school-aged children and of categorically eligible SNAP children (as determined by NSLP application) to determine how direct certification could be further improved; (10) estimate the “national” direct certification matching rates under various scenarios (Optional Task); and (11) develop model continuous improvement plans for States using State-level matching and for States using district-level matching (Optional Task).
In order to meet the study objectives, the project will include three data collection efforts: (1) a web-based national survey of State and local nutrition program administrators; (2) in-person interviews conducted with State- and district-level staff responsible for direct certification in seven case study States; and (3) an exploration of unmatched SNAP participant records and NSLP applications in case study States.
The previously approved data collection studied the feasibility of expanding the use of computer matching for certification and verification of eligible children under the NSLP. With respect to direct certification, that study “found considerable variation in the methods and effectiveness of direct certification across States, suggesting that it may be possible to increase effectiveness in some States and thereby expand direct certification to more [SNAP] or TANF recipients” (Cole and Logan 2007). That study also provided a number of suggestions for how States and local districts could make improvements. This current study, then, will build on those results and provide a current picture of direct certification efforts and further assess the relationship between direct certification characteristics and performance. The information collected for this study will help FNS, State CN directors, and districts recognize promising trends, understand new approaches, and identify steps needed to improve their direct certification efforts.
The web-based, national survey of direct certification practices (Appendix A) will be completed during the first phase of data collection by all entities directly responsible for conducting direct certification. Specifically, a. All 50 States, the District of Columbia, and five territories will be asked to complete the survey, as will , and all districts in those States in which direct certification data matching is conducted at the district level. will be asked to complete the survey.
We designed the survey to collect the detailed information required to address the study objectives while minimizing burden on survey respondents. We will ask respondents only questions relevant to the direct certification method they employ—State-level matching, district-level matching, or letter method2—and whether they are State or district staff. In addition, we will ask most districts to complete a shortened version of the survey, whereas we will ask a subsample of districts to complete the full survey. To reduce burden, we will provide a long-form of the district-level survey to a sample of district level-matching states and a short-form to the remaining districtsshort-form of the district-level survey to two-thirds of districts and a more detailed long form to the remaining districts. The short version of the district-level survey will provide updated national information on LEAs’ current direct certification practices in three key areas: (1) student enrollment data characteristics, (2) LEA data matching process characteristics, and (3) methods of linking children in the same household. These key areas of interest are also included in the long-form survey.
The long version of the survey explores the LEA direct certification process in greater detail for each of the above areas, including attributes of the information systems and databases and specific data-matching algorithms and results. It will also address the following additional topics: planned changes to direct certification; experiences interacting with State data systems; challenges and barriers faced in the direct certification process; and the feasibility of using Medicaid databases for direct certification in the future. (Questions included in the shortened survey are highlighted in yellow in Appendix A.) Construction of this survey was informed both by the previously approved data collection and by recent work Mathematica has conducted on behalf of FNS involving semistructured interviews on direct certification practices with States identified as having strong direct certification performance. (Questions included in the shortened survey are highlighted in yellow in Appendix A.)
The in-person, semistructured interviews (Appendix B) will be completed during site visits to seven case study States selected based on having direct certification processes in place that best address the key research questions. We will conduct interviews with either individuals or small groups with a time limit of 60 minutes. The interviews will have eight types of respondents: (1) State CN staff, (2) State education staff, (3) State SNAP staff, (4) State Medicaid staff, (5) State TANF staff, (6) State IS staff, (7) district staff, and (8) district IS staff. In some States, SNAP, Medicaid, and/or TANF programs might be integrated, such that the individuals most knowledgeable about topics relevant to direct certification of SNAP recipients will also be the most appropriate respondents for questions related to Medicaid and/or TANF. Because States differ in their approach to direct certification, we will interview only the respondent types relevant for that particular case study State.
There could be some advantage to conducting the site visits prior to the National survey and using our findings to help inform and refine the composition of the national survey. However, there are two important disadvantages to this approach to timing that outweigh this potential benefit: (1) Timing of the survey is critical. It is important to conduct the survey as close to the start of the school year as possible when information about the initial match is still fresh in the respondent’s minds. Conducting the semistructured interviews first would delay the start date for the survey; (2) Gathering base information from the survey first will make on-site visits more efficient for interviewer and interviewee. The interviewer will come to the meeting more informed on State direct certification operations and the interviewee will be prepared to expand on previous answers and provide important context.
To explore the accuracy of direct certification matches and provide insight into how data matching could be improved, we will collect and examine unmatched SNAP participant records and NSLP applications from a sample of districts within the seven case study States.3 More specifically, we will select a sample of 28 districts from the seven States from which we will request all applications for which a student was determined to be categorically eligible for school meal benefits, that is eligible based on participation in SNAP, TANF, or other programs that confer eligibility for free meals. These applications represent students who could have been directly certified without an application but were not. We expect to collect 2,100 to 2,150 applications, a figure we base on the average number of NSLP applications per district that have categorically eligible students (28 districts * 76 applications = 2,128 total NSLP applications). In addition, we will request that each case study State provide the SNAP participant files used in the initial matching with student enrollment data. Using both SNAP participant data and the categorically eligible NSLP applications, we will be able to (1) describe the characteristics of children with SNAP records who are not matched to enrollment data and (2) conduct an independent match of sample categorically approved NSLP applications. These analyses will identify the types of children who are not directly certified through the State’s matching procedures and provide insight into the accuracy and completeness of these procedures.
The contractor is committed to meeting its ethical and contractual obligations to preserve the confidentiality of the sensitive and personal data our clients entrust to us. The policies, procedures and technical safeguards are designed to efficiently protect confidential information and data from unauthorized disclosure, use, or alteration. These measures are implemented companywide, and are consistent with the Federal Information Security Management Act of 2002 (FISMA), OMB Circular A-130, Management of Federal Information Resources, the Privacy Act, and National Institute of Standards and Technology (NIST) computer security standards and guidance. All contractor staff are required to comply with a Confidentiality Pledge, and complete security awareness training, as well as training on the use of specific security measures.
The contractor’s standard safeguards include Federal Information Processing Standard (FIPS) 140-2 compliant data encryption methods, removing identifiers from data as soon as practicable and controlling access to information on a need-to-know basis. When not in use, hard copy and external media that contain confidential data are stored in controlled access areas. The full complement of standard procedures is documented in the contractor’s Corporate Security Manual, which is available upon request.
Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also, describe any consideration of using information technology to reduce burden.
We designed the data collection methodology for this study to minimize burden and provide the flexibility required to meet the needs of respondents. We are committed to compliance with the E-Government Act of 2002 to promote the use of technology.
We will administer the national survey as a web survey. This allows easy access and efficient collection of data and ensures the privacy of respondents’ information. (A sample screenshot is included in the packageas Appendix C.) Because the survey will include program-specific and technical questions, we have designed the web survey so that a respondent can save responses and then hand off sections to other appropriate administrators who have relevant knowledge. Web surveys will be password protected, with data transmitted via a secure tunnel to a database residing behind a monitored firewall. The web survey will include functions for tracking survey responses, enabling project staff to keep abreast of the status of survey respondents. The database will alert staff on past-due surveys so staff can follow up with nonrespondents. Twenty percent of this data collection will be submitted electronically but the URL has not yet been created. For the in-person site-visits, the contractor will use extant data to create State and district profiles and other documentation to closely familiarize itself with the details of the direct certification efforts of each case study State (see Appendix C). Staff carrying out the in-person site visits will use this information to make State-specific preparations for the visits, ensuring that the interviews are specific, streamlined, and an efficient use of the respondents’ time.
In requesting categorically eligible applications for the exploration of unmatched records, we will minimize the burden on respondents by accepting those data in the format (such as hard copy, Excel file, text file, or other flat file formats) and delivery method (such as use of a secure file exchange (FX) site or by hard copies mailed) that are most convenient for respondents. In accordance with the Privacy Act, the contractor will safeguard all data, and only authorized users will have access to them.
Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purpose described in item 2 above.
Every effort has been made to avoid duplication in the data collection. While previous data collection efforts have collected similar data, but these data are no longer current, they do not account for more recent changes in direct certification regulations, they do not collect data from all states and districts in States performing district-level matching, and they do not provide the level of detail required for this study.
The previously approved data collection occurred in late-2005 (survey of states) and early-2006 (telephone interviews with six case study states), which is before all states and districts were required to have direct certification systems in place for SY 2008-2009. Given that technology changes rapidly, data matching procedures are continually evolving, and the high likelihood that states and districts made changes to their systems to meet the SY 2008-2009 deadline, the information previously collected is outdated. In addition, the survey for the previously approved data collection effort did not include district level administrators. It is critical that information is collected from local districts in the 19 States (plus Ohio) that employ more decentralized data matching methods. Without information on how direct certification is handled in those districts, FNS would not have an understanding of how the data matching systems and methods are operating in 40 percent of the country. In addition, the current data collection effort will ask more detailed questions about the student enrollment data characteristics, data matching techniques, and other relevant attributes of the direct certification systems and processes employed by States and districts.
Because technology changes rapidly and data matching procedures continually evolve, Given the above, the information on direct certification procedures that will be to be collected in this study does not exist elsewhere. FNS does not require States or districts to report any information related to their computer matching activities; therefore, this data collection does not duplicate State or district efforts.
If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.
We will not contact any small businesses or entities during the course of this study.
Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
This is a one-time collection effort. If the data are not collected, FNS will not have the information it needs to address the study objectivesoutlined in Section A.2. For example, FNS will not be able to gather updated national information on the current processes and procedures used by States and districts to conduct direct certification with computer matching techniques; FNS will not be able to explore the relationship between these methods and overall direct certification performance measures; and FNS would not be able to identify steps for continuous improvement in data matching techniques and tools to increase matching rates. These limitations will significantly hinder FNS’ ability to assess the feasibility of improving the certification process used for the NSLP.
Explain any special circumstances that would cause an information collection to be conducted in a manner:
requiring respondents to report information to the agency more often than quarterly;
requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
requiring respondents to submit more than an original and two copies of any document;
requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;
in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
requiring the use of a statistical data classification that has not been reviewed and approved by OMB;
that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.
There are no special circumstances. We will conduct data collection in a manner consistent with the guidelines in 5 CFR 1320.5.
If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency’s notice, soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments.
Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting form, and on the data elements to be recorded, disclosed, or reported.
In accordance with 5 CFR 1320.8 (d) 1995, a notice of the proposed information collection and an invitation for public comment was published in the Federal Register, March 11, 2011, Volume 76, Number 48, Pages 13342-13344. No public comments were received in response.
In addition to soliciting comments from the public and from National Agricultural Statistics Service (NASS), FNS drew upon its experience conducting best practice interviews with states as a part of the annual report to Congress on NSLP direct certification implementation progress. FNS also consulted five district directors, four State level directors and six Mathematica senior technical staff about the availability of data, design, level of burden, and clarity of instructions for this collection:
DISTRICT PILOT TEST Participants
Ms. Jackie Schumacher, Food Service Director 307-587-4285
Cody, WY
Ms. Lena Harris-Wilson, Food Service Director 307-221-0355
Cheyenne, WY
Ms. Vicki Hoffman, Food Service Director 316-973-2160
Wichita, KS
Ms. Nancy Coughenour, Food Service Director 913-993-9723
Shawnee Mission, KS
Ms. Cynthia Schrader, Food Service Director 913-684-1569
Leavenworth, KS
STATE PILOT TEST Participants
Dr. Colleen Fillmore, Child Nutrition Director 208-332-6820
Boise, ID
Ms. Janet Hawk, Coordinator of School Nutrition Programs 609-984-0692
Trenton, NJ
Ms. Cheryl Johnson, Director, Child Nutrition & Wellness 785-296-2276
Topeka, KS
Ms. Tamra Jackson, Nutrition Programs Supervisor 307-777-6263
Cheyenne, WY
MATHEMATICA SENIOR TECHNICAL STAFF
Kevin Conway: Project Director 609-750-4083
Nancy Cole: Senior Researcher 617-674-8353
John W. Hall: Senior Statistician 609-275-2357
Quinn Moore: Senior Researcher 919-240-4879
Lara Hulsey: Researcher 609-936-2778
Brandon Kyler: Senior Program Analyst 609-716-4381
Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.
There are no payments or gifts to respondents.
Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.
The information provided in this study through the national survey of direct certification practices, through the in-person semistructured interviews, and through the collection of unmatched SNAP records and NSLP applications will be kept private to the extent allowed by law. Results will be reported only at the State level, and the names of participating districts will not be revealed. We will assure survey and semistructured interview respondents in writing that they will not be personally identified in any publications. Moreover, we will ensure that any published reports with tabular summaries or frequency distributions will not allow the deductive disclosure of any participant in this study. This assurance is contained in the frequently asked questions page that will be transmitted to respondents with the introductory materials (see Appendix D).
The exploration of unmatched records of SNAP participants in seven in-depth study States will require the collection of SNAP participant data and NSLP applications. Both the SNAP participant data and NSLP applications will contain private information, such as names, addresses, dates of birth, Social Security numbers, and program participation information.
More specifically, the application for school meal benefits requires the current income, the names of all household members and the social security number of the adult household number who signs the application or indication that such adult does not possess a social security number. However, if the application is being made for a member of a food stamp household or a Temporary Assistance for Needy Family Programs, the application must enable the household to provide the appropriate case number in lieu of names of all household members, household income information and social security number. Section 9(b) of the National School Lunch Act (Public Law 103-448) restricts the use or disclosure of any eligibility information to persons directly connected with the administration or enforcement of the program. It also authorizes States and local school food authorities to conduct verification of the eligibility for free and reduced price meals. The social security numbers may be used to identify household members in carrying out efforts to verify the correctness of information stated on the application.
In addition, Section 7(b) of the Privacy Act of 1974 (P.L 93-579, U.S.C. 552a note) requires that Federal, State or local government agencies which request individuals to disclose their social security number be informed (1) whether that disclosure is mandatory or voluntary, (2) by what statutory authority or other authority each number is solicited, and (3) what uses will be made of the number. The Department’s prototype Privacy Act Statement which fulfills these criteria has been incorporated into section 245.6(a)(1) and 245.6a(a)(2) of the regulations governing free and reduced price eligibility and has been included in the Department’s prototype free and reduced price application.
We will collect the SNAP participant data from State staff. We anticipate that, in most States, the participant data will be available electronically and, therefore, will be transmitted to the contractor via a secured transfer site. The secure file exchange (FX) site uses a secure sockets layer (SSL) certificate to encrypt the data transmission, which conforms to the strictest data security protocols. Users will access the FX site using a site-specific user name and password. We will provide technical assistance to States transferring the data via our secured FX site.
State or district staff will collect the NSLP applications and transmit them to the contractor in one of three ways. First, if the applications are available electronically, they can be sent via the secure transfer site. If applications are available only by hard copy, then State or district staff can either (1) deliver the necessary files in-person to contractor staff during a site visit or (2) ship the hard-copy applications via mail or courier. When received at our location, we will keep these files in a secure, locked location accessible only by authorized project staff. Upon completion of the study, the SNAP participant data and NSLP applications will be destroyed.
The contractor has a long history of protecting the privacy of records and considers it a critical aspect of any study’s scientific integrity and legality. And as discussed in Section A.2 above, the contractor’s policies, procedures and technical safeguards are designed to efficiently protect confidential information and data from unauthorized disclosure, use, or alteration. Only authorized personnel with a need to know will have access to data containing personally identifiable information (PII). These measures are implemented companywide, and are consistent with the Federal Information Security Management Act of 2002 (FISMA), OMB Circular A-130, Management of Federal Information Resources, the Privacy Act, and National Institute of Standards and Technology (NIST) computer security standards and guidance. In addition, the contractor’s standard safeguards include Federal Information Processing Standard (FIPS) 140-2 compliant data encryption methods, removing identifiers from data as soon as practicable, and controlling access to information on a need-to-know basis.
The USDA Privacy Office has determined that, in accordance with OMB M-03-022, a Privacy Impact Assessment (PIA) is not required for any phase of this data collection. The semistructured interviews, web surveys, and the SNAP records and NSLP applications already collected by states and districts fall outside the scope of this PIA requirement. There will be no creation of a new System of Records, and there will be no new data collection of PII. Once the contractor has completed its process analysis using data provided by the States and districts, and provided FNS with the aggregated evaluation report, all electronic and paper records will be destroyed consistent with NIST data destruction guidelines. FNS does not, and will not see any individual SNAP records or NSLP applications.
We will not collect any confidential data in the national survey or the in-depth interviews with State and local staff conducted during site visits, so we do not need a plan for assurance of confidentiality. We will assure respondents in writing that they will not be personally identified in any publications. Moreover, we will ensure that any published reports with tabular summaries or frequency distributions will not allow the deductive disclosure of any participant in this study. This assurance is contained in the frequently asked questions page that will be transmitted to respondents with the introductory materials (see Appendix D).
The exploration of unmatched records of SNAP participants in seven in-depth study States will require the collection of SNAP participant data and NSLP applications. Both the SNAP participant data and NSLP applications will contain private information, such as names, addresses, dates of birth, Social Security numbers, and program participation information.
We will collect the SNAP participant data from State staff. We anticipate that, in most States, the participant data will be available electronically and, therefore, will be transmitted to the contractor via a secured transfer site. The secure file exchange site uses a secure sockets layer (SSL) certificate to encrypt the data transmission, which conforms to the strictest data security protocols. Users will access the FX site using a site-specific user name and password. We will provide technical assistance to States transferring the data via our secured FX site.
State or district staff will collect the NSLP applications and transmit them to the contractor in one of three ways. First, if the applications are available electronically, they can be sent via the secure transfer site. If applications are available only by hard copy, then State or district staff can either (1) deliver the necessary files in-person to contractor staff during a site visit or (2) ship the hard-copy applications via mail or courier. When received at our location, we will keep these files in a secure, locked location accessible only by authorized project staff. Upon completion of the study, the SNAP participant data and NSLP applications will be destroyed.
The contractor has a long history of protecting the privacy of records and considers it a critical aspect of any study’s scientific integrity and legality. We will comply with the data security requirements for all aspects of this project through the implementation of security controls that the contractor routinely uses in carrying out project work that involves sensitive data. Only authorized personnel with a need to know will have access to data containing personally identifiable information (PII). In accordance with the Privacy Act, the contractor will safeguard all data and only authorized users will have access to them.
Provide additional justification for any questions of a sensitive nature, such as sexual behavior or attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
FNS and the contractor will comply with the Privacy Act of 1974. The questions in the national survey of direct certification practices will relate to program details, direct certification practices, and the respondents’ opinions of effectiveness. ; the questions During the pilot test, no participants identified any question as sensitive. Moreover, web administration is the preferred mode for collecting potentially sensitive information since it does not require respondents to disclose anything they feel is threatening directly to an interviewer, either face-to-face or over the telephone. The comparative privacy of answering questions on a computer reduces the perceived threat and has been found to improve the accuracy of responses.
will not be sensitive. During site visits with in-depth study States, the semistructured interview questions will relate to program details, direct certification practices, and participants’ opinions of effectiveness. These questions are not will not be considered sensitive based on our experience asking similar questions for the best practice interviews with States that are conducted as a part of the annual report to Congress on NSLP direct certification implementation progress. These questions will relate to program details, direct certification practices, and participants’ opinions of effectiveness.
Provide estimates of the hour burden of the collection of information. The statement should:
Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.
Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories.
The study will collect data from a total of 6,513 respondents across all States. There are three categories of data collection: (1) a web-based, national survey (States and districts); (2) in-depth interviews during site-visits; and (3) collection of unmatched SNAP participant records and NSLP applications. The web-based, national survey will be conducted with 56 State and territory CN program directors and approximately 6,265 district administrators (2,500 districts will receive a long version of the survey; 3,765 districts will receive a short version). In-depth interviews during site visits will be conducted with 7 State CN agency officials; 7 State education staff; 7 State SNAP officials; 7 State Medicaid agency officials; 7 State TANF officials; 14 (2 per State) State IS staff; 18 district staff; and 18 district IS staff. Records of unmatched SNAP participant records will be collected by 7 State staff and NSLP applications will be collected by 100 district staff.
The burden estimate for the web-based, national survey of direct certification practices is 1.0835 hours (65 minutes) for State CN staff inclusive of the respondents’ time to prepare for and complete the survey; the burden estimate is 1.0 hour (60 minutes) for district staff completing the long version of the survey and 0.334 hours (20 minutes) for district staff completing the short version of the survey. For all persons who decline to participate in the survey, the burden estimate is 0.1002 hours (6 minutes) and includes the respondents’ time to read a letter and/or respond to a telephone call. For all respondents interviewed during the site visits, the burden estimate is 1.334 hours (80 minutes), including respondents’ time to read an introductory letter, receive a reminder letter, and prepare for and participate in the visit. The burden for gathering unmatched SNAP records is 4.0 hours for each case study State; the burden for district staff to gather categorically eligible NSLP applications is 4.0 hours at each sampled district. Estimates for the interviews, record collection, and focus groups are all based on the contractor’s prior corporate experience.
Table A.12.1 outlines the burden estimates for the national survey of direct certification practices and site visits; the table reflects the expected average length of the survey, interviews, and collection of unmatched SNAP records and NSLP applications. Appendix A provides a copy of the national survey of direct certification practices. Appendix B provides the semistructured interview protocols that we will use during site visits.
Table A.12.1. Annual Burden Estimate
Affected
|
Respondent Type |
Estimated |
Responses |
Total |
Estimated |
Estimated |
State,
Local, |
Web-Based Survey |
|||||
State CN staff (long survey) Complete |
50 |
1 |
50 |
1.0835 |
54.18 |
|
State CN staff (long survey) Attempted |
6 |
1 |
6 |
0.1002 |
0.60 |
|
District staff (long survey) Complete |
2,000 |
1 |
2,000 |
1.00 |
2,000 |
|
District staff (long survey) Attempted |
500 |
1 |
500 |
0.1002 |
50.10 |
|
District staff (short survey) Complete |
3,012 |
1 |
3,012 |
0.334 |
1,006.01 |
|
District staff (short survey) Attempted |
753 |
1 |
753 |
0.1002 |
75.45 |
|
Site Visits |
||||||
State CN staff |
7 |
1 |
7 |
1.334 |
9.34 |
|
State education staff |
7 |
1 |
7 |
1.334 |
9.34 |
|
State SNAP staff |
7 |
1 |
7 |
1.334 |
9.34 |
|
State Medicaid staff |
7 |
1 |
7 |
1.334 |
9.34 |
|
State TANF staff |
7 |
1 |
7 |
1.334 |
9.34 |
|
State IS staff |
14 |
1 |
14 |
1.334 |
18.68 |
|
District staff |
18 |
1 |
18 |
1.334 |
24.01 |
|
District IS staff |
18 |
1 |
18 |
1.334 |
24.01 |
|
Unmatched SNAP Records and NSLP Applications |
||||||
State CN staff (SNAP unmatched records) |
7 |
1 |
7 |
4 |
28 |
|
District staff (NSLP applications) |
100 |
1 |
100 |
4 |
400 |
|
Total |
6,513 |
|
6,513 |
|
3,727.74 |
The total cost to respondents for their time in this collection is $127,347.44 (Table A.12.2). To calculate the annualized cost to State and local agencies and business respondents, we used the mean hourly wage rate categories determined by the Bureau of Labor Statistics, May 2009, National Industry-Specific Occupational Employment and Wage Estimates.
Table A.12.2. Annual Cost to Respondents
Respondent Type |
Instrument Type |
Average Hours per Response |
Number of Respondents |
Frequency of Response |
Mean Hourly Wage Rate |
Cost to Respondent |
State CN Staff |
Long survey Long survey (attempted) Site visit SNAP unmatched records |
1.0835
0.1002 1.334
4.00 |
50
6 7
7 |
1
1 1
1 |
$37.74
$37.74 $37.74
$23.56 |
$2,044.75
$22.64 $352.49
$659.68 |
District Staff b |
Long survey Long survey (attempted) Short survey Short survey (attempted) Site visit SNAP unmatched records |
1.00
0.1002 0.334
0.1002 1.334
4.00 |
2,000
500 3,012
753 18
100 |
1
1 1
1 1
1 |
$35.74
$35.74 $35.74
$35.74 $35.74
$22.73 |
$71,480.00
$1,790.57 $35,954.80
$2,696.58 $858.12
$9,092.00 |
District IS Staff b |
Site visit |
1.334 |
18 |
1 |
$22.73 |
$545.75 |
State IS Staffa |
Site visit |
1.334 |
14 |
1 |
$23.56 |
$440.10 |
State Education Staffa |
Site visit |
1.334 |
7 |
1 |
$37.74 |
$352.49 |
State SNAP Staffa |
Site visit |
1.334 |
7 |
1 |
$37.74 |
$352.49 |
State Medicaid Staffa |
Site visit |
1.334 |
7 |
1 |
$37.74 |
$352.49 |
State TANF Staffa |
Site visit |
1.334 |
7 |
1 |
$37.74 |
$352.49 |
Total |
$127,347.44 |
a North American Industry Classification System (NAICS) 999200: State Government.
b NAICS 999300: Local Government.
Provide estimates of the total annual cost burden to respondents or record keepers resulting from the collection of information, (do not include the cost of any hour burden shown in items 12 and 14). The cost estimates should be split into two components: (a) a total capital and start-up cost component annualized over its expected useful life; and (b) a total operation and maintenance and purchase of services component.
There are no capital/start-up or ongoing operation/maintenance costs associated with this information collection.
Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost and any other expense that would not have been incurred without this collection of information.
The total costs of this study include a firm fixed price contract with Mathematica for $999,964, which includes design of the study and development of data collection instruments, data collection, analysis and report writing, plus time spent by the Federal project officer (GS 13-Step 2) to manage data collection ($4,000). In addition, there are two option tasks (see Section A.2) that may be exercised at the discretion of FNS, which cost $97,493 and $74,734 respectively. Annual contract costs for the study, including the two option tasks, are as follows:
Year 1: $336,075 Base Contract (September 15, 2010, through September 30, 2011)
Year 2: $663,889 Base Contract + $172,227 Option Tasks (October 1, 2011, through September 30, 2012)
Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-1.
This is a reinstatement with changes of a previously approved data collection that will build on the data collection for “Feasibility of Computer Matching in the National School Lunch Program” (OMB Number 0584-0529). In contrast to the original data collection effort that focused solely on State-level respondents, the proposed data collection will expand the study to all districts in district-matching States. There are , of which there are approximately 6,265 districts across 19 States, plus Ohio, that perform data matching at the local level. TIn addition, the revised collection will also include in-person interviews of State and local staff in the seven in-depth study States, whereas the previous collection interviewed respondents by telephone in six States. and In addition, the current study further expands data collection efforts by collecting collection of unmatched SNAP records and categorically eligible NSLP applications from a sample of districts in the seven in-depth study States. This information is critical to an exploration of the accuracy of direct certification matches and providing insight into how data matching could be improved. T. akenTaken together, sSuch an expansion of the data collection effort will significantly substantially increase the number of respondents from whom data will be collected (6,513 versus 225). As a result, the burden hours for this revised data collection are 3,727.74, compared with 144.25 for the previously approved collection.
For collections of information whose results are planned to be published, outline plans for tabulation and publication.
The contractor will deliver analyses derived from this data collection to FNS via four key deliverables: a main final report, an unmatched records report, presentations (a briefing for FNS and two conference presentations), and a direct certification metadata repository. We describe each of these key deliverables in greater detail below. In addition, there are two option tasks to be exercised at the discretion of FNS that could result in two separate reports: an estimation of national certification matching rates report and a continuous improvement plan report. Each of the reports and the presentations will present key findings of the study in clear, nontechnical language that makes them understandable by a large audience. Table A.16.1 presents the schedule for data collection and the delivery of these products to FNS.
Table A.16.1. Data Collection, Analysis, and Reporting Schedule
Activity |
Time Schedule |
Data Collection* |
|
Send introductory letter to survey respondents |
Shortly after receiving OMB clearance |
Conduct national survey of direct certification practices |
September 4, 2012 to November 26, 2012 |
Send introductory letter to in-depth study States |
Shortly after receiving OMB clearance |
Conduct in-depth data collection |
October 29, 2012, to March 18, 2013 |
|
|
Data Analysis |
|
Compile survey data and prepare data files |
December 28, 2012 |
Complete data analysis |
June 7, 2013 |
Final table shells |
June 7, 2013 |
|
|
Reports |
|
Main Report – Final |
October 25, 2013 |
Unmatched Records Report – Final |
October 18, 2013 |
OPTIONAL: Estimation of National Certification Matching Rates Report – Final |
TBD |
OPTIONAL: Continuous Improvement Plan Report – Final |
TBD |
|
|
Presentations |
|
FNS presentation |
October 18, 2013 |
Conference “A” presentation |
TBD |
Conference “B” presentation |
TBD |
|
|
Direct Certification Metadata Repository |
November 1, 2013 |
|
|
Data Files |
|
Restricted use files (Final) |
June 7, 2013 |
Restricted use files (Main Report) |
September 4, 2013 |
Restricted use files (Unmatched Records Report) |
August 28, 2013 |
Raw data/analytic files |
October 16, 2013 |
Public use files |
November 6, 2013 |
* Assumes receipt of OMB clearance on or about August 1, 20112012.
TBD = to be determined.
Main Final Report. The main final report, to be published on the FNS website (http://www.fns.usda.gov/ora/), will provide an updated nationwide profile of direct certification methods; direct certification rates by method, trends, and process improvements; and a discussion of barriers and challenges encountered. The report will consist of an executive summary and six sections, plus appendixes: (1) background of NSLP direct certification and data matching, (2) discussion of study plan and methodology, (3) current direct certification practices, (4) barriers to direct certification and planned improvements, (5) practices and barriers associated with direct certification performance, (6) lessons learned and conclusions, and (7) appendixes.
We will present information in tables and in narrative form, with analysis of the study’s objectives and questions based on the information gathered. We will organize this discussion by topic, rather than by type of analysis, in order to provide readers with a fuller, more textured understanding of each dimension of direct certification. Our general approach will be to use information from the national survey to give the broader picture of direct certification practices nationally, and then to use analysis of the in-depth data collection to support the conclusions of the national survey analysis, provide counterexamples, or give a more nuanced local perspective on the patterns we find nationally.
Unmatched Records Report. The unmatched records report, to be published on the FNS website (http://www.fns.usda.gov/ora/), will provide a descriptive analysis of unmatched SNAP records from States selected for the in-depth data collection, and an independent match of SNAP records used for direct certification of children listed on categorically approved applications. The analysis will focus on examining the limitations of direct certification processes and exploring the reasons for the nonmatches, along with next steps and possible solutions. The published report will consist of an executive summary and four sections, plus appendixes: (1) background on the States and districts selected for the unmatched case review, (2) discussion of methodology used in the unmatched case review, (3) results of the data analysis, (4) conclusions and next steps, and (5) appendixes. The appendixes will contain detailed information, business process flow charts describing the States’ direct certification processes, and tables on each of the States’ and districts’ direct certification systems.
Direct Certification Metadata Repository. The direct certification metadata repository (DCMDR) prototype will be a one-stop technical solution to gather and display the most up-to-date information on data-matching ISs information systems (IS) and database characteristics to facilitate improved direct certification systems in each State and district. The database will include edited information gathered from the national survey of direct certification practices, supplementary information gathered from the in-depth case study interviews, and technical documentation on student and program data. Information stored in the database will include the following: direct certification typology, program data used in matching, attributes of State ISs, data element attributes, privacy/security considerations, frequency of matching, data-matching algorithms and computer code, and data-matching rates. The DCMDR couples the need for current information to identify promising new practices with the technological capacity to update process and technological changes quickly. Users will be able to query the database using a simple Section 508-compliant interface. In addition, the DCMDR will have reporting functionality so that information can be readily used to assist States and districts with their continuous improvement plans.
If FNS exercises either or both of the option tasks, we will conduct complementary analyses to further understand and improve direct certification practices employed by States and districts. The first option task would be to estimate the national direct certification matching rates under various scenarios, using data gathered in the national survey. Data from previous studies suggest that State-level matching yields higher rates of direct certification, on average, than district-level matching. However, a multivariate analysis has not been conducted to control for State characteristics and other features of direct certification. The contractor would then conduct the necessary multivariate analysis to examine the characteristics of States and their direct certification methods to identify the critical factors that influence higher matching rates. The second option would be to develop a matrix that categorizes the key characteristics of States’ data-matching processes, thereby enabling States to compare their processes and procedures and identify potential areas for improvement. The contractor will also develop model plans for each typology of States that will provide a guide for continuous improvement plans.
If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
The agency will display the OMB approval number and expiration date on all instruments.
Explain each exception to the certification statement identified in Item 19 “Certification for Paperwork Reduction Act.”
There are no exceptions to the certification statement. The agency is able to certify compliance with all provisions under Item 19 of OMB Form 83-I.
1 The Richard B. Russell National School Lunch Act (NSLA) and the Child Nutrition and WIC Reauthorization act of 2004 refer to two different terms to refer to the local entities that enter into agreements with State agencies to operate the NSLP: LEAs and School Food Authorities (SFAs). In essence, LEAs are responsible for the application, certification, and verification functions of the school meal programs. SFAs are responsible for other aspects of the NSLP, such as meal pattern requirements and meal-counting and claiming reimbursements. For consistency sake, we will use the term “district” throughout the remainder of this document. However, it is important to note that the sampling frame is SFA.
2 The Healthy, Hunger-Free Kids Act of 2010 (PL 111-296) required State agencies to phase out the use of the letter method as their primary method for direct certification with SNAP. Full compliance with this requirement is to occur by SY 2012-2013. However, it is appropriate to include questions regarding the letter method for two reasons: first, some States may still use the letter method as a secondary means of certifying children for school lunches; and second, there may be instances in which a State or district has not yet phased out the use of the letter method. It is important to capture that information.
3 Although this component of data collection is technically a compilation of extant data, it is appropriate that the collection of unmatched SNAP participant records and NSLP applications remain in this clearance package. Responding to this request will impose a burden on respondents as they will be required to gather and submit these records and documents to Mathematica. As such, we want to be as transparent as possible regarding all the activities required of this proposed data collection effort.
File Type | application/msword |
Author | Dawn Patterson |
Last Modified By | Kevin Conway |
File Modified | 2012-02-03 |
File Created | 2012-02-02 |