Assessment
of the Contributions of an Interview to Supplemental Nutrition
Assistance Program Eligibility and Benefit Determinations:
OMB
Supporting Statement
Part A
May 2, 2013
Project Officer: Rosemarie Downer
CONTENTS
PART A: JUSTIFICATION 5
A1. Circumstances that Make Data Collection Necessary 5
A2. Purpose and Use of the Information 7
A3. Use of Information Technology and Burden Reduction 12
A4. Efforts to Identify Duplication and Use of Similar Information 13
A5. Impacts on Small Businesses and Other Small Entities 13
A6. Consequences of Collecting the Information Less Frequently 14
A7. Special Circumstances Relating to the Guideline of 5 CFR 1320.5 14
A8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside Agency 15
A9. Explanation of Any Payment or Gift to Respondents 16
A10. Assurance of Confidentiality Provided to Respondents 17
A11. Justification for Sensitive Questions 20
A12.
Estimates of Hour Burden Including Annualized Hourly
Costs 21
A13.
Estimate of the Total Annual Cost Burden to Respondents
or
Record-Keepers 24
A14. Annualized Cost to the Federal Government 25
A15. Explanation for Program Changes or Adjustments 25
A16. Plans for Tabulation and Publication and Project Schedule 25
A17. Reason(s) Display of OMB Expiration Date Is Inappropriate 29
A18. Exceptions to Certification for Paperwork Reduction Act Submissions 29
APPENDIX A: Administrative data ELEMENTS
APPENDIX B: Tailored performance data indicators
APPENDIX C: Letters to SNAP Directors and community-based Organizations
APPENDIX D: Client survey
APPENDIX E: Client Survey Correspondence
APPENDIX F: Focus groups with procedural denials guide
appendix g: focus group correspondence
APPENDIX H: Time-use data collection protocol
appendix i: Site visits and interview protocol
appendix j: Qc-like review form
APPENDIX K: MATHEMATICA CONFIDENTIALITY AGREEMENT
APPENDIX L: Public comments
APPENDIX M: response to Public comments
TABLES
A.12.1. Annual Burden Estimate
A.12.2. Annual Cost to Respondents
A.16.1. Project Schedule
PART A: JUSTIFICATION
Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.
This is a new information collection request. Under legal authority Section 17 [7 U.S.C. 2026] (a)(1) of the Food and Nutrition Act of 2008 (included in this request) provides general legislative authority for the planned data collection. It authorizes the Secretary of Agriculture to enter into contracts with private institutions to undertake research that will help to improve the administration and effectiveness of SNAP in delivering nutrition-related benefits.
FNS seeks approval to conduct data collection as part of the Assessment of the Contributions of an Interview to Supplemental Nutrition Assistance Program (SNAP) Eligibility and Benefit Determinations. The U.S. Department of Agriculture’s Food and Nutrition Service (FNS) is undertaking this study. The overall aim of this evaluation is to examine the impact of eliminating client interviews at SNAP certification and recertification. FNS has contracted with Mathematica Policy Research to conduct this evaluation. This package requests clearance for a new data collection, which will occur through the following means:
Administrative data
Tailored performance data
Client survey
Focus groups with procedural denials
Time-use data collection
Site visits
SNAP is a critical source of support for many low-income families and individuals. In recent years, States have changed the way clients enroll in SNAP. A central feature of the changes is a waiver that allows States to conduct the in-person eligibility interview over the telephone. Many States have implemented this interview waiver. Some States have expressed interest in exploring alternative certification approaches that do not require conducting any interviews in the SNAP eligibility determination process. However, little data are available to assess the impact of eliminating a certification interview on client access, customer service, and program integrity.
This study will focus on the contributions of interviews to the determination of SNAP eligibility and benefits. It will examine if there are differences in payment accuracy, program access, administrative costs, and client satisfaction under two conditions: usual application procedures and no-interview demonstration procedures. The study will be conducted in three States.
State officials, local office staff, and client advocates—along with FNS—all recognize the potential risks and rewards of eliminating the in-person interview for most clients. The potential rewards might include reductions in administrative costs for States and increases in program access for some clients. However, the potential risks—such as reduced access for other clients and increased payment errors (among others)—are important enough to warrant careful study of the effects of eliminating the interview. To the best of its ability, the study has to isolate changes in the interview mode from other changes that could influence key program outcomes.
The three States selected to participate in the study are North Carolina, Oregon, and Utah. North Carolina and Oregon will designate demonstration and comparison sites to test how not conducting an interview (demonstration site) compares with the current interview model (comparison site) in the State. Utah will randomly assign active cases to a no-interview or an interview group at the start of the pilot and all new applications will be assigned thereafter. The study will examine the impacts of these demonstrations on payment accuracy, State administrative costs, and client access. Also, it will examine how these conditions affect the steps eligibility workers must take to ensure that the study collects accurate information from clients.
A2. Purpose and Use of the Information
Indicate how, by whom, how frequently, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.
Mathematica will collect information for the Assessment of the Contributions of an Interview to SNAP Eligibility and Benefit Determinations on behalf of FNS. To obtain a detailed and comprehensive view of the implementation of SNAP modernization initiatives, Mathematica will collect data via in-person interviews (Appendix I); telephone client surveys (Appendix D); focus group discussions (Appendix F); and through administrative case records (Appendix A), tailored performance data (Appendix B), and other relevant materials.
The study has eight research objectives: (1) describe the no-interview demonstration in each State; (2) describe any modernization activities in each State that complement the demonstration to make its application more effective; (3) describe the process for implementing the demonstration; (4) describe the response of clients to the demonstration; (5) describe the response of SNAP staff to the demonstration; (6) describe the response of community-based organizations (CBOs) and other stakeholders to the demonstration; (7) document how key program outcomes change after the demonstration is implemented; (8) document the main take-away points from the study to inform FNS for consideration for future studies.
The overall purpose of this study is to meet its research objectives with the precision necessary to inform future SNAP policy. It will quantify the impact of replacing the in-person interview with no interview and examine how this affects participation, efficiency, access, payment accuracy, and client satisfaction.
Site Visit Interviews
A key source of data for the study will come from on-site interviews using a set protocol (see Appendix I). The study team will conduct a single visit to the demonstration and comparison sites in North Carolina and Oregon. The visit will involve observations of local offices in the demonstration and comparison areas, interviews with State and local SNAP staff, and interviews with CBOs. In Utah, the visit will focus on the state office, the centralized call center, and other centralized operations centers, where staff will interview state officials, administrators, and call center staff. Staff at a sample of local employment centers and CBOs will also be interviewed to obtain their reactions to the demonstration.
During the visit, Mathematica will also conduct focus groups with individuals whose applications were procedurally denied, described in additional detail in the following section.
Client Data
Another key source of data for the study is from the clients who apply for SNAP benefits and those participating in SNAP. Mathematica will directly collect two types of data in English and Spanish from clients: (1) data from a survey of SNAP clients who recently applied or were recertified for benefits and (2) data from focus groups with individuals who applied for SNAP benefits but were denied because they did not complete the application process (referred to as procedural denials).
The short survey of SNAP clients will ask about their recent application or recertification interview experiences so as to provide the clients’ perspectives on the process (see Appendix D). The study team will select the sample for the client survey from among clients who applied or were recertified within the prior two months, thereby focusing on experiences that can be recalled with a sufficient degree of accuracy. The study team expects to complete 462 interviews per site in each State for a 95 percent confidence interval of no more than plus or minus 5 percentage points.
The study will use semistructured focus group guides (see Appendix F) to explore the reasons clients’ applications were procedurally denied. When meeting with the procedural denial clients, the discussion will focus on their experiences with the SNAP application process. The discussion will cover six main topics: (1) SNAP knowledge and expectations before applying for benefits, (2) experiences with the SNAP application process, (3) submitting verification documents, (4) access to help in completing the SNAP application, (5) the SNAP interview process and communication of key information, and (6) overall impressions. A total of 12 focus groups will be conducted with an average of 10 participants per group.
Administrative Data
The study will collect three main types of administrative data: (1) monthly extracts of each State’s SNAP caseload case records, (2) disposition data from reapplications of individuals who were denied benefits without an eligibility interview, and (3) State quality control (QC) records. Burden associated with the collection of administrative data is included in estimates approved under OMB Control Number: 0584-0512 expiration date 1/31/2016.
The study will obtain monthly case record extracts and application data from each State in two batches spanning from two years before the demonstration through the demonstration period. The study team will use these data to examine trends in the following outcomes: SNAP participation overall and by subgroup; participation in other programs (for example, Temporary Assistance for Needy Families); program access as measured by the number of applications submitted; applicant deductions; approval and denial rates; and other outcome measures that might be related to the interview model.
FNS requires that participating States offer eligibility interviews to any client who applies under the no-interview demonstration and is deemed ineligible. States will provide interview data on all clients denied eligibility through the demonstration, including (1) whether and when the client was offered an interview after denial, (2) whether the client accepted the interview offer, (3) whether the client was determined eligible after the interview, and (4) which client information differed when comparing the original and the interview-based applications.
Finally, the study will obtain disposition records from each participating State’s QC process for the study. Using FNS 380-1 (approved under OMB Control Number: 0584-0299 expiration date 2/29/2016) and FNS 245 (approved under OMB Control Numbers: 0584-0034 expiration date 1/31/2016) FNS requires all States to conduct QC reviews of a random sample of the State caseload to monitor payment accuracy. Furthermore, FNS requires the States participating in this study to conduct supplemental QC-like reviews on a sample of cases participating in the demonstration. States will be asked to provide electronic records of the ongoing QC reviews and supplemental QC-like reviews.
Performance and Time-Use Data
The study team will collect performance-related data for demonstration and comparison sites at two points in time: before the start of the demonstration and after implementation. In North Carolina and Oregon, performance and time use data will be collected for staff from offices in the demonstration area and compared to staff in offices in the comparison areas. In Utah, the state will identify specific teams to manage the no-interview cases (demonstration). 1 The performance of staff from those teams will be compared to staff on the teams that manage the traditional cases (comparison).
All performance data and supplemental retrospective administrative cost data will be collected from state and/or local SNAP office staff electronically. Using these data, 12 performance indicators will be calculated for both the demonstration and comparison sites. These indicators assess inputs (for example, the number of applications received); outputs (for example, the percentage of scheduled interviews completed); and outcomes (for example, change in administrative costs). The performance indicators to be considered will include the change in the (1) number of applications received; (2) method of submission; (3) percentage of clients completing scheduled interviews2; (4) percentage of scheduled interviews completed; (5) time spent processing each application component; (6) timeliness of the application process; (7) denial rates and reasons (including procedural denials); (8) proportion of cases with client-initiated changes; (9) number of fraud cases reported; (10) number of workers assigned to various tasks (if relevant); (11) staff workload; and (12) administrative costs by activity.
The study will collect time-use data (Appendix H) via web-based activity logs from a sample of caseworkers in both demonstration and comparison sites over the course of a workweek. These data will cover four general activities—certification, recertification, other case management, and non–case-related activities—in order to estimate the difference in costs between the current interview approach and the no-interview demonstration test. SNAP staff will complete a daily work log that identifies which clients the caseworker served (demonstration or comparison); the number of cases on which they worked; and the number of hours worked on tasks and subtasks associated with certification, recertification, other case management activities, and non–case-related activities.
Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also, describe any consideration of using information technology to reduce burden.
The study will comply with the E-Government Act of 2002 to promote the use of technology. Because State staff resources are limited, data collection will minimize the burden on the States. To address these challenges, Mathematica will work closely with State data managers to articulate data requirements and to ensure a comprehensive understanding of the meaning behind analysis variables. The time-use and performance data collections will rely on an accessible database into which SNAP staff can enter relevant information (100%) as part of their normal work routines. The study will use computer-assisted telephone interviewing (CATI) for the client survey as an efficient alternative to conducting interviews and tracking responses on paper. Features such as programmed skip patterns on the survey instrument will reduce respondent burden and minimize any questions asked in error.
Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purpose described in item 2 above.
There is no similar data collection available. Every effort has been made to avoid duplication. FNS has reviewed USDA reporting requirements, state administrative agency reporting requirements, and special studies by other government and private agencies. FNS solely administers the SNAP programs. The information required for this study is not currently reported to State Agencies on a regular basis in a standardized form.
This is the first study of its kind. Most SNAP modernization changes are recent, thus the information to be collected in this study does not exist elsewhere. FNS does not require most States (unless certain waivers are in place) to report any information related to the implementation of their modernization efforts; therefore, this data collection does not duplicate State efforts. FNS has not previously tried to isolate the impact of modes of the eligibility interview, so there is currently no reliable information on whether the proposed no-interview model has an impact.
If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.
FNS has determined that the requirements for this information collection do not adversely impact small businesses or other small entities. Although smaller States involved in this data collection effort, they delivered the same program benefits and perform the same function as any other State. Thus, they maintain the same kinds of information on file. FNS estimates that one percent of our respondents are small entities.
Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
This is a one-time data collection. If this study is not conducted, FNS could not evaluate the effectiveness of this program. The data collection plan described in this submission will help FNS understand the impact of eliminating the required client interview for SNAP certification and recertification. The study is designed to compare sites that eliminate the interviews at certification and recertification with sites that use the State’s current interviewing practices or randomly assigned clients in Utah. The study will evaluate whether and to what extent the participation, efficiency, access, payment accuracy, administrative costs, customer access, and staff and client satisfaction are affected by having an interview or not having an interview. The results of this study will inform FNS about the contributions of client interviews to program operating efficiency and access. In the absence of these results, FNS will lack the means to assess the potential efficacy of State modernization changes related to interview requirements.
Explain any special circumstances that would cause an information collection to be conducted in a manner:
requiring respondents to report information to the agency more often than quarterly;
requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
requiring respondents to submit more than an original and two copies of any document;
requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;
in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
requiring the use of a statistical data classification that has not been reviewed and approved by OMB;
that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.
There are no special circumstances. The study will conduct data collection in a manner consistent with 5 CFR 1320.5.
If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency’s notice, soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments.
Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting form, and on the data elements to be recorded, disclosed, or reported.
a. Federal Register Notice and Comments
In accordance with 5 CFR 1320.8 (d) 1995, a notice of the proposed information collection and an invitation for public comment was published in the Federal Register, 11/29/2011, volume 76, number 229, pages 73584-73586. One public comment was received (Appendix L) and responded to (Appendix M).
b. Consultations Outside of the Agency
FNS consulted with the following individuals for expert consultation about the design, level of burden, and clarity of instructions for this collection:
Name |
Title |
Affiliation |
Phone Number |
Alana Landey |
Acting Director, Division of Economic Support for families |
Office of Human Services Policy Office of the Assistant Secretary for Planning & Evaluation U.S. Department of Health & Human Services |
202-401-6636 |
Michael Jacobsen
|
Methods Branch |
National Agricultural Statistics Services (NASS) |
202-690-8639 |
Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.
As a token of appreciation in the focus group, FNS plans to offer a financial incentive to increase response rate without enticing participation in the focus groups, as is customary. During screening calls (Appendix F1), the SNAP procedural denial applicants recruited for the focus groups will be offered $30 (incentive disbursed upon completion of the discussion). All invited SNAP participants will be informed that these incentives will not affect the value of any potential SNAP benefits. Based on experience, this amount has been sufficient to encourage participation in focus groups conducted with similar populations. Another incentive to further increase participation, light refreshments will be provided as a token of our appreciation. In addition, the study will provide transportation stipends to those who express a need for them.
The study will survey SNAP clients by telephone about their recent application/recertification experiences under the tested interview conditions to provide their perspectives on the process. Clients will receive an advance letter (Appendix E) about the survey, including a $2 pre-interview cash incentive to increase awareness and interest in the survey. Clients will also receive a $10 Visa gift card to a local store after completing the survey as a post-interview incentive to achieve completion of the survey at the highest possible response rates. Clients will be informed that these incentives will not affect the value of any potential SNAP benefits. The telephone interview will last five to seven minutes.
Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.
A system of record notice (SORN) titled FNS-8 USDA/FNS Studies and Reports in the Federal Register on April 25, 1991 Volume 56, Number 80, and is located on pages 19078-19080 discusses the terms of protections that will be provided to respondents. Participants in this study will be subject to assurances and safeguards as provided by the Privacy Act of 1974 (5 USC 552a), which requires the safeguarding of individuals against invasion of privacy. The Privacy Act also provides for the privacy treatment of records maintained by a Federal agency according to either the individual’s name or some other identifier.
Individuals participating in this study will be notified that the information they provide will not be published in a form that identifies them. Our contractor, Mathematica, will make certain that all surveys are held securely and that in no instance will responses be made available except in tabular form. Under no condition will information be made available to SNAP program personnel. SNAP program staff responsible for assisting Mathematica in the recruitment of study participants will be fully informed of Mathematica’s policies and procedures regarding privacy of the data.
Additionally, Mathematica will comply with the following legislation as required:
E-Government Act of 2002 (P.L. 107-347, Title V, Subtitle A, “Confidential Information Protection”)
The Freedom of Information Act (5 U.S.C. 552)
USA Patriot Act of 2001 (P.L. 107-56)
Office of Management and Budget (OMB) Federal Statistical Confidentiality Order of 1997
Computer Security Act of 1987
Federal Information Security Management Act (FISMA)
OMB Circular A-130, Management of Federal Information Resources
Presidential Directive Decision 63, Critical Infrastructure Protection (CIP)
Presidential Directive Decision 67, Enduring Constitutional Government and Continuity of Government Operations
Homeland Security Presidential Directive 7
National Institute of Standards and Technology’s Guide for Developing Security Plans for Information Technology Systems (Special Publication 800-18)
U.S. Government “Plain Language” Guidelines
Mathematica will protect the privacy of all information collected for the evaluation and will use it for research purposes only. No information that identifies any study participant will be released. Further, personally identifiable data will not be entered into the analysis file and data records will contain a numeric identifier only. When reporting the results, data will be presented only in aggregate form so that individuals and institutions will not be identified. Mathematica will include a statement to this effect with all requests for data. Further, the study team will maintain no individually identifiable information beyond the duration of the study. All members of the study team having access to the data will be trained on the importance of privacy and data security. All data will be kept in secured locations and identifiers will be destroyed as soon as they are no longer required.
Mathematica will employ the following safeguards to carry out during the study:
All employees at Mathematica will sign a confidentiality pledge (Appendix K) emphasizing its importance and describing their obligation.
Access to identifying information on sample members will be limited to those who have direct responsibility for providing and maintaining sample locating information. At the conclusion of the research, these data will be destroyed.
Identifying information will be maintained on separate forms and files, which are linked only by sample identification number.
Access to the file linking sample identification numbers with the respondents’ IDs and contact information will be limited to a small number of individuals who have a need to know this information.
Access to the hard-copy documents will be strictly limited. Documents will be stored in locked files and cabinets. Discarded materials will be shredded.
Computer data files will be protected with passwords and access will be limited to specific users. Especially sensitive data will be maintained on removable storage devices that are kept physically secure when not in use.
Employees will be required to notify their supervisors, the project director, and the Mathematica security officer if private information has been disclosed to an unauthorized person, used in an improper manner, or altered in an improper manner. The project director and Mathematica security officer, in consultation with FNS, will then determine the appropriate action to be taken based on the nature of the breach of privacy.
The study team will notify interview and focus group respondents that participation is voluntary and will not affect their benefits, and that their individual responses will be kept private and not be disclosed to anyone apart from the members of the research team, except as required by law. Mathematica has a long history of protecting the privacy of records and considers it a critical aspect of any study’s scientific integrity and legality. During eligibility screening calls (Appendix G), staff will explain to potential focus group participants that information requested from them is for research purposes only and that their identities will not be disclosed to anyone outside the evaluation project, including SNAP staff. Focus group moderators will also notify recipients that their conversations in the focus groups will be audio-taped, that their recorded comments will be heard only by members of the research team and saved only until transcribed, and specify that the transcription summaries will not reveal their identities.
Provide additional justification for any questions of a sensitive nature, such as sexual behavior or attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
The interview questions for SNAP staff and CBOs and vendors will primarily relate to program details and their opinions of effectiveness and will not be sensitive. Additionally, respondents to the client survey and members of the participant focus groups are not likely to view the discussion questions about their SNAP experiences as sensitive.
As described in Section A.10, the study team will notify all client survey respondents at the outset of their interview, and focus group participants during the screening call and again at the outset of the focus group, of the intent to maintain privacy. In addition, the study team will inform them that participation is voluntary and they need not answer any questions that make them uncomfortable. The study team will also inform them that there are no penalties if they decide not to respond, either to the information collection as a whole or to any particular question. All responses will be kept private and will not be reported to SNAP staff or any other program, agency, or organization, except as otherwise required by law. Rather, all the responses will be combined so that no individual is identifiable.
Provide estimates of the hour burden of the collection of information. The statement should:
Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.
Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories.
For all interviews of State, district or county, and local office SNAP staff, and CBO staff, the average burden estimate across all types of staff is 2.06 hours, which includes the respondents’ time to prepare for and complete the interview. A client survey pretest was conducted with a burden of 7 minutes. For client survey respondents, including the respondents’ time to read an advance letter and complete the survey, the burden estimate is 0.1667 hours (10 minutes)3. For client survey refusers, the burden estimate is 0.0833 hours (5 minutes), including time to read the advance letter and field attempts to conduct the survey. For all participating members in the focus groups, the burden estimate is 1.667 hours (100 minutes). This includes the respondents’ time to be screened and recruited, receive a reminder call, read a reminder letter, and participate in the group. For all who decline to participate in the focus groups, including the respondents’ time to be screened, the burden estimate is 0.0833 hours (5 minutes). Estimates for the interviews, record collection, and focus groups are based on Mathematica’s prior corporate experience with similar populations. Examples are In-Depth Case Studies of Advanced Supplemental Nutrition Assistance Program Modernization Initiatives and Evaluation of Elderly Nutrition Demonstrations, each conducted on behalf of FNS. The estimate of burden for the client survey is based on a pretest conducted in January 2012 with 9 SNAP clients.
All interviews with staff from State-, district-, county-, and local-level SNAP offices, as well as staff from CBOs, will be conducted once about 13 months into the demonstration period. SNAP clients and procedural denials will be interviewed or participate in a focus group only once. This sums to a total of 986 hours, including State SNAP staff, 36 hours; district/county SNAP staff, 36 hours; local office SNAP staff, 120 hours; CBO staff, 18 hours; SNAP clients participating in the survey, 462 hours; and SNAP client survey nonresponders, 73 hours. The number of survey nonresponders is based on a starting sample of 3,648 clients, of whom 95 percent will be eligible for the survey, and will achieve an 80 percent response rate. The burden for clients with procedural denials participating in the focus groups is estimated at 200 hours; for respondents who elect not to participate in the focus groups (refusers), the estimated total burden is 40 hours. The number of refusers is based on the assumption that in order to have 120 respondents ultimately attend the focus groups, the study team will have to recruit 300 people. In order to recruit 300 people, the team will have to initially contact twice as many, or 600.
Table A.12.1. Annual Burden Estimate
Affected Public |
Respondent
|
Estimated # Respondents |
Responses Annual per Respondent |
Total Annual Responses |
Estimated Avg. # of Hours per Response |
Estimated Total Hours |
State and Local Agencies a g |
State SNAP staff |
12 |
1 |
12 |
3 |
36 |
District/County SNAP staff |
18 |
1 |
18 |
2 |
36 |
|
Local office SNAP staff |
60 |
1 |
60 |
2 |
120 |
|
SUBTOTAL |
|
90 |
--- |
90 |
--- |
192 |
Not-for-Profit Organizations g |
CBO staff b |
12 |
1 |
12 |
1.5 |
18 |
SUBTOTAL |
|
12 |
--- |
12 |
--- |
18 |
Individuals and Households |
Active SNAP participants (Pretest of client survey) |
9 |
1 |
9 |
0.1167
|
1.05 |
Active SNAP participants (client survey)c |
2,772 |
1 |
2,772 |
0.1667
|
462.0 |
|
Active SNAP participants (client survey nonresponders)d |
876 |
1 |
876 |
0.0833
|
73 |
|
SNAP
procedural denials |
120 |
1 |
120 |
1.667
|
200 |
|
SNAP procedural denials (focus group nonresponders)f |
480 |
1 |
480 |
0.0833
|
40 |
|
SUBTOTAL |
4,257 |
--- |
4,257 |
--- |
776.07 |
|
Grand Total |
4,359 |
1 |
4,359 |
.2262 |
986.07 |
a 100 percent participation is expected from this group.
b CBO staff will be administered applicable items of the master interview protocol (see Appendix I for marked items).
c Client survey respondents will receive an advance letter before the interview.
d Client survey nonresponders will receive an advance letter before fielding a call attempting the interview.
e Focus group members will participate in a brief screening call or interview, participate in the focus group, and receive a reminder call and letter before the focus group.
f Focus group refusers will participate in a brief screening call or interview.
g Burden for interview is inclusive of time-use data collection protocol and site visits and interview protocol.
The total cost to respondents for their time in this collection is $10,255.54 (Table A.12.2). The annualized cost to State and local agencies was calculated using the mean hourly wage rate categories determined by the Bureau of Labor Statistics, May 2010, National Industry-Specific Occupational Employment and Wage Estimates. The annualized cost to not-for-profit organizations was calculated using the average hourly wage estimates published by the Bureau of Labor Statistics for the National Compensation Survey, April 2009. The annualized cost to SNAP participants and procedural denials was calculated using the Federal minimum wage as of July 24, 2009.
Table A.12.2. Annual Cost to Respondents
Respondent Type |
Estimated # Respondents |
Total Annual Responses |
Estimated Avg. # of Hours per Response |
Estimated Total Hours |
Mean Hourly Wage Rate |
Cost to Respondent |
State SNAP staff |
12 |
12 |
3 |
36 |
$23.89a |
$ 860.04 |
District/County SNAP staff |
18 |
18 |
2 |
36 |
$22.12b |
$ 796.32 |
Local office SNAP staff |
60 |
60 |
2 |
120 |
$22.12b |
$ 2,654.40 |
Not-for-Profit Organizations |
12 |
12 |
1.5 |
18 |
$17.68c |
$ 318.24 |
Client Survey - Pretest |
9 |
9 |
0.1167 |
1.05 |
$7.25d |
$ 7.61 |
Client Survey |
2,772 |
2,772 |
0.1667 |
462.0 |
$7.25d |
$ 3,349.50 |
Client survey nonresponders |
876 |
876 |
0.0833 |
73.0 |
$7.25d |
$ 529.25 |
Focus group |
120 |
120 |
1.667 |
200.0 |
$7.25d |
$ 1,450.29 |
Focus group nonresponders |
480 |
480 |
0.0833 |
40.0 |
$7.25d |
$ 289.88 |
Grand Total |
|
$ 10,255.54 |
a North American Industry Classification System (NAICS) 999200: State Government.
b NAICS 999300: Local Government.
c Wages in the Nonprofit Sector: Healthcare, Personal Care, and Social Service Occupations, National Compensation Survey
d Federal minimum wage as of July 24, 2009
Provide estimates of the total annual cost burden to respondents or record keepers resulting from the collection of information, (do not include the cost of any hour burden shown in items 12 and 14). The cost estimates should be split into two components: (a) a total capital and start-up cost component annualized over its expected useful life; and (b) a total operation and maintenance and purchase of services component.
There are no capital/start-up or ongoing operation/maintenance costs associated with this information collection. The study will provide reporting tools to SNAP offices for the purpose of reporting performance and time-use data.
A14. Annualized Cost to the Federal Government
Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost and any other expense that would not have been incurred without this collection of information.
The annualized cost to the Federal government is $622,441. The total costs of this study include a firm fixed price contract with Mathematica for $1,863,322, which includes design of the study and development of data collection instruments, data collection, analysis, and report writing, plus time spent by the federal project officer (GS 13-Step 10) to manage data collection ($4,000).
Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-1.
This is a new information collection request. This study is a program change that will add 986 hours to the OMB Inventory.
For collections of information whose results are planned to be published, outline plans for tabulation and publication.
This study will conduct three primary quantitative analyses: (1) difference-in-differences analysis in demonstration site States (North Carolina and Oregon); (2) post-comparison group analysis in demonstration site States; (3) comparison of mean outcomes of random assignment groups in Utah; and (4) qualitative comparison site analysis in the demonstration site States.
Quantitative Analysis
A difference-in-differences analysis will be conducted for those outcomes for which both pre- and post-implementation measures are available. The study team will estimate the impact of the demonstration on several SNAP outcomes by using a difference-in-differences approach. Outcomes to be analyzed include the following:
Applications submitted
Method of application submission
Applicants requesting application assistance
Follow-up contacts made to applicants
Application approval rates
Application processing timeliness
Applicant information, including
Household size
Household composition
Sources of income
Deduction amounts
Average benefits
Clients participating in SNAP
Clients participating in multiple benefit programs
Client churning
SNAP administrative cost per case
Payment error
Staff Workload
Client satisfaction with the application process
The difference-in-differences approach will help identify whether the no-interview demonstration is affecting these key outcomes in the demonstration sites. The study will tabulate impact estimates in the final report separately for North Carolina and Oregon. The analysis will examine overall impacts as well as impacts by demographic and economic characteristic subgroups; all analyses will include indications of associated statistical significance.
For North Carolina and Oregon, the study team will conduct a post-comparison group design for outcome measures for which data are collected at only one time point. The team will compute policy impacts on the number and percentage of applicants who contacted someone for assistance to complete the application process, levels of client satisfaction, and other findings from the client survey. The study team will tabulate differences in post-implementation outcomes by State and subgroup in the final report, with an indication of associated statistical significance. This analysis will not support causal inference. Caution will be taken to interpret the results appropriately.
Random assignment of the no-interview demonstration in Utah will ensure that, on average, the unobservable characteristics of the demonstration and control group members are similar. This allows inferences to be drawn about whether the demonstration, rather than unobserved factors, is causing differences in outcomes between these groups. The random assignment analysis will examine a broad set of outcomes, including the following:
Applications submitted
Method of application submission
Applicants requesting application assistance
Follow-up contacts made to applicants
Application approval rates
Application processing timeliness
Applicant information, including
Household size
Household composition
Sources of income
Mean amount of income reported (across all clients)
Deduction amounts
Average benefits
Clients participating in SNAP
Clients participating in multiple benefit programs
Client churning
SNAP administrative cost per case
Payment error
Staff Workload
Client satisfaction with the application process
At a computational level, the basic analysis for the random assignment State will mirror that of the post-comparison group analysis. The study team will tabulate impact estimates in the final report and examine demographic and economic characteristic by subgroups, with an indication of associated statistical significance.
Qualitative Analysis
For process outcomes, the study will examine qualitative data from interviews with staff, focus groups with clients, and program observations. The study team will analyze all information assembled from these sources to answer key questions about the comparison between usual interview practices and the no interview alternative. The purpose of the analysis is to describe how the demonstration was implemented and operated, and explore the outcomes of the demonstration.
The study team will analyze qualitative site visit data through theme tables for identifying similarities and differences across States and common lessons learned. It will develop tables that reflect the key research questions and will produce one table for each State. Tables will identify several research questions around a specific outcome and provide a summary of the responses from the various respondents and sites. Table cells will summarize conclusions about a given question for a desired level of staff or location. Looking across the tables for each State will identify common themes and outliers.
Table A.16.1 presents the project schedule.
Table A.16.1. Project Schedule
Task |
Date |
Data Collection |
|
Administrative Data |
|
Receive first batch of case record extracts |
12/1/2012 |
Receive second batch of case record extracts |
11/1/2013 |
Receive tailored administrative data |
11/1/2013 |
Post-Demonstration Data |
|
Mail advance letter to client survey sample members |
8/1/2013 |
Conduct client survey (start) |
8/15/2013 |
Schedule second round site visits |
8/1/2013 |
Recruit focus group participants and send reminder letters |
8/1/2013 |
Conduct second round site visits (start) |
9/1/2013 |
Conduct focus groups (start) |
9/1/2013 |
Interim Data Analysis |
|
Submit final table shells |
12/27/2013 |
Complete data analysis |
1/17/2014 |
Interim Report |
|
Submit final interim report to FNS |
5/16/2014 |
Final Data Analysis |
|
Submit final table shells |
1/10/2014 |
Complete data analysis |
2/28/2014 |
Final Report |
|
Submit final report to FNS |
8/15/2014 |
Briefing |
|
Conduct briefing |
6/30/2014 |
Public Use Files |
|
Submit final file documentation, codebooks, and data files incorporating FNS comments |
8/22/2014 |
If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
All forms completed as part of the data collection will display the expiration date for OMB approval.
Explain each exception to the certification statement identified in Item 19 “Certification for Paperwork Reduction Act.”
There are no exceptions to the certification statement.
1 Utah is still discussing options for managing the no-interview cases. Having a separate team is the ideal choice, but due to technological constraints, this option may not be possible. If Utah is not able to identify specific staff to manage the no-interview cases, it may not be possible to evaluate time-use for the demonstration.
2 This percentage will be calculated from all applicants regardless of whether an interview was ever scheduled.
3 Multiple attempts will be made to complete the survey and follow-up appointments will be scheduled to complete the survey with respondents unavailable to do so at the time of the initial telephone contact, however only a single response to the survey will be required.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Assessment of the Contributions of an Interview to Supplemental Nutrition Assistance Program Eligibility and Benefit Determinati |
Author | Dawn Patterson |
File Modified | 0000-00-00 |
File Created | 2021-01-30 |