OMB Control Number: 0584-xxxx
Expiration Date: xx/xx/xxxx
Attachment E.5.a FNS Response to NASS Comments
Question
Please describe the respondent universe for the different studies more clearly. Perhaps repeat the standard definition of elderly to clearly define the population. For the Study of State Interventions, it will be difficult to make inferences to the other states since they were not selected using statistical methods.
Response
The “elderly” population in this study is defined as individuals age 60 or over.
Study of State Interventions: Respondents for this study are key stakeholders involved with SNAP administration at the State and local levels. To obtain a variety of perspectives, we will interview staff members at various levels (including administrators, supervisors, and front-line staff members) and those responsible for the design, initial implementation, and operations of each intervention. This primarily includes staff directly involved in the agencies administering the SNAP at the State and local levels.
To the extent we are able to garner their cooperation, we will also interview partner agency staff members in States with a CAP (where we will interview Social Security Administration (SSA) administrators involved in planning discussions and front-line staff members involved with the application and eligibility determination process) and in States that conduct data verification across agencies (where we will interview partner agency staff members involved in planning discussions about data sharing or staff members integral to the operations of the data systems).
We also will interview representatives of non-governmental community-based organizations (CBOs) who may refer elderly individuals to SNAP or provide them with supplemental or alternative support such as meals or food baskets.
The main purpose of the study of State interventions is to document how various interventions aimed at increasing SNAP elderly enrollment are implemented in the study States. As such, this component of the study is not meant to serve the purpose of making statistical inferences. Rather, it is intended as a way of deepening the understanding of how interventions were implemented, which will help in interpreting the findings from the study of intervention effects.
Study of Elderly Participant Perspectives: Respondents for this study fall into three categories of elderly individuals:
SNAP participants: Individuals age 60 and over who are currently enrolled in SNAP and receive benefits
Non-participating applicants: Eligible individuals age 60 and over who attempted to apply for SNAP but did not succeed or eligible individuals age 60 and over who enrolled in SNAP after reaching age 60 but are no longer participating
Non-participants: Individuals age 60 and over who are eligible for SNAP but have not applied since reaching age 60
Study of Intervention Effects: The study of intervention effects is designed as an evaluation of natural experiments, where we will attempt to isolate the impact of particular interventions aimed at increasing elderly enrollment in SNAP. In an ideal case scenario, interventions would be assigned to States at random, and the impact of adoption could be calculated. In that case, findings from the study would be generalizable to all States. In reality, States adopt interventions in a non-random manner. As such, we believe there is no “universe” to which findings are directly generalizable. However, our study is still important because 1) it can show whether the adoption of interventions led to the intended effect in States that have already adopted policies, and 2) because it can provide suggestive evidence that adopting some interventions in other States that have not yet adopted them might lead to better program outcomes.
Question
For the Study of Elderly Participant Perspectives, please explain why a convenience sample will be used rather than a probability sample. The sample was drawn randomly initially but then only those that scheduled interviews first were actually used. There could be inherent characteristics in those that responded first that could influence the results of the study.
Response
We appreciate that the reviewer identified this problem in the original Supporting Statement (Part B), and this is a valid point. In fact, there was an error in our description of the sampling strategy (it was not described that way in the Design Plan for the study). We have revised the Supporting Statement in the relevant places to clarify our strategy. In sections B1 and B3, we revised the text to clarify that for the interviews we will not be using a straightforward convenience sample, because we do plan to take steps to minimize sample bias.
We also added a footnote to support the rationale for having a small sample size in qualitative research in general. The text of the footnote is as follows:
Although we chose this approach to minimize sample bias that could occur with a straightforward convenience sample, the sample sizes in each local area will be too low to make general claims about representativeness regardless. Low sample sizes are customary in qualitative research and rigor in this case requires taking reasonable steps to reduce obvious sources of bias and being clear about the limitations of the sample the analysis process. Because the goal of qualitative research is meaning rather than generalizability, the representativeness of the sample is a lower priority than it would be in quantitative research. Source: Mason, M. (2010, August). Sample size and saturation in PhD studies using qualitative interviews. In Forum qualitative Sozialforschung/Forum: qualitative social research (Vol. 11, No. 3).
Question
Is there a targeted response rate? How will the missing values of variables in the standardized file be treated? What are the new variables constructed from the raw data and how will that affect the data analysis?
Response
It is unclear from the comment which study these questions pertain to. Based on the language used in the questions, we will assume that these questions pertain to the Study of Intervention Effects.1 With that caveat, we believe that the response rate for quantitative data requests associated with the Study of Intervention Effects will be close to 100%.
The missing values of variables in the standardized file will be coded as missing. Mathematica and SPR’s standard approach is to not impute data when we are doing quantitative analysis.
Because we have not collected or reviewed the administrative data for these States on these interventions, we are not able to determine which constructed variables will be needed. However, on other projects that use data from multiple States, we often have to construct variables for race, education, income, benefit receipt, and others to standardize the data and ensure we are able to assess similar information.
1 If the question about response rates was in reference to the Study of Elderly Participant Perspectives, we added more text to the Supporting Statement (part B) to clarify that we expect an attrition rate of 30 percent for the interviews and focus groups.
E5a-
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Anne Paprocki |
File Modified | 0000-00-00 |
File Created | 2021-01-21 |