Part B: Collections of Information Employing Statistical Methods for the Supplementation Nutrition Assistance Program (SNAP) Employment & Training Study
October 27, 2014
Submitted to:
Office of Management and Budget
Submitted
by:
Food and Nutrition Service
United States Department of Agriculture
CONTENTS
PART B: COLLECTIONs OF INFORMATION EMPLOYING STATISTICAL METHODS 1
B.1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection. 1
1. Selecting registrants and participants from study states for survey data collection 1
2. Selecting providers from study states for survey data collection 3
3. Selecting focus group participants from study states 3
B.2. Describe the procedures for the collection of information including: 3
1. Data collection 4
2. Statistical methodology, estimation, and degree of accuracy 6
3. Unusual problems requiring specialized sampling procedures 7
4. Periodic data collection cycles to reduce burden 7
B.3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied. 7
1. Survey data collection 7
2. Focus group data collection 10
B.4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information. 10
1. Survey pre-test 10
2. Focus group pre-test 11
B.5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency. 11
References 12
The SNAP E&T study will collect survey data from SNAP work registrants (Instrument I.1), SNAP E&T participants (Instrument I.1), and SNAP E&T program providers (Instrument I.2). The study team will also conduct focus groups with SNAP E&T participants (Instruments I.3 and I.4). This section provides information on sampling procedures for the SNAP E&T study.
The study team has randomly selected a nationally representative sample of 25 states using a stratified probability proportional to size (PPS) sampling design. The study team also selected 5 additional states to serve as backups to replace any of the 25 states from the initial sample that decline or are unable to participate (no more than 30 states would be contacted to obtain the 25 selected). The study team used a composite size measure, based on state-level work registrants and SNAP E&T participant counts, as the measure of size (MOS) for sampling. This measure is a weighted sum of both the number of work registrants and the number of SNAP E&T participants in a state. Therefore, if a state has a large number of work registrants but few SNAP E&T participants, the chances of selection might be similar to a state with fewer work registrants but a large number of SNAP E&T participants. The study team carried out this sampling using a three-stage process:
Select “certainty” states. States with an MOS large enough that they would be included in the sample of 25 states with a probability of 1 are called “certainty” states. The study team selected all certainty states for the initial sample.
Select “second-level” certainty states. States with an MOS large enough to be included in a sample of 30 states with a probability of 1 (but not necessarily large enough to be included in a sample of 25 states with a probability of 1) are called “second-level” certainty states. The sample size of 30 reflects the initial sample plus 5 backup states. Therefore, all second-level certainty states were included in either the initial sample or as a backup state. The study team assigned these states to the sample or backup by using simple random sampling.
Select remaining states. In the final stage, PPS sampling was used to select the remaining smaller states for the sample. Before sampling, the study team sorted these states into implicit strata defined by the work components offered by each state. The study team grouped the various work activities into four broad categories: (1) basic education and upfront work training, (2) job training, (3) higher education training, and (4) unique activities. Although each state may provide several activities across the four categories, a state is assigned to only one category.1 The study team also considered including geographic region as a stratum variable but determined that states would be well represented geographically in the sample without the inclusion of the additional stratum.
Using state data from Form FNS-583 (OMB Control Number 0584-0339, Expiration date 7/31/2017) for fiscal year (FY) 2013 from FNS, the study team created the composite size measure and work component stratification variable. The study team used the quarterly and annual counts of work registrants and SNAP E&T participants that states submit to FNS each year. Given the possibility that counts could vary across quarters as a result of seasonal patterns, the yearly count may be a more accurate representation of work registrants and SNAP E&T participants for each state. Therefore, the study ultimately used only the annual counts of work registrants and SNAP E&T participants from each state to construct the composite size measure.
The state sampling frame included 49 states and the District of Columbia (hereafter, referred to as “states”). The study team excluded Rhode Island from the sample because state officials did not provide FNS with complete data for FY 2013.
Each of the 25 selected states will be asked to provide administrative data for work registrants and SNAP E&T participants over a specific period. The study team will use the data to create the sampling frame for the SNAP E&T registrant and participant survey. The study team expects that about 95 percent of the individuals in the sample frame will be eligible and will aim to have 80 percent of those eligible complete the survey. To obtain the desired number of completed surveys (3,000 completed, with 1,500 from each group), the study team will select 1,974 individuals for each group, for a total sample of 3,948. Before selection, work registrants and SNAP E&T participants within states will be sorted to ensure adequate distribution of several factors, including age, gender, zip code, and household type. The study team will sample work registrants and SNAP E&T participants by using sequential random selection, with samples released in phases until the desired number of completed surveys are obtained. The initial release of sample will include only enough individuals to obtain the desired number of completed surveys under the most optimistic assumptions about eligibility and cooperation. After about three weeks of data collection, the study team will review the progress and determine whether additional sample releases should be made and how many individuals should be included in them. Each release of sample will be a random subsample of the full sample.
From the proposed sample sizes, precise estimates for work registrants and SNAP E&T participants will be obtained. The precision of estimates from the proposed survey will be a function of sample size and the expected design effects.
The study team will collect a current list of SNAP E&T program providers from each state. The list will include contact information to be used to create the sample frame for the providers. The study team estimates a 95 percent eligibility rate and an 80 percent response rate for providers. Thus, a sample size of 658 providers will supply the desired 500 completed interviews. Selection of the sample of providers will not be independent of the participant sample, but rather will be limited (to the extent possible) to locations where sampled SNAP E&T participants reside. The study team will first identify providers serving areas covered by individuals in the participant sample. Then, the study team will randomly select an equal number of providers across the 25 states for the provider sample.
From the 25 states included in the study, the study team will select 5 states, in consultation with FNS, in which to target SNAP E&T participants for focus group discussions. Selection criteria—including voluntary or mandatory status, geographic region, size of the SNAP E&T program, long-term and short-term unemployed, and the SNAP E&T program services and components offered—will help to ensure that the focus group participants represent variation in key characteristics. Once the 5 participating states have been selected, the study team will use provider lists to identify three locations in each state for focus groups. The goals when selecting these sites will be to have a mix of urban, suburban, and rural locations where SNAP E&T program training is provided, as well as to capture different key features of SNAP E&T program service provider characteristics with each focus group. The study team will also identify SNAP E&T locations where there are large concentrations of Spanish-speakers, and will conduct some Spanish language focus groups in these areas. The study team will work with community-based organizations that work with local SNAP offices to help identify a place to conduct the focus groups (for example, a church or community center).
A random sample of focus group participants will be drawn from the administrative data provided by the five selected states, which will be a different sample than that which is drawn for the registrant and participant survey. The study team will identify a sample large enough to account for individuals for whom the contact information is no longer accurate. Participants will be selected to represent a broad range of demographic characteristics (age, gender, race, education level, urban or rural location, able-bodied adults without dependents, and so on) and type of group. The study team will design a screening questionnaire for use in recruiting subjects, and will assign bilingual recruiters to contact households where Spanish is likely to be the primary language (Attachment A.11). This screening will allow organization of subgroups of individuals with diverse characteristics. The study team anticipates, on average, about a four-to-one ratio of contacts to participants. Thus, a total sample of 600 SNAP E&T participants will be used to target about 120 to 150 focus group participants. Also, based on previous experience with similar populations, the study team plans to over-recruit for the focus groups by about 20 percent to account for no-shows.
Statistical methodology for stratification and sample selection,
Estimation procedure,
Degree of accuracy needed for the purpose described in the justification,
Unusual problems requiring specialized sampling procedures, and
Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
The SNAP E&T Study will collect administrative data to select states and sample survey and focus group participants. From this sample, the study will field surveys of work registrants, SNAP E&T participants, and SNAP E&T program providers in 25 states. Focus groups will provide more in-depth information in 5 states. This section provides more information on data collection procedures.
The study team will collect administrative data from FNS and from states for the purpose of sampling states, work registrants, and SNAP E&T participants. Using these data, the study team will select states, create the sampling frames for surveys, and obtain respondent contact information. Administrative data will come from two sources: (1) federally collected SNAP E&T program data and (2) state-level caseload files.
Once the necessary data from states has been obtained, nationally representative samples of SNAP work registrants, SNAP E&T participants, and a sample of SNAP E&T program providers will be selected. The study team will select samples large enough to obtain completed surveys with 1,500 work registrants, 1,500 SNAP E&T participants, and 500 SNAP E&T program providers. The study team will administer two surveys: (1) a single instrument for the SNAP E&T registrant and participant survey, with specialized pathways for each, and (2) a survey for SNAP E&T program providers.
SNAP E&T program data collection will use a variety of integrated methodologies and instruments to gather the information required from respondents. The study team will develop web and computer-assisted telephone interviewing (CATI) instruments for the registrant and participant survey. For the SNAP E&T program provider survey, respondents will have the option of completing either a web-based survey or a telephone survey with an interviewer.
SNAP E&T registrant and participant survey. The study team will release the sample in three waves, each a week apart. The study team will conduct the data collection for each state over a 10-week period. The study team will target an 80 percent response rate for the SNAP E&T registrant and participant survey.
The study team will send advance letters (Attachment A.1) to sample members reminding them of the importance of the SNAP E&T program. The advance letter will be printed on USDA letterhead and signed by a high-ranking FNS official. Letters will be written in both English and Spanish, at a fifth-grade level to ensure comprehension. The letter will include an address for the web-based survey and a toll-free number to call to ask questions or complete the survey. The letter also will describe the incentive. Letters will be sent first-class with address service, as that will enable us to update incorrect addresses.2
The study team anticipates that some sample members will speak Spanish. The study team will develop bilingual study materials and offer to conduct interviews in Spanish to minimize nonresponses due to language barriers. Using information from administrative records on primary language or language of application, the study team will attempt to identify Spanish-speaking households before contacting them for CATI interviews, so that the initial contact will be from a bilingual interviewer. The study team will also evaluate the need for interviews in other languages and, in consultation with FNS, will determine whether to provide additional language service using study team staff (which can provide ad hoc translation for many Asian and European languages) or our language service partners, Interviewing Services of America or Magnus Language Valet.
High response rates are correlated with high quality data. The study team has had success administering CATI surveys to SNAP and other low-income populations, but CATI response rates, in general, have been declining for many years. To achieve a high response, multiple survey modes should be offered (De Leeuw et al. 2007). To maximize response to the SNAP SNAP E&T registrant and participant survey, the study team will give sampled work registrants and participants two options for responding—(1) CATI and (2) web. Web surveys have been less effective with low-income populations because of the group’s relatively low rates of Internet access. However, web access has expanded in recent years—at home, at work, through smart phones, and in public places such as libraries (Zickuhr 2012). SNAP E&T participants are also likely to have familiarity with computers because some of the SNAP E&T work programs use computers exclusively to administer training. The CATI questionnaire will be programmed to match the web-based questionnaire in order to minimize mode effects (differences in the responses depending upon whether the survey is conducted via the Internet or telephone). The study will use universal survey design practices (Matulewicz and Coburn 2008) so the web-based questionnaire can be completed by recipients of differing reading levels and cognitive abilities, without the assistance that telephone interviewers offer. The study team expects the web-based and CATI surveys to take about the same amount of time to complete (20 minutes).
Provider survey. The provider survey will follow a data collection schedule similar to the registrant and participant survey. The study team anticipates an 80 percent response rate for the SNAP E&T provider survey. As with the registrant and participant survey, the study team will send advance letters to sample members reminding them of the importance of the SNAP E&T program; printed on USDA letterhead and signed by a high-ranking; and, that include an address for the web-based survey, a completion deadline, and a toll-free number to call to ask questions or complete the survey (see Attachment A.8). The primary completion mode for the provider survey will be the web. However, in order to provide maximum flexibility to respondents, the study team will also provide respondents with the option to complete the survey over the phone. We anticipate that 90 percent of provider respondents will complete the survey via the web and 10 percent will complete it over the phone.
The study team will use a screening questionnaire for recruitment of subjects and will assign trained bilingual recruiters to contact households where Spanish is likely to be the primary language. This screening will allow organization of subgroups of individuals with diverse characteristics (Attachment A.11). The study team anticipates, on average, about a five-to-one ratio for contacts versus participants, and plans to over-recruit for the focus groups by about 20 percent to account for no-shows.
The study team has extensive experience conducting focus groups with individuals similar to those who participate in the SNAP E&T program, and relies upon a set of well-established, field-tested, standardized procedures. Every focus group will be conducted by two staff members who are highly skilled data collectors. Each session will last approximately 90 minutes. One person will moderate the focus group while the other takes notes on a laptop. At the beginning of the focus group session, the study team will provide an overview of the purpose of the study and of the focus group. Participants will be given letters regarding informed consent (Attachment A.14), advising them of their rights as subjects—including their right to discontinue participation at any time or refuse to answer any question. Participants will be asked to sign a release agreeing to the digital recording of the discussion and to the use of the recording for research purposes, as long as respondents are not identified by name.
When the focus group session is complete, the trained moderator and note taker will debrief, review the notes for major themes, and ensure that all research questions were answered. Topical areas found to be lacking in coverage will be emphasized during the next focus group and question probes will be tailored accordingly.
All data will be transferred to a password-protected laptop. At the end of each session, the recordings will be sent to a data transcription service, Landmark Associates, via a secure file transfer protocol (FTP) site. All submitted media files and completed transcripts will be stored and maintained on a 256-bit encrypted secure server. The files will be available only when password and user identification are validated.
Following completion of all interviews and focus groups, the study team will use the NVivo 10 software program to analyze the qualitative data collected. Researchers will import a verbatim transcript of each group into NVivo and will code the data using a standard coding scheme that will be honed over multiple coding cycles. A comprehensive list of nodes will be included in the coding scheme in order to identify and capture salient themes across groups. Attributed codes will be assigned to each transcript, allowing the research team to analyze the results across different types of groups.
Separate sets of weights will be constructed for each of the target groups being sampled for surveying: (1) work registrants, (2) SNAP E&T participants, and (3) SNAP E&T program providers. Weights are needed to account for differential sampling, which includes PPS sampling, to account for nonresponses (which will likely occur during data collection) as well as any potential undercoverage that may exist in the sampling frames.
Construction of the analysis weights for each of the target groups will involve four main steps. First, probabilities of selection will be calculated for the responding units in the analysis samples. The inverses of the probabilities of selection will serve as the initial sampling weights. Second, nonresponse weighting adjustments will be made to the sampling weights to correct for different response rates among different groups. Nonresponse weighting adjustment factors will be developed during an extensive nonresponse bias analysis in which variables associated with nonresponse propensities will be determined using chi-squared automatic interaction detector (CHAID) procedures (Kass 1980). Variables identified during the CHAID procedures will be used in logistic regression models to estimate response propensities. The propensity scores for respondents will then be used for the nonresponse adjustments to the sampling weights. Third, post-stratification to known population totals or ratio adjustments to estimated totals will be made to the weights to correct for any undercoverage in the sampling frames. Population totals, including those based on gender and race and ethnicity, will be considered. As a fourth and final step, we will identify and adjust for any extreme outlier weights that may result from the first three steps of the weight construction. Weights that are extremely large relative to all other weights can increase design effects and give certain cases undue influence on estimates based on the analysis samples. We will determine a systematic protocol for trimming extreme weights once the initial weights are constructed (based on Potter 1990).
The study has no unusual problems requiring specialized sampling procedures.
The study only has one cycle of data collection.
The SNAP E&T study will employ a variety of methods to maximize response rates and deal with nonresponses. This section describes these methods for the survey and focus group data collection efforts.
The study team will target an 80 percent response rate for the SNAP E&T registrant and participant survey and the SNAP E&T program provider survey. To obtain 1,500 completed SNAP E&T registrant and participant surveys, a sample size of 1,974 for each group (for a total of 3,948) will be used. The study team will create a sample of 658 SNAP E&T program providers to target 500 completed provider surveys.
The SNAP E&T study will provide work registrants and SNAP E&T participants two options for responding to the survey: (1) CATI and (2) web-based. Respondents can call a toll-free number to complete the survey over the phone. Interviewers available through the toll-free number will also be available to provide assistance or answer any questions respondents may have. The CATI questionnaire will be programmed to match the web-based questionnaire, which will use language that is accessible to respondents with varying reading levels and cognitive abilities. In addition, all instruments will be available in English and Spanish. For CATI surveys, bilingual interviewers can conduct the interview in Spanish and possibly other languages, if deemed necessary.
For the SNAP E&T registrant and participant survey, the study team will offer sample members a differential incentive to encourage web-based completion or call-in completion (versus the incentive offered to those sample members to whom the study team initiates a call). The study team will provide $40 to the respondent as a token of our appreciation after the respondent completes the interview in one of these less-expensive modes (initiates the call or completes a web-based survey) and $20 if the study team initiates the call. In addition, if it is necessary to increase the response rate, in week 7 of the survey, the study team will identify nonrespondents and mail each a letter with $5 as a “pre-incentive” to visit the website or call us to complete the survey (Attachment A.4).
The study plan for successfully locating respondents and convincing them to participate combines intensive locating efforts to identify missing contact information with proven methods for gaining cooperation. Address and telephone numbers will be included for many sample members, or might be incorrect. That is, their contact information has changed since last provided to the state or is otherwise incorrect in program records (as a result, for example, of an expected normal amount of data entry error). When records lack correct information, the study team will, during sample selection, send records to two vendors for updates: (1) Accurint, a division of LexisNexis, which uses a comprehensive database compiled from sources such as credit bureaus, motor vehicle administrations, and voter registration lists; and (2) MelissaData, which uses National Change of Address (NCOA) data. NCOA is a registry maintained by the U. S. Postal Service of people who move or change address in the United States. If neither vendor provides updated information on a case, the study team will refer it to Mathematica’s locating department, which is highly skilled at searching specialized online databases, social media, and other sources for contact information. The study team will also obtain updated address information from advance letters returned by the postal service.
To further reinforce study legitimacy among work registrants and SNAP E&T participants, the study team will create a one-page flyer (Attachment A.6) that describes the study’s purpose and the evaluator’s role in conducting a brief survey sponsored by FNS. The study team will ask state administrators to email flyers to local SNAP offices and SNAP E&T program providers so they can be distributed to office staff and posted in areas where they will be visible to visiting SNAP work registrants and SNAP E&T participants.
During the field period, a series of reminder letters (Attachments A.3, A.5, and A.5) and a postcard (Attachment A.2) will be sent every two weeks to all sample members who are presumed to be eligible and are nonresponsive. The reminder materials will briefly describe the purpose of the survey, state that the study team has been trying to reach the sample member by telephone, highlight the incentive, and provide the study’s toll-free number to call and complete the survey or obtain web log-in information. Table B.3 summarizes the approximate schedule for survey data collection activities for the SNAP E&T registrant and participant survey, by release. The exact timing of each activity may vary depending on the analysis of paradata during data collection.
Table B.3. Summary of SNAP E&T registrant and participant survey data collection activities, by release
|
Mail advance letter |
Begin outbound calls |
Mail reminder postcard |
Mail 1st reminder letter |
Mail pre-incentive letter |
Mail 2nd reminder letter in colored envelope |
Complete data collection |
Release 1 |
Week 1 |
Week 2 |
Week 4 |
Week 6 |
Week 7 |
Week 8 |
Week 10 |
Release 2 |
Week 2 |
Week 3 |
Week 5 |
Week 7 |
Week 8 |
Week 9 |
Week 11 |
Release 3 |
Week 3 |
Week 4 |
Week 6 |
Week 8 |
Week 9 |
Week 10 |
Week 12 |
In addition to the biweekly reminder, if the study team is not on target to reach an 80 percent response rate by week 7, the study team will send nonrespondents an additional “Time is Running Out” letter (Attachment A.4). The letter will include $5 as a token of our appreciation to visit the website or call to complete the survey. The study team will send these letters to all nonrespondents, which the study team anticipates will be no more than 30 percent of the sample. Finally, instead of sending the traditional reminder letter in week 8 of the survey, the study team will send a “Last Chance” letter (Attachment A.5). This final letter will be sent in an oversized, colored envelope to draw additional attention to the letter and increase the likelihood that it will be opened.
The study team has developed and refined methods to build rapport and overcome the reluctance of sample members to participate in the interview. The study team’s approaches focus on preventing and converting refusals. The strategies aim to convince sample members that (1) the study is legitimate and worthwhile, (2) their participation is important and appreciated, and (3) the information provided will be held confidential and will not affect their job or their eligibility for SNAP or other benefits. The study team interviewers are well prepared to address common respondent concerns in these areas. Telephone refusals are flagged in the CATI scheduler and, one week after a refusal, the study team will send a letter to the sample member emphasizing the importance of the study and addressing typical concerns. Interviewers skilled at converting refusals will then contact the sample member by telephone.
The study team will select data collectors who have solid experience and performance in comparable topic areas, and who have demonstrated (1) reliability, (2) communication skills, (3) accurate reading and recording proficiency, and (4) aptitude for the administrative and reporting requirements of survey work. The study team will use certified bilingual interviewers to complete interviews in Spanish. To develop the skills necessary to encourage participation among low-income households, all telephone interviewers receive general interviewer training before being assigned to a study. This training involves essential interviewing skills, probing, establishing rapport, avoiding refusals, eliminating bias, and being sensitive to at-risk and special populations. In addition, all interviewers will receive project-specific training that reviews study goals, provides a question-by-question review of the instrument, and conveys best practices for interviewing for the specific study. Interviewers will also conduct mock interviews prior to conducting real interviews. Data collector training sessions will take place no earlier than two weeks before data collection begins.
The study team will develop a comprehensive data collection training manual and plan. Topics will include (1) an overview of the study; (2) a review of general interviewing techniques, such as contacting, scheduling, confirmation methods, refusal avoidance, refusal conversion, and confidentiality issues; (3) frequently asked questions and approved responses; (4) glossary of terms; (5) procedures and contacts for addressing problems; and, (6) strategies for maximizing valid responses and obtaining high response rates. All interviewing staff will receive these materials.
The study team will also target an 80 percent response rate for the SNAP E&T program provider survey. To obtain 500 completed interviews, the study team will start with a sample size of 658 providers. The study team will send each of the providers in the sample an email containing password-embedded links to the web-based survey (Attachment A.8). Emails will highlight the importance and ease of the survey and provide a toll-free number for questions or concerns. The study team will send a reminder email (Attachment A.9) up to four times and a postcard (Attachment A.10) to encourage survey completion. As previously stated, the exact timing of each reminder may vary depending upon the analysis of paradata during data collection. As is customary, provider respondents will not be offered an incentive.
Once individuals have been recruited to participate in focus groups, the study team will prepare and send a personally addressed letter to each individual who agreed to participate (Attachment A.12) confirming the time, date, and place of the focus group. The materials also will make note of the $40 incentive as a token of our appreciation after participating in the focus group and an additional $10 if the participant arrives 15 minutes early. In addition, the day before the focus group, the study team will call each participant to remind him or her of the time and location (Attachment A.13). Once each focus group has been conducted, a member of the study team will send a thank-you note to each participant.
The SNAP E&T study will pre-test the survey instruments with SNAP work registrants, SNAP E&T participants, and SNAP E&T program providers. It will also pre-test the focus groups in English and Spanish.
The study team followed a careful process for review and pre-testing that it established for developing all data collection instruments. A pre-test of the combined work registrant and participant instrument was conducted by trained interviewers, including a certified bilingual interviewer in August 2014. Two Spanish speaking work registrants were included in the pre-test. The SNAP E&T program provider data collection instrument was also pre-tested by trained interviewers. At the end of the pre-test, the study team conducted a debriefing interview with each respondent discussing (1) the flow of introductory and transitional text, (2) the clarity of questions and response options, (3) sensitivity, (4) respondent burden, and (5) survey length.
Using the pre-test results, the study team prepared a pre-test memo that identifies issues requiring discussion with FNS and recommendations for changes to the instruments. The final instruments, including revisions based on pre-test results, are included as attachments.
The focus group discussion guide was developed in English and translated into Spanish. The study team conducted focus groups to pre-test the guide with English and Spanish respondents. After the pre-tests, the study team assessed which questions worked well and which needed adjustment. In consultation with FNS, the study team revised the instrument based on pre-test findings. The final instrument is included as an attachment.
Gretchen Rowe
Project director
Senior researcher, Mathematica Policy Research
(202) 484-4221
Stephanie Boraas
Survey director
Survey researcher, Mathematica Policy Research
(202) 484-3292
Nick Beyler
Statistician
Statistician, Mathematica Policy Research
(202) 250-3539
John Hall
Statistician
Senior fellow, Mathematica Policy Research
(609) 275-2357
Bonnie Harvey
Statistical assistant
Statistical assistant, Mathematica Policy Research
(202) 552-6409
Frank Potter
Statistician
Senior fellow, Mathematica Policy Research
(609) 945-6573
Vivian Gabor
Principal, Gabor & Associates Consulting
(202) 841-3071
Alicia Koné
Principal, Koné Consulting
(425) 275-2895
Claire Wilson
Focus group task leader
Executive director of research, Insight Policy Research
(703) 504-9484
Brian Estes
Senior researcher
Senior researcher, Insight Policy Research
(703) 504-9492
Leslie Smith
Methodology Division
United States Department of Agriculture-National Agricultural Statistics Service
De Leeuw, E., M. Callegaro, J. Hox, E. Korendijk, and G. Lensvelt-Mulders. “The Influence of Advance Letters on Response in Telephone Surveys.” Public Opinion Quarterly, vol. 71, no. 3, 2007, pp. 413–443.
Kass, G.V. “An Exploratory Technique for Investigating Large Quantities of Categorical Data.” Applied Statistics, vol. 29, no. 2, 1980, pp. 119–127.
Matulewicz, H., and J. Coburn. “Universal Design for Web Surveys: Practical Guidelines.” Survey Practice, vol. 1, no. 4, 2008.
Potter, Francis J. “A Study of Procedures to Identify and Trim Extreme Sampling Weights.” In Proceedings of the American Statistical Association, Section on Survey Research Methods. Alexandria, VA: American Statistical Association, 1990, pp. 225‑230.
Zickuhr, K., and A. Smith. “Digital Differences.” Washington, DC: Pew Research Center’s Internet & American Life Project, April 12, 2012.
1 The study team developed the groups to ensure that states providing less common types of activities, such as on-the-job training or higher education, were included. All states provided some type of basic education or upfront training (category 1). Therefore, the study team will first assign any states reporting participation in activities under category 2 to that group, then category 3, then category 4. The study team will assign any remaining states to category 1. The study team will not base the assignments on the proportion of participants in each of the categories, but rather on the existence of participation.
2 Address service means that the U.S. Postal Service will return mail to the sender if the mail contains a wrong address. If a forwarding address for the recipient is available, the new address will be attached to the letter and be mailed.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | LaTia Downing |
File Modified | 0000-00-00 |
File Created | 2021-01-24 |