C. Pretest Methods and Summary of Findings_clean_responsetoPRAO

C. Pretest Methods and Summary of Findings_final_clean_final response to PRAO.docx

Modernizing Channels of Communication with SNAP Participants

C. Pretest Methods and Summary of Findings_clean_responsetoPRAO

OMB: 0584-0659

Document [docx]
Download: docx | pdf

Shape1

OMB Number: 0584-XXXX

Expiration Date: XX/XX/XXXX


Appendix C. Pretest Methods and Summary of Findings

OMB No. 0584-XXXX

Modernizing Channels of Communication
With SNAP Participants

March 2, 2020

Project Officer: Andrew Burns

Office of Policy Support

Food and Nutrition Service

U.S. Department of Agriculture

1320 Braddock Place

Alexandria, VA  22314

703.305.1091

Andrew.Burns@usda.gov



Modernizing Channels of Communication With SNAP Participants: Draft Pretest Methods and Summary of Findings



On behalf of U.S. Department of Agriculture (USDA) and Food Nutrition Service (FNS) and its research team, Insight Policy Research (Insight) pretested the following data collection instruments for the Modernizing Channels of Communication With SNAP Participants study:

  1. Introductory Telephone Call With State MCS Staff and Administrators Protocol

  2. State MCS Staff and Administrators Interview Protocol

  3. Software Developers Interview Protocol

  4. Local Office Frontline Staff Group Interview Protocol

  5. Other Stakeholders or Community Partners Interview Protocol

  6. SNAP Participants Focus Group Demographic Questionnaire

  7. SNAP Participants Focus Group Protocol

  8. SNAP Office Waiting Room Questionnaire

The primary objective of the pretest was to ensure the instruments would be clear and easy for respondents to understand. The pretest objectives were as follows:

  • Identify problems related to communicating the intent or meaning of questions.

  • Determine whether respondents could provide the information requested.

  • Identify problems with introductions, instructions, or explanations.

  • Assess the time needed to complete the questionnaire and other respondent burden issues.

  1. Recruitment and Data Collection Methods

In March 2019, with the help of the FNS Northeast Regional office staff, FNS research team sent a one-page overview of the study via email to New York State’s (NYS) Supplemental Nutrition Assistance Program (SNAP) director requesting the State’s participation in the pretest. The research team followed up with the NYS SNAP director via email to provide additional information and followed up by telephone to request New York City’s (NYC) participation.

After securing permission from NYS, we scheduled an interview to pretest the Introductory Telephone Call With State MCS Staff and Administrators Protocol with two of NYC’s MCS leads. During the discussion with the MCS leads, FNS research team also explained the need to pretest six additional interview protocols: one with State MCS staff and administrators, a second with software developers, a third with local office frontline staff, a fourth with other stakeholders or community partners involved with MCS, and two protocols with SNAP participants. SNAP participants who are involved in the pretest of the focus group and waiting room questionnaire will also pretest the focus group screener.

After conducting the introductory protocol with the State MCS staff and administrators, the research team was directed to speak with the NYS Office of Research and Evaluation. During a subsequent telephone call, it was discovered that further NYS approval was needed to continue pretesting the protocols. At that time, FNS research team began working with the Maryland State SNAP department to complete the pretest on an expedited schedule. All interviewees signed a form that indicated their consent to participate in the pretest.

Table X. Summary of Pretest Participants


Participant Title

State

Instrument Tested

1

Deputy commissioner for the NYC Human Resources Administration

New York

Introductory Telephone Call With State MCS Staff and Administrators Protocol

State MCS Staff and Administrators Interview Protocol

2

Director of Business Process Innovation

New York

Introductory Telephone Call With State MCS Staff and Administrators Protocol

3

Director of the Bureau of Systems Modernization and Information Analysis

Maryland

Software Developers Interview Protocol

4

Former SNAP Local Office Frontline Worker 1

Maryland

Local Office Frontline Staff Group Interview Protocol

5

Former SNAP Local Office Frontline Worker 2

Maryland

Local Office Frontline Staff Group Interview Protocol

6

Case manager at non-profit organization that partners with State SNAP office

Maryland

Other Stakeholders or Community Partners Interview Protocol

7

SNAP Participant 1

Maryland

SNAP Participants Focus Group Protocol

SNAP Office
Waiting Room Questionnaire

SNAP Participants Focus Group Demographic Questionnaire

8

SNAP Participant 2

Maryland

SNAP Participants Focus Group Protocol

SNAP Office
Waiting Room Questionnaire

SNAP Participants Focus Group Demographic Questionnaire

9

SNAP Participant 3

Maryland

SNAP Participants Focus Group Protocol



  1. Findings

In NYC, SNAP-related MCS are administered at the city level. For its SNAP participants, NYC implemented a website, the NYC ACCESS HRA mobile application (app), and text messaging services. For the purposes of the pretest, FNS research team conducted interviews with NYC SNAP staff who were involved in the development and improvement of the website, mobile app, and text messaging platforms. In contrast, in Maryland, SNAP-related MCS are administered at the State level, and SNAP beneficiaries have access to the MyDHR (Maryland Department of Human Resources) mobile-optimized website as well as third-party apps (e.g., the FreshEBT mobile app). Table 1 provides a summary of the findings.

Table 1. Summary of Pretest Findings

Instrument

Summary of Findings

Resulting Changes to Instrument

Introductory Telephone Call With State MCS Staff and Administrators Protocol

  • Duration: 41 minutes (anticipated 60 minutes)

  • Further clarification needed on research team’s approach to compiling State-specific understanding of MCS

  • One probe was deemed unnecessary

  • Adjusted introductory text

  • Deleted one probe

State MCS Staff and Administrators Interview Protocol

  • Duration: 75 minutes (anticipated 90 minutes)

  • Further clarification needed on definition of MCS

  • Further clarification needed on research team’s approach to compiling State-specific understanding of MCS

  • Several questions appeared redundant

  • Some confusion about definition of terms

  • Adjusted introductory text and added definition of MCS

  • Deleted redundant questions

  • Added clarifying context to provide background

Software Developers Interview Protocol

  • Duration: 55 minutes (anticipated 60 minutes)

  • Further clarification needed on research team’s approach to compiling State-specific understanding of MCS

  • Wording of some questions was unclear

  • Adjusted introductory text

  • Clarified wording of some questions

Local Office Frontline Staff Group Interview Protocol

  • Duration: 51 minutes (anticipated 60 minutes)

  • Further clarification needed on research team’s approach to compiling State-specific understanding of MCS

  • Further clarification needed on a few questions

  • Even in States lacking a formal MCS, participants may use mobile communication for SNAP

  • Adjusted introductory text

  • Added clarifying text to and adjusted wording of a few questions

  • Added two optional questions for States with no text messaging capabilities or mobile app to ask staff about presence of third-party or informal mobile communications

Other Stakeholders or Community Partners Interview Protocol

  • Duration: 27 minutes (anticipated 60 minutes)

  • Further clarification needed on research team’s approach to compiling State-specific understanding of MCS

  • Even in States lacking a formal MCS, participants may use mobile communication for SNAP

  • Lack of general understanding of how community partners’ clients use MCS

  • Adjusted introductory text

  • Added two optional questions for States with no text messaging capabilities or mobile app to ask staff about presence of third-party or informal mobile communications

  • Added an introductory question to assess whether, when, and how clients use MCS for SNAP

SNAP Participants Focus Group Protocol

  • Duration: 61 minutes (anticipated 90 minutes)

  • Lack of flow between introduction and icebreaker

  • Some participants focused on third-party MCS

  • Some participants may use MCS informally to communicate with their caseworkers

  • Even in States lacking a formal MCS, participants may use mobile communication for SNAP

  • Confusion around the wording of some questions

  • Adjusted ordering of icebreaker and introductory text

  • Added clarifying text to focus conversation on State-sponsored MCS

  • Added probe to assess informal MCS

  • Added two optional questions for States with no text messaging capabilities or mobile app to ask staff about presence of third-party or informal mobile communications

  • Adjusted wording of some questions to improve clarity

SNAP Office
Waiting Room Questionnaire

  • Duration: 10–11 minutes (anticipated 5-7 minutes)

  • Confusion around the wording of some questions

  • Deleted two questions to ensure completion within allotted time

  • Adjusted wording in some questions to improve clarity

SNAP Participants Focus Group Demographic Questionnaire

  • Duration: 5–11 minutes (anticipated to be included as part of 90-minute focus group)

  • Confusion around the wording of some questions

  • Additional response options needed for some questions

  • Adjusted wording in some questions to improve clarity

  • Added additional response options for some questions

  1. Introductory Telephone Call With State MCS Staff and Administrators Protocol

FNS research team interviewed the deputy commissioner for the NYC Human Resources Administration and the director of Business Process Innovation Projects on April 17, 2019, via telephone.

  1. Duration

The interview lasted 41 minutes, which suggests that the number of questions is appropriate for the 45- to 60-minute protocol and that more information can be provided at the beginning of the protocol surrounding the data sources for previously collected data. After the pretest was completed, one of the probes for the interview questions (i.e., a subquestion) was deleted to allow a more open-ended response. None of the main interview questions were deleted.

  1. General Findings

The interviewees said the questions were clearly worded and were appropriate for their experiences and levels of knowledge. The interviewees were able to address every topic but provided more detail than expected about the State’s MCS and also had questions about the data sources used to collect information about the MCS. As a result, FNS research team plans to include more information about the definition of an MCS and indicate the sources from which the research team obtained information on the State’s existing MCS.

  1. Question-by-Question Findings and Recommendations

Table 2 provides the findings and recommendations for specific questions. See appendix F for the revised Introductory Telephone Call With State MCS Staff and Administrators Protocol, including the suggested changes described in table 2.

Table 2. Item-Level Recommendations for Introductory Telephone Call With State MCS Staff and Administrators Protocol

Question Number From Draft Instrument

Findings/Observations

Recommendations

Introductory text at the beginning of the interview

During the pretest, the team discovered it would be helpful to include a definition of MCS at the beginning of each interview with State staff and provide more context about how the State-specific conceptual map and/or State MCS profile was compiled.

Add the following text to the introduction: “We have done some initial research to get a preliminary understanding of your State’s mobile communication strategies (MCS). For the purposes of this study, MCS include text messaging, mobile applications participants can download on a smartphone or tablet, and websites that are optimized for viewing on mobile devices. These findings will help FNS and States improve communication with clients and identify best practices that lead to improved program outcomes. The information we have reviewed so far has been collected from publicly available reports such as the SNAP State Options Report and your State’s public-facing website [IF APPLICABLE: as well as the SNAP Process and Improvement Grant information]. Over the course of the study, we plan to conduct two interviews with you and other key staff involved in the MCS implementation.”

Introduction to Section C. Functional Components of Each MCS

More information was needed on when the team collected the data and from which data sources.

Add the following text to the introduction: “As a reminder, this depiction of your State’s MCS is based on information available from public sources. Our preliminary review was last updated in [INSERT DATE].”

E.1.a. Did you collect baseline data?

This probe is unnecessary; the team wants to allow stakeholders to provide the timeframe and range of conventional and unconventional data and monitoring techniques, not lead stakeholders to respond only about baseline data.

Delete the probe.



  1. State MCS Staff and Administrators Interview Protocol

To pretest the State MCS Staff and Administrators Interview Protocol, the FNS research team spoke with the deputy commissioner for the NYC Human Resources Administration on May 1, 2019, via telephone.

  1. Duration

The interview lasted 75 minutes. The protocol has been lengthened to 90 minutes to allow for all the questions to be answered.

  1. General Findings

The respondent had no difficulty understanding or answering the questions in the protocol, and the conversation flowed smoothly; however, she said it seemed “old fashioned” to discuss the development and implementation of the mobile app and the mobile-optimized website separately because, in NYC’s case, the two tools were developed simultaneously and were fully integrated. To remedy this situation, FNS research team suggests adding language acknowledging that each State has used a different process to design and develop its comprehensive MCS approach.

  1. Question-by-Question Findings and Recommendations

Table 3 provides the findings and recommendations for specific questions. See appendix G for the revised State MCS Staff and Administrators Interview Protocol, including the suggested changes described in table 3.

Table 3. State MCS Staff and Administrators Interview Protocol

Question Number From
Draft Instrument

Findings/Observations

Recommendations

Introductory text at the beginning of the interview

During the pretest, the team discovered it would be helpful to include a definition of MCS at the beginning of each interview with stakeholders.

Add the following text: “. For the purposes of this study, MCS include text messaging, mobile applications participants can download on a smartphone or tablet, and websites that are optimized for viewing on mobile devices.”

Introduction to Section C. Function Components of Each MCS

More information was needed on when the team collected the data and from which data sources.

Add the following text to the introduction: “As a reminder, this depiction of your State’s MCS is based on information available from public sources. Our preliminary review was last updated in [INSERT DATE].We realize that each State has developed a unique approach for designing and implementing MCS.”

C 4. [IF APPLICABLE] Are there plans to add some or all of the above mentioned functionality to your State’s MCS? Why or why not?

This question is very similar to the question that precedes it (C.3):
“b. Which of these functions are the most commonly used?”

Delete the first part of this question to avoid redundancy and confusion.

Probe under question E.1.

1. How, if at all, has implementing these MCS influenced the way your State conducts business?

Probe: Has implementation of MCS influenced staffing, workflow, client’s demand for face-to-face local offices, and/or workload? Has it influenced case management? Which areas were most affected?

Probe [IF APPLICABLE]: If your State has any view-only functions in place, what value do those have to your State?

Probe: How have these strategies influenced participants’ experience with SNAP?

The probe “How have these strategies influenced participants’ experience with SNAP?” was somewhat duplicative to an earlier question.

Delete the probe.

F.2. Has there been any additional progress toward meeting your original program goals?

Interviewees were confused about this question.

Add an introduction with background to provide context for the question and confirm the goals.

  1. Software Developers Interview Protocol

To pretest the Software Developers Interview Protocol, the FNS research team conducted an interview with the director of the Bureau of Systems Modernization and Information Analysis in Maryland. In her previous role as the assistant director of the Innovation group, she oversaw the planning, testing, and implementation of the former (not mobile optimized) website and the current MyDHR mobilized website. This protocol was pretested in person on June 13, 2019.

  1. Duration

The interview lasted 55 minutes, which suggests the number of questions is appropriate.

  1. General Findings

The interviewee said the questions were clearly worded and easy to understand. However, she deferred some questions about contracts and pricing to the contracts or procurement group within the State. She noted that the software developers or the team that oversaw the implementation may not know about the structure of payment for MCS services. The interviewee similarly noted that some States may be hesitant to provide information about the cost structures for their products or paid purchases. She suggested that when framing any questions surrounding cost structure, interviewers should preface it by stating that they not only are interested in the dollar amount paid but also are looking to compare differences in resources. There was some confusion surrounding what was considered “obtaining consent” for different forms of MCS.

  1. Question-by-Question Findings and Recommendations

Table 4 provides the findings and recommendations for specific questions. See appendix H for the revised Software Developers Interview Protocol, including the suggested changes described in table 4.

Table 4. Item-Level Recommendations for Software Developers Interview Protocol

Question Number From
Draft Instrument

Findings/Observations

Recommendations

Introductory text at the beginning of the interview

During the pretest, the team discovered it would be helpful to include a definition of MCS at the beginning of each interview with stakeholders.

Add the following text: “For the purposes of this study, MCS include text messaging, mobile applications participants can download on a smartphone or tablet, and websites that are optimized for viewing on mobile devices.”

B.1.a.

1. Tell me about the planning and preparations for developing the MCS in the State.

a. How long did the planning and development process take?

Because the team interviewed someone from the State MCS development team, it was unclear whether the State utilized in-house developers, contract workers, or a combination of the two for development of the MCS.

Add the following text to the probe: “Do you use in-house developers?”

B.3.a.

1. How did you plan for potential data security or privacy issues when developing or implementing these tools?

a. [IF APPLICABLE] How do you obtain consent to participate in these communication mechanisms from clients in the State?

The interviewee was unclear with respect to what was considered “giving consent.”

Add to the probe a few examples of ways to give consent.

  1. Local Office Frontline Staff Group Interview Protocol

To pretest the Local Frontline Staff Group Protocol, the FNS research team met in person with two former frontline workers in Maryland. Cumulatively, the workers had served in intake positions, quality control positions, and supervisory roles in a local office. Each interviewee had worked on managing Maryland SNAP cases prior to the transition from the previous (not mobile optimized) website to the MyDHR mobile-optimized website. The protocol was pretested on June 13.

  1. Duration

The interview lasted 51 minutes, which suggests the number of questions is appropriate.

  1. General Findings

At times the interviewees needed questions to be repeated or rephrased. Overall the interviewees reported that the questions were clearly worded and easy to understand. Cumulatively, the interviewees were able to answer all the questions with the help of examples and probes, but neither of them understood what was meant by “uptake” in the question “Has uptake varied over time?” and asked the interviewer to clarify the meaning. In future administration of the protocol, interviewers could use alternative language to inquire about the popularity of MCS among SNAP participants (for question B.1).

  1. Question-by-Question Findings and Recommendations

Table 5 provides the findings and recommendations for specific questions. See appendix I for the revised Local Office Frontline Staff Group Interview Protocol, including the suggested changes described in table 5.

Table 5. Item-Level Recommendations for Local Office Frontline Staff Group Interview Protocol

Question Number From
Draft Instrument

Findings/Observations

Recommendations

Introductory text at the beginning of the interview

During the pretest, the team discovered it would be helpful to include a definition of MCS at the beginning of each interview with stakeholders.

Add the following text: “For the purposes of this study, MCS include text messaging, mobile applications participants can download on a smartphone or tablet, and websites that are optimized for viewing on mobile devices.”

Introduction to Section C. Function Components of Each MCS

More information was needed on when the team collected the data and from which data sources.

Add the following text to the introduction: “As a reminder, this depiction of your State’s MCS is based on information available from public sources. Our preliminary review was last updated in [INSERT DATE].”

B.1. Tell me about the uptake of these mobile communication tools among SNAP participants.

The probe “Has uptake varied over time?” needed to be clarified. This question was better understood when it was posed as “whether more people used the mobile website over time.”

Add the following probe to the question: “How popular are these tools among SNAP participants? Are they used often?”

A.1. Please start by telling me about your responsibilities related to mobile communication in the [LOCAL SNAP OFFICE] and experience with your State’s MCS development and implementation.

When asked this question, interviewees did not always address the client’s use of the MCS.

Add the definition of MCS to the question so that the question reads, “Please start by telling me about your responsibilities related to mobile communication in the [LOCAL SNAP OFFICE] and experience with your State’s MCS development and implementation of text messaging, mobile applications SNAP participants can download on a smartphone or tablet, and websites that are optimized.”

C.1. Probe: Can you confirm that this list of available functions is accurate?

Probe [IF APPLICABLE]: What level of functionality does the app provide? Can clients only view information about their cases (view-only), or are they also able to upload documents, report changes, or submit recertification applications?

Interviewees did not understand what was meant by “report changes.”

Replace “report changes” with “initiate changes.”

D.2. What portion of case management activities do you think clients can complete via their mobile devices?

Interviewees did not understand what was meant by “case management.”

Add the following clarifying probe: “For example, was the processing of new applications, or the process for recertifying, conducting quality control, or other clerical activities, affected by SNAP participants using the MCS? If so, how?”

Section E: Ease of Use: Text Messaging

For States that do not have text messaging service available, it would still be helpful for researchers to determine whether text messaging (related to SNAP activities) is provided by a third-party provider.

Add the following question to the beginning of the E: Ease of Use: Text Messaging section: “Based on our understanding, there is no text messaging option available for communication between SNAP participants and the State; however, do you or any other groups (outside of the State) use mobile communications (e.g., text messaging) to communicate with clients?”

Section F: Ease of Use: Mobile App

For States that do not have a mobile app available, it would still be helpful for researchers to determine whether a mobile app (related to SNAP activities) is provided by a third-party provider.

Add the following question to the beginning of the F: Ease of Use: Mobile App section: “Based on our understanding, there is no SNAP mobile app sponsored by the State; however, do you or any other groups (outside of the State) use mobile apps to communicate with SNAP clients?”



  1. Other Stakeholders or Community Partners Interview Protocol

To pretest the Other Stakeholders or Community Partners Protocol, the FNS research team conducted a telephone interview with a case manager at a Maryland nonprofit that partners with the State’s SNAP office to help elderly Maryland residents complete the SNAP application and assist them with other case management activities. This protocol was pretested on June 26.

  1. Duration

The interview lasted 27 minutes, which suggests the number of questions on the protocol is appropriate.

  1. General Findings

The interviewee was able to understand and respond to the questions. However, the team discovered during the interview that the case manager works exclusively with older populations who are neither familiar with the relevant technology nor have access to a computer. As a result the case manager typically uses a laptop to assist SNAP applicants and participants with SNAP-related activities and does not have experience using MCS for SNAP-related activities. The team learned the importance of carefully prescreening community partners prior to the interview. FNS research team recommends relaying clearer expectations of which staff or organizations would be ideal for participating in this interview as well as adding two additional questions to the protocol to provide more context surrounding SNAP participants’ use of MCS and which MCS are available to SNAP participants.

  1. Question-by-Question Findings and Recommendations

Table 6 provides the findings and recommendations for specific questions. See appendix J for the revised Other Stakeholders or Community Partners Interview Protocol, including the suggested changes described in table 6.

Table 6. Item-Level Recommendations for Other Stakeholders or Community Partners Interview Protocol

Question Number From
Draft Instrument

Findings/Observations

Recommendations

Introductory text at the beginning of the interview

During the pretest, the team discovered it would be helpful to include a definition of MCS at the beginning of each interview with stakeholders.

Add the following text: “For the purposes of this study, MCS include text messaging, mobile applications participants can download on a smartphone or tablet, and websites that are optimized for viewing on mobile devices.”

Introduction to Section D. Function Components of Each MCS

More information was needed on when the team collected the data and from which data sources.

Add the following text to the introduction: “As a reminder, this depiction of your State’s MCS is based on information available from public sources. Our preliminary review was last updated in [INSERT DATE].”

Section E: Ease of Use: Text Messaging

For States that do not have text messaging service available, it will still be helpful for researchers to determine whether text messaging (related to SNAP activities) is provided by a third-party provider.

Add the following question to the beginning of the E: Ease of Use: Text Messaging section: “Based on our understanding, there is no text messaging option available for communication between SNAP participants and the State; however, do you or any other groups (outside of the State) use mobile communications (e.g., text messaging) to communicate with clients?”

Section F: Ease of Use: Mobile App

For States that do not have a mobile app available, it would still be helpful for researchers to determine whether a mobile app (related to SNAP activities) is provided by a third-party provider.

Add the following question to the beginning of the F: Ease of Use: Mobile App section: “Based on our understanding, there is no SNAP mobile app sponsored by the State; however, do you or any other groups (outside of the State) use mobile apps to communicate with SNAP clients?”

Section A. Recap Background on State’s MCS Implementation

The team learned through the pretest interview that this particular organization’s clients were not likely to use the MCS for SNAP-related activities. It would be helpful to ask the interviewee, early in the interview, how popular the MCS are among its clients.

Add the following question to the Recap Background on State’s MCS Implementation section: “Tell me about how your clients use [INSERT NAME OF MCS].”

  1. SNAP Participants Focus Group Protocol

To pretest the SNAP Participants Focus Group Protocol, the FNS research team conducted an in-person focus group with three individuals who were current SNAP participants. All three of the participants had heard of the MyDHR website, but only two had accessed the website via a mobile device or tablet. Two of the participants were aware of the FreshEBT app, and one had downloaded and used the app. Although the majority of the focus group focused on the MyDHR mobile-optimized website, there was a great deal of confusion regarding the services available through the nonoptimized website versus the mobile-optimized website for MyDHR, such as which link to use to access the MyDHR homepage.

  1. Duration

The focus group lasted 61 minutes, but additional time was needed to conduct the usability testing of the MyDHR mobile-optimized website. Despite having a Wi-Fi network at the pretest location, all three participants were unable to connect to the website on their telephones during the focus group, resulting in the exclusion of the ease-of-use testing questions and activities from the pretest.

  1. General Findings

The interviewees said the questions were clearly worded and easy to understand. However, there were several terms and phrases used throughout the focus group that required further explanation or definition. For example, the interviewees did not understand what was meant by “acquired” or “incentive.”

  1. Question-by-Question Findings and Recommendations

Table 7 provides the findings and recommendations for specific questions. See appendix K for the revised SNAP Participants Focus Group Protocol, including the suggested changes described in table 7.

Table 7. Item-Level Recommendations for SNAP Participants Focus Group Protocol

Question Number From
Draft Instrument

Findings/Observations

Recommendations

Icebreaker text

Some of the text in the icebreaker section was more appropriate for and aligned with the introduction.

To provide more clarity and continuity of the conversation, the team moved some of the text from the icebreaker section to a more appropriate place in the introduction.

A.2. Based on this diagram, are there any SNAP-related activities you can perform on your telephone that we haven’t already discussed?

Some participants responded to this question by describing the functionality of the FreshEBT app instead of the State-sponsored MyDHR mobile-optimized website.

Modify the question as follows so it asks interviewees to describe only the functionality available in the State-sponsored MCS: “2. Based on this diagram, are there any SNAP-related activities you can perform on [INSERT NAME OF STATE’S MCS] on your phone that we haven’t already discussed (e.g., via text messaging, on a mobile app, or on a mobile-optimized website)?”

B.1. Please raise your hand if you have ever received text messages from your local SNAP agency.

Some participants said they had received emails or text messages from caseworkers regarding their SNAP accounts.

Add a probe to the question that asks if participants have ever received informal text messages from the State office; for example, “Have you received personal messages from your local SNAP agency or case worker? Did you receive messages through an app or mobile-optimized website?”

Section B. Functionality of Text

For States that do not have text messaging service available, it would still be helpful for researchers to determine whether text messaging (related to SNAP activities) is provided by a third-party provider.

Add the following question to the beginning of the section: “Based on our understanding, there is no official text messaging communication available between SNAP participants and the State, however do you ever use text messaging to communicate with the SNAP local office or others who help you manage your SNAP case?”

C.2 Did you have to download anything to start using the app? If so, please describe the instructions for how you acquired the app and first logged in.

Participants were not sure what was meant by “acquired.”

Replace “acquired” with “came upon.”

Section C. Functionality of Mobile App

For States that do not have a mobile app available, it would still be helpful for researchers to determine whether a mobile app (related to SNAP activities) is provided by a third-party provider.

Add the following question to the beginning of the section: “Based on our understanding, there is no official State SNAP mobile app; however, do you ever use mobile apps for SNAP?”

D.2 Did you have to do anything prior to logging into the website? If so, please describe the process of how you acquired accessed the website and first logged in?

Participants were not sure what was meant by “acquired.”

Replace “acquired” with “accessed.”

D.3 How did you find out about the website?

There was some confusion surrounding the website.

Add the name of the website to this question.



  1. SNAP Office Waiting Room Questionnaire

To pretest the SNAP Office Waiting Room Questionnaire, the FNS research team tested the protocol in person with two individuals who were current SNAP participants.

  1. Duration

The questionnaire was read to the participants and was completed in 10–11 minutes, which indicated several questions should be eliminated. The team deleted two questions in an attempt to reduce the amount of time required to complete the questionnaire to 5–7 minutes.

  1. General Findings

Although the interviewees indicated the questionnaire was clear and easy to understand, at several points during the pretest, participants provided responses that were not relevant to the questions. As a result the team recommends rewording several questions to help participants better understand the intent of each question and the questionnaire’s focus on SNAP-related activities that are conducted on a mobile device (beyond using wireless telephone services for SNAP-related activities).

  1. Question-by-Question Findings and Recommendations

Table 8 provides the findings and recommendations for specific questions. See appendix L for the revised SNAP Office Waiting Room Questionnaire, including the suggested changes described in table 8.

Table 8. Item-Level Recommendations for SNAP Office Waiting Room Questionnaire

Question Number From
Draft Instrument

Findings/Observations

Recommendations

Have you ever used your telephone to get information about SNAP through a website or app? If so, what did you do?

Can you tell me about what it was like? Did you get the information you needed or do what you needed to do? Was it easy to use?

Did using this mobile technology save you time? For instance, did it save you a trip to your local office or from making a telephone call and waiting on hold?

Did you still have to take additional steps to get what you needed?

Did you have to ask someone (a friend, family member, caseworker) for help using the mobile technology?

How did you find out about these mobile technology tools?

This question took a long time to administer because it contained several separate questions instead of probes for one question.

Delete several of the questions that follow the first question and modify the first question to focus on the probes related to accessing the MCS and whether it saved time, which align with other questions across the protocols. The modified question could read, “Have you ever used your smartphone to get information about SNAP through a website or app? If so, what did you do?

Probe: Can you tell me about what it was like? Did you get the information you needed or do what you needed to do? Was it easy to use?

Probe: Can you tell me about what it was like? Did you get the information you needed or do what you needed to do? Was it easy to use??”

What part of the SNAP process do you wish you could do on your telephone?

Some participants responded to this question by stating they would like to use the mobile telephone to make calls as part of the SNAP process.

Because the team’s focus is on improvements to the three specific SNAP MCS (text message, mobile app, and mobile-optimized telephone), the team recommends naming the three MCS of focus in the question; for example, “What part of the SNAP process do you wish you could do on your phone via text message, mobile app, or a website viewed on your phone?”

How could [LOCAL SNAP OFFICE] better connect with you on your mobile device?

Some participants responded to this question by stating they would like to use the mobile telephone to make calls as part of the SNAP process.

Because the team’s focus is on improvements to the three specific SNAP MCS (text message, mobile app, and mobile-optimized telephone), the team recommends naming the three MCS of focus in the question; for example, “How could [LOCAL SNAP OFFICE] better connect with you on your mobile device via text message, mobile app, or a website viewed on your phone?”



  1. SNAP Participants Focus Group Demographic Questionnaire

To pretest the SNAP Participants Focus Group Demographic Questionnaire, the FNS research team briefly tested the questionnaire with two individuals who were current SNAP participants. Participants were notified that the questionnaire was intended to be administered prior to the focus group and that their answers would not affect their eligibility for the focus group.

  1. Duration

The demographic questionnaire took 5–11 minutes to complete.

  1. General Findings

The participants were able to understand the questions with relative ease but provided several responses that were not already listed on the questionnaire. Although the questionnaire includes “other” fields, FNS Research team recommends adding several answer options and providing additional clarification that the questionnaire is asking about the number of people in one’s SNAP household as opposed to the number of people that live in the same residential unit.

  1. Question-by-Question Findings and Recommendations

Table 9 provides the findings and recommendations for specific questions. See appendix O for the revised SNAP Participants Focus Group Demographic Questionnaire, including the suggested changes described in table 9.

Table 9. Item-Level Recommendations for SNAP Participants Focus Group Demographic Questionnaire

Question Number From
Draft Instrument

Findings/Observations

Recommendations

6. Including you, how many people currently live or stay in your home

When responding to this question, one participant who lived in a group home was unsure if she should count the other residents of the home even though they were not part of her SNAP household.

Replace “home” with “SNAP household” to clarify the question is asking for the number of people.

7. About how many years have you been receiving SNAP benefits

When responding to this question, some participants who had received SNAP across multiple States at different times were not sure which time period to cite.

Amend the question as follows: “Including you, how many people are currently in your SNAP household??”

8. Where do you access the internet? (Select all that apply)

When responding to this question, one participant answered, “on the telephone,” which is not a current option.

Add “phone” as a potential answer for this multiple-choice question.

10. What are all the ways you pay for the internet on your telephone or tablet?

Some participants who responded to this question said they utilized a free telephone provided through SNAP that included a set amount of free data.

Because some SNAP programs provide mobile telephones with free or discounted data, the team recommends adding “Government-subsidized phone plan (e.g., a set amount of data is provided free, and any additional amount is paid for by phone holder)” and “N/A” to the list of potential responses.

10. What are all the ways you pay for the internet on your telephone or tablet?

Some participants who responded to this question said they did not pay for internet service for their telephones.

Add the following question prior to this question: “What are all the ways you pay for the internet on your phone or tablet?”

11. How do you access the internet on your telephone? (Select all that apply)

When responding to this question, one participant answered, “Google”

Reword the question as follows to focus on connectivity: “How do you connect to the internet on your phone? (Select all that apply)”



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy