RTI International, Altarum Institute, and the University of California, Berkeley (UCB) conducted pretests of the draft data collection instruments for the WIC Nutrition Education Study in November to December 2013. Based on the pretest findings, the instruments were revised and the Participant and Site Surveys were translated into Spanish. Pretests of these instruments were conducted with Spanish-speaking individuals in March 2014.
This appendix summarizes the pretest procedures and key findings. Based on the pretest findings, RTI provided the Food and Nutrition Service (FNS) with “red line” instruments along with recommendations for revising the instruments.
To obtain feedback from a diverse group of WIC staff and participants, we identified local agencies (LAs) in four FNS regions to recruit pretest participants to test the Phase I and Phase II data collection instruments. These regions were
Western Region: California (three LAs);
Midwest Region: Illinois and Wisconsin (one LA each);
Northeast Region: Maine and Massachusetts (one LA each); and
Southeast Region: North Carolina (one LA) and Florida (two LAs).
The LAs represented diversity in terms of caseload size, type of agency (e.g., government, nongovernment), geographic setting (e.g., urban, rural), number of WIC staff, and race/ethnicity of participants served. Before recruiting LAs, the Regional Offices and State agencies (SAs) affiliated with these agencies were contacted via email to inform them of the study. All SAs indicated that they had no reservations about the selected LAs participating in the pretests.
To recruit participants, we worked with three of the LAs located in California, Illinois, and Massachusetts. The LA Directors at each LA were contacted by phone to make the request for their agency’s participation. Each director agreed to complete the LA Survey themselves (instead of delegating it to someone else) because they believed themselves to be the most appropriate individual.
One week before the date of the scheduled pretest interview, the participants were sent a paper copy of the survey along with instructions to complete it in advance of the interview and record the start and end times for completing the survey. Completed surveys were returned to the interviewer prior to the scheduled pretest interview. On the day of the interview, the interviewer obtained the participant’s verbal consent and then used a debriefing guide to lead him/her through a discussion of the survey questions.
The average response time for the three participants was 48 minutes (see Table 1). Each of the respondents skipped two or more questions based on the skip patterns. The respondent who took the longest to complete the survey reported he spent some time contacting the SA about the percentage of high-risk participants. Because all respondents reported that they do not have this data available, we recommend deleting Question 6.1 Because participants reported that they have a good idea of the predominant high-risk conditions across all participant categories but not for the different categories, we recommend combining the two questions on high-risk conditions (Questions 7 and 8) into one question to ask about the predominant high-risk conditions among all participants.
Table 1. LA Survey: Pretest Participants’ Time to Complete Survey and Opinions Regarding Survey Length
Participant |
Time to Complete (minutes) |
Opinion on Survey Length |
California |
55 |
“A little long. Time to do survey will depend on how familiar the person who completes it is with the data.” |
Illinois |
29 |
“About right” |
Massachusetts |
60 |
“Okay” |
|
Average: 48 minutes |
|
In addition, we recommend that the two questions (Questions 12 and 13) on methods of nutrition education by appointment type be combined into one question using a table format. Additionally, we recommend other minor changes throughout the survey to clarify terminology or instructions, add or revise response options, and revise question order or format (Questions 2, 5, 9, 12, 20–24, 27, 31, 32, and final table). Based on these revisions, we estimate respondent burden to be 45 minutes for the LA Survey.
To recruit participants, we worked with three of the LAs located in California, Illinois, and Massachusetts. The LA Directors were contacted by phone to make the request for their agency’s participation in the pretest. Each LA Director recommended an individual at one of their sites to participate in the pretest. We conducted pretests with a site supervisor, a senior nutritionist, and a nutrition education coordinator.
One week before the date of the scheduled pretest interview, participants were sent a paper copy of the survey along with instructions to complete it in advance of the interview and record the start and end times for completing the survey. Completed surveys were returned to the interviewer prior to the scheduled pretest interview. On the day of the interview, the interviewer obtained the participant’s verbal consent and then used a debriefing guide to lead him/her through a discussion of the survey questions.
The average response time for the three participants was 62 minutes (see Table 2). It took one respondent 90 minutes to complete the survey, but this respondent answered nearly all questions because the site provides nutrition education in multiple modes and has a large number of staff. Two of the three participants had to consult records, reports, or other staff members to complete the survey. One participant reported she spent 15 minutes with her supervisor for input on questions that she was uncertain about. Overall, the three participants found the survey reasonable in length and believed their peers could complete the survey without too much difficulty. However, one participant thought that her peers who spend a lot of time with clients and have little “office time” may find it challenging to find time to complete the survey. Another participant suggested including more examples and clarification for some questions to make the survey less challenging.
Because it took participants, on average, over an hour to complete the survey, we recommend dividing the survey into a set of core questions and two modules to reduce respondent burden; half of the survey respondents will receive the base questions and one module, while the other half will receive the base questions and the second module (i.e., two versions of the survey). Question 1 and Question 37 were deleted after dividing the survey into two versions because they were no longer deemed relevant. In addition, an optional Nutrition Education Staff Summary form was added to Version 1 to collect information on characteristics of staff because participants thought it would be useful, “especially for sites that have a large number of staff.”
Table 2. Site Survey: Pretest Participants’ Time to Complete Survey and Opinions Regarding Survey
Participant |
Time to Complete (minutes) |
Opinion on Survey Length |
Overall Ease/Difficulty to Completea |
California |
90 |
“Length was about right given the amount of information asked.” |
4, “Some questions were detailed” |
Illinois |
43 |
“Appearance-wise, quite lengthy, but not as bad as I thought it would be.” |
“Overall fairly easy” |
Massachusetts |
45–60 |
“A little long.” |
2 = easy |
|
Average: 62 minutes |
|
|
a Rated using scale from 1 to 5, where 1 is “very easy” and 5 is “very difficult.”
The new format was tested by two Altarum staff members who have experience working with local WIC agencies. It took one staff member 45 minutes to complete Version 1 and the other staff member 55 minutes to complete Version 2. The 10-minute difference was a result of the staff member’s more descriptive response to the open-ended question regarding nutrition education projects. Based on the results of this testing, we estimate respondent burden to be 45 minutes for each version of the Site Survey.
During the pretests, no respondent calculated the number of full-time equivalents (FTE) in the same way and was unable to explain clearly how to do the calculation. So that the FTEs can be calculated consistently across sites, the number of full- and part-time staff members (Question 20) was restructured, and Question 21 on FTEs was deleted. Additionally, we recommend other minor changes throughout the survey to clarify terminology or instructions, add or revise response options, and revise question order or format (Questions 2, 6, 9–13, 38, 41–43, 45, and 48).
To recruit participants, the LA Directors at two LAs in Florida were contacted by phone to ask for their agency’s participation. Each Director identified one individual to complete the Site Survey and participate in the phone interview.
One week before the date of the scheduled pretest interview, the participants were sent a paper copy of the survey along with instructions to complete it in advance of the interview and record the start and end times for completing the survey. One participant received Site Survey Version 1 and the other received Site Survey Version 2. On the day of the interview, the interviewer obtained the participant’s verbal consent and then used a debriefing guide to lead him/her through a discussion of the survey questions.
The response time for Site Survey Version 1 was 38 minutes and the response time for Version 2 was 60 minutes. The respondent for Version 1 reported it was not necessary to consult with any other individuals to complete the survey or to locate records to answer the questions. The respondent for Version 2 reported that she needed to find the information necessary to respond to Question 4. Both respondents selected rating “2” on a 5-point scale of difficulty of completing the survey with “1” being very easy and “5” being very difficult.
Both respondents reported that the instructions were clear; however, they recommended using the term “la clinica” in place of “del sitio” in the instructions and throughout the survey because that is the term that will be familiar to Spanish-speaking respondents. Both also suggested revisions in terminology in other questions (e.g., use of “Universidad” rather than “bachillerato” for college degree and use of “classes de nutricion” rather than “sesiones de education grupel” for nutrition education group sessions).
Pretest respondents were asked for opinions regarding translation revisions suggested by FNS. In some cases, the respondents agreed with the changes, and in others they stated that the terms used in the questions were appropriate or they recommended a change to a term other than what the FNS reviewer suggested.
To recruit participants, we worked with three of the LAs located in Wisconsin, California, and Maine. The LA directors were contacted by phone to request their agency’s participation in the pretest of the Phase I in-depth interviews. Each LA director identified one or two staff members who would be appropriate to participate in the pretest. We conducted interviews with five individuals: one clinic coordinator, one supervising nutritionist, one nutritionist, and two nutrition paraprofessionals.
Two days before the scheduled interview date, we sent participants the selected modules of questions to be asked during the pretest for their review. On the day of the interview, the interviewer obtained the participant’s verbal consent and then administered the selected modules following the interview guide. Each respondent answered questions in the module(s) related to their job role (per the instructions in the interview guide). Following the administration of each module, the interviewer used a debriefing guide to lead a discussion to identify any problematic questions or other concerns before moving on to the next module.
Table 3 presents the amount of time it took respondents to complete each module. The interview guide was effective in eliciting descriptions of how respondents provide nutrition education and information about their experience with training, technology, coordination, and use of reinforcement materials. Hence, few changes are recommended to the interview guide. Based on the average time spent per module during the pretest, a 30-minute interview time period (burden) is appropriate to complete two to three modules per respondent. We suggest removing two duplicative questions in Module B (items l and m), which will shorten this module by a couple of minutes allowing for both Modules A and B to be completed in approximately 30 minutes. We also recommend adding a question to Module C to ask respondents to comment on the effectiveness of technology-based nutrition education because two respondents volunteered this information in response to another question, and they had interesting points to offer. Lastly, we suggest revising questions about assistance with using skills learned in training in both Modules A and B because pretest respondents understood the term “mentoring” differently.
Table 3. Phase I In-depth Interviews: Pretest Participants’ Time to Complete Each Module
Modules |
A-Individual |
B-Group |
C-Technology |
D-Reinforcers |
E-Coordination |
Time to Complete Modules |
M1: 18 min W1: 21 min M2: 16 min
AVG: 18 |
C1: 20 min W1: 15 min C2: 15 min
AVG: 17 |
C1: 9 min W1: 4 min C2: 5 min
AVG: 6 |
M1: 5 min C2: 5 min M2: 3 min
AVG: 4 |
C1: 3 min M1: 1 min C2: 3 min M2: 3 min AVG: 2.5 |
W = Wisconsin, C = California, M= Maine
We describe below the pretests with five English-speaking participants. In March 2014, we will conduct pretests with four Spanish-speaking participants to test the translated versions of the instruments and subsequently update this appendix.
To recruit participants for the interviews, RTI and UCB contacted a local WIC clinic in Raleigh, NC, and Oakland, CA, respectively. In addition to posting study flyers at the Raleigh clinic, RTI and UCB staff members announced the study during group nutrition education sessions or approached clients about their interest in the study. Interested individuals were screened for eligibility, and if eligible, and an interview was scheduled. We conducted interviews with five women: one pregnant woman, one postpartum woman, one woman with an infant up to 11 months, and two women with children aged 1 to 4 years.
On the day of the interview, the interviewer obtained the participants’ written consent and asked them to complete the questionnaire, noting the start and end times for Parts 1 and 2. After the participant finished completing the questionnaire, the interviewer used a debriefing guide to lead a discussion using cognitive interviewing (“think aloud”) techniques to identify any problematic questions or other concerns. Each participant received a $50 cash gift for their participation.
Table 4 presents the amount of time it took respondents to complete each part of the Participant Survey and their opinions about the survey.2 Pretest participants completed the combined version of the survey, which includes questions asked at all three time periods—baseline, interim, and final.3 On average, it took participants 24 minutes to complete both parts of the survey. Excluding the outlier with a burden of 38 minutes reduces the average burden to 20.5 minutes. Given that for the full-scale survey administration, participants will be completing a subset of the questions (not all questions are asked at each time period), we estimate that respondent burden is 20 minutes for each version (baseline, interim, final) of the survey (about 10 minutes for Part 1 and 10 minutes for Part 2).
Table 4. Phase II Participant Surveys: Pretest Participants’ Time to Complete the Survey and Opinions on Survey
Participant |
Time to Complete (Minutes) |
Opinion on Survey Length |
Overall Ease/Difficulty to Completea |
||
Part 1 |
Part 2 |
Total |
|||
CA1 |
5 min |
12 min |
17 min |
“It was appropriate. Had to think about some questions more than others.” |
2 = easy |
CA2 |
11 min |
27 min |
38 min |
“Fine” |
2 = easy |
NC1 |
5 min |
17 min |
22 min |
“Okay. It could take longer for less educated people.” |
1 = very easy |
NC2 |
7 min |
13 min |
20 min |
No comment |
2, “everyday stuff; just had to think back” |
NC3 |
10 min |
13 min |
23 min |
“Just right. Felt less than 23 minutes.” |
1 = very easy |
|
Average
for both parts: |
|
|
a Rated using scale from 1 to 5, where 1 is “very easy” and 5 is “very difficult.”
The following changes were recommended to the Survey for Parent/Caregiver with Eligible Child:
Make textual changes to self-efficacy and stages of change tables (Questions 14 and 15) given that most respondents indicated that they were already engaged in the desired behaviors. These changes will help reduce the number of respondents already doing the behavior at baseline.
Make textual changes to reduce the social desirability bias of certain questions (Questions 8, 14, 22, and 43).
Add more descriptive text to clarify meaning of some questions and/or alleviate confusion that one or more respondents had with the questions (Questions 19, 30, 32, 41, 46, and 54).
Reformat some questions to alleviate confusion that one or more respondents had with answering the questions (Questions 43, 44, 49, 69, and 70). These revisions will not increase or reduce the response burden.
Eliminate questions and/or response options that were found to be duplicative (Questions 34, 36, 40, 42, 45, 48, 55, 56, and 57).
Add a question to capture the value of the information given at most recent WIC visit (following Question 34).
Add a question to capture the impact of bringing the child to a WIC clinic visit (following Question 34).
Add a question to look at continuity between visits (Question 42c).
For consistency, the above changes were recommended to the other two surveys as well as the following changes:
For the Survey for Pregnant Women, eliminate breastfeeding questions because pregnant women cannot be already breastfeeding their child (Questions 37, 43, and 47).
For the Survey for Postpartum Women, add more descriptive text to clarify meaning of questions (Questions 9 and 11) and add breastfeeding questions (Questions 35 and 71), which are educational topics WIC discussed with the target audience.
To recruit participants for the interviews, RTI and UCB attended group nutrition education sessions at a local WIC clinic in Raleigh, NC, and Oakland, CA, respectively, and made an announcement about the study at the beginning of each session. After the session, interested group attendees approached RTI and UCB staff members and were screened for eligibility; if eligible, an interview was scheduled. We conducted interviews with four Spanish-speaking women: one pregnant woman, one postpartum woman, and two women with children aged 12 months to 4 years.
The interviewer obtained participants’ written consent and asked them to complete the questionnaire, noting the start and end times for Parts 1 and 2. After the participant finished completing the questionnaire, the interviewer used a debriefing guide to lead a discussion using cognitive interviewing (“think aloud”) techniques to identify any problematic questions or other concerns. Each participant received a $50 cash gift for their participation.
Table 5 presents the amount of time it took respondents to complete each part of the Participant Survey and their opinions about the survey.4 Pretest participants completed the combined version of the survey, which includes questions asked at all three time periods—baseline, interim, and final.5 On average, it took participants 41 minutes to complete both parts of the survey. For the full-scale survey administration, participants will be completing a subset of the questions (not all questions are asked at each time period). Thus, we estimate that respondent burden is 30 minutes for the baseline survey, 25 minutes for the interim survey, and 25 minutes for the final survey, which is longer than the English version (20 minutes for each time period). This is consistent with other surveys we have conducted in which it generally takes longer to complete a survey in Spanish than English because the translated document is longer.
Table 5. Phase II Participant Surveys: Pretest Participants’ Time to Complete the Survey and Opinions on Survey
Participant |
Time to Complete (Minutes) |
Opinion on Survey Length |
Overall Ease/Difficulty to Completea |
||
Part 1 |
Part 2 |
Total |
|||
CA1 |
17 min |
40 min |
57 min |
“Some questions seemed repetitive.” |
2 = easy |
CA2 |
9 min |
28 min |
37 min |
“There was a lot of reading involved.” |
2 = easy |
NC1 |
9 min |
16 min |
25 min |
No comment |
2 = easy |
NC2 |
13 min |
32 min |
45 min |
No comment |
2 = easy |
|
Average
for both parts: |
|
|
a Rated using scale from 1 to 5, where 1 is “very easy” and 5 is “very difficult.”
The following changes were recommended to the three versions of the surveys:
Change “How often do you do the following things” to “How many times do you do the following things.”
In Question 32, change “Talk with WIC staff one-on-one” to “Talk one-on-one with a WIC staff person” to use consistent language throughout surveys.
In Question 73, fix translation error; change “each topic” to “the topic.”
Throughout survey, revise translation for “one-on-one” so more meaningful to respondents.
For consistency, the above changes, except for the two translation-related items, will be made to the English versions as well.
To recruit participants to pretest the focus group moderator guide, two RTI team members attended three classes at the WIC clinic in Raleigh, NC. At the beginning and/or end of each class, a staff member informed class participants about the study and asked them to see an RTI team member after class to schedule an interview. We conducted interviews with three women: one pregnant woman, one postpartum woman, and one woman with a child.
On the day of the interview, the interviewer began by asking participants to read and sign an informed consent form. The interviewer then led a discussion with each participant using the moderator guide. The interviewer stopped intermittently to ask each participant if any of the questions were difficult to answer and/or understand. Each interview lasted approximately 30 minutes, and participants received a $50 cash gift.
The moderator guide was useful in leading a stimulating and productive discussion with each participant. Participants found the questions easy to answer and also enjoyed sharing their opinions. However, when asked Questions 13 and 14, which asked about any changes made to try to be healthier, the pregnant participant asked, “Have I made changes since becoming pregnant or since receiving WIC,” when the questions asked about changes in the past year. This question was reworded to clarify meaning, “since receiving WIC” and revised to ask participants to answer in the past 6 months (because 12 months may be difficult to recall). No additional changes were made to the moderator guide. Based on this pretesting and our experience conducting focus groups, the estimated burden for each group is 90 minutes.
To recruit participants, we worked with three of the WIC LAs in Wisconsin, California, and Maine. The LA Director was contacted by phone to request their agency’s participation in the pretest of the Nutrition Educator Survey. Each LA Director identified a staff member who would be appropriate to participate in the pretest. We conducted interviews with three individuals: one clinic coordinator, one nutritionist, and one nutrition paraprofessional who is bilingual, with English as a second language.
One week before the pretest interview, we sent a paper copy of the survey along with instructions to complete the survey and record the start and end times when completing the survey. Completed surveys were returned to the interviewer prior to the scheduled interview. On the day of the interview, the interviewer obtained the participants’ verbal consent and then used a debriefing guide to lead them through a discussion.
The average response time was 25 minutes (see Table 6). To reduce respondent burden to 20 minutes and to address respondents’ comments regarding the ease/difficulty of completing the survey, we recommend (1) reducing or expanding response options in Questions 5, 8, 9, 10, 14, 16, 17, 18, 20, 23, 24, 25, and 27 and (2) making revisions to alleviate confusion that one or more respondents had with Questions 1, 7, and 12. Additionally, we recommend removing two questions that are not necessary to addressing the research questions and that respondents had difficulty answering (Questions 21 and 25). Based on these revisions, we estimate respondent burden to be 20 minutes for the Nutrition Educator Survey.
Table 6. Nutrition Educator Survey: Pretest Participants’ Time to Complete the Survey and Opinions on Survey
Participants |
Time to Complete (minutes) |
Opinion on Survey Length |
Overall Ease/Difficulty to Completea |
Wisconsin |
29 |
“Overall very thorough” |
2 = easy |
California |
26 |
“It does take time to read through each question, but they were easy.” |
1 = very easy |
Maine |
20 |
“About right” |
1 = very easy |
|
Average: 25 min |
|
|
a Rated using scale from 1 to 5, where 1 is “very easy” and 5 is “very difficult.”
During an assessment of client-centered services conducted for the Michigan WIC Program in September 2013, four Altarum staff members administered the Individual Nutrition Education Assessment Form during observation of 20 individual sessions of certification and nutrition education activities with participants. Following the observations conducted in Michigan, the team compared their experience with using the form. Team members who had observed the same LA staff members conducting individual sessions compared their results for consistency in ratings and found that ratings were consistent. While conducting the cognitive interviews at the Raleigh WIC clinic, RTI also observed two group nutrition education sessions to test the Group Nutrition Education Assessment Form.
Based on the pretesting, we recommend the following changes:
Delete second row of Question 12, “Approach to Education Topics,” because the features were duplicative of the previous row.
Delete one of three response items in Question 17, “Information Gathering,” to make it easier for the observer to answer.
Revise the wording of the response items in Question19 to provide clarification.
Based on the pretesting, we recommend the following changes:
Remove “educational props” from Question 9 because they are included in Question 10.
Change response option for facilitation style (Question 15) from a three-point scale to a two-point scale to make it easier for observer to provide an answer.
Make the same changes described above in the Individual Form.
We emailed four SAs in Maryland, Oklahoma, Illinois, and Arizona regarding the administrative data request and asked them to respond to the following questions:
1. Approximately how much time (in minutes or hours) would it take for your SA to review the request: prepare, run and review the data; and submit the data to RTI?
2. Would your SA be able to provide all three data items requested? If not, which would you not be able to provide?
3. Is the data request clear? If not, what is unclear and/or what suggestions do you have for revising it?
Based on the comments from three of four state agencies (Maryland, Oklahoma, and Illinois), our estimate of the amount of time needed to respond to the data request (3 hours) was substantially underestimated. The SAs estimated that it would take 8 to 20 hours to respond to the data request, suggesting different states may have varying capacities to respond to these types of requests.
All three SAs can provide the dates of WIC visits and whether nutrition education was provided on those visits; however, only Illinois can provide whether a WIC participant is high risk, but to provide this information, the agency would need to look in each participant’s record because there is no high-risk field in their information system. In Maryland, the high-risk field indicator is usually “turned off” by staff after nutrition education has been completed. In Oklahoma, they have no standardized definition of high risk and no corresponding indicator field in their information system. If they were provided with a list of nutrition risk conditions, they could indicate which, if any, of those risk codes are in participant records. The three SAs found the data request clear, with the exception that Oklahoma has no standard definition for “high risk,” and suggests defining the term in the data request.
Because FNS decided it was too burdensome to ask SAs to comply with the data request, we revised our procedures to request the information from the pilot sites for a 30 percent subsample of participants. We emailed four of the sites that participated in the Phase I pretesting and asked them to review the revised administrative data request and to provide the estimated burden.
Three of the four sites responded to our email request. The three sites provided burden estimates of 3 hours, 4 hours, and 1.5 hours, for an average of 2.8 hours.
1 Question numbers correspond to the pretest version of the instruments and are available upon request.
2 For the full-scale study, Part 1 will be completed before the WIC participant’s appointment, and Part 2 will be completed after the WIC participant’s appointment.
3 Following the pretests, we will split the combined survey into three versions (baseline, interim, final).
4 For the full-scale study, Part 1 will be completed before the WIC participant’s appointment, and Part 2 will be completed after the WIC participant’s appointment.
5 Following the pretests, we will split the combined survey into three versions (baseline, interim, final).
EEEE-
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | scc |
File Modified | 0000-00-00 |
File Created | 2021-01-27 |