Study Plan for Evaluating and Improving Response Rates and Conducting a Non-Response Bias Analysis for the NPS Visitor Survey Card
Submitted to OMB/OIRA by the NPS Social Science Program
March 20, 2009
Introduction
The current Information Collection Request for the NPS Visitor Survey Card (VSC) was approved for 18 months (expiration date May 31, 2010). This approval is contingent on NPS meeting certain terms of clearance to make the data collected more suitable for GPRA and other program evaluation purposes. According to these terms, NPS is to submit a study plan that it will undertake to evaluate and improve the VSC collection methods, including a non-response bias assessment. This plan is to include actions to increase the current response rate of 29%. While 29% is significantly higher than the industry average for customer satisfaction surveys without follow-ups, the study plan described in this document considers ways to increase this rate.
A two-step process was devised to meet the terms of clearance:
A VSC Technical Task Force appointed by the NPS Visiting Chief Social Scientist prepared the study plan. This plan was reviewed and revised by the Visiting Chief Social Scientist after consultation with the Associate Director, Natural Resource Stewardship and Science.
Based upon options identified in the plan, an expanded VSC Task Force, under the direction of the Visiting Chief Social Scientist, will engage NPS stakeholders to:
Develop a statement of purpose for the VSC
Assess the feasibility of the alternative approaches described in the plan
Develop an acceptable method for determining non-response bias in facility/program/service evaluations
Revise questions on the survey to address the above items.
The study plan outlines four options that the NPS could undertake to increase VSC response rates and incorporate an enhanced non-response bias analysis plan into the methodology. Additional options may be identified during the study; therefore, the ones listed in this plan likely will be modified before a final course of action that meets NPS needs is determined.
Some measures to meet the terms of clearance involve significant changes to the VSC’s questionnaire, field methods, analysis, and cost. These implications also will be studied over the following year.
Alternatives
The Technical Task Force identified four alternatives for further study. These describe an array of likely response rates based on different field methods applied to the VSC. They range from a “no-change alternative” to an option that could achieve response rates similar to surveys incorporating multiple follow-ups to non-respondents.
Estimated costs for each alternative are not quantified at this time. Instead, they are presented on a qualitative scale, ranging from “low” to “very high.” Further study is needed to pinpoint these costs.1 Nevertheless, the following description provides an approximate idea of the budgetary impact of implementation and operation.
Low: Little or no change from the current VSC budget.
Moderate: Significant increase to the current VSC budget, but could be accommodated by re-programming Social Science Program funds, terminating some other activities.
High: Costs exceed the total annual appropriated budget of the Social Science Program and would require a significant increase in base funding.
Very High: The budget impact of “Very High” is estimated at approximately twice that of “High.”
1. 30-35% Response Rate with a Basic Non-Response Bias Analysis
Questionnaire Content: No change.
Field Methods: No change in most parks. Mailback survey with option to return the survey in drop boxes at parks requesting these. Modest increases in response rate would continue over time as the VSC project works with parks to improve their field methods.
Non-Response Bias Analysis: An annual non-response bias analysis would be added to the protocol in which a stratified random sample of 52 park units (10% sampling error at 90% confidence level) would be selected to participate. At these units, non-respondents would be asked to verbally respond to the overall satisfaction question from the current VSC survey instrument. Answers would be recorded by interviewers. Findings from non-respondents would be compared to respondents. If no significant differences are found, the probability of non-response bias is reduced. If significant differences are detected consistently over a specified period (e.g., two years), actions to improve response rates would be implemented. (Potential actions include alternatives 3 and 4.)
Benefits:
Adding the non-response bias analysis to the current survey method ensures that the representativeness of the sample can be estimated without the larger investment required for improving response rates (i.e., follow-ups to nonrespondents). This would give the NPS the confidence that the current approach can be used for GPRA and other program evaluation purposes.
Implementation in FY10 with little disruption to current operations.
GPRA-related data are not attached to visitor names or addresses, preserving respondents’ anonymity.
Minimum impact to park units. Only parks selected for the non-response bias analysis are affected.
Estimated Costs: Low
Additional costs would be limited to adding the non-response bias analysis procedure to the current training manual for VSC coordinators at each park participating in the analysis. The additional costs include extra training provided by park coordinators to interviewers at the non-response bias sample sites, additional survey materials, oversight, data entry, and analysis.
2. 30-35% Response Rate with an Expanded Instrument for More In-Depth Non-Response Bias Analysis.
Questionnaire Content: Demographic variables added to the instrument.
Field Methods: No change for most parks. Mailback survey with option to return the survey in drop boxes at parks requesting these. Modest increases in response rate would continue over time as the project works with parks to improve their field methods.
Non-Response Bias Analysis: A non-response bias analysis would be conducted using the added demographic questions. Two approaches were considered:
A stratified random sample of 52 park units (10% sampling error at 90% confidence level) would be selected to participate in this analysis. At these units, non-respondents would be asked one or two demographic questions and the overall satisfaction question from the modified questionnaire. Answers would be recorded by interviewers. Findings from non-respondents would be compared to respondents to check for significant differences. If no significant differences are found, the probability of non-response bias is reduced. If differences are detected consistently over a specified period (e.g., two years), actions to improve response rates could be implemented in future years. (Potential actions include alternatives 3 and 4.)
OR
Late respondents to the mailback survey could be compared to early respondents on the demographic questions and overall satisfaction. (This does not require asking non-respondents demographic and satisfaction questions on-site, reducing the burden of the survey on staff and visitors.) Late respondents would serve as a proxy for non-respondents. If no significant differences are found, the probability of non-response bias is reduced. If differences are detected consistently over a specified period (e.g., two years), actions to improve response rates could be implemented in future years. (Potential actions include alternatives 3 and 4.)
Benefits:
Adding either non-response bias analysis to the current survey methodology ensures that the representativeness of the sample can be estimated without the larger investment required for improving response rates (i.e., follow-ups to nonrespondents). This would give the NPS greater confidence that the current approach can be used for GPRA and other WASO program evaluation purposes.
Implementation could be accomplished by FY11.
GPRA-related data are not attached to visitor names or addresses, preserving respondents’ anonymity.
Minimum impact on park units. Only parks selected for the non-response bias analysis are affected.
By adding demographic variables to the instrument, NPS would have a means to collect this important visitor information at the unit, region, and system levels. The additional questions also would increase park buy-in to the survey. Systematic information on who park visitors are is one of the most common needs reported by park managers.
Estimated Costs: Low to Moderate (depending on the non-response bias analysis option)
Additional costs relate to the design of the new instrument and adding the non-response bias analysis procedure to the current training manual for VSC coordinators at each park participating in the analysis. The additional costs include extra training provided by park coordinators to interviewers at the non-response bias sample sites, additional survey materials, oversight, data entry, and analysis.
3. 50-60% Response Rate with an On-site Return Option
Questionnaire Content: Demographic variables added to the instrument. Evaluation questions refined.
Field Methods: An added lock box option for the collection of surveys on-site would be implemented at all park units where this is feasible.2 The option for respondents to complete the survey on-site and return it in a secure manner has dramatically increased the response rate at most parks where this has been done and in similar work for the Bureau of Land Management. In locations where the use of drop boxes is feasible, response rates regularly exceed 60%.
Non-Response Bias Analysis: A more elaborate non-response bias analysis would be conducted using demographic variables. Two approaches were considered:
A stratified random sample of 52 park units (10% sampling error at 90% confidence level) would be selected to participate in the analysis. At these units, non-respondents would be asked one or two demographic questions and the overall satisfaction question from the modified questionnaire. Answers would be recorded by interviewers. Findings from non-respondents would be compared to respondents to check for significant differences. If no significant differences are found, the probability of non-response bias is reduced. If differences are detected consistently over a specified period (e.g., two years), actions to improve response rates could be implemented in future years. (Potential actions include alternative 4.)
OR
Late respondents to the mailback survey could be compared to early respondents on the demographic questions and overall satisfaction. (This does not require asking non-respondents demographic and satisfaction questions on-site, reducing the burden of the survey on staff and visitors.) Late respondents would serve as a proxy for non-respondents. If no significant differences are found, the probability of non-response bias is reduced. If differences are detected consistently over a specified period (e.g., two years), actions to improve response rates could be implemented in future years. (Potential actions include alternative 4.)
Benefits:
Substantial increase in response rate without the burden of capturing visitor addresses for follow-up contacts.
Ensures more refined GPRA-related data are captured through improved instrument, and are not attached to visitor names or addresses, preserving respondents’ anonymity.
With the additional questions, the survey data supplies a more useful visitor population profile for the field, as well as more sensitive field feedback on visitors’ evaluations (e.g., visitor satisfaction with a facility could include subset boxes for cleanliness, availability, type, general condition).
Implementation could be accomplished by FY12 if funding were available from the FY11 appropriation.
Adding the demographic items and refined evaluation questions increases the utility of the survey, as well as buy-in by park units. Systematic information on who park visitors are is one of the most common needs reported by park managers.
An NPS organizational benefit would result from integrating additional evaluation information from visitors into operational systems (i.e., performance evaluations, budgets allocations, strategic planning, etc.).
Estimated Costs: High
Substantial costs relate to the design and production of an expanded questionnaire and adding the non-response bias analysis procedure to the current training for VSC coordinators at each park participating in this analysis. Training would be done by video or interactive, verifiable Web-based instruction. Other costs include additional survey materials, oversight, data entry, and analysis.
Additional substantial costs are related to drop box purchase and shipping, expanded reporting and printing of survey results, database development, and higher indirect costs.
The NPS would incur costs related to integrating additional evaluation information from visitors into operational systems (i.e. performance evaluations, budgets allocations, strategic planning, etc.).
Other Costs:
Greater hour burden on visitors due to added length of the questionnaire.
4. 75-85% Response Rate with Follow-ups to Nonrespondents
Questionnaire Content: Demographic variables added to the instrument. Evaluation questions refined.
Field Methods: An added lock box option for the collection of the surveys on-site would be implemented at all park units where this is feasible. The option for respondents to complete the survey on-site and return it in a secure manner has dramatically increased response rates at most parks where this has been done and in similar work for the Bureau of Land Management. In locations where the use of drop boxes is feasible, response rates regularly exceed 60%. In addition, for those respondents who choose to mail the survey back at a later date (rather than use a drop box), multiple follow-ups will be employed to obtain an overall response rate of 75-85%. This requires collecting names and addresses from visitors selecting the mailback option.
Non-Response Bias Analysis: A non-response bias analysis would be conducted using demographic variables and overall satisfaction. Two approaches were considered:
A stratified random sample of 52 park units (10% sampling error at 90% confidence level) would be selected to participate in the analysis. At those units, non-respondents would be asked one or two demographic questions and the overall satisfaction question from the modified questionnaire. Answers would be recorded by interviewers. Findings from non-respondents would be compared to respondents to check for significant differences. If no significant differences are found, the probability of non-response bias is reduced. If differences are detected, and assuming the response rate is 75-85%, NPS will add a section to the national survey report describing the bias and the implications for interpreting the results.
OR
A survey log would be used by interviewers at all parks to record the observable characteristics of groups contacted (date and time, group size, number of children and adults, group type, and race and ethnicity using the OMB protocol for observing these). Characteristics of respondents returning the questionnaire by drop box or mail would be compared to the characteristics of those who accept the survey, but fail to return it (using either response mode), and those who refuse to participate at all. This approach does not require asking non-respondents demographic and satisfaction questions on-site, reducing the burden of the survey on staff and visitors. If no significant differences are found, the probability of non-response bias is reduced. If differences are detected, and assuming the response rate is 75-85%, NPS will add a section to the national survey report describing the bias and the implications for interpreting the results.
Benefits:
The very high response rate significantly reduces the likelihood of
non-response bias. This would give NPS the confidence that the current approach can be used for GPRA and other program evaluation purposes.Adding demographic items and refined evaluation questions increases the utility of the survey, as well as buy-in by individual park units. Systematic information on who park visitors are is one of the most common needs reported by park managers.
Implementation could be accomplished by FY12 if funding were available from the FY11 appropriation.
An NPS organizational benefit would result from integrating additional evaluation information from visitors into operational systems (i.e., performance evaluations, budgets allocations, strategic planning, etc.).
Estimated Costs: High to Very High
Substantial costs relate to the design and production of an expanded questionnaire and adding the non-response bias analysis procedure to the current training for VSC coordinators at each park participating in this analysis. Training would be done by video or interactive, verifiable Web-based instruction. Other costs include additional survey materials, oversight, data entry, and analysis.
Additional substantial costs relate to follow-ups to nonrespondents selecting the mailback option, lock box purchase and shipping, expanded reporting and printing of survey results, database development, and increased indirect costs.
The NPS would incur costs related to integrating additional evaluation information from visitors into operational systems (i.e. performance evaluations, budgets allocations, strategic planning, etc.).
Other Costs:
Major overhaul of the entire VSC operation.
Visitors who opt for the mailback option are not anonymous because they are asked to supply contact information for follow-ups.
Greater burden on visitors due to added length of survey instrument, collecting follow-up information, and repeated contacts for non-respondents.
Next Steps
During the next several months, the NPS Social Science Program will engage internal stakeholders and the NPS leadership to evaluate the four alternatives and to better understand emerging needs of the NPS relevant to the VSC. To supplement this effort, the NPS may consult with additional outside survey experts.
During the study of the VSC alternatives, the relative costs of each alternative will be quantified and refined. The Technical Task Force projected the based on the assumption that sampling would occur at 330 units annually. However, the survey could be implemented at a sample of NPS units on a rotating basis. Using this approach, the 330 NPS units participating in the VSC survey would be divided into smaller groups, with each group conducting the survey on a cyclical schedule. This has benefits and limitations that need to be assessed during the study.
A discussion has begun regarding technological improvements—such as mobile touch-screen kiosks—that could streamline the collection, entry, and analysis of VSC data. Because this technology requires substantial investment up-front, it would need to be field-tested. If tests proved successful, use of such technology could make administration of the VSC survey less expensive, reducing the cost of any of the alternatives.
Technical Task Force Members
Dr. Steven Hollenhorst, Principal Investigator, Visitor Services Project, University of Idaho
Jen Hoger Russell, Project Director, Visitor Survey Card
Bret Meldrum, Branch Chief, Visitor Use and Social Sciences, Yosemite National Park
Michael Savidge, Program Manager, Golden Gate National Recreation Area
Charles Jacobi, Recreation Specialist, Acadia National Park
Megan McBride, Senior Research Associate, NPS Social Science Program
Study Plan Submitted by:
Dr. Jim Gramann, NPS Visiting Chief Social Scientist
1 The estimates assume that surveying occurs at 330 NPS units annually. Amending this sampling schedule is discussed at the end of the plan.
2 At this time, the number of parks where use of drop boxes is feasible is not known. The VSC is conducting an online survey of park VSC coordinators to determine how many parks could use the drop boxes if they were made available.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Study Plan for Evaluating and Improving Response Rates and Conducting a |
Author | Steve Hollenhorst |
File Modified | 0000-00-00 |
File Created | 2021-02-02 |