1024-0216 Visitor Survey Card Part B

1024-0216 Visitor Survey Card Part B.docx

National Park Service Visitor Survey Card

OMB: 1024-0216

Document [docx]
Download: docx | pdf

Supporting Statement Part B for

Paperwork Reduction Act Submission


OMB Control Number 1024-0216


National Park Service Visitor Survey Card




  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


The Park Studies Unit at the University of Idaho has designed and plans to survey a random sample of visitor at approximately 330 units of the National Park System. Approximately 132,000 visitors will be surveyed annually (400 cards will be distributed at each of the units). The survey content and methodology, described below, is similar to that used for the previous surveys since 1998.


Respondent Universe

The population of interest for this collection will adults 18 years of age or older who visit a National Park during a designated sampling period. Visitors will be randomly selected to participate in the study.


Visitor Selection

A random sample of visitors will be intercepted at the end of their visit at designated locations at varying times and days during key visitation periods for each of the Park Units. Sampling will occur during peak visitation periods. During the sampling periods, visitors will be asked on-site to participate in the survey. Willing visitors will be given instruction on how to complete the Visitor Survey Card (VSC) and given the option to return the survey on-site or to use the self-addressed return mail option. All refusals will be recorded by the interviewer on the Surveyor Record Sheet, and this information will be used to evaluate any non-response bias.


At a selection of 33 parks, a random sample of visitors will be intercepted at the end of their visit. We will ask all visitors contacted if they would be willing to answer two questions (see Item 2 below) prior to receiving the survey. The answers to these questions as well as their gender will be recorded by the interviewer on the Surveyor Record Sheet. The responses at these selected parks will be used to evaluate any non-respondent bias.


Expected Response Rate

Based on experience from previous years, we expect to receive 39,600 annual responses (30% response rate). This response rate is considered to be at highest end of rates achieved for customer service evaluations in the private sector (10% to 30%). The response rates have increased from 26% (FY 2005) to 30% (FY 2009). This surveying effort will produce sample sizes that are considered robust at both individual and national level parks with acceptable margins of error.


2. Describe the procedures for the collection of information including:

* Statistical methodology for stratification and sample selection,

* Estimation procedure,

* Degree of accuracy needed for the purpose described in the justification,

* Unusual problems requiring specialized sampling procedures, and

* Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The distribution of the visitor survey cards will be evenly divided between weekends and weekdays, and between two blocks of time, 8:00 a.m. to noon and 1:00 p.m. to 5:00 p.m. Locations, days, dates, and times will be determined on a park-by-park basis. We will allow for slight deviation in the schedule if weather or inconsistent visitation patterns are noticed. The protocols for the distribution of survey cards call for contacting visitors on a frequency of every third person or vehicle. This protocol is to be followed with the following exceptions:


  • When vehicular or foot traffic is heavy - the surveys will be spread out over the entire four hour period. Instead of sampling every third person or vehicle, a time interval of every three minutes or five minutes between parties will be used.

  • When vehicular or foot traffic is slow – the survey distribution will be adjusted from every third group to every group until the flows are deemed to be consistent (these adjustments will be duly noted on the interviewers record sheets).


All parks participating in the VSC are assigned a survey month between February 1 and August 31. In all cases, this is a month during the park’s peak visitation season. Each park receives 400 surveys to distribute during its assigned month. With a response rate of 30%, an average of 120 surveys is returned per park. The associated margin of error is +/- 9 percentage points at the 95% confidence level for each park. This level of accuracy is sufficient because—as noted in Part A—a consistently low rating for any item signals the need for additional systematic investigation into potential problems with the provision of that item. This can be done through staff debriefings, observation of program or facility use, focus groups of visitors, or more comprehensive visitor surveys.


The corresponding margin of error for the national dataset is less than +/- 1percentage point at a 95% confidence level. This is desirable for reporting System-level performance with respect to national GPRA goals.


The survey cards will be distributed at locations with high concentrations of visitors such as popular trail heads, visitor centers, and campgrounds. The survey coordinator in each park will randomly schedule a minimum of eight survey days (each a four-hour time period) during the month. Parks with low visitation may schedule more days. Sampling days are stratified so that four are weekdays and four are weekends. In addition, equal numbers of mornings and afternoons are included in the schedule. This is the best case scenario. Parks will need to increase sampling days based on individual circumstances such as weather, visitor attendance fluctuations, and construction where distributing all 400 survey cards to a random sample of visitors, in 8 days becomes impossible. Parks will be instructed to add as many days to the minimum of 8 to ensure a representative sample of park visitors.


Initial visitor contacts are made by NPS employees or uniformed park volunteers in each of the NPS units. This contact takes approximately one minute. Interviewers use the following contact script as a guide:


Hello! My name is (name), and (full park name) is conducting an important survey. We are asking visitors for their opinions about the park’s services. Your participation in the survey is voluntary and anonymous, and you can complete the card in about three minutes when it is convenient to you. The survey card has its own preaddressed, postage-paid envelope. Just drop it in a US mailbox (or a convenient drop-box in the park- If drop boxes are used at the site) when you are through with it. Your opinion is important! Would you be willing to complete the survey card and mail it back to us?”


Visitors who agree to participate are thanked and given a postage-paid, return-addressed, survey card. They will be encouraged to complete the survey and mail it back at the end of their park visit, or in some cases, deposit the completed survey in a locked metal drop box on-site. The survey card takes approximately three minutes to complete.


If the visitor answers “No” then interviewers will follow with “Thank you! Have a great day experiencing your National Park.”


Selected Non-Response Bias Check Sites


A non-response bias check will be conducted at a random sample of 10% of the participating NPS sites (n=33). At these sites, during the initial contact, a random sample of visitors will be asked to participate in the survey using the script below. They will also be asked two questions to assess if there is a non- response bias. The interviewer will record the responses to these questions as well as the gender of the respondent. If the visitor refuses to answer the two non-response bias check questions the interviewer will simply record the visitor’s gender from observation in the log sheet.


At the selected sites interviewers will be instructed to use the following script when approaching selected visitors:


Hello! My name is (name), and (full park name) is conducting an important survey. We are asking visitors for their opinions about the park’s services. Your participation in the survey is voluntary and anonymous, and you can complete the card in about three minutes when it is convenient to you. The survey card has its own preaddressed, postage-paid envelope. Just drop it in a US mailbox (or a convenient drop-box in the park- If drop boxes are used at the site) when you are through with it. Your opinion is important! Would you be willing to complete the survey card and mail it back to us?”



If the visitor agrees to take the survey or refuses to take the survey but agrees to answer the non-response bias check questions then the interviewer will ask:


Would you be willing to answer two questions?”


On a scale of 1-5 what, in your opinion, is the overall quality of facilities, services, and recreational opportunities here today at (full park name). Select 1 for Very Good, 2 for Good, 3 for Average, 4 for Poor, or 5 for Very Poor.”


My last question is “what year were you born?”


The interviewer will log the responses to both questions and note gender of respondent on a log sheet.


If the visitor declines to participate the interviewer will follow up with

Thank you! And have a great day experiencing your National Park.”


The interviewer will note gender of respondent, and the reason for non-contact on the log sheet, for example, “in a hurry to catch up with family.”


All Visitor Survey Cards will be returned to the University of Idaho Park Studies Unit. Returned cards are scanned and the data used to create reports for each individual park. Combining the park data for each region and then the entire National Park System creates park, regional, and national data reports. Parks are advised that they should use caution when interpreting responses to any question with a sample size of less than 30. Graphs of individual measures with samples of less than 30 are flagged with a bold caution warning. In addition, parks with less than 30 returned surveys are omitted from the national reports. Graphs with less than 10 responses are removed from the data report, and an explanation for the absences is put in the space. Finally, parks with discrepancies in their data-collection methods (as determined by the University of Idaho) do not receive individual reports, and their results are not used in preparing the agency report.


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

Several steps will be taken to maximize response rate and ensure an accurate and reliable sample at all collection sites.


Multiple options for completing the survey

Respondents will be given an option to fill out the survey on site or mail back. Based on past experiences, we expect that most of the respondents will use the on-site drop boxes to return their surveys. Using the on-site drop boxes has increased our response rate from 24% to 30%. As funding and logistics allow, VSC staff will continue to employ the use of drop boxes in as many units as possible. This effort should continue to assist in maintaining higher response rates.


Addressing potential non-response bias

To address the issue of non-response bias, we will randomly select 33 park units to participate in a non-response bias analysis. A non-response bias survey will be conducted on site to all visitors contacted to participate in the study. The findings from the non-response follow-up survey will be statistically compared to the initial respondent sample to identify significant differences or potential non-response bias.


Based upon the anticipated national response rate at or around 30%, the following language will be included in the national report:


Low survey response rates increase the probability of non-response bias. Non-response bias occurs when those who choose to participate in a survey differ substantially and systematically from those who choose not to participate. If these differences are related to GPRA measures, the results may be unreliable.”


  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


The questions in the current version of the VSC have been tested and approved annually by OMB since 1998. The script and process for the interviewer was tested for clarity and accuracy in recording responses using a random sample of seven people entering the University of Idaho College of Natural Resources building. The same seven volunteers were also given the current version of the VSC and were asked to review and provide comments concerning the overall structure, sequence and clarity of questions. These individuals were also asked to estimate the time burden of the survey. Comments and suggestions provided by reviewers and pre-test participants were evaluated and used to revise the survey instrument where appropriate.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.

Jennifer Hoger-Russell, Visitor Survey Card Project Director

University of Idaho - Park Studies Unit

208-885-4806


Dr. Steven Hollenhorst, Director of the Park Studies Unit

University of Idaho - Park Studies Unit

208-885-7911


Dr. Bruce Peacock, Social Science Division Chief

National Park Service

970-267-2106


Statistical consultant:


Dr. Lena Le, Social Science Researcher

Department of Conservation Social Sciences

University of Idaho

208-885-2585



6


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPSU PSU
File Modified0000-00-00
File Created2021-02-02

© 2024 OMB.report | Privacy Policy