1024-0224 NPS Programmatic Submission Form - SEM

1024-0224 NPS Programmic Review Form_SEM 12-21-2021.docx

Programmatic Clearance Process for NPS-Sponsored Public Surveys

1024-0224 NPS Programmatic Submission Form - SEM

OMB: 1024-0224

Document [docx]
Download: docx | pdf

NPS Form 10-201 (Rev. 09/2019) OMB Control No. 1024-0224

National Park Service Expiration Date 05/31/2023


PROGRAMMATIC REVIEW AND CLEARANCE PROCESS

FOR NPS-SPONSORED PUBLIC SURVEYS






The scope of the Programmatic Review and Clearance Process for NPS-Sponsored Public Surveys is limited and will only include individual surveys of park visitors, potential park visitors, and residents of communities near parks. Use of the programmatic review will be limited to non-controversial surveys of park visitors, potential park visitors, and/or residents of communities near parks that are not likely to include topics of significant interest in the review process. Additionally, this process is limited to non-controversial information collections that do not attract attention to significant, sensitive, or political issues. Examples of significant, sensitive, or political issues include: seeking opinions regarding political figures; obtaining citizen feedback related to high-visibility or high-impact issues like the reintroduction of wolves in Yellowstone National Park, the delisting of specific Endangered Species, or drilling in the Arctic National Wildlife Refuge.



SUBMISSION DATE: 12.21.2021

PROJECT TITLE: Socioeconomic Monitoring (SEM) Pilot Survey: Phase Two


ABSTRACT: (not to exceed 150 words)


A strong mandate and need for socioeconomic research exist in the National Park Service (NPS). These are expressed in the NPS strategic goals for science, in statements by the NPS leadership, in the report of the Second Century Commission, and other outlets. This mandate resulted in a pilot socioeconomic monitoring study in 2015/2016 at a sample of park units across the U.S., previously described as “Phase One.” The purpose of this study is to use outcomes from the 2015/16 pilot program to progress into Phase Two of the process at up to 28 NPS units across the United States during the 2022 season. Phase Two seeks to further explore visitor demographics and characteristics at a larger subset of NPS units, applying recommendations from the 2015/16 pilot program and validating instrument and method refinements. Data collected during the 2022 Phase Two sampling period will inform individual park managers on key information about their visitors and allow for analysis of data for future implementation.


PRINCIPAL INVESTIGATOR CONTACT INFORMATION:

NAME:

Mandi Roberts

TITLE

Vice President

AFFILIATION:

Otak Inc..

ADDRESS:

11241 Willows Road NE, Suite 200 Redmond, WA 98052

EMAIL:

mandi.roberts@otak.com

PHONE:

425-822-4446


PARK OR PROGRAM LIAISON CONTACT INFORMATION:

NAME:

Bret Meldrum

TITLE

Chief, Social Science Program

AFFILIATION:

National Park Service, Environmental Quality Division

ADDRESS:

1201 Oakridge Drive, Fort Collins, CO 80525

EMAIL:

bret_meldrum@nps.gov

PHONE:

(970)-267-7295




PROJECT INFORMATION:

Where will the collection take place? 28 NPS units of varying sizes and types

Sampling Period Start Date: 3/1/2022

Sampling Period End Date: 10/31/2022

Type of Information Collection Instrument: (Check ALL that Apply)

X Mail-Back Questionnaire

Face-to-Face Interview

X On-Site Questionnaire

Focus Groups

Telephone Survey

Other (List)

Will an electronic device be used to collect information?

No X Yes – Type of Device: Android Tablets


SURVEY JUSTIFICATION:

Social science research in support of park planning and management is mandated in the NPS Management Policies 2006 (Section 8.11.1, “Social Science Studies”). The NPS pursues a policy that facilitates social science studies in support of the NPS mission to protect resources and enhance the enjoyment of present and future generations (National Park Service Act of 1916, 38 Stat 535, 16 USC 1, et seq.). NPS policy mandates that social science research will be used to provide an understanding of park visitors, the non-visiting public, gateway communities and regions, and human interactions with park resources. Such studies are needed to provide a scientific basis for park planning and development.

A strong mandate and need for socioeconomic research exist in the National Park Service (NPS). These are expressed in the NPS strategic goals for science, in statements by the NPS leadership, in the report of the Second Century Commission, NPS Next emphases, and through the Department of the Interior priorities for 2018-2022. Additionally, a recent GAO report identified the need to better understand and monitor customer experience dimensions where results more directly link to investments made by the bureau. The need for socioeconomic research and monitoring also was identified in an external review of the NPS Social Science Program and supported in the 2008 Interior Appropriations Bill Joint Explanatory Statement.

In response, an NPS SEM working group developed a statement of mission, goals, and objectives that emphasized collecting, organizing, and making available high-quality social science trend data to NPS managers. The NPS administered a pilot SEM program in 14-NPS units from 2015-2016 (Phase One) to determine the viability and cost of such a program. In total, over 6,000 surveys were completed, providing the NPS with useful insights about in-park visitor characteristics.

Data generated through this collection is needed to understand and gauge how well the NPS is serving the American public at park-level and bureau-wide resolutions. Results from the second phase of this Socioeconomic Monitoring (SEM) pilot will validate the need for further refinement to improve data quality and to determine the potential opportunities to generalize results across a wider population in future research efforts.



SURVEY METHODOLOGY

  1. Respondent Universe:

The universe of respondents for this study is adults (ages 18 and over) visiting one of the selected 28 park units sampled during the peak month, defined by the five-year monthly average.

  1. Sampling Plan / Procedures:

Each NPS unit will be designated as one of three types (nature, historic, or recreation) adapted from Haefele, Loomis, & Bilmes, 20161. Historic parks will be subdivided into urban and non-urban based on NPS Visitor Use Statistics (VUS) population class.

(1) Nature – National Park areas that focus on the preservation of natural environments and features, shorelines, and bodies of water.

  1. Historic – National Park areas that focus on the preservation of American history and culture or the commemoration and remembrance of significant events and people.

  • Historic Urban – historic areas with a population class of Urban; Suburban; or Mixed, with most of the surrounding population class considered Urban or Suburban.

  • Historic Non-urban – historic areas with a population class of Rural; Outlying; Remote; Mixed, with most of the surrounding population class considered Rural or Outlying; or No Boundary Data.

(3) Recreation – National Park areas that focus on nature-based recreation opportunities


Park unit size and sampling strata will be determined by the average annual visitation (descending order from highest to lowest) within each category (nature, historic, or recreation). Parks accounting for at least 80% of the category’s total annual visitation will designated “large” parks, with the remaining parks designated “small” parks.

Table 1. Number of units in the sampling frame per strata


Unit Size


Park Type & Class

Large

Small

Total

Nature

29

67

96

Historic

Historic Urban


23


79


102

Historic Non-Urban

34

80

114

Recreation

17

35

52

Total

103

261

364

Using a random number generator, a proportional number of parks will be selected from each stratum to create a stratified random sample of up to 28 parks. A blend of natural/urban/historic sites along with units of varying annual visitor volumes are included in the sample. Sampling of Visitors within Park Units

A random sampling of visitors will be intercepted while visiting one of the selected NPS sites during a 10-day sampling period. Surveyors will be stationed at specific intercept locations within each NPS unit (e.g., visitor centers, attraction areas, trailheads, and near park entrances) based on insights from park staff, NPS visitor use statistics, prior research, and professional experience.

Surveys that require intercepting visitors in vehicles will be conducted by safely flagging them into a designated survey area. Surveyors will be instructed to attempt to intercept every nth vehicle passing based on the anticipated volume and number of visitor contacts required at each NPS unit.

Where surveying requires intercepting individuals on foot or otherwise outside of their vehicles, visitors traveling past the intercept locations or within the designated survey area will be randomly approached. Surveyors will be instructed to attempt to intercept every nth group passing based on the anticipated volume and number of visitor contacts required at each NPS unit.



Follow-up (Mail-Back/Online) Survey

All respondents who agree to participate in the onsite intercept survey will be asked to complete the mail-back/online follow-up survey.

  1. Instrument Administration:

On-site (Intercept) Survey

Within each park unit, visitors will be sampled to participate in an intercept survey about their park experience and trip characteristics. Across the 10-day sampling period at each park, surveyors will contact potential respondents in locations where they are likely to have already experienced the site, as opposed to just arriving. This process will limit the number of potential respondents who just arrived at the park and have yet to spend time within.

If the visitor agrees to participate, the survey will be verbally administered by the surveyor and the responses will be recorded via an Android Tablet. If the visitor does not agree, surveyors will thank them for their time, attempt to ask the three non-response bias questions, and then sample the next nth visitor. This process will be standardized across all park units using the protocols established for surveyors.

Following a brief introduction of the purpose of the survey, the potential respondent (adult group member with the most recent birth date) will be asked if they would be willing to take part in the 5 minute survey. The intercept survey will include the questions used as the non-response bias check (i.e., residency (permanent or seasonal)) and nights away from their permanent residency on this trip as well as basic trip characteristics that apply to their current visit. Four potential outcomes are expected following the request to participate: (1) Complete refusal; (2) Partial refusal, answering non-response questions but nothing further; (3) Complete Intercept, but refuse to take mail-back; and (4) Complete Intercept and take mail-back.

As part of the intercept protocol, surveyors will add a unique identifier to each survey that will be linked to the mail-back survey, the postage-paid envelope, and on a slip of paper inside the packet with the URL to the online survey. This unique identifier will also serve as the password to access the online survey.

The final question on the Intercept survey will provide the respondent an opportunity to provide their mail or email address that will be used for the follow-up protocol of the “Tailored Design Method” (reminder protocols for mail-back surveys).

Due to the COVID-19 pandemic, the research team recognizes a need to consider contingencies to the instrument administration. If COVID-19 continues into the sampling period, the research team will adopt several strategies to ensure a safe environment for both respondent and surveyors including, but not limited to the following practices:

  1. Social distancing (at least 6 feet) required between surveyor and respondent.

  2. Masks will be worn by surveyors and recommended for respondents.

  3. Surveyors will not allow the respondent to interact with the tablet and all questions will be asked verbally.

  4. Hand sanitizer will be available at the sampling site.

  5. Respondents will be asked to provide their email address and/or mailing address if they are not comfortable with taking a mail-back survey packet from the surveyor on the day they are intercepted.

Example Script for Intercept Survey:

Hello, I am working with [NPS Site] conducting a 5-minute survey to improve visitor experiences in the park. May I ask you several questions about your [NPS Site] experience?

  • If NO – The surveyor will thank the visitor and ask them to answer the three questions that will serve as a non-response bias check (in Section E below)

  • If YES – The surveyor will begin the intercept visitor survey with the recruited individual after reading the Paperwork Reduction and Privacy Act below. The surveyor will verbally administer the survey and record responses on an Android Tablet.

Before we begin, I would like to let you know that this survey has been approved by the Office of Management and Budget. It is important to note that a Federal agency may not conduct or sponsor, and you are not required to respond to, a collection of information unless it has a valid OMB control number. The control number for this collection is XXX and this number is valid through XXX. Secondly, your participation is voluntary, and your name will never be connected with your individual responses. This survey will only take about five minutes of your time today.



Upon completion of the intercept survey, respondents will be thanked for their time. The following protocol will be used to ask the respondent to participate in the follow-up survey.

Follow-up (Mail-Back/Online) Survey

A follow-up survey will be administered to capture full trip characteristics such as spending and travel patterns that are typically not known until after the trip is completed. All on-site respondents agreeing to complete and return the follow-up survey will be given a packet with a paper copy of the survey, a self-addressed postage-paid envelope.

The respondent will be instructed to complete and return the follow-up survey after their trip at the current NPS site. Surveys may be mailed back from any U.S. mailbox. International respondents will be encouraged to mail the survey back before leaving the U.S. All respondents will be given a card with web link to use as an alternative for completing the online version of the survey. To safeguard the respondent’s anonymity each survey (on-site and follow-up) will have an identifier unique for each respondent. We will use the identifiers to monitor non-response bias.

Upon agreeing to participate in the follow -up survey, each respondent will be asked to provide their home mailing address and/or email address. All respondents will receive correspondence by postal or email based upon their preference. A reminder postcard will be sent to all contacts within the week following the end of each on-site data collection period. This postcard will thank them for their participation and encourage them to complete and return the follow-up survey if they have not already done so.

Following the “Tailored Design Method,” three weeks following the initial contact, a second reminder and survey packet (with the same unique IDs) will be either mailed or emailed to all non-respondents. The mailed packets will be sent to domestic and international addresses and will include the appropriate postage paid, self-addressed return envelopes. Two to four weeks after the replacement survey is sent, a final contact will be sent by mail or email to all remaining non-respondents.

  1. Expected Response Rate / Confidence Level:

As stated in Table 3 above, we anticipate a total of 32,001 total contacts during this pilot study. Based on previous pilots and recent NPS survey research efforts, we forecast that each park unit will generally have equal response rates in the intercept survey (85% response rate), non-response survey (90% participation by those who give soft refusals), acceptance rate of taking the mail-back survey (90% participation by those who participate in the intercept survey), and acceptance rate of completing the mail-back survey (40% response rate by those who take the mail-back survey form) – each of which is described in further detail below.

Onsite (Intercept) Survey

Based on previous NPS research experience with this method, we anticipate that 85% of visitors contacted during each sampling period will agree to participate in the intercept survey (n=30,600 total anticipated participants). RRC Associates and ITRR conduct frequent intercept surveys across the U.S. and these percentages are based upon the average refusals obtained. For example, in a recent study using these same methods at Zion National Park by the research team, an intercept survey resulted in an approximate response rate of at least an 86% participation rate spread across a variety of locations in the park. Surveyors will aim to capture as widespread participation as possible using a variety of standardized techniques such as offering to take the survey verbally or by reading the tablet, and joint reading of the questions for visitors who were unable to participate.

Of that 15% of contacted visitors who do not agree to participate in the intercept survey (noted in the table below as ‘soft refusals’), we expect 90% to answer the non-response bias questions (n=4,860) with roughly 10% of visitors completely refusing to participate in any part of the collection (hard refusals, n=540).

Table 4a. Anticipated Onsite Survey Response Rates

Total Number of Visitor Contacts (see Table 3)

Completed Onsite Surveys

(85% of contacts)

Soft Refusals
(15% of contacts)

Completed
Non-Response Surveys
(90% of soft refusals)

Hard Refusals
(10% of soft refusals)

36,000

30,600

5,400

4,860

540



Follow-up (Mail-back/Online) Survey

Based on 2020/21 NPS studies applying coupled intercept and mail-back survey methods, we estimate that 90% of visitors who complete the intercept survey will agree to take a mail-back/online form. From that, we anticipate that 40% of those respondents will complete and return the mail-back or online survey. Therefore, the following estimates for the mail-back/on-line survey are assumed based on visitor contacts from the intercept survey. All visitors who are given a mail-back/online survey form will have already taken the intercept survey, which includes non-response bias questions. Thus, there is no extra effort necessary to collect non-response bias responses from respondents who do not participate in the mail-back/online survey.

Table 4b. Anticipated Follow-up Survey Response Rates

Completed
Onsite Surveys

(see Table 4a)

Accepted Follow-up Surveys
(90% of completed

onsite surveys)

Completed Follow-up Surveys
(40% of accepted surveys)

Non-respondents

(60% of accepted surveys)

30,600

27,540

9,180

13,770


Using a 95% confidence level, both the intercept/onsite survey and the mail-back/online survey have a margin of error under +/- 5.0%.


  1. Strategies for dealing with potential non-response bias:

To account for potential non-response bias for visitors not agreeing to participate in either the intercept or mail-back/online survey, visitors who do not agree to complete either study (those referred to above as soft refusals) will be asked the following questions:

  1. Do you currently live in the U.S.?

  2. Are you a permanent or seasonal resident of the local area around [NPS Site]?”

  3. On this trip, have you, or will you, [and your personal group] stay overnight away from your permanent residence either inside [NPS Site] or within the local area”

Responses will be analyzed and compared to respondents who completed the entire intercept survey to explore any non-response bias concerns. Because the intercept survey will be linked to the online survey via a unique identifier which is also used as a password for the online survey, respondents who do not complete the follow-up survey (either by mail-back or online) will be compared to those who did participate. Thus, non-response bias checks will be conducted on both intercept and mail-back survey respondents. Chi-square tests will be conducted between the respondents and non-respondents to explore and identify any issues of underrepresentation due to non-response bias. The multi-method approach to the mail-back/online survey will allow for more widespread participation among respondents, limiting non-response issues.


  1. Description of any pre-testing and peer review of the methods and/or instrument:

All instruments have been pilot tested and used in other park research studies, such as the 2015-16 Pilot Study referenced earlier. In addition to the OMB approval process, the research team will be building a data management system that ensures confidentiality for all data collected.


The instruments for this study were developed using questions from the Pool of Known Questions. These pre-approved questions have been peer-reviewed, used across a variety of park studies, and piloted for necessary improvements. After Phase One, the questions were further refined based on analysis for non-response issues, errors in respondent answers, and through consultation with additional experts. Multiple steps have been taken to ensure the questions provide reliable and accurate results with minimal respondent burden.


The data collection method is a proven mixed-method approach (intercepting visitors on-site followed by a full-trip detail via a mail-back/online survey). This method has been successfully used by social science researchers across the country at multiple destinations. Studies using this method have been peer-reviewed and determined scientifically valid. The tablet and online survey portions will be internally tested, checked for quality assurance, and monitored for consistency. A more holistic picture of trip characteristics and preferences are captured through this mixed-method approach. The intercept survey captures information from a wider range of the population while the mail-back/online survey records full-trip details that are difficult to assess on-site.


BURDEN ESTIMATES

The combined annual burden for this collection is estimated be 5,772 hours. This includes the initial contact time, time to complete and return the questionnaires for both sample groups (e.g., on-site and mail back surveys). We anticipate contacting 36,000 visitors at 28 units during the sample period of this pilot study. We used the number of completed response in Tables 4a and 4b above.

  • On-site Survey

We anticipate that the initial contacts for on-site survey will take less than a minute. We are expecting that 85% (n=30,600) of all visitors contacted will agree to take 5 minutes to complete the on-stie survey (30,600 respondents x 5 minutes = 2,550 hours).

  • On-site non-response survey

Of all visitors contacted, we anticipate that 90% (n=4,860) of those refusing to complete the full survey will agree to answer the non-response check question (4,860 respondents x 2 minutes = 162 hours). The burden for the remaining visitors that completely refuse to participate will not be calculated due to lack of participation.



  • Mail-back Survey

The visitors completing the on-site survey will be asked if they would be willing to complete the follow-up survey. We expect that 40% (n=9,180) will agree to complete the post-experience survey. The time to complete the survey will take 20 minutes (9,180 X 20 minutes = 3,060 hours).

Table 5. Burden Estimates


Completed Responses

Completion Time *

(minutes)

Burden Hours

(rounded up)

On-site Survey*

30,600

5

2,550

On-site non-response survey

4,860

2

162

Mail-back Survey

9,180

20

3,060

Total burden requested under this ICR:

44,640


5,772

* Initial contact time is added to the time to complete the surveys

REPORTING PLAN:

NPS personnel will be continuously updated throughout the process as the need arises. A final technical report will be delivered to NPS managers in the Natural Resource Data Series format, or other desired format, that will be developed collaboratively to address the study purpose and identify key findings for management and planning needs. A presentation and participation in workshops will also be provided at the request of NPS colleagues.








NOTICES

Privacy Act Statement


General: This information is provided pursuant to Public Law 93-579 (Privacy Act of 1974), December 21, 1984, for individuals completing this form.


Authority: National Park Service Research mandate (54 USC 100702)


Purpose and Uses: This information will be used by The NPS Information Collections Coordinator to ensure appropriate documentation of information collections conducted in areas managed by or that are sponsored by the National Park Service.


Effects of Nondisclosure: Providing information is mandatory to submit Information Collection Requests to Programmatic Review Process.


Paperwork Reduction Act Statement


We are collecting this information subject to the Paperwork Reduction Act (44 U.S.C. 3501) and is authorized by the National Park Service Research mandate (54 USC 100702). This information will be used by The NPS Information Collections Coordinator to ensure appropriate documentation of information collections conducted in areas managed by or that are sponsored by the National Park Service. All parts of the form must be completed in order for your request to be considered. We may not conduct or sponsor and you are not required to respond to, this or any other Federal agency-sponsored information collection unless it displays a currently valid OMB control number. OMB has reviewed and approved The National Park Service Programmatic Review Process and assigned OMB Control Number 1024-0224.


Estimated Burden Statement


Public Reporting burden for this form is estimated to average 60 minutes per collection, including the time it takes for reviewing instructions, gathering information and completing and reviewing the form. This time does not include the editorial time required to finalize the submission. Comments regarding this burden estimate or any aspect of this form should be sent to the Information Collection Clearance Coordinator, National Park Service, 1201 Oakridge Dr., Fort Collins, CO 80525.


1 Haefele, Michelle and Loomis, John B. and Bilmes, Linda, Total Economic Valuation of the National Park Service Lands and Programs: Results of a Survey of the American Public (June 28, 2016). HKS Working Paper No. 16-024, Available at SSRN: https://ssrn.com/abstract=2821124 or http://dx.doi.org/10.2139/ssrn.2821124

RECORDS RETENTION - PERMANENT. Transfer all permanent records to NARA 15 years after closure. (NPS Records Schedule, Resource Page 1 of 10

Management And Lands (Item 1.A.2) (N1-79-08-1)).

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMolly Ryan
File Modified0000-00-00
File Created2023-08-18

© 2024 OMB.report | Privacy Policy