Supporting Statement A
National Park Service Recreation Fee Pricing Study
Survey Pre-Test and Pilot
OMB Control Number 1024-NEW
Terms of Clearance: This is a new collection
Justification
1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection.
The National Park Service (NPS) is requesting approval to conduct a pre-test and pilot of a national household survey designed to obtain information about U.S. residents’ potential behavioral responses to changes in the level and structure of entrance fees at national parks. The pre-test and pilot will inform the design and administration of a future survey of recent and potential park visitors that is intended to help determine revenue and access implications of different entrance fee rates and collection models. The entrance fee options evaluated will include: (1) increases in entrance fees, (2) different fee structures, and (3) technology-based solutions for collecting entrance fees to make the entrance process more convenient and efficient for visitors.
Following the most recent price update, Congress requested that the NPS provide data that could inform future entrance fee reviews. However, the available data are deficient. There have been very few reviews of entrance fees at national parks, and the NPS does not have any comprehensive information on how visitors might respond to changes in entrance fees, fee structures, and technology-based solutions for collecting entrance fees.
The Federal Lands Recreation Enhancement Act (FLREA), 16 U.S.C. (§§ 6801 - 6814), Public Law 108-447, authorizes the NPS and four other agencies to collect and retain recreation fees, including entrance fees and amenity fees for certain facilities, equipment, and services (such as campgrounds). Around two-thirds of all FLREA revenues are collected at NPS sites (U.S. Department of the Interior and Department of Agriculture, 2015, p. 9). FLREA requires that fee revenue is used to enhance the visitor experience, with at least 80 percent of the money staying at the particular park where it was collected, and the other 20 percent used at parks that do not collect fees. This revenue funds thousands of projects that benefit the visitor experience. Entrance fees collected by the NPS totaled $249 million in fiscal year 2020.
In 2006, the NPS instituted a standard entrance fee model/pricing structure, grouping park units into four categories in an effort to simplify and standardize various entrance fees (individual, vehicle, motorcycle, and annual park pass) across similar types of parks. A moratorium on fee increases was put in place by the NPS Director in 2008 and remained in effect until 2014, at which point the NPS updated its entrance fee pricing structure for the first time since 2006. Proposed fee changes at an NPS site are subject to public review and comment through the civic engagement process to help determine when a park should adopt the recommended pricing structure.
In 2015, the Department of the Interior’s Office of the Inspector General (OIG) published a review of NPS’ recreation fee program. One of the recommendations that remained unresolved was for the NPS to reevaluate the current entrance fee model to determine necessary updates and establish intervals for periodic reviews to ensure that the fee model remains up to date (OIG, 2015). In a separate review, the Government Accountability Office (GAO) reaffirmed OIG’s recommendation, noting that while NPS guidance on recreation fees directs the agency to set fees at a reasonable level, it does not call for periodic reviews of these fees. GAO’s guide on federal user fees states that federal agencies should regularly review fees and make changes if warranted. If federal user fees are not reviewed and adjusted regularly, federal agencies run the risk of undercharging or overcharging users. In addition, GAO noted that because park units are not required to provide information on decisions not to change their fees or deviate from the fee schedule, the NPS is missing opportunities to ensure that entrance fees are reasonable. The report recommended that the NPS revise its guidance to periodically review entrance fees and direct park units to provide information on their decisions to not increase fees (GAO, 2015).
In 2017, the NPS proposed changes in entrance fee prices and structures, including higher fees during the five-month peak season at 17 of the most visited national parks. After receiving more than 100,000 public comments, the majority of which were in opposition to the proposed changes, the NPS implemented a 10-percent across-the-board price increase for daily entrance fees and park-specific annual passes, as well as a reduction in fee-free days.
Collecting information directly from the public about their attitudes and behavioral responses will help the NPS make better decisions about entrance fee pricing, balance visitor access and revenue effects, and enable more efficient long-term management decisions and improved visitor service. The NPS committed to Congress that it would undertake a study of this type.
Legal Justification:
The Federal Lands Recreation Enhancement Act of 2004 (FLREA; 16 U.S.C. §§ 6801 – 6814) – authorizes the NPS to collect and retain recreation fees, including entrance fees and amenity fees for certain facilities, equipment, and services (such as campgrounds). FLREA also requires that the public has opportunities to participate in the development of or changing of a recreation fee established under FLREA.
Protection, interpretation, and research in System (54 U.S.C. § 100701) - Recognizing the ever-increasing societal pressures being placed upon America’s unique natural and cultural resources contained in the System, the Secretary shall continually improve the ability of the Service to provide state-of-the-art management, protection, and interpretation of, and research on, the resources of the System.
Research mandate (54 U.S.C. § 100702) - The Secretary shall ensure that management of System units is enhanced by the availability and utilization of a broad program of the highest quality science and information.
2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection. [Be specific. If this collection is a form or a questionnaire, every question needs to be justified.]
The NPS does not have any comprehensive information on visitor responsiveness to changes in entrance fee rates or collection models, or the effect of price changes on visitor access. The NPS has contracted Resource Systems Group (RSG), Inc., to conduct a pre-test and pilot of this survey instrument. The findings will be used to develop a full-scale general population survey to evaluate visitor perceptions and behavior under different entrance fee pricing models. The final version of the general population survey will be submitted to OMB as a separate ICR.
The components of this collection are described below:
Pre-test
The pre-test will be conducted to (1) validate the survey questions, (2) investigate various sampling methods, (3) estimate the respondent burden, and (4) determine the usability of the survey design. The pre-test survey will be administered to a small, national sample of U.S. households. Respondents will be queried as to whether they have visited one or more units of the National Park System within the last two years and whether they plan to visit one or more units over the next twelve months. If the respondent meets either criteria, they will be asked detailed information about one of their previous or planned park visits, as well as a set of questions about entrance fees and fee structures, attitudinal questions regarding entrance fees, and finally, a set of demographic questions. The pre-test version of the survey will ask the respondent for any comments or suggestions that might improve the wording, flow, or layout of the questionnaire.
Debriefing Interviews
At the end of the pre-test, the respondents will be asked to provide a telephone number if they would be interested in participating in a post-survey debriefing interview. A sample of 20 respondents volunteering a telephone number will be contacted to participate in a post-survey debriefing interview. The debriefing interview will proceed sequentially through each section of the pre-test survey completed by the respondent. Upon the conclusion of the pre-test, researchers will revise the survey to address any issues identified by respondents before administering the pilot test. The interviewer will focus the discussion on segments with anticipated higher response burden, such as the stated preference questions or the map-based park finder tool.
Pilot
The pilot survey will be administered to a random sample of U.S. households. The respondents will be asked if they have visited one or more units of the National Park System within the last two years and whether they plan to visit one or more units over the next twelve months. If the respondent meets either criterion, they will be asked the about the following:
previous or planned park visits,
entrance fees and fee structures,
attitudes regarding entrance fees, and
demographics
A non-response follow-up questionnaire will be administered to analyzed to assess potential problems with survey questions, response options, stated preference (SP) attribute levels, and survey implementation logistics. The pilot results will also provide updated estimates of the response rate for the main survey, the response rate for the non-response follow-up survey, and the completion rate for the SP questions.
Justification of Pre-test and Pilot Survey Questions
Section 1: Past Park Unit Visitation - The questions in this section will be tested to determine if they can adequately capture information about the number of times and the specific park units the respondent visited one or more units of the National Park System in the past two years. If the respondent has visited a park in the last two years, they will be asked a series of questions about one of these visits, including the season, the mode of entry, duration of stay, type and amount of entrance fee paid. This information is necessary to determine whether the respondent meets the criteria to respond to the SP questions question in Section 4. The SP questions will go further to ask recent park visitors about their preferences for different types of entrance fee passes (daily individual, weekly individual, daily vehicle pass, weekly vehicle pass, etc.) and how responsive they would be to changes in the price of entrance fees
Section 2: Future Park Unit Visitation – Respondents who have not made a trip to a national park in the previous two years will be asked any future plans to visit one or more national parks. The questions in this section will be tested to determine if they can adequately capture information about the number of times the respondent plans to visit one or more units in the next 12 months, and which specific park units they plan to visit. If a visit or visits is planned, the respondent will be asked a series of questions about the expected season, mode of entry, duration of stay, and expected type and amount of entrance fee that will be paid. Again, this information is necessary to determine whether the respondent meets the criteria to respond to the SP questions in Section 4. The SP questions will go further by asking future park visitors about their preferences for different types of entrance fee passes and how responsive they would be to changes in the price of entrance fees.
Section 3: No Past or Future Visitation - Respondents that have not visited a park unit in the last two years and do not plan to visit a park unit in the next 12 months will be presented with a list of possible reasons that may apply. These respondents will not be shown the stated preference exercises.
Section 4: Stated Preference Exercises - Respondents who have visited a park unit within the last two years or plan to visit a park unit in the next 12 months will be presented with a series of SP questions about their most recent visit (or planned visit) to a particular park unit. They will be told that the NPS is considering changes to entrance fees and annual pass policies. The respondent will be asked how they would choose to visit that particular park unit if different pass options and entrance fees were in place. The options will differ in terms of the type of pass they are asked about (e.g., daily individual pass, weekly individual pass, daily vehicle pass, weekly vehicle pass, annual vehicle pass), the season (peak vs. off-peak), the purchase location (park entrance vs. online), and the cost. The respondent will be asked to select the number of days they would visit the park (one choice would be not to visit), and to report the number of days they would visit given the new pricing structure. This information is a critical component of the survey because it constitutes the core SP choice experiment questions. The responses will be used to evaluate the adequacy of wording and attribute levels, and to determine if the information can be used to measure the price elasticity of demand for park visits and ultimately the potential change in revenues resulting from the implementation of alternative entrance fee prices and structures.
Section 5: Fees and Payment Attitudes – The respondents completing Section 4 will be asked a series of questions about fees and payments, such as: (1) which park passes they were aware of, (2) which passes apply to them, and (3) opinions about different entrance fee structures. Responses to these questions will be evaluated to test whether this information can be used to better understand respondents’ attitudes towards different fee structures, and their opinions regarding increasing revenues at national parks. This section will be shown to all respondents.
Section 6: Demographics - The questions in this section will be used to obtain basic demographic characteristics, including household size, gender, age, education level, ethnicity, race, employment status, and income. Demographics will be used to identify segments of the population that are more or less responsive to changes in entrance fees and/or fee structures. The respondent characteristics will be compared with those from other NPS and general population surveys. This section will be shown to all respondents.
Pilot Non-Response Follow-Up Survey Questions
The follow-up survey will contain a subset of questions from the pilot survey selected to characterize non-respondents with respect to park visitation history, attitudes toward the Park Service, and demographic characteristics.
3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden and specifically how this collection meets Government Paperwork Elimination Act (GPEA) requirements.
We expect that at least 99% of the sample use the electronic submission method. We will use a web-based survey instrument for the pre-test and pilot versions of the survey. Respondents will be selected through address-based sampling and receive invitations via mail to participate online. This approach meets Government Paperwork Elimination Act (GPEA) requirements to provide an option to submit information electronically to Federal agencies.
A toll-free number will be provided, and telephone interviews will be used as an option to ensure that households without internet access have an opportunity to participate in the survey. The interviewer will guide the respondent through the non-SP survey questions. Due to their complexity, the SP questions will be printed and mailed to the respondent. The respondent will then call the toll-free number to provide responses to these SP questions over the phone. Based on past experience, using this method, we anticipate that <1% of respondents will complete the survey via telephone (Fowler et al., 2018).
4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.
There is no known duplication of effort. An exhaustive review of previous relevant efforts revealed the following:
In 2000, 2008 and 2018, The NPS conducted the Comprehensive Survey of the American Public (OMB Control Number 1024-0254). This national household survey included a few questions that provided minimal information regarding the general public’s opinions toward entrance fees. However, they did not collect the data necessary to quantify how park visitors would alter their visitation when presented with changes in entrance fees or fee structures.
In 2006, a national telephone survey was conducted to assist in the pricing of a new Interagency Annual Pass. This study utilized a SP method (contingent valuation) to determine the price elasticity of demand for this new pass, and the effects of different potential prices on agency revenues (Aadland et al., 2012). However, this study was specific to the Interagency Annual Pass and did not provide any information on park-specific entrance fees.
In 2017, an NPS internal review of existing entrance fee data (prices and revenues) was conducted to determine how park visitation and fee revenues might be expected to change under different entrance fees and fee structures. Unfortunately, empirical evidence is limited because entrance fees at national parks have varied little since 2006; from 2008 to 2014 the NPS held prices constant. Price increases since 2014 have been confounded by large increases in visitation associated with the NPS Centennial and increased marketing by partners.
Given the available data and existing studies, nothing can currently be said about expected visitor responses to significant price changes or entirely new pricing structures. As such, there is no known duplication of effort.
5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.
This information collection will not impact small businesses or other small entities. It will only be sent to individual households.
6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
The NPS committed to Congress that it would undertake a study of fees and pricing in response to criticisms that the 2017 seasonal fee proposal was made using limited data. It remains that NPS does not have current or comprehensive information on how visitation will change in response to adjustments in entrance fees or fee structures. As a result, NPS is unable to accurately estimate the revenue or equity implications of these actions when evaluating alternatives to address budgetary issues, as well as broader benefits and costs from a public policy perspective. This study will provide the necessary information to inform data driven pricing and fee structure decision making going forward.
7. Explain any special circumstances that would cause an information collection to be conducted in a manner:
* requiring respondents to report information to the agency more often than quarterly;
* requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
* requiring respondents to submit more than an original and two copies of any document;
* requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;
* in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
* requiring the use of a statistical data classification that has not been reviewed and approved by OMB;
* that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
* requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.
No special circumstances apply to this information collection.
8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice [and in response to the PRA statement associated with the collection over the past three years] and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.
Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.
Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years — even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.
On August 2, 2021, we published a Federal Register Notice (86 FR 16411) informing the public of our intent to request OMB approval for a new information collection. We solicited comments for a period of 60 days ending on October 1, 2021. We did not receive any comments from the public. In addition to the 60-day FRN, the research team contacted three reviewers external to NPS with relevant expertise to evaluate the questionnaire and sampling plan for methodological soundness, clarity, and content.
The reviewer’s comments and the study team’s responses are provided as a supplemental document in ROCIS. Based on the peer review process, the survey team updated the research design and questionnaire where appropriate.
Table 8.1. Persons consulted with outside the agency
|
Reviewer Title |
Affiliation |
Reviewer #1 |
Senior Economist |
University of Montana |
Reviewer #2 |
Associate Professor |
University of Wyoming |
Reviewer #3 |
Research Economist |
U.S. Forest Service |
9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.
We are requesting to include a modest monetary incentive ($2) with the invitation letter to the pre-test and pilot surveys, and a $5 incentive to accompany the request for the non-response follow-up survey. The higher incentive for the non-response survey is expected to improve cooperation with a more reticent subset of the sampled population.
Past research demonstrates the effectiveness of monetary incentives in increasing survey response rates (e.g., Church, 1993; Jobber et al., 2004; Edwards et al., 2005). This can be especially important in general population surveys that tend to see lower response rates than surveys with sample frames consisting of on-site users or user lists. A monetary incentive can encourage survey responses over a wide range of population demographics and may help reduce selection bias. Both Messer and Dillman (2010) and Stratus Consulting (2015) included non-contingent incentives in general population surveys that used address-based sampling and on-line questionnaire designs, with response rates ranging between 23 and 31%. In addition, a national general population mail survey of comparable complexity conducted by NPS (Visibility Valuation Study - OMB Control Number 1024-0255) used a $2 non-contingent incentive and received an overall response rate of 33%. Based on this accumulated research and experience.
10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.
Respondent names or addresses will not appear in any reports or findings. All responses will be anonymous, the respondent’s names and addresses will be assigned an ID number that will be used to track survey responses and the contact information will be maintained only for the purpose of follow-up contact with non-respondents.
11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
No questions of a sensitive or personal nature will be asked.
12. Provide estimates of the hour burden of the collection of information. The statement should:
* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.
* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.
* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 14.
We estimate that we will receive 641 completed response (115 pre-test; 16 Debrief Interviews; 460 Pilot and 50 non-response survey) totaling 275 annual burden hours. We estimate the dollar value of the burden hours is $ 10,700.
Burden estimates are based upon the time required for respondents to read all correspondences, instructions, and complete the surveys. This collection will proceed in four distinct phases:
Pre-test: 500 residential addresses listed in the United States Postal Service’s Delivery Sequence File (DSF) will be used to establish the sample for the pre-test. We expect that 8% of those addresses (n=40) to be invalid. From the remaining valid addresses (n=460), we anticipate a 25% response rate, resulting in approximately 115 completed responses. We expect that respondents will spend 30 minutes, on average, completing the survey if they have made a recent visit or plan to make a visit to a national park. For respondents who have no past or future visitation, we estimate they will require 10 minutes to complete the survey. This pre-test will be used to evaluate completion time and solicit comments or suggestions that might improve the wording, flow, or layout of the questionnaire. Respondents will be asked to provide a telephone number and name that will establish the sample for the pre-test debrief.
Pre-test Debrief: From the respondents providing telephone numbers, we will contact 20 respondents in the order they completed the survey to participate in a pre-test debriefing interview and 16 of those contacts will complete the interview. During the debriefing interview, we will proceed sequentially through each section of the survey and ask respondents for comments on question wording, flow, and layout. Respondents who completed the SP section of the survey will be prioritized for these debriefing calls. We expect these interviews will take 30 minutes, on average, to complete.
Pilot: 2,000 additional households, drawn from the same DSF frame, will be contacted to complete the pilot survey. We expect that 8% (n=160) of those addresses to be invalid. From the remaining valid addresses (n=1,840), we anticipate a 25% response rate, resulting in approximately 460 completed survey responses. We expect that it will take approximately 30 minutes to complete the survey for past and future visitors, and 10 minutes to complete the survey for respondents with no past or future visitation.
Non-Response Follow-Up Survey: 250 randomly selected non-respondents from the pilot will be contacted to complete a brief mail follow-up survey. Assuming a 20% response rate, this implies 50 respondents. We expect respondents will spend no more than 5 minutes completing the follow-up survey.
We estimate the total burden of this collection to be 275 hours (Table 12.1).
Table 12.1: Total Estimated Burden
Respondents |
Number of Contacts |
Number of Completed Responses |
Estimated Completion Time (minutes) |
Total Burden (hours) |
Pre-test |
|
|
|
|
Past Visitors |
220 |
50 |
30 |
25 |
Future Visitors |
220 |
50 |
30 |
25 |
No Past or Future Visitation |
60 |
15 |
10 |
3 |
Subtotal |
500 |
115 |
|
53 |
Debriefing Interview |
20 |
16 |
30 |
8 |
Pilot |
|
|
|
|
Past Visitors |
880 |
200 |
30 |
100 |
Future Visitors |
880 |
200 |
30 |
100 |
No Past or Future Visitation |
240 |
60 |
10 |
10 |
Subtotal |
2,000 |
460 |
|
210 |
Nonresponse Survey |
250 |
50 |
5 |
4 |
TOTAL |
2,770 |
641 |
|
275 |
We estimate the annual dollar value of the burden hours associated with this information collection request is $10,700.
We used the Bureau of Labor Statistics (BLS) News Release USDL-22-0105, January 28, 2021, Employer Costs for Employee Compensation—December 2021, to calculate the cost of the total annual burden hours. Table 12.2 lists the hourly rate for all individuals/civilian workers as $38.91, including benefits.
Table 12.2: Estimated Dollar Value of Burden Hours
Respondents |
Number of Completed Responses |
Total Burden Hours |
Dollar Value of Burden Hours (Including Benefits) |
Total Dollar Value of Burden Hours |
Pre-test |
115 |
53 |
$38.91 |
$2,062 |
Debriefing Interview |
16 |
8 |
$38.91 |
$311 |
Pilot |
460 |
210 |
$38.91 |
$8,171 |
Nonresponse Follow-Up |
50 |
4 |
$38.91 |
$156 |
Total |
641 |
279 |
|
13. Provide an estimate of the total annual [non-hour] cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).
* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information [including filing fees paid]. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.
* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.
* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.
There are no non-hour burden costs resulting from the collection of this information.
14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information.
The total annual cost to the National Park Service to administer this information collection is $232,920. This includes the cost of salaries and benefits for the federal employee administering this information collection ($6,754). To determine average hourly rates, we used Office of Personnel Management Salary Table 2022-RUS for a GS-13/6. We used the Bureau of Labor Statistics (BLS) News Release USDL-22-0105, January 28, 2021, Employer Costs for Employee Compensation—December 2021, to calculate the most current benefits rates for government employees and multiplied the hourly rate by 1.6 to obtain a fully burdened rate of $82.42 (Table 14.1).
Table 14.1 Annualized cost to the Federal Government
Position |
GS Level |
Hourly Rate* |
Hourly Rate incl. benefits (1.6) |
Estimated percent of time spent on collection work |
Total Cost Including Benefits |
NPS Economist |
13/ 6 |
$52.76 |
$84.42 |
80 |
$6,754 |
The operational expenses (Table 14.2) for this collection are approximately $226,166 which includes contracted services through RSG, Inc. ($213,641) and other operational expenses ($12,525).
Table 14.2 Operational Expenses
Support Staff |
Estimated Costs |
Subject Matter Expert – Modeling |
$17,792 |
Senior Statistician |
$25,683 |
Senior Economist 4 |
$23,200 |
Senior Economist 3 |
$14,717 |
Senior Social Scientist 3 |
$31,116 |
Technical Specialist – Survey Programming |
$35,078 |
Senior Social Scientist 2 |
$5,227 |
Social Scientist 1 |
$34,623 |
Staff Economist 2 |
$26,205 |
Subtotal |
$213,641 |
|
|
Other Expenses |
Estimated Costs |
Printing and Postage |
$5,325 |
DSS Mailing List (Purchased Address Sample) |
$950 |
Incentive costs Pre-test (500 x $2 = $1,000) Pilot (2000 X $2 = $4,000) Nonresponse Follow-Up (250 x 5 = $1,250) |
$6,250 |
Subtotal |
$12,525 |
TOTAL |
$226,166 |
15. Explain the reasons for any program changes or adjustments in hour or cost burden.
This is a new collection.
16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.
There
are no plans to publish the results at the end of this collection.
The results will be summarized for internal review by the study team
and NPS to inform subsequent phases of the study. Data tabulation
will include response frequencies and measures of central tendency,
as appropriate. Section B.2 describes the analytical framework that
will be used to summarize results and estimate the price elasticity
of demand from the SP questions.
Pre-test data collection is estimated to occur between May 1 and June 15, 2022, with pilot data collection following between July 1 and August 31. Data analysis and reporting will be completed by the fall of 2022.
17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
We will display the OMB control number and expiration date on the information collection instruments.
18. Explain each exception to the certification statement.
There are no exceptions to the certification statement.
References Cited
Aadland, D., B. Anatchkova, B.D. Grandjean, J.F. Shogren, B. Simon, and P.A. Taylor. 2012. Valuing Access to U.S. Public Lands: A Pricing Experiment to Inform Federal Policy. Social Science Quarterly 93(1): 248-269.
Church, A. 1993. Estimating the Effect of Incentives on Mail Survey Response Rates: A Meta-Analysis. Public Opinion Quarterly 57: 62-79.
Edwards, P., R. Cooper, I. Roberts, and C. Frost. 2005. Meta-analysis of randomized trials of monetary incentives and response to mailed questionnaires. Journal of Epidemiological Community Health 59: 987-999.
Fowler, M., T. Cherry, T. Adler, M. Bradley, and A. Richard (RSG). 2018. 2015-2017 California Vehicle Survey, California Energy Commission. Publication Number: CEC-200-2018-006.GAO. Government Accountability Office. (2015). National Park Service. Revenues from Fees and Donations Increased, but Some Enhancements Are Needed to Continue This Trend. Report to Congressional Requesters. GAO-16-166. Retrieved from https://www.gao.gov/assets/680/674187.pdf.
Jobber, D., J. Saunders, V-W. Mitchell. 2004. Prepaid monetary incentive effects on mail survey response. Journal of Business Research 57: 347-350.
Messer, B. and D. Dillman. 2010. Using Address-Based Sampling to Survey the General Public by Mail vs. 'Web plus mail'. Technical Report 10-13, Pullman WA: Social and Economic Science Research Center.
OIG. Department of the Interior Office of the Inspector General. (2015). Review of National Park Service’s Recreation Fee Program. Report No.: C-IN-NPS-0012-2013. https://www.doioig.gov/sites/default/files/2021-migration/CINNPS00122013Public.pdf
Stratus Consulting. 2015. Economic Valuation of Restoration Actions for Salmon and Forests and Associated Wildlife in and along the Elwha River. Prepared for Adam Domanski and Giselle Samonte, U.S. Department of Commerce, National Oceanic and Atmospheric Administration. September 14.
U.S. Department of the Interior and Department of Agriculture. (2015). Implementation of the Federal Lands Recreation Enhancement Act. Triennial report to Congress. Retrieved from: 2015-fourthflrea-triennial.pdf (usda.gov)
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Ponds, Phadrea |
File Modified | 0000-00-00 |
File Created | 2022-07-05 |