Supporting Statement A
Glen Canyon Survey
OMB Control Number 1024-0270
Terms of Clearance: The pilot study (Phase 1) is approved. The National Park Service must submit the results of the pilot study and an explanation of changes to the survey due to the results with the information collection request for the full study.
General Instructions
A completed Supporting Statement A must accompany each request for approval of a collection of information. The Supporting Statement must be prepared in the format described below, and must contain the information specified below. If an item is not applicable, provide a brief explanation. When the question “Does this ICR contain surveys, censuses, or employ statistical methods?” is checked "Yes," then a Supporting Statement B must be completed. OMB reserves the right to require the submission of additional information with respect to any request for approval.
Justification
1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection.
On September 18, 2014 the National Park Service received a Notice of Action (NOA) from the Office of Management and Budget approving a pilot survey to collect information concerning the economic value of National Park System resources within the Glen Canyon Dam region of the Colorado River Corridor (which includes the Glen Canyon Dam and Grand Canyon National Park). Following this approval, the pretest was conducted in November and December 2014. The primary goal of the pretest was to determine if the survey instrument and the sampling methods performed as anticipated. The pretest report (Glen Canyon Passive Use Survey: Pretest Report, January 2015) is included in ROCIS as a supplementary document to this ICR. The results of the pretest suggested that the survey and sampling methods would provide the level of detail and data that will be one piece of information that the Secretary of the Interior will use to evaluate future dam operation plans associated with the current ongoing Glen Canyon Draft Environmental Impact Statement (DEIS). Specifically, the pretest responses showed the numerical range of the key willingness to pay (WTP) parameter in the conjoint valuation question sufficiently captured the stated range of respondent WTP. Additionally, pretest responses showed no problems with respondent comprehension and interpretation of the survey questions.
Following the completion of the pretest we are now requesting approval of a revised version of the instrument that will be used to survey a broader respondent universe. We anticipate that the results could be used to inform the development of an operating plan for the Glen Canyon Dam in accordance with the Grand Canyon Protection Act (GCPA).
The fundamental basis of this request is the need to supplement the information currently available in the extremely outdated studies. This collection will take into account all relevant case studies related to this issue; however, it will closely parallel the work of Welsh et al. (1995) by using stated choice methods to measure the changes in value associated with the impacts on riparian areas caused by changes in operations of Glen Canyon Dam.
The need for this collection is well established within the NPS, US Bureau of Reclamation and other agencies responsible for the management of the Colorado River System. Determination of the time requirements, and need for data collection are based on broad input and evidence from management issues and studies related to Glen Canyon Dam and the flows of the Colorado River through the Grand Canyon.
Legal Justification:
The Grand Canyon Protection Act of 1992 (Pub. L.102–575 - Section 1804(c) - The Secretary shall operate Glen Canyon Dam in accordance with the additional criteria and operating plans specified in section 1804 and exercise other authorities under existing law in such a manner as to protect, mitigate adverse impacts to, and improve the values for which Grand Canyon National Park and Glen Canyon National Recreation Area were established, including, but not limited to natural and cultural resources and visitor use.
Colorado River Storage Project Act Of 1956 – (43 U.S.C. -- 620-620o) This Act authorizes and directs the Secretary to investigate, plan, construct and operate facilities to mitigate losses of, and improve conditions for, fish and wildlife and public recreational facilities.
The National Park Service Act of 1916, (38 Stat 535, 16 USC 1, et seq.) requires that the National Park Service (NPS) preserve the national parks for the use and enjoyment of present and future generations. At the field level, this means resource preservation, public education, facility maintenance and operation, and physical developments that are necessary for public use, health, and safety.
National Environmental Policy Act, 1969, (42 U.S.C. 55) requires visitor use data in impact assessment of development on users and resources as part of each park's general management plan.
2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection. [Be specific. If this collection is a form or a questionnaire, every question needs to be justified.]
The original Terms of Clearance for this collection required that we conduct a pilot study (Phase 1). The results were used to evaluate the overall performance of the survey instrument and the sampling methods, determine if the levels for the conjoint questions worked as expected, to make any modifications necessary to the final version of the questionnaire, and to predict a possible response rate.
The results from the pilot test are summarized below:
A total of 225 U.S. household addresses were selected for the pretest sample. Of the 225 survey mailed, 23 were retuned as undeliverable and 49 completed surveys were returned for an overall response rate of 24%.
The results from the pretest suggest that most respondents understood the questions, followed instructions and had adequate information to answer the stated-preference conjoint questions.
Overall 39.7% of the respondents voted in favor of the action plans presented, but as expected this percentage was lower when the cost of the plan (bid amount) was higher.
While the pretest sample size was too small to estimate any meaningful discrete choice model parameters, respondents reacted to key choice attributes (cost) as predicted by theory (downward sloping demand curve).
The range of high and low bids presented suggests that no change in overall bid range is needed in the final survey instrument in order to capture the essential bid response distribution.
We feel that similar results will provide managers with information that could be used to determine options for experimental management actions that will meet the requirements of the Grand Canyon Protection Act and minimize impacts to resources.
Phase 2 of this collection will use the modified version of the mail-back survey to collect information from households selected from two sample frames: (1) National (All US households) and (2) the Glen Canyon Region (the Western Area Power Administration (WAPA) Marketing area and all households served by utilities powered by the Glen Canyon Dam). The justification for the survey questions are described in the box below.
Justification of Survey Questions
Section 1 - The questions in this section will be used to check for respondent understanding and familiarity of the condition and management of the resources along the Colorado River Corridor.
Section 2 – The questions in this section constitute the core discrete choice experiment (DCE) valuation questions. Specifically the section
Includes two stated choice conjoint questions. Each choice question presents one action scenario and the no-action scenario (or opt-out).
Outlines the key differences between two policies and describes the resources impacted by these policies
Identifies the respondent’s voting preference between Proposed Plan A and Existing Management Plan
Outlines the key differences between two policies and describes the resources impacted by these policies
Identifies the respondent’s voting preference between Proposed Plan B and Existing Management Plan
Section 3 - The questions in this section will be used to better understand the respondents’ voting preference for proposed management options.
Section 4 - Identify the respondent’s understanding and opinions about the future operation of Glen Canyon Dam.
Section 5 - The question in this section will be used to determine the respondents’ understanding of cultural, natural, and environmental resources.
Section 6 - Will be used to understand more about the respondents’ attitudes towards and use of the national parks system.
Section 7 - Will be used to obtain demographic characteristics (education level, income category, age, gender, race/ethnicity, urban/rural residence).
The final phase (Phase 3) will be a short non-response telephone survey that will test for any non-response bias.
A final report of the results will be presented to the National Park Service that will include data collection methods, response rates, analysis, and non-response corrections if needed.
3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.
No automated or electronic techniques will be used.
4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.
There is no known duplication of efforts of studies currently underway. A thorough review of previous survey efforts in the park units as well as a comprehensive analysis of existing data gaps was conducted in order to narrowly define the scope of any new information needed to satisfy the current management objectives. This review found that Welsh et al. (1995) was the most recent study addressing this topic. Therefore up-to-date information on the economic value of the Glen Canyon NPS resources is needed.
5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.
This information collection will not impact small businesses or other small entities.
6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
The results of this survey will be used to provide one piece of information that the Secretary of the Interior will use to evaluate plans associated with the ongoing Glen Canyon Environmental Impact Statement (EIS) process. If this collection is not conducted, the NPS would be forced to rely on outdated sources of information thereby, potentially compromising the accuracy and reliability of future policies and recommendations.
7. Explain any special circumstances that would cause an information collection to be conducted in a manner:
* requiring respondents to report information to the agency more often than quarterly;
* requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
* requiring respondents to submit more than an original and two copies of any document;
* requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;
* in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
* requiring the use of a statistical data classification that has not been reviewed and approved by OMB;
* that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
* requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.
No special circumstances apply to this information collection.
8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice [and in response to the PRA statement associated with the collection over the past three years] and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.
Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.
Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years — even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.
A Federal Register notice requesting comments was published on September 23, 2013 (78 FR 58344). The notice announced that we were preparing an information collection to be submitted OMB for approval. In that notice we solicited public comments for 60 days, ending November 23, 2013. We received three requests to review the survey instruments. In response to the requests, we provided a summary of the study purpose and design and informed that the final versions of the survey would be available for review once the request was submitted to OMB.
On September 18, 2014, OMB approved a pretest version of the survey to be conducted to inform the administration of a full and final survey. During November and December 2014, we conducted the pretest version of the survey using the sampling design described in Part B. All participants were asked to provide feedback at the end of the survey concerning the structure of the survey questions. There were two statements that dealt directly with comprehension. About 20% of respondents said that the description of the plans were hard to understand. Conversely 13% said that the survey provided enough information to make a choice between the options given. A key focus of the pretest was on the understandability and effectiveness of the conjoint question survey section in conveying information, and eliciting consistent, meaningful responses.
While the pretest sample size is considered too small to estimate any meaningful discrete choice model parameters, respondents reacted as expected and predicted by theory to key choice attributes (cost) in a downward sloping demand curve. Additionally, the range of high and low bids presented suggests that no change in overall bid range is needed in the final survey instrument in order to capture the essential bid response distribution.
We received seven comments specifically addressing the 30 day Federal Register Notice of July 9, 2014 (79 FR 38946). The comments are attached in ROCIS as supplemental documents. In summary, comments received from the organizations listed below primarily concerned their overall objections towards the study and the overall utility of the collection. However, none of the letters addressed any specific changes or editorial corrections that could be made to the survey or the methodology. The NPS gave a presentation and addressed many questions regarding this survey and its methodology at the August 28, 2014 Glen Canyon Dam Adaptive Management Work Group (AMWG) meeting. The AMWG is a semi-annual meeting that is attended and represented by national, state, and tribal natural resources agencies, stakeholders, and environmental organizations including the National Park Service. Economist from NPS, Colorado State University and Montana State University also provided updates and addressed additional questions during two AMWG stakeholder conference calls (November 13, 2014 and December 16, 2014). Therefore, the NPS concedes their efforts to respond to verbal and written comments provided by these agencies. A summary of the comments received from the following organizations are included below:
Colorado River Energy Distributors Assoc. (CREDA)
Comment: This collection is not necessary and will not have practical utility and does not clearly meet the requirements of 5 CFR 1320. Pubic will have opportunity to comment on actual alternatives in public draft of EIS. Survey alternatives do not accurately portray LTEMP alternatives therefore ICR is unnecessary and misleading. The purpose and intent of ICR needs to be clarified otherwise CREDA believes it is an unwarranted and unnecessary burden on respondents. ICR materials not available until recently. Commitment to “include or summarize each comment in our request to OMB to approve this ICR” was not met. Inaccurate and misleading references in the Authorizing Statue(s) information and in Supporting Document A.
NPS Response: In order to collect information from the public, we must be granted approval by the Office of Management and Budget to do such. In accordance to and as required by the Paperwork Reduction Act of 1995, which is the purpose of 5 CFR 1320.1, we have submitted the proper paperwork to OMB to request approval and was granted the approval to collect the information for the pilot study associated with this collection. We are again following the proper guidance provided by OMB to request approval to collect the information requested within. For the conjoint analysis methodology, respondents are provided with information about the resource outcomes, not the alternatives. This methodology values individually the management outcomes such as the conditions of river beaches, native fish populations, and trout populations. The outcome levels selected for the survey are set statistically to maximize estimation efficiency and are intended to represent the range of potential impacts. The values of LTEMP alternatives will then be estimated by setting individual outcome levels to match those of the respective alternatives and adding their indicated values together. The NPS presented and addressed these questions regarding the survey methodology at the August 28, 2014 AMWG meeting. The NPS also provided updates and addressed questions during the November 13, 2014 and December 16, 2014 AMWG stakeholder calls.
Southern Nevada Water Authority
Comment: The survey fails to adequately represent resource interactions, dam operations, and associated management actions. The survey over-emphasizes recreational values and underemphasizes values of other stakeholders. Results will misrepresent the value of important resources and provide false valuation of contemplated actions. Request that AMWG be given opportunity to discuss the survey’s details at their August 2014 meeting.
NPS Response: For the conjoint analysis methodology, respondents are provided with information about the resource outcomes, not the alternatives. This methodology values individually the management outcomes such as the conditions of river beaches, native fish populations, and trout populations. The outcome levels selected for the survey are set statistically to maximize estimation efficiency and are intended to represent the range of potential impacts. The values of LTEMP alternatives will then be estimated by setting individual outcome levels to match those of the respective alternatives and adding their indicated values together. The NPS presented and addressed these questions regarding the survey methodology at the August 28, 2014 AMWG meeting. The NPS also provided updates and addressed questions during the November 13, 2014 and December 16, 2014 AMWG stakeholder calls. Therefore, these comments have been addressed by the NPS.
Comment: The FRN lacks specific information that would aid the public in more fully understanding the purpose and need of the ICR. Unclear how any data and/or information collected via the ICR survey instruments would be used by the NPS. The CRB suggests that the appropriate venues for those activities should be through the AMWG and with the input of the LTEMP EIS co-lead agencies (i.e., Reclamation and NPS) and cooperating agencies. It is not clear that any information collected by the NPS would be of any practical value and contribute to the overall analysis of the six detailed and complex alternatives being evaluated through the LTEMP EIS process. The CRB suggests that both survey instruments significantly over-simplify and/or under-state the current state of scientific knowledge and uncertainty. As presently structured, the ICR is incomplete and potentially misleading. The CRB suggests that the most meaningful and appropriate venue in which to solicit public feedback is through the LTEMP EIS process.
NPS Response: The current 30 day FRN attempts to provide the clarity requested. The title has been changed to “Glen Canyon Passive Use Survey.” For the conjoint analysis methodology, respondents are provided with information about the resource outcomes, not the alternatives. This methodology values individually the management outcomes such as the conditions of river beaches, native fish populations, and trout populations. The outcome levels selected for the survey are set statistically to maximize estimation efficiency and are intended to represent the range of potential impacts. The values of LTEMP alternatives will then be estimated by setting individual outcome levels to match those of the respective alternatives and adding their indicated values together. The NPS presented and addressed questions regarding the survey methodology at the August 28, 2014 AMWG meeting. The NPS also provided updates and addressed questions during the November 13, 2014 and December 16, 2014 AMWG stakeholder calls. Therefore, these comments have been addressed by the NPS.
Arizona Department of Water Resources
Comment: Alternatives presented in the survey do not represent the range of alternatives in the EIS and would result in little or no practical utility. It would be more appropriate for the pubic to comment on actual alternatives in the public draft of the LTEMP EIS.
NPS Response: For the conjoint analysis methodology, respondents are provided with information about the resource outcomes, not the alternatives. This methodology values individually the management outcomes such as the conditions of river beaches, native fish populations, and trout populations. The outcome levels selected for the survey are set statistically to maximize estimation efficiency and are intended to represent the range of potential impacts. The values of LTEMP alternatives will then be estimated by setting individual outcome levels to match those of the respective alternatives and adding their indicated values together. The NPS presented and addressed questions regarding the survey methodology at the August 28, 2014 AMWG meeting. The NPS also provided updates and addressed questions during the November 13, 2014 and December 16, 2014 AMWG stakeholder calls. Therefore, these comments have been addressed by the NPS.
Western Area Power Administration
Comment: The FRN notice is insufficient to discern utility of the IC and therefore recommends that NPS clarify scope and purpose of IC to allow parties to better understand the utility of IC. Title of IC is misleading. WAPA requested that NPS share the survey document and proposed that NPS integrate the collection of information through the survey, economic analysis, and any analysis that is being conducted to inform the Secretary on alternative management options.
NPS Response: The current 30 day FRN attempts to provide the clarity requested. The title has been changed to “Glen Canyon Passive Use Survey.” All documents associated with this submission are posted in Reginfo.gov as required by the Office of Management and Budget. The request for additional information in the 60 day federal Register Notice provided three separate addresses – to which this letter was addressed and received. The website for reginfo.gov is displayed, as required, in the 30 day federal Register Notice of July 9, 2014 (79 FR 38946) for this request. A second 60 day notice was not required for the final survey because the request was made in the 60 day FRN published on September 23, 2013 (78 FR 58344) and closed on November 23, 2013. This study is only one of many studies being conducted to inform the Secretary on alternative LTEMP management options.
Irrigation & Electrical Districts Association of Arizona
Comment: Echoed comments from others. Concerned about hidden and obscure documents not easily available for review by public and interested parties is so fatally flawed as to be beyond salvage. Improper use of federal funds for which there is no credible use in the upcoming EIS analysis.
NPS Response: All documents associated with this submission are posted in Reginfo.gov as required by the Office of Management and Budget. The request for additional information in the 60 day federal Register Notice provided three separate addresses – to which this letter was addressed and received. The website for reginfo.gov is displayed, as required, in the 30 day federal Register Notice of July 9, 2014 (79 FR 38946) for this request. A second 60 day notice was not required for the final survey because the request was made in the 60 day FRN published on September 23, 2013 (78 FR 58344) and closed on November 23, 2013.
The NPS presented and addressed questions regarding the survey methodology at the August 28, 2014 AMWG meeting. The NPS also provided updates and addressed questions during the November 13, 2014 and December 16, 2014 AMWG stakeholder calls. Therefore, these comments have been addressed by the NPS.
American Public Power Association
Comment: The collection is not necessary for proper performance of NPS functions as required by 5 CFR 1320 and will not have practical utility. Concerned by methodologies used and requested further examination of all aspects of this ICR including survey methodologies.
NPS Response: In order to collect information from the public, we must be granted approval by the Office of Management and Budget to do such. In accordance to and as required by the Paperwork Reduction Act of 1995, which is the purpose of 5 CFR 1320.1, we have submitted the proper paperwork to OMB to request approval and was granted the approval to collect the information for the pilot study associated with this collection. We are again following the proper guidance provided by OMB to request approval to collect the information requested within. The NPS presented and addressed questions regarding the survey methodology at the August 28, 2014 AMWG meeting and provided updates and addressed questions during the November 13, 2014 and December 16, 2014 AMWG stakeholder calls. Therefore, these comments have been addressed by the NPS.
Each of the organizations above rejected the notion of the need for this collection. The NPS participated in a number of conference calls coordinated by these groups to answer the concerns voiced in these correspondences. The Agency contended the basis for this collection is predicated on the research need to update the Welsh et al (1995) because this was the most recent study addressing this topic and therefore up-to-date information on economic value of the NPS resources along Colorado River is overdue and necessary for NPS management needs.
In addition to the pilot survey, we solicited feedback from three professionals with expertise in economic valuation, natural resource management and planning as well as survey design and methodology. The reviewers were asked to provide comments concerning the structure of the revised survey instrument and to provide feedback about the validity of the questions and the clarity of instructions. We also asked if the estimated time to complete seemed adequate. We received several editorial and grammatical suggestions to provide clarity and to correct punctuation. Those edits were incorporated into the final versions of the surveys.
The following individuals provided feedback:
John Loomis, Colorado State University
Lynne Koontz, National Park Service
Michael Welsh, Industrial Economics
9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.
No incentives will be provided.
10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.
We will not provide any assurances of confidentiality. The names and mailing address will be stored in separate data files and the will only be combined for the purpose of making follow-up contact with non-respondents. Respondent names or addresses will not appear in any of our reports or findings. All responses will be anonymous and the respondent’s names will never be associated with their responses. At the completion of the study all name and address files will be destroyed.
11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
No questions of a sensitive or personal nature will be asked.
12. Provide estimates of the hour burden of the collection of information. The statement should:
* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.
* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.
* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 14.
The estimated total burden of this collection will be 758 hours (Table 1).
Mail Survey: We will mail a total of 4,626 surveys (3,470 national surveys and 1,156 regional surveys) to a random sample of adults (18 years and older) in households within the national and regional sampling areas. All respondents will receive similar instructions and answer a series of similar questions (e.g., general socio-demographics, visitation patterns, recreational preferences, and values, etc.). We have estimated that it will take about 30 minutes to complete and return the mail survey; this time includes the time to read the correspondence and instructions included within the survey package. We expect to receive a 30% (n=1,041) response rate for the national survey and a 40% (462) for the regional survey (1,503 responses x 30 minutes = 752 burden hours). While these anticipated response rates are greater than those received for the pretest survey (24%), the pretest was a national population, and due to scheduling was conducted during the less than optimal holiday season. Previous studies have shown regional and local samples generally have higher response rates than national samples, due to greater proximity to the area covered by the survey.
Non-respondent survey: After the third (and final) mailing, we will initiate the non-response survey. We will identify all non-respondents and start by telephoning a random sample of 200 household to answer a short non-response survey (approximately 5 minutes). We expect to receive a 35% (n=70) response rate for this final segment of the collection (70 responses x 5 minutes=6 hours).
Table 1: Estimation of respondent burden
|
Sample |
Number of Contacts |
Number of Completed Surveys |
Estimated Completion Time (minutes) |
Total Annual Burden (Hours) |
Mail Survey |
National Regional |
3,470 1,156 |
1,041 462 |
30 30 |
521 231 |
Non-respondent survey |
National Regional |
100 100 |
35 35 |
5 5 |
3 3 |
|
Total |
4,826 |
1,573 |
|
758 |
We estimate the total dollar value of the burden hours to be $25,385. We multiplied the estimated burden hours by $33.49 (for individuals or households). This wage figure includes a benefits multiplier and is based on the National Compensation Survey: Occupational Wages in the United States published by the Bureau of Labor Statistics Occupation and Wages, (BLS news release USDL-15-1132 for Employer Costs for Employee Compensation— March 2015 at - http://www.bls.gov/news.release/pdf/ecec.pdf, dated June 10, 2015).
Table 2. Estimated Dollar Value of Burden Hours
Activity |
Annual Number of Responses |
Total Annual Burden Hours |
Dollar Value of Burden Hours (Including Benefits) |
Total Dollar Value of Annual Burden Hours |
Mail Survey |
1,503 |
752 |
$33.49 |
$25,184 |
Non-response survey |
70 |
6 |
$33.49 |
$201 |
TOTAL |
1,573 |
758 |
|
$25,385 |
13. Provide an estimate of the total annual [non-hour] cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).
* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information [including filing fees paid]. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.
* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.
* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.
There are no non-hour burden costs resulting from the collection of this information.
14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 in a single table.
There are no Federal employee costs associated with this collection. All employee costs are contracted through the University of Montana. The estimates below include the operational expenses that include contracting support, travel, and printing costs associated with this collection totaling $308,684.
Table 3. Operational Expenses
Operational Expenses |
Estimated Cost |
Contract Support
|
$119,066 17,446 38,154 |
Travel |
1,044 |
Subcontracts |
24,000 |
Survey Support (Sample procurement, printing, postage, mailing, etc.) |
63,000 |
Indirect Costs (University of Montana) |
45,974 |
TOTAL |
308,684 |
15. Reasons for change in burden
The estimated burden for the final survey is 758 hours. This represents a net increase of 1,492 respondents and 723 hours from the estimated 81 respondents and 35 hours previously approved for the pilot survey.
16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.
After data collection is complete, University of Montana researchers will prepare a final report that will present the survey results and the willingness to pay (WTP) estimates. Section B.2 describes the analytical techniques that will be used to summarize the results and estimate the WTP values.
Time schedule
|
Steps |
Time Schedule (preliminary estimated) |
Survey Administration |
Mail initial contact letter |
2 weeks following final OMB approval |
1st survey mailing, Reminder postcard, 2nd survey mailing, 3rd survey mailing |
Completed within 1.5 months of initial survey mailing |
|
Main data collection complete |
2 months following OMB approval |
|
Analysis |
Begin Data Analysis Complete Data Analysis |
2 months following OMB approval 3 months following OMB approval |
Non-response survey |
Begin non-response follow-up survey Non-response phone survey complete |
2 weeks following completion of main data collection; complete within 1 month of main data collection |
Reports |
Draft report |
5 months following OMB approval |
Final Report delivery to NPS |
6 months following OMB approval |
17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
We are not seeking approval to exclude the expiration date - we will display the OMB control number and expiration date on all information collection instruments included in this request.
18. Explain each exception to the certification statement.
There are no exceptions to the certification statement.
References Cited
Loomis, J. et al. (2005). Final Report of the Protocol Evaluation Panel on the Recreation Monitoring Program of the Grand Canyon Monitoring and Research Center.
National Research Council. (1999). Downstream: Adaptive Management of Glen Canyon Dam and the Colorado River Ecosystem. National Academy Press, Washington, D.C.
Welsh, M. P., R. C. Bishop, M. L. Phillips, and R. M. Baumgartner. (1995). “GCES Non-Use Value Study.” Prepared for Glen Canyon Environmental Studies Non-Use Value Committee.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Ponds, Phadrea |
File Modified | 0000-00-00 |
File Created | 2021-01-24 |