Supporting Statement for
Paperwork Reduction Act Information Collection Submission
OMB Control Number 1090-0010
Klamath Nonuse Valuation Survey
Terms of Clearance: “Approval is granted for the focus group/pre-test portion of the ICR. Once the focus groups and pre-tests are complete, DOI should summarize and report the results to OMB, and any revisions to the supporting statements and the survey instruments.” The previous terms of clearance are expected by DOI to apply to the Information Collection DOI is requesting, as DOI has not yet begun to implement this collection.
A. Justification
A1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.
The U.S. Department of the Interior (DOI) is requesting approval to modify an existing information collection for the Klamath Nonuse Valuation Survey. The 60-day notice was published in the Federal Register on June 9, 2009 (74 FR 27340). OMB has approved DOI’s request to close the focus groups approved under this information collection request and to include fielding the survey instrument. DOI requests approval to make changes to the instrument as a result of an Information Quality Act request.
The Klamath River Basin provides essential habitat for several fish species, including Chinook salmon, coho salmon, steelhead trout, Pacific lamprey, and the Lost River and shortnose suckers. Some of these species are important components of ocean and/or in-river harvest (Chinook salmon and steelhead trout), while others are rarely harvested because of fishery regulations, limited availability, and/or listed status under the Endangered Species Act (ESA). In addition to its importance as fish habitat, the Klamath River and its tributaries also provide water to agriculture through the Bureau of Reclamation’s Klamath Irrigation Project. Oversubscription of Klamath water has thwarted recovery of depressed fish stocks and led to economic hardship for farming and fishing communities, prompting federal disaster relief for farmers in 2001 and for fishermen in 2006.
In February 2010, the U.S. government; the states of Oregon and California; the chairmen of the Klamath, Yurok, and Karuk Tribes; and the utility company PacifiCorp formally announced the final Klamath Basin Restoration Agreement (KBRA) and Klamath Hydroelectric Settlement Agreement (KHSA)1. These agreements define a set of activities, including the removal of four dams on the Klamath River by 2020; the dam removals are designed to restore fisheries and provide water supply certainty in the Basin. The Hydroelectric Settlement Agreement calls for the Secretary to determine whether dam removal will advance restoration of the salmonid fisheries of the Klamath Basin and is in the public interest.
The Secretary, acting through the Bureau of Reclamation, has authority to undertake these studies under both the Secure Water Act of 2009 March 30, 2009, 123 Stat. 991, and the Bureau of Reclamation’s general planning authority pursuant to the Act of June 17, 1902, 32 Stat. 388. These studies will be conducted in accordance with all applicable legal requirements.
The Secretary of the Interior authorized development of the Klamath project on May 15, 1905, under provisions of the Reclamation Act of 1902 (32 Stat. 388). The Hydropower Settlement Agreement and the KBRA will affect the Klamath project operations.
Absent the agreements, the Department would be participating in a judicial process administered by the Federal Energy Regulatory Commission (FERC).
Under the KHSA, the Secretary of the Interior is to determine by March 31, 2012, whether the potential removal of these dams will advance restoration of the salmonid fisheries of the Klamath Basin and is in the public interest, which includes but is not limited to consideration of potential impacts on affected local communities and Tribes. The determination will be based on a number of factors, including an economic analysis.
To comply with the Secretary’s responsibilities, one important area of benefits that needs to be addressed is “nonuse value.” Nonuse values accrue to members of the public who value Klamath Basin improvements regardless of whether they ever consume Klamath fish or visit the Klamath Basin. Nonuse value is a component of the total value an individual places on the environmental change. To measure these benefits, DOI has contracted with RTI International in Research Triangle Park, NC, to design and implement a stated-preference (SP) valuation survey of the U.S. public. The survey, which will measure the total value including nonuse value of the environmental change to individuals, will be the only component of the larger economic analysis that assesses the benefits that the public as a whole (who are federal taxpayers) hold for dam removal and implementing the KBRA, which will be funded in part by federal money.
A2. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection. [Be specific. If this collection is a form or a questionnaire, every question needs to be justified.]
DOI has contracted with RTI International in Research Triangle Park, NC, to conduct the survey. Under the original information collection request, focus groups were conducted to test the survey instrument. The testing is complete, and the survey is ready to be administered.
The final survey will provide information for the economic analysis of the KBRA and the Klamath Hydroelectric Settlement Agreement. The economic analysis provides one piece of information that the Secretary of the Interior will use to evaluate the plans.
There are three primary survey instruments.
The main survey instrument that will be administered by mail and web.
The follow-up phone survey. As part of the nonresponse follow-up, we will attempt to ask nonrespondents for whom we have telephone numbers three questions that can be used for nonresponse bias identification.
Nonresponse survey: A greatly shortened version of the survey for the nonresponse follow-up.
The survey instruments posted with this submission contain all the documents associated with the survey, including the prenotification post card and cover letters. Several versions of the main survey instrument were created as part of the conjoint experimental design and to test specific hypotheses about the impact of the survey instrument on willingness-to-pay (WTP). The versions include:
The order of human uses listed on page 5 of the survey (see description below). The alternate order is listed at the end of the survey instrument.
Conjoint experimental design. There will be 16 blocks of two conjoint choice questions. Most respondents will receive one randomly assigned block of two choice questions.
A one-question version of survey. In the pilot test, 50% of the sample will receive a version of the survey that has only one conjoint question. There will be 16 versions of the one-question survey created by deleting the second conjoint question (and revising the wording of the pages leading up to the conjoint section). If the proportion of the sample that selects Plan A is significantly different in the one-question version of the survey, 20% of the final sample will also receive the one-question version.
Cost in question 43. The costs will be randomly assigned. The device cost will be $25/$50/$110/$175. The reduced electricity costs will be $1/$2/$10/$15. (There are 16 combinations of device cost and reduced electricity cost, each will be assigned to one conjoint block so as not to add to the number of versions of the survey.)
In total, there will be 64 survey versions of the main instrument (2 orders for human uses x 16 blocks of the conjoint experiment + 32 versions of the one-question survey).
Below we discuss the justification for each question in the main survey instrument.
Section 1 (pp. 1–3) Introducing the Survey Issues
The first three pages of the survey introduce the issues to be addressed in the survey. It begins by placing the Klamath Basin issues within a national context. It informs respondents that the Klamath is one of many river basins across the country facing problems and requiring solutions, and the federal government is involved in making decisions about restoring the basin. It then describes how the survey is organized, what types of information will be provided, and what types of questions will be asked.
To encourage and motivate response, the survey informs respondents that their opinions will help government decision makers choose the best options for restoring the Klamath. It also describes some of the unique features of the Klamath.
Section 2 (p. 4) Introduction to the Klamath River Basin
Because many respondents are likely to know very little about the basin, this page provides basic geographic, demographic, and wildlife resource information about the Klamath. For reference, the full-page map showing the outline of the basin and its main water features of interest for the survey (Upper Klamath Lake and the Klamath River) will also be inserted in the survey booklet next to this page.
Question 1 asks whether the respondent had any prior knowledge about the Klamath, and Question 2 asks whether they have ever visited the basin. These questions provide basic indicators of familiarity with the Klamath, which can be used in the analysis to test whether and how prior knowledge affects preferences for restoring the basin.
Section 3 (p. 5) Human Uses of Klamath River Basin Water
This section introduces respondents to the many uses of Klamath Basin water resources. It describes the main ecosystem services (although it does not use this technical term) provided by the basin, and it introduces the parties who benefit from these services. It also introduces the idea that water is a scarce resource and that some of its uses are in direct competition with each other. Competition over water resources is a main source of environmental pressures within the basin.
Question 3 asks respondents about their uses of rivers in their area. This question provides data that can be used to test whether individuals’ own interactions with river resources in their area affect their preferences for restoring the Klamath’s river resources. It also provides respondents with an opportunity to pause from acquiring information about the Klamath.
Section 4 (pp. 6–9) Fish Resources under Threat in the Klamath
These pages introduce two main indicators of river ecosystem health in the Klamath Basin: the status of salmonid populations and risks to threatened (coho salmon) and endangered (Lost River and shortnose suckers) fish species. It describes the main causes of degradation in these indicators: dams, water withdrawals for irrigation, over fishing and water pollution. The map on page 7 is especially important for describing how dams are blocking migratory fish from accessing a large portion of their native habitat in the basin.
Introducing and describing these indicators is critical at this stage of the survey because they are also the main attributes that will be used in the SP choice tasks to represent improvements in river ecosystem health.
Questions 4, 5, and 6 offer respondents an opportunity to reflect on these two key indicators and consider the importance (i.e., level of concern) that they place on them. The responses can also serve as informal cross-checks on the preference weights for these indicators that are estimated as part of the data analysis.
Section 5 (pp. 10–14). Conflicts and Agreements over Water Management in the Klamath
These pages describe the recent history of conflict and more recent agreements over the management of Klamath Basin resources. The discussion emphasizes that, although it is true that there has been substantial and well-publicized disagreement in the past, most parties involved have now agreed on a key set of principles for resolving these conflicts: dam removal, water sharing, and fish restoration. All of the plans for moving forward with restoration of the basin contain these main elements.
Page 12 describes how the agreements would be paid for by a combination of money from PacifiCorp and PacifiCorp ratepayers, the states of Oregon and California, and the federal government.
Page 13 describes in general terms the main positive and negative impacts that would result from implementing these agreements. It emphasizes that there are important trade-offs for the government (and thus the respondent) to consider in moving forward.
Questions 7 and 8 ask respondents to describe their familiarity with the conflicts and agreement. These questions provide data that can be used to test whether individuals’ knowledge affects their preferences for restoring the Klamath’s river resources, and they provide respondents with another opportunity to pause from acquiring new information.
Question 9 asks respondents how much they agree that Oregon and California residents should pay more than residents of other states. The purpose of this question is mainly to remind respondents that all U.S. residents would pay, but Oregon and California residents would pay more. This feature is likely to have a large impact on the perceived fairness of any restoration plan for the Klamath.
Question 10 asks respondents if they receive power from PacifiCorp. This information will help identify individuals who are likely to end up paying the most for Klamath restoration plans, which may be an important factor affecting stated preferences for these plans.
Question 11 asks respondents how much they agree with federal government involvement in the Klamath Basin restoration. This question provides data that can be used to test whether individuals’ attitudes about the role of the federal government affect their preferences for restoring the Klamath’s river resources. It also provides respondents with another opportunity to pause from acquiring information about the Klamath.
Question 12 presents respondents with a series of seven general statements about the role of humans in protecting natural resources. It asks respondents to indicate their level of agreement with each statement. The purpose of this question is to gauge respondents’ overall support for environmental protection and to give respondents an opportunity to reflect on and provide feedback on these general issues, before responding to the choices about very specific plans in the Klamath.
Section 6 (pp. 15–18). Choice Task 1
These pages introduce the first choice task and describe the main features and outcomes of two options: NO ACTION and ACTION PLAN A. Text is included to encourage the respondent to think carefully about each choice and to answer in the same way as they would if it were an actual vote between the two options presented. Pages 16 and 17 allow the reader to compare the two options side by side in three main dimensions:
changes in wild salmonid populations
changes in extinction risks for suckers and coho
added costs to the household
The first two dimensions are described in words and illustrated with color graphics.
Page 18 describes the voting scenario and reminds respondents to consider their households’ budget and spending trade-offs before responding.
Question 13 asks whether the respondent has ever had the opportunity to vote on a similar program. The purpose of this question is to gauge how plausible the choice context is for respondents and also to give respondents a break from acquiring information.
Question 14 asks the respondent to choose (vote for) the option they prefer. The choice data from this question will be used to analyze preferences and estimate willingness to pay (WTP) for Klamath River ecosystem restoration and its associated attributes.
Question 15 asks for respondents’ level of certainty in their response to Question 14. Measuring certainty in this way will allow us to test for its effect on stated preferences and to investigate whether placing more weight on responses involving higher levels of certainty affects WTP estimates.
Section 7 (pp. 19–22) Choice Task 2
These pages introduce a second choice task between NO ACTION and a new alternative: ACTION PLAN B. For this choice, the respondent is asked to assume that the previously described ACTION PLAN A is not an option. Pages 20 and 21 provide detailed descriptions of NO ACTION and ACTION PLAN B outcomes, using the same three dimensions and format that were used on pages 16 and 17 to describe the NO ACTION and ACTION PLAN A options. Page 22 again describes the voting scenario and reminds respondents to consider his/her households’ budget and spending trade-offs before responding.
Question 16 asks the respondent to choose (vote for) the option they prefer. The choice data from this question will be used to analyze preferences and estimate WTP for Klamath River ecosystem restoration and its associated attributes.
Question 17 asks for respondents’ level of certainty in their response to Question 14. Measuring certainty in this way will allow us to test for its effect on stated preferences and to investigate whether placing more weight on responses involving higher levels of certainty affects WTP estimates.
Section 8 (pp. 23–24) Stated Choice Debriefing Questions
Question 18 presents a series of 10 statements describing possible reactions to the choice questions and asks respondents to indicate their level of agreement. The answers to these questions will be used to determine whether and how the attitudes reflected in these statements affected respondents’ stated choices and trade-offs.
Questions 19 and 20 use the same format as Question 18 but specifically ask about the choice of the NO ACTION and action plans, respectively. They each present two statements and ask for respondents’ level of agreement. The answers to these questions will also be used to determine whether and how the attitudes reflected in these statements affected respondents’ stated choices and trade-offs.
Question 21 asks respondents about their perceived likelihood that the results of the survey will be used by the government to assist in decision making. The purpose of this question is to gauge how relevant respondents believe their answers are for policy, which may affect their stated preferences in the choice tasks.
Section 9 (p. 25) Recreational Use of Klamath
This section asks about the respondents’ past use of the Klamath River Basin (Questions 22, 23, and 24).The purpose of these questions is to develop indicators of recreational use. These indicators will be used in the analysis to determine how these recreational use indicators affect respondents’ stated preferences and total WTP (including both use and nonuse values) for Klamath Basin restoration.
Section 10 (pp. 26–29) About You and Your Household
This section primarily collects information about respondents’ and their households’ characteristics. These data will be used as explanatory variables in the SP analysis and to compare the survey samples’ characteristics with those of the populations they represent (i.e., to test for nonresponse bias).
Questions 25 to 34 ask for standard sociodemographic information (gender, age, marital status, household size, household income, education, homeownership, employment status, and ethnicity).
Question 35 specifically asks about membership in one of the Klamath Basin tribes, and Question 36 asks about employment in specific industries that are most likely to be affected by Klamath Basin restoration activities.
Questions 37 to 39 ask about specific economic conditions and expectations for the respondent and their household. The purpose of these questions is to investigate how the current economic recession may be affecting stated preferences and WTP for Klamath Basin restoration.
Questions 40 and 41 ask about the respondent’s electric bill and willingness to buy a device that would lower their electric bill over the next 10 years. The purpose of these questions is to get a rough estimate of the respondent’s time preferences for models that use the discount rate to convert the 20-year payment to a single amount. The question will also provide information on how preferences for a good with use value (a device to lower electric bills) vary between the different strata.
Question 42 asks whether the respondent is the person in the household with the most recent birthday. To randomize the selection of a respondent from each household, the instructions included in the survey’s cover letter request that the person with the most recent birthday respond to the survey. This question verifies whether this randomization took place but, to promote as high response rates as possible, it also asks the respondent to complete and return the survey even if these instructions were not followed.
One-question version of survey:
The number of SP questions each respondent answers has also been shown to affect responses in some surveys. The effects seem to vary by survey, and it is difficult to identify whether the differences in survey responses reflect strategic bias or learning or some other effect.
Because the survey will be administered primarily by mail, it is possible that survey respondents will look at both conjoint questions before answering. Instead of considering the two questions as separate and unrelated, respondents might let the cost of one plan relative to the benefits influence their response to the other plan, perhaps selecting the plan that was “the best deal”. Alternatively, looking at both questions might help the respondent think more carefully and provide more accurate answers.
In theory, a single, dichotomous choice question will be incentive compatible. However, asking a single question of each respondent greatly increases sample size requirements. We will test whether the number of conjoint questions presented to a respondent affects their responses by comparing responses to a one-question survey with our two-question survey. In the pilot test, 50% of the sample will only get one conjoint question. The one-question version will be created by dropping the second question in each of the 16 blocks of conjoint questions used in the main survey. If there is a significant difference in the percent of the sample selecting Plan A in the two versions of the survey, then 20% of the final sample will also be mailed the one-question version of the survey.
Experimental Design for SP Choice Questions
The main version of the survey instrument includes two stated choice conjoint questions. Each choice question presents one version of the action plan and the no-action plan (or opt-out). The action plans vary across four attributes: the increase in salmon and steelhead population, the risk of extinction for coho salmon, the risk of extinction for the two suckers, and the cost per household.
In an SP question, the levels of the attributes do not need to match existing levels or existing projections. In fact, SP methods are used in marketing to estimate demand for new product features that do not currently exist. In the health literature, researchers are often interested in preferences for outcomes that are not currently feasible with given treatments. In the environmental economics literature, researchers are often interested in valuing the outcome but not the plan used to achieve the outcomes. Past surveys have presented plausible but made up plans to achieve the outcomes of interest and ask survey respondents to value changes in outcomes levels.
For this survey, DOI needs estimates of the total value that individuals place on the outcomes associated with the KBRA and dam removal. The agreements provide a real process and payment vehicle for the SP questions. To provide policy-relevant estimates, the levels for the attributes need to encompass the range of likely outcomes. Based on expert judgment and existing literature, the Environmental Impact Statement (EIS) will provide quantitative or qualitative estimates of the impact on fish populations. The impacts laid out in the EIS will be used to create outcome levels or ranges of outcome levels for which we will calculate WTP. Currently, the expert panels for different fish are still meeting. The current attribute ranges for the SP questions are based on current thinking about possible outcomes. If the fisheries experts revise their estimates, the attribute levels will be adjusted to reflect the latest information at the time data collection begins.
The three levels for the increase in salmon and steelhead population are based on information from the scientific literature on current and historic runs and the professional judgment of fisheries biologists involved in the project. The estimates in the literature for historic and current levels of population vary depending on the methods and data sources. The text in the survey describes a range of current and historic population levels based on the more conservative articles. The levels for the population increase attribute are 30% increase, 100% increase, and 150% increase.
For the coho salmon and the suckers, their threatened and endangered status was converted into a scale depicting risk of extinction based on an article by Patrick and Damon-Randall (2008). The levels for the coho salmon attribute are high, moderate, and low. The status of the suckers is more precarious, and the attribute levels will be very high, high, and moderate.
Finally, the levels for the cost attribute are based on reactions from focus groups and one-on-one interviews. The cost levels are $12, $24, $48, and $90 per year for 20 years.
As described in more detail in Section A16, the survey will be mailed out to 10% of the sample at the beginning. After two mailings of the survey instrument, we will use the responses we have received to conduct preliminary analysis on the data. If more than 80% of the sample selects the no-action plan, then the cost of the plans will be adjusted downward. A simple conditional logit will be estimated with the data to examine whether the other attribute levels are significant. If the levels are not significant, then the levels will be adjusted up or down to create greater differences between the plans.
Experimental Design
The experimental design for the SP conjoint questions was created using Sawtooth Software (2009). Each respondent will answer two conjoint questions that consist of one action plan and the no-action plan. To encourage respondents to think about the trade-offs between the plans and no action, the experimental design was created as if the respondents were answering one question comparing two action plans. The design includes a restriction that there are no dominated pairs (one plan is better for all the attributes). If respondents compare the plans, they will see that the plans involve trade-offs. For the actual SP questions, the plans will be split into two questions comparing each plan to the no action plan individually. The format of this design actually provides additional information about the respondent’s preferences, because we may know something about how the respondent ranks the two action plans (for example, if they select plan A and select the no-action plan instead of Plan B, we know that they prefer A to B). This additional information can be incorporated into the analysis; however, the models are more complicated to run.
A3. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.
Data will be collected using both a mail questionnaire and a Web instrument. There are at least four important factors related to using this electronic method of data collection. First, it can help increase response rates over and beyond what can be achieved through mail alone. Individuals have a different likelihood to respond in various modes, often related to their personal mode preferences. In addition, increasing response rates by including people who did not cooperate to the mail survey request, by offering a Web mode, can decrease nonresponse bias; those who complete only the mail survey may be systematically different from the rest of the population.
Second, electronic data collection has been demonstrated to lead to improvements in the quality of collected data (Baker, Bradburn, and Johnson, 1995), through capabilities such as automated skip logic and prompting for missing data.
Third, also related to the previous benefit, is a likely reduction in respondent burden. Having automated skip logic relieves the respondent from this task. Some studies have also shown a reduction in the time needed to complete the survey when moving from a paper-based to a computer-based instrument (e.g., Baker, Bradburn, and Johnson, 1995).
Fourth, cost savings are possible through the use of Web survey instruments, with minimal variable costs (cost per additional interview), that can avoid more costly follow-up of the same sample cases.
Maximizing response rate is very important to the validity of the survey. In addition, because of the subject matter, we believe that using self-administered survey instruments is very important. However, we recognize that Web and mail surveys are different modes and different types of respondents may favor one mode over another. In Section B3, we discuss our approach to identifying possible mode effects.
A4. Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.
To our knowledge, no studies to date have used SP methods to estimate total household values (including nonuse values) for Klamath River Dam removal or other restoration measures; however, a limited number of studies have used these methods to investigate values for related programs in other parts of the United States. Although a number of other economic valuation studies have addressed dam removal activities in the United States, most of them have applied revealed preference (RP) methods and focused on use-related values (Robbins and Lewis, 2008; Provencher, Sarakinos, and Meyer, 2008; Lewis, Bohlen, and Wilson, 2008; Loomis, 1999).
Table A1 identifies and summarizes key features of nine existing studies that have estimated total values for U.S. river ecosystem restoration using SP methods.2 Similar to the current study, the majority of these have assessed total values for western rivers, with only one study done in the East (Adams, 2004). The closest study geographically to the Klamath River study is the one by Douglas and Taylor (1999). They estimated total values for restoration activities in the Trinity River, a southern tributary to the Klamath River, which will not be affected by the current restoration program. Despite its proximity to the Klamath, the results of the Trinity River study are difficult to interpret or to directly transfer to the current program.
Only two of these studies have specifically dealt with dam removal; the Elwha Dam removal project (Loomis, 1996) is the most similar to the Klamath River plans. However, all but one of the studies included fish recovery as a key response to the restoration program being evaluated. Five of the studies specifically describe increases in salmon and other anadromous fish populations, and four of these use specific numbers of additional fish to describe the impacts of the program.
All nine of the studies used contingent valuation methods (CVMs) rather than conjoint methods to elicit WTP; however, a few of them included split-sample designs to measure scope effects associated with alternative programs. The most common form of payment vehicle was an increase in taxes, followed by an increase in utility (power or water) bills.
Table A1. Previous Valuation Studies of Dam Removal or Related Restoration Efforts
Category |
Loomis, 1996 |
Welsh et al., 1995 |
Bell, Huppert, and Johnson, 2003 |
Douglas and Taylor, 1999 |
Hanemann, Loomis, and Kanninen, 1991 |
Loomis et al., 2000 |
Sanders, Walsh, and Loomis, 1990 |
Adams, 2004 |
Olsen, Richards, and Scott, 1991 |
River Ecosystem Studied |
Elwa River Basin, Olympic Peninsula, WA |
Colorado River (including parts of the Grand Canyon) below the Glen Canyon Dam, AZ |
Five Pacific Northwest estuaries in WA and OR |
Trinity River, CA |
San Joaquin Valley, CA |
South Platte River, CO |
11 rivers in Colorado |
Huron River, MI |
Columbia River Basin in WA, OR, ID, and MT |
Main Restoration Program Elements |
Dam removal (2) |
Three alternative flow release regimes from the dam |
Coho enhancement program |
Increase Trinity River flows |
Five programs: Two for wetland habitat, two for water contamination, and one for river flows |
Conservation easement, riparian buffers, reduced flow diversion |
Protection of rivers under the Wild and Scenic Rivers Act |
Dam removal or keeping dam in current condition |
Dam flow and dam passage changes |
Main Program Impacts |
Increases in four species of salmon and steelhead |
|
Coho salmon recovery |
Increase anadromous fish population and improved boating recreation |
Wetlands
program: maintain or increase
wetland habitat. |
Ecosystem services: Wastewater dilution, natural water purification, erosion control, habitat for wildlife |
Recreation and ecosystem preservation |
Improved river recreation and fish vs. continued pond recreation and fish |
Doubling the salmon and steelhead runs by 2000 |
(continued)
Table A1. Previous Valuation Studies of Dam Removal or Related Restoration Efforts (continued)
Category |
Loomis, 1996 |
Welsh et al., 1995 |
Bell, Huppert, and Johnson, 2003 |
Douglas and Taylor, 1999 |
Hanemann, Loomis, and Kanninen, 1991 |
Loomis et al., 2000 |
Sanders, Walsh, and Loomis, 1990 |
Adams, 2004 |
Olsen, Richards, and Scott, 1991 |
Fish Population Metrics |
Increase of pink salmon and other fish species |
Qualitative: “improvement,” change in “danger of extinction” |
WA survey: Allowable catch
of coho salmon |
Number of spawning adult anadromous fish |
Salmon improvement |
Improve habitat for six native fish so they are not in danger of extinction |
NA |
Reduction of lake fish population with increase in river fish population |
Quantity of fish in salmon and steelhead runs |
Fish Population Metric Range |
200,000 pink salmon with a total increase of 300,000 fish |
NA |
80,000–160,000 |
9,000–105,000 |
Not mentioned |
NA |
NA |
NA |
Double the amount (an increase of 5 million fish) |
SP Valuation Method |
CVM |
CVM |
CVM |
CVM |
CVM |
CVM |
CVM |
CVM |
CVM |
SP Question Format |
Dichotomous choice |
Dichotomous choice |
Dichotomous choice |
Open-ended/bid cards |
Double-bounded dichotomous choice |
Dichotomous choice |
Open-ended |
Dichotomous choice |
Open-ended |
Payment Vehicle |
Taxes |
Taxes, utility bills |
Taxes |
Utility bill |
Taxes |
Water bill |
NA |
Taxes |
Power bill |
Survey Mode |
Mail (telephone follow-up) |
On-site, mail, and telephone. |
Mail and telephone |
In person |
Mail and telephone |
Telephone |
|||
Sample Frame |
Clallam County, WA, rest of WA, and rest of United States |
Power service (marketing) area (WY, UT, CO, NW, AZ, NV), and rest of United States |
Coastal WA and OR |
Trinity users and households in WA, OR, CA, and NV. |
San Joaquin Valley, CA, OR, WA, and NV households |
Towns near the river, CO |
CO |
Ann Arbor, MI |
WA, OR, ID, and western MT. |
(continued)
Table A1. Previous Valuation Studies of Dam Removal or Related Restoration Efforts (continued)
Category |
Loomis, 1996 |
Welsh et al., 1995 |
Bell, Huppert, and Johnson, 2003 |
Douglas and Taylor, 1999 |
Hanemann, Loomis, and Kanninen, 1991 |
Loomis et al., 2000 |
Sanders, Walsh, and Loomis, 1990 |
Adams, 2004 |
Olsen, Richards, and Scott, 1991 |
||||||||||
Sample Size |
Total: 2,500 |
Total: 5,950 |
5,000 (1,000 per estuary) |
Total: 5,000 |
1,960 |
462 |
~420 |
2,000 |
4,028 |
|
|||||||||
Completed Surveys |
|
|
2,006 |
Total: 2,347 |
Total: 1,004 |
96 |
214 |
766 |
Nonusers: 695 |
|
|||||||||
Survey Year |
1994 |
1994–1995 |
2000 |
1993–1994 |
1989 |
1998 |
1983 |
2003 |
1989 |
|
|||||||||
Main Value Estimate |
Average annual household WTP for the dam removal program |
Average annual household WTP for the dam water release alternative |
Average household WTP over 5 years by income level and estuary |
Average annual WTP by users or households |
Average annual CA household WTP for each program |
Average monthly household WTP for river restoration |
Average annual household WTP for increments of river protection by use and preservation values |
Average annual household WTP for dam removal or dam maintenance |
Average monthly household WTP for a guaranteed doubling of the salmon and steelhead runs |
|
These studies vary widely in the extent of the market surveyed and WTP estimates applied. Four studies only estimate the values for those in the immediate area of the river or watershed. Four other studies use a tiered approach to assess different WTP estimates for households in the immediate area versus those in the rest of the state, nearby states, or the rest of the country.
A5. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.
The survey will only be sent to households, and will not impact small businesses or other small entities.
A6. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
This is not a periodic data collection. If the Agency did not conduct the survey, the information basis for the Secretary of the Interior’s determination would be incomplete. The survey is the only component of the economic analysis that captures the total value of the change (including nonuse values). It is also the only component of the analysis that will include data on the values and opinions of people living outside the Klamath Basin area, who are federal taxpayers and who will be supporting the Klamath activities through their tax dollars.
A7. Explain any special circumstances that would cause an information collection to be conducted in a manner:
* requiring respondents to report information to the agency more often than quarterly;
* requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
* requiring respondents to submit more than an original and two copies of any document;
* requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;
* in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
* requiring the use of a statistical data classification that has not been reviewed and approved by OMB;
* that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
* requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.
All information collection and recordkeeping activities in this submission are consistent with the guidelines in 5 CFR 1320.6.
A8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice [and in response to the PRA statement associated with the collection over the past three years] and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden.
Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.
Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years — even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.
In accordance with the Paperwork Reduction Act, DOI published a 60-day notice requesting comments regarding this information collection request (74 FR 27340; June 9, 2009). The Agency received no comments. On August 30, 2010, the Agency published a 30-day notice requesting comments and received one general comment on the overall design of the project, but no comments on the survey instrument. The pilot test was approved on December 14, 2010. The Agency received additional comments on the survey instrument shortly after the survey instrument originally submitted to OMB was distributed to the Klamath Technical Coordinating Committee at the Committee’s request. All of the comments focused on the background material and description of the no action and action alternatives. No comments were received on the questions themselves. Further revisions were made based on these comments and changes suggested by the team of federal biological scientists working on the project. Four additional one-on-one interviews will be conducted to test for reactions to the changes in the survey wording during the 30-day comment period.
In addition to input from RTI International staff and its project team of consultants, including Dr. V. Kerry Smith of Arizona State University and Dr. John Duffield of the University of Montana, DOI has received feedback from an external expert review panel—Dr. Trudy Cameron (University of Oregon), Dr. Kevin Boyle (Virginia Tech University), and Dr. Wictor Adamovicz (University of Alberta)—regarding the design of the survey instrument and the plans for data collection and data analysis. The experts provided feedback first on a draft plan for designing the survey instrument, collecting the data, and analyzing the data. The plan was revised based on the comments. The experts provided a second round of feedback on the revised plan for collecting and analyzing the data and on the draft survey instrument.
The survey plan and an outline of the survey instrument were also presented twice to the Klamath stakeholder group. The stakeholder group includes representatives from all the groups that signed the Klamath Hydroelectric Settlement Agreement and the KBRA, representatives from the parties that declined to sign the agreements (Siskiyou County and the Hoopa Tribe), and members of the public.
As part of the instrument design process, we held six focus groups in different parts of the country: Medford, OR; Klamath Falls, OR; Eureka, CA; Kansas City, KS; Raleigh, NC; and Phoenix, AZ. In addition, the instrument was pretested through 10 one-on-one interviews—4 in Oregon, 3 in California, and 3 in other parts of the country. The survey was revised in the following ways based on feedback from the expert reviewers, focus groups and one-on-one interviews:
Shortened text, simplified language,
Revised maps to increase geographic scope, add landmarks, added explanation for maps in text
Revised descriptions of Klamath basin, fish in the basin, and the agreements to present information people said they needed, delete information that was unnecessary, address benefits and costs of the agreement more fully
Revised description of action and no-action plans, payment vehicle
Reduced the number of SP conjoint questions
Where potential sources of bias in responses were noted, we either tried to address these in the text of the survey or added a debriefing question to identify respondents who may be protesting and to identify reasons for SP responses unrelated to economic trade-offs
Based on comments received following approval of the pilot test survey instrument on December 17, 2010, the following types of changes were made to the instrument:
Some information was changed to be consistent with the draft Layperson’s Guide to the Klamath River being prepared as part of the overall project and Agency information to the media.
Changes were made to the descriptions of some of the uses of the Klamath Basin, reasons for declining fish populations, threats to the endangered species, the history behind the agreements, the main features of the agreements, the impacts of the agreements, and how the agreement would be paid for.
The baseline population of Chinook salmon and steelhead trout through 2060 was revised from a projected 30% decline to current population in the graph with the statement “Scientists expect that wild populations of these fish will remain at low levels in the future.” The change was made based on input from the biological scientists working on the project.
A9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.
A small monetary incentive of $2 (2 $1 bills) will be sent to sample members to help gain their participation, as demonstrated consistently across studies to increase cooperation (for reviews, see Heberlein and Baumgartner, 1978 and Singer et al., 1999). The money will be included in the initial survey mailing to respondents. This amount is provided as a token of appreciation aimed to build a social exchange between the organizations making the survey request and the individual (Dillman, 1978; Dillman, 2000), to the extent possible. Furthermore, incentives have been shown to reduce nonresponse bias by increasing cooperation particularly among those who are not interested or involved in the survey topic (Groves, Singer, and Corning, 2000; Groves, Presser, and Dipko, 2004; Groves et al., 2006). Thus, the use of incentives is instrumental to increasing response rates and reducing nonresponse bias.
An amount that presents a token of appreciation can help curb nonresponse, but with falling response rates in household surveys (e.g., Curtin, Presser, and Singer, 2000; Stussman, Dahlhamer, and Simile, 2005) higher incentives are needed. Therefore, to gain better understanding of potential nonresponse bias, a nonresponse follow-up phase will be implemented increasing the incentive amount to $20 for 20% of the nonrespondents after the third mailing (a reminder letter). The nonresponse follow-up will include a letter sent by Federal Express or priority mail to 20% of the nonrespondents offering an incentive of $20 to return a short survey consisting of 6 questions taken from the main survey (which will be included in the Fed Ex mailing). A few days later, we will call the nonrespondents with phone numbers to reiterate the offer and answer questions. By providing a significantly higher incentive to return the survey and drastically shorter survey instrument, we hope to entice some percentage of the nonrespondents to return the survey. The characteristics of these “high incentive” responders will be compared to the sample of “low incentive” responders to help evaluate nonresponse bias. Although the literature suggests that including an upfront incentive generates a higher response rate than the promise of an incentive, our budget does not allow us to mail all the nonrespondents $20 up front.
A10. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.
At the beginning of the survey, the following statement is included:
Your participation in this survey is voluntary. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific individual. We will not provide information that identifies you to anyone outside the study team, except as required by law. Your responses will be stored separately from your name and address, and when analysis of the questionnaire is completed, all name and address files will be destroyed.
A Federal agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number.
In addition, the following statement is included in the survey before we ask the demographic questions: “Finally, we would like to ask you a few questions about you and your household. Responses to these questions will be used only for statistical purposes and to compare respondents to this survey with the U.S. population as a whole. The reports prepared for this study will summarize findings across the sample and will not associate responses with an individual. Your answers will not be saved or stored in a way that can be associated with your name or address.”
A11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
Participants will not be asked any questions that are personal or sensitive in nature.
A12. Provide estimates of the hour burden of the collection of information. The statement should:
* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.
* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.
* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 14.
Table A2 summarizes the burden estimates for the final data collection. The surveys will be mailed out in two batches according to the schedule in Table A4. The initial mailing will be to 10% of the sample, which we will use as a pretest of the mailing procedures and the conjoint attribute levels. The first 10% of the sample will receive the first and second mailing (as needed for nonrespondents to the first mailing). After some preliminary analysis of the responses returned from the first 10% of the sample, the remaining 90% of the sample, the main survey sample, will released. As needed, respondents will receive the prenotification postcard, two mailings of the survey, a reminder postcard between the first and second survey instrument mailings with the address and password for the web version of the survey, and a reminder letter to respondents who have not returned their surveys several weeks after the second mailing of the survey instrument. Finally we will select 20% of the nonrespondents from the main survey sample for non-response follow-up that includes a Fed Ex letter and shorter version of the survey, a higher incentive offer and a follow-up telephone call.
For the estimated response rates, we assumed the lower bound for each response rate to provide a conservative estimate. The actual response rate is likely to be higher. From an original sample of 13,000 addresses, we conservatively expect that 80% (10,400) of the addresses will be valid.
10% Sample (Pretest) first mailing: The initial 10% of sample households will receive a prenotification post card followed by the first mailing of the survey instrument with a cover letter. The post card will be mailed to 1,300 households, out of which we expect 80% of the addresses will be valid. The first mailing will be sent to 1,040 households with valid mailing addresses. DOI estimates that everyone will spend .008 hours (approximately 30 seconds) looking at the post card. DOI estimates that 20% of the households in the Klamath area and 15% of households from outside the Klamath area will complete the survey after the first mailing. This yields a total of 177 respondents, and DOI estimates that respondents will spend 0.50 hours (30 minutes) looking at the letter and completing the survey. DOI estimates that there will be 863 nonrespondents after the first mailing and that nonrespondents will spend, on average, 0.05 hours (3 minutes) looking at the letter and survey.
10% Sample reminder postcard and second mailing: A reminder postcard will be sent that includes the address for the web version of the survey along with the respondent’s password (the postcard will fold over so password will not be visible). Then the survey instrument and a second cover letter will be mailed to nonrespondents. DOI estimates that after the second mailing 10% of the households that did not respond to the first mailing will complete the survey. This yields a total of 86 respondents, and DOI estimates that respondents will spend 0.50 hours (30 minutes) completing the survey and reading the letter and postcard. DOI estimates that there will be 777 nonrespondents and that nonrespondents will spend on average 0.05 hours (3 minutes) looking at the letter, postcard and survey.
90% Sample (Main survey sample) first mailing: The remaining 90% of sample households will receive a prenotification post card followed by the first mailing of the survey instrument with a cover letter. The post card will be mailed to 11,700 households, out of which we expect 80% of the addresses will be valid. The first mailing will be sent to 9,360 households with valid mailing addresses. DOI estimates that everyone will spend .008 hours (approximately 30 seconds) looking at the post card. DOI estimates that 20% of the households in the Klamath area and 15% of households from outside the Klamath area will complete the survey after the first mailing. This yields a total of 1,591 respondents, and DOI estimates that respondents will spend 0.50 hours (30 minutes) looking at the letter and completing the survey. DOI estimates that there will be 7,769 nonrespondents after the first mailing and that nonrespondents will spend, on average, 0.05 hours (3 minutes) looking at the letter and survey.
90% Sample reminder postcard and second mailing: A reminder postcard will be sent that includes the address for the web version of the survey along with the respondent’s password (the postcard will fold over so password will not be visible). The survey instrument and a second letter will be mailed to nonrespondents. DOI estimates that after the second mailing 10% of the households that did not respond to the first mailing will complete the survey. This yields a total of 777 respondents, and DOI estimates that respondents will spend 0.50 hours (30 minutes) completing the survey and reading the letter and postcard. DOI estimates that there will be 6,992 nonrespondents and that nonrespondents will spend on average 0.05 hours (3 minutes) looking at the postcard letter and survey.
90% sample third mailing: After the second mailing, the 6,992 nonrespondents households will be sent a reminder letter and will spend 0.008 hours (approximately 30 seconds) looking at the letter. The letter will provide the web address of the survey and a toll-free number and email for the respondent to call or write and get another copy of the survey. DOI expects that 5% of nonrespondents will complete the survey. DOI estimates that 350 respondents will spend 0.50 hours (30 minutes) on the survey.
Fourth mailing (Nonersponse follow-up): After the third mailing, 20% of the nonrespondents (1,328 households) will be sent a letter by Federal Express or Priority Mail with a letter and a much shorter version of the survey offering a $20 incentive to return the shorter survey. DOI assumes that 1,328 households will spend 0.008 hours (approximately 30 seconds) looking at the package. DOI assumes there will be telephone numbers for 65% of the nonrespondents. For respondents with telephone numbers, the letter and survey will be followed by a phone call from a live operator who will either talk to the household or leave a message reiterating the higher incentive and offering to mail another copy of the survey if the household needs one. DOI expects that 20% of nonrespondents will complete the shorter survey after the phone call reminder. DOI estimates that 173 respondents will spend an average of 0.17 hours (10 minutes) on the shorter survey, letter, and the phone call. DOI estimates that 691 nonrespondents will spend 0.08 hours (5 minutes) on the survey, letter, and phone call.
For the 35% of households without telephone numbers, DOI expects that 10% of nonrespondents will complete the survey after receiving the Federal Express letter. DOI estimates that 46 respondents will spend 0.08 hours (5 minutes) on the shorter survey and letter. DOI estimates 418 nonrespondents will spend 0.05 (3 minutes) on the survey and letter.
Total Cost: The Agency estimates that it will cost respondents $27.42 per hour in loss of potential salary and benefits to participate in the survey. Respondents will spend an annual total of 2,571 hours at a cost of $70,496.82.3
Table A2. Burden Calculations
Type of Respondent |
No. of Respondents |
No. of Responses per Respondent |
Total Annual Responses |
Average Burden Hours per Response |
Total Annual Hour Burden |
10% Sample Postcard mailing |
1300 |
|
|
|
|
Sample of valid addresses |
1040 |
1 |
1,040 |
0.008 |
8.32 |
10% Sample first mailing nonrespondents |
863 |
1 |
863 |
0.05 |
43.15 |
10% Sample first mailing respondents |
177 |
1 |
177 |
0.5 |
88.5 |
10% Sample reminder postcard and second mailing nonrespondents |
777 |
1 |
777 |
0.05 |
38.85 |
10% Sample reminder postcard and second mailing respondents |
86 |
1 |
86 |
0.5 |
43 |
90% Sample Postcard Mailing |
11700 |
|
|
|
|
Sample of valid addresses |
9360 |
1 |
9,360 |
0.008 |
74.88 |
90% Sample First mailing nonrespondents |
7769 |
1 |
7,769 |
0.05 |
388.45 |
90% Sample First mailing respondents |
1591 |
1 |
1,591 |
0.5 |
795.5 |
90% Sample reminder postcard and second mailing nonrespondents |
6992 |
1 |
6,992 |
0.05 |
349.60 |
90% Sample reminder postcard and second mailing respondents |
777 |
1 |
777 |
0.5 |
388.5 |
90% Sample Third mailing reminder letter: nonrespondents |
6992 |
1 |
6,992 |
0.008 |
55.94 |
90% Sample Third mailing reminder letter: respondents |
350 |
1 |
350 |
0.5 |
175 |
Nonresponse follow-up mailing |
1328 |
1 |
1,328 |
0.008 |
10.62 |
Nonresponse bias sample mailing nonrespondents (households with telephone numbers) |
691 |
1 |
691 |
0.08 |
55.28 |
Nonresponse bias sample mailing respondents (households with telephone numbers) |
173 |
1 |
173 |
0.17 |
29.41 |
Nonresponse bias sample mailing nonrespondents (households without telephone numbers) |
418 |
1 |
418 |
0.05 |
20.9 |
Nonresponse bias sample mailing respondents (households without telephone numbers) |
46 |
1 |
46 |
0.08 |
3.68 |
Total |
|
|
39,430 |
|
2571 |
a Total includes multiple contacts with some households. The total number of unique individuals contacted will be 10,400 (total valid addresses).
Note: Numbers may not sum because of rounding.
A13. Provide an estimate of the total annual [non-hour] cost burden to respondents or record keepers resulting from the collection of information. (Do not include the cost of any hour burden shown in Items 12 and 14).
* The cost estimate should be split into two components: (a) a total capital and start-up cost component (annualized over its expected useful life) and (b) a total operation and maintenance and purchase of services component. The estimates should take into account costs associated with generating, maintaining, and disclosing or providing the information [including filing fees paid]. Include descriptions of methods used to estimate major cost factors including system and technology acquisition, expected useful life of capital equipment, the discount rate(s), and the time period over which costs will be incurred. Capital and start-up costs include, among other items, preparations for collecting information such as purchasing computers and software; monitoring, sampling, drilling and testing equipment; and record storage facilities.
* If cost estimates are expected to vary widely, agencies should present ranges of cost burdens and explain the reasons for the variance. The cost of purchasing or contracting out information collection services should be a part of this cost burden estimate. In developing cost burden estimates, agencies may consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission public comment process and use existing economic or regulatory impact analysis associated with the rulemaking containing the information collection, as appropriate.
* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.
The cost burden on respondents and record-keepers, other than hour burden, is zero.
A 14. Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies also may aggregate cost estimates from Items 12, 13, and 14 in a single table.
The estimated annualized cost to the federal government is $583,785 (Table A3). The cost was calculated based on quantification of hours and operational expenses.
Table A3. Annualized Cost to the Federal Government
Expense category |
Cost |
Labor plus overhead |
$282,561 |
Supplies, postage, copying, incentives, telephone |
$301,174 |
TOTAL |
$583,785 |
A15. Reasons for change in burden
There is no change in request of burden for this submission.
A16. For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.
After data collection is complete, RTI will prepare a final report presenting the results from the survey and the WTP estimates. Section B.2 describes the analytical techniques that will be used to summarize the results and estimate the WTP values.
The time schedule for the project is presented in Table A4. The first mailing will go to 10% of the sample. After a few weeks, the mailings for the remaining 90% of the sample will be sent.
Table A4. Time Schedule
Date |
First 10% of Sample (Pretest) |
Last 90% of Sample (Main survey sample) |
Analysis |
3/1/11 |
Lead post card mailing |
|
|
3/4/11 – 3/9/11 |
1st survey mailing |
|
|
3/16/11 |
Reminder postcard with website address and password |
|
|
3/28/11-4/1/11 |
2nd survey mailing |
|
|
4/18/11 |
|
|
Begin analysis of 10% sample |
5/9/11 |
|
Lead post card mailing |
Complete analysis of 10% sample |
5/12/11-5/17/11 |
|
1st survey mailing |
|
5/24/11 |
|
Reminder postcard with website address and password |
|
6/7/11 – 6/10/11 |
|
2nd survey mailing |
|
6/27/11-7/1/11 |
|
3rd mailing |
Begin analysis of final data |
7/30/11 |
|
|
Main data collection complete |
8/1/11 |
|
4th mailing, Federal Express (20% NR) |
Begin nonresponse follow-up study |
8/1/11-8/17/11 |
|
Reminder phone call to all households with a number, 2 attempts then message |
|
9/12/11 |
|
Data collection complete |
|
9/19/11 |
|
|
Draft Report |
9/30/11 |
|
|
Final report |
9/30/11 |
|
|
Final data delivery to DOI |
*NR=nonrespondents
A17. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
Not Applicable.
A18. Explain each exception to the certification statement.
There are no exceptions to the certification
1 The U.S. Government and PacifiCorp are only parties to the KHSA. The U.S. Government becomes a party to the KBRA upon enactment of authorizing legislation.
2 A similar and potentially relevant SP study conducted outside the United States is by Johansson and Kriström (2009), which includes a contingent valuation (CV) analysis of changes in water flow from a hydroelectric dam in Sweden. Another is a paper by Morrison and Bennett (2004) which uses SP methods to estimate and compare values for river restoration projects in five catchments in New South Wales, Australia.
3 Based on an average hourly wage of $19.41. See BLS Employer costs for Employee Compensation – December 2009, March 10, 2010. (http://www.bls.gov/news.release/ecec.nr0.htm).
File Type | application/msword |
File Title | Supporting Statement for a New Collection RE: Capacity for Local Community Participation in Deer Management Planning |
Author | Megan McBride |
Last Modified By | Rachel Drucker |
File Modified | 2011-02-15 |
File Created | 2011-02-15 |