Supporting Statement B 5-16-2013

Supporting Statement B 5-16-2013.docx

Social Values of Ecosystem Services at Cape Lookout National Seashore

OMB: 1024-0267

Document [docx]
Download: docx | pdf

Supporting Statement B


Social Values of Ecosystem Services at

Cape Lookout National Seashore


OMB Control Number 1024-NEW




1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of organizations (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


Cape Lookout National Seashore receives approximately 530 thousand visitors a year. We used the NPS Public Use Statistics1 for Cape Lookout National Seashore to determine the potential respondent universe for the on-site survey; and the 2010 Census data2 to determine the potential respondent universe for Carteret County, NC (Table 1 below).


There will be two target populations for this collection. The first will be a random sample of all on-site visitors (18 years old and older) who are visiting the park during the two sampling periods (summer and fall). The second will be a random sample of local residents (18 years and older) living in the county surrounding the Park.


Table 1. Respondent Universe and expected number of responses



Respondent universe

Number of

Initial contacts

Response rate

Total expected returned

Visitor survey- (summer)

62,456

2,112

35%

739

Visitor survey- (fall)

46,339

1,056

35%

370

Resident survey

54,100

3,600

30%

1,080

Non-respondent survey

4,579

1,900

10%

190

Total


8,668


2,379



2. Describe the procedures for the collection of information including:

* Statistical methodology for stratification and sample selection,

* Estimation procedure,

* Degree of accuracy needed for the purpose described in the justification,

* Unusual problems requiring specialized sampling procedures, and

* Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The sample sizes chosen for each segment of this study will result in a +/- 3% margin of error (at the 95% confidence level). This degree of accuracy will be more than sufficient to meet the needs of this study. We will use the following equation (Dillman, 2007) to determine the sample size:



Whereas:

n: sample size


N: population size. For the resident survey, the population size is estimated based on results of the U.S. Census in 2010 which showed 54,100 people 18 years or older reside in Carteret county, North Carolina. For the visitor survey, we use visitation statistics for 2011 which show 508,116 visitors in 2011 as a baseline for population size.


p: proportion of the population expected to choose a response category


b: acceptable amount of sampling error in this case is 3%


c: Z-score associate acceptable confidence interval in this case is 95% and Z=1.96


Visitor survey: The visitor survey will be conducted over two seasons (summer and fall). Visitation records show that July has more visitors compared to October. Therefore we will stratify the samples during the two survey periods. We will contact 2,112 visitors in July and 1,056 in October. We anticipate a 35% response rate which would result in 1,109 completed surveys.


Resident survey: This will be a mail-out/mail back method and we anticipate a lower response rate because we will not have personal contact with respondents as in the case of the visitor survey. To achieve 1,080 completed surveys, we will mail surveys to 4,000 local residents we expect that 3,600 will be received (assuming that 10% (n=400) will be returned as undeliverable). The sample will be stratified by zip code. There are 17 different zip codes for Carteret County. The number of surveys sent will be proportional with the population size in each zip code.


An unusual situation is that residents of the area may also be visitors to the park. During the face-to-face contact at the park (see script below), each visitor approached will be asked if they received a survey at their residence in April. Only those who did not receive a resident survey will be invited to complete a visitor survey. This method will be used in both the July and October survey season to avoid double sampling.




3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


Several steps will be taken to maximize response rates and ensure an accurate and reliable sample.

We will follow The Tailored Design Method (Dillman, 2007) for mail surveys to help ensure a high response rate and representative sample.


The process for both samples will follow the protocols described below:


On-site Visitor survey


  1. A random sample of visitors will be approached at several locations in the park.

  2. The mail-back method will be used as the most suitable method for collecting surveys. The selected visitors will be asked if they would like to participate in the survey.

  3. Those who refused will be asked to provide a reason for their refusal.

  4. Those agreeing to participate will be asked their mailing address or email (as a substitute) for follow up purposes. The zip code will also be used for non-response bias checking

  5. All selected individuals will be asked three questions, and gender will be observed and recorded. The information will be recorded in a survey log for non-response bias checking. Those who agree to participate in the survey will receive instructions on how to complete the survey and return it by mail.

  6. Individuals will be provided a pre-addressed, postage paid envelope to return the survey. Follow up reminders will be sent 2 weeks, 4 weeks and 6 weeks after the initial contact at the park.


Interview script

The initial contact with visitors will be used to explain the study and determine if visitors are interested in participating (see script below). The initial contact will take approximately 1 minute to complete. If a group is encountered, the survey interviewer will ask the individual within the group who has the next birthday to serve as the respondent for the study. All individuals approached (including those that refuse and those accepting) will be asked three on-site non-response bias questions (used to collect information that will be used in the final analysis - see below) this will take an additional two minutes. The number of refusals will be recorded and used to calculate the overall response rate for the collection.


Visitors selected for participation in the survey will be read the following script:


Hello, my name is _________. I am conducting a survey for the National Park Service to better understand your opinions about Cape Lookout National Seashore programs and services. Your participation is voluntary and all responses will be kept anonymous. Would you be willing to take a survey and mail it back to us using the self-addressed envelope?” It will take about 20 minutes to complete after your visit. Have you already received a survey from Cape Lookout in the mail?



IF YES: Thank you and enjoy the rest of your visit.

IF NO: ask to speak to the person that has the next birthday (at least 18 years old). Are you willing to participate?

IF NO: Thank you. Would you mind if I ask you a few quick questions?

IF YES: Thank you. Would you mind if I ask you a few quick questions?

  1. Is this your first time visiting Cape Lookout National Seashore?

  2. What is your zip code?

  3. How many people are in your group?

IF NO: Ask reason for refusal.


Once the visitor has agreed to participate in the study, we will ask them to provide or personally record their name, mailing address, and email address on a survey log sheet – this information will only be used to follow-up with all non-respondents.


Addressing potential non-response bias


In addition to the three on-site nonresponse questions (see box above), we will use zip codes to calculate the distance traveled to the park to check non-response bias. Participants’ gender and group size will also be recorded in the survey log. If a non-response bias is found, the data will be weighted to reduce the effect of non-response bias


Residence Survey


1. Participants will be informed about the survey request in advance. We will mail an introductory letter.

2. There will be a cover letter that explains (included in the questionnaire):

  • The purpose of the survey

  • The reason for participation

  • The terms of anonymity and how the results will be used

3. We will provide a pre-addressed, postage paid envelope to return the survey.

4. We will send reminders during the survey period to those that have not completed the survey.


Addressing potential non-response bias


We will send a short survey card to 1,900 non-respondents and we expect that 10% (n=190) will respond. The information from this survey will be used to test non-response bias among the local residents. The results will also be weighted if non-response bias is found.




4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


A thorough literature review was conducted prior to designing the questionnaire. Sixteen of 22 questions in the visitor survey (73%), and 15 of 24 questions in resident survey (62%) came from the NPS Pool of Known Questions that have previously been approved by OMB (OMB Control Number: 1024-0224; Expiration Date: 8/31/2014). We also contacted 5 experts in the field of survey development and design to ask them to provide feedback on the clarity of questions, questionnaire design and sampling procedures. The experts provided editorial comments and suggestions that were used to improve the design and order of questions.


Two groups of graduate students (University of Idaho) were used to test the length and clarity of questionnaire. One group (n=8) was used to test the resident survey and another group (n= 7) was used to test the visitor survey. We asked both groups to complete the entire questionnaire as if they were visitors to Cape Lookout or as residents of Carteret County. Each student was also asked to record the time needed to answer the entire questionnaire as well as to note any question that need to be reworded to improve clarity. Average times to complete the surveys were calculated based on the time reported by two groups.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Collection and analysis agency:


Lena Le, Ph.D., Assistant Professor and Interim Director

Park Studies Unit

University of Idaho

Phone: 208-885-2585

Email: lenale@uidaho.edu

2 http://quickfacts.census.gov/qfd/states/37/37031.html

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorFWS User
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy