0646 SS Part B 051118

0646 SS Part B 051118.docx

Socioeconomics of Coral Reef Conservation

OMB: 0648-0646

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT


SOCIOECONOMICS OF CORAL REEF CONSERVATION


OMB CONTROL NO. 0648-0646



B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g. establishments, State and local governmental units, households, or persons) in the universe and the corresponding sample are to be provided in tabular form. The tabulation must also include expected response rates for the collection as a whole. If the collection has been conducted before, provide the actual response rate achieved.


The potential respondent universe for this study is adults, eighteen years or older, who live near, and may use, coral reefs affected by activities related to the NOAA’s Coral Reef Conservation Program. The total population (all individuals) of the potentially impacted area is 11,244,759. Respondents will be classified into seven geographical jurisdictions and 22 reporting units, as defined in Table 5. In American Samoa, face-to-face interviews will be conducted, and in the remaining jurisdictions, a combination of internet, face-to-face interview, and phone based interview are the likely implementation modes. Each of the geographical jurisdictions is expected to be surveyed once every five to six years. Respondents will be randomly selected from the target audiences (see B.2). Based on our recent round of completed coral reef jurisdictional surveys, we observed the following response rates for face to face were 90% for American Samoa, 60% Guam, 51% CNMI and for telephone surveys the respective rates were 13.4% Florida, 29% Hawaii, 28% USVI and 2% Puerto Rico. A previous study of a different marine resource (Monk Seals) showed rates such as 50 percent for mail surveys 40% for internet surveys and 80 percent for in-person surveys.1 Dillman et al. (2009) considers a response rate above 50% a high response rate for mail surveys.2



Table 5: Study Jurisdictions and Reporting Units

Jurisdiction

Reporting units

Population

Puerto Rico

  • 9 Socio-economic Regions (Aguadilla, Arecibo, Bayamon, Caguas, Carolina, Humacao, Mayaguez, Ponce, and San Juan)


3,725,7893

Florida


  • Monroe County

  • Miami-Dade County

  • Martin County

  • Broward County

  • Palm Beach County

5,784,0434

U.S. Virgin Islands

  • St. Thomas

  • St. Croix

  • St. John

106,4055

Guam

  • 19 County Subdivisions

159,3586

American Samoa


  • Tutuila Island

  • Ofu County

  • Olosega County

  • Tau County

55,0707

Main Hawaiian Islands

  • Hawaii County

  • Honolulu County

  • Kauai County

  • Maui County

1,360,2118

Commonwealth of the Northern Mariana Islands (CNMI)

  • Rota Municipality

  • Tinian Municipality

  • Saipán Municipality

53,8839

Total

11,244,759


2. Describe the procedures for the collection, including: the statistical methodology for stratification and sample selection; the estimation procedure; the degree of accuracy needed for the purpose described in the justification; any unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.


For each of the jurisdictional populations, we intend to select a random sample of individuals over the age of eighteen, stratified geographically as described in Table 6. The random sample will be obtained from the selected survey firm using standard sample selection tools. The sample frame will be developed from telephone directories, mailing lists obtained and maintained by the survey firms, and other sources as needed, depending on the coverage of these sources. These strata have been designed to account for the differing sizes of the populations in the areas close to coral reefs.


We have used the standard approach to estimating sample size for a stratified population:

[t2 N p(1-p)] / [t2 p(1-p) + a2 (N-1)]

Where N is the size of the total number of cases, n is the sample size, a is the expected error, t is the value taken from the t distribution corresponding to a certain confidence interval, and p is the probability of an event. The final sampling size will be based on available resources.

Table 6: Sampling Requirements by Geographical Jurisdictions

Jurisdiction

Total Sample

Sample Size by Strata

1. American Samoa

652

448

Tutuila

Island

124

Ta’u

County

40

Olosega

County

40

Ofu

County

2. CNMI

900

375

Saipan

Municipality

270

Tinian

Municipality

255

Rota

Municipality

3. Guam

712

75

Agat

Municipality

75

Piti

Municipality

75

Asana

Municipality

75

Talafolo

Municipality

75

Merizo

Municipality

165

Tamuning

Municipality

172

Mangilao

Municipality

4. Hawaii

1700

400

Hawaii

County

500

Oahu

County

400

Kauai

County

400

Maui

County

5. Florida

2,000

385

Monroe

County

410

Miami-Dade

County

400

Martin

County

405

Broward

County

400

Palm Beach

County

6. Puerto Rico

3,500

388

Aguadilla

Region

388

Mayaguez

Region

388

Arecibo

Region

389

Ponce

Region

392

Bayamon

Region

391

San Juan

Region

388

Caguas

Region

388

Haumacao

Region

388

Carolina

Region

7. US Virgin Islands

1,125

385

St. Croix

Island

385

St. Thomas

Island

355

St. John

Island

Total

10,589


In addition to asking the questions regarding the impact of the Coral Reef Conservation Programs, the surveys will collect information on socioeconomics and demographics. This additional information will be used to sort and categorize the survey results in order to control for as many variables as possible. This approach will ensure a large enough respondent pool (particularly in more populated jurisdictions) to make comparisons between strata where required.


In each of the jurisdictions, we intend to hire qualified surveying contractors with databases of contact information in order to allow for the greatest possible randomization and coverage of survey participants. NOAA will also work with these contractors to select the most cost-effective survey methodology which will resonate with the population measured. Survey participants will be selected in American Samoa for face-to-face interviews due to the very low incidence (and low reliability) of either cellular phones or land lines; in other locations, local opinion poll contractors will select participants at random using a combination of internet and telephone polling. In some locations, combined methodologies such as face-to-face interviews and telephone surveys or internet and telephone surveys may be used to increase response rates. The methodology to be employed by jurisdiction can be found in Table 7.


Table 7: Survey Methodology by Geographical Jurisdictions

Jurisdiction

Geographic scope

Estimated Response Rate (based on previous NOAA 0646 surveys)

1. American Samoa

Face-to-face

90%

2. CNMI

Telephone or Face-to-face

20% (Tel); 29% (Face to Face)

3. Guam

Telephone or Face-to-face

13% (Tel); 60% (Face to Face

4. Hawaii

Telephone or Internet

30% (Tel)

5. Florida

Telephone or Internet

13% (Tel)

6. Puerto Rico

Telephone or Internet

6% (Tel)

7. US Virgin Islands

Telephone or Face-to-face

20% (Tel); 20% (Face to Face)


Survey Specific Challenges

As can be seen from Table 7, we have selected a number of different methods to collect data from different jurisdictions. Table 8 highlights the percent of population classified as internet users for the seven jurisdictions. In general we will attempt to collect data using a mixture of internet and telephone methods. The one exception is American Samoa where an in-person household survey will be conducted due to the extremely low level of internet usage in this jurisdiction (i.e., approximately 6 percent). In addition, the average internet use in CNMI, Guam, Puerto Rico and the US Virgin Islands is 39 percent as compared to 79 percent for Hawaii and Florida. As a result, we will likely utilize a telephone survey or a mixed mode approach in these jurisdictions in order to capture non-internet users.








Table 8: Internet Usage in Survey Jurisdictions

Jurisdiction

Population

Percent of Population Classified as Internet Users

1. American Samoa

55,070

6%

2. CNMI

53,883

30%

3. Guam

159,358

56%

4. Hawaii

1,360,211

79%

5. Florida

5,784,043

80%

6. Puerto Rico

3,725,789

40%

7. US Virgin Islands

106,405

28%

Source: Data from Hawaii and Florida US Census 2010. Other data “Internet World Statistics”, American Samoa data March 2011, CNMI data from August 2010, Guam data from June 2010, Puerto Rico data from, June 2011, and US Virgin Islands from December 2002 (see http://www.internetworldstats.com/).


We expect that there will be some language issues. Table 9 shows there are several major languages spoken beyond English by the populations of each jurisdiction.


Table 9: Languages Spoken in Survey Jurisdictions

Jurisdiction

Major Languages Spoken

1. American Samoa

English, Samoan

2. CNMI

English, Chamorro, Carolinian, Tagalog, Chinese, Korean, Japanese

3. Guam

English, Chamorro, Tagalog, Chinese, Korean, Japanese

4. Hawaii

English,

5. Florida

English, Spanish

6. Puerto Rico

English, Spanish

7. US Virgin Islands

English, Spanish


This language issue will be ameliorated by the use of polling specialists who speak the local language. Where appropriate, the survey contractors will ensure that the questions posed in the survey are translated into the proper cultural contexts. Responses will be tracked to see if there are statistically significant differences in the survey results between those who speak English at home and those who do not. In addition, surveys will be translated into local languages where appropriate. In past iterations of the survey for which this renewal and extension is being submitted, language issues were successfully accounted for with multi-lingual survey staff assisting in translation of the questionnaire. There were no major barriers to this, and this is not expected to significantly change for the next round of surveys.

We also expect that there is some risk of sample selection bias towards those of higher incomes, particularly for the telephone and internet surveys. In areas where access to phone and internet services are not widely available, this bias may be more than minimal. To the greatest extent possible, we hope that this can be corrected through the use of telephone surveys. If responses appear to favor high-income groups we will use various weighting procedures in the post-survey analysis to adjust for bias. Specifically, we will overweight the underrepresented groups if expected responses are not obtained. We will identify ‘control totals’ for the population that the survey is aiming to reach and calculate weights to adjust the sample totals to the control totals. For example, suppose the distribution of income groups in the population is as follows:

Income

Percent of Population

$0-$25K

28.00

$25K-$50K

27.00

$50-$75K

18.00

$75K-$100K

11.00

$100K plus

16.00

Total

100.00

However, the response distribution is:



Income

Percent of Population

$0-$25K

5.00

$25K-$50K

27.00

$50-$75K

18.00

$75K-$100K

11.00

$100K plus

39.00

Total

100.00

In response to this disparity, we would weigh the sample for the “$0-$25K” and the “$100K plus” groups to bring it in to line with the proportion of that group in the population as a whole. That is, we would apply a weight of 5.6 to the results for the “$0-25K” group (i.e., 28 percent divided by 5 percent) and a weight of 0.41 to “$100K plus” group to deal with their overrepresentation.

Periodicity

This survey will be conducted approximately every five to six years to minimize the cost burden.

3. Describe the methods used to maximize response rates and to deal with nonresponse. The accuracy and reliability of the information collected must be shown to be adequate for the intended uses. For collections based on sampling, a special justification must be provided if they will not yield "reliable" data that can be generalized to the universe studied.


While the surveys conducted in person are expected to yield standard rates of response (80% based on previous NOAA surveys of targeted population),10 there is some concern about the potential for non-response in the telephone/internet surveys. While response rates for many surveys have been declining in the United States for years, previous studies have indicated that the low response rates commonly associated with internet polling can be somewhat improved with the use of pre-poll telephone calls. To accomplish this, polling representatives ask respondents whether they are willing to participate in the online study and then direct them via a secure link or email. In addition we will conduct extensive online advertising to encourage response. Research has shown that under these conditions, internet and telephone surveys can reach similar response rates as those found in mail surveys.11


A variety of techniques have been incorporated into this study to maximize response rates. The surveys are user-friendly, with clear, easy to comprehend questions. Each questionnaire is short and can be completed in a short period of time (see Part A). The survey topic and related questions were developed to be interesting to respondents. Each survey makes use of listing options to allow the respondent to answer questions by checking appropriate boxes, which may aid in recall and analysis.


As an option, in-person surveys may be conducted at respondents’ homes and participants will be given the opportunity to receive and /or return the survey by mail if they are unable to complete the surveys at the time of interview. These individuals who complete the survey by mail will receive a pre-addressed stamped return envelope.


There may be instances (subject to budget constraints) where the mode of survey delivery in the given U.S. Jurisdiction will be via mail. If this is the case, survey implementation will be based on the Dillman Tailored Design Method.12 This approach includes multiple steps and points of contact. The initial mailing will include the questionnaire, a pre-addressed stamped envelope and a detailed cover letter. The cover letter will explain the project, why a response is important, a statement indicating that all personal information will be kept confidential, and instructions for completing and returning the completed survey (via mail/fax/email). Addresses on envelopes will be handwritten, and colored envelopes will be used to make them stand out. Surveys will be tracked using individual identification numbers. A follow-up thank you postcard will be sent seven to nine days after the questionnaire. The postcard will express appreciation for participating and will indicate that if the completed questionnaire has not yet been mailed, it is hoped that it will be returned soon. Three weeks after the initial mailing, a second mailing will be sent to all who have not returned the survey. This follow-up will consist of a different cover letter, another copy of the questionnaire, and another pre-addressed stamped envelope.


For internet surveys we will use a number of techniques13 to increase response including:

  • Subject lines on contact emails will clearly indicate the purpose of the survey and will explicitly avoid SPAM language in the subject line or body of the message (I.e. title all caps)

  • Information on how the respondents name was obtained, the survey intention, the use of the data, guarantees of anonymity

  • Personalized messages

  • Use of a “.gov” reply email address

  • Indication of how long the survey takes to complete and the cutoff date.

  • Use of only clean and updated email lists

  • Scheduled regular reminders and follow-ups.


Cross-cultural research faces additional methodological challenges that, if not properly addressed, may considerably increase the risk of inferential errors during the administration of surveys.14 Specifically, concepts may entail culture-specific attributes and meanings which need to be explicitly taken into account to ensure sound interpretation of cross-cultural data.15 As discussed above (see Question 2), we will address this cross cultural issue by using polling specialists who speak the local language to conduct in-person and phone surveys. These polling specialists’ knowledge of local culture and idioms are anticipated to have a positive impact on survey response rates.


In terms of increasing response rates for telephone surveys, we will use a number of techniques. First, we will work with survey firms to ensure that we have an accurate up-to-date list of phone numbers from which to draw potential respondents. Second, we will use a combination of proven approaches to increase surveys response, including conducting interviewer training so interviewers are sensitive to cultural issues and know how to administer the survey, setting clear establishment of researcher credentials in the introduction, and increasing call attempts and targeting call times. These methods have proven to be an effective approach in increasing response rates for telephone surveys. 16 In addition, we will ensure that we have multilingual telephone staff available for specific calls where language may be an issue.


4. Describe any tests of procedures or methods to be undertaken. Tests are encouraged as effective means to refine collections, but if ten or more test respondents are involved OMB must give prior approval.


The entities contracted to implement the surveys will be asked to demonstrate their competence in survey administration techniques. This would include providing past examples of survey work or recordings of previous survey efforts. They will be expected to test each instrument on nine participants prior to execution of the full survey. This will include participants interviewed by those


speaking their languages. This survey pre-test will allow for the refinement and correction of any methodological issues that are identified.


5. Provide the name and telephone number of individuals consulted on the statistical aspects of the design, and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Individuals consulted on the statistical aspects of the design, survey implementation and data analysis are listed below:

Peter E.T. Edwards and Jarrod Loerzel will supervise data collection. Data analysis will be completed by Jarrod Loerzel, Matt Gorstein, Arielle Levine, and Peter E.T. Edwards.


Peter E.T. Edwards, PhD.

Natural Resource Economist and Social Science Coordinator

The Baldwin Group Inc. (on contract for)

NOAA Coral Reef Conservation Program

National Ocean Service, Office for Coastal Management

1305 East West Highway, SSMC4, Room 10417

Silver Spring, MD, 20910

Tel 240-533-0784, Fax 301-713-4012/4389

Peter.Edwards@noaa.gov


Jarrod Loerzel

Social Scientist

JHT Inc. (on contract for)

National Centers for Coastal Ocean Science

NOAA National Ocean Service

Hollings Marine Laboratory

331 Fort Johnson Road

Charleston, SC 29412

Tel 843-460-9938

Jarrod.Loerzel@noaa.gov


Matt Gorstein

Social Scientist

JHT Inc. (on contract for)

National Centers for Coastal Ocean Science

NOAA National Ocean Service

Hollings Marine Laboratory

331 Fort Johnson Road

Charleston, SC 29412

Tel 843-460-9933

Matt.Gorstein@noaa.gov


Arielle Levine PhD.

Social Scientist

The Baldwin Group Inc. (on contract for)

NOAA/ Coral Reef Conservation Program

National Ocean Service, Office for Coastal Management

Tel 619-594-5600

Arielle.Levine@noaa.gov


(References are on following page)

References


Cantor, D. and Cunningham, P. (2002) “Methods for Obtaining High Response Rates in Telephone Surveys” in “Studies of Welfare Populations: Data Collection and Research Issues Panel on Data and Methods for Measuring the Effects of Changes in Social Welfare Programs”, Eds.Ver Ploeg, M, Moffitt, R.A. and Citro, C.F. , Committee on National Statistics Division of Behavioral and Social Sciences and Education National Research Council.

Dillman, D., J. Smyth and L. Christian. (2009) Internet, Mail and MixedMode Surveys: The Tailored Design Method. New York: John Wiley & Sons.

Dillman, D. A. (2000). Mail and Internet surveys: The total design method (2nd ed.). New York:

Wiley. Division of Instructional Innovation and Assessment, The University of Texas at Austin. “Guidelines for Maximizing Response Rates.” Instructional Assessment Resources. 2007. http://www.utexas.edu/academic/diia/assessment/iar/teaching/gather/method/surveyResponse.php

Loomis, David K., “Beach Users Perceptions Concerning Zuma Beach Restoration”, University of Massachusetts Amherst, 2009.

NOAA, “2011 National Marine Recreational Fishing Expenditure Survey” 2011.

Peng, T. K., Peterson, M. F., & Shyi, Y.-P. (1991). Quantitative Methods in Cross-National Management Research: Trends and Equivalence Issues. Journal of Organizational Behavior, 12(2), 87-107.

Public Perception and Attitudes about the Hawaiian Monk Seal, Survey Results Report”, Sustainable Resources Group International, Inc., prepared for NOAA Fisheries Service Pacific Islands Regional Office, April 2011.

Singh, J. (1995). Measurement Issues in Cross-Cultural Research. Journal of International Business Studies, 26(3), 597-619.

Washington-Oregon-California Purse Seine Survey”, NOAA, 2007, OMB Control #: 0648-0369, Gulf States Marine Fisheries Commission (GSMFC).






1 See “Public Perception and Attitudes about the Hawaiian Monk Seal, Survey Results Report”, Sustainable Resources Group International, Inc., prepared for NOAA Fisheries Service Pacific Islands Regional Office, April 2011. “Beach Users Perceptions Concerning Zuma Beach Restoration”, David K. Loomis, University of Massachusetts Amherst, 2009.

“Washington-Oregon-California Purse Seine Survey”, NOAA, 2007, OMB Control #: 0648-0369, Gulf States Marine Fisheries Commission (GSMFC). NOAA, “2011 National Marine Recreational Fishing Expenditure Survey” 2011.

2 Dillman, D., J. Smyth and L. Christian. (2009) Internet, Mail and Mixed‐Mode Surveys: The Tailored Design Method. New York: John Wiley & Sons.

4 http://factfinder2.census.gov/faces/nav/jsf/pages/index.xhtml [Census Total Population figures for 5 counties, 2010]

6 http://2010.census.gov/news/releases/operations/cb11-cn179.html [Guam county subdivisions: Agana Heights, Agat, Asan, Barrigada, Chalan Pago-Ordot, Dededo, Hagåtña, Inarajan, Mangilao, Merizo, Mongmong-Toto-Maite, Piti, Santa Rita, Sinajana, Talofofo, Tamuning, Umatac, Yigo, and Yona]

7 http://2010.census.gov/news/xls/cb11cn177_as.xls [Eastern & Western Districts; Ofu, Olosega, Tau Counties, 2010]

8 http://factfinder2.census.gov/faces/nav/jsf/pages/index.xhtml [Census Total Population figures for Hawai’i, Honolulu, Kauai, and Maui Counties, 2010]

10 See “Public Perception and Attitudes about the Hawaiian Monk Seal, Survey Results Report”, Sustainable Resources Group International, Inc., prepared for NOAA Fisheries Service Pacific Islands Regional Office, April 2011, “Washington-Oregon-California Purse Seine Survey”, NOAA, 2007, OMB Control #: 0648-0369,  Gulf States Marine Fisheries Commission (GSMFC) and NOAA, “2011 National Marine Recreational Fishing Expenditure Survey” 2011.

11 See, 2000, “Complementary Methodologies: Internet versus Mail Surveys”, DSS Research, Inc.

12 Dillman, D., J. Smyth and L. Christian. (2009) Internet, Mail and Mixed Mode Surveys: The Tailored Design Method. New York: John Wiley & Sons.

13 See Dillman, D. A. (2000). Mail and Internet surveys: The total design method (2nd ed.). New York:

Wiley. Division of Instructional Innovation and Assessment, The University of Texas at Austin. “Guidelines for Maximizing Response Rates.” Instructional Assessment Resources. 2007. http://www.utexas.edu/academic/diia/assessment/iar/teaching/gather/method/surveyResponse.php

14 Singh, J. (1995). Measurement Issues in Cross-Cultural Research. Journal of International Business Studies, 26(3), 597-619.

15 Peng, T. K., Peterson, M. F., & Shyi, Y.-P. (1991). Quantitative Methods in Cross-National Management Research: Trends and Equivalence Issues. Journal of Organizational Behavior, 12(2), 87-107.

16 This approaches have been shown to have a positive impact on response rates, see Cantor, D. and Cunningham, P. (2002) “Methods for Obtaining High Response Rates in Telephone Surveys” in “Studies of Welfare Populations:

Data Collection and Research Issues Panel on Data and Methods for Measuring the Effects of Changes in Social Welfare Programs”, Eds.Ver Ploeg, M, Moffitt, R.A. and Citro, C.F. , Committee on National Statistics Division of Behavioral and Social Sciences and Education National Research Council.

19


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSarah Brabson
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy