SUPPORTING STATEMENT
SOCIOECONOMICS OF CORAL REEF CONSERVATION
OMB CONTROL NO. 0648-xxxx
INTRODUCTION
This request is for a new information collection.
The National Oceanic and Atmospheric Administration (NOAA) created the Coral Reef Conservation Program to safeguard and ensure the welfare of the coral reef ecosystems along the coastlines of America’s States and Territories. The administration of this program has potential economic and cultural impacts on the lives of nearby residents and citizens. In accordance with its mission goals, NOAA has designed a survey to help assess the impacts of the Coral Reef Conservation Program.
The survey is designed to be repeated every three to four years in order to provide longitudinal data about the impact of the Coral Reef Conservation Program.
The purpose of this information collection is to obtain information from individuals in the seven US jurisdictions containing coral reefs. Specifically, NOAA is seeking information on the behaviors and activities related to coral reefs, as well as information on knowledge and attitudes related to coral reefs and specific reef protection activities.
The Coral Reef Conservation Program (CRCP), developed under the authority of the Coral Reef Conservation Act of 2000 (P.L. 106-562; 16 U.S.C. 6401 et seq.) is responsible for programs intended to enhance the conservation of coral reefs. Under this authority, CRCP works with local partners in Florida, US Virgin Islands, Puerto Rico, Hawaii, American Samoa, Guam, and Commonwealth of the Northern Mariana Islands to reduce key threats to coral reefs, including climate change, land based sources of pollution, and impacts from fishing.
CRCP is embarking on a new National Coral Reef Monitoring Program (NCRMP), intended to enhance the conservation of coral reefs. As part of this program, CRCP intends to gather and monitor a collection of socioeconomic variables, including those related to knowledge, attitudes, and perceptions of coral reefs and coral reef management of jurisdictional residents.
CRCP intends to use the information collected through this instrument for research purposes as well as measuring and improving the results of our reef protection programs. Because many of our efforts to protect reefs rely on education and changing attitudes toward reef protection, the information collected will allow CRCP staff to ensure programs are designed appropriately at the start, future program evaluation efforts are as successful as possible, and outreach efforts are targeting the intended recipients with useful information.
2. Explain how, by whom, how frequently, and for what purpose the information will be used. If the information collected will be disseminated to the public or used to support information that will be disseminated to the public, then explain how the collection complies with all applicable Information Quality Guidelines.
The purpose of the survey is to gather longitudinal information from residents in Florida, US Virgin Islands, Puerto Rico, Hawaii, American Samoa, Guam, and Commonwealth of the Northern Mariana Islands (CNMI) related to their knowledge, attitudes, and perceptions of coral reefs and coral reef management practices.
As part of the NCRMP, CRCP, in consultation with partners and stakeholders developed a set of long-term core indicators that will be measured over time for each of the coral reef jurisdictions. The data gathered as part of this information collection request will assist CRCP in tracking these indicators and improve the results of its existing and future programs. A list with a description and the relevance of each indicator is shown in Table 1 below.
Table 1: National Indicators for the National Coral Reef Ecosystem Monitoring Program
National Indicator |
Priority |
Importance of Gathering Data to Measure Indicator |
1. Participation in reef activities (including snorkeling, diving, fishing, harvesting) |
Critical |
Understand the economic and recreational importance of coral reefs to local residents; understand level of extractive and non-extractive pressures on reefs |
2. Knowledge of coral reef rules and regulations |
Critical |
Tracking this information over time at the jurisdictional or national level will provide a better understanding of the effect of investing in education and outreach |
3. Perceived compliance with coral reefs rules and regulations |
Critical |
Determine how people are impacting coral reefs and effectiveness of regulations and enforcement efforts |
4. Perceived resource condition |
Critical |
Complement biophysical information. Key to understanding public support for various management strategies |
5. Knowledge of threats to coral reefs |
Critical |
Monitoring this information over time is key to tracking whether CRCP constituents understand threats to coral reefs. Data gathered will help inform management strategies and education and outreach efforts |
6. Attitudes towards coral reef management strategies |
Critical |
Monitoring this information over time will be valuable to decision-makers. Information collected will assist decision-makers to evaluate and improve existing strategies and design new management approaches |
7. Participation in behaviors that may improve coral reef health
|
Critical |
Improve existing knowledge and gain a better understanding on how human behaviors impact coral reefs positively and negatively |
8. Cultural importance of reefs |
Critical |
Understand traditional and cultural significance of coral reefs to jurisdictional residents and whether their significance is changing over time |
9. Population trends change near coral reefs |
Important |
Determine how changing population trends increase pressure on coral reefs and reef-adjacent population |
10. Economic impact of coral reef fishing to jurisdiction |
Important |
Track the economic contributions of coral reefs to reel fishing and justify government funding of coral reef protection programs |
11. Economic impact of dive/snorkel tourism to jurisdiction1 |
Important |
Track the economic contributions of coral reefs to tourism and justify government funding of coral reef protection programs
|
While the indicators to be measured are applicable to all jurisdictions, it is important to note that there are considerable geographical, cultural and linguistic differences among residents nearby and tourists to these coral reef areas. In order to provide flexibility in the data collection instrument to account for those and other differences, CRCP decided to construct a bank of questions, instead of administering a single survey to all jurisdictions. The question bank will ensure that specific topics relevant to each of the seven jurisdictions are addressed, and that the questions asked as part of the surveys will be relevant to the target audiences and the sampled populations.
The bank of questions (which ultimately contains 138 questions) was created in coordination with NOAA staff and partners in these jurisdictions, and incorporates questions from former regional and local surveys, published articles and other information pertaining coral reefs and coral reef management. In addition, all the questions included in the bank are associated to one or more national indicators, and therefore, all are relevant to measure these indicators. In addition to the indicator-related questions, a number of demographic questions were also included, with the purpose of allowing CRCP to sort the responses into different subgroups and analyze how demographics relate to question responses.
Table 2 on the following page presents a summary of the question categories included in the question bank.
Table 2: Question Bank Categories
Question Number |
Category |
Description |
1-13 |
Attitude toward / importance of coral reefs |
Importance of coral reef aspects, including willingness to pay for coral reef protection, and satisfaction with the state of coral reefs over time |
14-28 |
Participation in coral reef activities |
Frequency of participation in coral reef activities, including activities conducted at the coral reef jurisdiction and how deterioration of coral reef conditions could affect participation in these activities |
29-35 |
Perceived threats to coral reefs |
Perceived threats in coral reef jurisdiction, including familiarity with common threats to coral reefs and perception on their potential impact |
36-48 |
Marine Protected Areas |
Familiarity with Marine Protected areas (MPAs), including perceived purpose, benefits and impact, and effect on coral reef activities |
49-66 |
Resource conditions of coral reefs |
Perception of the condition of coral reefs over time, and willingness to accept actions such as limited access, increased restrictions on coral reef activities (e.g. fishing, boating), more stringent pollution regulation, and statutes limiting development |
67-73 |
Coral reef changes since establishment of MPAs |
Perceived changes since the introduction of MPAs and impact of these changes on personal use of coral reef areas |
74-81 |
Knowledge of rules/regulations |
Knowledge of applicable regulations and restrictions to coral reef activities in MPA, and knowledge/perceptions on the effectiveness of traditional or cultural methods for managing resources |
82-85 |
Compliance with rules/regulations |
Perception on level of compliance with regulations related coral reefs (e.g. by fishers, divers, local population, tourists), perception of enforcement levels, and rationale to follow coral reef regulations |
86-100 |
Coral reef management processes |
Level of support towards environmental causes, including donations, volunteering activities, and involvement in activities related to the management of coral reefs. Perceptions on the success of coral reef strategies and regulations and the roles of the Federal, local government, and local communities to protect coral reefs |
101-104 |
Support for management processes and regulations |
Perception on the success of different actions and regulations to address problems in coral reef areas. Level of support towards specific regulations and measures aimed to protecting coral reefs. |
105-109 |
Sources of information available |
Identification of most relevant sources of information about coral reefs (e.g. newspapers, radio, brochures, NOAA publications, etc.), and level of trust in information sources |
110-119 |
Coral reef financial reliance |
Reliability of coral reefs as a personal source of food or income, including involvement on commercial fishing activities and their impact on personal income |
120-138 |
Demographic questions |
Generic demographic information to facilitate the categorization and analysis of the responses. Information includes family members, age, gender, education, occupation, household income, place of residence, race, languages, religious affiliation, and membership in community groups |
Information on each jurisdiction will be collected at regular intervals every three to four years. The information will be collected by contractors in close coordination with CRCP in accordance with the methodology set forth in Part B. For each jurisdiction, CRCP will work with contractors to define the survey objectives, the data collection strategy, select relevant questions from question bank and tailor them to the specific jurisdiction. CRCP is planning to use the following approach to select the questions for each jurisdiction:
Identify the categories of questions that are necessary for that jurisdiction. Within each category, select which questions and answer choices are most applicable to that jurisdiction (e.g. questions of tribal affiliation are rarely applicable to residents of Florida)
Prioritize the questions chosen in order to obtain the most critical information while staying under the 30 minute threshold.
The questions for the Guam interviews have already been selected from the bank, and are included in this submission as a separate document.
As described in Question 3 below, the information will be collected by using the most efficient and effective means in the individual jurisdiction. During the three years covered by this clearance we expect to use face-to-face interviews in American Samoa, phone or internet based survey techniques in Hawaii, Florida, and Puerto Rico, telephone surveys in Commonwealth of the Northern Mariana Islands (CNMI) and Guam, and phone or face-to-face interviews in the US Virgin Islands (USVI).
For each survey after the Guam survey, a nonsubstantive change request will be submitted, listing the selected questions, and briefly describing the information collection venue and sampling methodology applicable to this community.
Data collected will not be disseminated to the public in a way which could potentially reveal personally identifiable information (PII). Aggregate and summary statistics will only be publicly available for the data which will allow the identities of survey respondents to remain confidential. CRCP will maintain the data in accordance with the highest standards of information security and will keep PII data only as long as is absolutely necessary to complete the survey.
CRCP fully acknowledges the possibility of experiencing potential bias during the data collection, for example, in case of non-response to certain questions or non-truthful answers (these scenarios are dealt with in Part B’s detailed descriptions of methodology).
The risk associated with these potential biases skewing the analysis will be minimized by the fact that CRCP will be primarily using the information as indicative parameters to analyze the effectiveness of its programs. The information collected will not be used by CRCP to conduct comprehensive evaluations of its programs nor will the data from this survey be used in isolation be used to make decisions about these programs. Any decisions to modify existing programs and to create new coral reef initiatives will be made using information collected from a number of sources, including this survey and other tools such as formal program assessments and evaluations and CRCP’s strategic plans.
NOAA will retain control over the information and safeguard it from improper access, modification, and destruction, consistent with NOAA standards for confidentiality, privacy, and electronic information. See response to Question 10 of this Supporting Statement for more information on confidentiality and privacy. The information collection is designed to yield data that meet all applicable information quality guidelines. Prior to dissemination, the information will be subjected to quality control measures and a pre-dissemination review pursuant to Section 515 of Public Law 106-554.
We are planning on conducting face-to-face interviews in America Samoa due to the low density of internet and phone connections, phone interviews in CNMI and Guam, and depending on feasibility in each location and advice from local survey firms, phone or face-to-face interviews in USVI, and phone or internet interviews in Hawaii, Puerto Rico, and Florida.
This combination of information collection techniques has been designed with the objective of selecting the most cost-effective approach depending on the specific conditions in each jurisdiction, and at the same time, to reduce the burden on respondents.
It is important to mention that the use of internet-based techniques versus phone-based techniques will be dependent on the percentage of internet users in each jurisdiction. In jurisdictions with high-internet use rates like Florida, Hawaii, and Puerto Rico, most of the information may be collected using electronic means. However, in jurisdictions with a lower proportion of internet users like the US Virgin Islands, CNMI, and Guam, a significant percentage of information may be collected via phone surveys.
4. Describe efforts to identify duplication.
A literature review was conducted to identify studies analyzing knowledge, attitudes and reef use patterns and protection activities, including social and economic data related to the communities
affected by coral reef conservation programs. There are no published studies that provide this information.
In addition, there are no currently approved information collections requesting similar information in the seven jurisdictions containing coral reefs. There is a currently approved collection (OMB Control Number 0648-0585) to conduct a survey to estimate individuals’ preferences and economic values of the Hawaiian coral reef ecosystem. However, the scope of this study only includes one jurisdiction, and its focus is only to evaluate a number of specific management actions provided in the survey.
Finally, this effort is being coordinated by the CRCP’s Social Science Coordinator. Part of her job is to coordinate survey efforts occurring in the jurisdictions to reduce survey fatigue and avoid unnecessary expenditure of resources. All efforts will be made to ensure that this data collection is not redundant with other efforts in the jurisdictions.
N/A. Only individuals will be interviewed.
One of the main objectives of this collection is to assist the Coral Reef Conservation Program (CRCP) to fulfill its mission of enhancing the conservation of coral reefs. The information requested will allow CRCP to gauge the effects of its existing conservation programs and improve them accordingly. In addition, the information will allow CRCP to design new programs and ensure that they are as successful as possible.
Not conducting this investigation could undermine CRCP’s ability to effectively evaluate its programs, and to ensure that they are helping achieve its mission.
No special circumstances are anticipated. The information requested will be voluntary and the collection will be conducted in accordance with OMB guidelines.
A Federal Register Notice was published on July 1, 2011 (76 FR 38618). One public comment was received. This comment was a request via email for a hard copy of the question bank. The document was sent.
The question bank and the sampling strategies for this collection were developed in consultation with key CRCP staff and partners and are modeled on the national indicators for this program.
No payments or gifts are provided to respondents.
As stated on the questionnaires, identifying information (name, address, telephone number, email address) will be used only to administer the survey. This information will be viewed only by the contractor compiling the data, and will be destroyed at the end of the information collection. This process will maintain the anonymity of the responses received. Results will be aggregated, so that no responses can be attributable to individuals.
All data received from the surveys will be placed on a secure server and will be password protected. This website will not be available to the public. All computerized data will be maintained in a manner that is consistent with NOAA’s IT Security Program. No data files will contain personal identifiers.
11. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private.
For this collection, no sensitive questions will be asked. However, if a respondent does perceive a particular question as sensitive (e.g. religious affiliation), we will treat a response to this type of question as completely voluntary and therefore, no-response options will be added to the menus of possible answers. In addition, if a respondent is interested in learning why a specific question is being asked, the survey administrator will explain the purpose of the given question. In actuality, we do not believe that very many respondents will be uncomfortable identifying their religious affiliation.
The rationale behind the inclusion of religious affiliation questions in the survey is that in certain jurisdictions, especially in remote areas, coral conservation attitudes and practices may be linked to religious beliefs and local cultural ideas of nature. The collection of this information will allow CRCP to better understand the practices attributed to these beliefs, and thus adequately tailor its programs to these jurisdictions.
To address potential sensitivity issues associated with these questions, personal identifying information will not be stored and will only be used to administrate the survey; respondents will be made aware of this practice. Identifying information will be viewed only by the contractor compiling the data, and will be destroyed at the end of the information collection. This process will maintain the anonymity of the responses received.
A variety of instruments and platforms will be used to collect information from respondents. The annual burden hours requested (1,191) are based on the maximum number of collections we expect to conduct over the requested period for this clearance, even though we do not expect 100% response. Using average labor rates for the specific jurisdictions and for the nation as a whole when jurisdictional information is not available, the burden estimate results in expected labor costs of $20,204.
The response burden is based on an average number of questions asked. Depending on the jurisdiction the composition of these questions will change to fit the particular circumstances. For statistical purposes, NOAA will always ask a core set of questions (i.e., demographics). These types of questions generally have a lower response burden than the more detailed questions in the survey. The response burden is based on three different components: the survey administrator explaining the purpose and need to the respondent, demographic questions for statistical purposes, and programmatic related questions. We estimate that the survey administrator will take 1 minute to explain the purpose and need of the survey to the respondent (if the call recipient declines the survey, this time will fall under nonresponse burden). The remaining number of questions will be determined by NOAA’s research priorities at the time. The questions have been divided into indicator groups. Of these groups, NOAA will shift its importance and the number of questions asked from each group to keep the total time needed within 30 minutes.
We acknowledge that not all respondents contacted will be willing to participate in the survey. For these negative responses we estimate a non-response burden of 1 minute for the survey administrator to explain the purpose and need for the survey and the respondent to decline. Based on previous NOAA surveys of the targeted population we expect a 50% response rate for mail surveys, 80% response rate for in-person surveys and 40% response rate for internet surveys2.
Table 3: Estimates of Burden Hours (3.5-year time frame)
Requirements |
# of Respondents |
Responses Per Respondent |
Total # of Responses |
Response Time |
Total Burden (in hours) |
Labor Cost |
|
|
|
|
|
|
|
Florida |
2,600 |
1 |
2600 |
30 min. |
1300 |
$24,648 |
Guam |
650 |
1 |
650 |
30 min. |
325 |
$4,657 |
Hawaii |
975 |
1 |
975 |
30 min. |
488 |
$10,023 |
American Samoa |
358 |
1 |
358 |
30 min. |
179 |
$3,741 |
Puerto Rico |
1,625 |
1 |
1625 |
30 min. |
813 |
$10,034 |
Commonwealth of Northern Marianas Islands |
325 |
1 |
325 |
30 min. |
163 |
$3,396 |
Virgin Islands |
488 |
1 |
488 |
30 min. |
244 |
$4,111 |
|
|
|
|
|
|
|
Non response burden |
3,779 |
1 |
3779 |
1 min. |
63 |
$0 |
Total Responses |
10,800 |
|
|
|
|
|
Total Public Burden |
|
|
|
|
3,573 |
$60,611 |
Annualized |
3,600 |
|
|
|
1,191 |
$20,204 |
There will be no cost to respondents beyond burden hours.
The government will implement two surveys each year, taking three and a half years to complete all seven surveys. The total cost to the government for these seven surveys is estimated at a total of $630,000, which averages to $180,000 each year. Contractor costs are roughly $120,000 per year or a total of $420,000. These costs include survey design and preparation of the draft OMB Clearance package.
NOAA staff time and travel required to participate in planning and design activities is estimated to average $60,000 a year, which is a total of $210,000 for the three and a half years. NOAA staff will be responsible for fielding the survey (including response tracking, coding and processing the data, and delivery of final data files), and data analysis and reporting. Fielding the survey and processing the data activities are estimated at .20 FTE for a GS-09 per survey. This would result in a cost of roughly of $48,000 per year. Additionally, the travel costs NOAA staff will include to conduct and deliver the survey will be roughly $12,000 per year. (Table 4)
Table 4: Government Cost Distribution of all 7 surveys
|
Total Cost for 3.5 years ($) |
Cost / Year ($) |
Contractor Costs |
420,000 |
120,000 |
NOAA Personnel Costs (FTE + Travel) |
210,000 |
60,000 |
TOTAL |
630,000 |
180,000 |
Not applicable. This a new information collection request.
Data collected under this clearance will only be used for research purposes, to measure and improve the results of CRCP programs, and to target outreach efforts.
While the agency does not intend to publish its findings, it may receive requests to release some of its findings through congressional inquiries or Freedom of Information Act (FOIA) Requests. CRCP will disseminate the findings when appropriate, and strictly following NOAA’s guidelines, and all applicable laws and regulations.
Not
applicable.
Not applicable.
The potential respondent universe for this study is adults, eighteen years or older, who live near, and may use, coral reefs affected by activities related to the NOAA’s Coral Reef Conservation Program. The total population (all individuals) of the potentially impacted area is 11,244,759. Respondents will be classified into seven geographical jurisdictions and 22 reporting units, as defined in Table 5. In American Samoa, face-to-face interviews will be conducted and in the remaining jurisdictions, a combination of internet and phone based interview will occur. Each of the geographical jurisdictions is expected to be surveyed once every three to four years. Respondents will be randomly selected from the target audiences. Based on previous NOAA surveys of the target populations, we anticipate that the response rate will be 50 percent for mail surveys, 40% for internet surveys and 80 percent for in-person surveys.3 Dillman et al. (2009) considers a response rate above 50% a high response rate for mail surveys.4
Table 5: Study Jurisdictions and Reporting Units
Jurisdiction |
Reporting units |
Population |
|
Puerto Rico |
|
|
3,725,7895
|
Florida
|
|
|
5,784,0436 |
U.S. Virgin Islands |
|
|
106,4057 |
Guam |
|
159,3588 |
|
American Samoa
|
|
|
55,0709 |
Main Hawaiian Islands |
|
Maui County |
1,360,21110 |
Commonwealth of the Northern Mariana Islands (CNMI) |
|
Saian Municipality |
53,88311 |
Total |
11,244,759 |
For each of the jurisdictional populations, we intend to select a random sample of individuals over the age of eighteen, stratified geographically as described in Table 6. The random sample will be obtained from the selected survey firm using standard sample selection tools. The sample frame will be developed from telephone directories, mailing lists obtained and maintained by the survey firms and other sources as needed, depending onr the coverage of these sources. These strata have been designed to account for the differing sizes of the populations in the areas close
to coral reefs. We have used the standard approach to estimating sample size for a stratified population:
[t2 N p(1-p)] / [t2 p(1-p) + a2 (N-1)]
Where N is the size of the total number of cases, n is the sample size, a is the expected error, t is the value taken from the t distribution corresponding to a certain confidence interval, and p is the probability of an event. The final sampling size will be based on available resources.
Table 6: Sampling Requirements by Geographical Jurisdictions
Jurisdiction |
Total Sample |
Sample Size by Strata |
||
1. American Samoa |
550 |
400 |
Tutuila |
Island |
50 |
Ta’u |
County |
||
50 |
Olosega |
County |
||
50 |
Ofu |
County |
||
2. CNMI |
500 |
400 |
Saipan |
Municipality |
50 |
Tinian |
Municipality |
||
50 |
Rota |
Municipality |
||
3. Guam |
1,000 |
100 |
Agat |
Municipality |
100 |
Piti |
Municipality |
||
100 |
Asana |
Municipality |
||
100 |
Talafolo |
Municipality |
||
100 |
Merizo |
Municipality |
||
250 |
Tamuning |
Municipality |
||
250 |
Mangilao |
Municipality |
||
4. Hawaii |
1,500 |
350 |
Hawaii |
County |
750 |
Honolulu |
County |
||
100 |
Honolulu |
County |
||
300 |
Maui |
County |
||
5. Florida |
4,000 |
250 |
Monroe |
County |
1,250 |
Miami-Dade |
County |
||
500 |
Martin |
County |
||
1,000 |
Broward |
County |
||
1,000 |
Palm Beach |
County |
||
6. Puerto Rico |
2,500 |
1,000 |
South & West |
PR |
1,000 |
North & East |
PR |
||
250 |
Western Islands |
PR |
||
250 |
Eastern Islands |
PR |
||
7. US Virgin Islands |
750 |
350 |
St. Croix |
Island |
350 |
St. Thomas |
Island |
||
50 |
St. John |
Island |
||
Total |
10,800 |
In addition to asking the questions regarding the impact of the Coral Reef Conservation Programs, the surveys will collect information on socioeconomics and demographics. This additional information will be used to sort and categorize the survey results in order to control for as many variables as possible. This approach will ensure a large enough respondent pool (particularly in more populated jurisdictions) to make comparisons between strata where required.
In
each of the jurisdictions, we intend to hire local surveying
contractors with databases of contact information in order to allow
for the greatest possible randomization of survey participants. NOAA
will also work with these contractors to select the most
cost-effective survey methodology which will resonate with the
population measured. Survey participants will be selected in
American Samoa for face-to-face interviews due to the very low
incidence of either cellular phones or land lines; in other
locations, local opinion poll contractors will select participants at
random using a combination of internet and telephone polling. The
methodology to be employed by jurisdiction can be found in Table 7.
Table 7: Survey Methodology by Geographical Jurisdictions
Jurisdiction |
Geographic scope |
Estimated Response Rate (based on previous NOAA surveys) |
1. American Samoa |
Face-to-face (and mail as back-up) |
50-80% |
2. CNMI |
Telephone |
50-80% |
3. Guam |
Telephone |
50-80% |
4. Hawaii |
Telephone or Internet |
50-80% |
5. Florida |
Telephone or Internet |
50-80% |
6. Puerto Rico |
Telephone or Internet |
50-80% |
7. US Virgin Islands |
Telephone or Face-to-face |
50-80% |
We do not intend to compare survey results between jurisdictions (though comparisons between the larger regional strata are possible), so there is no concern about comparability issues between methodologies.
As can be seen from Table 7, we have selected a number of different methods to collect data from different jurisdictions. Table 8 highlights the percent of population classified as internet users for the seven jurisdictions. In general we will attempt to collect data using a mixture of internet and telephone methods. The one exception is American Samoa where an in-person household survey backed-up by mail surveys will be conducted due to the extremely low level of internet usage in this jurisdiction (i.e., approximately 6 percent). In addition, the average internet use in CNMI, Guam, Puerto Rico and the US Virgin Islands is 39 percent as compared to 79 percent for Hawaii and Florida. As a result we will support the internet survey in these jurisdictions with lower internet usage with a telephone survey, or in the case of USVI with face-to-face interviews, to capture non-internet users.
Table 8: Internet Usage in Survey Jurisdictions
Jurisdiction |
Population |
Percent of Population Classified as Internet Users |
1. American Samoa |
55,070 |
6% |
2. CNMI |
53,883 |
30% |
3. Guam |
159,358 |
56% |
4. Hawaii |
1,360,211 |
79% |
5. Florida |
5,784,043 |
80% |
6. Puerto Rico |
3,725,789 |
40% |
7. US Virgin Islands |
106,405 |
28% |
Source: Data from Hawaii and Florida US Census 2010. Other data “Internet World Statistics”, American Samoa data March 2011, CNMI data from August 2010, Guam data from June 2010, Puerto Rico data from, June 2011, and US Virgin Islands from December 2002 (see http://www.internetworldstats.com/).
We expect that there will be some language issues. Table 9 shows there are several major languages spoken beyond English by the populations of each jurisdiction.
Table 9: Languages Spoken in Survey Jurisdictions
Jurisdiction |
Major Languages Spoken |
1. American Samoa |
English, Samoan |
2. CNMI |
English, Chamorro, Carolinian, Tagalog, Chinese, Korean, Japanese |
3. Guam |
English,Chamorro, Tagalog, Chinese, Korean, Japanese |
4. Hawaii |
English, Hawaiian pigin |
5. Florida |
English, Spanish |
6. Puerto Rico |
English, Spanish |
7. US Virgin Islands |
English, Negerhollands, Virgin Islands Creole |
This language issue will be ameliorated by the use of polling specialists who speak the local language. These contractors will also be used to ensure that the questions posed in the survey are translated into the proper cultural contexts. Responses will be tracked to see if there are statistically significant differences in the survey results between those who speak English at home and those who do not. In addition, mail and internet surveys will be translated into local languages.
We also expect that there is some risk of sample selection bias towards those of higher incomes, particularly for the telephone and internet surveys. In areas where access to phone and internet services are not widely available, this bias may be more than minimal. To the greatest extent possible, we hope that this can be corrected through the use of telephone surveys. If responses appear to favor high-income groups we will use various weighting procedures in the post-survey analysis to adjust for bias. Specifically, we will overweight the underrepresented groups if expected responses are not obtained. We will identify ‘control totals’ for the population that the survey is aiming to reach and calculate weights to adjust the sample totals to the control totals. For example, suppose the distribution of income groups in the population is as follows:
Income |
Percent of Population |
$0-$25K |
28.00 |
$25K-$50K |
27.00 |
$50-$75K |
18.00 |
$75K-$100K |
11.00 |
$100K plus |
16.00 |
Total |
100.00 |
However, the response distribution is:
Income |
Percent of Population |
$0-$25K |
5.00 |
$25K-$50K |
27.00 |
$50-$75K |
18.00 |
$75K-$100K |
11.00 |
$100K plus |
39.00 |
Total |
100.00 |
In response to this disparity, we would weigh the sample for the “$0-$25K” and the “$100K plus” groups to bring it in to line with the proportion of that group in the population as a whole. That is, we would apply a weight of 5.6 to the results for the “$0-25K” group (i.e., 28 percent divided by 5 percent) and a weight of 0.41 to “$100K plus” group to deal with their overrepresentation.
This survey will be conducted every three to four years to minimize the cost burden.
While the surveys conducted in person are expected to yield standard rates of response (80% based on previous NOAA surveys of targeted population),12 there is some concern about the potential for non-response in the telephone/internet surveys. While response rates for many surveys have been declining in the United States for years, previous studies have indicated that the low response rates commonly associated with internet polling can be somewhat improved with the use of pre-poll telephone calls. To accomplish this, polling representatives ask respondents whether they are willing to participate in the online study and then direct them via a secure link or email. In addition we will conduct extensive online advertising to encourage response. Research has shown that under these conditions internet and telephone surveys can reach similar response rates as those found in mail surveys.13
A variety of techniques have been incorporated into this study to maximize response rates. The surveys are user-friendly, with clear, easy to comprehend questions. Each questionnaire is short and can be completed in a short period of time (see Part A). The survey topic and related questions were developed to be interesting to respondents. Each survey makes use of listing options to allow the respondent to answer questions by checking appropriate boxes, which may aid in recall and analysis.
In person surveys will be conducted at respondent’s homes and participants will be given the opportunity to receive and /or return the survey by mail if they are unable to complete the surveys at the time of interview. These individuals who complete the survey by mail will receive a pre-addressed stamped return envelope.
The implementation of the mail surveys is based on the Dillman Tailored Design Method.14 This approach includes multiple steps and points of contact. The initial mailing will include the questionnaire, a pre-addressed stamped envelope and a detailed cover letter. The cover letter will explain the project, why a response is important, a statement indicating that all personal information will be kept confidential, and instructions for completing and returning the completed survey (via mail/fax/email). Addresses on envelopes will be handwritten, and colored envelopes will be used to make them stand out. Surveys will be tracked using individual identification numbers. A follow-up thank you postcard will be sent seven to nine days after the questionnaire. The postcard will express appreciation for participating and will indicate that if the completed questionnaire has not yet been mailed, it is hoped that it will be returned soon. Three weeks after the initial mailing, a second mailing will be sent to all who have not returned the survey. This follow-up will consist of a different cover letter, another copy of the questionnaire, and another pre-addressed stamped envelope.
For internet surveys we will use a number of techniques15 to increase response including:
Subject lines on contact emails will clearly indicate the purpose of the survey and will explicitly avoid SPAM language in the subject line or body of the message (I.e. title all caps)
Information on how the respondents name was obtained, the survey intention, the use of the data, guarantees of anonymity
Personalized messages
Use of a .gov reply email address
Indication of how long the survey takes to complete and the cutoff date.
Use of only clean and updated email lists
Scheduled regular reminders and follow-ups.
Cross-cultural research faces additional methodological challenges that, if not properly addressed, may considerably increase the risk of inferential errors during the administration of surveys.16 Specifically, concepts may entail culture-specific attributes and meanings which need to be explicitly taken into account to ensure sound interpretation of cross-cultural data.17 As discussed above (see Question 2), we will address this cross cultural issue by using polling specialists who speak the local language to conduct in-person and phone surveys. These polling specialists’ knowledge of local culture and idioms are anticipated to have a positive impact on survey response rates.
In terms of increasing response rates for telephone surveys, we will use a number of techniques. First, we will work with survey firms to ensure that we have an accurate up-to-date list of phone numbers from which to draw potential respondents. Second, we will use a combination of proven approaches to increase surveys response, including conducting interviewer training so interviewers are sensitive to cultural issues and know how to administer the survey, setting clear establishment of researcher credentials in the introduction, and increasing call attempts and targeting call times. These methods have proven to be an effective approach in increasing response rates for telephone surveys. 18 In addition, we will ensure that we have multilingual telephone staff available for specific calls where language may be an issue.
Contracted polling groups will be asked to demonstrate their survey administration techniques on nine participants prior to execution of the full survey. This will include participants interviewed by those speaking their languages. This sample test will allow for the refinement and correction of any methodological issues that are identified.
Christy
Loper, Ph.D.
Coral Reef Conservation Program
US National
Oceanic and Atmospheric Administration
on detail
to:
Office of the Assistant Secretary for Conservation and
Management
1401 Constitution Avenue, Suite 6224
Washington,
DC
20230
202-482-5143 (office)
240-429-7044 (cell)
christy.loper@noaa.gov
Individuals consulted on the statistical aspects of the design:
Victoria Adams, Ph.D. Economist Booz Allen Hamilton 8283 Greensboro Drive McLean, VA, 22102 Telephone: (703)-377-4942 |
The individuals and firms that will collect and analyze the data have not yet been identified.
Cantor, D. and Cunningham, P. (2002) “Methods for Obtaining High Response Rates in Telephone Surveys” in “Studies of Welfare Populations: Data Collection and Research Issues Panel on Data and Methods for Measuring the Effects of Changes in Social Welfare Programs”, Eds.Ver Ploeg, M, Moffitt, R.A. and Citro, C.F. , Committee on National Statistics Division of Behavioral and Social Sciences and Education National Research Council.
Dillman, D., J. Smyth and L. Christian. (2009) Internet, Mail and Mixed‐Mode Surveys: The Tailored Design Method. New York: John Wiley & Sons.
Dillman, D. A. (2000). Mail and Internet surveys: The total design method (2nd ed.). New York:
Wiley. Division of Instructional Innovation and Assessment, The University of Texas at Austin. “Guidelines for Maximizing Response Rates.” Instructional Assessment Resources. 2007. http://www.utexas.edu/academic/diia/assessment/iar/teaching/gather/method/surveyResponse.php
Loomis, David K., “Beach Users Perceptions Concerning Zuma Beach Restoration”, University of Massachusetts Amherst, 2009.
NOAA, “2011 National Marine Recreational Fishing Expenditure Survey” 2011.
Peng, T. K., Peterson, M. F., & Shyi, Y.-P. (1991). Quantitative Methods in Cross-National Management Research: Trends and Equivalence Issues. Journal of Organizational Behavior, 12(2), 87-107.
“Public Perception and Attitudes about the Hawaiian Monk Seal, Survey Results Report”, Sustainable Resources Group International, Inc., prepared for NOAA Fisheries Service Pacific Islands Regional Office, April 2011.
Singh, J. (1995). Measurement Issues in Cross-Cultural Research. Journal of International Business Studies, 26(3), 597-619.
“Washington-Oregon-California Purse Seine Survey”, NOAA, 2007, OMB Control #: 0648-0369, Gulf States Marine Fisheries Commission (GSMFC).
1 CRCP will track this information for these indicators (9-11) indirectly through secondary sources and separate data collection activities. This will reduce the burden on participants.
2 See “Public Perception and Attitudes about the Hawaiian Monk Seal, Survey Results Report”, Sustainable Resources Group International, Inc., prepared for NOAA Fisheries Service Pacific Islands Regional Office, April 2011. “Washington-Oregon-California Purse Seine Survey”, NOAA, 2007, OMB Control #: 0648-0369, Gulf States Marine Fisheries Commission (GSMFC). NOAA,“2011 National Marine Recreational Fishing Expenditure Survey” 2011. For internet surveys see “Beach Users Perceptions Concerning Zuma Beach Restoration”, David K. Loomis, University of Massachusetts Amherst, 2009.
3 See “Public Perception and Attitudes about the Hawaiian Monk Seal, Survey Results Report”, Sustainable Resources Group International, Inc., prepared for NOAA Fisheries Service Pacific Islands Regional Office, April 2011. “Beach Users Perceptions Concerning Zuma Beach Restoration”, David K. Loomis, University of Massachusetts Amherst, 2009.
“Washington-Oregon-California Purse Seine Survey”, NOAA, 2007, OMB Control #: 0648-0369, Gulf States Marine Fisheries Commission (GSMFC). NOAA,“2011 National Marine Recreational Fishing Expenditure Survey” 2011.
4 Dillman, D., J. Smyth and L. Christian. (2009) Internet, Mail and Mixed‐Mode Surveys: The Tailored Design Method. New York: John Wiley & Sons.
5 http://factfinder2.census.gov/faces/nav/jsf/pages/index.xhtml [Census Total Population figure, 2010]
6 http://factfinder2.census.gov/faces/nav/jsf/pages/index.xhtml [Census Total Population figures for 5 counties, 2010]
7 http://2010.census.gov/news/xls/cb11cn180_vi.xls [Three islands only]
9 http://2010.census.gov/news/xls/cb11cn177_as.xls [Eastern & Western Districts; Ofu, Olosega, Tau Counties, 2010]
10 http://factfinder2.census.gov/faces/nav/jsf/pages/index.xhtml [Census Total Population figures for Hawai’i, Honolulu, Kauai, and Maui Counties, 2010]
11 http://2010.census.gov/news/xls/cb11cn178_cnmi.xls [Three municipalities]
12 See “Public Perception and Attitudes about the Hawaiian Monk Seal, Survey Results Report”, Sustainable Resources Group International, Inc., prepared for NOAA Fisheries Service Pacific Islands Regional Office, April 2011, “Washington-Oregon-California Purse Seine Survey”, NOAA, 2007, OMB Control #: 0648-0369, Gulf States Marine Fisheries Commission (GSMFC) and NOAA, “2011 National Marine Recreational Fishing Expenditure Survey” 2011.
13 See, 2000, “Complementary Methodologies: Internet versus Mail Surveys”, DSS Research, Inc.
14 Dillman, D., J. Smyth and L. Christian. (2009) Internet, Mail and Mixed Mode Surveys: The Tailored Design Method. New York: John Wiley & Sons.
15 See Dillman, D. A. (2000). Mail and Internet surveys: The total design method (2nd ed.). New York:
Wiley. Division of Instructional Innovation and Assessment, The University of Texas at Austin. “Guidelines for Maximizing Response Rates.” Instructional Assessment Resources. 2007. http://www.utexas.edu/academic/diia/assessment/iar/teaching/gather/method/surveyResponse.php
16 Singh, J. (1995). Measurement Issues in Cross-Cultural Research. Journal of International Business Studies, 26(3), 597-619.
17 Peng, T. K., Peterson, M. F., & Shyi, Y.-P. (1991). Quantitative Methods in Cross-National Management Research: Trends and Equivalence Issues. Journal of Organizational Behavior, 12(2), 87-107.
18 This approaches have been shown to have a positive impact on response rates, see Cantor, D. and Cunningham, P. (2002) “Methods for Obtaining High Response Rates in Telephone Surveys” in “Studies of Welfare Populations:
Data Collection and Research Issues Panel on Data and Methods for Measuring the Effects of Changes in Social Welfare Programs”, Eds.Ver Ploeg, M, Moffitt, R.A. and Citro, C.F. , Committee on National Statistics Division of Behavioral and Social Sciences and Education National Research Council.
File Type | application/msword |
File Title | SUPPORTING STATEMENT |
Author | Richard Roberts |
Last Modified By | sarah.brabson |
File Modified | 2012-03-08 |
File Created | 2012-03-06 |