SUPPORTING STATEMENT
SOCIOECONOMICS OF the Ocean Guardian Education Program
OMB CONTROL No. 0648-xxxx
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g. establishments, State and local governmental units, households, or persons) in the universe and the corresponding sample are to be provided in tabular form. The tabulation must also include expected response rates for the collection as a whole. If the collection has been conducted before, provide the actual response rate achieved.
We estimate the population of students who participate in the Ocean Guardian Program to be 7,887. This information was obtained from the Ocean Guardian Grant applications. The number of student participants at each school varies from 12 to 755 students. If a school has120 or less students participating in the program we will sample the school’s entire population. If a school has greater than 120 students we will conduct a random sample of the students based on the school’s participation. The random sample will draw 100 students and we will work with the teachers at the school to randomly draw those students from their class lists. There are 5 versions are a result of using the choice experiments. The various versions will allow us to develop the marginal willingness to pay for different attributes of the Ocean Guardian Program.
The expected response rate is approximately 50-60%. New York City Department of Education completes annual surveys of parents, students and teachers. The average response rate is roughly 52.4% over the past 5 years. The response rate may be slightly higher in this case, since the teachers will be the ones requesting the data on behalf of NOAA’s ONMS.
Total Population |
Sample Size from Schools with 120 or less (9 schools) |
Sample Size |
Total Sample |
Expected Response Rate |
7,887 |
783 |
1,800 |
2,583 |
50-60% |
2. Describe the procedures for the collection, including: the statistical methodology for stratification and sample selection; the estimation procedure; the degree of accuracy needed for the purpose described in the justification; any unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.
Statistical Analysis
Data analysis (see below) will be geared toward understanding the attitudes and preferences parents have towards the Ocean Guardian Program in addition to estimating their marginal willingness to pay for various characteristics/opportunities that the Ocean Guardian Program has to offer.
Degree of Accuracy Needed for the Purpose Described in the Justification
The method we are using to collect data is the stated-preference conjoint analysis (Louviere, Hensher and Swait, 2009). We will be using a fractional factorial design. In the survey there are 7 attribute levels, one attribute with 3 levels, 5 attributes with 2 levels and the price attribute has 6 levels. This means there are 450 possible choice sets. Given our sample size, a full factorial design would not yield results we could analyze, so we are using a fractional factorial design.
Determination of Minimum Sample Size In Orme (1998), the following formula is found for determining the minimum sample size for a given design:
N = 500 * NLEV/(NALT*NREP)
where,
N = minimum sample size required
NLEV = the largest number of levels in any attribute (here 6 for number of prices)
NALT = number of alternatives (options) per choice set (not including the Status Quo), here 2.
NREP = number of choice sets per respondent (here 5).
Therefore, in our design, the minimum sample size required for statistical efficiency is equal to 300. Our planned sample size is 2,583, so our sample sizes are sufficient to not only meet minimum requirements, but provide added safety for margin of error. (Even if we only get a 50% response rate, similar to the NY survey mentioned above, this would still provide us with nearly 1,500 responses. 1,500 responses is still well above the minimum 300 that are needed).
In addition to the above, as a general rule, six observations are needed for each attribute in a
bundle of attributes to identify statistically significant effects (Bunch and Batsell, 1989 and
Louviere et al, 2000). Given that there are 5 different versions of the survey and we are sampling 2,583 persons, this would provide us with 485 persons receiving each survey version. We are confident that at least 6 persons would complete each version.
Analysis of Choice Questions. Analysis of the choice questions for estimating the non-market economic use values and how those values change with changes in Ocean Guardian attributes and socioeconomic factors will start out using a standard multinomial model based in random utility theory, as described by Ben-Akiva and Lerman (1985). To summarize their exposition, let U = utility of household (well-being). Consider U to be a function of a vector zin of attributes for alternative i, as perceived by household respondent n. The variation of preferences between individuals is partially explained by a vector Sn of socio-demographic characteristics for person n.
Uin = V(zin, Sn) + ε(zin, Sn) = Vin + εin
The “V” term is known as indirect utility and “ε” is an error term treated as a random variable
(McFadden 1974), making utility itself a random variable. An individual is assumed to choose the option that maximizes their utility. The choice probability of any particular option (Status Quo Option A, Option B, or Option C) is the probability that the utility of that option is greatest across the choice set Cn:
P (i│Cn) = Pr[Vin + εin ≥ Vjn + εjn , for all j ∈ Cn, j not equal to i]
If error terms are assumed to be independently and identically distributed, and if this distribution can be assumed to be Gumbel, the above can be expressed in terms of the logistic distribution:
Pn(i) = eμVin / Σ eμVjn
The summation occurs over all options Jn in a choice set. The assumption of independent and identically distributed error terms implies independence of irrelevant attributes, meaning the ratio of choice probabilities for any two alternatives is unchanged by addition or removal of other unchosen alternatives (Blamey et al., 2000). The “μ” term is a scale parameter, a convenient value for which may be chosen without affecting valuation results if the marginal utility of income is assumed to be linear. The analyst must specify the deterministic portion of the utility equation ‘‘V,’’ with sub-vectors z and S. The vector z comes from choice experiment attributes, and the vector S comes from attitudinal, recreational, and socio-demographic questions in the survey. Econometrics software will be used to estimate the regression coefficients for z and S, with a linear-in-parameters model specification. These coefficients are used in estimating average household value for a change in one level to another level of a particular attribute for welfare estimation. Welfare of a change is given by (Holmes & Adamowicz, 2003):
$ Welfare = (1/βc)[V0 - V1]
where βc is the coefficient on cost, V0 is an initial scenario, and V1 is a change scenario.
The standard multinomial logit model treats the multiple observations (choice experiment replications) from each household as independent. An alternative is to model these as correlated with a random parameters (mixed) logit model. Thus a random parameters logit model will also be tested using techniques described by Greene (2007).
Unusual Problems Requiring Specialized Sampling Procedures
We do not anticipate any unusual problems that require specialized sampling procedures.
3. Describe the methods used to maximize response rates and to deal with nonresponse. The accuracy and reliability of the information collected must be shown to be adequate for the intended uses. For collections based on sampling, a special justification must be provided if they will not yield "reliable" data that can be generalized to the universe studied.
To increase response rates we plan to utilize the Ocean Guardian teacher or point of contact at each school. By having a faculty member reach out to the students, we are utilizing an existing parent-teacher relationship to increase response rates. Additionally, there will be an initial letter sent home to parents informing them of the upcoming survey. We will then send them home a letter with a link to the survey. One week later we will send them a reminder letter with a link to the survey. Lastly, we will send them a thank you letter with contact information if they still need a link to the survey.
There will be 4 separate contacts to inform and remind parents to complete the survey. In cases where schools collect demographic profiles of parents, we can compare our respondents to the schools demographics and weight the data accordingly. With the various outreach procedures, and the weighting as needed, we are confident that the survey will yield results that can be extrapolated to the parent population.
Although we do not have demographics of the schools, we will collect census data for the zip codes the schools serve. This data will then be used to weight the sample from each school to be consistent with the population.
4. Describe any tests of procedures or methods to be undertaken. Tests are encouraged as effective means to refine collections, but if ten or more test respondents are involved OMB must give prior approval.
Many of the survey questions, those related to the socio-economic data, in particular, and the research methods proposed for this collection, have been repeatedly deployed in past information collections by NOAA. ONMS has routinely used importance/satisfaction questionnaires and has used the stated preference method in the past to develop estimates of ecosystem services. Additionally, several ONMS staff members have reviewed the survey and provided feedback. The staff members include education coordinators, ONMS Conservation Staff, and NOAA’s Education Evaluator. We have had informal discussions with some parents to about the topic and survey.
5. Provide the name and telephone number of individuals consulted on the statistical aspects of the design, and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
NOAA Project Leads
Dr. Danielle Schwarzmann was the primary advisor on the statistical aspects of the study design. Dr. Schwarzmann is an economist with the Office of National Marine Sanctuaries.
Project Lead
Dr. Danielle Schwarzmann
Economist
NOAA/NOS/Office of National Marine Sanctuaries
1305 East West Hwy., SSMC4, 11th floor
Silver Spring, MD 20910
Telephone: 240-533-0705
Fax: 301-713-0404
E-mail: Danielle.Schwarzmann@noaa.gov
Project Co-Lead
Seaberry
J. Nachbar
Education
Coordinator
Ocean
Guardian School Program Director
CA
B-WET Program Coordinator
NOAA's
Office of National Marine Sanctuaries
Phone:
(831)647-4204
Fax: (831) 647-4250
Email: seaberry.nachbar@noaa.gov
Project Co-Lead
Dr. Bob Leeworthy
Chief Economist
NOAA/NOS/Office of National Marine Sanctuaries
1305 East West Hwy., SSMC4, 11th floor
Silver Spring, MD 20910
telephone: (240) 533-0647
fax: (301) 713-0404
e-mail: Bob.Leeworthy@noaa.gov
Project Co-Lead
Naomi Pollack
Program Coordinator
Ocean Guardian School Program
NOAA’s Office of National Marine Sanctuaries
99 Pacific St., Building 455-A
Monterey, CA 93940
Phone: (831) 236-7677
Project Co-Lead
Sylvia Hitz
Hollings Scholar
File Type | application/msword |
Author | Sarah Brabson |
Last Modified By | Sarah Brabson |
File Modified | 2016-03-30 |
File Created | 2016-03-29 |