Download:
pdf |
pdfSUPPORTING STATEMENT
COASTAL ZONE MANAGEMENT ACT EXTERNAL EVALUATION
OMB CONTROL NUMBER 0648-xxxx
B.
COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
1. Describe (including a numerical estimate) the potential respondent universe and any
sampling or other respondent selection method to be used. Data on the number of entities
(e.g. establishments, State and local governmental units, households, or persons) in the
universe and the corresponding sample are to be provided in tabular form. The tabulation
must also include expected response rates for the collection as a whole. If the collection has
been conducted before, provide the actual response rate achieved.
The total respondent universe is 125 potential interviews. Of this universe, we anticipate the
total number of interviews to be 78. The target for responses for local collaborators for the state
Coastal Zone Management Programs is 48 interviewees, stratified by type of structure of those
programs (networked v. policy focused) – 6 per each type/4 interviews per program. The target
for responses for the local collaborators for the NERRS program is 18 (no specific stratification
but sample will be chosen to ensure geographic and size distribution) – 6 NEERS programs/3
interviews each. The target responses is 12 interviews for national level collaborators (know
both the CZ and NERRS programs) and will: have previous significant experience working with
CZ programs; are leaders in current government organizations and non profits with coastal zone
management focus; are leaders in current government organizations and nonprofits likely to
interact with CZ programs; and are academics who focus on coastal zone issues – 1 interview per
person.
In order to choose the specific sample (interviewees) from the universe of potential interviews,
SRA will conduct the following steps:
• Consult with NOAA planning group to categorize CZ and NERRS programs by
stratification criteria and develop a list of national collaborators;
• Develop a universe of potential interviews (8-10) through talking with each program
regarding their collaborators, partners, actual and potential customers of their
information, including contact information (telephone and e-mail);
• SRA will send an introductory e-mail describing the study, requesting and interview and
estimating the estimated time commitment, SRA will schedule an interview with the first
responders on that list until 6 interviews are reached. If one of the interviewees is
unavailable or unresponsive, SRA will move to the next interviewee on the list.
1
Tabular Form of Interview Protocol
State Coastal Zone Programs (48)
Policy Focused (24)
National Estuarine Research Reserves (18)
6 local government
6 non-profit
6 business
Network Focused (24)
6 local government
6 local government
6 state government
6 state government
6 non-profit
6 non-profit
6 business
6 business
National Collaborators (12)
6 federal government
6 academics/researchers
Response rate is anticipated to be at least 80 percent, based on rates for recent similar surveys
(see Question 3).
2. Describe the procedures for the collection, including: the statistical methodology for
stratification and sample selection; the estimation procedure; the degree of accuracy
needed for the purpose described in the justification; any unusual problems requiring
specialized sampling procedures; and any use of periodic (less frequent than annual) data
collection cycles to reduce burden.
We do not intend to conduct any statistical analysis of the responses received. Rather, we
anticipate a descriptive evaluation.
3. Describe the methods used to maximize response rates and to deal with nonresponse.
The accuracy and reliability of the information collected must be shown to be adequate for
the intended uses. For collections based on sampling, a special justification must be
provided if they will not yield "reliable" data that can be generalized to the universe
studied.
Our pool of possible interviewees is greater than the number of interviews planned in the scope
of work. In the event that a respondent is unreachable or does not respond, we will move on to
another possible interviewee in that group. We will send the background and questions in
advance and then schedule a phone interview. In the event that a respondent is unavailable for a
phone interview, but wishes to participate, we can arrange for email submission. Furthermore, we
have staff on both the east and west coasts, so are easily available for 12 hours on any given
business day.
We anticipate, at a minimum, an 80% response rate for our interview requests, based on past
experience of the contractor with similar process designs. Examples: 1) in the development of a
2003 Report to Congress, we interviewed a series of personnel from the areas of public health,
microbiology, wastewater management, and/or water quality. Employing methods similar to
those recommended here, we interviewed 76 of 88 candidates (86%); 2) in a 2001 effort to
support the Delaware River Basin Commission to develop a model for the fate and transport of
PCBs in the Delaware Estuary, SRA staff interviewed 71 of 80 candidates (89%). We have
2
found in past experiences that the personnel are committed enough to the program or issue, that
they are eager to talk, provided they have the time.
4. Describe any tests of procedures or methods to be undertaken. Tests are encouraged as
effective means to refine collections, but if ten or more test respondents are involved OMB
must give prior approval.
The survey questions were developed based on reviews of previous data collection efforts (both
internal and external), focused towards the overall mission of the CZMA program as well as a
series of discussions with both CZ and NERRS program managers. Prior to implementing our
survey in the interviews, we will beta test the survey form through two test interviews with
people familiar with the subject matter, but who are not part of the sample.
5. Provide the name and telephone number of individuals consulted on the statistical
aspects of the design, and the name of the agency unit, contractor(s), grantee(s), or other
person(s) who will actually collect and/or analyze the information for the agency.
In order to ensure a robust methodological design, we recommend the use of a small (2-4 person)
informal review panel of evaluation/program experts to assist in two phases of this evaluation:
1) methodological review; and 2) interpretation of findings. Suggestions include:
1) Yvonne Watson (EPA’s Evaluation Support Division) – Phone: 202-566-2239; Email:
watson.yvonne@epa.gov
2) Bill Michaud (SRA International) – Phone: 860-738-7501; Email:
bill_michaud@sra.com.
3) Steve Yaffee or Julia Wondolleck (Ecosystem Management Initiative)
Data collection and analysis will be performed by:
1. Linda Manning (The Council Oak) – Phone: 703.942.8512; Email:
lmanning@thecounciloak.com.
2. Greg Frey (SRA) – Phone: 503.236.7100; Email: greg_frey@sra.com.
3
File Type | application/pdf |
File Title | SUPPORTING STATEMENT |
Author | Richard Roberts |
File Modified | 2009-11-23 |
File Created | 2009-11-23 |