The Supporting Statement for OMB 0596-NEW
Overcoming Barriers to Wildland Fire Defensible Space
May 2012
B. Collections of Information Employing Statistical Methods
Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
The sampling universe will all persons living in counties selected due to their coverage by Colorado State Forest Service (CSFS) and thus within or near forest areas, i.e., in or near the wildland-urban interface. For this study the universe consists of approximately 2.8 million people in 36 different counties in Colorado (of 64 total). Collection will be limited to 4,500 persons (one per household), over the 3-year life of this approval (approximately 1,500 per year). With the 2.8 million people in approximately 1.12 million households, this calculates to an overall sampling fraction of approximately 1:248. Sampling will be done on a county by county basis using the principle of probability proportionate to size, or more accurately in our case, probability proportionate to estimated size (ppes) [Hansen, Hurwitz and Madow, 1953:341-342], and thus no case weighting is anticipated. Furthermore, no person/household will receive more than one survey (i.e, without replacement). Sampling for this collection will be focused on, and limited to, residents of counties selected in consultation with our State partner, CSFS. A random sample of addresses within the geographic boundaries of the sampling frame will be purchased from a commercial sampling firm.
The target is one adult per household. Because the number of available households vary considerably by county (e.g., some counties are only a few thousand residents and others are above 1 million), and assuming average household size is consistent throughout, we will have a universe of about 1.12 million households (5.116 million persons in CO, less metro non-WUI areas, and the resultant 2.8 million divided by average household size of 2.49). This is inexact as the process will include consultation with the CSFS to choose the counties involved and some leeway is needed to make informed choices at that time. If anticipated response rates are achieved any county with more than about 18,526 residents (7,440 households) this should yield enough responses to be tabulated as intended by county. If there is insufficient response (n<30) or population in a given county is too small (n< 18,526) a decision will be made to amalgamate adjacent counties in the CSFS service district to one or more analytical groups for the purposes of reporting (e.g., Mineral County, pop. 712, and Archuleta County, pop. 12,084, might be combined to achieve usable number of targeted households). We would ask the CSFS to assist so that counties that are dissimilar in wildland fire service aspects are not combined for analysis. Thus our minimum subsample size will be 30 and sampling and post-hoc analysis decisions (amalgamation) will be made to assure sufficient responses to achieve practical significance.
Based on previous research by the principal investigators and others doing reasonably similar studies, this survey can hope for a response rate of 70%, but likely it will be somewhat lower. This will vary by county, but unfortunately in rather unpredictable ways. Because nonresponse bias is as important as sample size or high response rate (Crompton & Tian-Cole 2001; Vaske 2008), nonresponse bias checks will also be conducted, especially if response rates fall below the targeted rate. The primary mechanism for nonresponse bias checks will be to send a sample of non-respondents a short survey request with a few key variables. If needed, a second mechanism will be used. It will not be possible to use telephone methodology as is often recommended, unless broad population comparisons are made, i.e. the original sampling list will not be used because it will not contain phone information, only email or postal addresses. Another bias check will be comparisons between those who complete the survey on-line and those who opt to fill out a mailed questionnaire (web plus mail approach).
Web-based surveys generally report lower response rates than mail or intercept methodologies. Although this may occur here we also note that our population of interest will be much more attuned to the topic of interest—their own safety and surviving a wildland fire event. It has been reported that such survey might attain over 80% response rate but it is not likely. We will follow advice from the experts such as Dillman, etc. to communicate effectively and use follow-ups to attain a maximum response rate. Note particularly the results of studies such as Lozar Manfreda et al. (2008) and other work by Dillman and his colleagues suggest that web surveys (vs email or postal mail) will yield about 11% lower response rate and this will be affected by types of survey (one-time surveys like this one are lower) and mode of contact (email is better than postal contact), etc. We are aware of these differences and hope that a focus on homeowner’s residence and the affective nature of fire loss risk will produce results in the upper tier of these otherwise broad comparison surveys.
For more specific information in the recent literature please see:
Lozar Manfreda, K., Bosnjak, M., Berzelak, J., Haas, I., & Vehovar, V. (2008). Web surveys versus other survey modes. International Journal of Market Research, 50, 79–104.
Dillman, D. A., Smyth, J. D., & Christian, L. M. (2009). Internet, mail and mixed-mode surveys: The tailored design method (3rd ed.). Hoboken, NJ: Wiley.
Dillman, D. A., Reips, U. D., & Matzat, U. (2010). Advice in surveying the general public over the internet. International Journal of Internet Science, 5, 1, 1–4.
Messer, B. L. & Dillman, D. A. (2010). Surveying the General Public by Mail vs. ‘Web plus Mail’. Pullman, Washington: Washington State University, Social & Economic Sciences Research Center (SESRC) Technical Report 10-13.
Vaske, J.J. (2011). Advantages and Disadvantages of Internet Surveys: Introduction to the Special Issue. Human Dimensions of Wildlife, 16, 149–153.
B.1 Summary table of proposed universe and sample strata
|
Population |
Households |
Sample size expected |
Respondents expected |
Universe (CO) |
5,116,796 |
2,212,898 |
4,500 |
3,150 |
Strata (36 CSFS counties) |
712 to 572,003 |
285 to 229,720 (est.) |
2 to 926 |
30 to 6481 |
1 After consolidation to achieve minimum strata sizes
Describe the procedures for the collection of information including:
Statistical methodology for stratification and sample selection,
A random sample of addresses within the geographic boundaries of the sampling frame, (i.e., the service region of CSFS districts) will be purchased from a commercial sampling firm. The sampling firm will be instructed to stratify by county and select sufficient potential respondents from household listings in a manner consistent with a probability proportional to estimated size basis. If needed we will further randomly select from the lists supplied to more closely approximate the target number of respondents. A random sample of one adult per household is targeted, and instructions on the cover letter will ask them to self-select the intended respondent.
Estimation procedure,
No estimation procedures are needed to obtain the data beyond the random sampling of selected counties. Data will be analyzed using a combination of simple population estimates (percentages, means, etc.) and more advanced statistical techniques (e.g., regression) to estimate the effects of some parameters such as educational information on desired outcome behaviors or defensible space actions.
Degree of accuracy needed for the purpose described in the justification,
In determining the required sample size for the proposed study, we identified that we would want to be able to detect small effects (at around 3% difference) between key participant groups in any one year, allow a probability of no more than .05 that we would interpret differences as significant when they are in fact not, allow the probability that we find differences that were significant if they did exist at .8, assumed variability of the population of the study at .5, and a response rate of .77. Accordingly, we determined that we would need to request information from 1,500 persons per year of this three year data collection to have sufficient sample size to detect small effect sizes at alpha .05, 80% power, in a “worst case scenario” in terms of population variance. Obviously counties sampled in a given year that have small populations may have confidence intervals around some estimates that are broader at the 95% level than those with large populations, even with the aggregation process described above. Confidence intervals and other similar procedures such as effect sizes (Hedge’s g , point biserial correlation, or phi) will be reported to assure the reader of the quality of the data for the analysis being done. We plan to report these aspects in a transparent way to assure the reader and meet QA/QC goals.
Unusual problems requiring specialized sampling procedures, and
None
Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
This survey is designed to be administered once for each participating household, with only one person from each selected household asked to fill out the survey.
Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.
Accepted survey procedures involving multiple strategies and mailings, as outlined by Dillman (2009), Vaske (2008), and others, will be used. In keeping with accepted best practices (Dillman, 2009) an invitation to the survey will be sent to individuals’ postal addresses up to three times, or until they respond, whichever comes first. The survey process will include recipient pre-notification, short time intervals to follow-up mailings, and personal appeals (i.e., avoid mass mailing).
In order to maximize response rates, we will employ a mixed-method survey design. A pre-notice letter will be sent to all households, followed one week later by an invitation to participate in the survey. The invitation will include the URL for the electronic version of the survey and an access code. One week later, a reminder/thank you postcard, again with the URL and access code, will be mailed to respondents who have not completed the survey. If a respondent does not complete the Survey via the Web, they will receive a paper survey. The reminder postcards will be sent at 1 week intervals and include the URL and access code. Additionally, the cover letter of the Survey will contain clear but concise language that identifies the purpose of the survey, as well as why it has been sent to the respondents. These factors combined help ensure a high degree of accuracy in obtaining valid responses. The successive invitations will be sent as “reminders” only to those who have yet to respond to the survey. This helps to encourage responses as well as ensure that all targeted potential respondents have an opportunity to respond.
In addition, the electronic survey process will employ commercially available, user-friendly web hosting, and allow for respondents to stop and resume answering the survey without loss of information entered previously.
Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.
Pilot testing has been conducted on 7 university or agency individuals by Colorado State University staff. They were asked on the basis of having lived in areas as similar to those intended for survey area, but without regard to prior experience with wildland fire, and eliminating those who have been involved in developing this survey. Most reported completion times of about 25 minutes, which was rounded up to 30 minutes for purposes of the burden estimate. Minor changes were made to the instrument, primarily in question wording and layout for clarity.
Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
James Absher, Ph.D, Research Social Scientist, USDA Forest Service, and Jerry J. Vaske, Ph.D., Professor at Colorado State University, headed the preparation and review of this submission package and attached instrument. Both are extensively trained in statistical and methodological issues, have substantial experience in the fields of human dimensions of wildfire, as well as considerable experience in survey design and implementation. Dr. Absher holds a degree in statistics (Stanford, 1970), Dr. Vaske authored a text on survey research methods (Vaske, 2008), and both Drs. Absher and Vaske have individually taught research methods and statistics courses numerous times at the university level.
We acknowledge and send thanks to the National Agricultural Statistical Service staff, and to the OMB & OIRA desk officers, who reviewed and provided helpful comments on drafts of this request.
Data collection and analysis will be overseen by Drs. Absher and Vaske, with the assistance of a graduate research assistant with statistical expertise under the supervision of Dr. Vaske. In addition, the PSW station has extensive on-site capabilities for data analysis and survey management.
James Absher, Ph.D.
Research Social Scientist
Pacific Southwest Research Station
USDA Forest Service
4955 Canyon Crest Drive
Riverside, CA 92507
(951) 680-1559; jabsher@fs.fed.us
Jerry Vaske, Ph.D.
Professor
Colorado State University
1480 Campus Delivery
Fort Collins, CO 80523
(970) 491-2360; jerryv@warnercnr.colostate.edu
Appendix 1: Literature cited in the responses above
Dillman, D. A., Smyth, J. D., & Christian, L. M. (2009). Internet, mail, and mixed-mode surveys: The tailored design method (3rd ed.). Hoboken, NJ: Wiley & Sons.
Hansen, M.H. , Hurwitz, W.N., & Madow, W.G. (1953). Sample Survey Methods and Theory, Vol. 1. NewYork: John Wiley.
Vaske, J. J. (2008). Survey research and analysis: Applications in parks, recreation and human dimensions. State College, PA: Venture Publishing.
Page
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2021-01-31 |