PART B:
STATISTICAL METHODS OF DATA COLLECTION
B1 Potential Respondent Universe
The target group of interest in the national survey is the universe of outdoor arts festivals, with the festival organizer as spokesperson/survey respondent. We estimate the size of the universe to be approximately 6,000 festivals. Based on a projected response rate of 75%, we anticipate a survey sample of roughly 4,500 festivals.
The field studies involve several different target groups: festival administrators, audience members, volunteers, and artists. Approximately 3,150 audience members will answer a short survey questionnaire, and approximately 56 festival volunteers and 56 artists will participate in open-ended focus group discussions.
B2 A. Sampling Method and Respondent Universe
National Survey
The national survey will consist of two complementary efforts: (1) a mini-survey of 200 randomly chosen festivals from the master list of all outdoor arts festivals in the U.S., enabling intensive follow-up to achieve a very high response rate, such as 90%, and (2) a survey of all other festivals in the universe. The mini-sample will help us understand more fully the characteristics of the universe, since we hope to capture responses from nearly everyone.
The latter survey brings to bear a complication arising from the fact that a single organizer might present multiple festivals. This means that potentially an organizer could receive requests to complete multiple questionnaires. The potential burden on a “multiple organizer” could be huge, and the likelihood of him or her cooperating with numerous survey questionnaires is slight. Therefore, to the extent that we can identify “multiple organizers,” our methodology limits the number of festivals any given organizer is asked to report on to three.
Field Research
Seven festival sites will be selected for in-depth case study. Sites will be selected to maximize diversity in geography, artistic discipline, governance structure, ticket price, and mission. Working with the executive directors at each site, we will recruit 8-10 artists and 8-10 volunteers for focus group discussions about the nature of their involvement in the festival, its benefits, and the impact of the festival on the community.
At each festival site, two surveys will be administered. The first is a survey of approximately 350 people per site who are attending general festival activities, and the purpose is to gather demographic data about them. At each festival, five research assistants will deliver the survey questionnaire to every nth person entering the festival site, with care given to procedural uniformity, including administration of the survey on the same weekday and general timeframe across sites. Research assistants will emphasize the importance of the survey and the brevity of the one-page questionnaire.
The second survey is of people attending special programs sponsored by the festival, and the purpose is to gather attitudinal and behavioral data about arts and festival attendance as well as demographic data. These surveys will be placed with a pencil on every nth seat. We anticipate a high response rate as the questionnaire is brief and the person is seated and has little reason not to assist the festival. We will collect approximately 100 surveys of special program attendees at each site.
We will also conduct an in-depth interview with each festival Executive Director to aid us in drawing a detailed portrait of the history, mission, programs and challenges of the selected festivals. These Executive Directors are leaders in the field and can provide insight into the ways in which their festivals do and do not exemplify festivals more broadly. The interviews will also help us identify and assess existing data held by these organizations that may be incorporated into the study. Further, the interviews provide an opportunity for Executive Directors to ask us in detail about the goals and methods of the study.
B. Estimation Procedures
The primary method of data analysis for national and site surveys will be descriptive. We will summarize responses by using means, standard deviations, and frequencies of responses. In some cases, we may conduct simple cross-tabs, mean comparisons, or correlations to assess relationships among particular items.
C. Degree of Accuracy
We plan to use information from the sampling frame and the responses from the high response rate mini-survey to assess the quality and representativenesss of the responses to the overall national survey. From the sampling frame we can investigate whether the non-participating organizations differ in terms of geographic location and type of organization. Similarly we plan to compare the responses to the high response rate mini-survey with the overall national survey. Based on these comparison a weighting adjustment will likely be created that will ensure that the national sample properly represents the universe of outdoor arts festivals.
Since post–data collection statistical adjustments require analysis procedures that reflect departures from simple random sampling we will estimate the “design effect” associated with a weighted estimate. The term design effect is used to describe the variance of the weighted sample estimate relative to the variance of an estimate that assumes a simple random sample. In a wide range of situations, the adjusted standard error of a statistic should be calculated by multiplying the usual formula by the square root of the design effect (deft). Thus, the formula for computing the 95 percent confidence interval around a percentage is
where is the sample estimate and n is the unweighted number of sample cases in the group being considered. The average design effects for this study will be calculated using replicate weights. Replicating weights is one way to compute sampling errors to reflect a complex sample design. The replication method involves splitting the full sample into smaller groups, or replicate samples, each constructed to mirror the composition of the full sample. Each replicate consists of almost the full sample but with some respondents removed. The variation in the estimates computed from the replicate samples is used to estimate the sampling errors of survey estimates from the full sample. Even if we do need to weight the national sample, we anticipate the average design effect to be small and the overall degree of accuracy of our weighted estimates will be sufficient to meet the purposes of this project described in the “Needs and Uses of the Data” section of the supporting statement.
D. Specialized Sampling - Not applicable
E. Less Frequent Data Collection - Not applicable
B3 Procedures to Deal with Non-Response
National Survey - The data collection schedule calls for multiple mailings to encourage festivals’ survey participation. The first mailing is a personalized, pre-survey letter in paper form, since first-class surface mailings often have greater impact than email. This pre-survey letter will explain to the festival organizer the upcoming survey and request participation when the emailed invitation arrives the following week. Once the survey invitation is emailed, three email reminders at one-week intervals are scheduled for follow-up of non-respondents, and a 4th reminder is planned if necessary. In addition, telephone prompting of non-respondents will occur as needed. These steps are the basis of a compelling data collection procedure encouraging survey participation.
The survey website is available 24 hours a day, seven days a week, and is compliant with Section 508 of the Rehabilitation Act. A toll-free technical support phone number is listed at the site and on the printed survey letter in case a festival organizer has questions about the research study or encounters difficulty using the Internet.
B4 Pre-Testing of Procedures
Silber & Associates pre-tested the survey questionnaire in October, 2008, on seven festival organizers. The objectives were to: (a) test the questionnaire for wording, flow, and meaning; (b) determine the average time to complete the survey; and (c) conduct post-survey cognitive interviews with respondents to understand their interpretation of the questions and the reasoning behind their answers. The audience survey questionnaires went through pre-testing on a small number of people to determine the average respondent burden.
B5 Person Responsible for Statistical Aspects of the Design
Contact information for the firm conducting the statistical analyses is:
Bohne G. Silber, Ph.D. (national survey)
Silber & Associates
13067 Twelve Hills Road, Suite B
Clarksville, Maryland 21029-1144
(410) 531-2121, ext. 11
bgsilber@silberandassociates.com
Timothy Triplett, Survey Statistician
The Urban Institute
2100
M Street, NW
Washington, DC 20037
(202) 261-5579
Carole Rosenstein, Ph.D. (fieldwork)
Arts Management Program
College of Arts and Sciences
1023 Clemens Hall
Buffalo, NY 14260
(716) 645-2437 x1468
File Type | application/msword |
File Title | Table of Contents |
Author | Dr. Bohne G. Silber |
Last Modified By | neaprofile |
File Modified | 2009-04-10 |
File Created | 2009-04-10 |