B. Collections of Information Employing Statistical Methods
1. Potential Respondent Universe and Sample Selection Method
The potential respondent universe for the ArtBeat initiative is all adult attendees of NEA-sponsored performances. Survey participation decisions occur at two levels.
First, NEA grantee organizations may elect to participate in the voluntary ArtBeat initiative. If an organization elects to participate, then it encourages its audience members to visit the survey website and answer the questions pertaining to the sponsored event.
Second, audience members may elect to participate in the voluntary ArtBeat survey. Each participating NEA grantee organization will be provided a unique link to a NEA website at which is located a web-based survey of their particular audiences.
Consequently, while the potential respondent universe is all adult attendees of NEA-sponsored performances, the actual respondent universe is all adult attendees of NEA-sponsored performances organized by those grantees who agreed to participate in the ArtBeat initiative.
This collection does not provide for sampling at either decision level. With respect to audience members, the survey is being made available to all attendees in order to reduce the burden on grantees. Because participation by respondents is optional, the most successful respondent targeting / recruitment methods require direct one-on-one interaction between interviewers / surveyors and an audience member. As described in Attachment A, Section 12, this intense in-person survey delivery results in a 38% participation rate.
The in-person survey would put an enormous burden on participating NEA grantee organizations, including requiring them to print and administer surveys and enter responses into a spreadsheet. In addition, this kind of in-person approach would require grantees to select a sample of audience members. Asking arts organizations to become familiar with and implement sampling techniques in such a varied set of circumstances -- from live ticketed events with assigned seats to open festivals and outdoor performances -- would be excessive.
Because the burden on NEA grantees is minimized through the ArtBeat approach, and because the survey and data collection are web-based and largely automated, the opportunity to participate in the ArtBeat survey will eventually be made available to all grantees that present live events. However, as described in Attachment A, Section 12, approximately 100 grantees with live audiences will be invited to participate in this pilot phase, of whom 27 will eventually be eligible.
In subsequent years, it is assumed that 67% of all invited grantees (i.e., those with live audiences) will elect to participate in the ArtBeat survey. This number is based on the grantee participation rate in a preliminary survey of audiences among NEA grantees conducted in FY2012 and referenced in Attachment A, Section 1. In this preliminary survey, WolfBrown used a variety of collection approaches (web-based, leaving surveys on seats, approaching audience members during and after performance, etc.) and offered grantees a financial incentive ($1,000) and a report summarizing their audience data in exchange for participating. This method resulted in a 61% participation rate (31 of 51). However, the NEA believes that the twin burdens of survey administration and data entry suppressed participation by grantees who were unwilling to dedicate their limited manpower resources to distributing surveys and typing data into spreadsheets.
Although the ArtBeat project is not offering any financial incentives to grantees, it will provide participating grantees with an individualized report and raw data pertaining to responses by their audience members. Importantly, by making the survey entirely web-based, the ArtBeat initiative promises these benefits while offering grantees drastically reduced survey administration and data entry burden. We expect this innovation to increase grantee participation rate by 5%-6%.
Lastly, as described in Attachment A, Section 12, the NEA expects approximately 10% of audience members will participate in the voluntary ArtBeat survey.
2. Information Collection Procedures
NEA grantees with live audiences will be invited to participate in the ArtBeat survey. For the pilot phase of the ArtBeat information collection, the Office of Research and Analysis at the NEA has identified 100 grantees with live audience to be invited to participate in the ArtBeat survey. This number represents approximately 10 grantees per each of the nine NEA arts disciplines that commonly have live audience. In other words, the universe of the NEA grantees with live audiences is stratified into 9 strata, and, within each strata, 10 grantees are randomly selected to be contacted. This is done to ensure that the differences between types of grantees (for example: museums and festivals) do not influence the outcome of the pilot phase. The random selection of grantees within each strata is to minimize any budgetary or other organization-specific attributes that may systematically influence the types of grantees who are invited to participate in the survey. In the pilot phase, the first three grantees per discipline to agree to participate will become a part of this information collection. In subsequent years, every grantee with live audiences will be invited to participate and eligible to participate if they choose to do so.
Each participating grantee will be provided a unique link to a NEA website dedicated to the survey of their particular audiences. Participating grantees will be encouraged to promote the survey link to their audience members via posters, flyers, program inserts, etc. At the NEA website, audience members will be briefly introduced to the survey before being redirected to a survey-taking platform. There, they will complete a short survey and have an opportunity to view a summary of data collected from audiences attending NEA-funded live events.
The collection of information will be fully web-based, and therefore all responses will be completed through electronic submission.
Because this information collection relies on grantees self-selecting into participating in the ArtBeat survey, it may appear that this may create a selection bias with respect to the results of the survey. However, the results of this study will be determined by audience responses, so the selection bias concern would be relevant if the population of audience members attending events organized by grantees who choose to participate in the ArtBeat survey is sufficiently different from the population of audience members who attend events organized by grantees that DO NOT choose to participate in the ArtBeat survey. Generally, it is very unlikely that such two different distinct populations exist; however, to ensure that the survey analysis properly accounts for the possibility of the bias, a set of weights will be created using demographic information to make the sampled audience members representative of the general arts-going population. For the information on how these weights will be constructed see the section below.
3. Methods to Maximize Response Rates and Deal with Non-Response
Since a grantee decision to participate in the ArtBeat survey is voluntary, to maximize the grantee participation rate, the NEA will offer grantees raw and summarized (tabulated, graphed, etc.) data related to their event. The NEA expects that the promise of receiving these data, which could be used for internal programming purposes, will invest grantees in encouraging participation among their audience members. Grantees will be provided with a “toolkit” that provides suggestions about how to publicize and promote the ArtBeat survey among potential respondents. Thus, the burden on the grantee associated with this initiative is minimal in comparison to a method that would require sampling and targeting1. Given this approach to survey implementation, a 10% response rate anticipate among audience members is a reasonable expectation (see Attachment A. Section 12).
While there may not be a strong reason to believe audience members that participate in the ArtBeat survey would be different from audience members who refuse to participate, this expected response rate strongly suggests that, to ensure unbiasedness of results, the NEA should adjust the ArtBeat responses for non-response. To do so, the NEA plans to leverage the Survey of Public Participation for the Arts (SPPA), an NEA-sponsored collection that is periodically implemented by the Census Bureau as a supplement to their Current Population Survey.
The SPPA offers the most comprehensive data available regarding attendance at arts events. Most recently administered in 2012 (data from this version is currently being analyzed by the NEA Office of Research & Analysis), the survey represents the nation’s most authoritative information regarding the arts attendance habits of the American public. The SPPA also contains the age, race, ethnicity, gender, and education demographics that the Census uses to create the weights to ensure survey unbiasedness2. To take an example, data from the 2008 SPPA3 show that among individuals participating in any benchmark art activities4 at least once in the last 12 months, women make up 55% and college graduates make up 30.4% of attendees.
To aid in properly accounting for non-response bias, the ArtBeat questionnaire includes questions regarding respondents’ gender, age, race, ethnicity, and education. We assume the demographics of attendees at NEA-sponsored events on the average are similar to the demographics of SPPA respondents who report having attended arts events. With this in mind, we plan to use the SPPA results to create a set of weights that will allow us to adjust the ArtBeat responses for non-response.
The control totals will be derived from the SPPA along the gender, ethnicity, race, age category, and education dimensions for those respondents who reported attending an event associated with live audiences. We will use a raking procedure to create a set of weights for the ArtBeat survey. This raking procedure will be repeated for each of the replicate weights provided by Census for the SPPA. Weights derived in this manner will permit computing unbiased proportions of respondents selecting a particular ArtBeat response and appropriate standard errors.
In order to compute the unbiased tallies of individuals selecting a particular ArtBeat response, the computed proportions of people selecting this response will be multiplied by the total number of people who attended an NEA-sponsored event with live audiences. This information can usually be obtained from grantees’ final descriptive reports (FDRs), which grantees are required to submit at the close of every grant. In addition, participating grantees will be asked to provide estimates of the numbers of attendees before the FDR submission. This will serve two purposes. First, it will permit a more timely calculation of estimates, as FDRs may not be submitted until well after the performance is complete. In addition, some grantees may choose to inform their audience members about the ArtBeat surveys during particular timeframes and not throughout the entire duration of an event (e.g., a museum exposition may last several months and overlap fiscal years). In such cases, the universe of potential respondents is limited to those who attended during the time frame for which the survey was available.
4. Test Procedures or Methods
As mentioned earlier and in Attachment A. Section 8, in FY 12, WolfBrown conducted extensive testing of questionnaire format and methods of inviting audience members to participate in the survey. WolfBrown, who are experts in surveying audiences, assessed a number of questions and response options and identified those that best capture intrinsic impact, the concept in which the NEA is most interested. The survey included in this Information Collection Request is based upon the WolfBrown study recommendations findings, using the survey questions they recommend with some minor modifications.
Notwithstanding the benefits the NEA received from the WolfBrown study, this phase of the ArtBeat initiative is being conducted as a pilot test for several reasons. First, the WolfBrown study did not include an automated on-line survey option like the one the ORA is administering for the ArtBeat initiative. This pilot phase will allow further testing not only of the ArtBeat survey, but of the mechanisms for implementing the survey and all associated materials that accompany the survey. The “PR Toolkit”, for example, is being tested through this pilot phase of the ArtBeat initiative. Other innovations being tested in this pilot include implementing the survey on a website to which all audience members are referred, developing a scalable system that connects individual responses to NEA-sponsored events, linking responses to other data the NEA collects about those events (such as data from the final descriptive reports), capturing response rates, and making raw and summarized data on responses available to participating grantees.
With respect to the survey itself, this pilot phase will also allow us to test which of the two question options currently gauging experience with creating or performing art results in lower respondent drop-out rate. At present, two versions of the question are under consideration. The more generic question simply asks "Do you have any experience with creating or performing art?" with response options being "Yes, I have extensive experience", "Yes, I have some experience", and "No". The more detailed questions asks a similar question, but asks respondents to provide a response for each of 10 NEA arts disciplines (Dance, Music, Opera, etc.).
The more detailed question is clearly more informative. However, it is also likely that the more detailed question will cause response fatigue and will increase respondent drop-out. The pilot phase of the survey will therefore randomly assign respondents either the more generic or the more detailed version of the question. A simple analysis of survey drop-out rates will allow the NEA to assess whether the or not the benefits of the additional information gleaned from the more detailed question are outweighed by an unacceptable increase in respondent drop-outs. Based on this analysis, future versions of the ArtBeat survey will include only one of the two versions of the question.
Additionally, the survey asks respondents to identify their top emotions that were evoked by the arts event. Respondents will be given a list of emotions to choose as answers for their first, second, and/or third most memorable feelings. Since reviewing one's emotions may introduce a level of fatigue into the response process, the survey will assign to a half of respondents questions asking them to identify their three most memorable emotions, and to the other half of respondents, the survey will assign questions about the top two most memorable emotions. The differences between drop-out rates and the rates of response to this question will indicate whether the question with a larger number of options resulted in a lower probability of responding to this question or to completing the survey. Future versions of the ArtBeat survey will include only one of the two versions of this question.
Similarly, we will use the pilot phase of the ArtBeat initiative to evaluate the utility of including two affect questions currently included in the survey ("To what extent, if at all, did you feel captivated during the event?" and "To what extent, if at all, did you lose track of time during the event?"). It is important to note that we include both questions on the survey based on the recommendations provided by WolfBrown based on their extensive audience impact survey experience and, more specifically, their testing of the questionnaire on audience members at NEA-sponsored events.
The "captivation" question is most directly related to the performance measure found in the NEA Strategic Plan, in which we commit to measuring the share of adults who report being affected by NEA-funded events. The "track of time" question, while not as closely linked to the NEA’s performance measure, is more literal, may be better understood by respondents, and therefore may be more reliable.
Hence, there is a trade-off between the two measures' validity and reliability. The "captivation" question has greater content validity; the "track of time" questions appears to be more reliable. This pilot phase of the ArtBeat survey will allow us to compare the results of the two measures. If the results from the two variables appear to be very similar, we may conclude that that the "captivation" question is objectively as reliable as the "track of time" question, which would suggest that subsequent iterations of the ArtBeat survey should include only the "captivation" question. If, however, the pilot phase shows that the "track of time" results are considerably different from the "captivation" question, we may conclude that the two questions are capturing different concepts.
B5. Individuals Consulted on Statistical Aspects, Collecting, and/or Analyzing Data.
The individuals consulted on this project involved both internal and external personnel.
Externally, heavy use was made of the information provided by Alan Brown and Jennifer Novak-Leonard of WolfBrown, the premier organizations in surveying audiences of live arts events, and other WolfBrown consultants.
Alan Brown, Principal
WolfBrown,
alan@wolfbrown.com
415.796.3060
Internally, the NEA Office of Research & Analysis contains a number of individuals with an extensive background in quantitative social science applications including public survey construction and analysis:
Steven Shewfelt, Ph.D.
Deputy Director of Research & Analysis
National Endowment for the Arts
202.682.5563
Joanna Woronkowitz, Ph.D.
Formerly, Senior Evaluation Officer at the NEA Office of Research & Analysis
Currently, professor at the School of Public and Environmental Affairs at Indiana University, Bloomington IN.
Melissa Menzer, Ph.D.
Office of Research & Analysis
National Endowment for the Arts
202.682.5548
Roman Ivanchenko, Ph.D.
Office of Research & Analysis
National Endowment for the Arts
202.682.5743
The NEA ORA stuff will conduct ArtBeat data analysis.
1 A survey takes 5 min to complete. On the average, it will take 2 minutes to enter the responses into a spreadsheet. A rejection to take a survey would take approximately 1 minute. Assuming a response rate of 38% for the in-person targeted recruitment method, it would require approaching 263 candidate respondents for an approximate recruitment, answering, and data entry time of 863 minutes, or 14.4 hours. This number is almost five times larger than the current estimate of 3 burden hours per grantee shown in Attachment A, Section 12.
2 An additional benefit of including demographic items is the ability to use them to evaluate the extent to which the NEA-sponsored events with live audiences reach the underserved populations.
3 http://www.nea.gov/research/2008-sppa.pdf, Table 3-2.
4 Jazz, classical music, opera, musical plays, non-musical plays, ballet performances, and visits to art museums or gallers constitute benchmark art activities.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Roman Ivanchenko |
File Modified | 0000-00-00 |
File Created | 2021-01-29 |