Download:
pdf |
pdf
Poetry Out Loud
Evaluation Plan
Updated
August 1, 2018
Prepared for:
National Endowment for the Arts (NEA)
Prepared by:
Social Policy Research Associates (SPR)
Contract #: C17‐05
This page was intentionally left blank.
Contents
Introduction to the POL Evaluation Plan ........................................................................................... 1
A Review of the Literature ................................................................................................................. 4
Revisiting the Logic Model and Evaluation Planning Matrices ........................................................ 20
Data Collection ................................................................................................................................. 28
Data Analysis .................................................................................................................................... 46
Deliverables and Timeline ................................................................................................................ 52
Next Steps for the Evaluation .......................................................................................................... 56
Appendix A: Study Framing Document .......................................................................................... A‐1
Appendix B: Bibliography ................................................................................................................ B‐1
Appendix C: Power Analysis ............................................................................................................ C‐9
Appendix D: Customizable Email for School Recruitment ............................................................ D‐1
Appendix E: Consent and Assent Forms ......................................................................................... E‐1
Appendix F: Interview and Focus Group Protocols ........................................................................ F‐1
Appendix G: Survey Instrument ..................................................................................................... G‐1
Appendix H: Survey Instrument Technical Details: Scale Validity and Reliability ......................... H‐1
This page was intentionally left blank.
Introduction to the POL Evaluation Plan
The National Endowment for the Arts (NEA) commissioned a multi‐year study to better
understand student‐level outcomes associated with the Poetry Out Loud program implemented
under optimal conditions. In December 2016, the NEA awarded Social Policy Research Associates
(SPR) a contract to conduct the study over a 29‐month period ― December 2016 through April
2019.1 This evaluation plan builds on the foundational work of the Study Framing Document (see
Appendix A) and presents the revised and final evaluation design. The document that follows
opens with a review of the literature; it next presents the plan for data collection and data
analysis; and it presents an overview of the project’s deliverables and timeline. Finally, it
presents next steps for the evaluation.
Poetry Out Loud (POL) Program Overview
This study is a new data collection request, and the data to be collected are not available
elsewhere unless collected through this information collection. The data collection activities are
planned for September 2018 through June 2019. The study will provide the National Endowment
for the Arts (NEA) a better understanding of student‐level outcomes associated with the Poetry
Out Loud program.
Since its founding in 2005, Poetry Out Loud (POL) is a national arts education program
implemented annually that encourages the study of great poetry. The program consists of a
tiered poetry recitation competition to high schools across the country supported by free
educational materials. Beginning at the classroom level typically during the fall semester,
winners will advance to a school‐wide competition, then to a regional competition (if
implemented in the state), then to a state competition, and ultimately to the national finals in
Washington, DC, held in late April or early May. The program is a partnership among the
National Endowment for the Arts (NEA), the Poetry Foundation,2 and the state and jurisdictional
arts agencies of the United States. POL serves more than 3 million students and 50,000 teachers
from 10,000 schools in every state plus Washington, DC, the US Virgin Islands, and Puerto Rico.
Information about the competition and instructional resources is provided through the
Poetry Out Loud website (poetryoutloud.org). Participating teachers use the Poetry Out Loud
toolkit (including the Teacher’s Guide and classroom posters) and online resources (including
lesson plans, learning recitation videos, and information on how to run a competition) to teach
poetry recitation and run classroom competitions. Students select, memorize, and recite poems
from an online anthology of more than 900 classic and contemporary poems. Information on
evaluation criteria and judging is also publicly available on the website.
1
The NEA later approved a one‐year no‐cost extension to allow the research team to begin collecting data in Fall
2018 rather than Fall 2017, thus extending the project to April 2020. These changes will be reflected in the Gantt
chart in the Deliverables and Timeline section.
The Poetry Foundation, publisher of Poetry magazine, is an independent literary organization committed to a
vigorous presence for poetry in our culture.
2
Poetry Out Loud Evaluation Plan 1
Poetry Out Loud is implemented in schools and classrooms in generally one of two
ways—requiring mandatory student participation or allowing students to voluntarily participate
in the program. Mandatory participation means that a teacher requires his or her entire class(es)
to participate in the Poetry Out Loud program. Some schools may additionally require grade‐
level participation or even school‐wide participation. In contrast, some schools may opt to have
students voluntarily participate in the program. This means that students self‐select to
participate in Poetry Out Loud whether this is in the classroom or in an after‐school club.
Each organizing partner makes significant contribution to program planning and
implementation. Each year, the NEA and Poetry Foundation collaboratively: develop or update
the content and design of all Poetry Out Loud program materials (including the Teacher’s Guide,
anthology, poster, and website); coordinate and provides technical assistance to program
managers at the state arts agencies; plan the Poetry Out Loud National Finals; and invest in
expanding the program’s reach to new audiences. The NEA provides funding to state arts
agencies to implement the program and to run the national finals as well as support and
resources for state and local‐level partners, teachers, and students. The Poetry Foundation
provides funding for the program’s prizes, travel, permissions, website, materials, and
distribution of materials in addition to support and resources for state and local‐level partners,
teachers, and students. Each state arts agency is responsible for administering Poetry Out Loud
in their state. This includes publicizing the program, recruiting schools to implement Poetry Out
Loud in the classroom, and conducting a state competition. Each state arts agency receives an
NEA grant of $17,500 to assist with expenses of Poetry Out Loud program coordination.
The study supports the Agency’s FY 2018‐2022 Strategic Plan, which seeks in part to
“expand and promote evidence of the value and impact of the arts for the benefit of the
American people” (Strategic Objective 3.2). The current evaluation study will be the first since
2008. The prior implementation evaluation, which was commissioned by the Poetry Foundation,
focused on the reach, support, and engagement with POL by students and participating schools,
providing compelling evidence that the program had continued to grow (over the course of the
three years) and reach increasingly diverse students, rural schools, and schools with and without
existing strong arts programs. Additionally, the evaluation found that POL helped to facilitate
both the engagement and retention of teachers by providing them resources to bolster existing
curricula. With respect to student‐level outcomes, the evaluation focused largely on poetry
appreciation and engagement. However, since the evaluation engaged only state‐level POL
student champions, these study findings are not assumed to be representative of POL
participants in general.
The current evaluation was requested by NEA senior leadership and program partners
who seek to build upon the past evaluation by increasing understanding of POL’s impact on
student participants. Specifically, agency and partner staff expressed interest in understanding
the impact of POL on students who had not volunteered to participate – that is, students whose
teachers required their participation in POL (“mandatory student participation”) – in order to
reduce or eliminate the bias associated with self‐selection. The study will focus on assessing
student outcomes in poetry appreciation and engagement, but also student‐level outcomes
associated with social and emotional development, and academics. In order to more fully
Poetry Out Loud Evaluation Plan 2
understand the impact of POL, a quasi‐experimental design was sought that established a
comparison group of students who did not participate in POL.
Program managers are also interested in understanding the effectiveness of the program
when it is implemented under conditions promoted by the POL partners as optimal. The current
study is structured as an efficacy study in order to examine the student‐level benefits of this
program under these optimal conditions. Because POL programming varies across schools and not all
schools that implement POL do so under optimally conditions, the present study is not intended to
be representative of the entire universe of schools implementing POL.
Poetry Out Loud Evaluation Plan 3
A Review of the Literature
POL is a national program, available to schools in every state. Given the breadth of the program’s
scope and the specificity of its design (having students study, memorize, and perform poetry),
and the required rigor of our evaluation design (mixed methods, and quasi‐experimental design),
we conducted a broad sweep of the literature to ensure that our review included not only
research that addressed outcomes in areas aligned with this evaluation (poetry appreciation and
engagement, academic achievement and engagement, and socio‐emotional development), but
also research that provides us with a comprehensive understanding of the different research
designs and methods employed in relevant studies. The first half of the literature review focuses
on outcomes of interest for the POL evaluations as examined in arts education and poetry‐
specific research. In it, the research team looks at research on populations, settings, and art
forms similar to POL, as well as at some studies of programs different from POL but which the
research team felt shed light on the evaluation. The second half of the literature review focuses
on methodology―especially issues of selection bias in research and ways to address those
issues.
Ultimately, our team reviewed 84 documents. Because many touched upon multiple areas of
interest that guided our literature review, the list is intentionally not mutually inclusive.3 An
overview of these areas of interest and the numbers of research documents in our review that
address these areas is provided in Exhibit 1.
3
We identified sources primarily using the search engine Google Scholar—an index that includes a wide range of
scholarly literature and most peer‐reviewed journals. Variations of the following search terms were used to identify
relevant literature: “Quasi‐experimental design and poetry,” “school poetry program,” “poetry engagement in the
classroom/high school,” “assessing poetry engagement in students,” “evaluation of poetry programs,” “poetry and
verbal development/literacy,” “poetry recitation program,” “poetry and positive outcomes for students,” “poetry
and socio‐emotional outcomes,” poetry and academic outcomes,” “attitudes towards reading and writing,” “literacy
and the arts,” “evaluations of school arts programs,” “evaluation of school theater/performing arts programs,”
“theater arts and academic outcomes,” “arts participation and academic outcomes/socio‐emotional outcomes,” and
“evaluation of/school public speaking programs.” Literature identified with these search terms was examined for
relevance and reviewed more thoroughly if determined to be relevant. We also drew on the NEA website’s archive
of working papers, “Research: Art Works”, as well as on direct recommendations from the NEA and on the arts‐
related research of our own team.
The research team also conducted a review of validated measures and instruments used in education and arts
research, which informed the development of the evaluation survey instrument. Search terms for the instrument
review included: “measures of life‐long learning in high school,” “measures of communication abilities,”
“communication skills‐self assessment,” “openness to diverse perspectives instruments,” “measuring 21st century
skills,” “measuring student engagement,” “measuring socio‐emotional development/learning,” “attitudes/anxiety
about public speaking scale,” “attitudes towards poetry scale,” and “assessing poetry engagement in high school,”
and “measuring high schoolers’ attitudes towards reading and writing.”
Poetry Out Loud Evaluation Plan 4
Exhibit 1. Literature Review Overview
Areas of Interest
# of Sources
Age Range
High school
Middle school
< Elementary school
Adult
Context
Provides Context4
Design
Qualitative
Quasi‐experimental
Meta‐analysis/Lit Review
Descriptive
Case Study
Experimental
Correlational
Data Collection Method
Survey
Interview
Existing Data
Observational
Other
Outcomes
Socio‐emotional
Academic
Literacy
Other
School Engagement
Poetry Appreciation
Community/Civic Engagement
Population
General Youth
Low Socioeconomic Status
Other
English Language Learner
Minority
Teachers
Setting
School Based
23
20
10
7
31
15
12
9
3
3
3
15
17
14
9
2
17
20
18
14
7
6
5
20
29
5
5
4
2
29
27
4
This category includes sources such as The Arts and Education: New Opportunities for Research (Arts Education
Partnership, 2004) and Benchmarking for Success: Ensuring U.S. Students Receive a World‐Class Education (National
Governor’s Association and Council of Chief State School Officers 2008), publications that provided the research
team with topical or methodological background, while not necessarily being research studies or directly applicable
ones.
Poetry Out Loud Evaluation Plan 5
Areas of Interest
# of Sources
Community Based
Extracurricular
Study Focus
Poetry
Arts Education General
Theater
Other
Public Speaking
4
4
21
15
6
1
21
Situating Poetry in the Research Base
Poetry is language that focuses on rhythm, sound, image, and other aesthetic or experimental
qualities more than sense‐making. The National Endowment for the Arts and the Poetry
Foundation created POL based on a clear valuing of the multidimensional and multisensorial
facets of poetry. POL’s design aims to support students in reaping the benefit of poetry study,
memorization and performance, recognizing that dynamic poetry study can not only strengthen
student performance in English Language Arts, but support students in public speaking, self‐
confidence, and learning about literary history. The purpose of the current evaluation is to
examine the degree to which these and other outcomes are affected by participation in POL.
However, there are challenges to systematically assessing the effect of poetry programming on
student outcomes. Despite the multidimensional and sensorial characteristics of poetry, it is not
generally characterized as part of ‘the arts’ in school settings. That is, unlike music, dance, visual
art, and theater—which are characterized as ‘arts’ and often taught as part of distinct arts
modules—poetry is typically categorized as an English/Language Arts (ELA) curricular element.
For example, a recent study of arts education in K‐12 public schools mentions creative writing
and poetry only once (Parsad and Spiegelman, 2012). At the same me, poetry―especially
poetry memorization and recitation—is rarely, if ever, distinguished in research or educational
standards from other aspects of the ELA curriculum. For example, the Common Core standards
for ELA mention poetry in relation to “Reading” standards and analyzing “Range, Quality, and
Complexity” of texts, but not in “Speaking & Listening” or in “Writing.” Similarly, neither the
National Council of Teachers of English (NCTE) nor the English Language Arts Standards mention
poetry specifically. There is a great deal of research focusing on the importance of ELA curricula
for a variety of student outcomes. Common Core, the current set of national standards for ELA
and math sponsored by the Council of Chief State School Officers (CCSSO) and the National
Governors Association Center for Best Practices, prides itself on having been developed on a
foundation of extensive scholarly research and evidence (NGA, CCSSO and Achieve, 2008). And
yet, because poetry tends to be grouped with ELA at large, there is no deep research base that
can tell us about the value of poetry programs like Poetry Out Loud for student outcomes.
Poetry Out Loud Evaluation Plan 6
Arts Participation and Academic Achievement Generally
Because the body of empirical research on poetry studies is scant, we broadened our literature
review to look at arts participation generally, looking specifically at studies focused on outcomes
that were similar to those of POL. The body of research demonstrating a strong statistical link
between participation in the arts and impacts in areas deemed critical by researchers and arts
proponents is not robust. As Thomas (2016) notes, most arts and arts education research is
primarily descriptive in nature, making it challenging to assert with confidence whether
participation in the arts will result in strong academic performance. Still, the existing research
yields interesting correlative findings from which we can build. For example, we reviewed several
studies that investigate the links between arts participation and academic outcomes, measured
largely through correlations between arts involvement and standardized test scores. Some
looked at outcomes more generally (e.g., Inoa, 2014; Catterall, Chapleau and Iwanaga, 1999; and
Harland et al., 2000), as measured by standardized test scores.5
There were a few studies that were cited frequently. This includes a study by Vaughn and Winner
(2000), who examined the relationship between number of years of arts participation and SAT
scores. (This study is also discussed in the methodology section.) They found that students who
participated in more years of arts classes had high math, verbal, and composite SAT scores.
Given our evaluation’s investigation into outcomes related to the performative aspect of POL, it
is interesting to note that the researchers also found that participation in theater had the highest
correlation with SAT verbal scores. Catterall and colleagues’ 1999 examination of the National
Educational Longitudinal Study (NELS) data also found that participation in the arts over a long
period of time resulted in consistently higher academic outcomes by the 12th grade when
compared to non‐arts involved students. Like Vaughn and Winner, their findings also indicated
that sustained involvement in theater arts was linked to improvements in reading proficiency,
gains in self‐concept, and higher levels of empathy and tolerance. In 2012, Catterall et al. (also
discussed in the methodology section) expanded their analysis, looking at data from four
different longitudinal studies to examine the relationship between arts participation and
academic outcomes, focusing specifically on youth from “economically and socially
disadvantaged backgrounds.” Catterall and colleagues found that arts‐engaged youth from
socially and economically disadvantaged backgrounds had stronger achievement outcomes than
similarly disadvantaged youth who were not arts‐engaged.
Arts Participation and English Language Arts Outcomes
While the studies mentioned in the previous section provided some useful insights about the
connections between arts participation and academic outcomes in general, and the potential
influence of sustained arts involvement on academic outcomes generally, they did not tell us
enough about the potential connections between specific kinds of arts participation and
outcomes for high school students focused in ELA in particular. Indeed, in its 2004 report, “The
5
Although, below, in the section on Addressing Relevant Methodological Issues, we discuss some issues with the
lack of statistically significant findings or other limitations in study design, laying the groundwork for the mixed‐
method POL evaluation design.
Poetry Out Loud Evaluation Plan 7
Arts and Education: New Opportunities for Research,” the Arts Education Task Force reported a
dearth of research on the connection between the arts and older students’ language and literacy
development, noting that most work done in this research area is focused on younger children.
Our review of the literature confirmed this gap, though the studies by Vaughn and Winner
(2000), and Catterall, Chapleau and Iwanaga (1999) indicate that there is some connection
between ELA outcomes and performing that warrants further study. Indeed, Podlozny’s (2000)
meta‐analysis of experimental studies examining the links between drama instruction and verbal
development and achievement yielded interesting results. This meta‐analysis included 80 studies
involving a range of children and youth engaged in school‐based drama, and that used at least
one measure of verbal achievement (oral measures of understanding, written measures of
understanding, reading achievement, reading readiness, oral language development, vocabulary,
and writing). The study found strong correlations between drama instruction and verbal
achievement outcomes on six of the seven areas (vocabulary was the only area where the
correlation was not significant). Interestingly, there were significant effects for both enacted and
non‐enacted texts (though, not surprisingly, the effects for the enacted texts were larger).6 The
finding that drama instruction can improve both enacted and non‐enacted texts is intriguing,
providing some evidence that the potential verbal development and achievement benefits of
arts programs may extend beyond program participation.
Poetry Studies and English Language Arts Outcomes
Studies on the academic effects of studying poetry (and especially the effects of memorization
and recitation of poetry) at the high school level are few, and those that exist are generally small
in scale and qualitative in nature. For example, a rare study that examined poetry memorization
and recitation in a school setting (Athanases, 2005) focused on a single 10th grade classroom
where students selected a poem (written by someone other than themselves) to recite to their
classmates at the end of the unit. Using ethnographic methods such as classroom observations,
observation of poem recitation rehearsals, and student reflections, Athanases found that the
students strengthened their ability in writing about poetry, and in using and applying concepts
such as assessment of dramatic situation and subtext. While not generalizable, the methods are
instructive and the findings of this study point to the potential power of the performative aspect
of poetry study and how it is connected to strong literacy practices. Indeed, Fisher and Frey
(2007) named reading aloud as a literacy strategy that works, noting that hearing others perform
and hearing themselves read are effective strategies for facilitating fluent reading skills.
Moreover, reading poetry out loud can lead to greater enjoyment of (and therefore engagement
in) poetry and literature in general. A study by Crozer (2014) demonstrated that framing poetry
curriculum and instruction as “play” for a sample of fourth, fifth and sixth grade students
increased both student and teacher engagement with literature. Similarly, in a study by Ivey and
Broaddus (2001), which included surveys and interviews of 1,765 sixth grade students in 23
schools and follow‐up interviews with 31 students, students reported strong enjoyment around
reading poetry and plays out loud.
6
Enacted texts were acted out in class, while non‐enacted were simply read.
Poetry Out Loud Evaluation Plan 8
Research focused on adults indicate that poetry engagement may also be shaped by teacher
effectiveness. Findings from “Poetry in America” (Schwartz et al., 2006)—the first national, in‐
depth survey of people’s a tudes about and experiences with poetry―provided insights around
adults’ reading and listening habits, early experiences with and perceptions of poetry, and how
people took in poetry. One major finding of interest for our study is that a sizeable portion of the
survey’s 1000 respondents (both those who identified as current poetry users and non‐users)
indicated that teachers were influential in people’s early experiences with poetry. The role of
effective teaching is an important line of analysis to consider in our evaluation, particularly given
POL’s focus on giving teachers the support they need to teach poetry effectively and with
confidence.
Some literature suggests that poetry engagement can also be influenced by the medium used to
teach poetry. Hughes (2009), for example, looked at how performance and the medium for
teaching poetry influence outcomes. In particular, Hughes studied the performative aspect of
digital new media—the way in which the creation and posting of content foregrounds the
persona of the content creator/poster—as a mechanism for teaching and creating poetry. She
collected data from a single classroom of 28 students in Ontario, Canada, using photographic
and video documentation and semi‐structured interviews. A key finding was that students were
more excited and engaged with poetry after participating. While this research was focused on a
poetry writing program and digital media, many of the same elements fostered via the medium
are present in POL. For example, the performative aspect, the audience focus (students shared
their poems with the class), and the collaborative ethos described by Hughes are aspects that
are also present in POL. The study provides a compelling argument that alternative modes of
teaching poetry that draw on performative elements―such as, in POL’s case, the frame of the
compe on―are key to increasing student engagement.
In addition to increased engagement in poetry and literature, some poetry‐focused research also
highlights outcomes related to stronger understandings of poetry and poetry constructs (e.g.
meter, form, diction, metaphor), increased student confidence in their own understanding of
poetry (and teacher assessment of student understanding), and increased skill development in
English Language Arts. For example, Koukis (2010) conducted a study of 19 under‐performing
students, examining if, after a 10‐week poetry reading and writing workshop in tracked English
classes, they thought of themselves as more successful English students. The study also
examined the extent to which students’ knowledge of poetry increased, poetry sections of
standardized tests improved, and knowledge of poetry‐related concepts improved. Findings
indicate that students had a better understanding and knowledge of poetry after the workshops,
and their test scores on literacy concepts and terms improved. Focus groups and interviews also
indicated that students developed a clearer understanding of metaphor and other poetry terms.
Wiseman (2010, 2011) studied 22 eighth‐grade students’ responses to a year‐long poetry
program implemented via weekly 45‐minute workshops in students’ English classrooms and
taught by a teaching artist from the community. Workshops focused on having the students
write and read poetry and often included contemporary hip‐hop and rap songs among the
poems taught. Wiseman used ethnographic and observational methods to document students’
experiences and responses to the program throughout the year. Findings included positive
outcomes in areas aligned with POL’s goals, including improved reading comprehension
Poetry Out Loud Evaluation Plan 9
specifically regarding poetry, comfort with poetic devices, analytical capacity, self‐confidence as
it pertains to self‐expression, and “creative manipulation of emotional and social topics while
integrating and expanding students’ language” (2011, p. 76).
Arts Participation and Socio‐emotional Development
While the “value” of the arts (including poetry) is often measured by its ability to influence
academic outcomes, there is also a body of research that investigates the value that arts
participation has in supporting various aspects of socio‐emotional development (e.g., self‐
concept, self‐confidence, positive behavioral change). While most empirical studies are small,
one mixed methods study conducted in England by Harland et al. (2000) was significantly large
and focused on students at the high school level.7 This study included the administration of over
2,200 surveys to 11th graders at 22 schools, qualitative case studies of five secondary schools
noted for having strong arts reputations, and an examination of performance on national
academic tests for 27,607 students in 152 schools. In terms of socio‐emotional outcomes,
intriguing findings emerged from the case studies. School administrators reported that the arts
contributed to a more positive school culture by encouraging a positive, cohesive atmosphere,
while students reported that participation in arts classes provided opportunities to learn about
social and cultural issues, and contributed to their personal and social development. They also
reported having enriched expressive skills and an increased sense of self‐confidence as a result
of arts participation.
The scope and scale of Harland’s study was not typical.8 Much of the research linking the arts
and socio‐emotional outcomes investigates programs that use the arts—specifically the dramatic
arts—as a kind of therapeutic intervention. Daykin et al. (2008) conducted a comprehensive
review of research on the impact of performing arts participation on adolescent health and
behavior and found that all the studies that met their quality standards for inclusion into the
review were focused on drama interventions, leading the researchers to conclude that there is
“relatively little reporting of evaluated non‐drama interventions within non‐clinical settings” (p.
12).
Daykin et al.’s finding that research into the connection between performative arts and
emotional health is largely focused in the dramatic arts is not surprising and confirms our
assumptions that investigations into theater‐focused research would support our efforts to
evaluate the connections between the performative aspects of POL and the project’s desired
outcomes. This makes sense given the element of “play” in this art form, which enables
participants to safely try on different roles and points of view when they recite. It creates a space
7
England and the United States have very different school systems; thus, while the team found the findings of
interest, it remains circumspect about their broad applicability to the current study.
8
Elpus (2013) also conducted a large‐scale study looking at the connections between arts education and positive
youth development, using data from the National Longitudinal Study of Adolescent Health. While his study yielded
interesting results, some of which link arts education to positive behavioral outcomes, some of the specific
indicators used (e.g., drinking, substance abuse, levels of sexual activity, suspension) are not useful for the purposes
of our study. However we found his methods instructive and discuss them further in the methods section of this
literature review.
Poetry Out Loud Evaluation Plan 10
for participants to question a character’s choices, the motivation for those choices, and the life
contexts that may have shaped those choices. This creates opportunities for the development of
theory of mind9 and empathy, which is foundational for positive behavioral change. However,
empirical studies measuring empathy development in general are largely focused in early
childhood—there are few studies that concretely link the development of empathy with
participation in the arts, particularly for high school age youth (Goldstein, 2011).10
Still, Daykin et al.’s review of the literature was useful, indicating evidence of a positive
relationship between drama participation and peer interactions and social skills. Some of the
studies they highlight include a randomized control trial study by McArdle et al. (2002), which
yielded some evidence of student‐reported improvement in self‐concept as well as teacher‐
reported improvements in behavior, however, this study was focused on a very specific and
younger target population (‘at‐risk’ 11‐year‐olds). Daykin et al. also highlight a mixed methods
study by Walsh‐Bowers and Basso (1999) that was intriguing to our team because the length of
the drama intervention is longer than most interventions (15‐weeks) and is therefore more
closely aligned to the POL program time frame, and because it incorporated teachers’
perspectives in their data collection methods. In fact, the study found significant improvements
in students’ social skills, as reported by teachers, though student reports did not support this
finding, which we find useful to think about as we work to ensure that our evaluation design
triangulates and clearly distinguishes findings across data collection groups and strategies.
In our review of the empirical research linking arts and socio‐emotional outcomes, we were
intentional about seeking out research studies that took place in schools, in order to align the
research contexts with the POL program. Again, we found that what little empirical research
exists is theater‐based and often has a specific intervention focus. Joronen (2012), for example,
conducted a controlled study of a school‐based theater arts program designed to enhance social
relationships and reduce bullying. (Joronen is also discussed in the methodology section.) The
9
Theory of mind refers to our general understanding of what others may be thinking or feeling [Wellman, Cross, &
Watson (2001), cited in Goldstein (2011)].
10
Goldstein’s (2011) study about the relationship between the development of social‐cognitive skills and different
forms of arts participation offers suggestive findings, particularly in terms of the development of empathy through
acting. Empathy, understood as an emotional response to another’s emotional state, has been previously linked to
participation in the arts and has been shown to be related to the development of critical social skills during
adolescence such as increased understanding of perspectives of others (which in turns helps in problem solving and
avoiding conflict). Her findings indicate that acting, rather than music or other types of arts, appear to be a
conducive medium to develop empathy because as students develop their characters, they focus on learning about
them and thinking about the characters’ motivations and emotions.
Poetry Out Loud Evaluation Plan 11
researchers used one scale focused on social relationships from a School Well‐Being profile to
measure social relationships and bullying before and after the theater program intervention. The
researchers found a statistically significant effect on social relationships and a decrease in
bullying victimization. Kim and Boyns (2015) examined the effects of a five‐week theater
intervention for a small sample of autistic teens (N=18) which culminated in a public
performance of a musical about the autism spectrum. The findings from this quasi‐experimental,
non‐equivalent group design study included significant increases in comfort with others, self‐
esteem, and empathy.
While the research described above helped to inform our thinking about where to focus our
energies in our investigation into the socio‐emotional outcomes related to POL, it is important to
note that the studies above do not align well with POL in numerous ways. They differ in terms of
target populations (most studies are focused on younger children whereas POL is focused on
high school‐aged youth), in specificity (some target ‘at‐risk’ youth or youth on the autism
spectrum, whereas POL does not specify a target population beyond high school age), and in
intention (many of the programs are designed to address specific behaviors or attitudes,
whereas POL is focused on the study of poetry, with no social interventionist aims.) And finally,
of course, the research above is focused on drama, which helps us to think about links that can
be made around the performance aspect of POL, but it does not help us to understand the ways
in which poetry study can support improved socio‐emotional outcomes.
Poetry and Socio‐emotional Development
The literature base linking poetry with socio‐emotional outcomes is thin, and largely includes
studies of programs that took place in non‐school‐based settings. These include several studies
of poetry programs for youth that are located only partly in schools (Weinstein 2010) or fully in
non‐school locations such as libraries or juvenile detention centers (Crawford Barniskis, 2012;
Lazzari, Amundson and Jackson, 2005). Weinstein (2010) conducted a multi‐year study of youth
spoken word (YSW) programming using ethnographic methods. She collected data by being a
participant‐observer at in‐ and out‐of‐school workshops and via interviews with youth poets,
teaching artists, program administrators, and classroom teachers. She found that teenagers who
participate in YSW programs identify multiple personal and social benefits from their
participation. They begin to see themselves as writers and to act on that self‐perception, and
they report positive effects on self‐confidence, sense of self‐efficacy, belonging, and purpose
through participation in the poetry programs. Weinstein also cites several other studies of
spoken word programs (Fisher, 2003, 2005, 2007; Holbrook and Salinger, 2006; Jocson, 2006;
and Weiss and Herndon, 2001), noting that evaluations of those programs have shown increased
participant confidence, self‐efficacy, and understanding of genre and process.
Crawford Barniskis (2012) conducted a mixed methods study rooted in grounded theory11, which
examined teens’ experiences in arts programs hosted by the public library and its influence on
civic engagement. This was a particularly intriguing study, given its focus on civic engagement—
11
Grounded theory is an inductive methodology, in which conceptual categories emerge during the research and
inform the research as it continues.
Poetry Out Loud Evaluation Plan 12
an outcome not touched on by any of the other poetry‐specific studies we reviewed. The
program consisted of sessions on graffiti, digital photography, poetry, drawing, and dance.
Fourteen teens, ages 12‐18, participated in the study. Participants completed short surveys
assessing their civic and social attitudes and they participated in focus groups and interviews at
the end of the six‐session program. Participants shared that the program facilitated a sense of
belonging, social connection, creativity, and a sense of being valued. The author suggests that
these represent key values that facilitate civic engagement, an area of interest for us as a
measure of social and emotional development.
Finally, Arts Education Partnership (2004) suggests that further research should explore the
relationship between arts participation and student resilience and the role of the arts in helping
young people cope in difficult situations as well as to become active change agents in their
schools and communities.
Methodological Issues: Addressing Self‐Selection Issues and Strengthening
Causality Claims
Almost two decades after Winner and Hetland (2000) concluded there was insufficient evidence
to make causality claims about the effects of arts involvement and a variety of student
outcomes, several studies have attempted to address this gap. The challenge has been that while
correlational studies find positive and statistically significant relationships between involvement
in the arts and student outcomes (Catterall, Chapleau, and Iwanaga, 1999; Caterrall, Dumais,
Hampden‐Thompson, 2012; Vaughn and Winner, 2000), the findings, though robust, have not
supported a causal‐effect relationship between the two. This relationship is important for those
who are interested in establishing whether art programs or increased youth involvement in the
arts have an impact on student outcomes—that is, statistically significant effects that can be
attributed to a program or intervention. The following paragraphs describe how selection bias
has been identified in the research and how it has hindered researchers’ ability to make causality
claims. It then describes how research has utilized various research designs and techniques to
strengthen causality inferences. The section concludes with a commentary on the advantages of
utilizing mixed methods in advancing our understanding of program effects—in this case POL—
and the potential for mixed method research to strengthen causal inquiries.
Issues of Selection Bias: Improved Designs, Better Controls
Researchers who have examined the relationship between involvement in the arts and student
outcomes using correlational designs, have mentioned the difficulty in addressing sources of
selection bias. Bias occurs in research because of the effects of a mechanism used to select
individuals or units for inclusion in the study. For example, Winner and Cooper (2000) mention
that students who are highly involved in the arts are also those who are typically more engaged
in school. Thus, it is quite possible that students do “better” not because of their involvement in
the arts, as they found in their research, but because of their high levels of engagement in
school. Other research identifies similar sources of bias. As described earlier, Vaughn and Winner
(2000) find that there is a link between arts involvement and SAT scores but note that the
positive difference in achievement between students who were highly involved in the arts and
Poetry Out Loud Evaluation Plan 13
those who were not may be explained by the fact that students who tend to participate in arts
activities were already high achievers.
More recent work examining engagement in the arts and academic achievement using large
longitudinal secondary datasets—see results described earlier in the literature review—also
conclude that the results of their work do not support making causal claims (Catterall, Dumais, &
Hampden‐Thompson, 2012).12 Even though students who were highly involved in the arts
consistently demonstrate more positive outcomes than students who do not engage as much
with the arts, their research does not help explain the mechanisms by which participating in the
arts influence student outcomes. Because their research uses secondary data that had already
been collected, there were no viable ways to address selection bias.
How do more recent studies try to address these sources of student selection bias? Current
research emphasizes the importance of improved research designs and better controlled
studies. A fundamental concept in the discussions of causal attribution is the issue of
comparison. Specifically, the ability to compare outcomes between two groups that are similar
except for the treatment, program, or intervention of interest. Using a comparison group, one
can infer that the differences between those who experience the treatment and those who did
not are only attributable to the treatment itself and not to other factors.
Random assignment is considered by many to be the ideal type of design that supports causal
inference (NRC, 2002; US DOE, IES, and NCES, 2011). In this literature review, we found a few
studies that utilized random assignment to examine the relationship between arts and student
outcomes. For example, Inoa et al. (2014), conducted a study using multi‐stage cluster
randomized design to select four schools to implement a program where theater arts were
infused in the curriculum and four to serve as a control group. The study’s goal was to examine
the effect of integrating theater arts into the curriculum and examine its impact in students’
literacy and mathematics achievement. The researchers found that even though students who
were in the theater arts infused programs consistently outperformed those in the control
schools, most of the differences were not statistically significant. As limitations of the study, the
researchers mention that the lack of statistically significant findings may have been a result of
small samples,13 underscoring the issue that insufficient statistical power is a potential threat to
the validity of findings. Moreover, beyond the issue of non‐statistical findings, we think that Inoa
and his colleagues could have addressed other sources of bias that might have explained their
findings. For example, the authors could have included details about how schools’ buy‐in could
12
These data sets include four major longitudinal data collection efforts: National Education Longitudinal Study
(NELS:88), the Early Childhood Longitudinal Study ‐ Kindergarten (ECLS‐K), the Education Longitudinal Study of 2002
(ELS:200) and the National Longitudinal Survey of Youth (NLSY97).
13
Insufficient statistical power is also present in other studies we reviewed, including Goldstein (201) and Thomas
(2016). Low statistical power due to low sample size of studies, small effects, or both, negatively affect the likelihood
that a statistically significant finding reflects, in fact, a true effect.
Poetry Out Loud Evaluation Plan 14
have made a difference in the results or if there were differences in implementation—issues
discussed in more detail below.14
When randomization is not feasible, researchers rely on quasi‐experimental approaches to try
and approximate the underlying logic of the experiments where the researchers can randomly
assign units into treatment and control. The goal in these studies is to synthetically create a
comparison group that only differs from the treatment group in terms of the intervention and
thereby address issues of bias. In recent years, several research studies examine the relationship
between the arts and student outcomes using these types of methods. For example, Elpus’s
(2013) study examining postsecondary outcomes of youth involved in the arts (visual arts, music,
dance, drama, and films and media arts) used a series of observable covariates and statistical
controls to address the differences that exist between students who elect to study arts and
students who do not. As mentioned previously, prior research suggests substantial differences
exist between students who elect and students who do not elect to study arts. Elpus’s (2013) use
of propensity score matching seeks to address the issue of selection bias by adjusting for a series
of observable covariates that are theoretically related to students’ selection into art classes. As
the author notes, selection into the arts is a complex phenomenon that has not been fully
addressed in prior research. Nevertheless, literature does provide strong evidence that gender,
race/ethnicity, language proficiency, student prior achievement, and parental education are
linked to selection into arts classes.
Other quasi‐experimental studies employ similar strategies to control for issues of bias and then
utilize techniques that help isolate causal effects of the intervention or program. For example,
Thomas’ (2016) evaluation of a music education intervention matched treatment and control
schools via propensity score matching using average daily attendance, percentage of low income
students, and passing rates in a standardized state test. Then, to evaluate the impact of the
program, she utilized a difference‐in‐difference approach to estimate the difference of outcome
effects before and after the program took place, for both the treatment and the control group.
Because the treatment and control groups were initially matched, the assumption of the model
is that it accounts for unobserved factors that affect treatment and control in similar ways.
In another quasi‐experimental study, Martin and his colleagues (2013) carefully discuss the
importance of controlling for covariates. Their work validates that controlling for covariates such
as age, gender, language background, parental education, and prior achievement are vital to the
understanding of arts participation variance beyond these differences. In addition to accounting
for these aspects, Martin et al. (2013) also control for prior measures of academic motivation
and engagement. In doing so, they addressed another source of bias Winner and Cooper’s
(2000) research had identified. Their findings suggested that participating in the arts may be
associated with greater engagement in school and this relationship could have explained the
higher levels of achievement observed in students who participated in the arts compared to
those who did not.
14
Behavioral experiments that were conducted in controlled environments with small numbers of participants such
as Rauscher (1997) were not included in our review because these studies are very different from studies conducted
at schools. For a meta‐analysis review of experiments see Podlozny’s (2000).
Poetry Out Loud Evaluation Plan 15
Other Sources of Bias at the Institutional and Implementation Levels
Controlling for selection bias at the student‐level has been one of the primary contributions of
quasi‐experimental research. However, as several studies we reviewed point out, there are other
sources of bias at the organizational and the implementation level that may come into play. For
example, Joronen’s et al. (2014) controlled evaluation of a drama program to enhance social
relationships finds that matching strategies may not be able to control for other sources of
selection bias operating at the school level. The authors pose that part of the reason why the
drama intervention in their study might have been effective is that the school that implemented
the intervention could have been more willing to participate in the intervention than the
comparison school. This initial school buy‐in may be indicative of their preference to engage in
the promotion of social health, which was the research outcome of interest. “Institutional”
selection biases of other types can be more difficult to detect, particularly if one is not familiar
with complex environments such as schools.
Research in schools is challenging because of the complexity of the environment where
interventions or programs take place. There are numerous factors that are continuously at work
influencing students’ outcomes and this makes it challenging to isolate the effect of any one
intervention/program on student outcomes. Other sources of bias could potentially exist at the
school level, including systematic reasons why students access or do not access specific
programs. For example, students who have demonstrated stronger ability in an academic subject
may systematically be assigned to teachers or classrooms where the teacher has additional
expertise in the specific teaching of that area (Darling‐Hammond, 1998). Another example of
how the school context may interact with the potential effects of an intervention or program is
the additional opportunities students are offered at school, especially if they are related with the
outcome of interest (Eccles et al. 2003). More specifically, thinking about students’ involvement
in school art activities, one would think it is important to determine whether schools have a
strong extracurricular offering or offer additional support for students to improve the outcomes
of interest at the time the intervention or program is implemented. However, of the quantitative
studies reviewed here, we found only a few described or accounted for these contextual
differences. To help think through potential sources of bias, either at the student or at the
institutional level, it can be helpful to have a better understanding of the context in which the
intervention or program takes place. A more nuanced understanding about how processes of
implementation take place can also be helpful in thinking through alternative explanations that
could account for observed differences in outcomes—an advantage of mixed methods research,
which we address later in this section.
In addition to the context, another point of criticism that is often made about quantitative
studies is that they tend to say too little about the interventions or programs themselves and
often fail to address the degree or quality of their implementation. The need to address these
elements is substantiated by research. Findings from implementation research studies have
demonstrated that programs can vary substantially in the quality, amount of exposure to the
intervention or program, staff facility in delivery of the content of the intervention, and
instructional strategies employed to deliver the interventions’ content (Durlak, 2015; Century
and Cassata, 2016).
Poetry Out Loud Evaluation Plan 16
As mentioned before, one of the criticisms of Inoa and colleagues’ (2014) research is the lack of
discussion of the actual program and of other aspects of program implementation. Beyond the
lack of statistical significant findings, their report does not address programmatic factors that
could have given additional insight into their non‐statistically significant findings. For example,
the researchers do not explain how the program was expected to improve language arts and
mathematics learning and do not offer details about the program itself beyond the normative
hours with which it was supposed to be carried out. Additionally, there was no discussion about
differences in implementation of the program or variations in the intensity or approaches
teachers used to teach the content. A better description of these elements would indeed have
been useful to advance other research.
As other studies show, the details of the program and its implementation matter. For example,
Joronen’s et al. (2011) study found that program intensity―that is, greater exposure to the
intervention―influenced the outcomes they examined. Their results –described earlier in the
literature review—indicate that while the effect of the program they evaluated was statistically
significant for the high‐intensity intervention classes, it was not significant for those where the
intervention classes were low‐intensity. Martin’s et al. (2013) findings indicate that arts
participation and engagement was a good predictor of both academic and nonacademic
outcomes. They emphasize that the extent to which students are engaged with arts participation
is a better predictor of academic outcomes than is sheer quantity of arts participation; that is,
the quality of the arts education programming is critical.
Qualitative research may help address some of the knowledge gaps about programs. Harding
and Seefeldt (2013) describe how qualitative research can provide valuable insights into some of
the challenges of causal inference. These include obtaining information about the components
and characteristics of a program or intervention. As mentioned earlier in this section, many
quantitative studies tend to give minimal information about the program or intervention they
study and as a result it is difficult to assess what the program/intervention is and if it is
comparable to others. Qualitative research can also offer insights on how selection into the
program/intervention takes place, which is of critical interest to quantitative inquiry. In addition,
qualitative research can also inform the measurement of concepts, offer description of causal
mechanisms, and be helpful in explaining the heterogeneity of effects.
However, the majority of the qualitative studies reviewed were not focused on questions about
why or how the programs work. Mostly, the qualitative studies in our review focused on
understanding students’ beliefs, perspectives, and subjective experiences related to poetry and
poetry appreciation.
Given the limitations of quantitative research and the potential for qualitative methods to fill in
the gaps, we believe a mixed method approach to the evaluation of POL will enable us to
understand whether participation in POL has an impact on student outcomes, and will also
provide us with insights as to how and why it does or does not impact outcomes. While our
literature review yielded few studies using a mixed methods approach, we know that by bringing
together different quantitative and qualitative methodologies to inform research questions,
mixed methods are optimally equipped to provide more thorough answers to our research
questions. Mixed methods leverage the strength of quantitative methods to generalize results
Poetry Out Loud Evaluation Plan 17
along with the strength of qualitative research to understand the program and its
implementation—how the program works. Qualitative research can also play an important role
in studies seeking to make their causality inferences more robust.
The systematic use of triangulation of quantitative and qualitative data sources in mixed
methods studies has the potential to strengthen conclusions, especially when several pieces of
evidence point in the same direction. Mixed methods offer the opportunity of leveraging a
variety of data sources obtained by different approaches to inform research questions. In
addition to reinforcing findings, qualitative information is uniquely equipped to add depth to
them.
The review of the literature confirms the outcomes of interest for the study and the planned
methodological approach for collecting data and measuring those outcomes. In the next section,
we review key aspects of the framing document―the logic model and the evalua on planning
matrices.
Poetry Out Loud Evaluation Plan 18
This page was intentionally left blank.
Poetry Out Loud Evaluation Plan 19
Revisiting the Logic Model and Evaluation Planning Matrices
The Study Framing Document, an early study design deliverable that presented POL’s logic model
and the study’s research questions and evaluation planning matrices that map research
questions to the outcomes of interest for the study—appears in full in this evaluation plan as
Appendix A.15 The logic model and matrices are also presented here in the body of the text, for
ease of demonstrating how the logic model is foundational to the evaluation (while also having a
broader purpose) and how the outcomes and indicators identified in the matrices were
confirmed as appropriate measures by the literature review just presented. The logic model is
shown in Exhibit 2 and the matrices follow in Exhibits 4, 5, and 6.
15
Note that the Study Framing Document is considered a developmental document for this study; the evaluation
plan presents updated content and should be considered the most up‐to‐date presentation of the logic model and
matrices.
Poetry Out Loud Evaluation Plan 20
Exhibit 2. Logic Model
Poetry Out Loud Evaluation Plan 21
Note that the three research domains for which we have developed evaluation planning
matrices (see below) closely overlap, but do not exactly align with the four impact areas shown
in the logic model. In part, this is because the NEA requested a study that is more focused on
student outcomes than teacher and community outcomes; thus, our research domains focus
primarily on students. Outcomes related to teachers and communities are not a focus of this
study, though we will present findings related to both in the likely event that they will emerge as
a natural and related part of our inquiry process. Slight alignment shifts are also due to the fact
that the logic model has been revised since the study was launched; assisting the NEA and
program partners with the revision was part of SPR’s charge in conducting the study. The first
two matrices are focused on (1) student academic engagement and performance and (2) student
socio‐emotional development, respectively. These domains map almost exactly to their
respective impact areas on the logic models. The third research domain—student poetry and
appreciation—is overlapping with its impact area, but with an important difference in that it
focuses on outcomes associated with students; whereas, the logic model impact area
encompasses community members as well. Exhibit 3 shows the relationship of research domains
to logic model impact areas.
Exhibit 3. Research Domains and Logic Model Impact Areas
Research Domain
Logic Model Impact Area
Student Academic Engagement and Performance
Students’ Academic Skills and Performance are
Strengthened
Student Social and Emotional Development
Students’ Social and Emotional Health Increases
Student Poetry Appreciation and Engagement
Awareness and Appreciation of Poetry and Arts
Programming Increases16
Teacher Knowledge of and Confidence in
Teaching Poetry Increases
Exhibits 3, 4, and 5 map research questions for each domain to desired outcomes, constructs,
indicators, and data sources.
16
Like the research domain, this impact area includes students; however, unlike the research domain, it also
includes outcomes associated with community members.
Poetry Out Loud Evaluation Plan 22
Exhibit 4. Academic Engagement and Performance
Does student participation in POL
correlate with increased academic
engagement in English classes
and/or in school more generally?
Does POL have a positive impact
on students’ reading
comprehension and/or analytical
skills (particularly regarding
poetry)?
Are POL students more likely to be
comfortable using metaphor,
simile, or a wider vocabulary in
writing or in speaking after the
program?
‐ Academic engagement in
English classes
‐ Academic engagement in
school
‐ Academic motivation in
school
‐ Post high school aspirations
‐ # absences
‐ # suspensions
‐ Relevant results from
interviews and surveys
‐ Academic achievement in
English classes and in
school
‐ Standardized ELA scale
scores
‐ Standardized ELA
proficiency scores
‐ Relevant ELA assessments
‐ Student GPA
‐ Reading comprehension
‐ Analytical skills reading
poetry
‐ Scale scores in standardized
test scores in reading
comprehension
‐ Comfort with different
poetry forms and devices
‐ Vocabulary development
‐ Relevant results from
interviews
Admin Interviews
Student Surveys
Teacher Interviews
Data Source
Student Interviews
Indicators
Student Records
Constructs
ELA Proficiency
Lit History
Learn/Engage
Outcomes
Analytical Cap
Research Questions
Poetry Out Loud Evaluation Plan 23
Exhibit 5. Social and Emotional Development
Data Source
‐ Self‐confidence
‐ Scaled survey scores related to
confidence in public speaking
‐ Relevant results from interviews
‐ Scaled scores related to comfort
with self‐expression
‐ Relevant results from interviews
Admin Interviews
Teacher Interviews
Art Prog
Sense of Self
Confidence
Do students experience
increased self‐confidence in:
their public speaking abilities,
social skills, intellectual abilities,
or in general after participating
in POL?
Community
Engagement
Indicators
Student Interviews
Constructs
Student Surveys
Outcomes
Student Records
Research Questions
Do students feel more secure,
empowered, and/or articulate
in expressing themselves after
participating in POL?
‐ Self‐confidence
‐ Empowerment
Are students more likely to
engage in civic activities during
or after participation in POL?
Are students more likely to
engage in extracurricular
activities during or after
participation in POL?
‐ Civic engagement and
leadership
‐ In‐ and Out‐of‐School
engagement
‐ Survey scores related to
participation in community
activities
‐ Survey scores related to
involvement in student leadership
‐ Relevant results from interviews
‐ Survey scores related to
participation in extracurricular
activities, school clubs, and/or
after school programs
‐ Relevant results from interviews
Poetry Out Loud Evaluation Plan 24
Exhibit 6. Poetry Appreciation and Engagement
Does participating in POL correlate
with students’ increasing their
likelihood of reading or writing
poetry for pleasure?
Does POL promote the sharing of
poems among students and if so,
by what means?
Do students talk about poetry or
POL on social media networks after
the participation versus before?
‐ Frequency scale of poetry
inclusion in curriculum
‐ Relevant results from
interviews
‐ Scale of attitude toward poetry
‐ Scale of comfort with public
speaking
‐ Attitude about finishing HS
‐ % planning to go to college
‐ Relevant results from
interviews
‐ Attitudes toward
poetry
‐ Attitudes toward public
speaking
‐ Post high school
aspirations
Admin Interviews
‐ Increased poetry
content in curriculum
Teacher Interviews
‐ Frequency scale of poetry
exchanges via social media type
‐ Relevant results from
interviews and surveys
‐ Sharing poetry with
peers
‐ Sharing poetry via
social media (Facebook,
Instagram)
Student Interviews
‐ Agreement with reading poetry
‐ Agreement with writing poetry
‐ Relevant results from
interviews and surveys
Student Records
‐ Behaviors related to
reading poetry
‐ Behaviors related to
writing poetry
Does a teacher or a school’s
participation in POL correlate with
greater incorporation of poetry in
classroom/school instruction?
Does POL participation correlate
with any attitudinal changes
toward poetry, academics, public
speaking/performing, or post high
school aspirations?
Data Source
Exposure to
SAA/NEA/PF
Poetry Exposure
Indicators
Arts Appreciation
Constructs
Student Surveys
Outcomes
Research Questions
Poetry Out Loud Evaluation Plan 25
Brief Overview of the Evaluation Design
The purpose of the evaluation of the Poetry Out Loud program is to understand student‐level
outcomes associated with the implementation of POL programs. The evaluation is mixed
method, combining a quasi‐experimental design involving a treatment group of students
participating in POL and a comparison group of non‐participating students from the same
schools. The quasi‐experimental design will include pre‐ and post‐student surveys for the
treatment and comparison groups, analysis of student record data for all students (treatment
and comparison), coupled with qualitative on‐site data collection to help understand POL
program implementation17 and the counterfactual (i.e., the experiences of those in the
comparison group). This design will allow the research team to analyze all outcomes of interest.
It also helps us to provide insight into the factors affecting those outcomes and to identify how
outcomes have changed after implementation of the program.
To learn about the efficacy of the Poetry Out Loud program, SPR will select a purposive sample
of 10 POL‐participating schools across the U.S. to conduct quantitative and qualitative data
collection activities. In consultation with the NEA, SAA staff, and other project partners, SPR will
recruit school sites that meet the criteria to be part of the study. Specific details about school
site selection are addressed in detail in the section that follows. As noted in the evaluation
planning matrices, the study is guided by a series of research questions focused on the
assessment of the program’s impact in three different domains: students’ academic
engagement and performance, poetry engagement and appreciation, and socio‐emotional
development.
17
A detailed understanding of the counter‐factual—the programming that the comparison group, students not
participating in POL receive—is important, but the scope of the study precludes site visits long enough to observe
non‐participating classrooms and interview non‐participating teachers and students. Thus, our interview protocols
for participating teachers will include a few questions about what poetry programming would look like in the
absence of POL participation.
Poetry Out Loud Evaluation Plan 26
This page was intentionally left blank.
Poetry Out Loud Evaluation Plan 27
Data Collection
We plan to select a purposive sample of 10 schools to include in the study from about 1,360
schools in 18 high‐performing states. Because the NEA is interested in assessing Poetry Out Loud
program outcomes in schools located in states that provide optimal conditions for program
success, schools will be selected only from states that offer these conditions. Based on the NEA
program staff’s years of experience, the NEA’s criteria to determine whether states offer optimal
conditions for the POL programs are as follows:
states should have an overall count of participating students exceeding 2,500;
an overall count of participating schools exceeding 20;
presence of ancillary activities supporting state finals competitions, direct student
exposure to a working artist, and celebratory activities for students and families such as a
welcome banquet or reception;
formal teacher recognition at the state level;
opportunities for winning students to perform at local arts events throughout the state;
strong support for the POL program from executive leadership at the state arts agency;
workshops for teachers and/or students facilitated by the state arts agency;
matching or overmatching of POL grant money with funds from the state arts agency;
and an annual program assessment.
According to the NEA, 18 states meet many of these conditions although they do not necessarily
need to meet them all in order to be considered high‐performing POL states.18 Using NEA data
files from these states, we obtained information about students in POL‐participating schools in
each of those 18 states. As Exhibit 7 below shows, there were about 1,360 participating schools
and over 190,000 participating students in the school year (SY) 2015‐16.19 This exhibit also shows
there was quite a bit of variation in the number of schools participating in POL across states and
in the number of students participating in each school, as some schools reported having only one
participating student and others reported well over 1000.
18
These states are California, Georgia, Massachusetts, Minnesota, Mississippi, Missouri, Montana, Nevada, New
Hampshire, New Jersey, New York, Ohio, Pennsylvania, Tennessee, Texas, Virginia, Washington, and West Virginia.
19
We excluded 21 schools from the original files NEA provided because these were marked as “withdrew” or “TBA,”
or did not clearly report the number of participating students or school location..
Poetry Out Loud Evaluation Plan 28
Exhibit 7. Participating Schools and Participating Students in High‐performing POL States
State
Total Number
Schools in POL
Average # of
Total Number
Min Students
Students in POL
Students in POL
per School
Schools
Max Students
per School
California
223
31,098
139
1
1,750
Georgia
88
13,365
152
2
2,100
Massachusetts
87
21,219
244
2
1,850
Minnesota
38
2,707
71
3
450
Mississippi
68
5,512
81
1
1,000
Missouri
29
4,134
143
6
468
Montana
60
5,633
94
10
800
Nevada
48
2,751
57
1
230
New Hampshire
41
8,178
199
5
1,300
New Jersey
160
27,155
170
2
2,000
New York
125
13,245
105
2
1,200
Ohio
56
9,276
166
4
875
Pennsylvania
123
6,744
55
1
650
Tennessee
29
3,927
135
4
1,520
Texas
26
2,220
85
3
900
Virginia
50
7,446
149
1
780
Washington
70
21,357
305
2
1,300
West Virginia
41
4,584
112
3
740
Total
1,362
190,551
Identifying and Recruiting Schools to Participate in the Study
Efficacy studies such as this one enable us to examine the benefits of an intervention under
optimal conditions for the implementation of the Poetry Out Loud program in schools. Because
the phenomena of interest are observed under optimal conditions, this maximizes the likelihood
of observing program effects, if these exist. In addition, to reduce sources of self‐selection bias,
SPR will recruit schools where POL programming is mandatory at least on one grade level. (1)
states are optimally implementing Poetry Out Loud;20 (2) schools are implementing mandatory
20
As noted in Part A, schools will be selected from states that are optimally implementing POL. Optimal conditions
as determined by the Poetry Out Loud program partners are as follows: states should have an overall count of
participating students exceeding 2,500; an overall count of participating schools exceeding 20; presence of ancillary
activities supporting state finals competitions, direct student exposure to a working artist, and celebratory activities
Poetry Out Loud Evaluation Plan 29
POL programming in at least one grade level;21 (3) schools meet the necessary conditions to
implement the study, including having a minimum of 900 POL‐participating students and about
900 non‐participants who are matched using propensity score methods, allowing the
implementation of a school‐wide online survey, and having the ability to provide student‐level
data for all students in the school; and (4) schools possess other features so as to achieve a good
mix of school sites primarily in terms of geography, and secondarily in terms of locale
(urban/rural) and student body composition.
Identifying and Recruiting Optimally Implementing POL Schools
To identify schools that are optimally implementing Poetry Out Loud and have mandatory POL
programming in at least one grade level, the SPR research team will begin working with the NEA
staff to contact State Arts Agency officers in optimally implementing states to obtain
recommendations on which schools would meet the study’s criteria. To aid in this effort, SPR will
develop an introductory email—to be sent by the NEA, on NEA letterhead—explaining the aims
of study, outlining the criteria for school selection, and requesting a meeting to discuss their
recommendations for schools that meet the criteria along with other preliminary information
necessary to determine eligibility. In the initial email for SAAs we will outline (1) the purpose of
the study, (2) outline the school characteristics that would best fit the aims and needs of the
study we are asking them to identify, and (3) what the research the research activities of the
study will entail. NEA and/or SPR staff will discuss the request with SAAs by phone.
To make it easier to ask those schools and districts SAAs to identify as candidates to participate
in the study, the research team will begin recruitment efforts by requesting that the NEA and
State Art Agency officials send principals and school district superintendents a letter of support
to encourage participation in the research and introducing SPR. The team will provide some
basic language for the NEA and State Art Agency officials that they can use to draft the letter,
customizing with relevant additional detail. Based on experience recruiting other sites for
research studies, having the support of a high‐level entity that provides funding for the program
being evaluated can make an important positive difference in the recruitment process. Overall,
we think that expressed support from the NEA and state arts agency will increase the likelihood
that schools and district administrators to participate in the study. The letters will list some of
the benefits of participation for schools and districts, such as: (1) an opportunity for districts and
schools to learn how others are implementing POL; (2) an opportunity for districts and schools to
for students and families such as a welcome banquet or reception; formal teacher recognition at the state level;
opportunities for winning students to perform at local arts events throughout the state; strong support for the POL
program from executive leadership at the state arts agency; workshops for teachers and/or students facilitated by
the state arts agency; matching or overmatching of POL grant money with funds from the state arts agency; and an
annual program assessment. Eighteen states were identified by the NEA and the Poetry Foundation as optimally
implementing POL.
21
NEA defined “mandatory” participation at the classroom level as individual teachers deciding that their class will
participate in POL and that every student in the class will be required to select and memorize a poem and compete
in the classroom and/or school competition. Mandatory participation at the grade level is when all teachers in a
particular grade or grades agree to participate in POL and require all students in that grade level to select and
memorize a poem and compete in the classroom and/or school competition. Selecting schools with mandatory
participation prevents self‐selection bias in the sample.
Poetry Out Loud Evaluation Plan 30
participate in research that examines how POL programming benefits youth; (3) the chance to
contribute to a knowledge base with rigorous evidence about incorporating POL programming in
schools, which will provide information that policymakers and educators across the country can
use (i.e. participating in the study is a way to give back); and (4) possibility of professional
development opportunities as related to POL and free resources that they can share with their
students at no cost.
Once SPR has a final list of recommended schools that implement mandatory grade level POL
programs from where to recruit from and a letter from NEA/SAAs has been sent to the school
principals and school superintendents, we will commence contacting schools. Overall, the team
expects to contact about 30 schools and school districts and reach final agreements to
participate in the study with 10 of them.
SPR will first contact school principals via email. The initial communication will contain key
information about the study and will let principals know that the research team would like to
schedule a 15‐minute phone conversation.22 The purpose of the call is to briefly describe the
purpose of the study and the research activities we plan to undertake. The phone call will also be
used to determine whether the selected school meets the necessary criteria for the study and
find out if the principal would be willing to participate in the research activities. During the call,
after a brief overview of the study, the research team will talk with principals about a number of
issues related to helping determine if their school will be appropriate for the study and if they
would be willing to participate. The following list includes topics that the team will explore with
the principals. The topics will also help the team prepare to work with the school in order to
implement the study there.23 Topics of conversation may include:
Why the school decided to participate in Poetry Out Loud
Benefits of participating in Poetry Out Loud
Additional funding or in‐kind resources for POL
Details about how POL is implemented – (mandatory for the whole school, by grade, by
class)
Duration of participation in Poetry Out Loud
Details about the English Language Arts department
Presence of teaching artists at the school
School guidelines for teachers to obtain Professional Development
Details about students and school‐issued email addresses
Details about student record data
How the school measures ELA academic achievement
22
These details of the letter (and, in more detail, the phone call) will include (1) the purpose of the study, (2)
preview of school characteristics that would best fit the aims and needs of the study, (3) what the research team will
be asking of schools, (4) what data the team will collect and (5) how the data will be used, and (6) benefits of
participating in the study.
23
Before conducting the site questionnaire, the research team will already have a list of information about the
school. This information includes: Total student enrollment, Number of POL participants, school geographic location,
school locale (urban/suburban/rural), percent of students participating in free and reduced lunch, percent of
minority students.
Poetry Out Loud Evaluation Plan 31
Standardized assessments used by school and how student GPA is calculated
Existence of school district level data sharing agreements for research purposes
Details about what activities will occur if the school decides to participate in the study
Use of passive parental consent in the school
Exploring how the research team could implement the school‐wide survey
Once we assess the school site information and determine which schools meet the criteria for
the study, the research team will then contact the school superintendent’s office using a similar
process as the one followed with the school principals. The main goal of communicating with the
school district officers is to obtain additional information and determine school site eligibility
which also includes establishing data sharing agreements to access de‐identified student‐level
data with some and make sure we follow the research protocol in school sites within the district.
There are important topics to discuss with school district personnel, including the process for
obtaining student‐level data, restrictions on student‐level variables that can be released, the
process for establishing data sharing agreements, permission to conduct student surveys within
the school, and parental consent. Because this research is not under contract with the
Department of Education, it is likely that the Family Educational Rights and Privacy Act (FERPA)
will require parental consent to allow data collection activities. Additionally, because the
research involves direct interaction with students, all districts are likely to require obtaining
parental consent (see in Appendix E). In this respect, SPR will pursue a passive consent strategy.
The preliminary list of topics for school district personnel include items such as:
School district history of collaboration with a research consortium or researchers
School district process for establishing data sharing agreements to provide de‐identified
student level data for research purposes
School district capacity to provide student record data for all students for a specific
school
How students who participate in Poetry Out Loud are identified in student record data
Details about what research activities would include
Process for school sites to gain approval from the district to engage in research activities
School district guidelines around using passive parental consent
Process and guidelines for obtaining de‐identified student data for research purposes
Exploration of what would facilitate the school district’s willingness to collaborate with
NEA and the State Arts Agency to carry out this research
Potential impediments to the school district participating in the research
The team recognizes that not all school districts will be willing or able to agree to the full list of
requests. However, the team has included various steps in the recruitment process to increase
the likelihood of recruiting schools that will be able to provide the most comprehensive support
for the study.
Identifying Schools That Meet Criteria for the Study
There are three important criteria schools need to meet to participate in the study: (1)
mandatory implementation of POL programming in at least one grade level, (2) number of
Poetry Out Loud Evaluation Plan 32
students participating and not participating in POL and (3) willingness/ability to participate in all
three research activities (online student survey; student and teacher interviews; and provision of
student‐level record data). The first criteria, identifying schools that implement mandatory
participation is intended to minimize student self‐selection. In regard to the second criteria, one
of the primary goals for this study is to be able to carry out the research using a quasi‐
experimental design which aims to enhance the strength of causal inference that can be made.
To make this possible, each of the schools in the purposive sample will need to have 900 or more
students participating in POL and at least as many non‐POL participants.
In regard to the third criteria, agreement to participate in all phases of the research, the
recruitment strategy builds in steps to assess which schools will be able to participate fully in the
research.
Obtaining a Good Mix of Schools
After finalizing the list of school site recommendations, the research team will provide a
description of these schools in terms of geographic location, locale (urban/suburban/rural), and
student body composition. For this, we will first obtain schools’ information using Common Core
of Data (CCD) from the NCES. As the selection process is underway, the research team will look
at the mix of schools and assess the quality of the mix by prioritizing school site location and
secondarily, to other school site characteristics. The objective is primarily to ensure school sites
are well dispersed across the U.S. and secondarily to diversify the mix of schools in terms of
other school characteristics.
Student Surveys
SPR will conduct online pre‐ and post‐surveys for all students who will be participating in POL
during SY2018‐19. The research team will also survey students who did not participate in the
program.24 The pre‐ and post‐student surveys are designed to gather the necessary data to
answer the research questions of interest. To measure changes in responses and determine
whether the program has had an impact on a variety of outcomes, survey data will be collected
before and after POL programming has occurred (and before and after standard ELA curriculum
for non‐participants).
Designing the Survey Instrument and Cognitive Testing
Informed by the literature review, the research team began developing the survey instrument by
identifying existing measurement scales and survey items that will yield information about each
of the three domains of interest: 1) students’ academic engagement and motivation, 2) social
and emotional development, and 3) poetry appreciation and engagement. Measurement scales
are the indicators that will be used to answer the research questions guiding the evaluation.
24
As discussed earlier in the report, gathering survey data for non‐POL participants strengthens the ability of making
inferences about the effects of the program. Without a comparison group, it is not possible to determine the impact
of POL participation on student outcomes.
Poetry Out Loud Evaluation Plan 33
Before delving into each of the three domains, it is important to describe how measurement
scales are useful in designing the instrument and how these are expected to yield the necessary
information about specific constructs. Constructs can be defined as concepts that cannot be
directly observed nor can they be directly measured (e.g. intelligence, motivation). For this
reason, researchers have actively developed measurement scales composed of a collection of
purposely selected items (also known as indicators) that are interrelated and that together
represent a specific underlying construct (Carifio and Perla, 2007). Indicators in this case are the
direct measures that provide specific information. In general, scales are preferred over single
survey questions because they are composed of multiple indicators that have been previously
subjected to reliability and validity tests and are backed by empirical evidence that shows the
items together serve as proxies for a specific construct (Clark and Watson, 1995; DeVellis, 2003).
Thus, constructs are inferred from direct measurements of multiple items, indicators, or
variables, that theoretically are related (Borsboom et al., 2003). For example, the construct of
student academic engagement can theoretically be represented by a variety of behaviors,
including students’ level of effort, their ability to persist at different tasks, as well as their
intrinsic motivation toward learning and enthusiasm toward engaging in school activities. All
together, these behaviors make up our notion of academic engagement. Importantly, the more
that is known about the theoretical foundations of a particular construct, the more likely it is to
find reliable, valid, and useful scales available to measure that construct.
As described in the logic model (p. 20), it is hypothesized that POL participation is associated
with a variety of outcomes. Participation is expected to be associated with students’ academic
engagement and academic achievement. It is also expected to be related to the development of
certain students’ socio‐emotional competencies such as self‐confidence or empathy. It is also
expected that participation in POL will help develop a variety of student skills such as public
speaking and the ability to communicate effectively. Lastly, participation in POL is expected to
generate changes in attitudes toward poetry and how students interact with poetry in a
meaningful way. The following sections present each of the three domains; outline the
constructs of interest that will ultimately be included in the preliminary draft of the survey
instrument (see Appendix G), and make explicit how the evaluation objectives, the research
questions, and constructs are connected.
Academic Engagement and Motivation, and Achievement
Generally, student academic engagement can be defined as the level of participation and
intrinsic interest that a student demonstrates in school. Engagement in schoolwork involves a
variety of behaviors such as effort and persistence and a variety of attitudes toward learning and
toward school. Using the evaluation research questions as a guide, the research team selected
the constructs that would yield data to help inform the research questions. Drawing from the
literature, the team then searched for measurement scales for the constructs that were
identified. To select the measurement scales or survey items, the research team gave preference
to instruments that have previously been implemented and that provide thorough technical
information about the validity and reliability of the scales they utilize. Exhibit 9 shows how the
research questions related to academic engagement, motivation, and achievement are
connected to the constructs, and which indicators will measure the underlying construct. Lastly,
it mentions the survey instruments used to draw each of the indicators. Technical information
Poetry Out Loud Evaluation Plan 34
about the validity and reliability properties of the measurement scales are included in the survey
instrument draft (see Appendix H).
Poetry Out Loud Evaluation Plan 35
Exhibit 9. Academic Engagement, Motivation, and Performance
Research Questions
Outcomes
Constructs
Indicators
Survey Instruments
Does student
participation in POL
correlate with
increased academic
engagement in
English classes
and/or in school
more generally?
ELA Proficiency
Analytical Cap
Learning
Engagement
‐ Academic
engagement in
school
‐ Academic
engagement in
English classes
‐ Academic
motivation in school
‐ Measure of school
climate
‐ College aspirations
‐ Survey questions assessing overall
interest in learning (Q1)
‐ Academic engagement in English
classes: assessing interest in the class
and the topics covered, as well as the
general disposition towards the class
(Q2)
‐ Academic motivation in school:
assessing willingness to do well in
school and ability to persevere (Q1)
‐ Measure of school climate: include
feelings of belonging and relationship
with peers and teachers (Q3)
Student Engagement in
School Questionnaire
(SESQ)–(Q1)
Chicago Public Schools:
5Essentials Survey –(Q2)
California Healthy Kids
Survey (CHSK) – (Q3)
‐ Academic
‐ Students' self‐reported measure of
achievement in
academic achievement in school (Q7)
English classes and in
and in their English class (Q8)
school
California Healthy Kids
Survey (CHSK) –(Q7; Q8)
Poetry Out Loud Evaluation Plan 36
At this stage, the draft of the survey instrument includes items that help answer the research
questions about the relationship between POL participation and academic engagement,
motivation, and achievement, and includes items that intend to measure the development of
skills related to POL participation (e.g., communication skills and comfort with public speaking).
In addition, the instrument includes items about constructs that yield indirect, yet relevant,
information about students’ context, such as perceptions about school climate, frequency of
participation in extracurricular activities (a construct also present in the social and emotional
domain), and an item related to academic aspirations.
Social and Emotional Outcomes
Broadly speaking, social and emotional development encompasses a wide variety of processes
through which students acquire and effectively apply the knowledge, attitudes, and skills
necessary to understand and manage emotions, develop self‐awareness, feel and show empathy
for others, develop skills to establish and maintain positive relationships, and apply decision‐
making skills to social situations in ways that contribute to the well‐being of the student and her
community (Hamedani and Darling‐Hammond, 2015; Durlak, 2011). From this wide variety of
processes, the research team focused only on key constructs that are either more directly
related to the research questions guiding the evaluation or those for which there is previous
research highlighting their importance (see for example Goldstein, 2011).
Specifically, for this domain of social and emotional development, the survey instrument draft
includes measures related to the constructs of self‐confidence, sense of empowerment and self‐
expression, certain prosocial attitudes and behaviors, and other indicators of civic participation
and volunteerism. After identifying the constructs, the research team reviewed a variety of
existing instruments and selected specific scales and survey items. Exhibit 10 includes the
research questions in the social and emotional domain, lists the constructs along with item indicators
and mentions the indicators that serve as measures of the constructs. Notably, as researchers have
indicated, attention to the social and emotional aspects of education become more relevant since
these aspects, also known as “non‐cognitive factors” or “soft skills,” are also strong predictors of
student success (Darling‐Hammond, 2015; Goldstein, 2011).
Poetry Out Loud Evaluation Plan 37
Exhibit 10. Social and Emotional Development
Constructs
Community
Engagement
Confidence
Outcomes
Sense Of Self
Research Questions
Survey Instruments
‐ Scales measuring self‐confidence in
social skills; intellectual abilities (Q10)
‐ Scale related to self‐confidence in
‐ Self‐confidence public speaking (Q6)
Do students experience
increased self‐confidence in
their public speaking abilities,
social skills, intellectual
abilities or, in general, after
participating in POL?
‐ Self‐confidence
Do students feel more secure,
empowered, and/or articulate
in expressing themselves after
participating in POL?
Are students more likely to
engage in extracurricular
activities during or after
participation in POL?
Indicators
‐ Empowerment
& leadership
‐ In‐ and Out‐of‐
School
engagement
‐ Scale measuring comfort participating
in group discussions (engagement with
peers) (Q6)
‐ Scale related to comfort with self‐
expression (Q6/Q6a)
California Healthy Kids
Survey – Youth Reliance &
Development Module
(Q10) AND PRCA‐24 (Q6)
PRCA‐24 (Q6/6a)
‐ Scale related to confidence in solving
issues (Q10) and taking leadership roles
(Q11)
California Healthy Kids
Survey – Core Module
(Q10)
Common Measure:
Leadership Development
(Q11)
‐ Survey scores related to participation
in extracurricular activities, school
clubs, and/or after school programs
(Q5)
California Healthy Kids
Survey – Core Module (Q5)
Poetry Out Loud Evaluation Plan 38
Poetry Appreciation and Engagement
Several features of POL are intended to promote students’ poetry appreciation and engagement
and, therefore, it is hypothesized that participation in POL will impact constructs in this domain.
In contrast to the previous two domains, the measurement of poetry appreciation and
engagement is underdeveloped. During the literature review, the research team identified only a
handful of instruments and none them had information about their technical features readily
available. To develop items, the research team either drew on or adapted questions from
existing instruments. In the end, the research team identified five different constructs: general
attitudes toward poetry, and attitudes toward reading, writing, memorizing, and reciting poetry.
To be able to provide information for research questions in the domain, the instrument includes
questions about sharing poetry with peers and sharing poetry via social media. Lastly, because in
high performing schools, the majority of students will have been exposed to the program in
some form, the instrument includes a survey item about POL participation and student
participation in POL competitions in previous years.
Exhibit 11 shows the research questions guiding the poetry appreciation and engagement
domain and lists the constructs the research team identified. Along with the constructs, it
mentions the specific indicators that would serve to measure each of the constructs.
Poetry Out Loud Evaluation Plan 39
Exhibit 11. Poetry Appreciation and Engagement
Does participating in POL correlate
with students’ increasing their
likelihood of reading or writing
poetry for pleasure?
Does POL promote the sharing of
poems among students and, if so,
by what means?
Do students talk about poetry or
POL on social media networks after
the participation versus before?
Does POL participation correlate
with any attitudinal changes toward
poetry, academics, public
speaking/performing, or post‐high
school aspirations?
Constructs
Arts
Appreciation
Outcomes
Poetry
Exposure
Research Questions
Indicators
Survey Instruments
‐ Behaviors regarding
reading poetry
‐ Survey item measuring agreement
with reading poetry for pleasure (Q12)
‐ Behaviors regarding
writing poetry
‐ Survey item measuring agreement
with writing poetry for pleasure (Q12)
‐ Sharing poetry with
peers
‐ Survey items measuring agreement
with sharing poetry among peers (Q12)
POL Student
Survey AND Koukis
(2010)
POL Student
Survey AND
internally
‐ Sharing poetry via
generated survey
‐ Survey items measuring frequency of
social media (Facebook,
poetry exchanges via social media (Q20) item
Instagram)
‐ Attitudes toward
‐ Survey item measuring attitudes about
poetry (general)
toward poetry (Q12)
‐ Attitudes toward
‐ Survey item measuring attitudes about
poetry (memorization) poetry toward memorization (Q12)
POL Student
‐ Attitudes toward
‐ Survey item measuring attitudes about
Survey AND Koukis
poetry (recitation)
poetry toward recitation (Q12)
(2010)
‐ Attitudes toward
public speaking*
‐ Post high school
aspirations*
Note: Survey items marked with and “*” were already included in prior sections.
Poetry Out Loud Evaluation Plan 40
The next steps involved in the survey design are to share with the NEA team and the technical
review group the draft of the survey instrument and gather their feedback to refine the survey
(see Appendix G). Once the review and revision period is complete, the research team will
finalize the instrument and prepare to conduct cognitive testing for the survey instrument.
Cognitive Testing
The primary objective of the cognitive testing phase is to investigate whether the survey
questions are being interpreted as originally intended‐‐that is, to determine whether
respondents understand the questions correctly and whether they can provide accurate
answers. During the cognitive testing process, researchers will be looking for survey items that
respondents misunderstand or items where respondents experience additional difficulty in
answering. Other problems answering survey questions can arise if respondents do not fully
comprehend the question, interpret the question differently than was intended, or if
respondents do not have previous information or understanding to be able to answer the
question.
To prepare for cognitive testing, the research team will first select a purposive sample of high
school students preferably from different grades, gender, ethnic/racial and linguistic background
to conduct nine or fewer cognitive interviews. Members of our research team will administer a
draft questionnaire and conduct the cognitive interviews following common approaches to the
interview: think aloud (ask respondent to share thoughts as they answer the questions) or active
probing (ask specific targeted questions). Once these interviews have been conducted, the
researchers will meet to compare notes and provide a summary of recommendations to address
the issues encountered during their cognitive interviews. The research team will then summarize
the results and make recommendations to modify the survey accordingly.
After collecting the survey data through cognitive interviews, the research team will then
conduct exploratory analyses to assess the instrument validity and reliability. These exploratory
analyses will assess the internal consistency of survey items that are intending to measure an
underlying construct by calculating Cronbach’s alphas. As part of the analyses, the research team
will conduct preliminary construct and discriminant validity analyses of the instrument by
calculating the correlations between items and determine their overall association. To do this,
the research team will look at construct validity and test whether survey items that are expected
to be related to each other are in fact closely co‐varying. The final step of the cognitive testing
process is to share a memo with NEA summarizing the findings.
After cognitive testing is complete and revisions to the survey instrument are complete, SPR will
then submit the survey instrument along with the qualitative data collection instruments, to an
institutional review board (IRB) for review of the research design as it pertains to human
subjects. Thereafter, the team will also submit the survey, along with other required materials,
to the Office of Management and Budget (OMB) for approval under the Paperwork Reduction
Act. Both the IRB and the OMB review processes are described in the deliverables section below.
Poetry Out Loud Evaluation Plan 41
Student Administrative Records
We will collect student‐level administrative records from the selected schools after developing
data sharing agreements with each of the school districts. During that process, we will submit a
list of the student variables we are requesting. We intend to collect these variables for all
students enrolled in the schools selected for the study. During the process of recruiting schools,
we will provide a parental consent for the school to review and will request assistance sharing
the form with all parents or guardians of students. The data we will ask for includes the
following:
1. Unique identifiers for all students (with student proxy id generated by the school district)
2. Participation in POL identifier for current and prior academic year
3. Student‐level demographic information (e.g., gender, race/ethnicity, free and reduced
lunch status, special education, English learner)
4. Grade level
5. Relevant assessment data in English Language Arts and language proficiency tests for
school year (SY) 2018‐19 and if applicable one prior school academic year (SY2016‐17);
GPA and ELA end‐of‐course grades
6. Student‐level records of attendance, suspensions, and expulsions
Using a Baseline Test to Increase Precision
A large proportion of the variance in student test scores can be explained by students’ prior
achievement (Martin et al., 2013; Rivkin, Hanushek, & Kain, 2005). By estimating impacts with a
regression model that uses prior achievement as one of the covariates, we can substantially
increase the precision of the impact estimates, allowing us to detect smaller effects. The
research team will pursue this model specification strategy if possible. An important aspect to
consider is whether students have participated in POL in the prior year. Depending on what we
find, we will adjust, using a different model for those students who have participated in the prior
year and examine the differences with students who have participated in POL only once.
Transferring and Archiving Data Securely
Protecting the confidentiality of sensitive data is a priority of the study team. SPR has adopted
federal standards for the use, protection, processing, and storage of data. Our security policies,
procedures, and safeguards are consistent with the Privacy Act, the Federal Information Security
Management Act, OMB memoranda on data security and privacy, and National Institute of
Standards and Technology security standards. Our approach to implementing security controls
includes assigning dedicated security and privacy experts to each project and leveraging
company‐wide secure computing infrastructure and data handling practices. SPR secures
sensitive information and strictly control access to it on a need‐to‐know and least‐privilege basis.
The study team takes seriously the ethical and legal obligations associated with the collection of
confidential data. Ensuring the secure handling of confidential data is accomplished via several
mechanisms: obtaining suitability determinations for designated staff, training staff to recognize
and handle sensitive data, using secure data transfer protocols, protecting computer systems
from access by staff without favorable suitability determinations, limiting the use of personally
Poetry Out Loud Evaluation Plan 42
identifiable information (PII) in data, limiting access to secure data on a need‐to‐know basis, and
creating data extract files from which identifying information has been removed. The assurances
and limits of confidentiality will be made clear in all advance materials sent to recruit states and
respondents. All data that includes PII from all study components will be housed on a disc drive
in a locked cabinet at SPR; all non‐PII study materials will be kept in a secure project folder, to
which only the study team has access. Upon completion of the project, SPR will ensure the
secure destruction of all data originally provided (i.e., data containing PII), employing digital or
physical shredding of electronic or physical data. When disposing of electronic data containing
PII, SPR uses secure deletion software that overwrites disks to a minimum of 7 times for reusable
media (USB drives and hardrives) and physical destruction (cross cut shredding) for non‐reusable
media (e.g., CD/DVDs).
Site Visits
In‐person data collection is at the core of our qualitative data collection. Interviews with
administrators, teachers, and students will allow us to document the “story” of POL in the
optimally implementing schools selected for the study, from the initiation of participation
through the reception of the program by students, including those who go on to compete at the
state and national levels. Interview data will allow us to contextualize the quantitative findings
from the pre/post assessments and the administrative data. We plan to visit at least 6 schools in
our sample and conduct the remaining “school visits” remotely through video conferencing.
During school visits, we will collect data by means of: (1) semi‐structured interviews with POL‐
participating students and teachers following a prepared interview protocol; state arts agency
administrators will be interviewed by phone prior to the site visit, also following a prepared
protocol; (2) focus groups with POL‐participating students; and (3) observations of program
activities using an observation template. Interviews and focus groups add different elements to
qualitative data collection. Interviews allow for intimate conversation between researchers and
interviewees, while focus groups primarily allow us to hear from a larger number of youth. Also,
one‐on‐one interviews can yield more personal information, while focus groups draw out the
social story – how a community of students together experience the program. SPR will
communicate with POL coordinators and teachers at our chosen schools well in advance to
select and schedule an optimal time to visit/hold interviews. When possible, on‐site visits will
occur during a me when we can observe POL ac vity―a poetry lesson being taught, students
practicing for school or state competitions, or on dates when competitions are being held.
Interviewers will use semi‐structured interviews to capture the perspectives of the students,
teachers, and administrators, and to document differences and similarities among respondents.
SPR will make every effort possible to collect robust, reliable data as part of the site visit
interviews. Site visitors will rely on structured interview protocols which will allow them to
collect data consistently and systematically across the schools while still providing for some
flexibility to pursue topics as they arise. We will also cover similar topics with each of the
respondents to gain a fuller picture of the program and its effect on students and teachers.
Actual schedules will differ depending on how many classrooms in each school are participating
in POL, how many teachers are available to meet with us, how many students the teachers are
Poetry Out Loud Evaluation Plan 43
able to line up for interviews and focus groups, and how many teachers are available to have us
observe in their classroom (on‐site visits only).
The interview and focus group protocols will focus on collecting data about the program, about
its organization and implementation, and about how the program influences the youth in the
different outcome domains the evaluation is interested in. The interview protocols are in
Appendix F. Exhibit 13 shows some of the topic areas to be covered.
Exhibit 13. Interview Topics
Interviews and Focus Groups
Respondents
SAA Administrators
Topics to be Explored
Relationship between SAA and POL, especially support that would
help contextualize findings; SAA’s role in implementation of POL;
outcomes of interest to the SAA; impact on the larger arts
community in the state; impact on SAA staff and organization
Teachers
Teacher background; experience with POL curriculum; intensity of
POL programming; perceptions of student academic, socio‐
emotional, and poetry‐specific outcomes; POL’s influence on the
teacher and on the teacher’s teaching practice
Students
Participation in POL; feelings about poetry; poetry “use” (reading
and writing for pleasure, sharing with peers, etc.); probing
connections between poetry and academic outcomes and social and
emotional development
Observations
If possible, we will observe a POL unit being taught, students practicing for competitions, or a
competition itself.
In advance of the visit, we will share parental passive consent forms with the school staff and
teachers who are helping to coordinate our visit in order that students can share them with their
parents and gain passive consent to participate in interviews and focus groups. (That is, parents
who object to their child participating in the study can return the form signed and that child will
not be included in the study; all others will be included.) Youth participating in the study will
themselves provide a verbal or electronic assent to participate. The passive consent and verbal
and electronic assent language, which can be found in Appendix E, will go through the IRB
approval process, described in the section on Deliverables and Timeline, below.
After each site visit is complete, the site visitor will prepare detailed notes in the form of a site
visit write‐up. Write‐ups will describe key features of POL as observed and discussed at each
school. The research team will develop a write‐up template that mirrors the protocols, to ensure
that systematic information is recorded during each site visit. We will use these write‐ups to
conduct the analysis and reporting described in the next chapter.
Technical Review Group
Poetry Out Loud Evaluation Plan 44
We have established and will convene a technical review group (TRG) to provide us with critical
feedback to ensure high quality at the beginning, middle, and end of the study. The TRG will be
convened three times: once to review and give input into the evaluation design; a second time in
the middle of data collection, to review data collected thus far and any early analysis; and a third
time, toward the end of the study, to review a draft of the final report. The TRG, which has
already been recruited, includes experts in creative youth‐development research and evaluation,
data collection, and teaching. Recruited members to date are:
Creative Arts Education Research
Sarah Cunningham, Executive Director for Research & Director, Arts Research Institute,
School of the Arts, Virginia Commonwealth University
Jonathan Herman, Executive Director, National Guild for Community Arts Education
Data Collection
Jamal Abedi, Professor, University of California at Davis, School of Education
Educators
Philip de Sa e Silva, Educator, St. Paul Academy, Minnesota
Aimee Espiritu, Educator, Youth Speaks, Oakland, California
Derek Fenner, Educator, Alameda County Office of Education, Oakland, California
Andrea Santos, Educator, Logan High School, West Virginia
We consulted with POL program partners to develop a strong list of potential review group
members that helped us to meet the partners’ standards for quality assurance. We have
budgeted to offer honoraria to individuals who serve as TRG members. We will allow ample time
for the group to review the deliverables or other materials we share with them, and for our team
to incorporate their feedback. To conserve on costs, we will convene this group via
videoconference. Members of the NEA team will be invited to observe meetings and we will
provide minutes of each meeting to the NEA project director and to group members.
Poetry Out Loud Evaluation Plan 45
Data Analysis
In this section, we describe our plan for analyzing the data we collect.
Quantitative Data Methods
There are two quantitative components of the POL evaluation included in the design, a pre‐ and
post‐student survey and the analyses of student administrative data using quasi‐experimental
techniques. Each of these components is described below.
Pre/Post Survey Analysis
As mentioned in the data collection section above, the analysis of the survey will start with
examining preliminary data obtained through cognitive testing. After discussing the results of the
cognitive testing with NEA staff and TRG members and making the necessary modifications to
the survey items, the research team will then prepare to deploy the student online pre‐survey
across 10 schools. As pre‐survey data becomes available, the research team will take a second
look at the reliability measures and evaluate internal consistency of survey items for data quality
assurance. Preliminary analyses of data will also examine the relationships between outcomes of
interest (e.g., academic engagement, poetry appreciation and engagement, and social and
emotional development) and other variables such as demographics, school of attendance, and
other variables of interest, to gain a better understanding of the relationships between
outcomes of interest and covariates.
Once the team deploys and obtains post‐survey data, the first step will be identifying matched
pairs to assess changes in the outcomes of interest. Depending on the availability of data for
non‐POL participants, the research team will determine the appropriate data analysis methods
to examine change in outcomes between the pre‐ and the post‐survey. For instance, if data is
not available for students who do not participate in POL, analyses of student outcomes between
pre‐ and post‐surveys would be analyzed using paired t‐tests to determine whether the mean
change in the outcomes between the pre‐ to post‐ differed for students who participate in POL.
If data are available for non‐POL participants, the research team would be able to conduct other
types of analysis including repeated measures ANOVA or analysis of covariance (ANCOVA).
Repeated measures ANOVA is a procedure indicated to measure change for related groups (as
opposed to independent groups) and it is used to detect overall differences between related
means and thereby detect a treatment effect but not taking into account pre‐test scores. The
ANCOVA approach helps answer whether the post survey means, adjusted for pre‐test scores,
differ between the two groups. This procedure tests whether means, adjusted for pre‐test
scores, differ between the two groups, those who participate in POL and those who do not.
ANCOVA uses the pre‐scores as a covariate and post‐scores as the dependent and accounts for
Poetry Out Loud Evaluation Plan 46
“treatment” as a factor. It is considered a more versatile method in situations where basic
ANOVA assumptions are violated.25 The general ANCOVA model is as follows:
Post‐score= β0 + β1x I(Intervention)+ β2 x Pre‐score +error
For this model, I (Intervention) is an indicator variable. The indicator takes on values of either “0”
or “1” for pretest‐posttest data with only one intervention. For this model, a value of “1”
indicates that a subject belongs to students who participated in POL and “0” that the subject is
part of the group who did not participate in POL. ANCOVA (analysis of covariance) can be used to
account for differences that exist between groups at the pre‐test, which is likely given that the
two groups will not be selected randomly (Bonate, 2000).
Missing Survey Data
Although we will conduct the survey using methods (electronic administration and close work
with school administrators and teachers) to ensure a reasonably high response rate (around
40%) to both pre‐ and post‐surveys, it is very likely that some students’ data will be missing in
either the pre‐ or the post‐survey. Because missing data can have a significant effect on the
conclusions that can be drawn from the data, SPR will look at whether certain responses are
more likely to be missing and whether certain groups are more likely to have missing values than
others. More specifically, we will calculate the response rate to understand survey non‐
response, one of the principal sources of error that can potentially bias the results.26 To do this
we will compare the characteristics of those who responded to the survey with the pool of
program participants on various demographic characteristics (e.g., age, grade, gender, race and
ethnicity). The pool of participants will be obtained from the student administrative data we
receive from schools. Through this comparison, we will determine whether there are statistically
significant differences between the actual and potential survey respondents. Depending on the
results, we will determine if there is need to address non‐response using additional statistical
procedures such as weighting.
Student‐level Administrative Data
To analyze student academic outcome measures, including student test scores in English
language arts and other student measures such as attendance and discipline records, we will use
a quasi‐experimental design (QED) approach. The QED component relies on student
administrative data because these data are available for a rich pool of students, which will allow
25
ANOVA assumptions state that 1) the dependent variable should be measured at the continuous level; 2) the
independent variables should consist of two or more categorical, independent (unrelated) groups, and 3) there
should be independence of observations. Because dependent variables in the study will likely to be measured using
Likert scales, the use of non‐parametric tests such as the McNemar, Wilcoxon sign‐rank tests to examine pre‐ and
post‐survey changes will also be useful to assess change. These tests are advisable because they make fewer
assumptions about the distribution of responses among participants and do not rely on the data having normal
distributions.
26
Nonresponse bias is the imbalance that results when respondents differ in meaningful ways from those who did
not answer the survey.
Poetry Out Loud Evaluation Plan 47
us to draw a similar comparison group to compare student outcomes. As mentioned earlier, we
expect districts to provide student‐level data for all students in the school.
The typical approach for evaluating the effectiveness of a program or intervention involves
comparing the outcomes of individuals who participated in it with those who did not.
Randomized experiments in which individuals are randomly assigned to the treatment and
control groups are widely considered the gold standard in research because those conditions
ensure that the treated and control groups are similar in terms of both their observed (e.g., race,
sex, individual motivation) and other unobserved characteristics. These initial similarities
between the two groups allow researchers to attribute any differences in outcomes to the
program under evaluation. Because conducting randomized control trials in educational settings
is particularly difficult due to logistical, ethical, or practical considerations it is common to utilize
quasi‐experimental designs instead.
As noted above, identifying an appropriate comparison group is critical to assessing the impacts
of the POL programming. Thus, our quasi‐experimental design will draw a comparison group
from non‐POL participants from the same schools. To build the comparison group that is as
similar as POL participants we will use propensity score matching (PSM) to construct a
comparison group that is most similar to the group that participates in POL programming at least
on observable characteristics. To do this we will select a set of covariates from which we will
estimate the propensity score. The selection of covariates will be based on previous research
examining the relationships between variables of interests (e.g., age, gender, race/ethnic
background, English Learner status, prior academic achievement).27 Next, we will pool the data
of POL and non‐POL participants to estimate the propensity score [(Pr(X) = Pr(T=1|X)] for each
subject. To estimate the propensity score for each subject will use a logit regression, with POL
participation as the dependent measure, and a range of demographic and other characteristics
as independent measures to establish the relative weights for each of the independent measures
in “predicting” POL participation. It is useful to think of the propensity score as an individual
prediction of the likelihood of an individual student participating in POL. The next step is to
match each student in the group of students who participated in POL to another individual
student in group that did not participate in the program. To do this, we plan to use the “nearest
neighbor” approach in the selection process, meaning that we will select the comparison group
member whose propensity score is closest to the respective POL participant. We also plan to use
replacement, so that a potential comparison group member can be matched to several POL
participants. Lastly, we will assess the matching and perform sensitivity tests to assess whether
other approaches would be preferable before estimating the average POL participation effect on
student outcomes.28
The expectation is that matching individuals in baseline characteristics will reduce group
differences that we think might bias the outcomes of interest in some way (Rosenbaum and
27
For example, previous research has found that students that have higher academic achievement to begin with are
those who tend to be more involved in the arts. Thus, including a measure of achievement, that is baseline test
scores as a covariate, will help address this potential source of bias.
28
Other matching methods include caliper and radius matching, stratification/interval matching, or kernel matching.
Poetry Out Loud Evaluation Plan 48
Rubin, 1983). Essentially, PSM ensures that the group that participated in POL and the
comparison group are as similar as possible with respect to observable characteristics. Thus, any
observable differences could then be more confidently attributed to their participation in POL.
Later models will also examine how the clustering of students at the school level impacts the
outcomes of interest by using fixed effects, and present results for both models.
Missing Administrative Data
Administrative data are typically available for the vast majority of students since schools have to
collect these data routinely to meet federal and state accountability requirements. Nevertheless,
as discussed previously with the survey data, we will assess the patterns of missing data and
determine whether it is necessary to correct for missing data using other methods.
Qualitative Data
Data analysis in qualitative research is often most effective when it occurs as an iterative process
of collecting data, reflecting on emerging themes, refining hypotheses, and then starting another
round of the process (Kleinman, Copp, and Henderson, 1997). Accordingly, we have designed the
study so that data analysis begins during the interview visits, continues immediately after the
focus groups, and concludes as the team synthesizes results with other data collection activities
as part of the process of writing the final report.
Prior to conducting data analysis, SPR will send audio recordings of interviews and focus groups
to a transcription firm, and each transcript then will be de‐identified and assigned a unique
research ID to ensure respondents’ privacy. SPR will then analyze these transcripts, along with
site visitors’ observation notes, using qualitative data analysis software (NVivo). Using software
to analyze this type of information supports effective data management because it allows the
gathering of all qualitative data—in this case, observation notes, interview transcripts, and focus
group transcripts—into a single database, and enables the research team to adopt a transparent
and systematic approach to addressing each research question of interest. We will process the
data in the three key stages described below.
Classification. SPR will classify each interview transcript according to key characteristics of the
individual interviewed (such as grade, gender, race/ethnicity, and eligibility for free and reduced
lunch), the school to which she or he attends, and the state in which he or she resides (for
details of which we will look primarily to the interview with the state arts agency administrators).
Focus group transcripts will also be classified based on school and state characteristics. These
metadata will allow SPR to query the database and filter searches by a participant, school, or
state characteristic to assess differences and trends. The metadata will also enable the study
team to keep the NEA apprised about data collection.
Coding. The analysis team will read all transcripts and code passages according to a hierarchical
coding scheme that maps to key research questions and mirrors the structure and themes of the
interview protocols. We will develop an initial set of codes and add codes as necessary based on
new themes identified during data collection debriefing meetings. One of the key benefits of
using software to code the data is that it allows SPR to submit the coding scheme to the NEA for
review and comment to ensure that the codes (1) align with the NEA’s priorities for the final
Poetry Out Loud Evaluation Plan 49
report and (2) link the research questions and analysis in a way that supports the NEA’s key
interests. Each analysis team member will be assigned a limited set of themes to code in the
data, ensuring that he or she is able to develop expertise in recognizing specific themes. In order
to ensure inter‐rater reliability we will have each member of the analysis team test code one to two
interviews or focus group transcripts and compare the results, discussing discrepancies in coding and
collectively deciding how to code any contentious items.
Analysis. Once all transcripts have been coded, the team will use the software’s querying
features to explore the relationship between respondent and school or state characteristics, and
perceptions of and experiences with POL. The team will conduct this analysis by using the codes
identified during stakeholder interviews to query and filter for experiences with POL. Each team
member will then prepare an internal analysis memo summarizing his or her findings, and these
memos will form the basis for drafts of the final report.
Poetry Out Loud Evaluation Plan 50
This page was intentionally left blank.
Poetry Out Loud Evaluation Plan 51
Deliverables and Timeline
In this section we review key deliverables, the institutional review board (IRB) process, and the
study timeline.
There are five key deliverables associated with the study.
Key Deliverable #1: Study Framing Document. Completed March 31st, 2017. This report included
the components listed below.
1. Summaries of key findings from the contractor’s rapid scan of each of the following:
previous research and evaluation publications related to POL; a relevant sampling of
administrative and programmatic documents (e.g., grantees’ final descriptive reports,
student and teacher testimonials, interview transcripts, etc.) from POL grants to states
made during the five‐year period from fiscal year (FY) 2011 through FY 2015; and
published or ‘gray’ literature about valuations of similar youth arts engagement projects.
2. Revised POL logic model and supporting narrative, including the program’s inputs,
activities, outputs, and short‐ and intermediate‐level outcomes.
3. Recommendations for indicators for each proposed outcome, and the data sources,
metrics, and/or methodologies that the contractor could use to assess these indicators.
Key Deliverable #2: POL Evaluation Plan. To be submitted October 27, 2017. The evaluation plan
detailed in the document includes:
1.
2.
3.
4.
5.
6.
7.
8.
9.
An overview of research questions and POL program;
A description of the evaluation design;
A review of relevant literature;
Identification of data sources, data collection methods and instruments;
An analytical plan;
A sampling strategy;
An overall study timeline;
An approach to the protection of human subjects/consent strategy (as appropriate);
A communications and reporting plan.
Key Deliverable #3: Draft Supporting Statements for Paperwork Reduction Act Information
Collection Request. Prepare draft supporting statements to support the NEA’s information
collection request (ICR) package(s) under the Paperwork Reduction Act. SPR shall provide the
following documents to the NEA, which will then submit these documents to OMB.
1. 60 Day Federal Notice (submitted to NEA 3/31/17; posted in Federal Register 4/6/17)
2. 30 Day Federal Notice (TBD)
3. Information Clearance Package (Final Clearance Package estimated to be submitted to
NEA 12/15/17)
- OMB‐i83 form
- Supporting statement A
- Supporting statement B
Poetry Out Loud Evaluation Plan 52
-
Survey instruments
Interview protocols (student, teacher and administrator protocols)
Recruitment material
Participant assent/consent forms (student, teacher, and administrator)
Administrative data release request
IRB approval letter
Although not listed as a key deliverable, the fact that the research team will be working with
schools, that the team takes seriously the ethical and legal obligations associated with human
subject data collection, and the reference to the IRB approval letter above make clear that the
research team will also need to submit the study design and instruments to an institutional
review board (IRB). A comprehensive IRB application will be submitted to Solutions IRB, an
established independent and fully accredited review board that provides ethical review of
quantitative and qualitative research and is experienced in social, behavioral, and education
focused research review. The IRB application package will be comprised of a detailed description
of all research activities including sampling, recruitment, risks and benefits of the research,
consent procedures, secure data storage, and protections for participate confidentiality. In
addition, supplemental documents will be submitted, including: interview guides, survey
questions, recruitment materials, and consent/assent forms. Following the submission of the
complete IRB package, the application will be reviewed by Solutions IRB. If necessary, SPR will
address any outstanding issues, questions, or concerns in a revised submission. No research
activities that involve human subjects will be conducted prior to receiving IRB approval. Any
modifications to the design of the research study or study materials will be submitted to the IRB
and will not be implemented until modifications are approved.
Key Deliverable #4: POL Evaluation Report. To be submitted November 16, 2019. The POL
evaluation report will not exceed 80 pages in length (inclusive of appendices). The report shall
include:
1. Executive summary;
2. Introduction and background;
3. Evaluation design and approach, including research questions, description of the
evaluation design and methods, with copies of data collection instruments presented in
an appendix;
4. Study findings, presented by research question;
5. Proposed revisions to the logic model and/or the measurement model from Key
Deliverable #1;
6. Conclusions and recommendations for future POL guidelines development, grantee
reporting requirements, and performance metrics.
Key Deliverable #5: Supplemental Products. To be submitted December 19, 2019. The
supplemental products may include the following:
1. Graphic fact sheet(s) that capture the study’s key findings in 1‐2 pages;
2. Set of PowerPoint slides that summarize the study and its findings;
3. Selected quotes from a highly visible educator and/or poet regarding POL;
Poetry Out Loud Evaluation Plan 53
4. Interview transcripts and other raw data (all documents must be de‐identified prior to
submission to the program partners, as the program partners reserve the right to make
the evaluation report and data resulting from this study publicly available).
Exhibit 14 shows the schedule of tasks and deliverables for the project.
Poetry Out Loud Evaluation Plan 54
This page was intentionally left blank.
Poetry Out Loud Evaluation Plan 55
Next Steps for the Evaluation
Under the new timeline for the project, data collection will begin in the Fall of 2018. In order to
be prepared for that data collection, the research team will undertake the following steps:
Convene the TRG in July 2017 to review the evaluation plan;
Revise the evaluation plan according to NEA, Poetry Foundation, and TRG comments in
July‐October 2017;
Conduct cognitive testing of the instruments in October 2017;
Submit the IRB package in November 2017. The IRB that the research team works with
has only a few weeks’ turnaround time once the package is submitted, compared to the
much longer turnaround time for OMB PRA approval, and the bulk of the content to be
included in the package will be developed with the evaluation plan and instrument
development;
Submit the PRA package to the NEA in December 2017 – February 2018;
Recruit schools in May to July 2018;
At the time of recruitment and right before data collection is to start in September 2018,
the research team will work with the schools that agree to participate to schedule site
visits, plan student record data transfer, and plan for administering the survey.
Poetry Out Loud Evaluation Plan 56
Exhibit 14. Schedule of Tasks and Deliverables
Note: Cognitive testing, technical review group, IRB, recruiting schools, and data collection are all projections subject to circumstances
Poetry Out Loud Evaluation Plan 57
Appendix A: Study Framing Document
Poetry Out Loud Evaluation Plan A1
This page was intentionally left blank.
Poetry Out Loud Evaluation Plan A2
Poetry Out Loud Evaluation Study
Framing Document
May 5, 2017
Prepared for:
National Endowment for the Arts
Prepared by:
Social Policy Research Associates
Contract #: C17‐05
Poetry Out Loud Evaluation Plan A1
This page was intentionally left blank.
Contents
Introduction to the Study Framing Document .................................................................................. 1
Poetry Out Loud (POL) Program Overview ........................................................................................ 1
Findings from the Previous Evaluation .............................................................................................. 1
Poetry Out Loud Logic Model ............................................................................................................ 3
Introduction to the POL Evaluation ................................................................................................... 8
Next Steps ........................................................................................................................................ 15
This page was intentionally left blank.
Introduction to the Study Framing Document
The National Endowment for the Arts (NEA) commissioned a multi‐year study to better
understand student‐level outcomes associated with the implementation of the Poetry Out Loud
program under optimally implemented conditions. In December 2016, the NEA awarded Social
Policy Research Associates (SPR) a contract to conduct the study over a 29‐month period –
December 2016 through April 2019. The Study Framing Document is intended to ground the
evaluation by presenting the program’s conceptual framework and demonstrating the
relationship between the research questions and the planned data collection activities. The
document that follows opens with an introduction to POL, including an overview of findings from
the previous evaluation; it next presents the logic model; it then presents a preliminary
evaluation matrix, which maps outcomes and research questions to data collection activities and
indicators. Finally, we offer next steps for the evaluation.
Poetry Out Loud (POL) Program Overview
A national arts education program supported by the National Endowment for the Arts, Poetry
Foundation, and state and jurisdictional arts agencies, Poetry Out Loud encourages the nation’s
youth to learn about great poetry through memorization and recitation, helping students master
public speaking skills, build self‐confidence, and learn about literary history.1
The primary goals of the program are to strengthen students’ academic skills and performance,
to support students’ social and emotional well‐being, to help grow teachers’ knowledge of and
confidence in teaching poetry, and to increase student and overall community awareness and
appreciation of poetry and the arts. These goals map closely to the “impacts” envisioned in the
program’s logic model, which we discuss in more detail in a subsequent section of the
document.
The current evaluation will be the first since 2008 and will focus on assessing student outcomes
in poetry appreciation and engagement, social and emotional development, and academics using
a rigorous quasi‐experimental design combined with qualitative data collection and analysis of
program design and implementation in 10 sample sites.
Now in its eleventh year as a nationwide program (after a pilot year in 2005 during which the
program was launched in two cities), POL has grown to serve more than 3 million students and
50,000 teachers from 10,000 schools in every state, Washington, DC, the US Virgin Islands, and
Puerto Rico.2
Findings from the Previous Evaluation
In the interest of identifying continuities with the previous evaluation of POL as well as
identifying gaps in that research that can be informed by the new evaluation, we reviewed
1
Poetry Out Loud logic model
2
http://www.poetryoutloud.org/about
Poetry Out Loud Study Framing Document 1
findings from the previous evaluation of POL.3 The previous evaluation of POL (Rockman et al,
2008) was informed by three years of data drawn from school coordinator surveys, teacher
surveys, state champion surveys, and teacher focus groups.
The annual reports produced by this prior evaluation focused on the reach, support, and
engagement with POL by students and participating schools, providing compelling evidence that
that the program had continued to grow (over the course of the three years) and reach
increasingly diverse students, rural schools, and schools with and without existing strong arts
programs. Additionally, the evaluation found that POL helped to facilitate both the engagement
and retention of teachers by providing them resources to bolster existing curricula. With respect
to student‐level outcomes, the evaluation focused largely on poetry appreciation and
engagement. The feedback solicited from students was a survey of students who were state
champions in POL competitions.
Results from the student state champion survey found that:
Every state champion/national finalist (100%) said they enjoyed studying and memorizing
the poems;
All the finalists (100%) said the experience got them excited about poetry recitation;
Every finalist (100%) said that they had found new favorite poems and poets;
90.9% of finalists said they enjoyed participating in the contest;
90.9% said poetry was more important to them after participation in the program;
86% said that listening to others recite poetry helped develop their recitation skills;
81.8% said participating in POL increased their confidence or made them feel more at
ease in front of an audience.
Teachers also echoed these points in their survey responses, suggesting that the finalist findings
might also extend to the general participating student population:
91% of the teachers completing the survey agreed (56% somewhat, 34% strongly) that
their students enjoyed studying and memorizing poems;
86% agreed their students will be more likely to read poetry for pleasure after
participating in the program.
These findings map well to the “awareness and appreciation of poetry and arts programming”
component of the logic model that we will describe in the next section. For example, these
findings suggest that POL students experienced increased exposure to poetry and that they
expressed appreciation for poetry. Some of the findings also pointed to the anticipated socio
health outcomes that the NEA hopes students might have accrued by virtue of participating in
the program. For example, the findings suggest that students’ confidence increased—one of the
measures of socio‐emotional health. However, there are numerous outcomes and impacts that
the more formative‐focused previous evaluation did not have a chance to explore. It is in those
areas―including students’ academic skills and performance, and a more in‐depth study of
students’ social and emo onal health―that the current evalua on can inform the NEA and the
field at large. In addition, even for areas well‐explored by the previous evalua on―specifically
3
As planned by the project team, a more comprehensive literature review will be included in the Evaluation Plan.
Poetry Out Loud Study Framing Document 2
student awareness and apprecia on of poetry and arts programming―the current evalua on’s
quasi‐experimental design will allow us to provide more rigorous analysis of those and the other
outcome areas. Having a comparison group of students not participating in POL will allow us to
say with greater confidence whether student outcomes are correlated with POL participation.
Further, the current study intends to collect data from all types of student participants, not just
competition winners (who represent a very small subset of the participating student population).
In the section that follows, we introduce the logic model as a foundational document for both
the program at large and the current evaluation.
Poetry Out Loud Logic Model
Logic models serve as visual representations of program resources, strategies, desired outcomes
and the relationships between them. They are useful tools that program staff and their
evaluators can use to ensure that programs as implemented have fidelity to the intended
program models and to track progress against specified goals. In February, SPR worked closely
with members of the NEA POL team to update the POL logic model which, at the time, reflected
the strategies and goals of POL when it was at a more nascent stage in its program development.
The revised logic model, which more accurately reflects the program’s current strategies and
goals, is included on page 8 in Exhibit 1.
The Poetry Foundation, co‐creator of Poetry Out Loud, seeks to elevate the visibility and
influence of poetry in our culture, and to “discover and celebrate the best poetry and place it
before the largest possible audience.” The POL logic model articulates how the program works
to meet these goals as well as other, more youth‐development centered goals focused on
academic outcomes and socio‐emotional well‐being. Stretched across the top of the logic model
is POL’s mission, which is to encourage the nations’ youth to learn about great poetry through
memorization and recitation, helping students master public speaking skills, build self‐confidence,
and learn about literary history. The mission provides contextual grounding for the elements of
the logic model listed beneath it. The core elements of the logic model include the inputs
provided by POL to support effective programming, contextual factors with the potential to
affect POL implementation and outcomes, key strategies in program implementation, the
anticipated outcomes that POL program partners hypothesize will result, and the key areas of
impact that program staff ultimately hope to achieve through POL. These elements are described
in detail in the following sections.
Inputs
Implementing POL on a national scale is a complex endeavor. The POL program and its
participants benefit from key partnerships with organizations and agencies, each of which bring
strong assets (e.g., networks, knowledge, expertise) to support the program and its visibility,
sustainability, and growth. Specifically:
The National Endowment for the Arts provides funding to support the effective
implementation of POL at multiple levels. In addition to funding the national POL
competition, NEA provides funds that support state arts agencies to implement the
Poetry Out Loud Study Framing Document 3
program within their states and to provide resources for state‐ and local‐level partners,
teachers, and students.
The Poetry Foundation provides easy access to free educational materials for teachers
and students. This includes a comprehensive curriculum package, teacher guides, and a
robust online anthology of poetry that the Poetry Foundation maintains and updates
regularly.
State Arts Agencies oversee the implementation of POL in their states. To that end, their
primary goals are to publicize the program, recruit teachers and schools to participate,
and develop and conduct the state‐level finals program. Some state arts agencies also
supplement NEA funds with additional state funds and/or private donations, or establish
organizational partnerships that enable special trainings and workshops for participating
students and teachers or add to the prize amounts.
Contextual Factors
Contextual factors are presented in line with inputs as factors that can affect POL
implementation and outcomes. These include school, teacher, and student prior experience with
POL; teacher’s years of experience; socio‐economic factors such as poverty rates among the
student body that can impact learning; and the social and cultural context of the school. There
are also differences across states in terms of deployment of POL funds, which can also affect
outcomes. States spend their POL funds differently, depending on priorities and implementation
capacity. Similarly, the extent to which states can support or augment the POL program with
supplemental state and/or private resources differs by state and can also influence outcomes.
Strategies
The inputs and strategies listed in the logic model are intrinsically tied together. That is, the
Inputs column highlights the role of core project partners, each of which provide important
resources to support participants and ensure quality programming, while the Strategies column
articulates how these resources are operationalized. Core strategies include:
Providing Easy Access to a Robust Online Poetry Anthology. POL aims to provide students
and their communities with high quality poetry from authors from different time periods
and diverse backgrounds. POL provides participating schools with access to this literature
through its online poetry anthology. This anthology, which POL maintains and regularly
updates, provides access to over 900 poems as well as other information to support
poetry learning, such as a glossary of terms and poet biographies.
Providing Teachers with a Comprehensive Poetry Curriculum. Recognizing that teaching
poetry may feel challenging for some teachers, POL also developed a comprehensive
poetry curriculum, complete with classroom materials, teaching guides, and all the
information teachers would need to implement POL in their classrooms. POL offers this
curriculum to all teachers, free of charge.
Sponsoring the National POL Competition and Leading Its Publicity Efforts. Each year, POL
sponsors a national competition which celebrates student efforts and challenges
Poetry Out Loud Study Framing Document 4
students to share their love of poetry through creative expression. This competition
ultimately results in the public performance of thousands of poetry recitations each year,
raising the visibility of poetry, the POL program, and the agencies that support this work.
Outcomes and Impacts
POL partners have identified four key areas of impact, each of which is connected to a set of
anticipated outcomes associated with participation in POL and that, as they are measured, serve
as indicators of progress. These include:
Students’ Academic Skills and Performance are Strengthened. The anticipated outcomes
associated with this impact area reflect the kinds of academic outcomes that are possible
through increased exposure to and dynamic engagement with poetry. For example,
exposure to poetry from artists from different generations who represent diverse
communities will increase student knowledge of poets and poetic styles Furthermore,
their, analytical capacity will likely grow as they wrestle with complex texts to uncover
meaning, as would their language arts proficiency as they engage not only with meaning
making of different texts but as they do so in different ways. The POL program’s
emphasis on the performative aspect of poetry engagement expands literacy practices
beyond ink and paper, enabling students to grow in their oral language development and
creative narrative expression. Finally, POL program partners hope that participation in
this dynamic program will result in increased engagement in learning in general.
Students’ Social and Emotional Health Improves. Socio‐emotional health is a complex and
broad arena, within which a multitude of variables can influence outcomes. The
anticipated outcomes associated with this impact area focus on factors that can
contribute to students’ socio‐emotional health and well‐being, such as student confidence
which may come as a result of participation in the performative aspects of the program
and/or through the success they may feel as they begin to uncover meaning within
poems that initially may have felt obscure. Rising confidence or the empowerment
students may feel through their learning may also have some influence over a student’s
sense of self or identity, as may a student’s choice of poetry. Given the large volume of
poems to choose from in the POL anthology, students have a multitude of options to
wrestle with works that resonate with them for any number of reasons, including the
ways in which poetry may speak to key aspects of their identity. Finally, POL partners
hypothesize that the dynamic and performative aspects of the POL program, the
competition structure, and the confidence that may grow through program participation
may result in increased engagement in the larger school community.
Teacher Knowledge of and Confidence in Teaching Poetry Increases. Part of the challenge
of learning poetry may stem from the fact that teaching poetry may feel challenging to
even the most seasoned English language arts instructors. Thus, this impact area includes
outcomes that support the building of teacher confidence and is directly tied to the use
of POL’s curricular support. For example, by being introduced to the POL program and
given access to POL’s teaching supports, more teachers will be exposed to the kinds of
arts education programming that is focused specifically on supporting them and their
Poetry Out Loud Study Framing Document 5
work and strengthening their ability to effectively teach poetry. It is also assumed that by
immersing themselves in the POL materials and having access to such a rich anthology of
poems, teachers’ knowledge and appreciation of poetry will increase, as well as their
enthusiasm for teaching poetry.
Awareness and Appreciation of Poetry and Arts Programming Increases. While youth are
the primary target for Poetry Out Loud, POL partners hope that as program participation
grows, so will the visibility of poetry and the POL program, resulting in a greater overall
awareness and appreciation of poetry and of arts programming. Outcomes related to
awareness are focused on exposure, with an assumption that awareness and familiarity
with poetry in general, as well as familiarity with specific poems will increase as more
students and their communities4 are exposed to poetry, arts programming, and the
agencies and organizations that make arts programming possible. Outcomes related to
increased appreciation include the expression of appreciation by students and their
communities as well as increased participation in arts programming.
The POL logic model provides a strong guiding framework to help stakeholders understand the
relationship between the POL inputs and strategies and the program’s desired outcomes and
impacts. It also provides a strong foundation for the development of SPR’s evaluation framework
and plan, which will provide a detailed description of how the evaluation will track progress
toward key outcome areas. In the next section, we present a preliminary evaluation matrix
designed to guide the evaluation.
4
For the purposes of this logic model, the definition of community includes students, their families and friends, their
school communities (teachers, administrators, etc.) and the broader community in their locales that support the
work of local schools.
Poetry Out Loud Study Framing Document 6
Exhibit 1. Logic Model
Poetry Out Loud Study Framing Document 7
Introduction to the POL Evaluation
The purpose of the evaluation of the Poetry Out Loud program is to understand student‐level
outcomes associated with the implementation of POL under optimally implemented conditions.
(As discussed, teacher and community outcomes are also of interest to the NEA, but are not the
primary focus for the current evaluation.) In consultation with the NEA and project partners, SPR
will select a sample of 10 POL‐participating schools to conduct quantitative and qualitative data
collection activities. The study is guided by a series research questions focused on the
assessment of the program’s impact in three different domains: students’ academic
engagement and performance, poetry engagement and appreciation, and social‐emotional
development. This section introduces a preliminary evaluation matrix that maps the intended
outcomes of the program (obtained from the logic model) to the research questions guiding the
evaluation. It also includes constructs and indicators that help build specificity and operationalize
the domains we intend to measure. Finally, it links the indicators to the intended modes of data
collection.
We first describe the components proposed for inclusion in the matrix and then present specific
examples of how the evaluation matrix might be completed for each of the three domains that
are part of this study. We intend for these preliminary examples to help facilitate and frame
subsequent discussions around each of the domains as we hone the constructs and indicators
that will be ultimately included in the evaluation plan.
We introduce the evaluation matrix by outlining the anticipated outcomes of the program
included in the program’s logic model. Anticipated outcomes, in this case, refer to the expected
outcomes that participants will experience as a result of the implementation of the program
strategies. As the samples in later exhibits show, most of the anticipated outcomes included in
the program logic model map to the specific research questions as found in the RFP (see Exhibits
4, 5, and 6). The preliminary model of the evaluation matrix we envision includes three
additional fields, one identifying the constructs of interest that we intend to measure, another
one identifying indicators that will help measure those constructs, and lastly a field indicating the
intended data sources. Exhibit 2 shows these suggested components.
Exhibit 2. Evaluation Matrix Components
Anticipate
d Outcomes
Research
Questions
Constructs
Indicators
Data
Sources
We chose to add a field after the research questions identifying more precise constructs.
Constructs are ways of re‐framing research questions that point toward measurability. They are
a logical step between research questions and the field we will discuss next, indicators.
Indicators are observable measures, and they occupy the position following constructs in the
evaluation matrix. Indicators reflect definite quantitative or qualitative means to measure the
Poetry Out Loud Study Framing Document 8
constructs of interest. It might take multiple indicators to measure a given construct. Deciding
which indicators to include is an important exercise that needs to be guided by previous research
and is typically done in consultation with multiple stakeholders. For this reason, we expect to
refine these indicators as we continue with the literature review and engage with the technical
review group. The objective for the evaluation plan is to select the indicators that will inform the
development of the data collection instruments and that will provide metrics that reliably assess
things like knowledge, attitudes, and behaviors of interest.
Last in the matrix, we include a field that outlines the data sources we intend to use to gather the
indicators of interest. As mentioned earlier, SPR will use five sources of data for this study (three
of which are covered in bullet #3 below):
Student Administrative Records. SPR will coordinate the data extract and transfer process
with school districts to obtain student‐level data in the schools selected for the study.
The main objective in this activity is to obtain student records for all students in each of
the schools—those who participate in POL and those who do not. These data are the
basis for the quasi‐experimental component of the study and will include the following
fields:
- Unique identifiers for all students (with student proxy id generated by the school
district);
- Participation in POL identifier for current and prior academic years;
- Student‐level demographic information (e.g., gender, race/ethnicity, free and
reduced lunch status, special education, etc.);
- Standardized assessment data in ELA for current and prior academic year, and
GPA and ELA end‐of‐course grades
- Student‐level records of attendance, suspensions, and expulsions.
Student Pre‐ and Post‐Surveys. SPR will design pre‐ and post‐student survey instruments
that measure changes in the academic engagement, poetry appreciation and
engagement, a range of attitudes toward poetry and engagement in school, and social
and emotional development domains. The pre‐survey data will provide baseline
measurements of students’ knowledge, attitudes, and behaviors in the domains of
interest. The plan is to include the same measurements included in both the pre‐survey
and the post‐survey, and to then quantify changes in students’ responses. To meet the
conditions for analyses, pre‐surveys need to be administered before a student has
experienced direct POL programming and the posttest needs to be administered directly
after the end of POL activities.
Teacher and Administrator Interviews, and Student Interviews and Focus Groups. These
data include on‐site, in‐person interviews with administrators and teachers, and
interviews and focus groups with students to document the “story” of POL in the
optimally implementing schools selected for the study, from the initiation of participation
through the reception of the program by students. During day‐long site visits, SPR will
interview teachers, students, and administrators, visit classrooms participating in POL
and, if possible, conduct focus groups with students. Interview protocols will be designed
such that they yield data that inform the research questions in the domains of interest.
Poetry Out Loud Study Framing Document 9
In the remainder of the document we present three examples, one for each of the three
domains SPR will examine in the evaluation of POL: academic engagement and performance,
poetry appreciation and engagement, and social and emotional engagement. Each of the
examples (Exhibits 4, 5, and 6) map sample outcomes from the logic model to the research
questions, suggest constructs and indicators, and align them with the data sources. SPR will add
to and refine the evaluation matrix in preparation of the evaluation plan, in consultation with
members of the NEA and the technical review group. We will use the literature review to further
inform the constructs and indicators we select.
Understanding the Matrices
Note that the three research domains that we have developed matrices for closely overlap with
but do not exactly align with the four impact areas shown in the logic model. In part, this is due
to the fact that the NEA requested a study that is more focused on student outcomes than
teachers and communities, thus our research domains focus primarily on students. Outcomes
related to teachers and communities are not a focus of this study, though we will lift up findings
related to both in the likely event that they will emerge as a natural and related part of our
inquiry process. Slight alignment shifts are also due to the fact that the logic model has been
revised since the study was launched; assisting the NEA and program partners with the revision
was part of SPR’s charge in conducting the study. The first two matrices are focused on student
academic engagement and performance and student socio‐emotional development,
respectively. These represent domains that map almost exactly to their respective impact areas
on the logic models. The third research domain—student poetry and appreciation—is
overlapping with its impact area, but with an important difference in that it focuses on outcomes
associated with students, whereas the logic model impact area encompasses community
members as well. Exhibit 3 shows the relationship of research domains to logic model impact
areas.
Exhibit 3. Research Domains and Logic Model Impact Areas
Research Domain
Logic Model Impact Area
Student Academic Engagement and Performance
Students’ Academic Skills and
Performance are Strengthened
Student Social and Emotional Development
Students’ Social and Emotional Health
Increases
Student Poetry Appreciation and Engagement
Awareness and Appreciation of Poetry
and Arts Programming Increases33
Teacher Knowledge of and Confidence in
Teaching Poetry Increases
33
Like the research domain, this impact area includes students; however, unlike the research domain, it also
includes outcomes associated with community members.
Poetry Out Loud Study Framing Document 10
What follows are the three matrices, each focused on a research domain. The first column of
each matrix lists study research questions that fall within the research domain. To the right of
the research questions column is a set of columns that list the anticipated outcomes associated
with this impact area/research domain. Outcomes directly tied to the research question are
marked with a “ ”. Constructs and indicators are described above, and columns for each are
located to the right of the outcomes column. As noted, SPR will refine and add to these for
inclusion of a more final version in the evaluation plan. Finally, the final column, data sources,
lists the appropriate sources of data to measure the indicators, from among the five sources that
we will be collecting (student records, student surveys, student interviews and focus groups,
teacher interviews, and administrator interviews).
Poetry Out Loud Study Framing Document 11
Exhibit 4. Academic Engagement and Performance
Does student participation in POL
correlate with increased academic
engagement in English classes
and/or in school more generally?
Are POL students more likely to be
comfortable using metaphor,
simile, or a wider vocabulary in
writing or in speaking after the
program?
‐ Comfort with different
‐ Relevant results from
poetry forms and devices interviews and surveys
‐ Vocabulary development ‐ Vocabulary test score gains
Admin Interviews
Student Surveys
Student Interviews
Teacher Interviews
Data Source
Student Records
‐ # absences
‐ # suspensions
‐ Relevant results from
interviews and surveys
‐ Standardized ELA scale
‐ Academic achievement in scores
English classes and in
‐ Standardized ELA proficiency
school
scores
‐ Student GPA
‐ Scale scores in standardized
‐ Reading comprehension test scores in reading
‐ Analytical skills reading
comprehension
poetry
‐ Standardized formative
student assessments scores
Does POL have a positive impact on
students’ reading comprehension
and/or analytical skills (particularly
regarding poetry)?
Indicators
‐ Academic engagement in
English classes
‐ Academic engagement in
school
Constructs
ELA Proficiency
Lit History
Analytical Cap
Outcomes
Learning Engage
Research Questions
Poetry Out Loud Study Framing Document 12
Exhibit 5. Social and Emotional Development
Do students feel more secure,
empowered, and/or articulate in
expressing themselves after
participating in POL?
Are students more likely to engage in
civic activities during or after
participation in POL?
Are students more likely to engage in
extracurricular activities during or
after participation in POL?
Comm Engage
Sense of Self
Confidence
Do students experience increased
self‐confidence in their public
speaking abilities, social skills,
intellectual abilities, or in general
after participating in POL?
‐ Scaled survey scores related to
‐ Self‐confidence
confidence in public speaking
‐ Prosocial attitudes ‐ Survey scores related to frequency of
and behavior
peer engagement
‐ Relevant results from interviews
‐ Scaled scores related to comfort with
self‐expression
‐ Relevant results from interviews
‐ Survey scores related to volunteerism
hours
‐ Survey scores related to participation
in community activities
‐ Survey scores related to involvement
in student leadership
‐ Relevant results from interviews
‐ Survey scores related to participation
in extracurricular activities, school
clubs, and/or after school programs
‐ Relevant results from interviews
‐ Self‐confidence
‐ Empowerment
‐ Civic participation
‐ In‐ and out‐of‐
school engagement
Admin Interviews
Data Source
Student Records
Indicators
Teacher Interviews
Constructs
Student Interviews
Outcomes
Student Surveys
Research Questions
Poetry Out Loud Study Framing Document 13
Exhibit 6. Poetry Appreciation and Engagement
Does participating in POL correlate
with students’ increasing their
likelihood of reading or writing
poetry for pleasure?
Does a teacher or a school’s
participation in POL correlate with
greater incorporation of poetry in
classroom/school instruction?
‐ Attitudes toward reading
poetry
‐ Attitudes toward writing
poetry
‐ Sharing poetry with
peers
‐ Sharing poetry via social
media (Facebook,
Instagram)
‐ Frequency scale of poetry
exchanges
‐ Relevant results from interviews
and surveys
‐ Frequency scale of poetry
exchanges via social media type
‐ Relevant results from interviews
and surveys
‐ Increased poetry content ‐ Frequency scale of poetry
in curriculum
inclusion in curriculum
‐ Relevant results from interviews
‐ Attitudes toward poetry
‐ Attitudes toward public
speaking
‐ Beliefs about post high
school aspirations
‐ Scale of attitude toward poetry
‐ Scale of comfort with public
speaking
‐ Attitude about finishing HS
‐ % planning to go to college
‐ Relevant results from interviews
Admin Interviews
Student Records
Data Source
Teacher Interviews
Indicators
Student Interviews
Constructs
Art Prog
Exposure to
SAA/NEA/PF
Does POL promote the sharing of
poems among students and if so,
by what means?
Do students talk about poetry or
POL on social media networks after
the participation versus before?
Does POL participation correlate
with any attitudinal changes
toward poetry, academics, public
speaking/performing, or post high
school aspirations?
Arts appreciation
Poetry exposure
Outcomes
Student Surveys
Research Questions
Poetry Out Loud Study Framing Document 14
Next Steps
The next steps for the evaluation are to solicit feedback from the NEA and program partners on
the Study Framing Document and revise it accordingly. Concurrently and following on our work
on the Study Framing Document, we will begin developing the evaluation plan that will serve as
the detailed blueprint for our evaluation design. The plan will include the following components:
(1) research questions, (2) evaluation design description, (3) detailed description of methods and
data collection instruments and sources, (4) analysis plan, (5) sampling strategy, (6) timeline, (7)
human subjects protection approach (i.e., IRB approval and plan for adherence to privacy laws),
(8) communications plan, and (9) reporting plan. The draft evaluation plan is scheduled for
submission on April 28, 2017, with a final evaluation plan due on May 19, 2017. The research
team will also begin preparing the OMB/PRA clearance package, including Federal Register
Notice, OMB 83‐I, supporting statements, and supplemental materials. The clearance package is
contractually scheduled for submission on August 18, 2017, but, in order to increase our chances
of receiving approval to collect data in the time frame we wish to collect it, we plan to submit
the package by June 30, 2017.34
34
Because of contract‐related delays, the evaluation plan was submitted in December 2017 and the Paperwork
Reduction Act clearance package was submitted in April 2018.
Poetry Out Loud Study Framing Document 15
This page was intentionally left blank.
Appendix B: Bibliography
Poetry Out Loud Evaluation Plan B1
This page was intentionally left blank.
Poetry Out Loud Evaluation Plan B2
Athanases, Steven Z. "Performing the Drama of the Poem: Workshop, Rehearsal, and
Reflection." English Journal 95, no. 1 (2005): 88‐96.
Beatty, Michael J. "Communication Apprehension as a Determinant of Avoidance, Withdrawal
and Performance Anxiety." Communication Quarterly 35, no. 2 (1987): 202‐217.
Beatty, Michael J., and Matthew H. Friedland. "Public Speaking State Anxiety as a Function of
Selected Situational and Predispositional Variables." Communication Education 39, no. 2
(1990): 142‐147.
Bonate, Peter L. Analysis of Pretest‐Posttest Designs. CRC Press, 2000.
Borsboom D, Mellenbergh GJ, van Heerden J. “The Theoretical Status of Latent Variables.”
Psychological Review 110, no. 2 (2003): 203‐219.
Bransford, John D., James S. Catterall, Richard J. Deasy, Paul D. Goren, Ann E. Harman, Doug
Herbert, Felice J. Levine, and Steve Seidel. The Arts and Education: New Opportunities for
Research. Washington, DC: Arts Education Partnership, 2004.
Bresler, Liora, and Margaret Macintyre Latta. "A Unified Poet Alliance: The Personal and Social
Outcomes of Youth Spoken Word Poetry Programming." International Journal of
Education and the Arts 11, no. 2 (2010): 1‐25.
Carifio J and Perla R. “Resolving the 50‐year Debate Around Using and Misusing Likert Scales.”
Medical Education 42, no. 12 (2008): 1150‐1152.
Catterall, James S. Doing Well and Doing Good by Doing Art: A 12‐year National Study of
Education in the Visual and Performing Arts. Los Angeles: I‐Group Books, 2009.
Catterall, James S., Susan A. Dumais and Gillian Hampden‐Thompson. "The Arts and Achievement
in At‐Risk Youth: Findings from Four Longitudinal Studies. Research Report# 55.” National
Endowment for the Arts, 2012.
Catterall, James, Richard Chapleau, and John Iwanaga. "Involvement in the Arts and Human
Development: General Involvement and Intensive Involvement in Music and Theater
Arts." Champions of Change: The Impact of the Arts on Learning, (1999): 1‐18.
Century, Jeanne, and Amy Cassata. "Implementation Research: Finding Common Ground on
What, How, Why, Where, and Who." Review of Research in Education 40, no. 1 (2016):
169‐215.
Clark, Lee Anna, and David Watson. “Constructing Validity: Basic Issues in Objective Scale
Development.” Psychological Assessment 7, no. 3 (2007): 309‐319.
Poetry Out Loud Evaluation Plan B3
Crawford Barniskis, Shannon. "Graffiti, Poetry, Dance: How Public Library Art Programs Affect
Teens Part 2: The Research Study and Its Practical Implications." Journal of Research on
Libraries and Young Adults 2, no. 3 (2012).
Crozer, Karen J. "American Poetry & A Paradigm of Play: Transforming Literature with Young
Children." PhD diss., The Claremont Graduate University, Claremont, 2014.
Darling‐Hammond, Linda. "Teachers and Teaching: Testing Policy Hypotheses from a National
Commission Report." Educational Researcher 27, no. 1 (1998): 5‐15.
Daykin, Norma, Judy Orme, David Evans, Debra Salmon, Malcolm McEachran, and Sarah Brain.
"The Impact of Participation in Performing Arts on Adolescent Health and Behavior: A
Systematic Review of the Literature." Journal of health psychology 13, no. 2 (2008): 251‐
264.
DeVellis RF. Scale Development Theory and Applications. Thousand Oaks, CA: Sage; 2003.
Durlak, Joseph A. Handbook of Social and Emotional Learning: Research and Practice. New York:
Guilford Publications, 2015.
Dwyer, M. Christine. "Reinvesting in Arts Education: Winning America's Future Through Creative
Schools." President's Committee on the Arts and the Humanities, (2011).
Eccles, Jacquelynne S., Bonnie L. Barber, Margaret Stone, and James Hunt. "Extracurricular
Activities and Adolescent Development." Journal of Social Issues 59, no. 4 (2003): 865‐
889.
Elpus, Kenneth. "Arts Education and Positive Youth Development: Cognitive, Behavioral, and
Social Outcomes of Adolescents Who Study the Arts." National Endowment for the Arts,
(2013).
ESSA and Arts Education: 7 Basics to Know. Washington D.C.: Americans for the Arts, 2015.
Fisher, Douglas, and Nancy Frey. "Implementing a Schoolwide Literacy Framework: Improving
Achievement in an Urban Elementary School." The Reading Teacher 61, no. 1 (2007): 32‐
43.
Fisher, Maisha. "Open Mics and Open Minds: Spoken Word Poetry in African Diaspora
Participatory Literacy Communities." Harvard Educational Review 73, no. 3 (2003): 362‐
389.
Fisher, Maisha T. Writing in Rhythm: Spoken Word Poetry in Urban Classrooms. New York:
Teachers College Press, 2007.
Fisher, Maisha T. "From the Coffee House to the School House: The Promise and Potential of
Spoken Word Poetry in School Contexts." English Education 37, no. 2 (2005): 115‐131.
Poetry Out Loud Evaluation Plan B4
Flanagan, Constance A., Amy K. Syvertsen, and Michael D. Stout. "Civic Measurement Models:
Tapping Adolescents' Civic Engagement. CIRCLE Working Paper 55." Center for
Information and Research on Civic Learning and Engagement (CIRCLE), (2007).
Fredricks, Jennifer, Wendy McColskey, Jane Meli, Joy Mordica, Bianca Montrosse, and Kathleen
Mooney. "Measuring Student Engagement in Upper Elementary through High School: A
Description of 21 Instruments. Issues & Answers. REL 2011‐No. 098." Regional
Educational Laboratory Southeast (2011).
Goldstein, Thalia R. "Correlations Among Social‐Cognitive Skills in Adolescents Involved in Acting
or Arts Classes." Mind, Brain, and Education 5, no. 2 (2011): 97‐103.
Hamedani, M. G., and Linda Darling‐Hammond. "Social Emotional Learning in High School: How
Three Urban High Schools Engage, Educate, and Empower Youth." Stanford Center for
Opportunity Policy in Education. March (2015).
Hanson, Thomas L., and Jin‐Ok Kim. "Measuring Resilience and Youth Development: The
Psychometric Properties of the Healthy Kids Survey. Issues & Answers. REL 2007‐No.
34." Regional Educational Laboratory West (2007).
Harding, David J., and Kristin S. Seefeldt. "Mixed Methods and Causal Analysis." In Handbook of
Causal Analysis for Social Research, pp. 91‐110. New York: Springer, 2013.
Harland, John, Kay Kinder, Pippa Lord, Alison Stott, Ian Schagen, Jo Haynes, Linda Cusworth,
Richard White, and Riana Paola. Arts Education in Secondary Schools: Effects and
Effectiveness. Massachusetts: National Foundation for Education Research, 2000.
Holbrook, Sara, and Michael Salinger. Outspoken!: How to Improve Writing and Speaking Skills
Through Poetry Performance. Portsmouth, NH: Heinemann Educational Books, 2006.
Hughes, Janette. "New Media, New Literacies and the Adolescent Learner." E‐Learning and
Digital Media 6, no. 3 (2009): 259‐271.
Inoa, Rafael, Gustave Weltsek, and Carmine Tabone. "A Study on the Relationship Between
Theater Arts and Student Literacy and Mathematics Achievement." Journal for Learning
Through the Arts 10, no. 1 (2014).
Ivey, Gay, and Karen Broaddus. "Just Plain Reading: A Survey of What Makes Students Want to
Read in Middle School Classrooms." Reading Research Quarterly 36, no. 4 (2001): 350‐
377.
Jerald, Craig D. "Benchmarking for Success: Ensuring US Students Receive a World‐Class
Education." National Governors Association, (2008).
Jocson, Korina M. Youth Poets: Empowering Literacies in and out of Schools. New York: Peter
Lang, 2008.
Poetry Out Loud Evaluation Plan B5
Jolliffe, Darrick, and David P. Farrington. "Development and Validation of the Basic Empathy
Scale." Journal of Adolescence 29, no. 4 (2006): 589‐611.
Joronen, Katja, Anne Konu, H. Sally Rankin, and Päivi Åstedt‐Kurki. "An Evaluation of a Drama
Program to Enhance Social Relationships and Anti‐Bullying at Elementary School: A
Controlled Study." Health Promotion International 27, no. 1 (2012): 5‐14.
Kim, Ah‐Jeong, and David Boyns. “Joining the Spectrum: An Interdisciplinary Inquiry into Theatre
as an Intervention for Autism Diagnosed Teens.” (working paper, 2015).
Kleinman, Sherryl, Martha A. Copp, and Karla A. Henderson. "Qualitatively Different: Teaching
Fieldwork to Graduate Students." Journal of Contemporary Ethnography 25, no. 4 (1997):
469‐499.
Koukis, Susan L. "At the Intersection of Poetry and a High School English Class: 9th Graders’
Participation in Poetry Reading Writing Workshop and The Relation to Social and
Academic Identities’ Development." PhD diss., Ohio State University, Ohio, 2010.
Lazzari, Marceline M., Kathryn A. Amundson, and Robert L. Jackson. “We Are More Than
Jailbirds: An Arts Program for Incarcerated Young Women." Affilia‐ Journal of Women and
Social Work 20, no. 2 (2005): 169‐185.
Martin, Andrew J., Marianne Mansour, Michael Anderson, Robyn Gibson, Gregory AD Liem, and
David Sudmalis. "The Role of Arts Participation in Students’ Academic and Nonacademic
Outcomes: A Longitudinal Study of School, Home, And Community Factors." Journal of
Educational Psychology 105, no. 3 (2013): 709.
McArdle, John J., Emilio Ferrer‐Caja, Fumiaki Hamagami, and Richard W. Woodcock.
"Comparative Longitudinal Structural Analyses of the Growth and Decline of Multiple
Intellectual Abilities Over the Life Span." Developmental Psychology 38, no. 1 (2002): 115.
McCroskey, James C., and Michael J. Beatty. "Communication Apprehension and Accumulated
Communication State Anxiety Experiences: A Research Note." (1984): 79‐84.
Morgan, Stephen L., and Christopher Winship. Counterfactuals and Causal Analysis: Methods and
Principles for Social Research. Cambridge: Cambridge University Press, (2007).
National Council of Teachers of English. Standards for the English Language Arts. Delaware:
International Reading Association, 1996.
National Research Council. Scientific Research in Education, Edited by. Richard Shavelson and Lisa
Towne. Washington, D.C.: National Academies Press, 2002.
Parsad, Basmat, and Maura Spiegelman. "Arts Education in Public Elementary and Secondary
Schools: 1999‐2000 and 2009‐10. NCES 2012‐014." National Center for Education
Statistics, 2012.
Poetry Out Loud Evaluation Plan B6
Podlozny, Ann. "Strengthening Verbal Skills Through the Use of Classroom Drama: A Clear
Link." Journal of Aesthetic Education 34, no. 3/4 (2000): 239‐275.
Rivkin, S.G., Hanusheck, E.A., and Kain, JF. “Teachers, Schools, and Academic Achievement.”
Econometrika 73, no.2 (2005), 417‐458.
Rosenbaum, Paul R., and Donald B. Rubin. "The Central Role of the Propensity Score in
Observational Studies for Causal Effects." Biometrika 70, no. 1 (1983): 41‐55.
Schwartz, Lisa K., Lisbeth Goble, Ned English, and Robert F. Bailey. "Poetry in America: Review of
the Findings." University of Chicago: Submitted to The Poetry Foundation (2006).
Thomas, M. Kathleen. "Music Education and Its Causal Impact on Student Engagement and
Success." (working paper, 2016).
Walsh‐Bowers, Richard, and Robert Basso. "Improving Early Adolescents' Peer Relations Through
Classroom Creative Drama: An Integrated Approach." Children & Schools 21, no. 1 (1999):
23‐32.
Vaughn, Kathryn, and Ellen Winner. "SAT Scores of Students Who Study the Arts: What We Can
and Cannot Conclude About the Association." Journal of Aesthetic Education 34, no. 3/4
(2000): 77‐89.
Winner, Ellen, and Lois Hetland. "Beyond the Soundbite: Arts Education and Academic
Outcomes." In Conference proceedings. J. Paul Getty Trust, publishers. 2001.
Winner, Ellen, and Monica Cooper. "Mute Those Claims: No Evidence (Yet) For a Causal Link
Between Arts Study and Academic Achievement." Journal of Aesthetic Education 34, no.
3/4 (2000): 11‐75.
Winner, Ellen. "The Arts and Academic Improvement: What the Evidence Shows: Executive
Summary of the Harvard Project Zero Reviewing Education and the Arts Project (REAP)."
(2000).
Wiseman, Angela M. "Now I Believe if I Write I Can Do Anything": Using Poetry to Create
Opportunities for Engagement and Learning in the Language Arts Classroom." Journal of
Language and Literacy Education 6, no. 2 (2010): 22‐33.
Wiseman, Angela. "Powerful students, powerful words: Writing and Learning in a Poetry
Workshop." Literacy 45, no. 2 (2011): 70‐77.
Weiss, Jen, and Scott Herndon. Brave New Voices: The Youth Speaks Guide to Teaching Spoken
Word Poetry. Portsmouth, NH: Heinemann, 2001.
Poetry Out Loud Evaluation Plan B7
8
This page was intentionally left blank.
Poetry Out Loud Evaluation Plan B8
9
Appendix C: Power Analysis
Poetry Out Loud Evaluation Plan B9
10
This page was intentionally left blank.
Poetry Out Loud Evaluation Plan B10
11
Power Analyses
After a comprehensive review of empirical studies measuring the impact of programs on a
broader range of student outcomes, we propose a sample size of the evaluation 360 POL
participants at selected schools Various studies show that the effects of programs or
interventions that are similar to POL in terms of duration, intensity, or content tend to have
small effects (about one fifth of a standard deviation) on a variety of student outcomes. For
example, Durlak, Weissberg, & Pachan’s (2010) meta‐analysis studying the effects of 41 after‐
school programs on student outcomes—student self‐perceptions, bonding to school, positive
social behaviors, or other aspects of school performances such as achievement test scores,
grades, and school attendance—reveals effect sizes between 0.14 and 0.37. Conventionally,
differences of 0.2, 0.5, and 0.8 standard deviations are considered ‘small’, ‘medium’, and ‘large’
effect sizes respectively (Lipsey & Wilson, 2001).
More specifically, for this work we examined more closely two studies that shared important
similarities with the current evaluation. One study consisted on infusing theater based learning
into the curriculum on verbal test scores and the other one examined changes in some aspects
of socio‐emotional development and attitude changes towards a specific subject after a
participating in a short duration program. In the first study, Inoa, Weltsek, & Tabone’s (2014)
found that students who participated in theater based learning had higher reading scores
between those who participated and those who did not. Because achievement is only one of the
student outcomes we also considered additional outcomes. For example, the second study,
Crombie, Walsh, & Trinneer (2003) examined changes on students’ confidence, importance, and
intentions regarding a specific discipline (science and technology) after participating in a
program over the summer. One of the similarities of this study with our evaluation is that it
examines changes in students’ perceptions utilizing a pre and a post surveys. Overall their results
showed there were small differences in students’ outcomes after participation in the program.
Both of these studies provided the statistical information to conduct our power analyses and
offered the data to make the necessary assumptions to calculate the minimum detectable
effects for our current evaluation. As shown in the Exhibit 1, the results of the analyses taking
into account these assumptions and considering different sample sizes taking into account
response rates
Exhibit 1 estimates the minimum detectable effects (MDEs), which are the minimum value of the
difference between treatment and comparison groups that would pass the statistical significance
test given expected sample sizes, power level, and statistical confidence level. Since impact
analyses may be conducted for the entire group and subgroups, Exhibit 1 shows the MDEs for
three hypothetical scenarios, using the standard level of power usually desired for impact
analyses (80%), and estimates MDEs at the 90% confidence level. All the estimations assume
that the size of the comparison group will be equal to that of each treatment group and use two‐
tailed tests of independence of means.
Poetry Out Loud Evaluation Plan B11
12
Exhibit 1:
Minimum Detectable Effects (MDE’s) Observable Under Two Sample Sizes
All
Participants
Sample Sizes
Assuming 80%
response rate
Assuming 50%
response rate
Program Group (n=360) (10 schools)
3,600
1,800
2,880
Comparison group (n=360) (10 schools)
3,600
1,800
2,880
Average confidence scores (M=3.86 SD = .88)
Maximum MDE with sample size
‐
.08
.06
Average reading scores (M=193.16 SD = 22.74)
1.37
MDEs
Maximum MDE with sample size
Based on these assumptions, it appears that a study conducted with POL participants and a
comparison group (assuming a 80% response rate in the survey) will be able to detect a
difference of .06 points or more in confidence scores (M=3.86, SD= .88) of the participant and
the comparison group at 80% power (i.e., if the evaluator conducted the analysis 100 times, the
difference would be statistically significant 80 percent of the time). Because MDEs are sensitive
to sample size, MDEs will change if response rates are lower (see Exhibit 1). Thus, we
acknowledge that it is critical to implement strategies to boost survey response rates to ensure
the success of the study. In terms of reading scores (assuming we obtain data for all students in
the school) will be able to detect a difference of 1.37 points or more in reading scores (M=3.86,
SD= .88). It is important to note, that states follow different testing regimes for these reason we
may need to standardize academic scores before analyses.
Note that MDEs are not guarantee of program impacts. Rather, they show the minimum size of
the impact we expect to detect based on given sample sizes and our estimation of how much the
impacts will vary within each group (program vs. comparison). We make no assumptions about
the size of the impact (i.e., the average difference between the program and comparison group
outcomes). The impact may well be zero, in which case our tests of impact will likely not be able
to reject the null hypothesis of no difference between groups. However, if the impacts are
greater than zero, we will be able to detect them only if they are larger than MDEs (and only 80%
of the time). In short, the statistical power of the study is not directly related to whether impacts
exist. Because we expect to find some differences in program implementation across schools, we
also assume we will need to control for such differences by adding fixed effects in all statistical
models.
Poetry Out Loud Evaluation Plan B12
References
Crombie, G., Walsh, J.P., Trinneer A. (2003).Positive Effects of Science and Technology Summer
Camps on Confidence, Values, and Future Intentions. Canadian Journal of Counseling 37,
no. 4. 257‐269.
Durlak, J. A., Weissberg, R. P., & Pachan, M. (2010). A meta‐analysis of after‐school programs
that seek to promote personal and social skills in children and adolescents. American
Journal of Community Psychology, 45, 294‐309.
Inoa, R., Weltsek, G., and Tabone, C. (2014). A Study on the Relationship Between Theater Arts
and Student Literacy and Mathematics Achievement. Journal for Learning Through the
Arts 10, no. 1. 1‐21.
Lipsey M.W., Wilson, D. B. (2001). Practical meta‐analysis. London: SAGE.
Appendix D: Customizable Email for School Recruitment
Poetry Out Loud Evaluation Plan D1
This page was intentionally left blank.
Poetry Out Loud Evaluation Plan D2
[Contact - Prefix] [Contact - First Name] [Contact – Last Name]
[Contact - Title]
[District Name]
[District Address 1]
[District Address 2]
[City], [State] [ZIP]
Dear [Contact - Prefix] [Contact– Last Name]:
My name is ______ and I am contacting you from the State Arts Agency that serves schools in
your region and on behalf of the National Endowment of the Arts (NEA). I wanted to alert you
that we have recommended [insert school name] to participate in a research study about how
Poetry Out Loud (POL) is implemented in your school. We have high hopes for this study since
it is an opportunity to examine how POL programming benefits youth, how it enriches the
English Language Arts curriculum, and affords opportunities for teachers to expand their
knowledge base in teaching poetry. Moreover, the results will be yield important information
about how POL is implemented across schools providing opportunities to learn more about
effective POL programmatic practices.
This study, commissioned by the NEA, is primarily focused on understanding the effect of POL
on student engagement, academic achievement, and social-emotional outcomes and examine the
variety of factors involved in the successful implementation of the program. A total of 10 schools
across the U.S. will be selected to take part of the study, and we hope yours is one of them.
To conduct the data collection and research activities for the study, the NEA has contracted with
Social Policy Research. The study will be comprised of three main research activities: (1) pre
and post online student surveys of POL participants and non-POL participant; (2) site visits to
conduct interviews and focus groups with students, teachers, and administrators; and (3) the
provision of de-identified student level data. For additional details about the study please see the
enclosed one-page summary.
To continue this selection process, a research team member from SPR will be contacting you in
the coming weeks. We expect research activities will begin in the Fall of 2018 and take place
throughout the 2018-2019 school year. We sincerely hope that you will consider the possibility of
[insert school name]’s participation in this study. If you have any questions about the study, please
contact Melissa Mack at (510) 788-2478] or [melissa_mack@spra.com].
Sincerely,
[insert name]
Poetry Out Loud Evaluation Plan D3
This page was intentionally left blank.
Appendix E: Consent and Assent Forms
Poetry Out Loud Evaluation Plan E1
This page was intentionally left blank.
Poetry Out Loud Evaluation Plan E2
PARENT/GUARDIAN PASSIVE CONSENT TO PARTICIPATE IN RESEARCH
Dear Parent:
[Name of School] will be participating in an evaluation of the Poetry Out Loud program that the
school offers to students [specify grade level, if appropriate]. Poetry Out Loud is a national arts
program that encourages high school‐aged youth to learn about poetry through memorization
and recitation. The purpose of the study is to understand how reading and performing poetry
affects student engagement, learning, and grades.
All students in the school will be given the opportunity to fill out two short surveys in the Fall and
Spring of this academic year about their experiences with poetry. The survey is anonymous and
voluntary. Your child’s grade does not depend on answering the questions. Your child does not
have to fill out any part of the questionnaire that makes him or her feel uncomfortable or that
you think your child should not answer.
Students who are participating in Poetry Out Loud may be invited to participate in an interview
or focus group with a researcher.
If for any reason you do not wish your son or daughter to participate in the survey, please sign
this form and return it by [date].
_______________________________________________
Student’s Name (please print)
_______________________________________________
Parent signature
Date:_______________
The study is funded by the National Endowment for the Arts and is being conducted by Social
Policy Research Associates. If you have any questions about the study or problems related to the
study you can talk to your principal; the project director at the NEA, [insert appropriate NEA
contact], who can be reached at [phone] or [email]; or the director of the study, Melissa Mack.
You can call or email her at (510) 788‐2478 or Melissa_Mack@spra.com. If you have questions
about the study but want to talk to someone else who is not a part of the study, you can call the
Solutions Independent Review Board at (855) 226‐446.
Poetry Out Loud Evaluation Plan E3
STUDENT ASSENT TO PARTICIPATE IN INTERVIEWS AND FOCUS GROUPS
(This form will be distributed to students who are asked to participate in interviews and focus groups.)
1. What will happen to me in this study?
Description of the study: This study looks at student experiences with poetry to understand how the
Poetry Out Loud program is working in classrooms like yours across different schools. This study
seeks to understand how reading and performing poetry affects student engagement, learning and
grades. As a participant in this study you may be asked to participate in an interview or focus group.
Interviews and focus groups will last between 30 minutes and one hour. During the interview/focus
group, you will be asked questions about your experiences at your high school and in your classes.
With your permission, the interview will be audio taped and we will selectively transcribe the tapes.
2.
Can anything bad happen to me?
Risks or Discomforts of Participating in the Study: There is very minimal risk for participating in the
study. If you feel uncomfortable with any of the questions you are asked you do not have to answer
the question or continue participating in the interview.
3. Can anything good happen to me?
Benefits of Participating in the Study: We cannot guarantee any benefits of this study but it might be
enjoyable to talk about your experiences at your high school and what we learn about your
experiences in your school might help other schools and classrooms be better places for students.
4. Will anyone know who I am in the study?
Confidentiality: Your name will not be used in the study; we will identify you by your grade level.
5. Who can I talk to about the study?
Contact Information: If you have any questions about the study or problems related to the study you
can talk to your teacher, who can put you in touch with the study directors.
6. What if I do not want to do this?
Voluntary Participation: You can choose not to participate in the study or you can change your mind
and stop being in the study at any time and you do not have to answer any of the questions asked of
you without getting in trouble and without it affecting your grades in school.
Poetry Out Loud Evaluation Plan E4
STUDENT ASSENT FORM TO PARTICIPATE IN SURVEYS
(A version of this form will appear on the front page of the electronic survey. Students will
indicate their assent to participate by clicking the “I Agree” button or something similar.)
1. What will happen to me in this study?
Description of the study: This survey is a part of a study that looks at student experiences with poetry
to understand how the Poetry Out Loud program is working in classrooms like yours across different
schools. This study seeks to understand how reading and performing poetry affects student
engagement, learning and grades. As a participant in this study you will be asked to participate in this
survey, which will take approximately 20-30 minutes to complete. The survey will ask you questions
about your experience in your community, your school and your classes.
2. Can anything bad happen to me?
Risks or Discomforts of Participating in the Study: There is minimal risk to participating in this
survey. If you feel uncomfortable with any of the questions you are asked you do not have to answer
the question or continue participating in the survey.
3. Can anything good happen to me?
Benefits of Participating in the Study: We cannot guarantee any benefits of this study but what we
learn about your experiences in your school might help other schools and classrooms be better places
for students.
4. Will anyone know who I am in the study?
Confidentiality: Your name will not be used in the study; we will identify you by your grade level.
5. Who can I talk to about the study?
Contact Information: If you have any questions about the study or problems related to the study you
can talk to your teachers, who can put you in touch with the study directors.
6. What if I do not want to do this?
Voluntary Participation: You can choose not to participate in the study or you can change your mind
and stop being in the study at any time and you do not have to answer any of the questions asked of
you without getting in trouble and without it affecting your grades in school.
Poetry Out Loud Evaluation Plan E5
This page was intentionally left blank.
Appendix F: Interview and Focus Group Protocols
Poetry Out Loud Evaluation Plan F1
This page was intentionally left blank.
Poetry Out Loud Evaluation Plan F2
POL Student Interview Protocol
My name is _________, and I am with Social Policy Research Associates, the organization that is
conducting a study about the Poetry Out Loud program. Thank you for taking the time to talk to
us. Our goal today is to learn about your experiences in Poetry Out Loud program.
Public reporting burden for this collection of information is estimated to average 45 minutes per
response, including the time for reviewing instructions, searching existing data sources, gathering
and maintaining the data needed, and completing and reviewing the collection of
information. This agency may not conduct or sponsor, and a person is not required to respond to,
a collection information unless that collection displays a valid OMB control number XXXX‐XXXX,
expiring [date]. Our discussion here should last about 45 minutes.
Your participation in this interview is voluntary, and there is no right or wrong answer – we just
want to understand your experiences. There are no program consequences (i.e., loss of benefits)
for deciding not to participate in the interview, or for deciding not to answer any particular
question. Also please note that your name will not be associated with any information you provide.
We will keep your responses private to the extent permitted by law.
[if you will be recording] We will be taking notes so we can later recall your perspectives more
accurately. In addition, so we can stay focused on the conversation, we would like to record
today’s discussion. If at any point you would like me to pause or turn off the recorder, please let
me know. I want to let you know that Social Policy Research Associates will not use your name in
any reports.
Background
1. Tell me a little bit about how you feel about school. Do you like it? Dislike it? Do you have
favorite subjects and least favorite subjects?
2. What’s something you like to do outside of school?
Poetry Out Loud
3. Tell me about the poetry you’ve studied in school this year. (e.g., did it happen all in one
week, or does your teacher weave it in with other types of writing? do you read and
discuss particular poems as a class? get assignments to write your own poem?)
a. Are you participating in the Poetry Out Loud competition?
b. Have you picked your poems and started memorizing them yet?
c. Why did you pick the poems you did?
d. How have you prepared to perform your poems – e.g. imagining being the person
speaking trying on different roles and points of view?
4. Had you ever memorized a poem before Poetry Out Loud? If yes, when/what for? If no,
tell me about what that was like?
Poetry Appreciation & Engagement
Poetry Out Loud Evaluation Plan F3
5. What are some of the things you like about poetry? What are some of the things you
dislike about poetry?
6. Do you have different feelings about different kinds of poetry that you’ve read? If so,
what are they?
7. How is poetry different from other kinds of writing? What makes it special?
8. What do you think poetry is good for, if anything?
9. Do you write poetry?
10. How does poetry work? (if poetry were a machine, how would it function? what makes it
GO?)
11. Has your feeling about poetry changed over time? If yes, what changed? When did the
change happen?
12. Do you read poetry outside of school (not as homework)? If yes, what do you read? How
often?
13. Do you write poetry on your own (that is, not as part of a school assignment)?
14. If you have access to social media, do you post poems on social media? If yes, what
platforms?
15. If yes, your own poems or those of other people? If no, why not?
Academic Engagement and Achievement
16. Do you think that studying poetry has helped you improve in your work in English class?
In what way? [Probe to help them ar culate specifics―has it improved their writing?
Their reading comprehension? Their vocabulary?]
17. What are you learning through your study of poetry that is most meaningful or
interesting to you? [For probes, consider: creative expression, different modes of
expression, different perspectives/world views, etc.]
18. Has studying poetry changed the way you feel about studying English language and
literature? If yes, in what way?
19. How about your other subjects or school in general? Has your experience with poetry
studies changed the way you approach or think about other subjects? Or your interest in
learning in general?
Social & Emotional Development
20. Has studying poetry for POL helped you in any way? If so, how? (e.g., self‐confidence,
self‐expression, increased sense of belonging, etc.)
a. Has reading and thinking about the poetry helped you? How?
i. Has it changed the way you see yourself?
ii. Has it changed your relationships with your friends? With school?
iii. Has it affected your participation in other activities – extracurricular or
outside of school altogether?
iv.
Poetry Out Loud Evaluation Plan F4
b. Has memorizing and reciting poems for others helped you? [note: how students
are able to respond to this question will depend on whether the student has
started memorizing and practicing or even already performed the POL poem]
i. Has it changed the way you see yourself?
ii. Has it changed your relationships with your friends? With school?
iii. Has it affected your participation in other activities – extracurricular or
outside of school altogether?
Wrap Up
21. I think you’ve answered all my questions. Do you have any questions for me? Is there
anything else you would like to share about your relationship with poetry and
participating in POL?
Poetry Out Loud Evaluation Plan F5
POL Teacher Interview Protocol
My name is _________, and I am with Social Policy Research Associates, the organization that is
conducting a study about the Poetry Out Loud program. Thank you for taking the time to talk to
us. Our goal today is to learn about your experiences in Poetry Out Loud program.
Public reporting burden for this collection of information is estimated to average 45 minutes per
response, including the time for reviewing instructions, searching existing data sources, gathering
and maintaining the data needed, and completing and reviewing the collection of
information. This agency may not conduct or sponsor, and a person is not required to respond to,
a collection information unless that collection displays a valid OMB control number XXXX‐XXXX,
expiring [date]. Our discussion here should last about 45 minutes.
Your participation in this interview is voluntary, and there is no right or wrong answer – we just
want to understand your experiences. There are no program consequences (i.e., loss of benefits)
for deciding not to participate in the interview, or for deciding not to answer any particular
question. Also please note that your name will not be associated with any information you provide.
We will keep your responses private to the extent permitted by law.
[if you will be recording] We will be taking notes so we can later recall your perspectives more
accurately. In addition, so we can stay focused on the conversation, we would like to record
today’s discussion. If at any point you would like me to pause or turn off the recorder, please let
me know. I want to let you know that Social Policy Research Associates will not use your name in
any reports.
Teacher Background
1. I’d like to learn a little bit more about you—how long have you been a teacher? How long
have you been teaching at this school?
2. What do you love most about teaching English Language Arts? What do you find most
challenging?
3. How comfortable do you feel teaching poetry? What do you enjoy about it? What do you
find challenging?
Experience with POL Curriculum
4. How long have you been teaching the POL curriculum? [If not new teachers] How did you
get started with it?
5. Tell me a bit about your experience with the curriculum. Describe for me how it gets
implemented in your classroom. (e.g., How are poems chosen? How does the process of
choice, memorization, analysis, and performance work? How many total hours of class
time? How many days of lesson plans? Portion of POL compared to the rest of the ELA
curriculum?)
6. How has the POL program been helpful to you in your teaching? Are there ways that it
could be improved?
Perceptions of Student Outcomes‐Academic
Poetry Out Loud Evaluation Plan F6
7. What kinds of poetry are your students most drawn to? Why? Do you have a sense of
what they find most interesting about poetry? What do they find most challenging?
8. Are there any connections between your students’ participation in POL and their
performance in your English class? Probe around ELA‐specific outcomes such as:
a. engagement in learning
b. reading comprehension
c. vocabulary development
d. analytical skills
e. writing
9. Have you noticed any changes in your students’ attitudes about reading or writing poetry
after participation in POL? How about their appreciation for poetry?
Perceptions of Student Outcomes―Socio‐emotional Development
10. A unique aspect of the POL program is the performance component. What are the
benefits of this component part (i.e., what do students gain through the experience of
performing poetry?) Can you share some specific examples? [Probe/listen for themes
around empowerment/self‐confidence]
11. Can you tell whether your students’ participation has had any impact on their
engagement in school in general? Can you describe? [Consider social interactions,
leadership opportunities, etc.]
12. Can you tell whether your students’ participation in POL has had any impact on the way
they engage socially? On their engagement with other students? How about their
engagement in learning overall? Describe.
13. To what extent does participation in POL affect students’ understanding of the world? Of
different cultures and perspectives?
14. To what extent does student participation influence students’ understanding of
themselves?
Wrap Up
15. Have your feelings about poetry and about teaching poetry changed since you started
participating in POL? If so, how?
a. Has this influenced your teaching practices?
b. Has it changed the extent to which you incorporate poetry into your curriculum?
16. Is there anything else you would like to share with me about the effects of POL
participation on your students or yourself?
Poetry Out Loud Evaluation Plan F7
State Arts Agency Administrator Interview Protocol
My name is _________, and I am with Social Policy Research Associates, the organization that is
conducting a study about the Poetry Out Loud program. Thank you for taking the time to talk to
us. Our goal today is to learn about your experiences in Poetry Out Loud program.
Public reporting burden for this collection of information is estimated to average 30 minutes per
response, including the time for reviewing instructions, searching existing data sources, gathering
and maintaining the data needed, and completing and reviewing the collection of
information. This agency may not conduct or sponsor, and a person is not required to respond to,
a collection information unless that collection displays a valid OMB control number XXXX‐XXXX,
expiring [date]. Our discussion here should last about 30 minutes.
Your participation in this interview is voluntary, and there is no right or wrong answer – we just
want to understand your experiences. There are no program consequences (i.e., loss of benefits)
for deciding not to participate in the interview, or for deciding not to answer any particular
question. Also please note that your name will not be associated with any information you provide.
We will keep your responses private to the extent permitted by law.
[if you will be recording] We will be taking notes so we can later recall your perspectives more
accurately. In addition, so we can stay focused on the conversation, we would like to record
today’s discussion. If at any point you would like me to pause or turn off the recorder, please let
me know. I want to let you know that Social Policy Research Associates will not use your name in
any reports.
Administrator Background
1. I’d like to learn a little bit more about you—how long have you been with the State Arts
Agency? What is your role there? (possible probes: do you have a background in
teaching? the arts?)
2. Did you play a key role in bringing POL to your state?
a. If so, please give a brief account of how that came to be. (e.g., did someone from
the NEA contact you? Did you hear about the program on your own and reach out
to them?)
b. If not, can you share what you know about how your state came to participate in
POL?
Relationship Between SAA and POL
3. Tell me a little bit about how the SAA supports POL.
a. Does it have a mission to actively expand the schools participating in it?
b. Does it host the state‐level finals?
c. Provide publicity?
d. Connect teaching artists (poets) to schools?
4. Does the SAA supplement the funding for POL that it receives from the NEA? If so, what
are additional sources of funds and how much compared to the allocation from the NEA?
Poetry Out Loud Evaluation Plan F8
5. Compared to the other arts programming that the SAA supports, what is special about
POL?
6. Why does the SAA continue to support POL? What does it see as the value of POL?
Implementation and Outcomes
7. Has the agency done any of its own research or evaluation on POL?
8. What is your perspective on the elements that contribute to successful implementation
of POL in a school?
9. Do you see differences in how successful the implementation of POL is at different
participating schools?
10. Do you think POL increases community exposure to the work of the SAA? If so, how?
11. Do you think POL increases community participation in public arts programming? If so,
how?
Wrap Up
12. Is there anything else you would like to share with me about the effects of POL
participation on schools, teachers, students, their families, the broader community?
Poetry Out Loud Evaluation Plan F9
POL Student Focus Group Protocol
My name is _________, and I am with Social Policy Research Associates, the organization that is
conducting a study about the Poetry Out Loud program. Thank you for taking the time to talk to
us. Our goal today is to learn about your experiences in Poetry Out Loud program.
Public reporting burden for this collection of information is estimated to average 1 hour per
response, including the time for reviewing instructions, searching existing data sources, gathering
and maintaining the data needed, and completing and reviewing the collection of
information. This agency may not conduct or sponsor, and a person is not required to respond to,
a collection information unless that collection displays a valid OMB control number XXXX‐XXXX,
expiring [date]. Our discussion here should last about 60 minutes.
Your participation in this interview is voluntary, and there is no right or wrong answer – we just
want to understand your experiences. There are no program consequences (i.e., loss of benefits)
for deciding not to participate in the interview, or for deciding not to answer any particular
question. Also please note that your name will not be associated with any information you provide.
We will keep your responses private to the extent permitted by law.
[if you will be recording] We will be taking notes so we can later recall your perspectives more
accurately. In addition, so we can stay focused on the conversation, we would like to record
today’s discussion. If at any point you would like me to pause or turn off the recorder, please let
me know. I want to let you know that Social Policy Research Associates will not use your name in
any reports.
Poetry Exposure/Appreciation
1. How many of you studied poetry before this year? [Count # of affirmative responses]
a. For those who respond in the affirmative, get some information about that—
when did they study it? What did that kind of study entail?
2. How many of you enjoy poetry?
a. What is it that you like about it?
b. Do you have favorite poets? Who are they? Why do you like their poetry?
3. How many of you dislike poetry or find it (or parts of it) difficult?
a. What is it about poetry that you find challenging? Is there something that you
think might help make it less challenging/more enjoyable?
4. Tell me a bit about how what poetry studies in your English class is like.
a. What kinds of poems do you study?
b. Do you get to choose? If so, how do you make your choice? What are you drawn
to?
c. What does studying poetry involve? [Depending on where they are in the
curriculum, it should include memorization, analysis, and recitation. Feel free to
probe so you know what they are doing. You are asking this question so that you
know how to tailor some of the other questions, depending on where they are in
the program and what they have done to this point.]
d. Do you study by yourself? In pairs or on group work?
Poetry Out Loud Evaluation Plan F10
Poetry and Academic Outcomes
5. Do you think that studying poetry has helped you improve in your work in English class?
In what way? [Probe to help them ar culate specifics―has it improved their wri ng?
Their reading comprehension? Their vocabulary?]
6. Tell me about the process of analyzing poetry. How would you describe it? What’s
challenging about it? Is there something rewarding about it?
7. What are you learning through your study of poetry that is most meaningful or
interesting to you? [For probes, consider: creative expression, different modes of
expression, different perspectives/world views, etc.]
8. Has studying poetry changed the way you feel about studying English language arts? If
yes, in what way?
9. How about your other subjects or school in general? Has your experience with poetry
studies changed the way you approach or think about other subjects? Or your interest in
learning in general?
Poetry and Socio‐emotional Development
10. Have your poetry studies introduced you to different ways of looking at the world and at
other people? Can you talk about that a little bit? [This is a more targeted question
around exposure to different cultures, perspectives, and world views.]
11. Has studying poetry had an effect on how you see yourself? If so, how?
12. How many of you have had experience speaking in front of audiences before? How many
of you are comfortable speaking in front of an audience?
a. For those of you who are uncomfortable with it, tell me a little bit about that.
What is it that makes you uncomfortable?
13. How many of you have had an opportunity to recite poetry in front of your classmates?
[Count # of affirmative responses]
a. [For those who responded affirmatively], what was that like?
b. Did that change at all your level of comfort speaking in front of people? Can you
describe how for me?
14. In studying poetry, you probably have seen lots of different ways that poets use language
to express themselves. Has this influenced the way you express yourself, or encouraged
you to think about different ways of expressing yourself? [If yes], tell me a bit about that.
a. For example, are you more comfortable using metaphor? Simile? A wider range of
words than you used to use? Other changes?
b. Do you find that these changes affect your writing? Your speaking? Both?
Wrap Up
15. Have your feelings about poetry changed since you started studying it in class?
16. Do you ever share poetry with your friends? Your family? If yes, how do you share it?
(e.g., social media?) What have been their responses?
17. Before studying poetry in this class, how many of you have tried writing your own
poetry? [If largely non‐affirmative response] Are any of you writing it now? What do you
write about? How do you feel when you write your own poetry?
Poetry Out Loud Evaluation Plan F11
a. For those who had been engaged in writing poetry before taking this class, has
the class had any influence on your writing process?
18. I think you’ve answered all my questions. Do you have any questions for me? Is there
anything else you would like to share about your participation in POL that we might not
have covered yet?
Poetry Out Loud Evaluation Plan F12
Appendix G: Survey Instrument
Poetry Out Loud Evaluation Plan G1
This page was intentionally left blank.
Poetry Out Loud Evaluation Plan G2
I. Academic Engagement and Outcomes
Constructs
Academic engagement in school
Academic engagement in English classes
School climate
Engagement in extracurricular activities
Other skills related to POL participation (engagement in group discussion, public
speaking)
Academic achievement in school
Academic achievement in English classes
Academic aspirations
[Academic Engagement, Motivation]
[Source: Student Engagement in School Questionnaire (SESQ)]
Q1. When thinking about school and ALL your classes in general, please tell us what do you
think about the following statements...
Never
Rarely
Sometimes
Often
Always
I am very interested in learning.
O
O
O
O
O
I think what I am learning in school is interesting.
O
O
O
O
O
I like what I am learning in school.
O
O
O
O
O
I enjoy learning new things in class.
O
O
O
O
O
I think learning is boring.
O
O
O
O
O
I try hard to do well in school.
O
O
O
O
O
In class, I work as hard as I can.
O
O
O
O
O
When I’m in class, I participate in class activities.
O
O
O
O
O
I pay attention in class.
O
O
O
O
O
When I’m in class, I just act like I’m working.
O
O
O
O
O
In school, I do just enough to get by.
O
O
O
O
O
When I’m in class, my mind wanders.
O
O
O
O
O
If I have trouble understanding a problem, I go over
it again until I understand it.
O
O
O
O
O
When I run into a difficult homework assignment, I
keep working at it until I think I’ve solved it.
O
O
O
O
O
Poetry Out Loud Evaluation Plan G3
[Academic Engagement in English Class]
[Source: CPS 5 essentials; scale to 5 points, *added]
Q2. When thinking about your ENGLISH class last year, how strongly do you disagree or agree
with the following statements?
Neither
Disagree
Strongly
Strongly
Disagree Disagree Nor Agree Agree Agree
I usually look forward to this class.
O
O
O
O
O
I work hard to do my best in this class.
O
O
O
O
O
Sometimes I get so interested in my work I
don't want to stop.
O
O
O
O
O
The topics I am studying are interesting and
challenging.
O
O
O
O
O
I like my English class more than any of my
other classes.*
O
O
O
O
O
I normally do well in my English class*
O
O
O
O
O
[School Climate]
[Source: CHKS, High School Questionnaire Core Module]
Q3. How strongly do you disagree or agree with the following statements?
Strongly
Disagree
Neither
Disagree
Strongly
Disagree Nor Agree Agree Agree
I feel close to people at this school.
O
O
O
O
O
I am happy to be at this school.
O
O
O
O
O
I feel like I am part of this school.
O
O
O
O
O
The teachers at this school treat students
fairly.
O
O
O
O
O
The teachers at this school encourage self‐
expression
O
O
O
O
O
I feel safe in my school.
O
O
O
O
O
Poetry Out Loud Evaluation Plan G4
[Extracurricular Activities Engagement]
[Source: Chicago Public Schools: 5Essentials, modified*]
Q4. Overall, how do you feel about school?
Q5. Now we have specific questions about after school activities. In a typical 5‐day week, how
often:
Three Four
Once Twice days days Five or more
a
a
per
per
days per
week
Never week week week week
Do you participate in academic activities (e.g.,
getting writing coaching, tutoring, homework
help, etc.) after school?
O
O
O
O
O
O
Do you participate in music, activities or classes
after school?*
O
O
O
O
O
O
Do you participate in visual arts such as drawing,
painting, photography, or ceramics activities or
classes after school?*
O
O
O
O
O
O
Do you participate in literature clubs after
school? This could include reading and discussing
poetry or novels or do creative writing*
O
O
O
O
O
O
Do you participate in theater or drama clubs
after school?*
O
O
O
O
O
O
Do you participate in debate clubs after school?*
O
O
O
O
O
O
Do you participate in any other enrichment
activities such as chess club or sports/fitness
activities after school?
O
O
O
O
O
O
Do you participate in computer classes after
school? This classes may include computer
programming, robotics, or game design.
O
O
O
O
O
O
Poetry Out Loud Evaluation Plan G5
[Attitudes towards public speaking and self confidence in public speaking]
[Source: Personal Report of Communication Apprehension (PRCA‐24)]
Q6. How strongly do you disagree or agree with the following statements?
Strongly
Disagree Disagree
I dislike participating in group discussions.
Generally, I am comfortable while
participating in group discussions.
I like to get involved in group discussions.
Engaging in a group discussion with new
people makes me tense and nervous.
I have no fear of giving a speech.
I feel relaxed while giving a speech.
My thoughts become confused and jumbled
when I am giving a speech.
I face the prospect of giving a speech with
confidence.
Neither
Disagree
Nor Agree
Agree
Strongly
Agree
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
Neither
Disagree
Nor Agree
Agree
Strongly
Agree
Q6a. How strongly do you disagree or agree with the following statements?
I dislike performing in public (e.g. dance,
theater, music).
Generally, I am comfortable while performing
in public (e.g. dance, theater, music).
I like to get involved in public performance.
Engaging in public performance projects with
new people makes me tense and nervous.
I have no fear of performing in public.
I feel relaxed while performing (e.g. dance,
theater, music) in front of others.
My thoughts become confused and jumbled
when I am performing in public.
I face the prospect of performing publicly (e.g.
dance, theater, music) with confidence.
Strongly Disagr
Disagree
ee
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
Poetry Out Loud Evaluation Plan G6
[Academic Achievement and Aspirations]
[Source: CPS 5 Essentials; CHSK]
Q7. What grades did you earn in school last year?
1.
2.
3.
4.
5.
6.
7.
8.
Mostly below D’s
Mostly D’s
About half C’s and half D’s
Mostly C’s
About half B’s and half C’s
Mostly B’s
About half B’s and half A’s
Mostly A’s
Q8. What grades did you earn in your ENGLISH/READING/LITERATURE class last year?
1.
2.
3.
4.
5.
6.
7.
8.
Mostly below D’s
Mostly D’s
About half C’s and half D’s
Mostly C’s
About half B’s and half C’s
Mostly B’s
About half B’s and half A’s
Mostly A’s
Q9. What is the highest level of education you plan to complete?
1.
2.
3.
4.
5.
6.
7.
Not planning to complete high school
High school
Career/technical school
2‐year community college or junior college
4‐year college or university
Graduate or professional school
Undecided
Poetry Out Loud Evaluation Plan G7
II. Social & Emotional Outcomes
Constructs
Self‐confidence in general
Self‐confidence in intellectual abilities
Self‐confidence in social skills
Empowerment; self‐expression
Civic participation
[Self‐confidence (in general, in intellectual abilities, in social skills, in academic expectations)]
[Source: California Healthy Kids Survey: Youth Reliance and Development Module]
Q10. How TRUE are the following about you right now?
Not at A little Somewhat
all true true
true
Mostly
true
Completely
true
I have high goals and expectations for myself.
O
O
O
O
O
I am looking forward to a successful career.
O
O
O
O
O
I try to work out problems by talking or
writing about them.
O
O
O
O
O
I can work out my problems.
O
O
O
O
O
I don’t expect very much of myself in the
future.
O
O
O
O
O
I can do most things if I try.
O
O
O
O
O
I can work with someone who has different
opinions than mine.
O
O
O
O
O
There are many things that I do well.
O
O
O
O
O
I listen to other students’ ideas.
O
O
O
O
O
I feel bad when people get their feelings hurt.
O
O
O
O
O
I try to understand what other people go
through.
O
O
O
O
O
When I need help, I find someone to talk with.
O
O
O
O
O
I enjoy working together with other students
on class activities.
O
O
O
O
O
When I work in school groups, I do my fair
share.
O
O
O
O
O
I stand up for myself without putting others
down.
O
O
O
O
O
I try to understand how other people feel and
think.
O
O
O
O
O
Poetry Out Loud Evaluation Plan G8
Not at A little Somewhat
all true true
true
Mostly
true
Completely
true
I trust my ability to solve difficult problems.
O
O
O
O
O
I understand my moods and feelings.
O
O
O
O
O
I understand why I do what I do.
O
O
O
O
O
[Leadership/Empowerment]
[Source: Common Measure: Leadership Development, High School]
Q11. If you found out about a problem in your community that you wanted to do something
about (for example, illegal drugs were being sold near a school, or high levels of lead were
discovered in the local drinking water), how well do you think you would be able to do each of
the following?
I definitely
can’t
I probably
can’t
Maybe
I probably
can
I definitely
can
Create a plan to address the problem.
O
O
O
O
O
Get other people to care about the
problem.
O
O
O
O
O
Organize and run a meeting.
O
O
O
O
O
Express your views in front of a group of
people.
O
O
O
O
O
Identify individuals or groups who could
help you with the problem.
O
O
O
O
O
Write an opinion letter to a local
newspaper.
O
O
O
O
O
Call someone on the phone that you
had never met before to get their help
with the problem.
O
O
O
O
O
Contact an elected official about the
problem.
O
O
O
O
O
Organize a petition.
O
O
O
O
O
Poetry Out Loud Evaluation Plan G9
III. Poetry Appreciation & Engagement
Constructs
General attitudes toward poetry
Attitudes toward reading poetry
Attitudes toward writing poetry
Attitudes toward memorizing poetry
Attitudes toward reciting poetry
Sharing poetry with peers
Sharing poetry via social media
Indicator of POL participation
[Attitudes Toward Reading and Writing Poetry]
[Source: POL Student Survey AND Koukis 2010; adapted]
Q12. The following questions are related to your experiences with poetry. For the following
please choose if you strongly disagree, disagree, neither disagree nor agree, agree, or strongly
agree.
I am familiar with poetry.
Poetry is important to me.
I enjoy learning how to interpret poetry.
I enjoy figuring out poems and thinking about what
they mean.
My English teacher helps me understand poetry.
I read poetry with my family.
I read poetry in my spare time.
It is easy for me to read poetry.
I write my own poetry in my spare time.
It is easy for me to write poetry.
I enjoy memorizing poems.
It is easy for me to memorize poems.
I enjoy reciting poetry.
I appreciate poetry more when it is read aloud.
I enjoy reciting poems in front of my peers.
My English teacher encourages me to write my own
poetry.
I am comfortable sharing poems I wrote with my peers.
I feel comfortable reciting poetry in front of my peers.
Neither
Disagree
Strongly
Disagree Disagree Nor Agree
Strongly
Agree Agree
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
O
Poetry Out Loud Evaluation Plan G10
Q13. How do you feel about poetry in general? (e.g. What aspects of poetry do you like? Are
there things about poetry that you don’t like? If so, what are they?)
Poetry Out Loud encourages students to learn about great poetry through memorization and
recitation. This program helps students master public speaking skills, build self‐confidence, and
learn about literary history and contemporary life.
Q14. Have you ever participated in Poetry Out Loud?
1. Yes (Go to Q15)
2. No (SKIP TO Q17.)
3. Not sure (SKIP TO Q17.)
Q15. How many years have you participated in Poetry Out Loud?
1.
2.
3.
4.
One years
Two years
Three years
Four years
Q16. When did you participate in Poetry Out Loud most recently?
1. Last year
2. Two years ago
3. Other [SPECIFY]
Q17. Did you learn about poetry in your English/language arts class during the last year?
1. Yes
2. No (SKIP TO Q20)
3. Not sure (SKIP TO Q20)
Q18. Have you competed in Poetry Out Loud?, Please choose at what level (mark all that apply):
1.
2.
3.
4.
5.
6.
Classroom
School contest
Regional contest
State contest
National contest
Did not compete at any of the above
Q19. Have any of your peers or family members participated or competed in Poetry Out Loud?
1. Yes
2. No
3. Not sure
[Use of Social Media]
[Source: CPS 5 essentials (adapted)]
Poetry Out Loud Evaluation Plan G11
Q20. People sometimes use social media platforms, such as Facebook, Instagram, Snapchat,
Twitter to create or share information or perspectives related to poetry. How often have you
done the following:
Never
Less than
once a
month
Once or
twice a
month
Once a
week
Several
times a
week
How often do you share someone else’s
poems on social media?
O
O
O
O
O
How often do you share someone else’s
poems through email?
O
O
O
O
O
How often do you share your own poems on
social media?
O
O
O
O
O
How often do you share your own poems
through email?
O
O
O
O
O
How often do you comment or tweet about
poetry since school started?
O
O
O
O
O
IV. Demographic Information
Q21. Are you:
1.
2.
3.
4.
Male
Female
Gender non‐conforming
Prefer not to say
Q22. Are you of Hispanic or Latino origin?
1. No
2. Yes
Q23. What is your race? (mark all that apply)
1. American Indian or Alaskan Native (A person having origins in any of the original peoples of
North America, and who maintains cultural identification through tribal affiliation or
community recognition.)
2. Asian (A person having origins in any of the original peoples of the Far East, Southeast Asia,
the Indian subcontinent, or the Pacific Islands. This area includes, for example, Cambodia,
China, India, Japan, Korea, Malaysia, Pakistan, the Philippine Islands, Thailand, and Vietnam.)
3. Native Hawaiian or Other Pacific Islander (A person having origins in any of the original
peoples of Hawaii, Guam, Samoa, or other Pacific Islands)
4. Black or African‐American (A person having origins in any of the black racial groups of Africa.)
5. White (A person having origins in any of the original peoples of Europe, North Africa, or the
Middle East)
Q24. Is English your first language?
Poetry Out Loud Evaluation Plan G12
1. Yes [SKIP to Q26a]
2. No
3. Don’t know
Q25a. [IF NO OR DON’T KNOW] How well do you understand, speak, read, and write English?
Understand spoken English
Speak English
Read English
Write English
Very Well
O
O
O
O
Well
O
O
O
O
Not Well
O
O
O
O
Not at All
O
O
O
O
Q25b. What languages do you speak at home or with friends?
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
Spanish
Mandarin
Cantonese
Taiwanese
Tagalog
Vietnamese
Korean
French
Russian
German
Other (write‐in)
Q26a. What is the highest level of education reached by your mother or female guardian?
1.
2.
3.
4.
5.
6.
7.
Did not finish high school
Finished high school
Attended but did not finish college
Finished two‐year college
Finished four‐year college
Finished graduate degree (e.g., MA, MD, PhD)
Don’t know/Not applicable
Q26b. What is the highest level of education reached by your father or male guardian?
1.
2.
3.
4.
5.
6.
7.
Did not finish high school
Finished high school
Attended but did not finish college
Finished two‐year college
Finished four‐year college
Finished graduate degree (e.g., MA, MD, PhD)
Don’t know/Not applicable
Q27. What grade are you in?
1. 9th grade
2. 10th grade
3. 11th grade
Poetry Out Loud Evaluation Plan G13
4. 12th grade
Q28. What month were you born?
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
January
February
March
April
May
June
July
August
September
October
November
December
Q29. What are the first two letters of your first name?
Q30. What are the first two letters of your last name?
Poetry Out Loud Evaluation Plan G14
Appendix H: Survey Instrument Technical Details:
Scale Validity and Reliability
Poetry Out Loud Evaluation Plan H1
This page was intentionally left blank.
Poetry Out Loud Evaluation Plan H2
Student Engagement in School Questionnaire (SESQ)
Description: Measures high school students’ engagement and investment in school.
Engagement and investment are measured via student’s self‐reports of time spent on
homework assignments, attendance, concentration, and attention in class.
Developer/source: Dornbusch S. and Steinberg L.
Reliability: Cronbach alpha= .74–.86
Validity: Criterion‐related validity for this scale, including positive correlations with
grades and evidence had been established (Taylor et al. 1994). Construct validity of the
measure including correlations with student ratings of academic ability and perceptions
of importance of school has also been reported (Taylor et al. 1994).
CPS 5 essentials
Description: The CPS 5 Essentials Survey measures both school engagement as well as
students’ engagement in specific courses and subject engagement.
Developer/source: Developed by University of Chicago Consortium on School Research
Reliability: Rasch= 0.78
Validity: Developers established content validity via literature review, expert review and
extensive multi‐year testing of items. Construct validity was established via Rasch
analyses and relating the five central constructs to student performance indicators.
CHKS ‐ Resilience & Youth Development Module:
Description: The California Healthy Kids Survey, Reliance and Youth Development module
measures factors associate with positive youth development. These include both internal
and external factors that contribute to positive developmental outcomes. Subscales used
included: School Connectedness, Self‐Efficacy, Empathy, Problem solving, and Self‐
Awareness.
Developer/source: California Department of Education (CDE) and WestEd and Duerr
Evaluation Resources
Reliability: Cronbach alpha = .73 to .88
Validity: Moderate construct validity has been established by examining the relationship
of subscales to related constructs (Hanson & Kim, 2007).
Personal Report of Communication Apprehension (PRCA‐24)
Description: The PRCA‐24 measures communication apprehension, including context
specific subscales that assess apprehension in different contexts including public
speaking, dyadic interaction, small groups, and larger groups.
Developer/source: McCroskey J.
Reliability: Cronbach alpha = > .90
Validity: Criterion and construct validity have been established in a body of published
studies (McCrosky & Beatty, 1994; Beatty, 1987; Beatty & Friedland 1990)
Poetry Out Loud Evaluation Plan H3
Common Measure: Leadership Development, High School
Description: This scale asks students to rate their competence in skills related to civic
action and provides an assessment of students’’ civic action efficacy.
Developer/source: Flanagan, C. A., Syversten, A. K., and Stout, M. D.
Reliability: Cronbach alpha=0.9
Validity: Structural equation modeling suggests evidence from suggests an adequate and
evidence for construct validity.
Poetry Appreciation and Engagement Scale
Description: This measure was modified from the POL Student Survey and the Koukis
(2010) poetry reading assessment. This scale was designed to measure high school
students level of engagement with poetry and appreciation of poetry.
Developer/source: Social Policy Research Associates, modified from POL Student Survey
and Koukis (2010)
Reliability: Not available
Validity: Not available
Poetry Out Loud Evaluation Plan H4
File Type | application/pdf |
File Title | Microsoft Word - POL Evaluation Plan_073018_PRA |
Author | shafferp |
File Modified | 2018-08-02 |
File Created | 2018-08-02 |