LCI_CIS Validation

Att_LCIandCISValidation.1850-004.pdf

National Study on Alternative Assessments (NSAA) Teacher Survey

LCI_CIS Validation

OMB: 1850-0860

Document [pdf]
Download: pdf | pdf
Assessment for Effective Intervention
http://aei.sagepub.com

Measuring the Enacted Curriculum for Students With Significant Cognitive Disabilities: A Preliminary
Investigation
Meagan Karvonen, Shawnee Y. Wakeman, Claudia Flowers and Diane M. Browder
Assessment for Effective Intervention 2007; 33; 29
DOI: 10.1177/15345084070330010401
The online version of this article can be found at:
http://aei.sagepub.com/cgi/content/abstract/33/1/29

Published by:
Hammill Institute on Disabilities

and
http://www.sagepublications.com

Additional services and information for Assessment for Effective Intervention can be found at:
Email Alerts: http://aei.sagepub.com/cgi/alerts
Subscriptions: http://aei.sagepub.com/subscriptions
Reprints: http://www.sagepub.com/journalsReprints.nav
Permissions: http://www.sagepub.com/journalsPermissions.nav

Downloaded from http://aei.sagepub.com by Katherine Nagle on May 9, 2008
© 2007 Hammill Institute on Disabilities. All rights reserved. Not for commercial use or unauthorized distribution.

Measuring the Enacted Curriculum
for Students With Significant
Cognitive Disabilities
A Preliminary Investigation



Meagan Karvonen, Shawnee Y. Wakeman, Claudia Flowers, and Diane M. Browder

Based on recent federal legislation, alternate assessments for students with disabilities may now be based
on alternate achievement standards, modified achievement standards, or grade-level achievement standards.
Although all students with disabilities must access the
general curriculum, those with significant cognitive
disabilities often do so through extensions of grade-level
content standards. Because curriculum is individualized for students with disabilities, experts cannot immediately apply to this population the methods for
examining the taught curriculum in general education. The purpose of this article is to describe the development and use of a method for examining the
enacted curriculum for students who take alternate assessments. We present initial item development, survey
blueprint, expert review, and pilot test findings. Experts
can use this tool to investigate the alignment of curriculum with alternate assessments and state standards
and to design professional development for educators
learning to access the general curriculum.
Students with significant disabilities were first included in
large-scale assessments after 1997 amendments to the Individuals with Disabilities Education Act (IDEA; IDEA
Amendments of 1997) required alternate assessments be
provided for students who could not participate in typical tests, even with accommodations. Final No Child Left
Behind regulations permitted states to develop alternate
achievement standards for reporting adequate yearly
progress for students with significant cognitive disabilities, but the regulations stipulated that these alternate

achievement standards must align with a state’s academic
content standards, promote access to the general curriculum, and reflect the highest achievement standards possible
(200.1[d]; Title 1—Improving the Academic Achievement
of the Disadvantaged; Final Rule, 2003). The federal law
does not define “significant cognitive disabilities,” and individuals who participate in alternate assessments come
from several IDEA categories of disabilities. In this article
the population of focus is students who may be classified
through IDEA as having moderate to severe mental retardation, autism, or multiple disabilities, including mental
retardation.
The Alternate Achievement Standards for Students
With the Most Significant Cognitive Disabilities: NonRegulatory Guidance states that the content of alternate
assessments should be “clearly related to grade-level content, although it may be restricted in scope or complexity or take the form of introductory or pre-requisite skills”
(U.S. Department of Education, 2005, p. 26). Educators
can begin with academic content standards for the grade
level in which the student is enrolled and then adapt or
“extend” these content standards for the individual with
disabilities.
Although some models of alignment have focused
on the relationship between standards and large-scale
assessments, Porter and Smithson (2001) characterized
alignment among three elements: (a) the intended curriculum, typically represented in content standards or curriculum frameworks; (b) the assessed curriculum; and (c) the
enacted curriculum, or what is actually taught in the classroom. It is the interaction of those three elements, along
with secondary elements (e.g., resources, professional

Assessment for Effective Intervention • Volume 33, No. 1 • pp. 29–38
Downloaded from http://aei.sagepub.com by Katherine Nagle on May 9, 2008
© 2007 Hammill Institute on Disabilities. All rights reserved. Not for commercial use or unauthorized distribution.

29

30

Assessment for Effective Intervention vol. 33/no. 1/2007

development), that contribute to the learned curriculum
(student outcomes).
In general education settings, states and districts
sometimes use the Surveys of Enacted Curriculum (SECs;
Council of Chief State School Officers, 2003) to investigate
alignment of the enacted curriculum with state standards
and assessments. Procedures have been developed to use
these surveys for purposes ranging from the interpretation
of assessment results to school curriculum improvement
and alignment. Versions of the SEC exist for mathematics,
science, and English language arts (ELA) and reading based
on the content, materials, and methods typical in general
education settings. A longitudinal, randomized study on
the use of SEC data in professional development for math
and science teachers revealed improved alignment of instruction with state standards over a 2-year period (Council of Chief State School Officers, 2004). Additionally,
researchers have found that general curriculum access as
measured by teacher survey is a strong predictor of alternate assessment outcomes for students with significant
cognitive disabilities (Roach & Elliott, 2006).

Understanding Curriculum and
Expectations for Students With
Significant Disabilities
As described previously, students with significant cognitive disabilities are now expected to learn academic skills
and content. However, when these students are assessed,
their performance is not judged against typical (gradelevel) achievement standards. The Alternate Achievement
Standards for Students With the Most Significant Cognitive
Disabilities: Non-Regulatory Guidance notes that alternate achievement expectations may reflect an expectation
for learning a narrower range of content (e.g., fewer objectives under a content standard) and learning content
that is less complex while still challenging (U.S. Department of Education, 2005). The skills the student acquires
may be associated with those that are typically acquired
at earlier grades or that are prerequisites to attaining
grade-level proficiency.
As this shift to academics represents a major curriculum change for this population (Browder et al., 2004),
teachers still need considerable help with planning curriculum; identifying, developing, and adapting materials;
and learning how to effectively teach academic skills to
students with significant cognitive disabilities. Surveys have
revealed that some teachers question the relevance of this
grade-level content for students with significant intellectual disabilities (Agran, Alper, & Wehmeyer, 2002) or do
not agree that alternate assessment promotes access to the
general curriculum standards (Flowers, Ahlgrim-Delzell,
Browder, & Spooner, 2005; Kleinert, Kennedy, & Kearns,
1999). Although some special educators have gained increased knowledge of general education through their

states’ professional development activities or through preservice training, many states continue to struggle with how
to build teacher competence in this area (see Note 1). Further complicating this curriculum shift are (a) the lack of
research-based strategies for teaching a wide range of ELA
and math content to the population (Browder, AhlgrimDelzell, Pugalee, & Jimenez, 2006; Browder, Wakeman,
Spooner, Ahlgrim-Delzell, & Algozzine, 2006); (b) a lack
of understanding of academics in general education, especially among special educators who teach students
with significant disabilities (Otis-Wilborn, Winn, Griffin, &
Kilgore, 2005); and (c) the need to combine academic
instruction for alternate achievement standards with individual curricular priorities represented in students’ Individualized Education Programs. Given these challenges,
how do teachers learn to identify the gaps in the curriculum and expand the range of academic content taught to
their students to cover the full range represented in the
state standards and on the assessment?
Because special education teachers need help with
curriculum planning as they adjust to the increased academic focus required by federal legislation, a mechanism
for measuring the enacted curriculum for students with
significant cognitive disabilities is now needed for both
alignment studies and teacher professional development.
We developed the Curriculum Indicators Survey (CIS) to
measure the enacted curriculum for students with significant cognitive disabilities who participate in alternate assessments based on alternate achievement standards. The
CIS also assesses some information about instructional
resources and professional development. The CIS incorporates elements of the SEC approach, but for several
reasons the existing SECs are not appropriate as valid
measures of the enacted curriculum for this population.
The SEC uses academic language that may be unfamiliar
to teachers of this student population, it includes items
about homework and classroom instruction procedures
that are irrelevant for students with significant disabilities, and it provides cognitive demand descriptors that are
not sufficiently wide ranging to capture the response
processes of students with the most significant cognitive
disabilities.
The purpose of this article is to describe the process
we used to develop and refine the CIS. We discuss four
stages of survey development: (a) initial survey development and blueprint, (b) procedures for survey completion, (c) findings from expert reviews of the surveys and
related materials, and (d) qualitative findings from pilot
implementation of the CIS. Prior to these four sections, we
provide a brief overview of the survey contents.

Overview of Survey Contents
We designed the CIS to measure the enacted curriculum in
ELA and mathematics across pre-kindergarten to 12th grade

Downloaded from http://aei.sagepub.com by Katherine Nagle on May 9, 2008
© 2007 Hammill Institute on Disabilities. All rights reserved. Not for commercial use or unauthorized distribution.

Assessment for Effective Intervention vol. 33/no. 1/2007

(pre-K–12). Part I of the survey collects information about
the teacher’s background, professional development,
classroom characteristics, instructional resources, use of
particular types of classroom assessment, and instructional influences. This part of the survey is answered
with all of the teacher’s students in mind (i.e., the “target
class”). If the teacher does not teach a self-contained
class but is instead responsible for students in multiple
settings (e.g., inclusion, homebound instruction), the target class consists of the teacher’s case load.
Part II of the survey has two separate versions for
ELA and math and is completed with a particular student
(“target student”) in mind. The contents of Part II surveys contain a list of topics or strands (e.g., geometry) and
specific content within each topic (see Figure 1 and see
Note 2). Teachers rate the intensity of coverage of each
item within topics that they teach to the target student and
also indicate the highest performance expectation (cognitive demand) for that student during the year. The survey can use this combination of information to generate
matrices that illustrate the percentage of instructional
time spent per strand, per level of cognitive demand (see
Table 1). This matrix provides a snapshot of areas of instructional emphasis and gaps. When compared with a
test blueprint organized in a similar matrix, it can identify
areas of strong and weak alignment. A third column indicates the grade level from which teachers adapted materials, activities, and contexts for items that were taught or
planned. The final page of each Part II survey asks about
the intensity of use of a variety of instructional strategies
(e.g., individualized instruction, independent practice)
and the level of expectation for student participation in
these activities. As the Part II surveys are designed to obtain the richest information about the enacted curriculum,
the following section is limited to item development for
those components of the CIS.

Initial Development of Academic
Content for the CIS
We established several priorities to guide the process of
generating items for ELA and math topics. First, we decided that the language in the items should be comprehensible to special educators who may lack the detailed
vocabulary of the content area specialist. To assist teachers of students with significant disabilities who are still
learning to access the general curriculum, some items consist of both their usual terminology as well as a parenthetical interpretation. For example, the item “proportional
relationships in problem solving, modeling, and analysis”
contains parenthetical examples of unit pricing and map
interpretation. Our second priority was that the performance expectations ratings (cognitive demand) should be
accessible to students with a wide range of levels of symbolic communication, including those with no symbol use

31

and limited intentionality. Finally, because the goal was to
develop content lists that are applicable across states, we
decided that the final list of topics and items should have
some social validity among a wide range of special educators, general educators, and measurement professionals
with expertise in this area.

Sources of ELA and Math Content
As the idea of general curriculum access and academics
for this population is still fairly new, there are few examples of agreement across multiple states on how to access
academic content standards. One exception is the Alternate
Assessment Collaborative (2004), a consortium of states and
nonprofit organizations that developed consensus frameworks and expanded benchmarks (hereafter, EAG benchmarks) for Grades 3 through 8 and high school in reading,
writing, math, and science. Additionally, the curricular
frameworks developed by Massachusetts (Massachusetts
Department of Education, 2001) are cited by the U.S. Department of Education (2005) as a model for states. The
Massachusetts frameworks, which reflect the same domains as the national standards developed by the National
Council of Teachers of Mathematics (www.nctm.org) and
the National Council of Teachers of English (www.ncte
.org), provide categories of content that can be used
across states for consideration of curriculum access. Although we cannot guarantee that all CIS items are reflected in various states’ curricular frameworks, summaries
of survey responses within categories (e.g., National
Council of Teachers of Mathematics strands) may still provide useful estimates of curricular emphasis across states.
We developed CIS items by gleaning items from both
the Massachusetts frameworks and EAG benchmarks.
Massachusetts’s frameworks, which extend downward to
early academic skills and feature skill progression across
grades, were the primary source of items. In each topic
(strand), we made a check to be sure we had incorporated the contents of the EAG benchmarks.
Both the EAG benchmarks and Massachusetts frameworks were written with cognitive demand embedded in
the item (e.g., EAG = “demonstrate understanding of main
idea”; Massachusetts = “summarize main ideas and supporting details”). We stripped out of the items verbs signifying cognitive demand so that only content (e.g., “main
idea in text”) was evident in the item. A second member
of the team checked the first member’s edits to be sure
no content was lost and that the remaining items would
appear clear to special educators. The final versions of
the Part II content sections consisted of 250 items across
27 topics in ELA and 178 items in 5 strands in math. The
response options for emphasis on each item were on a 5point scale ranging from no coverage to intensive, systematic coverage, with a separate option to indicate that
instruction is planned for later in the school year but has
not yet started.

Downloaded from http://aei.sagepub.com by Katherine Nagle on May 9, 2008
© 2007 Hammill Institute on Disabilities. All rights reserved. Not for commercial use or unauthorized distribution.

32

Assessment for Effective Intervention vol. 33/no. 1/2007

FIGURE 1. Excerpted items and codes from Curriculum Indicators Survey Part II (mathematics).

Scale for Rating Performance Expectations
Using the categories described by Tileston (2004) and
based on Bloom’s taxonomy, we created a scheme for coding cognitive demand represented in the items. Tileston’s
original categories included (a) memorize/recall, (b) performance, (c) comprehension, (d) application, (e) analysis, and (f) synthesis/evaluation. To be sensitive to the level
of adaptation to general curriculum that may be needed
for some students with the most significant cognitive dis-

abilities who may lack symbolic communication or even
intentionality, we extended the scale downward to include
attention, which includes such behaviors as eye gaze, touch,
vocalization, and recognition. To assist with accuracy and
precision of coding, we created lists of verbs that were
consistent with each category of cognitive demand (see
Table 2). We collapsed the upper two categories to yield
a 6-point scale ranging from attention to analysis, synthesis, evaluation. We made this change to reduce the
potential difficulty in processing categorical response op-

Downloaded from http://aei.sagepub.com by Katherine Nagle on May 9, 2008
© 2007 Hammill Institute on Disabilities. All rights reserved. Not for commercial use or unauthorized distribution.

33

Assessment for Effective Intervention vol. 33/no. 1/2007

TABLE 1. Example of Enacted Curriculum in Mathematics: Percentage of Instructional Time Spent per
Strand, at Each Level of Cognitive Demand

Attention

Memorize/
recall

Performance

Comprehension

Application

Analysis,
synthesis,
evaluation

26

5

16

2

2

0

Algebra

6

2

1

1

1

0

Geometry

4

2

0

0

2

0

16

0

4

0

0

0

4

0

4

0

0

0

Strand
Numbers and
operations

Measurement
Probability

TABLE 2. Performance Expectations and Sample Behaviors
Performance expectation

Examples

Attention

touch, look, vocalize, respond, attend, recognize

Memorize/recall

list, describe (facts), identify, state, define, label, recognize, record

Performance

perform, demonstrate, follow, choose, count, locate, read

Comprehension

explain, conclude, group, restate, review, translate, describe (concepts), paraphrase

Application

compute, organize, collect, apply, classify, construct, solve, use, order, develop, generate, interact with text

Analysis/synthesis/evaluation

pattern, analyze, compare, contrast, compose, predict, extend, plan, judge, evaluate, interpret,
investigate, examine, cause and effect

tions, which may be greater than the difficulty for items
measured on a continuous scale (Dillman, 2000). Given
(a) the large number of survey items and (b) the low density of responses expected at the upper end of the scale
due to the nature of the population, we intended the collapsing of the upper categories to reduce response burden without eliminating useful information.

Instructional Activity and Student
Participation Items
We adapted the lists of instructional activities in Part II
from the SEC to include other instructional methods that
represent best practices for the population. Part II lists a
total of 18 activities in ELA and 17 in math. Teachers rate
the frequency of use of each method within the most recent week on a 5-point scale ranging from none to considerable (8 or more hr). In addition to frequency of use
of each method, teachers rate the extent of student participation expected during that activity on a 4-point scale
ranging from no participation to independent, active par-

ticipation. We developed this scale based on the characteristics of the target students. The challenge for this
population is that there must be some trade-offs between
the amount of material targeted and the level of student
achievement. As mastery and/or independent performance is a focal point in the scoring criteria in several states,
it is necessary to determine the level of performance expected in teacher instruction. If teachers indicate that they
engage in many instructional activities with the student
and present content at high levels of cognitive demand
but expect nothing more than passive participation, this
provides rich information about the expectations for the
student’s learning of the enacted curriculum.

Survey Completion Procedures
We designed the CIS to be completed in a group setting
in which survey administration is prefaced by a brief presentation on alignment and the enacted curriculum. Following that presentation, teachers complete a brief exercise

Downloaded from http://aei.sagepub.com by Katherine Nagle on May 9, 2008
© 2007 Hammill Institute on Disabilities. All rights reserved. Not for commercial use or unauthorized distribution.

34

Assessment for Effective Intervention vol. 33/no. 1/2007

in which they classify their students according to four
levels of symbolic communication: (a) abstract symbolic
communication, (b) concrete symbolic communication,
(c) presymbolic communication, and (d) nonsymbolic communication with limited intentionality (awareness). Symbolic communication refers specifically to the types of
symbol systems the student currently utilizes from abstract symbols (e.g., selects items from picture or word
menu) to concrete symbols (e.g., uses a photograph of an
apple to request an apple) to presymbolic (e.g., touches
the apple to ask for it) or awareness (may or may not
respond to the presence of food). A recent cluster analysis of teacher survey responses supported this classification system (Browder, Wakeman, & Flowers, 2007).
Because students who take alternate assessments based
on alternate achievement levels encompass a wide range
of abilities, this theoretical model guides a purposeful
sampling strategy for Part II surveys. While teachers complete Part I of the survey, researchers review the student
information forms and select a single target student as
the focus for Part II surveys. We acknowledge that the
request to respond about a particular student drastically
narrows the sample size upon which we may be able to
make generalizations about the enacted curriculum. However, this focus on a single student improves the precision
of measurement of the enacted curriculum given the individualized nature of curricular priorities, instructional
methods, and expectations that may exist within one
teacher’s classroom (and that indeed exist for the population of students with significant disabilities). Although
the sample of target students may be smaller, it may still
be representative of the broader population if teachers
are assigned to certain target students through a purposeful sampling method based on students’ levels of symbolic communication.
When teachers receive the Part II surveys they are
also provided with a page of code summaries that contain lists of verbs typical of each performance descriptor.
As a group, the researchers and teachers complete several example items using statements resembling Individualized Education Program objectives that might reflect
different levels of cognitive demand within certain CIS
items. For instance, under the item “coins and money,”
teachers would rate “identify coins” at the memorize/
recall level, whereas they would label “use the next dollar strategy to make a purchase” as application. Five examples in ELA and six examples in math are discussed,
and teachers have an opportunity to talk through some of
the distinctions between activities associated with certain
levels of cognitive demand. Finally, given the likely narrowing of the curriculum for the students, researchers emphasize that teachers should not expect to rate most or all
of the items on the Part II surveys for a particular student,
and that they may in fact find many topics that they do
not teach. This experience is normalized to avoid the po-

tential negative affective responses that might occur if
teachers think they are not teaching “enough” to the target student. After the training examples, teachers complete both Part II surveys independently. Researchers
remain in the room and answer questions if teachers ask
for clarification.

Findings From Expert Reviews
The initial survey draft was reviewed by professionals with
content expertise in ELA and math, curriculum for students with significant disabilities, and the SEC model. ELA
and math content experts each had doctoral degrees and
more than 10 years of experience as university-based
teacher educators in their content areas. The SEC expert
also possessed a doctoral degree and was one of the original developers of the SEC model. The expert in curriculum for students with significant disabilities had a master’s
degree in special education, postgraduate coursework in
related areas, and significant work experience in providing training and technical assistance on general curriculum access and assessment for students with significant
disabilities.
Each expert received a set of specific questions to
address, along with background materials and copies of
the instruments. The content area experts and special education experts made several suggestions for revisions to
instructions, formatting, and item wording. We eliminated
five ELA items and added several math items based on
this feedback. Both content experts agreed that the scale
for cognitive demand was appropriate and clear and that
items and topics were consistent with national guidelines
in these disciplines.
The SEC expert generally agreed with the survey development procedures (past and planned) and provided
helpful suggestions about feedback to solicit from teachers and the design of a validity study ( J. Smithson, personal communication, February 20, 2006). The expert also
made several specific suggestions about the clarity of instructions to teachers and about response options, particularly in areas where changes were made from the SEC
to the CIS. Finally, the expert noted important limitations
in the ability to analyze alignment with standards based
on decisions made to simplify the survey by removing
embedded levels of cognitive demand within each level
of intensity of coverage.

Pilot Test Method
Participants and Setting
We pilot tested the first complete draft of the CIS with 12
teachers of students with significant disabilities using the

Downloaded from http://aei.sagepub.com by Katherine Nagle on May 9, 2008
© 2007 Hammill Institute on Disabilities. All rights reserved. Not for commercial use or unauthorized distribution.

Assessment for Effective Intervention vol. 33/no. 1/2007

administration procedures described previously. Of the 12
teachers who participated in the pilot, 25% had between
4 and 10 years of teaching experience, whereas 33% had
between 11 and 20 years of experience and another 33%
had between 21 and 30 years of experience. One respondent was a student teacher. Two respondents (17%) had
bachelor’s degrees, whereas the remaining 9 (75%) had
master’s degrees. Ten (83%) had special education certifications, and 6 (50%) also had certifications in grade-level
subject areas. Four respondents were National Board
certified. The students with whom they worked were in
pre-K–12 self-contained special education classrooms; were
classified as having severe/profound disabilities, autism,
and moderate developmental disabilities; and participated
in their state’s alternate assessment. All students in the
teachers’ classrooms who were in the age range for the
accountability system (3–8 and 10) participated in alternate assessments judged against alternate achievement
standards. We conducted the CIS pilot in a meeting room
in a school setting.

Data Collection and Analysis
We used three sources of data to pilot the CIS: teachers’
completed CISs, observation notes on teacher conversations during the CIS administration, and a follow-up e-mail
survey (see Table 3). We used CIS responses to make refinements to areas such as response options, new and revised items, formatting, and estimates of time required to
complete the survey. We also used observation notes to
identify potential areas of confusion and to understand
teachers’ interpretations of the items in light of their own
instruction. Eight teachers responded to the follow-up
e-mail questions about the accuracy and thoroughness
of the coverage of the curriculum for that student, the appropriateness of the survey for all of their students with

35

significant cognitive disabilities, and the clarity of response
options in each section of the survey.

Pilot Results: Teacher Perceptions
of the CIS
All 12 teachers were able to complete the CIS after receiving instruction on its purpose and administration. The
total time for the orientation and completion of the instrument was 90 min. Six of the eight teachers responding to
the follow-up e-mail survey generally thought that the CIS
accurately and thoroughly covered the math and ELA curriculum taught to the target student this year. Several noted
that although it did capture the content for the target student, the survey also included much more advanced content that went beyond what the student was learning this
year (e.g., “significant characters in Greek, Roman, and
Norse mythology” on the ELA survey). One of the two
teachers who thought the survey did not accurately and
thoroughly address the content of the target student’s curriculum was a pre-K teacher, and the other thought the
survey was a mix of items that were either too high functioning or not specific enough when they did reflect the
student’s level. When she could positively endorse an item,
this teacher seemed to be frustrated by not being able to
describe the complexity of how content was taught.
Regarding the relevance of the items for all of the
teachers’ students, responses were similar to those about
the target student. One teacher who had students labeled
with mild, moderate, and severe/profound mental retardation said that all of her students were in some way working on academic skills reflected in the survey. Two other
teachers agreed about the similarity of the content being
taught but noted the different adaptations depending
upon disability and skill level. One teacher noted that she
did not believe some of the literature topics on the ELA

TABLE 3. Follow-Up Teacher Survey Items
For Questions 1 and 2, think about the Part II survey you filled out with one student in mind (either in ELA or math). Remember
that our goal was to capture the ELA or math curriculum you are teaching to that target student this year, not other academics or
functional goals.
1. Do you think Part II of the survey accurately and thoroughly captured the ELA or math curriculum for the student you had in
mind when you filled out the survey? If not, what seemed inaccurate or missing?
2. Are there students in your class for whom the contents of the survey would not reflect the curriculum they are learning in ELA
or math this year? If so, in what way?
3. Think about both parts of the survey, and about the curriculum you teach this year. What are we not learning about the “enacted” curriculum by using this survey? What else is important to understand?
4. What parts of the survey were especially confusing or difficult to fill out? What didn’t seem to work? What should we revise?
Note. ELA = English language arts.
Downloaded from http://aei.sagepub.com by Katherine Nagle on May 9, 2008
© 2007 Hammill Institute on Disabilities. All rights reserved. Not for commercial use or unauthorized distribution.

36

Assessment for Effective Intervention vol. 33/no. 1/2007

survey were relevant. Another teacher did not believe the
math items were functional enough and did not include
some of her students’ current skills. At the pre-K level, the
teacher viewed the survey items as relevant for some of
the students but not for others.
Pilot test teachers expressed some concerns about the
content of the Part II surveys, including the source of the
items (which looked unfamiliar because they were not the
state’s standards) and whether teachers who were still
new to states’ academic content standards would be familiar with the language. Teachers also wondered whether
certain skills “counted” under particular CIS items. For example, an open house performance was associated with
the item “presentation based on a dramatic or literary production,” and one teacher asked whether the items in the
writing topic could be rated if the target student did adaptive writing (e.g., using adapted keyboards or writing
utensils such as a name stamp to facilitate student participation in the writing process).
One question in the e-mail survey asked teachers to
indicate what their responses to the CIS did not capture
about the curricula they teach and what else was important for us to understand about their students’ curricula.
The predominant theme underlying responses to this
question was, unsurprisingly, the individualization that
occurs. As one teacher stated, “Teaching our students is
not so ‘cut and dried.’” Given the diverse abilities, student
response modes, and use of assistive technologies, the
same skill may require extensive modification in its presentation to different students. Multiple teachers also
mentioned the importance of materials in adapting the
curriculum. Without commercially available materials that
are aligned to the curriculum and appropriate for their
students, teachers put considerable effort into adapting
materials for instruction. One teacher who used a specific
commercial curriculum in her classroom noted that although the curriculum could be described using the items
in the CIS, her CIS responses did not reflect information
about the integration of the academics into other parts of
the curriculum (e.g., vocational).

Pilot Test Discussion
Given that the CIS is a new instrument, and the first of its
kind developed for teachers of students with significant
cognitive disabilities, the purpose of this pilot study was
to validate its content with experts and teachers. Although
caution is warranted in interpreting the results due to the
small sample of pilot participants, we will use the information obtained to continue to refine the instrument for
use in future evaluations of instructional alignment.
The version of the CIS that was pilot tested with 12
teachers covered a broad range of ELA and math content
that, with few exceptions, captured the academic curricu-

lum taught to students with moderate and severe disabilities. These students ranged in age from preschool to high
school age and also represented all four categories used
to describe students’ level of symbolic communication.
Although some teachers indicated that their students
learned the “same” curriculum that was adapted differently, the sets of questions about instructional activities and
expected level of participation, as well as ratings of highest
performance expectations, should help explain the individualization that occurs when content is adapted for each
student. Although additional use of the CIS will be needed
to determine its true value to states, this preliminary study
suggests that it holds promise as a tool to measure the enacted curriculum for students who take alternate assessments based on alternate achievement standards.
States that may wish to use the CIS for alignment
studies or professional development purposes will need
to consider the trade-offs between the high resource demands required to obtain detailed information on the current version of the CIS and the lower resource demands
associated with other measures of the enacted curriculum. One possible next step in the development of the
CIS involves the creation of a short form. Although a short
form would provide a broader perspective on the curriculum taught by a large number of teachers, results would
lack the precision necessary to pinpoint specific gaps and
engage teachers in meaningful reflection about instruction. One possible compromise would be use of a matrix
sampling approach, in which some teachers complete
the long form but most complete the short form. This approach would provide states with instructional data that
could still yield alignment indices at the topic or strand
level without creating such a response burden that teachers would fail to respond to the survey.

Potential Limitations
There are several potential limitations to the CIS that we
should acknowledge. First, relying on teacher self-report
on instructional practices raises questions about the validity of the data obtained (Mayer, 1999). Multiple data
sources are needed to confirm teachers’ survey responses
and also to expand the information available about adaptations. On a small scale, classroom observations, analysis of instructional materials and other documents, and
interviews would serve to triangulate teachers’ survey responses. If used in practice rather than in a validation
study, these data may help teachers identify the way the
content of instruction identified in the CIS is intertwined
with materials, presentation modes, and response expectations. Observational data may also uncover discrepancies between what teachers intend to teach and what
curriculum they actually do teach.
Another limitation of the CIS is the source of topics
and items in Part II. Although we made efforts to draw

Downloaded from http://aei.sagepub.com by Katherine Nagle on May 9, 2008
© 2007 Hammill Institute on Disabilities. All rights reserved. Not for commercial use or unauthorized distribution.

Assessment for Effective Intervention vol. 33/no. 1/2007

from nationally recognized frameworks, the survey contents do not perfectly represent any one state’s curriculum. Currently, alignment is determined by mapping CIS
topics onto strands in a state’s standards and reporting
discrepancies at a coarse “grain size.” If the survey is used
to make judgments about opportunity to learn, conclusions
are attenuated by the lack of precise matching to the state
standards. However, because the CIS aligns with national
strands in math (National Council of Teachers of Mathematics, n.d.) and ELA (National Council of Teachers of
English, n.d.), states with standards comparable to national recommendations may find that it is a close enough
match to be usable for consideration of instructional
alignment. The alternative solution would be for a state to
create item banks specific to its own state standards.
Finally, this phase of survey development did not include an analysis of the reliability of teachers’ responses.
Teachers completed the survey with one of their own students in mind, and none of the teachers in the pilot study
shared responsibility for the same students. Thus, there was
no opportunity to examine interrater agreement about
responses based on a single student. Tentative evidence
of agreement on performance expectation categories was
evident during teacher training, when teachers independently identified the level of expectation they saw in sample items and compared answers. Subsequent stages of
instrument development will include collection of reliability evidence.

Implications for Practice
Teacher perceptions of the survey provide some hints
about the possible use of the CIS as a self-assessment
tool for professional development on improved alignment.
Overall, teachers were positive about the survey during
the pilot administration; only one teacher responded negatively. One teacher indicated that completing the survey
had given her additional ideas about skills she could try
teaching to her students. These overall favorable responses may have been due to pairing the administration
of the CIS with a brief review of the purpose and practice
of general curriculum access for this population.
The extent to which the administration of the CIS is
embedded in meaningful professional development may
affect how teachers respond to its use. Because each Part
II CIS may take an hour to complete, teachers need to realize some benefits to themselves to offset the costs associated with their time commitment. Using the CIS as a
self-assessment tool for professional development based
on a curriculum development and problem-solving model
(cf. Sparks & Loucks-Horsley, 1990) may help teachers redesign instruction to be more deeply horizontally and
vertically aligned and more adaptable for their student
populations.

37

On a more basic level, the CIS may simply help
teachers realize the breadth and scope of the general
curriculum. One pilot study teacher volunteered that the
survey gave her new ideas about content to add to the student’s curriculum. Seeing this breadth and depth also may
stimulate teacher discussion about the complex challenge
of helping students meet state standards while providing
an individualized focus for instruction. Another pilot study
participant, an experienced teacher with a graduate degree
and multiple certifications, spoke to the tension between
accessing the general curriculum to prepare students for
alternate assessments and the historical need for Individualized Education Programs and priorities:
I work hard to make sure that I align the portfolio
goals with the curriculum. I struggle with teaching a
curriculum to students with very specific [Individualized Education Programs]. . . . How do we continue writing student-centered [Individualized
Education Programs] based upon their individual
needs and teach a specific curriculum at certain
grade levels? Sometimes my students need to learn or
practice a basic skill in the area of ELA or math that
is covered in a younger age group curriculum. . . . I
worry sometimes that by pushing grade level benchmarks, one of my students will not get the basic skill
needed. I know that I can adapt everything to make
it age appropriate, but it is sometimes difficult.
One potential benefit of the CIS is its use as a tool to
help teachers identify gaps between instruction and other
elements of the educational system. It may also help locate differences between what teachers intend to teach
and what they actually teach. It will be of utmost importance to make sure that teachers are trained to understand
the content within the standards and sound practices for
teaching that content.
In summary, the CIS is a tool that has potential to
evaluate the extent to which teachers are addressing the
general curriculum. This instructional alignment may be
of interest to state alternate assessment coordinators, professional development leaders, researchers who focus
on alternate assessment or general curriculum access,
and teachers themselves. Because additional research is
needed to build the validity of the CIS, those who utilize
its current form are encouraged also to collect information on how it is viewed by content experts and teacher
respondents.
ABOUT THE AUTHORS
MEAGAN KARVONEN, PhD, is an assistant professor of educational
research methods at Western Carolina University. She received her doctorate from the University of South Carolina. Her current interests include curriculum alignment, alternate assessments, and mixed methods
research and evaluation. SHAWNEE Y. WAKEMAN, PhD, is a research
associate for the National Alternate Assessment Center at the University
of North Carolina at Charlotte. CLAUDIA FLOWERS, PhD, is a profes-

Downloaded from http://aei.sagepub.com by Katherine Nagle on May 9, 2008
© 2007 Hammill Institute on Disabilities. All rights reserved. Not for commercial use or unauthorized distribution.

38

Assessment for Effective Intervention vol. 33/no. 1/2007

sor of educational research at the University of North Carolina at Charlotte. DIANE M. BROWDER, PhD, is the Snyder Distinguished Professor of Special Education at the University of North Carolina. Her current
research interests include general curriculum access, literacy, and alternate assessment. Address: Meagan Karvonen, Department of Educational Leadership & Foundations, Western Carolina University, 250
Killian, Cullowhee, NC 28721; e-mail: karvonen@email.wcu.edu

NOTES
1. From state discussions at the fall seminars on Inclusive Assessment:
Evaluating and Improving Technical Quality of Alternate Assessments
(Denver, Colorado, October 9–10, 2006; and Washington, DC, October 24–25, 2006).
2. Additional information about the full CIS is available from Meagan
Karvonen.

REFERENCES
Agran, M., Alper, S., & Wehmeyer, M. (2002). Access to the general curriculum for students with significant disabilities: What it means to
teachers. Education and Training in Mental Retardation and Developmental Disabilities, 37, 123–133.
Alternate Assessment Collaborative. (2004). Guide to expanded benchmarks. Retrieved March 9, 2006, from http://www.cde.state.co.us/
cdesped
Browder, D., Flowers, C., Ahlgrim-Delzell, L., Karvonen, M., Spooner, F.,
& Algozzine, R. (2004). The alignment of alternate assessment content with academic and functional curricula. The Journal of Special
Education, 37, 211–233.
Browder, D., Wakeman, S. Y. & Flowers, C. (2007). Level of symbolic
communication classification for students with significant cognitive disabilities. Available at: http://education.uncc.edu/access/
GCApowerpp.htm
Browder, D., Wakeman, S., Spooner, F., Ahlgrim-Delzell, L., & Algozzine,
B. (2006). Research on reading instruction for individuals with significant cognitive disabilities. Exceptional Children, 72, 392–408.
Browder, D. M., Ahlgrim-Delzell, L., Pugalee, D. K., & Jimenez, B. (2006).
Enhancing numeracy. In D. M. Browder & F. Spooner (Eds.), Teaching reading, math, and science to students with significant cognitive disabilities (pp. 171–196). Baltimore: Brookes.
Council of Chief State School Officers. (2003, June 9). Surveys of enacted
curriculum: Tools for aligning instruction, standards, and assessments. Retrieved January 20, 2006, from http://www.ccsso.org/
projects/Surveys_of_Enacted_Curriculum/
Council of Chief State School Officers. (2004). Data on enacted curriculum study: Experimental design study of effectiveness of DEC profes-

sional development model in urban middle schools. Retrieved
March 10, 2006, from http://www.ccsso.org/content/pdfs/DECStudy
.pdf
Dillman, D. A. (2000). Mail and Internet surveys: The tailored design
method (2nd ed.). New York: Wiley.
Flowers, C., Ahlgrim-Delzell, L., Browder, D., & Spooner, F. (2005).
Teachers’ perceptions of alternate assessment. Research and Practice for Persons With Severe Disabilities, 30, 81–92.
Individuals with Disabilities Education Act of 1990, 20 U.S.C. § 1400 et
seq. (1990) (amended 1997)
Kleinert, H. L., Kennedy, S., & Kearns, J. F. (1999). Impact of alternate
assessments: A statewide teacher survey. Journal of Special Education, 33, 93–102.
Massachusetts Department of Education. (2001). Resource guide to the
Massachusetts curriculum frameworks for students with significant
disabilities. Retrieved March 9, 2006, from http://www.doe.mass
.edu/mcas/alt/
Mayer, D. P. (1999). Measuring instructional practice: Can policymakers
trust survey data? Educational Evaluation and Policy Analysis, 21,
29–45.
National Council of Teachers of English. (n.d.). Standards for the English language arts. Retrieved May 25, 2006, from http://www.ncte
.org/about/over/standards/110846.htm
National Council of Teachers of Mathematics. (n.d.). Principles and standards for school mathematics. Retrieved May 25, 2006, from http://
standards.nctm.org/index.htm
Otis-Wilborn, A., Winn, J., Griffin, C., & Kilgore, K. (2005). Beginning
special educators’ forays into general education. Teacher Education
and Special Education, 28, 143–152.
Porter, A. C., & Smithson, J. L. (2001, December). Defining, developing,
and using curriculum indicators (CPRE Research Report Series
RR-048). Retrieved January 20, 2006, from http://www.cpre.org/
Publications/rr48.pdf
Roach, A. T., & Elliott, S. N. (2006). The influence of access to general
education curriculum on alternate assessment performance of students with significant cognitive disabilities. Educational Evaluation
and Policy Analysis, 28, 181–194.
Sparks, D., & Loucks-Horsley, S. (1990). Five models of staff development.
Oxford, OH: National Staff Development Council.
Tileston, D. W. (2004). What every teacher should know about special
learners. Thousand Oaks, CA: Corwin Press.
Title 1—Improving the Academic Achievement of the Disadvantaged; Final Rule, 68 Fed. Reg. 236 (Dec. 9, 2003).
U.S. Department of Education. (2005). Alternate achievement standards
for students with the most significant cognitive disabilities: Nonregulatory guidance. Retrieved August 12, 2005, from http://www.ed
.gov/policy/elsec/guid/altguidance.doc

Downloaded from http://aei.sagepub.com by Katherine Nagle on May 9, 2008
© 2007 Hammill Institute on Disabilities. All rights reserved. Not for commercial use or unauthorized distribution.

STATE DEPARTMENT OF EDUCATION’S
CURRICULUM INDICATORS SURVEY (CIS) RESULTS

UNIVERSITY OF NORTH CAROLINA AT CHARLOTTE
NATIONAL ALTERNATE ASSESSMENT CENTER
AUGUST 27, 2007

TABLE OF CONTENTS
Introduction and methodology................................................................................................................... 3
Short version: Results................................................................................................................................... 5
Short version: Appendix tables................................................................................................................... 25
Long version: Results ................................................................................................................................... 43
Long version: Appendix tables ................................................................................................................... 48

State CIS Report

2

CURRICULUM INDICATORS SURVEY (CIS) RESULTS
The Curriculum Indicators Survey (CIS) was administered as part of the alignment study on
the Alternate State School Assessment (alternate assessment) conducted by the University of North
Carolina at Charlotte under the auspices of the National Alternate Assessment Center in spring
2007. This report summarizes the methodology and findings from the CIS administration. In
addition, information is provided about the alignment of the enacted curriculum, as reported by
teachers, with the emphases in the alternate assessment.
Methodology
The CIS is a five-part survey designed to measure, through teacher self-report, the enacted
academic curriculum in English language arts (ELA), math, and science, for students with significant
cognitive disabilities who are eligible to take a state’s alternate assessment based on alternate
achievement standards. The CIS is based on the concepts in the Surveys of Enacted Curriculum but
is adapted for the unique needs of this population of teachers and students.
• Part 1 asks for background information on the teacher (e.g., educational experience,
characteristics of case load, and instructional influences in each academic subject).
• In Part 2, teachers provide information about the types of students on their case load, based
on students’ levels of symbolic communication. They are then asked to select a single
student on their case load who will serve as the “target student” for the remaining three
parts of the survey.
• Parts 3-5 measure the English language arts, math, and science curriculum being taught to
the target student during the current academic year. For each academic skill taught, teachers
rate three pieces of information: (1) the intensity of coverage of the topic, (2) the highest
performance expectation (depth of knowledge, or DOK) of the student on the topic, and (3)
the grade level or band from which activities, materials, and contexts were adapted for
instruction on that skill. There are also a few questions in Parts 3-5 about the types of
instructional methods used to teach the academic content.
In this alignment study, both the long and short versions of the CIS were administered. As an
incentive, respondents who completed all five sections were entered into a drawing to receive one of
ten $50 gift cards.
Short Version
Teachers were invited to complete the short version online in May 2007. Eligible teachers
were identified by State Department of Education, Division of Assessment and Accountability.
Teachers were recruited via email sent directly by the State DOE. The State DOE sent email
invitations to 4,355 email addresses. A follow-up email was sent to six teachers who had completed
parts of the survey, but not the entire survey, five days before the deadline.
Through email and phone correspondence with potential short version participants, it
became clear that some teachers originally on the recruitment list were not eligible, while some
emails were sent to principals who then distributed the notice to teachers whose names were not on
the original list. A precise response rate cannot be determined because of the recruitment methods
used. Because not all teachers completed all sections, sample sizes for each section are reported with
the corresponding results.

State CIS Report

3

CIS topics were reviewed by content experts to determine the match between survey topics
and State’s content standards (Voluntary State Curriculum and High School Core Learning Goal
topics). These identified matches were used to analyze the alignment of the enacted curriculum as
reported by teachers with the alternate assessment.
Long Version
The long versions for Parts 3-5 were completed by alternate assessment facilitators during a
professional development meeting held in early June, 2007. Whereas teachers completed the short
version with a selected target student in mind, facilitators completed the surveys based on the
academic instruction provided by the “typical” teacher they worked with in 2006-07. They were then
asked to review the content not taught by the typical teacher and indicate whether the “best” teacher
they worked with in 2006-07 had taught that content.
Organization of the Findings
CIS short version results are organized into two sections: (a) respondents’ backgrounds and
(b) the enacted (taught) curriculum. Within the second section, results are reported for each subject
(English language arts, math, and science). Alignment of instructional emphases with emphases in
the Alternate assessment assessments are made for ELA and math only, since no alternate
assessment in science was administered in 2007. An appendix contains supplemental tables.
Long version results are reported separately, following the short version results. These
findings provide a fine-grained analysis of the instructional priorities for students who took alternate
assessments in 2007. However, as the responses allow for inferences at the teacher level rather than
student level, results are purely descriptive and intended for professional development planning.
Analysis was not conducted on alignment of the CIS long version results with alternate assessment
content.

State CIS Report

4

CIS Results: Short Version

State CIS Report

5

Section 1: Respondents’ Backgrounds
A total of 55 teachers, including 50 (91%) females and 5 (9%) males completed Part 1 of the CIS.
The majority (69%) held Masters degrees, while 27% had Bachelors degrees and two respondents
(4%) had a six-year degree. Distributions of years of teaching experience are summarized in the table
below.
Percent of Respondents Reporting Years of Teaching Experience

0-10

40.0

Teaching
students with
sig. cog.
disabilities
45.5

11-20

30.9

30.9

27.3

25.5

30.9

21 or more

29.1

23.6

11.8

20.0

10.9

Years of experience

Total
Teaching

Teaching
ELA

Teaching
Math

Teaching
Science

50.9

54.5

67.3

While fewer than one-fifth (18.2%) of teachers had three or less years of total teaching experience
and years of teaching students with significant cognitive disabilities, that figure was higher for
English language arts and math (27.3% each), and for science (41.8%). Relatively few respondents
held licensure in the academic subjects (13% in ELA, 3.8% in math, 7.5% in science). All but one
respondent (98.2%) held certification in special education, and 32.7% were certified in elementary
education. Fewer respondents had middle or secondary licensure or National Board certification
(9.1% each).
Teachers were also asked to report the amount of time in the past year that they had spent in
professional development on content standards and instructional strategies in each of the three
academic subjects. Response distributions are shown below. The most widely reported professional
development experiences were in ELA instructional strategies, followed by ELA content standards,
math content standards and instructional strategies. Approximately one-fourth of respondents
reported receiving any professional development in science within the previous year.
Time Spent in Professional Development in Past 12 Months
none
Instructional strategies in teaching ELA

34.5

1-5
hours
25.5

ELA content standards

41.8

29.1

14.5

5.5

9.1

Instructional strategies in teaching math

56.4

27.3

7.3

3.6

5.5

Math content standards

60.0

27.3

5.5

1.8

5.5

Instructional strategies in teaching science

80.0

7.3

7.3

5.5

0

Science content standards

76.4

16.4

5.5

1.8

0

State CIS Report

6-10
hours
12.7

11-15
hours
12.7

> 15
hours
14.5

6

Section 2: Academics
Teachers completed surveys on the enacted curriculum for their students in English language arts
(ELA), math, and science. Fifty-six teachers completed Part 2 of the survey, in which the target
student was identified for Parts 3-5. Of the target students selected, 39% were enrolled in elementary
grades, 34% in middle grades, and 27% in high school. (One student reportedly had no assigned
grade.)
In order to understand the characteristics of the learners selected as target students, respondents
were asked to identify which of the three levels of communication best reflected what the student
could currently do.
Level 1 (awareness/presymbolic): Has not yet acquired the skills to discriminate between
pictures or other symbols (and does not use symbols to communicate). May or may not use
objects to communicate. May or may not use idiosyncratic gestures, sounds/vocalizations, and
movements/touch to communicate with others. A direct and immediate relationship between a
routine activity and the student’s response may or may not be apparent. The student may have the
capacity to sort very different objects, may be trial and error. Mouthing and manipulation of
objects leads to knowledge of how objects are used. May combine objects (e.g., place one block
on another).
Level 2 (early symbolic): May use some symbols to communicate (e.g., pictures, logos, objects).
Beginning to acquire symbols as part of a communication system. May have limited emerging
functional academic skills. Representations probably need to be related to the student’s immediate
environment and needs.
Level 3 (symbolic): Communicates with symbols (e.g., pictures) or words (e.g., spoken words,
assistive technology, ASL, home signs). May have emerging or basic functional academic skills.
Emerging writing or graphic representation for the purpose of conveying meaning through
writing, drawing, or computer keying.
The majority of teachers identified target students who had symbolic communication. While three
target students were reported to be enrolled in grades pK-2, these responses were not excluded from
the descriptive portions of this report due to the small sample size and the possibility of a clerical
error (i.e., a student labeled pK-2 but of the chronological age to be enrolled in alternate assessmenteligible grades).
Communication Levels of Identified Target Students, by Enrolled Grade Band
Assigned grade band
pK-2
3-4
5-6
7-8
9-10
Total

State CIS Report

Level 1
1
1
2
1
4
9

Level 2
1
3
4
1
9

Level 3
1
8
9
10
9
37

Total
3
12
11
15
14
55

7

While disability labels are not precise classifications in terms of students’ levels of functioning,
teachers were asked to provide this information about their target students for descriptive purposes.
The most frequently reported categories were mental retardation, multiple disabilities, autism, and
speech/language impairment. None of the respondents selected target students with deaf-blindness,
traumatic brain injury, or serious emotional disturbance.
Disability Labels of Target Students (N = 56)
IDEA Disability Label
Mental Retardation

% of Target Students
67.9

Multiple Disabilities

30.4

Autism

28.6

Speech / Language Impairment

23.2

Other Health Impairment

12.5

Orthopedic Impairment

10.7

Visual Impairment

7.1

Specific Learning Disability

1.8

Hearing Impairment

1.8

Thus, in general, the target students on whom the remaining descriptions of enacted curriculum are
based are fairly evenly split among elementary, middle, and secondary grades. They primarily have
early symbolic or symbolic communication systems, while nearly one-third have multiple disabilities.
The State Department of Education should consider the remaining results in light of this profile of
the target students, in terms of overall representativeness of students who take alternate assessment.

State CIS Report

8

ENGLISH LANGUAGE ARTS
A total of 50 teachers completed the English language arts (ELA) section of the CIS, which includes
both reading and writing. This section of the report summarizes teacher responses to the ELA
section as well as ELA-related items from Part I of the survey (general background).

ELA Content

The table below provides an overview of the distributions of depth of knowledge (DOK) expected
of target students for items within each of the four ELA topics. Frequencies represent the number
of items, across target students, for whom the content was taught in 2006-07. Distributions of DOK
expectations for each item within each topic are reported in Table E.1 in the appendix.
Distribution of ELA Content Taught, by Depth of Knowledge

Attention
Topic

Language
Reading and Literature
Composition
Media

Memorize/
Recall

Perform

Comprehend

Analyze/
Synthesize/
Evaluate

Apply

N

n

%

n

%

n

%

n

%

n

%

n

%

237
433
193
48

52
128
32
22

21.9
29.6
16.6
45.8

51
122
50
15

21.5
28.2
25.9
31.3

64
93
70
4

27.0
21.5
36.3
8.3

27
43
20
2

11.4
9.9
10.4
4.2

37
39
19
5

15.6
9.0
9.8
10.4

6
8
2
0

2.5
1.8
1.0
0.0

The most frequently taught ELA topic was Reading and Literature. Forty-three percent of the
responses within this topic came from items related to beginning reading, understanding texts,
fiction, and nonfiction (see Table E.1). Language and Composition were the other two most
frequently reported ELA topics included in the enacted curriculum for target students in 2006-07.
The highest performance expectations for the target students in the current academic year tended to
be on attending to the content, memorizing or recalling the content, or performing rote tasks related
to the content. Very few of the target students were expected to analyze, synthesize, or evaluate
material.

Grade Level Materials, Activities, and Contexts

After identifying each type of ELA content and DOK at which the target students were taught,
teachers were also asked to identify the grade band or grade from which activities, materials, and
contexts were adapted to teach the corresponding ELA content. The table below summarizes the
distribution of responses to items within each ELA topic. (Respondents could identify more than
one grade band if applicable to the target student.)

State CIS Report

9

The majority of ELA materials were adapted from elementary grades, either pK-2 or 3-5.
Percent of CIS items taught to target student with materials, activities, contexts in each grade brand
pK-2

Language
Reading and Literature
Composition
Media

N
280
488
222
59

n
92
143
70
19

%
32.9
29.3
31.5
32.2

3-5
n
66
144
62
13

6-8
%
23.6
29.5
27.9
22.0

n
40
113
48
11

9-12
%
14.3
23.2
21.6
18.6

n
16
43
17
4

%
5.7
8.8
7.7
6.8

No grade
band
n
%
52 18.6
37
7.6
25 11.3
11 18.6

Specific
grade
n
%
14
5.0
8
1.6
0
0.0
1
1.7

Other ELA Instruction Information
Tables E.2 – E.5 in the appendix provide additional results related to ELA instruction. Highlights of
these findings are as follows:
• Instructional activities: The most frequently reported instructional methods used recently
with the target students in ELA were scaffolded instruction with supports, individualized
instruction, the use of manipulatives, and small group instruction. The highest rate of
expected independent, active performance within a lesson was seen in using computers or
assistive technology. Otherwise, fewer than one-fifth of respondents expected the target
student to perform independently in other ELA instructional activities. Instead, they
included some level of support or limited participation within the activity.
• Resources: Teachers reported using a wide range of materials to teach students who take the
Alternate assessment, including materials adapted from general education, teacher-made
materials, and age-appropriate materials designed for students with significant disabilities.
Nearly three-fourths (73%) also reported using assistive technologies. Most respondents also
reported using functional materials (78%) and other school settings (67%), although only
38% said their students received ELA instruction in inclusive settings. The majority of
teachers reported enlisting support from other special education teachers (51%) and
therapeutic support staff (73%) to assist with ELA instruction.
• Instructional influences: The strongest influences on teachers’ choices about ELA
instruction are student needs as documented in IEPs (96% moderate to strong influence),
classroom assessment results (91% moderate to strong influence), and alternate assessment
requirements (89% moderate to strong influence). Lesser influences included national ELA
standards (61% minimal to no influence), and prior alternate assessment results (38%
minimal to no influence).
• Classroom assessment: For the purpose of assessing their students in ELA, teachers
reported using observational data most frequently (80% once per week or more frequently),
followed by performance on-demand (76% once per week or more often), and objective
tests (65% weekly or more often) for assessment purposes.

State CIS Report

10

Instructional Alignment

To investigate the alignment of the enacted ELA curriculum with the emphases in the alternate
assessment, CIS responses and alignment expert codes were both linked back to Voluntary State
Curriculum (VSC) and High School Core Learning Goal (CLG) topics. The distributions of CIS
items endorsed within each topic and each DOK level were converted to proportions based on the
total number of items endorsed within the topics. While some CIS items may have included content
related to multiple VSCs, only the best match was selected.
Proportional coverage on the alternate assessment topics was determined by examining content
experts’ ratings of those items during the spring 2007 alignment study. After VSC/CLG topic
matches for each CIS item were identified, comparisons of topic x DOK proportions in the CIS
responses and alternate assessment ratings were made.
Grades 3-8
The correspondence between CIS topics and State’s VSC topics in ELA are shown below. All but
three CIS items matched one of the State VSC topics, and two VSC topics (Fluency and Listening)
had no CIS links.
CIS Topic
Discussion
Questioning, Listening, Contributing
Oral Presentation
Vocabulary & Concept Development
Structure and Origins of Modern English
Formal and Informal English
Beginning Reading
Understanding Text
Making Connections
Genre
Theme
Fiction
Nonfiction
Poetry
Style and Language
Myth, Traditional Narrative, and Classical Literature
Dramatic Literature
Dramatic Reading and Performance
Writing
Consideration of Audience and Purpose
Revising
Standard English Conventions
Organizing Ideas in Writing
Research
Evaluating Writing and Presentations
Analysis of Media
Media Production

State CIS Report

VSC Topic
7.1
7.1
7.1
1.3
5.1
7.1
1.1
1.4
3.1
3.1
3.1
3.1
2.1
3.1
3.1
3.1
3.1
7.1
4.1
4.1
4.1
5.1
4.1
4.1
None
None
None

11

The following table reflects proportional coverage of CIS ELA items identified as being aligned with
VSC Topics.
CIS Emphases, Grades 3-8 (N = 576 item endorsements, 38 respondents)
Attention

1.1 Phonics
0.005
1.2 Fluency
*
1.3 Vocabulary
0.007
1.4 Comprehension
0.01
2.1 Comprehension of
Informational Text
0.01
3.1 Comprehension of Literary
0.095
Text
4.1 Writing
0.033
5.1 Controlling language
0.01
6.1 Listening
*
7.1 Speaking
0.049
* N/A – no match to content standards

Memorize/
Recall

Perform

Comprehend

Apply

Analyze/
Synthesize/
Evaluate

0.005
*
0.016
0.012

0.021
*
0.009
0.012

0.002
*
0.010
0.009

0.016
*
0.01
0.012

0.003
*
0.002
0

0.014

0.014

0.007

0.009

0

0.115
0.056
0.019
*
0.043

0.063
0.052
0.031
*
0.063

0.036
0.016
0.007
*
0.014

0.007
0.014
0.016
*
0.04

0.002
0
0.002
*
0.003

The table below shows the proportional content x DOK coverage of alternate assessment Reading
assessments in the alignment study sample. None of the portfolios had ELA items identified at the
attention level of DOK, or in the topics of writing, controlling language, listening, or speaking.
Alternate assessment Emphases (N = 1,551)
Attention

1.1 Phonics
1.2 Fluency
1.3 Vocabulary
1.4 Comprehension
2.1 Comprehension of
Informational Text
3.1 Comprehension of Literary
Text
4.1 Writing
5.1 Controlling language
6.1 Listening
7.1 Speaking

Memorize/
Recall

Perform

Comprehend

Apply

Analyze/
Synthesize/
Evaluate

0.005
0.132
0.148
0.035

0.006
0.006
0.010
0.003

0.001
0.063
0.047
0.009

0.001
0.002
0.005
0.002

0.003
0.032
0.010

0.187

0.001

0.027

0.015

0.001

0.221

0.001

0.027

0.001

0.001

On the whole, teacher-reported curriculum emphases covered a broader range of content and DOK
than what was emphasized in the alternate assessment. There were 3 of 45 cells with no coverage on
the CIS ELA items, and 31 of 60 with no coverage on the alternate assessment.

State CIS Report

12

The CIS and alternate assessment matrices were then compared cell by cell to identify areas of
consistency and discrepancy between the enacted curriculum reported by teachers in 2006-07 and
the emphases in the alternate assessment Reading content in the sampled portfolios. The table
below summarizes this comparison. Small discrepancies exist within the Phonics and
Comprehension topics. CIS responses showed lesser emphases at the memorize/recall level,
especially in Vocabulary, Comprehension of Informational Text, and Comprehension of Literary
Text, as indicated by negative numbers. The alternate assessment places less emphasis on
Comprehension of Literary Text at the attention and performance levels of knowledge compared
with teachers’ instructional reports.
Boxes around cells are used to highlight places where the proportional discrepancies were greater
than 0.05 (essentially, more than five percentage points). Topics below the dark line are those not
intended to be measured by the alternate assessment in Reading.
Discrepancy between CIS and Alternate Assessment Emphases (CIS – AA)

1.1 Phonics
1.2 Fluency
1.3 Vocabulary
1.4 Comprehension
2.1 Comprehension of Informational Text
3.1 Comprehension of Literary Text
4.1 Writing
5.1 Controlling language
6.1 Listening
7.1 Speaking

Attention

Memorize/
Recall

Perform

Comprehend

Apply

0.005
0.000
0.007
0.010
0.010
0.095
0.033
0.010

0.000
-0.132
-0.132
-0.023
-0.173
-0.107
0.056
0.019

0.015
-0.006
-0.001
0.009
0.013
0.062
0.052
0.031

0.000
-0.063
-0.037
0.000
-0.020
0.009
0.016
0.007

0.015
-0.002
0.006
0.010
-0.007
0.006
0.014
0.016

Analyze/
Synthesize/
Evaluate
0.003
-0.003
-0.030
-0.010
-0.001
0.000
0.000
0.002

0.049

0.043

0.063

0.014

0.040

0.003

Blank cells indicate no coverage in either CIS or Alternate Assessment.
High School
The correspondence between CIS topics and State’s CLG topics in ELA are shown below. All but
five CIS items matched one of the State CLG topics.
CIS Topic
Discussion
Questioning, Listening, Contributing
Oral Presentation
Vocabulary & Concept Development
Structure and Origins of Modern English
Formal and Informal English
Beginning Reading
Understanding Text
Making Connections
Genre
Theme

State CIS Report

CLG Topic
3
3
3
1
None
3
None
1
1
1
1

13

CIS Topic

CLG Topic
1
1
1
1
1
3
3
2
2
2
3
2
None
4
None
None

Fiction
Nonfiction
Poetry
Style and Language
Myth, Traditional Narrative, and Classical Literature
Dramatic Literature
Dramatic Reading and Performance
Writing
Consideration of Audience and Purpose
Revising
Standard English Conventions
Organizing Ideas in Writing
Research
Evaluating Writing and Presentations
Analysis of Media
Media Production

The following table reflects proportional coverage of CIS ELA items identified as being aligned with
CLG Topics.
CIS Emphases, High School (N = 197 item endorsements; 14 respondents)

1. Reading, review, and responding to texts
2. Composing in a variety of modes
3. Controlling language
4. Evaluating content, organization, and
language use of texts

Attention

Memorize/
Recall

Perform

Comprehend

Apply

Analyze/
Synthesize/
Evaluate

0.162
0.030
0.086

0.112
0.020
0.046

0.086
0.066
0.107

0.056
0.041
0.036

0.061
0.005
0.010

0.015
0.005
0.025

0.010

0.005

0.015

The table below shows the proportional content by DOK coverage of Reading alternate assessments
in the alignment study sample. None of the portfolios had evidence identified at the attention level
of DOK, or in the topics of Controlling Language or Evaluating Content, Organization, and
Language Use of Texts. On the whole, teacher-reported curriculum emphases covered a broader
range of content and DOK than what was emphasized in the alternate assessment.
Alternate assessment Emphases (N = 266)

Attention

1. Reading, review, and responding to texts
2. Composing in a variety of modes
3. Controlling language
4. Evaluating content, organization, and
language use of texts

State CIS Report

Memorize/
Recall

0.692
0.015

Perform

0.045
0.004

Comprehend

0.128

Apply

0.071

Analyze/
Synthesize
/ Evaluate

0.041
0.004

14

The CIS and alternate assessment matrices were then compared cell by cell to identify areas of
consistency and discrepancy between the enacted curriculum reported by teachers in 2006-07 and
the emphases in the alternate assessment Reading content in the sampled portfolios. The table
below summarizes this comparison. Small discrepancies exist within the Composing topic. CIS
responses showed lesser emphases in the Reading, Reviewing, and Responding to Texts topic at the
memorize/recall and comprehension levels, but greater emphasis on that topic at the attention level,
as indicated by negative numbers.
Boxes around cells are used to highlight places where the proportional discrepancies were greater
than 0.05. Topics below the dark line are those not intended to be measured at the high school level
by the alternate assessment in Reading.
Discrepancy between CIS and Alternate Assessment Reading Emphases (CIS – AA)
Attention

1. Reading, review, and responding to texts
2. Composing in a variety of modes
3. Controlling language
4. Evaluating content, organization, and
language use of texts

Mem/Rec

Perform

0.162
0.030
0.086

-0.580
0.005
0.046

0.041
0.062
0.107

0.010

0.005

0.015

Comprehend

-0.072
0.041
0.036

Apply

-0.011
0.005
0.010

Analyze/
Synthesize/
Evaluate

-0.026
0.001
0.025

Blank cells indicate no coverage in either CIS or Alternate Assessment.

State CIS Report

15

MATHEMATICS
A total of 47 teachers completed the math section of the CIS. This section summarizes teacher
responses to the math section as well as math-related items from Part I of the survey (general
background).

Math Content

The table below provides an overview of the distributions of depth of knowledge (DOK) expected
of target students for items within each of the five math topics. Frequencies represent the number
of items, across target students, for whom the content was taught in 2006-07. Distributions of DOK
expectations for each item within each topic are reported in Table M.1 in the appendix.
Distribution of Math Content Taught, by Depth of Knowledge

Attention
Topic

Number Sense and Operations
Patterns, Relations, and Algebra
Geometry
Measurement
Data Analysis, Statistics, and
Probability

Memorize/
Recall

Perform

Comprehend

Analyze/
Synthesize/
Evaluate

Apply

N

n

%

n

%

n

%

n

%

n

%

n

%

116
121
128
111

20
35
33
35

17.2
28.9
25.8
31.5

14
19
24
16

12.1
15.7
18.8
14.4

34
33
38
27

29.3
27.3
29.7
24.3

15
11
15
11

12.9
9.1
11.7
9.9

28
20
13
19

24.1
16.5
10.2
17.1

5
3
5
3

4.3
2.5
3.9
2.7

67

25

37.3

13

19.4

15

22.4

7

10.4

2

3.0

5

7.5

The most frequently taught math topic was Geometry. Roughly one-third of the responses within
this category came from the item related to characteristics of geometric shapes (see Table M.1).
Patterns, Relations, and Algebra; and Number Sense and Operations were the other two most
frequently reported math topics included in the enacted curriculum for target students in 2006-07.
The highest performance expectations for the target students in the current academic year tended to
be on attending to the content, memorizing or recalling the content, or performing rote tasks related
to the content. Roughly one-third of expectations were at higher levels of cognitive demand
(comprehension and application) in Number Sense and Operations, and one-fourth were at higher
levels in Patterns, Relations, and Algebra; Geometry; and Measurement. Very few students were
expected to analyze, synthesize, or evaluate material.

Grade Level Materials, Activities, and Contexts

After identifying each type of math content and DOK at which the target students were taught,
teachers were also asked to identify the grade band or grade from which activities, materials, and
contexts were adapted to teach the corresponding math content. The table below summarizes the
distribution of responses to items within each math topic. (Respondents could identify more than
one grade band if applicable to the target student.)
The majority of math materials were adapted from elementary grades, either preK-2 or 3-5. Between
10% and 15% of items were taught with materials and activities that were not unique to a specific
grade band.

State CIS Report

16

Percent of CIS items taught to target student with materials, activities, contexts in each grade brand
pK-2

Number Sense and Operations
Patterns, Relations, and
Algebra
Geometry
Measurement
Data Analysis, Statistics, and
Probability

3-5

6-8

9-12

No grade
band
n
%
13
9.6

Specific
grade
n
%
0
0.0

N
135

n
64

%
47.4

n
41

%
30.4

n
13

%
9.6

n
4

%
3.0

146
154
131

63
71
58

43.2
46.1
44.3

47
44
36

32.2
28.6
27.5

12
17
15

8.2
11.0
11.5

9
6
4

6.2
3.9
3.1

14
16
16

9.6
10.4
12.2

1
0
2

0.7
0.0
1.5

80

33

41.3

23

28.8

7

8.8

5

6.3

12

15.0

0

0.0

Other Math Instruction Information
Tables M.2 – M.5 in the appendix provide additional results related to math instruction. Highlights
of these findings are as follows:
• Instructional activities: The most frequently reported instructional methods used recently
with the target students in math were small or large group instruction and the use of
manipulatives to solve problems, followed by individualized instruction. The highest rates of
expected independent, active performance within a lesson were seen in using computers or
calculators, independent work, and rote counting. Otherwise, the expectation for the target
student tended to include some level of support or limited participation rather than
independent performance of skills within the activity.
• Resources: Teachers most often reported using teacher-made materials or commercially
prepared materials adapted from general education in order to teach math lessons. The vast
majority used functional, real-life materials, although fewer than half taught math concepts in
inclusive classrooms. Fewer than half of teachers reported enlisting support from other
special education teachers and therapeutic support staff to assist with math instruction, or
adopting activities and materials used by general educators in their school. Roughly onefourth reported enlisting support from non-disabled peers.
• Instructional influences: The strongest influences on teachers’ choices about math
instruction are student needs as documented in IEPs (96% moderate or strong influence),
classroom assessment results (87% moderate to strong influence), and alternate assessment
requirements (86% moderate to strong influence). Less endorsed items included national
math standards (38% moderate to strong influence), and math content, materials, and
activities used by general education teachers in the school (44% moderate or strong
influence).
• Classroom assessment: For the purpose of assessing their students who take the alternate
assessment in Mathematics, teachers reported using observational data most frequently (84%
once per week or more frequently), followed by performance on-demand (73% once per
week or more often). Approximately half (53%) reported frequent use of objective tests for
assessment purposes.

State CIS Report

17

Instructional Alignment
To investigate the alignment of the enacted ELA curriculum with the emphases in the alternate
assessment, CIS responses and alignment expert codes were both linked back to VSC /CLG topics.
The distributions of CIS items endorsed within each topic and each DOK level were converted to
proportions based on the total number of items endorsed within the topics. While some CIS items
may have included content related to multiple VSCs, only the best match was selected.
Proportional coverage on the alternate assessment topics was determined by examining content
experts’ ratings of those items during the spring 2007 alignment study. After VSC/CLG topic
matches for each CIS item were identified, comparisons of topic by DOK proportions in the CIS
responses and alternate assessment ratings were made.
Grades 3-8
The correspondence between CIS topics and State’s Math VSC topics are shown below. All CIS
items matched one of the State VSC topics, and one VSC topic (Process of Mathematics) had no
direct CIS links.
CIS Topic
Number Sense
Operations
Computation and Estimation
Patterns, relations, and functions
Algebra
Relations and mathematical models
Variables and change
Characteristics of geometric shapes
Spatial relationships and coordinate geometry
Transformation and symmetry
Visualization/special reasoning/ Geometric modeling
Measurement tools
Concepts and attributes of measurement
Formulas of measurement
Data and statistics
Probability

State CIS Report

VSC Topic
6
6
6
1
1
1
1
2
2
2
2
3
3
3
4
5

18

The following table reflects proportional coverage of CIS Math items identified as being aligned
with VSC Topics.
CIS Math Emphases, Grades 3-8 (N = 368 item endorsements, 38 respondents)
Attention

1. Algebra, patterns, and
functions
2. Geometry
3. Measurement
4. Statistics
5. Probability
6. Number relationships and
computation/arithmetic
7. Process of mathematics

Memorize/
Recall

Perform

Comprehend

Apply

Analyze/
Synthesize/
Evaluate

0.057
0.041
0.049
0.022
0.022

0.041
0.049
0.033
0.008
0.016

0.06
0.071
0.052
0.024
0.005

0.0245
0.0326
0.0272
0.0054
0.0054

0.043
0.035
0.038
0.003
0.003

0.003
0.014
0.003
0.005

0.022

0.027

0.063

0.038

0.052

0.008

The table below shows the proportional content by DOK coverage of Math alternate assessments in
the grades 3-8 alignment study sample.
Math Alternate Assessment Emphases, Grades 3-8 (N = 1,421)
Attention

1. Algebra, patterns, and
functions
2. Geometry
3. Measurement
4. Statistics
5. Probability
6. Number relationships and
computation/arithmetic
7. Process of mathematics

Memorize/
Recall

Perform

Comprehend

Apply

Analyze/
Synthesize/
Evaluate

0.008
0.081
0.010
0.005

0.359
0.565
0.318
0.299

0.003
0.063
0.005
0.000

0.203
0.063
0.057
0.018

0.427
0.164
0.508
0.544

0.143

0.346

0.008

0.378

0.122

On the whole, teacher-reported curriculum emphases in grades 3-8 covered a broader range of
content and DOK than what was emphasized in the alternate assessment. While Probability had
some emphasis on the CIS, it was not represented in the alternate assessment sample. Processes of
mathematics were represented in neither the CIS nor the alternate assessment portfolios.
The CIS and alternate assessment matrices were then compared cell by cell to identify areas of
consistency and discrepancy between the enacted curriculum reported by teachers in 2006-07 and
the emphases in the alternate assessment math samples from grades 3-8. The table below
summarizes this comparison. CIS responses showed greater emphases at the attention and
comprehension levels, as indicated by positive numbers in those DOK columns. The alternate
assessment items had greater emphasis at the performance, application, and
analysis/synthesis/evaluation levels than what teachers report teaching to the target students this
year.

State CIS Report

19

Boxes around cells are used to highlight places where the proportional discrepancies were greater
than 0.05 (essentially, more than five percentage points).
Discrepancy between CIS and Grades 3-8 Alternate Assessment Math Emphases (CIS – AA)
Attention

1. Algebra, patterns, and
functions
2. Geometry
3. Measurement
4. Statistics
5. Probability
6. Number relationships
and computation/arithmetic
7. Process of mathematics

Memorize/
Recall

Perform

Comprehend

Apply

Analyze/
Synthesize/
Evaluate

0.057
0.041
0.049
0.022
0.022

0.033
-0.032
0.022
0.003
0.016

-0.300
-0.494
-0.266
-0.275
0.005

0.022
-0.030
0.022
0.005
0.005

-0.160
-0.027
-0.019
-0.016
0.003

-0.424
-0.150
-0.505
-0.539

0.022

-0.116

-0.284

0.030

-0.326

-0.114

Blank cells indicate no coverage in either CIS or Alternate Assessment.
High School
The correspondence between CIS topics and State’s Math CLG topics are shown below. All but
three CIS items matched one of the State CLG topics, and all CLG topics had direct CIS links.
CIS Topic
Number Sense
Operations
Computation and Estimation
Patterns, relations, and functions
Algebra
Relations and mathematical models
Variables and change
Characteristics of geometric shapes
Spatial relationships and coordinate geometry
Transformation and symmetry
Visualization/special reasoning/ Geometric modeling
Measurement tools
Concepts and attributes of measurement
Formulas of measurement
Data and statistics
Probability

State CIS Report

CLG Topic
None
None
None
1
1
1
1
2
2
2
2
2
2
2
3
3

20

The following table reflects proportional coverage of CIS Math items identified as being aligned
with CLG Topics.
CIS Math Emphases, High School (N = 110 item endorsements, 14 respondents)
Attention

1. Functions and Algebra
2. Geometry, Measurement, and Reasoning
3. Data analysis and probability

Memorize/
Recall

0.118
0.282
0.082

0.027
0.073
0.036

Perform

0.082
0.1
0.009

Comprehend

Apply

0.0091
0.0182
0.0182

0.036
0.045

Analyze/
Synthesize/
Evaluate

0.018
0.018
0.027

The table below shows the proportional content x DOK coverage of Math alternate assessments in
the alignment study sample.
Math Alternate Assessment Emphases, High School (N = 315)
Attention

1. Functions and Algebra
2. Geometry, Measurement, and Reasoning
3. Data analysis and probability

Memorize/
Recall

0.025
0.041

Perform

0.244
0.108

Comprehend

0.025

Apply

Analyze/
Synthesize
/ Evaluate

0.067
0.063

0.327
0.098

On the whole, teacher-reported curriculum emphases covered a broader range of content and DOK
than what was emphasized in the alternate assessment. CIS content covered all three topics and all
levels of DOK, with the exception of Data Analysis and Probability and the application level. The
alternate assessment sample included no evidence of Data Analysis and Probability, or any content
assessed at the attention level.
The CIS and alternate assessment matrices were then compared cell by cell to identify areas of
consistency and discrepancy between the enacted curriculum reported by teachers in 2006-07 and
the emphases in the high school alternate assessment math samples. The table below summarizes
this comparison. CIS responses showed greater emphases at the attention and memorize/recall
levels, as indicated by positive numbers in those DOK columns. The alternate assessment evidence
had greater emphasis on Functions and Algebra, and in content at the higher levels of DOK than
what teachers report teaching to the target students this year.
Boxes around cells are used to highlight places where the proportional discrepancies were greater
than 0.05.
Discrepancy between CIS and Alternate Assessment Emphases, High School (CIS – AA)
Attention

1. Functions and Algebra
2. Geometry,
Measurement, and Reasoning
3. Data analysis and
probability

Memorize/
Recall

Perform

Comprehend

Apply

Analyze/
Synthesize/
Evaluate

0.118

0.002

-0.163

-0.016

-0.030

-0.309

0.282

0.031

-0.008

0.018

-0.018

-0.080

0.082

0.036

0.009

0.018

0.027

Blank cells indicate no coverage in either CIS or Alternate Assessment.

State CIS Report

21

SCIENCE
A total of 47 teachers completed the science section of the CIS. This section summarizes teacher
responses to the science section as well as science-related items from Part I of the survey (general
background).

Science Content

The table below provides an overview of the distributions of depth of knowledge (DOK) expected
of target students for items within each of the six science topics. Frequencies represent the number
of items, across target students, for whom the content was taught in 2006-07. Distributions of DOK
expectations for each item within each topic are reported in Table S.1 in the appendix.
Distribution of Science Content Taught, by Depth of Knowledge

Attention
Topic
Earth and Space Science
Life Science (Biology)
Physical Science (Chemistry
& Physics)
Technology/Engineering
History/Nature of Science
Science as inquiry

Memorize/
Recall

Perform

Comprehend

Analyze/
Syntheize/
Evaluate

Apply

N

n

%

n

%

n

%

n

%

n

%

n

%

88
219

30
98

34.1
44.7

15
44

17.0
20.1

16
31

18.2
14.2

17
28

19.3
12.8

7
13

8.0
5.9

3
5

3.4
2.3

110
23
47
32

38
6
19
13

34.5
26.1
40.4
40.6

25
4
12
6

22.7
17.4
25.5
18.8

24
6
12
8

21.8
26.1
25.5
25.0

15
2
2
2

13.6
8.7
4.3
6.3

4
4
0
1

3.6
17.4
0.0
3.1

4
1
2
2

3.6
4.3
4.3
6.3

The most frequently taught science subject was life science. The most frequent responses within
this category were for the items related to personal and community health (18%), characteristics of
organisms, (18%), and environments, populations, and ecosystems (15%; see Table S.1). Physical
Science and Earth and Space Science were the other two most frequently reported science topics
included in the enacted curriculum for target students in 2006-07. In most topics, the highest
performance expectation for more than half the target students in the current academic year was
either attending to the content, or memorizing and recalling the content. There was a higher
proportion expected to apply knowledge in Technology/Engineering compared with other topics.
Very few students were required to analyze, synthesize, or evaluate material.

Grade Level Materials, Activities, and Contexts

After identifying each type of science content and DOK at which the target students were taught,
teachers were also asked to identify the grade band or grade from which activities, materials, and
contexts were adapted to teach the corresponding science content. The table below summarizes the
distribution of responses to items within each science topic. (Respondents could identify more than
one grade band if applicable to the target student.)
More than half of science materials, activities, and contexts were adapted from elementary grade
bands, with the exception of Technology/Engineering (48% from elementary grade bands). The
Technology/Engineering topic also had the highest proportion of materials, activities, and contexts
taught that were not linked to a specific grade band (24%).

State CIS Report

22

Percent of CIS items taught to target student with materials, activities, contexts in each grade brand
pK-2

Earth and Space Science
Life Science (Biology)
Physical Science (Chemistry &
Physics)
Technology/Engineering
History/Nature of Science
Science as inquiry

3-5

6-8

9-12

N
114
290

n
39
98

%
34.2
33.8

n
32
69

%
28.1
23.8

n
23
63

%
20.2
21.7

n
8
26

%
7.0
9.0

145
29
57
40

50
9
19
17

34.5
31.0
33.3
42.5

37
5
16
10

25.5
17.2
28.1
25.0

30
4
14
7

20.7
13.8
24.6
17.5

11
3
4
3

7.6
10.3
7.0
7.5

No grade
band
n
%
8
7.0
24
8.3
7
7
0
1

4.8
24.1
0.0
2.5

Specific
grade
n
%
4
3.5
10
3.4
10
1
4
2

Other Science Instruction Information
Tables S.2 – S.5 in the appendix provide additional results related to science instruction. Highlights
of these findings are as follow:
• Instructional activities: The most frequently reported instructional methods used recently
with the target students in science were small group instruction, scaffolded instruction with
supports, and the use of hands-on materials and manipulatives. Science instruction may not
have a large emphasis in target students’ overall educational program, as evidenced by the
high rates at which science methods were reported to have been used one hour or less in the
past week, or not at all. When certain science instruction methods were used, the expectation
for the target student tended to include some level of support or limited participation, rather
than independent, active performance within the lesson.
• Resources: Teachers most often reported using teacher-made materials or commercially
prepared materials adapted from general education in order to teach science lessons. Many
used functional, real-life materials (69%), although fewer taught science concepts in real-life
settings. Fewer than half of teachers reported enlisting support from general educators,
support staff, or nondisabled peers to assist with science instruction.
• Instructional influences: The strongest influences on teachers’ choices about science
instruction are student needs as documented in IEPs (96% moderate to strong influence),
classroom assessment results (91% moderate to strong influence), and alternate assessment
requirements (89% moderate to strong influence). The items most often rated as having
minimal to no influence on respondents’ science instructional choices were national science
standards and science content used by general education teachers at the school.
• Classroom assessment: For the purpose of assessing their students in science, teachers
reported using observational data most frequently (69% once per week or more frequently),
followed by performance on-demand (56% once per week or more often) and objective tests
(49%).

State CIS Report

23

6.9
3.4
7.0
5.0

CIS SHORT VERSION: CONCLUSIONS
Teachers who completed the short version of the CIS are teaching a broad range of content in
English language arts, math, and science. In ELA, the range of content includes topics covered by
the alternate assessment in Reading as well as other components of ELA, although the greatest
emphasis was in Reading. In science and math, the range of content taught was also broader than
what was emphasized in the alternate assessment. In general, there was also a range of DOK
reported for this sample of target students. In some cases, the span of DOK was wider than what
was reflected in the alternate assessments.
The State Department of Education may want to further consider discrepancies between the
symbolic communication skills of students in the sample and evidence of high expectations in
instruction. For example, while the majority of target students in the sample (84%) had early
symbolic or symbolic communication, teachers frequently reported teaching content at the
“attention” level – requiring only eye gaze, vocalization, or some other form of minimal, intentional
response. Similarly, there were low rates of expected independent, active participation of these
students in most instructional activities.
According to federal guidelines, alternate assessments judged against alternate academic achievement
standards are supposed to be aligned to grade level expectations, however, the activities, materials,
and contexts teachers report using during instruction tend to be adapted from grades pK-2 or 3-5.
The frequency with which materials were adapted from high school was not consistent with the
composition of the target student group identified for this study (27% high school). In order to
provide instruction that is more consistent with the content of alternate assessments aligned to grade
level expectations, teachers may require more professional development on how to adapt materials
and activities from grade levels that match the chronological age of their students.
Alignment results should be interpreted with caution, based on the low response rates and
characteristics of the target students identified for the survey (e.g., more elementary and middle
grades than high school; primarily students with some symbolic communication). Areas of
discrepancy between instructional and alternate assessment emphases are identified for State DOE’s
formative use, and are not intended to be conclusive, summative statements about the quality of
alignment of instruction with the assessment.
Finally, teachers’ responses to survey questions about instructional influences suggest that there may
be room for growth in their ways of building access to the general curriculum. While most report
that state standards have a strong influence on what they teach, respondents are not yet as
concerned as they could be about what their general education counterparts are teaching in the
content areas, or what the general education academic priorities are within their school or district.
Increasing student access to the general education curriculum and better aligning instruction in order
to increase academic achievement may require more professional development and strengthened
relationships with general educators in the same schools.

State CIS Report

24

Short version: Appendix
E1
E2
E3
E4
E5

English Language Arts
Distribution of ELA Content Taught, by Depth of Knowledge
ELA Instructional Methods and Level of Student Participation
Percent of Teachers Using Various Resources to Teach ELA
Teacher-Reported Influences on ELA Instruction
Frequency of Use of Classroom Assessments – ELA

M1
M2
M3
M4
M5

Math
Distribution of Math Content Taught, by Depth of Knowledge
Math Instructional Methods and Level of Student Participation
Percent of Teachers Using Various Resources to Teach Math
Teacher-Reported Influences on Math Instruction
Frequency of Use of Classroom Assessments – Math

S1
S2
S3
S4
S5

Science
Distribution of Science Content Taught, by Depth of Knowledge
Science Instructional Methods and Level of Student Participation
Percent of Teachers Using Various Resources to Teach Science
Teacher-Reported Influences on Science Instruction
Frequency of Use of Classroom Assessments – Science
In each subject, first two tables are based on academic section of CIS (Part 3, 4, or 5;
referenced to the target student), while last three are based on Part 1 (General classroom
information; not about a specific target student).

State CIS Report

25

Table E.1. Distribution of ELA Content Taught, by Depth of Knowledge (N = 50)
Item LANGUAGE
A1
Discussion (discussion rules, group
interactions)
A2
Questioning, Listening, and Contributing
(class discussion contributions, gathering
information)
A3
Oral Presentation (presentation elements
and techniques, presentation preparation)
A4
Vocabulary and Concept Development
(antonyms, synonyms, compound words,
prefixes, suffixes, dictionary use, use in
context)
A5
Structure and Origins of Modern English
(grammar, mechanics, parts of speech)
A6
Formal and Informal English (standard vs.
conversational language)
Total

B1
B2
B3
B4
B5
B6
B7
B8

READING AND LITERATURE
Beginning Reading (letters, handling of a
book, phonemic awareness, letter/sound
combinations, decode words)
Understanding a Text (predictions, retell
stories, cause/effect, story elements,
imagery, symbolism)
Making Connections (compare authors,
illustrators, settings)
Genre (forms of literature- poetry, prose,
fiction, nonfiction, drama)
Theme (lessons of folktales, fables, myths,
theme identification)
Fiction (plot, character, setting identification
of stories)
Nonfiction (meaning, prediction, and fact
identification of informational material)
Poetry (rhythm and rhyme, repetition,
imagery, figurative language)

State CIS Report

N

Attention
%
n

Mem/Recall
n
%

Perform
n
%

Comprehend
n
%

Apply
n
%

An/Syn/Eval
n
%

45

10

22.2

9

20.0

13

28.9

2

4.4

9

20.0

2

4.4

47

9

19.1

7

14.9

12

25.5

10

21.3

7

14.9

2

4.3

39

13

33.3

4

10.3

15

38.5

3

7.7

4

10.3

47

9

19.1

12

25.5

7

14.9

9

19.1

9

19.1

32

6

18.8

9

28.1

11

34.4

2

6.3

4

12.5

27
237

5
52

18.5
21.9

10
51

37.0
21.5

6
64

22.2
27.0

1
27

3.7
11.4

4
37

14.8
15.6

1
6

3.7
2.5

45

7

15.6

6

13.3

17

37.8

2

4.4

11

24.4

2

4.4

47

10

21.3

10

21.3

10

21.3

6

12.8

9

19.1

2

4.3

40

8

20.0

13

32.5

11

27.5

3

7.5

5

12.5

0.0

39

13

33.3

15

38.5

5

12.8

4

10.3

2

5.1

0.0

31

11

35.5

8

25.8

6

19.4

5

16.1

1

3.2

0.0

47

12

25.5

16

34.0

7

14.9

8

17.0

3

6.4

1

2.1

46

12

26.1

10

21.7

8

17.4

7

15.2

8

17.4

1

2.2

33

15

45.5

11

33.3

6

18.2

1

3.0

26

0.0

0.0

1

2.1
0.0

0.0

Item READING AND LITERATURE (cont.)
B9
Style and Language (words that appeal to
the senses, imagery, figurative language,
flow)
B10 Myth, Traditional Narrative, and Classical
Literature (characters in mythology,
adventures/exploits of characters)
B11 Dramatic Literature (elements of dialogue,
elements of drama, role play)
B12 Dramatic Reading and Performance
(rehearsal and performance of stories,
plays, poems, voice inflection)
Total

C1
C2
C3
C4
C5
C6
C7

COMPOSITION
Writing (use of pictures, letters, words to
write stories, poems, letters, reports)
Consideration of Audience and Purpose
(language to match audience and purposeentertain, persuade, inform)
Revising (clarification/rethinking for logic
and expression)
Standard English Conventions (legible
print/cursive, spacing of words, spelling,
end marks, punctuation)
Organizing Ideas in Writing (order of
events, details, logical progression)
Research (gather information about a topic,
steps of conducting research)
Evaluating Writing and Presentations
(decisions and judgments about writing;
use of scoring rubrics)
Total

State CIS Report

N

Attention
%
n

Mem/Recall
n
%

n

Perform
%

Comprehend
n
%

n

Apply
%

An/Syn/Eval
n
%

29

7

24.1

10

34.5

7

24.1

5

17.2

0.0

0.0

26

11

42.3

10

38.5

3

11.5

2

7.7

0.0

0.0

25

12

48.0

6

24.0

6

24.0

0.0

0.0

1

4.0

25
433

10
128

40.0
29.6

7
122

28.0
28.2

7
93

28.0
21.5

43

0.0
9.9

39

0.0
9.0

1
8

4.0
1.8

38

5

13.2

3

7.9

17

44.7

6

15.8

7

18.4

0.0

25

6

24.0

9

36.0

4

16.0

4

16.0

2

8.0

0.0

27

7

25.9

9

33.3

10

37.0

0.0

1

3.7

0.0

32

1

3.1

5

15.6

14

43.8

4

12.5

7

21.9

1

3.1

31

4

12.9

9

29.0

11

35.5

5

16.1

1

3.2

1

3.2

22

4

18.2

9

40.9

8

36.4

0.0

1

4.5

0.0

18
193

5
32

27.8
16.6

6
50

33.3
25.9

6
70

33.3
36.3

19

0.0
9.8

0.0
1.0

1
20

27

5.6
10.4

2

Item MEDIA
D1
Analysis of Media (text/film/play/website
comparison)
D2
Media Production (PowerPoint or other
technological presentation, video/audio
tape)
Total

State CIS Report

N

Attention
%
n

Mem/Recall
n
%

n

Perform
%

22

9

40.9

9

40.9

1

4.5

26
48

13
22

50.0
45.8

6
15

23.1
31.3

3
4

11.5
8.3

Comprehend
n
%

28

n

Apply
%

An/Syn/Eval
n
%

2

9.1

1

4.5

0.0

2

0.0
4.2

4
5

15.4
10.4

0.0
0.0

0

Table E.2. ELA Instructional Methods and Level of Target Student’s Participation (N = 50)
ELA/reading instructional time
during the past week in which the
target student engaged in each of
the following

0
None

Receive individualized instruction
Receive instruction in a small
group
Collect, summarize, or analyze
information
Engage in writing process
Learn to use resources
Use hands-on or manipulatives
Receive instruction with prompts
or scaffolded support
Use computers or other assistive
technology
Work independently
Perform assessment skills for data
collection/grading
Take a test
Practice skills in different setting
Practice skills with a variety of
similar materials
Engage in read aloud activities
View multi media presentations
Engage in speech or presentation
Use work center
Learn/demonstrate skills in
repeated opportunity/direct
instruction trials

State CIS Report

2
Some
(2-4
hours last
week)
24.0
18.0

3
Moderate
(5-7 hours
last week)

0
2.0

1
Little
(1 hour
or less
last week)
16.0
2.0

22.0

28.0

32.0
32.0
4.0
2.0

Level of Student Participation
P
AS
IA
Passive
Active
Independent
ParticiParticipaActive
pation
tion with
Participation
Supports
6.0
80.0
12.0
12.0
66.0
18.0

N
No
Participation

20.0
34.0

4
Considerable
(8 or more
hours last
week)
40.0
44.0

24.0

18.0

8.0

34.0

26.0

36.0

4.0

14.0
30.0
10.0
6.0

24.0
24.0
6.0
12.0

20.0
14.0
38.0
28.0

10.0
0
42.0
52.0

36.0
34.0
2.0
2.0

10.0
28.0
10.0
12.0

50.0
36.0
72.0
80.0

4.0
2.0
16.0
6.0

2.0

20.0

30.0

32.0

16.0

4.0

18.0

52.0

26.0

20.0
20.0

42.0
20.0

28.0
38.0

10.0
18.0

0
4.0

34.0
26.0

16.0
18.0

32.0
50.0

18.0
6.0

50.0
18.0
8.0

36.0
26.0
26.0

6.0
42.0
44.0

6.0
10.0
18.0

2.0
4.0
4.0

52.0
14.0
10.0

8.0
28.0
28.0

30.0
54.0
58.0

10.0
4.0
4.0

16.0
24.0
62.0
42.0
8.0

38.0
32.0
24.0
10.0
16.0

14.0
26.0
4.0
32.0
28.0

22.0
10.0
2.0
12.0
30.0

10.0
8.0
8.0
4.0
18.0

22.0
28.0
56.0
40.0
8.0

22.0
28.0
16.0
14.0
16.0

48.0
28.0
24.0
40.0
62.0

8.0
16.0
4.0
6.0
14.0

2.0
4.0

29

Table E.3. Percent of Teachers Using Various Resources to Teach ELA (N = 55)
Used to teach
ELA/Reading
Materials
Commercially made materials adapted (by you or someone else) from
general education

89.1

Commercially made manipulatives adapted (by you or someone else) from
general education

65.5

Age-appropriate, commercially made print or text materials designed for this
type of student

72.7

Age-appropriate, commercially made manipulatives designed for this type of
student

54.5

Other commercially made print or text materials designed for this type of student

54.5

Other commercially made age-appropriate manipulatives designed for this type
of student

45.5

Teacher-made books, workbooks, materials

96.4

Teacher-made manipulatives

89.1

Materials or lessons from websites

80.0

Computer

81.8

Assistive technologies (e.g., CheapTalk, Big Mac, Dynavox, text reader,
talking calculator, etc.)

72.7

Settings
Real life or natural setting materials (e.g., coins, community signs,
telephones)

78.2

Inclusive class setting

38.2

Other settings in my school

67.3

Other settings in the community

54.5

People
Nondisabled peers

27.3

Teachers from other disciplines (e.g., academic or special subject areas)

34.5

Another staff member at the school (e.g., speech/occupational/physical
therapist)

72.7

Other special education teachers

50.9

State CIS Report

30

Table E.4. Teacher-Reported Influences on ELA Instruction (N = 55)
No
influence

Minimal
influence

Moderate
influence

Strong
influence

0

27.3

25.5

47.3

1.9

14.8

37.0

46.3

State alternate assessment requirements

0

10.9

25.5

63.6

State alternate assessment results from
previous years

18.2

20.0

36.4

25.5

National ELA standards

31.5

29.6

24.1

14.8

ELA content, materials, and/or activities used
by general education teachers in my school

33.3

20.4

25.9

20.4

Training from my degree program
(undergraduate or graduate)

18.2

18.2

34.5

29.1

Students’ needs as documented on IEPs

0

3.6

3.6

92.7

School or district initiatives or priorities

7.4

29.6

27.8

35.2

Principal or other administrator expectations

7.3

25.5

36.4

30.9

Professional development experiences

5.5

18.2

49.1

27.3

0

9.1

18.2

72.7

State curriculum framework or content
standards
Instructional materials

Classroom assessment results

Table E.5. Percent Reporting Frequency of Use of Classroom Assessments – ELA (N = 55)
Not at all

< 1 time
per
month

1-4 times
a month

1-4 times
a week

> 4 times
a week

Objective questions (e.g., true/false,
multiple choice, yes/no)

11.1

11.1

13.0

35.2

29.6

Performance on-demand (e.g., task
analysis steps, repeated trials, incidence
recording)

5.5

1.8

16.4

32.7

43.6

Teacher observation (e.g., anecdotal or
descriptive data)

1.8

7.3

10.9

20.0

60.0

State CIS Report

31

Table M.1. Distribution of Math Content Taught, by Depth of Knowledge (N = 47)
Attention
Item

Number Sense and Operations

A1

Number Sense (whole numbers, fractions,
odd & even, sorting, matching, grouping,
ordering; money)
Operations (+,-,x /, commutative properties,
order of operations)
Computation and Estimation (comparisons,
rounding, properties of addition, subtraction,
multiplication, division)
Total

A2
A3

N

n

%

Mem/Recall
n

Perform

%

n

%

Comprehend
n

Apply

An/Syn/Eval

%

n

%

n

%

47

8

17.0

5

10.6

11

23.4

5

10.6

17

36.2

1

2.1

35

6

17.1

4

11.4

12

34.3

5

14.3

6

17.1

2

5.7

34

6

17.6

5

14.7

11

32.4

5

14.7

5

14.7

2

5.9

116

20

17.2

14

12.1

34

29.3

15

12.9

28

24.1

5

4.3

46

8

17.4

4

8.7

15

32.6

5

10.9

12

26.1

2

4.3

27

8

29.6

7

25.9

6

22.2

2

7.4

3

11.1

1

3.7

32

11

34.4

5

15.6

9

28.1

3

9.4

4

12.5

0.0

16

8

50.0

3

18.8

3

18.8

1

6.3

1

6.3

0.0

121

35

28.9

19

15.7

33

27.3

11

9.1

20

16.5

3

2.5

42

10

23.8

6

14.3

11

26.2

7

16.7

5

11.9

3

7.1

27

7

25.9

4

14.8

8

29.6

4

14.8

3

11.1

1

3.7

29

6

20.7

11

37.9

7

24.1

2

6.9

2

6.9

1

3.4

30
128

10
33

33.3
25.8

3
24

10.0
18.8

12
38

40.0
29.7

2
15

6.7
11.7

3
13

10.0
10.2

5

0.0
3.9

Patterns, Relations, and Algebra
B1
B2
B3

B4

Patterns, Relations, and Functions (identify,
reproduce, create, count in patterns)
Algebra (symbolic representations, variables,
algebraic equations)
Relationships and Mathematical Models
(equivalent measurements, mathematical
relationships, proportions)
Variables and Change (process and rates of
change, linear equations)
Total

Geometry
C1

C2
C3
C4

Characteristics of Geometric Shapes (two
and three dimensional shapes, congruent
shapes)
Spatial Relationships/ Coordinate Geometry
(coordinates, points on a line)
Transformation/Symmetry (flipped, turned
shapes, line and rotational symmetry)
Visualization/Spatial Reasoning/Geometric
Modeling (assembled and dissembled
shapes, use of tools (e.g., ruler, compass) to
create geometric figures)
Total

State CIS Report

32

Attention
Item

Measurement

D1

Measurement Tools (clock, calendar,
cylinder, tape measure, ruler)
Concepts and Attributes of Measurement
(length, weight, volume, capacity)
Formulas of Measurement (area, perimeter,
radius, diameter, circumference)
Total

D2
D3

N

n

%

46

10

21.7

44

18

21

Mem/Recall
n

Perform

Comprehend

%

n

%

n

%

7

15.2

13

28.3

4

8.7

40.9

3

6.8

12

27.3

3

7

33.3

6

28.6

2

9.5

111

35

31.5

16

14.4

27

37

13

35.1

5

13.5

30

12

40.0

8

67

25

37.3

13

Apply
n

An/Syn/Eval

%

n

%

9

19.6

3

6.8

8

18.2

0.0

4

19.0

2

9.5

0.0

24.3

11

9.9

19

17.1

3

2.7

12

32.4

2

5.4

1

2.7

4

10.8

26.7

3

10.0

5

16.7

1

3.3

1

3.3

19.4

15

22.4

7

10.4

2

3.0

5

7.5

6.5

Data Analysis, Statistics, And
Probability
E1

E2

Data and Statistics (data collection and
organization, mean, median, mode, use of
plots and graphs)
Probability (cause/effect, probabilities,
combinations of potential outcomes)
Total

State CIS Report

33

Table M.2. Math Instructional Methods and Level of Target Student’s Participation (N = 47)
Amount of math instructional
time during the past week in
which the target student engaged
in each of the following…
Receive individualized instruction
Receive instruction in a small or
large group
Collect, summarize, or analyze
information
Complete symbolic math
problems
Learn to use resources
Use hands-on or manipulatives to
count or solve mathematical
problems
Receive instruction with prompts
or scaffolded support
Use computers, calculators or
other assistive technology
Work independently
Perform assessment skills for data
collection/grading
Take a test
Practice skills in different setting
Rote count
Practice skills with a variety of
materials
Apply mathematical concepts to
real world applications
Use work center
Learn/demonstrate skills in
repeated opportunity/direct
instruction trials

None

Some
(2-4 hours
last week)

Moderate
(5-7 hours
last week)

0
4.3

Little
(1 hour
or less
last week)
10.6
6.4

36.2
19.1

29.8

21.3

29.8

Level of student participation*
Passive
Active
Independent
ParticiParticipaActive
pation
tion with
Participation
Supports
10.6
80.9
8.5
8.5
74.5
12.8

No
Participation

23.4
34.0

Considerable
(8 or more
hours last
week)
29.8
36.2

38.3

10.6

0

36.2

21.3

42.6

0

23.4

21.3

23.4

2.1

27.7

12.8

51.1

8.5

29.8
4.3

29.8
8.5

21.3
17.0

17.0
34.0

2.1
36.2

31.9
2.1

21.3
12.8

44.7
68.1

2.1
17.0

2.1

12.8

19.1

36.2

29.8

0

12.8

83.0

4.3

10.6

17.0

31.9

14.9

25.5

12.8

6.4

59.6

21.3

34.0
27.7

23.4
34.0

19.1
21.3

21.3
14.9

2.1
2.1

29.8
27.7

14.9
12.8

38.3
59.6

17.0
0

53.2
19.1
27.7
4.3

23.4
29.8
23.4
14.9

14.9
23.4
25.5
40.4

4.3
27.7
14.9
27.7

4.3
0
8.5
12.8

53.2
17.0
25.5
4.3

8.5
14.9
4.3
14.9

34.0
61.7
53.2
70.2

4.3
6.4
17.0
10.6

6.4

25.5

36.2

23.4

8.5

6.4

21.3

68.1

4.3

40.4
6.4

25.5
27.7

10.6
21.3

17.0
34.0

6.4
10.6

38.3
10.6

14.9
10.6

36.2
74.5

10.6
4.3

0
4.3

* Rated only for target students who received little, some, moderate, or considerable instruction using this method.

State CIS Report

34

Table M.3. Percent of Teachers Using Various Resources to Teach Math (N = 55)
Used to teach
Math
Materials
Commercially made materials adapted (by you or someone else) from general
education

81.8

Commercially made manipulatives adapted (by you or someone else) from general
education

87.3

Age-appropriate, commercially made print or text materials designed for this type of
student

54.5

Age-appropriate, commercially made manipulatives designed for this type of student

63.6

Other commercially made print or text materials designed for this type of student

49.1

Other commercially made age-appropriate manipulatives designed for this type of student

52.7

Teacher-made books, workbooks, materials

96.4

Teacher-made manipulatives

96.4

Materials or lessons from websites

70.9

Computer

81.8

Assistive technologies (e.g., CheapTalk, Big Mac, Dynavox, text reader, talking
calculator, etc.)

65.5

Settings
Real life or natural setting materials (e.g., coins, community signs, telephones)

90.9

Inclusive class setting

40.0

Other settings in my school

61.8

Other settings in the community

58.2

People
Nondisabled peers

25.5

Teachers from other disciplines (e.g., academic or special subject areas)

27.3

Another staff member at the school (e.g., speech/occupational/physical therapist)

38.2

Other special education teachers

45.5

State CIS Report

35

Table M.4. Teacher-Reported Influences on Math Instruction (N = 55)
No
influence

Minimal
influence

Moderate
influence

Strong
influence

State curriculum framework or content
standards

5.5

21.8

25.5

47.3

Instructional materials

3.6

16.4

47.3

32.7

State alternate assessment requirements

1.8

12.7

20.0

65.5

State alternate assessment results from
previous years

16.7

31.5

27.8

24.1

National math standards

27.3

34.5

27.3

10.9

Math content, materials, and/or activities used
by general education teachers in my school

29.1

27.3

29.1

14.5

Training from my degree program
(undergraduate or graduate)

20.0

21.8

32.7

25.5

Students’ needs as documented on IEPs

1.8

1.8

3.6

92.7

School or district initiatives or priorities

7.5

37.7

24.5

30.2

Principal or other administrator expectations

9.1

32.7

27.3

30.9

Professional development experiences

7.4

22.2

40.7

29.6

0

31.0

20.4

66.7

Classroom assessment results

Table M.5. Percent Reporting Frequency of Use of Classroom Assessments – Math (N = 55)
Not at
all

<1
time per
month

1-4
times a
month

1-4
times a
week

>4
times a
week

Objective questions (e.g., true/false, multiple
choice)

20.0

12.7

14.5

34.5

18.2

Performance on-demand (e.g., data collected
on student performance of task analysis steps)

1.8

5.5

20.0

29.1

43.6

0

5.5

10.9

27.3

56.4

Teacher observation

State CIS Report

36

Table S.1. Distribution of Science Content Taught, by Depth of Knowledge (N = 47)
Item

Earth and Space Science

A1

Structure and energy in the Earth’s
system. (Weather, minerals, rocks)
History, origin, and evolution of the
earth and the universe. (Changes in the
Earth’s surface, Big Bang Theory)
Earth, the Solar System, and objects in
the sky. (Moon phases, tides, tilt of the
earth, motion of the Earth)
Total

A2

A3

N

Attention
n
%

Memorize/Recall
n
%

41

10

24.4

8

19.5

8

19.5

7

17.1

20

10

50.0

3

15.0

3

15.0

4

20.0

27
88

10
30

37.0
34.1

4
15

14.8
17.0

5
16

18.5
18.2

6
17

22.2
19.3

39

12

30.8

15

38.5

6

15.4

4

25

12

48.0

7

28.0

0.0

33

19

57.6

5

15.2

2

17

8

47.1

4

23.5

17

9

52.9

2

27

14

51.9

23

10

38
219

14
98

n

Perform
%

Comprehend
n
%

Apply
%

n
5

12.2

An/Syn/Eval
n
%
3

7.3

0.0

0.0

2
7

7.4
8.0

3

0.0
3.4

10.3

1

2.6

1

2.6

5

20.0

1

4.0

6.1

4

12.1

2

6.1

2

11.8

3

17.6

11.8

3

17.6

2

11.8

6

22.2

3

11.1

2

43.5

2

8.7

6

26.1

36.8
44.7

3
44

7.9
20.1

9
31

23.7
14.2

Life Science (Biology)
B1

B2
B3

B4

B5

B6

B7
B8

Characteristics of organisms (Organ
systems, plants and animals, plant
structures)
Life cycles of organisms (birth,
development, reproduction, death)
Organisms and environments,
populations, and ecosystems (extinction,
food web, changes in ecosystems)
Cellular and molecular basis of life.
(animal cells, multicellular organisms,
organic molecules, types of cells,
organells)
Reproduction and heredity, diversity,
adaptations, and evolution of organisms.
(traits and genes, reproduction, Mendel,
Punnett squares, DNA, natural
selection, biodiversity)
Regulation and behavior of organisms
(Instinct and learned behavior, animal
and plant behaviors, interaction with the
environment)
Matter, energy, and organization in
living systems
Personal and Community Health
(diseases, nutrition, fitness,
environmental hazards)
Total

State CIS Report

37

0.0

1

3.0

0.0

0.0

1

5.9

0.0

7.4

2

7.4

0.0

4

17.4

1

4.3

0.0

4
28

10.5
12.8

5
13

13.2
5.9

3
5

7.9
2.3

Attention
Item

Physical Science (Chemistry and
Physics)

C1

Properties of matter (size, shape, color,
states of matter, weight and mass,
elements and compounds, periodic
table)
Chemical and physical changes in
matter. (changes in state, boiling and
melting points, bonding, reactions,
chemical equations, acids and bases)
Motion and forces (speed and velocity,
mass and inertia, vectors, Newton’s
laws, waves)
Energy (conservation of energy, forms
of energy, electricity, magnets, light,
sound, heat, potential and kinetic
energy, temperature)
Atomic theory (Atoms and molecules,
fission and fusion, nuclear reactions,
Lewis dot structures)
Total

C2

C3

C4

C5

N

%

n

Memorize/Recall
n

%

Perform
n

%

Comprehend
n

%

Apply
n

An/Syn/Eval

%

n

%

34

9

26.5

9

26.5

8

23.5

5

14.7

2

5.9

1

2.9

23

8

34.8

4

17.4

6

26.1

2

8.7

2

8.7

1

4.3

18

7

38.9

3

16.7

3

16.7

4

22.2

0.0

1

5.6

30

11

36.7

8

26.7

7

23.3

3

10.0

0.0

1

3.3

5
110

3
38

60.0
34.5

1
25

20.0
22.7

24

0.0
21.8

1
15

20.0
13.6

4

0.0
3.6

4

0.0
3.6

23
23

6
6

26.1
26.1

4
4

17.4
17.4

6
6

26.1
26.1

2
2

8.7
8.7

4
4

17.4
17.4

1
1

4.3
4.3

12

4

33.3

4

33.3

3

25.0

0.0

0.0

1

8.3

23

8

34.8

5

21.7

7

30.4

2

8.7

0.0

1

4.3

12
47

7
19

58.3
40.4

3
12

25.0
25.5

2
12

16.7
25.5

2

0.0
4.3

0.0
0.0

2

0.0
4.3

Technology /Engineering
D1

Materials and Tools (uses of materials,
proper uses, machines, technology,
invention)
Total
History/Nature of Science

E1

E2
E3

Science as a human endeavor. (diversity
among scientists, talents and skills of
scientists)
Nature of science (scientific method,
hypotheses, laws, and theories)
History of science (Science in different
cultures, rate of advancement, scientific
revolutions)
Total

State CIS Report

38

0

Item

Science as Inquiry

N

Attention
n
%

F1

Understanding of and abilities necessary
to do scientific inquiry. (Asking
questions, forming hypotheses,
conducting experiments)
Total

32
32

13
13

State CIS Report

40.6
40.6

Memorize/Recall
n
%

6
6

18.8
18.8

n

Perform
%

8
8

25.0
25.0

39

Comprehend
n
%

2
2

6.3
6.3

Apply
%

n

1
1

3.1
3.1

An/Syn/Eval
n
%

2
2

6.3
6.3

Table S.2. Science Instructional Methods and Level of Target Student’s Participation (N = 47)
0
None

Receive individualized instruction
Receive instruction in a small
group
Collect, summarize, or analyze
information
Engage in inquiry processes
Learn to use resources
Use hands-on materials or
manipulatives
Receive instruction with prompts
or scaffolded support
Use computers or other assistive
technology
Work independently
Perform assessment skills for data
collection/grading
Take a test
Practice skills in different setting
Practice skills with a variety of
similar materials
Engage in read aloud activities
View multi media presentations
Engage in speech or presentation
Use work center
Learn/demonstrate skills in
repeated opportunity/direct
instruction trials

Level of student participation
N
P
AS
No
Passive
Active
ParticiParticiParticipapation
pation
tion with
Supports
2.6
28.2
64.1
2.3
25.6
65.1

2
Some
(2-4
hours last
week)
21.3
38.3

3
Moderate
(5-7 hours
last week)

17.0
8.5

1
Little
(1 hour
or less
last week)
40.4
19.1

8.5
14.9

4
Considerable
(8 or more
hours last
week)
12.8
19.1

31.9

34.0

29.8

4.3

0

9.4

25.0

65.6

0

42.6
42.6
8.5

21.3
23.4
23.4

31.9
19.1
42.6

4.3
14.9
12.8

0
0
12.8

22.2
18.5
12.8

22.2
14.8
21.3

55.6
66.7
59.6

0
0
6.4

8.5

27.7

34.0

19.1

10.6

10.6

19.1

70.2

0

38.3

25.5

19.1

10.6

6.4

31.9

19.1

40.4

8.5

48.9
46.8

29.8
36.2

14.9
12.8

6.4
4.3

0
0

55.3
42.6

10.6
19.1

29.8
38.3

4.3
0

59.6
38.3
25.5

25.5
31.9
38.3

8.5
21.3
27.7

4.3
8.5
6.4

2.1
0
2.1

61.7
36.2
27.7

6.4
17.0
14.9

23.4
46.8
57.4

8.5
0
0

44.7
40.4
63.8
59.6
27.7

21.3
25.5
19.1
21.3
31.9

21.3
23.4
17.0
14.9
25.5

10.6
6.4
0
2.1
12.8

2.1
4.3
0
2.1
2.1

48.9
44.7
63.8
57.4
31.9

10.6
19.1
10.6
14.9
10.6

38.3
29.8
25.5
27.7
57.4

2.1
6.4
0
0
0

* Rated only for target students who received little, some, moderate, or considerable instruction using this method

State CIS Report

40

IA
Independent
Active
Participation
5.1
7.0

Table S.3. Percent of Teachers Using Various Resources to Teach Science (N = 55)
Used to teach
Science
Materials
Commercially made materials adapted (by you or someone else) from general
education

78.2

Commercially made manipulatives adapted (by you or someone else) from general
education

65.5

Age-appropriate, commercially made print or text materials designed for this type of
student

52.7

Age-appropriate, commercially made manipulatives designed for this type of student

50.9

Other commercially made print or text materials designed for this type of student

34.5

Other commercially made age-appropriate manipulatives designed for this type of student

38.2

Teacher-made books, workbooks, materials

80.0

Teacher-made manipulatives

80.0

Materials or lessons from websites

74.5

Computer

67.3

Assistive technologies (e.g., CheapTalk, Big Mac, Dynavox, text reader, talking
calculator, etc.)

54.5

Settings
Real life or natural setting materials (e.g., coins, community signs, telephones)

69.1

Inclusive class setting

41.8

Other settings in my school

50.9

Other settings in the community

45.5

People
Nondisabled peers

27.3

Teachers from other disciplines (e.g., academic or special subject areas)

40.0

Another staff member at the school (e.g., speech/occupational/physical therapist)

38.2

Other special education teachers

41.8

State CIS Report

41

Table S.4. Teacher-Reported Influences on Science Instruction (N = 55)
No
influence

Minimal
influence

Moderate
influence

Strong
influence

State curriculum framework or content
standards

17.0

17.0

28.3

37.7

Instructional materials

7.5

11.3

37.7

43.4

State alternate assessment requirements

9.4

15.1

15.1

60.4

State alternate assessment results from
previous years

35.3

19.6

21.6

23.5

National science standards

37.7

39.6

13.2

9.4

Science content, materials, and/or
activities used by general education
teachers in my school

30.2

13.2

39.6

17.0

Training from my degree program
(undergraduate or graduate)

30.2

28.3

24.5

17.0

Students’ needs as documented on IEPs

9.4

3.8

11.3

75.5

School or district initiatives or priorities

20.8

24.5

26.4

28.3

Principal or other administrator
expectations

22.6

26.4

26.4

24.5

Professional development experiences

24.5

20.8

35.8

18.9

Classroom assessment results

9.6

13.5

25.0

51.9

Table S.5. Percent Reporting Frequency of Use of Classroom Assessments – Science (N = 55)
Not at
all

< 1 time
per
month

1-4
times a
month

1-4
times a
week

>4
times a
week

Objective questions (e.g., true/false,
multiple choice, yes/no)

18.2

14.5

18.2

32.7

16.4

Performance on-demand (e.g., task
analysis steps, repeated trials, incidence
recording)

12.7

5.5

25.5

36.4

20.0

Teacher observation (e.g., anecdotal or
descriptive data)

9.1

5.5

16.4

32.7

36.4

State CIS Report

42

CIS Results: Long Version

State CIS Report

43

LONG VERSION: FACILITATOR RESPONSES
Twenty-two facilitators completed the long version of the CIS in June 2007. While they were instructed to
complete the surveys based on the instructional practices of the “typical” teacher they worked with in
2006-07 and then identify any additional content taught by their “best” teacher, respondents did not
differentiate between “typical” and “best” teachers. Therefore, the following results are intended to reflect
a sample of the academic instruction of the “typical” teachers, as seen by facilitators.
Detailed tables providing frequency distributions for content within each academic subject and topic are
provided in the appendix following this summary. In both the appendix and the summary tables that
follow, two kinds of information are provided:
• The “intensity” column provides a rough estimate of the frequency with which the content was
taught, across teachers. Responses on the 0-4 scale were summed across the 22 facilitator
responses, yielding a total possible score of 88 per item (which would mean all “typical” teachers
taught the content systematically, with daily or nearly daily coverage throughout the entire school
year). A score of zero would mean that all of the responding facilitators indicated that no “typical”
teacher taught that content during 2006-07. In the summary tables that follow on the next three
pages, the maximum possible intensity score varies depending on the number of items within the
topic. More detailed information about intensity is provided within the appendix tables.
• The remaining columns provide a frequency distribution with which each item was endorsed, at
each level of depth of knowledge (DOK).
Following is a list of highlights from the survey responses for each subject.

State CIS Report

44

•
•
•

English language arts (ELA)
The areas with the greatest instructional emphases were Vocabulary and Concept Development;
Questioning, Listening, and Contributing; Beginning Reading; and Discussion.
Evaluating Writing and Presentations and Style and Language were the topics with the least
instructional emphasis.
There were no clear patterns related to differences across topics in performance expectations
(DOK) at which facilitators reported the teachers typically taught. In general, the
analysis/synthesis/evaluation level was least frequently endorsed.

Intensity and Distribution of DOK for Content Taught within ELA Topics
Int*

Discussion
Questioning, Listening
and Contributing
Oral Presentation
Vocabulary and Concept
Development
Structure & Origins of
Modern English
Formal and Informal
English
Beginning Reading
Understanding Text
Making Connections
Genre
Theme
Fiction
Nonfiction
Poetry
Style and Language
Myth, Traditional
Narrative, and Classical
Literature
Dramatic Literature
Dramatic Reading and
Performance
Writing
Consideration of
Audience and Purpose
Revising
Standard English
Conventions
Organizing Ideas in
Writing
Research
Evaluating Writing and
Presentations
Analysis of Media
Media Production

Attention

Mem/Rec

Perform

Comprehension

Application

An/Syn/Eval

55
58

n
13
5

%
23.6
8.6

n
1
7

%
1.8
12.1

n
29
26

%
52.7
44.8

n
5
10

%
9.1
17.2

n
5
9

%
9.1
15.5

n
2
1

%
3.6
1.7

40
60

2
5

5.0
8.3

3
20

7.5
33.3

28
22

70.0
36.7

3
6

7.5
10.0

3
7

7.5
11.7

1
0

2.5
0.0

36

7

19.4

3

8.3

19

52.8

3

8.3

4

11.1

0

0.0

43
57
49
30
29
32
45
38
28
9

5
4
8
13
9
6
2
3
16
6

11.6
7.0
16.3
43.3
31.0
18.8
4.4
7.9
57.1
66.7

10
8
10
4
5
8
11
17
3
0

23.3
14.0
20.4
13.3
17.2
25.0
24.4
44.7
10.7
0.0

14
32
11
8
7
11
17
5
9
0

32.6
56.1
22.4
26.7
24.1
34.4
37.8
13.2
32.1
0.0

7
7
9
3
6
6
11
11
0
0

16.3
12.3
18.4
10.0
20.7
18.8
24.4
28.9
0.0
0.0

6
6
6
1
2
0
2
0
0
2

14.0
10.5
12.2
3.3
6.9
0.0
4.4
0.0
0.0
22.2

1
0
5
1
0
1
2
2
0
1

2.3
0.0
10.2
3.3
0.0
3.1
4.4
5.3
0.0
11.1

17
21

7
6

41.2
28.6

3
7

17.6
33.3

1
2

5.9
9.5

3
1

17.6
4.8

3
4

17.6
19.0

0
1

0.0
4.8

17
24

0
2

0.0
8.3

6
8

35.3
33.3

5
5

29.4
20.8

0
2

0.0
8.3

5
5

29.4
20.8

1
2

5.9
8.3

19
24

4
0

21.1
0.0

7
7

36.8
29.2

0
5

0.0
20.8

1
0

5.3
0.0

6
12

31.6
50.0

1
0

5.3
0.0

39

3

7.7

17

43.6

7

17.9

2

5.1

8

20.5

2

5.1

16
18

7
5

43.8
27.8

2
1

12.5
5.6

1
9

6.3
50.0

0
0

0.0
0.0

6
3

37.5
16.7

0
0

0.0
0.0

8
16
21

1
1
6

12.5
6.3
28.6

0
2
0

0.0
12.5
0.0

1
5
7

12.5
31.3
33.3

0
0
0

0.0
0.0
0.0

3
3
5

37.5
18.8
23.8

3
5
3

37.5
31.3
14.3

*Int = Intensity

State CIS Report

45

•
•
•

Mathematics
The areas with the greatest instructional emphases were Numbers and Operations, Measurement,
and Characteristics of Geometric Shapes.
Variables and Change and Probability were the topics with the least instructional emphasis.
In general, the most frequently reported DOKs were memorize/recall, performance, and
application.
Attention

Numbers & Operations
Patterns, relations, and
functions
Algebra
Relations and
mathematical models
Variable and change
Characteristics of
geometric shapes
Spatial relationships and
coordinate geometry
Transformation and
symmetry
Visualization/special
reasoning/ Geometric
modeling
Measurement
Data & Statistics
Probability

Mem/Rec

Perform

Comprehension

Application

An/Syn/
Eval

Int.*
2009
446

N
749
165

n
43
10

%
5.7
6.1

n
139
28

%
18.6
17.0

n
344
72

%
45.9
43.6

n
30
13

%
4.0
7.9

n
189
41

%
25.2
24.8

n
4
1

%
0.5
0.6

131
208

66
81

3
0

4.5
0.0

15
28

22.7
34.6

20
25

30.3
30.9

0
2

0.0
2.5

28
25

42.4
30.9

0
1

0.0
1.2

28
531

18
222

0
39

0.0
17.6

1
73

5.6
32.9

2
58

11.1
26.1

0
9

0.0
4.1

15
43

83.3
19.4

0
0

0.0
0.0

121

59

18

30.5

12

20.3

10

16.9

3

5.1

16

27.1

0

0.0

166

88

13

14.8

11

12.5

43

48.9

0

0.0

21

23.9

0

0.0

103

47

4

8.5

15

31.9

13

27.7

1

2.1

14

29.8

0

0.0

576
455
95

220
191
52

6
12
8

2.7
6.3
15.4

65
43
6

29.5
22.5
11.5

81
70
8

36.8
36.6
15.4

11
4
1

5.0
2.1
1.9

57
55
22

25.9
28.8
42.3

0
7
7

0.0
3.7
13.5

*Int = Intensity

State CIS Report

46

Science
•
•
•

The areas with the greatest instructional emphases were Life Science and Earth and Space Science.
History and Nature of Science, and Technology and Engineering were the topics with the least
instructional emphasis.
Science instruction was most frequently provided with expectations for memorization/recall and
application levels.
Attention

Earth and Space Science
Life Science (Biology)
Physical Science and
Chemistry
Technology and
Engineering
History and Nature of
Science
Science as Inquiry

Mem/Rec

Perform

Comprehension

Application

An/Syn/
Eval

Int.*
826
1025
417

N
493
519
234

n
79
58
21

%
16.0
11.2
9.0

n
205
211
70

%
41.6
40.7
29.9

n
46
48
32

%
9.3
9.2
13.7

n
69
65
34

%
14.0
12.5
14.5

n
86
115
61

%
17.4
22.2
26.1

n
8
22
16

%
1.6
4.2
6.8

88

50

10

20.0

8

16.0

9

18.0

6

12.0

17

34.0

0

0.0

55

24

2

8.3

6

25.0

4

16.7

2

8.3

6

25.0

4

16.7

132

58

9

15.5

11

19.0

12

20.7

6

10.3

17

29.3

3

5.2

*Int = Intensity

State CIS Report

47

Running head: LEARNER CHARACTERISTICS INVENTORY

An Analysis of the Learning Characteristics of Students Taking Alternate Assessments
Based on Alternate Achievement Standards

This manuscript was supported, in part, by the U.S. Department of Education Office of
Special Education Programs (Grant No. H3244040001). However, the opinions expressed
do not necessarily reflect the position or policy of the U.S. Office of Special Education
Programs and no official endorsement should be inferred.

Abstract
This study examines the learner characteristics of students in the alternate assessment
based on alternate achievement standards in three geographically and demographically
different states. Based on our results, it can be argued that students in this assessment
include at least two, distinct sub-groups within this population. The first set of learners
has either symbolic or emerging symbolic levels of communication, evidences social
engagement, and possesses at least some level of functional reading and math skills. The
second set of students in our sample has not yet acquired a formal, symbolic
communication system, may not initiate, maintain, or respond to social interactions
consistently, and has no awareness of print, Braille, or numbers. This article provides
implications and considerations of the findings of the Learner Characteristics Inventory
(LCI) for states and practitioners in developing alternate assessments based on alternate
achievement standards (AA-AAS).

Knowing What Students Know: Defining the Student Population Taking Alternate
Assessments Based on Alternate Achievement Standards
As a field, alternate assessment for students with disabilities is in its infancy.
Originating in Kentucky in 1992 (Kleinert, Kearns, & Kennedy, 1997), alternate
assessment was conceptualized originally for students with more severe disabilities
(Kleinert & Thurlow, 2001). Alternate assessment was mandated nationally by IDEA 97
as a mechanism for inclusion in large-scale educational assessments for those students
who could not participate in regular state and district assessments, even with
accommodations and modifications. Although IDEA did not limit alternate assessment to
students with the most significant disabilities, most states designed their original alternate
assessments for that small population of students. The No Child Left Behind Act of 2001
(NCLB) and subsequent regulations reinforced the requirement that states develop
alternate assessments for students with significant cognitive disabilities, and allow states
to set alternate achievement standards on alternate assessments designed for those
students. Regardless of whether alternate or grade-level achievement standards are set, all
assessment options are to be aligned to grade-level content standards (U.S. Department of
Education, 2004).
Alternate assessments on alternate achievement standards (AA-AAS) for students
with significant cognitive disabilities must evidence a rigorous technical quality
comparable to large-scale assessments for all students. For the “infant” field of alternate
assessment, this is no easy task. When IDEA 97 was passed, states had only two
examples of a statewide alternate assessment to consider (Kleinert et al., 1997; Kleinert,
Haigh, Kearns, & Kennedy, 2000), and only three years to design alternate assessments

of their own. It is not surprising, then, that states have varied widely in the alternate
assessment formats they have developed, in how they have aligned their alternate
assessments to the state academic content standards identified for all students, and in the
technical qualities of their alternate assessments. Further, with the exception of studies of
Kentucky’s alternate assessment (see Kampfer, Horvath, Kleinert, & Kearns, 2001;
Kleinert & Kearns, 1999; Kleinert, Kennedy, & Kearns, 1999; and Turner, Baldwin,
Kleinert, & Kearns, 2000) and subsequent work by Browder and colleagues (Browder,
Spooner, Algozzine, Ahlgrim-Delzell, Flowers, & Karvonen, 2003; Flowers, AhlgrimDelzell, Browder, & Spooner, 2005), little is known about how alternate assessments
have impacted teacher practice, access to the general curriculum, and most importantly,
student outcomes. The challenges are extremely complex. At this time, many states are
not only struggling with issues of technical quality of alternate assessment, but they are in
the midst of engaging in a challenging paradigm shift from functional or below gradelevel developmental instruction and assessment for some students with disabilities to
instruction and assessment linked to grade-level academic content standards for all
students.
A Conceptual Framework
The framework for our research comes from the National Research Council’s
Committee on the Foundations of Assessment’s conception of the “assessment triangle”
(Pellegrino, Chudowsky, & Glaser, 2001). The triangle focuses our attention on how
models of large-scale assessment reflect the characteristics of good teaching and learning,
and specifically how diverse groups of students demonstrate that learning within the
academic domains.

The assessment triangle consists of: “a model of student cognition in the domain, a
set of beliefs about the kinds of observations that will provide evidence of the students’
competencies, and an interpretation process for making sense of the evidence”
(Pellegrino et al., 2001, p. 44). Pellegrino et al. (2001) defined three pillars on which
every assessment must rest: “a model of how students represent knowledge and develop
competence in the subject domain, tasks or situations that allow one to observe students’
performance, and an interpretation method for drawing inferences from the performance
evidence thus obtained” (p. 2). They suggest that these pillars make up an assessment
triangle, and that this triangle—cognition, observation, interpretation—must be
articulated, aligned, and coherent for inferences drawn from the assessment to have
integrity. The triangle is illustrated in Figure 1. This study intends to examine a critical
part of the assessment triangle - the cognition vertex, and more precisely, one element of
that vertex - the learner characteristics of the students who are assessed with AA-AAS.
The students for whom AA-AAS is appropriate represent two problems that
challenge traditional measurement theory. First, they represent a small percentage
(estimated in NCLB regulation as 1% or less) of the total assessed population of students
with and without disabilities. Secondly, they are reportedly a highly diverse group
particularly with regard to learner characteristics, available response repertoires, and
often competing complex medical conditions (Heward, 2006; Orelove, Sobsey, &
Silberman, 2004). However, little empirical data exist to verify the extent to which
students with these learning characteristics are represented in the assessed population.
Who are these students?

According to IDEA 1997 and 2004, alternate assessments are designed for a very
small percentage of the student population for whom traditional assessments, even with
appropriate accommodations, would be an inappropriate measure of student progress
within the general education curriculum. Indeed, these students represent multiple
categories of disability under IDEA including: mental retardation, autism, and multiple
disabilities (US Department of Education, 2003). Qualitative data collected from state
participation criteria for alternate assessments (Midsouth Regional Resource Center,
2004) suggest that the following characteristics describe the population. These students
typically: a) have an Individualized Education Program (IEP), b) have a cognitive
disability, c) require instruction under multiple conditions to generalize learning, and d)
may receive a “functional curriculum”. However, there is little evidence of how states are
monitoring the use of participation guidelines in making assessment decisions, and thus
how consistently states are identifying students according to their own participation
criteria. In a further attempt to describe this population, Almond and Bechard (2005)
found in an alternate assessment on alternate achievement standard pilot across five states
that, these students were most likely to have a different curricular focus, require
communication supports and assistive technology, and require physical supports.
Validity Evaluation
Based on the conceptual framework of Pellegrino et al. (2001), the learning
characteristics of the assessed population have significant implications for the
assessment’s validity. Specifically, the validity evaluation of an assessment should
consider two questions. First, we need to know whether the assessment is appropriate for
the intended population. Secondly, in high stakes accountability environments, we want

to ensure that the appropriate population is, in fact, the population being assessed. This
study represents the first systematic attempt to address each of these two questions.
Methodology
Research Design
A survey research design was used to gather data on the learning characteristics of
students participating in the alternate assessment judged against alternate achievement
standards (AA-AAS) in three states. Please see Table 1 outlining the options for each of
the three states in data collection. Although the survey could be completed in different
modalities (i.e., online or paper/pencil), the directions for completing the survey were all
consistent: a) teachers were to complete an LCI for each student participating in the AAAAS, and b) for each item on the survey, teachers were to choose the best answer that
most appropriately described the student. The following outlines the specific data
collection options used in each state.
All special education teachers in State 1 were sent an email inviting them to
complete a Learner Characteristics Inventory (LCI) for each student they had
participating in the AA-AAS during the 2005-2006 school year. In the email, teachers
were offered three ways in which to complete the LCI:
1) Teachers could click on a link that directed them to the inventory where they
could complete it for each child participating in the alternate assessment (thus a
teacher with three students in the alternate assessment would complete the LCI for
each of the three students). If teachers completed the LCI online, they were asked
to print the completion page at the end of the survey and bring it to the scoring

site when dropping off the assessment. In this way, they would not be asked again
if they had completed the inventory for their student(s).
2) Teachers could complete the inventory by printing off the version attached to the
invitation email. Teachers were asked to print the inventory for each student
participating in the alternate assessment and bring the LCI(s) with them when
dropping off the assessment(s) at their scoring site.
3) If teachers chose not to complete the inventory, forgot to bring it with them to the
site, or chose to complete it upon arrival to the scoring site, inventories were
available for them at the scoring site. At all times, teachers were given the choice
not to participate in the LCI.
In State 2, all district administrators were sent an email from the Chief of the
Bureau of Assessment. District administrators were asked to forward an attached email to
teachers inviting them to complete an LCI for each student participating in the AA-AAS
during the 2005-2006 school year. In this state, teachers were only allowed the option to
complete the LCI online. Teachers were given a three week window to complete the
inventory for their student(s) and then the inventory was taken offline.
In State 3, an email invitation was sent to 247 teachers who attended alternate
assessment regional trainings. From this group of attendees, teachers administering the
alternate assessment this year were invited to complete the LCI for each of their students
participating in the alternate assessment. The invitation provided a brief description and
the purposes of the survey and asked teachers to click on the link to the online survey.
Once the teachers clicked on the link, they were directed to the online survey and
completed it for each of their students. The survey was available for two weeks. After the

first week, a friendly reminder was sent to teachers. The online survey was extended by
one week, and teachers received another friendly reminder.
Participants
All teachers who had students participating in the AA-AAS in three states were
asked to complete the LCI for each student completing the assessment that year who was
on their caseload. One state (State 1) was a southern state, largely rural. The second state
was a northeastern state, largely urban and suburban. The third state was a western state,
largely rural. To collect data on this population in an efficient and timely manner,
researchers developed the instrument to be a quick and easy instrument completed by the
students’ teachers which could eventually be incorporated into the assessment process
(such as when registering students to take the assessment or as part of the materials
submitted with the assessment). As we were interested in student and not teacher
descriptive data, we did not ask teachers to complete demographic data on themselves.
Instrumentation
The Learner Characteristics Inventory (LCI) was developed by researchers at the
National Alternate Assessment Center (NAAC) in conjunction with experts in the fields
of Occupational Therapy, Physical Therapy, Speech/Language Pathology/
Communication Disorders, Deaf-blindness, Reading, Mathematics, and Special
Education. The LCI went through an expert validation and changes to the categories were
made given thoughtful feedback from the experts. The LCI was emailed to 10 experts,
across these fields, with a structured evaluation form. The form required experts to give
feedback on the survey as a whole (i.e., clarity, utility, accuracy, understandability), but
for the questions that tapped individual expertise, experts were asked to provide specific

recommendations on content and clarity for those questions. Each item on the survey
included a purpose statement and rationale for the importance of including it on the
survey. Experts were asked to indicate if changes were needed for each question and to
precisely explain the changes necessary to improve the instrument.
The survey was then piloted with a small sample of teachers (approximately 25
from across elementary, middle and high school grade levels). Teachers were asked to
choose a partner respondent (such as speech/language pathologist, school psychologist,
general education teacher) and both were to independently score an LCI for a single
student so interrater agreement could be calculated. Interrater agreement was 84% and
teachers made suggestions for changes to the categories. These suggestions were
considered by researchers at NAAC, and a final version of the LCI was once more piloted
with a small sample of approximately 15 teachers from across grade levels and their
independent partner respondents. The average interrater agreement per variable was 95%,
indicating the instrument was valid to investigate the learning characteristics of students
with the most significant cognitive disabilities.
The instrument includes 10 questions, nine that are on a continuum of skills in the
areas of expressive communication, receptive language, vision, hearing, motor,
engagement, health issues/attendance, reading and mathematics. The other question is a
dichotomous variable that asks if students used an augmentative communication system.
Teachers were asked to rate where each student in their class participating in an AA-AAS
would rank on this continuum or dichotomy for each variable. *Please email the lead
author for a copy of the survey.
Data Analysis

The variables of expressive communication, receptive language, vision, hearing,
motor, engagement, health issues/attendance, reading and mathematics are continuous
variables, and we chose to measure them as such for data analyses purposes. Each item
within each variable was given a numerical value (low to high with high representing
more complex abilities). When coding the data in SPSS, multiple responses and missing
data were coded as exclusionary data. Descriptive statistics (frequencies and percentages)
were performed on each of the 10 questions on the LCI. In addition, correlational
analyses were conducted to investigate the relationships between expressive and
receptive communication and reading and mathematics skills, along with other variables.
In the results section, we outline response rate, descriptive statistics, and findings from
the correlational analyses.
Results
During the 2005-2006 school year, there were approximately 1,394 students who
completed an AA-AAS in State 1 from grades 4, 8, and 12. Teachers completed LCIs for
1,120 students during the Spring of 2006. The response rate was 80%. In State 2, there
were approximately 2,800 students who completed an AA-AAS from grades 3-8 and 10.
Teachers completed LCIs for 201 students also in the Spring of 2006. The response rate
was approximately 7%. It is possible the response rate was reduced in State 2 for two
reasons: a) time of year in which the inventory was conducted (very busy time of year)
and b) emailing teachers through district administrators (which required administrators to
forward the email to teachers increasing attrition). During the 2006-2007 school year,
teachers completed LCIs for 219 students in State 3 in the Spring of 2007. There were

approximately 467 students who completed an AA-AAS from grades 3-8 and 11. The
response rate was approximately 47%.
Descriptive analyses
Table 2 includes the total number of respondents and frequencies for each
variable in each state. To communicate expressively, most students in each state used
verbal or written words, signs, Braille, or language-based augmentative systems to
request, initiate, and respond to questions, describe things or events, and express refusal
(71%, 63%, and 74% respectively in States 1, 2, and 3). A smaller group of the
population in each state used understandable communication through such modes as
gestures, pictures, objects/textures, points, etc., to clearly express a variety of intentions
(17%, 26%, and 17% respectively). An even smaller group of students primarily used
cries, facial expressions, change in muscle tone, etc., to communicate, but these students
had no clear use of objects/textures, regularized gestures, pictures, signs, etc., to
communicate (8%, 11%, and 8% respectively).
Receptively, students in each state fell into two primary groups: those students
who independently followed 1-2 step directions presented through words (e.g. words
could be spoken, signed, printed, or any combination) while not requiring additional cues
(46%, 34%, and 56% respectively in state 1, 2, and 3); or those students who required
additional cues (e.g., gestures, pictures, objects, or demonstrations/models) to follow 1-2
step directions (41%, 54%, and 33%). A smaller group (10%, 10%, and 7%) alerted to
sensory input from another person (auditory, visual, touch, movement) but required
actual physical assistance to follow simple directions. Finally, less than three percent of

the population in each state displayed an uncertain response to sensory stimuli (e.g.,
sound/voice; sight/gesture; touch; movement; smell).
Overall, only a minority of students in each state used an augmentative
communication system, in addition to or in place of oral speech (18%, 30%, and 15%
respectively). Perhaps most significantly, only 57% of students in State 1, 36% of
students in State 2, and 33% of students in State 3 who communicated primarily through
cries, facial expressions, change in muscle tone, etc., used a formalized augmentative
communication system. Further, only 42% of the students in State 1, 44% of the students
in State 2, and 43% of students in State 3 who communicated through such modes as
gestures, pictures, objects/textures, points, etc., used a formalized augmentative
communication system in place of oral speech.
The LCI also investigated individual students’ reading and mathematics skills.
For each of the five options under reading and math, teachers were asked to select the
option that best described their student’s present performance in that area. In State 1 and
State 3, teachers noted that over 2% of the population read fluently with critical
understanding in print or Braille. State 2 did not provide this option on the inventory.
Almost 14% of the students in State 1, 12% in State 2, and 33% in State 3 were rated as
being able to read fluently with basic (literal) understanding from paragraphs/short
passages with narrative/ informational texts in print or Braille. The largest group from all
three states (50%, 47%, and 33%) was rated as being able to read basic sight words,
simple sentences, directions, bullets, and/or lists in print or Braille, but not fluently from
text with understanding. A smaller percentage of students (17%, 14%, and 18%) were
rated as not yet having a sight word vocabulary, but being aware of text/Braille,

following directionality, making letter distinctions, or telling a story from pictures.
Finally, teachers noted 15% of students in State 1, 25% of students in State 2, and 13% of
students in State 3 had no observable awareness of print or Braille.
Under math skills, teachers were again asked to select the performance
description that best indicated the skill level of their student(s). At the highest level, 2%
of students in State 1 and 4% of students in States 2 and 3 applied computational
procedures to solve real-life or routine word problems from a variety of contexts. The
largest category of students within each state (57%, 37%, and 51% respectively) was able
to complete computational procedures with or without a calculator. Nearly 19% of
students in State 1, 24% of students in State 2, and 27% of students in State 3 were
described as performing at the more basic level of counting with one-to-one
correspondence to at least 10, and/or make numbered sets of items. A smaller percentage
still (7%, 10%, and 6%) were described as being able to count by rote to 5, but without
the higher skill sequences of one-to-one correspondence or computation. Finally, teachers
noted that nearly 13% of students in State 1, 22% of students in State 2, and 11% of
students in State 3 had no observable awareness or use of numbers.
Most students in all three states (90%, 85%, and 89%) had normal vision or
corrected vision within normal limits. However, in State 1 nearly 9%, in State 2 almost
15%, and in State 3 exactly 10% of all students represented in our survey had low vision
or no functional use of vision for activities of daily living. As with vision, most students
in both states (95%, 93%, and 97%) had hearing within normal limits or corrected
hearing loss within normal limits. A small percentage of the population in States 1 and 2
(2% and 5% respectively) had significant and profound hearing loss, even with aids. No

students in State 3 had these characteristics. For almost 2% of the population in all three
states, teachers were unable to determine functional use of hearing for their students.
When asked to rate students’ motor abilities, teachers rated approximately 76% of
students in States 1 and 2 and 81% of students in State 3 as having no significant motor
dysfunction that required adaptations. However, the remaining 24% of students in States
1 and 2 and 18% of students in State 3 had a range of motor abilities from requiring
adaptations to support motor functioning to needing personal assistance for most/all
motor activities. Overall, there was clearly a wide variety of abilities and needs for this
student population related to motor functioning.
Engagement (awareness and interaction with others) is another variable
investigated by the LCI. Approximately 89% of students in State 1, 85% of the students
in State 2, and 91% of students in State 3 were able to initiate and sustain social
interactions or respond to social interactions (without initiating or sustaining them).
However, 8% of students in State 1, 11% of students in State 2, and 7% of students in
State 3 only alerted to other people. Approximately 2% of students in State 1, 4% in State
2, and 1% of students in State 3 did not alert to others people.
As the students who take AA-AAS are those with the most significant cognitive
disabilities who may also have special medical needs or considerations, the final variable
on the LCI investigated attendance in school. Remarkably, 94% of students in State 1,
99% of students in State 2, and 96% of students in State 3 attended at least 75% of school
days, with absences primarily due to health issues. In States 1 and 3, 2% of the
population attended approximately 50% or less of school days with absences primarily
due to health issues; in State 2 that percentage was 1%.

Correlational analyses
Correlational analyses were also conducted between expressive language,
receptive communication, and reading and math (Results for all three states can be found
in Table 3). A bivariate Pearson correlation was used to investigate the relationship
between expressive language and reading and math and receptive communication and
reading and math. In all three states, a statistically significant correlation was found
between the level of the student’s expressive language and the student’s level of reading.
As might be expected, students who were symbolic learners were also reading at a higher
level than those who were not. In addition, a significant correlation was also found
between the level of a student’s receptive communication and level of reading in all three
states. Consequently, students with a higher level of receptive communication were also
reading at a higher level. Furthermore, significant correlations were found between the
level of a student’s expressive language and mathematics and receptive communication
and mathematics in all three states. As again might be expected, students with higher
levels of expressive language and receptive communication were working at a higher
level in mathematics.
Correlational analyses were also conducted to investigate the relationship between
receptive language and engagement, motor, and health issues/attendance. These analyses
resulted in statistically significant correlations for receptive language and engagement (r
= .55, p < .01), motor (r = .57, p < .01), and health issues/attendance (r = .17, p > .01) in
State 1. Similarly, in State 2, analyses resulted in significant correlations for receptive
language and engagement (r = .58, p < .01), motor (r = .48, p < .01), and health
issues/attendance (r = .18, p > .01). In State 3, analyses yielded statistically significant

correlations for receptive language and engagement (r = .68, p < .01), motor (r = .56, p <
.01), and health issues/attendance (r = .41, p < .01).
Discussion
The No Child Left Behind Act of 2001 requires that all educational assessments,
including AA-AAS, that are used for determining school and state-level adequate yearly
progress (AYP), meet high standards of technical adequacy. As noted by Pellegrino et al.
(2001), two critical elements in determining technical adequacy are a) precisely defining
the target set of students for whom the assessment has been designed, and b) determining
if the learners for whom that assessment has been designed are, in fact, the students who
are taking it. The purpose of this paper was to describe the learner characteristics of
students taking AA-AAS in three demographically and geographically dissimilar states.
In order to describe the population of the students in the AA-AAS for these three states,
we created a brief scale – the Learner Characteristics Inventory (LCI) – across nine
separate dimensions in which students with significant cognitive disabilities are known to
have highly variable abilities (expression communication, receptive communication,
social engagement, motor, hearing, vision, health, reading, and math) (Heward, 2006;
Orelove et al., 2004). As might be expected, teachers’ ratings for individual students
ranged across the gamut of performance descriptions within each area assessed by the
LCI, but there are still some important conclusions that can be drawn.
1)

Students in these three states who are being identified to take the AA-AAS are for
the most part, students for whom the regular assessment, even with
accommodations, would probably not be appropriate. For example, only 2 – 4%
of the total students in the AA-AAS in these states are able to “read fluently with

critical understanding” or “apply computational procedures to solve real-life or
routine word problems”. Both of the above skills would be required for the
successful completion of grade-level reading and math assessments under NCLB.
2)

Yet the majority of students taking the AA-AAS represented in our survey from
these three states do have functional reading and math skills. For example, over
66% of the students in our survey from State 1 could at least read basic sight
words or simple sentences in print or Braille, and 59% of the students in the AAAAS from State 1 could, at a minimum, do computational problems with or
without a calculator.

3)

Within each of these three states, there would appear to be a small but significant
number of students (approximately 11% or less) in the AA-AAS whose language
skills could best be described as pre-symbolic (Bates, 1976). That percentage
appears consistent for both expressive and receptive communication. Moreover,
these percentages are also consistent with the percentage of students in each state
whom teachers report do not respond to social interactions.

4)

Even larger percentages of students in each of the three states have no observable
awareness of print or Braille (15%, 25%, and 13% for the three states
respectively) and no observable awareness or use of numbers (13%, 22%, and
11% respectively).

5)

As might be expected, there were strong correlations between levels of receptive
and expressive communication skills and academic and math measures for
students in the AA-AAS in each of the three states. The strongest correlations, as
also might be expected, were between academic ratings in math and reading for

the students in these states (.78, .84, and .85 respectively), indicating a very strong
relationship between math and reading performance on the LCI for these students.
Our findings suggest that while the majority of students in our sample in their respective
states’ AA-AAS did have functional math and reading skills, there is a smaller percentage
of students whose lack of a formalized, symbolic communication system, or whose lack
of awareness of the basic building blocks of reading and math (i.e., print and numbers)
may create tremendous challenges in building alternate assessments that a) capture
meaningful skills that these students have achieved; and b) are linked to grade-level
content standards.
Our results appear consistent with those of Almond and Bechard (2005), who also
found a broad range of communication skills in the students in their study (i.e., 10% of
the students in their sample did not use words to communicate, but almost 40% used 200
words or more in functional communication) and in their motor skills (students in their
sample ranged from not being able to perform any components of the task due to severe
motor deficits, to students able to perform the task without any supports). Our findings,
together with those of Almond and Bechard, highlight the extreme heterogeneity of the
population of students in the AA-AAS, making the development of valid and reliable
assessments for these students an even more formidable task.
Limitations
One of the most significant limitations in this study is the difficulty in describing
communication levels of students in a way in which all communication experts would
agree. Describing students’ levels of expressive communication can become confusing,
since various experts use varying terms for this purpose. Bates (1976) who was a pioneer

in identifying the emergence and levels of symbolic and language-based communication
spoke of three major stages of development. Locution, or the highest level, occurs when
an individual uses formal language to express intent. Formal language includes those
systems that are rule based such as oral speech, Braille, print, various forms of sign
language, or formalized augmentative communication boards or electronic systems (level
1 of Expressive Communication in the LCI). These are clearly symbolic systems. The use
of regularized gestures, points or objects to express communicative intent (level 2 in the
LCI), while understandable, falls at the level of illocution and can be considered at an
emergent symbolic level, but not formalized language. Finally, the individual who uses
less differentiate cries, muscle tone changes, etc., to communicate (level 3 in the LCI)
may require interpretation on the part of the listener and while these individuals are
definitely communicative, they would not be considered at a symbolic level of
communication. Mirenda (2003), a noted authority in functional and augmentative
communication development for students with significant disabilities, has listed multiple
options for “symbols” which can be used for functional communication. These might
include sign, pictures, partial objects, gestures, etc. When reviewing the vast literature in
this area it is difficult to determine which descriptors to use when describing a given
student’s communicative or expressive acts. Is one at a “symbolic level” of development
when he/she uses any symbol as a representation, even a real object, or should he/she be
utilizing a standardized, language system to be considered “symbolic?” In designing the
LCI, we separated the students who used formalized language (print, speech, sign,
formalized augmentative communication systems) at level 1 of expressive
communication from those who used some symbols (such as pictures, gestures, points,

etc.) in level 2 of expressive communication to determine the complexity of their
communication development. We recognize that not all researchers in this area would
interpret symbolic communication in the same sense that we used for our scale.
A second limitation is that the LCI is our own instrument, but no other measures
existed that would succinctly capture the essential dimensions in which we needed to
describe the population of students potentially eligible for the alternate assessment on
alternate achievement standards. In order to ensure that we did construct a valid measure
of student characteristics, we designed the LCI in conjunction with experts in the fields of
Occupational Therapy, Physical Therapy, Speech/Language Pathology/Communication
Disorders, Deaf-blindness, Reading, Mathematics, and Special Education; piloted the
survey with a small sample of teachers and “partner respondents” to achieve an
acceptable level of inter-rater agreement; and achieved a final interrater agreement of
95% upon subsequent revisions based upon expert panel and teacher comments.
However, the lack of a previously validated research tool for our study is a limitation.
In addition, a third limitation of this study is the use of teacher ratings to describe
the characteristics of students participating in AA-AAS. Certainly, there are limitations to
gathering data requiring teachers to rate students’ abilities (i.e., underestimating abilities)
but necessary in gathering data on the learning characteristics of students taking AAAAS. In the future, researchers may want to consider gathering descriptive data on the
respondent or have parents and teachers complete the same inventory to check for
consistency in reporting. Additionally, states used varied data collection techniques,
which we recognize as a limitation. However, the consistency in directions for

completing the LCI was maintained across each of the states and across each of the data
collection techniques.
The fourth significant limitation is, of course, the very low response rate for State
2. With a response rate of approximately only 7%, it would be impossible to generalize
the results from State 2 to the entire population of students in that state who are eligible
for the AA-AAS. Despite this limitation, we did include the results from this state for two
reasons: 1) we did have over 200 individual responses from the state; and 2) while this
was a very limited sample, in general the student characteristic results of from State 2
mirror those of States 1 and 3, for which we had response rates of 80% and 47%
respectively. This is especially true in the overall percentage of students in each state who
score at either Level 1 (Symbolic) or Level 2 (Emerging Symbolic) for both the
Expressive and Receptive Language items, and for the overall percentage of students in
each state who initiate/sustain or respond to social interactions. While State 2 teachers did
report a higher incidence of students who used an augmentative communication system,
who had no observable awareness of print or numbers, and a higher incidence of students
who required assistance for all motor activities than did teachers from States 1 and 3, we
simply cannot identify if this is a real difference or an artifact of the small sample from
that state. Further research is clearly needed to establish how states differ in their
identified populations for their alternate assessments.
Contributing, in all probability, to the low response rate for State 2 in our study
was the element of timing of the survey and the fact that the survey was electronically
“passed down” from administrators to teachers. Future studies should ensure that teachers
have direct access to the Learner Characteristics Inventory or a similar instrument, and

that the survey is not timed to coincide with other major due dates or year-end activities
for teachers.
Future Research Considerations
There are important considerations for future research investigating the learning
characteristics of students with the most significant cognitive disabilities as well as
possible uses of the LCI instrument. To begin, we have no current data that outline how
many students with the most significant cognitive disabilities are also English Language
Learners (ELL) who participate in the AA-AAS. This is an important consideration to
add to the LCI instrument in order to identify the number of students who are both
students with significant cognitive disabilities and ELL. In addition, this information will
help states to be sure teachers are providing appropriate instruction based on these
particular students’ learning needs.
Secondly, the AA-AAS for every state is being used to determine AYP for these
students, and in some states, is also part of student and school accountability measures
that have considerable impact (graduation status for individual students, rewards and
sanctions for schools). It is important to know what student characteristics are most
correlated with performance on the AA-AAS. For example, is it possible for states to
design their AA-AAS in such a way that even students at the emerging and pre-symbolic
levels of communication can demonstrate what they know and can do on content linked
to grade level content standards? Further research that links student characteristics on the
LCI with actual AA-AAS scores can begin to answer these questions.
Thirdly, research with the LCI, or similar measures that can reliably and validly
identify the learner characteristics of this population, would be useful in increasing

general public awareness about strengths and challenges for students taking alternate
assessments, and in delineating the extent to which states truly are assessing similar
populations of students in their respective alternate assessments on alternate achievement
standards. For states who may be over-identifying students for their AA-AAS (e.g.,
exceeding the 1% cap on students who can achieve proficiency in the AA-AAS),
instruments such as the LCI can be useful in determining if students with more advanced
academic skills (e.g., reading with critical understanding) are being placed into the AAAAS, and could perhaps be more appropriately placed into other assessment options
(Alternate Assessments on Grade Level Standards, or Alternate Assessments under
Modified Achievement Standards) allowed under NCLB.
Finally, professional development has been identified as a key variable for
teachers with students in the AA-AAS (Browder, Karvonen, Davis, Fallin, & CourtadeLittle, 2005). Instruments such as the LCI could be used to tailor professional
development on the AA-AAS to ensure that teachers receive inservice training that
addresses the communication levels of their students, as an essential variable in accessing
the grade level curriculum.
Implications for Practitioners
There are two critical implications for practitioners from this study. We will
discuss each in turn. First, the U.S. Department of Education (2004, 2005) clearly
requires that states develop alternate achievement standards that are linked to grade-level
content standards for students with significant cognitive disabilities. In its NCLB Peer
Review Guidance for states, the U.S. Department of Education (2004) has made this
linkage to grade-level context explicit:

For alternate assessments in grades 3 through 8 based on alternate achievement
standards, the assessment materials should show a clear link to the content
standards for the grade in which the student is enrolled although the grade-level
content may be reduced in complexity or modified to reflect pre-requisite skills.
(p. 15)
The challenge for both state level policy makers and practitioners is how this
linkage is to be made for students who are functioning at a pre-symbolic level of
communication. It is important to note that this term is not used to describe students who
expressively have not been provided with the means (or symbols) to convey content that
they may really know, but students who receptively are functioning at a pre-symbolic
level as well. Academic content is, by definition, symbolic content; that content becomes
increasingly complex and abstract at higher grade levels. For students at a pre-symbolic
level, then, teachers must teach the development of symbolic communication through the
grade-level content. As noted by Browder, Wallace, Snell, and Kleinert (2005), this
means simultaneously teaching the content while also teaching the symbols by which that
content is represented. For example, for students who are learning to identify key
characters in a story by selecting pictures of those characters, this means learning that
pictures are symbols that can represent actual characters, while learning about the
characters themselves. As a field focused on curriculum and instruction for students with
significant cognitive disabilities, we simply have not yet developed a research-base for
how these two important, but very distinct, skill sets (one a developmental and
communicative skill and the other an academic and core content skill) can be effectively
taught in tandem.

The second implication is, in part, recognition of the first. In consideration of the
heterogeneity of learners who are eligible for alternate assessments on grade-level
content standards, NCLB allows multiple alternate achievement standards (U.S.
Department of Education, 2005). According to the U.S. Department of Education (2005),
if a state:
chooses to define multiple alternate achievement standards, it must employ
commonly accepted professional practices to define the standards; it must
document the relationship among the alternate achievement standards as part of
its coherent assessment plan…One reason why a State might choose to develop
more than one alternate achievement standard is to promote access to the general
curriculum and to ensure that students are appropriately challenged to meet the
highest standards possible. (p. 22)
This survey suggests some evidence that states might want this option. Given that the one
percent of students with significant cognitive disabilities for whom the AA-AAS is
designed includes both symbolic learners who evidence skills in reading and math as well
as pre-symbolic learners who display limited social engagement, and no awareness of
print and numbers, it would appear to be a reasonable and coherent assessment approach
to consider separate alternate achievement standards for these two sets of students.
Certainly what might be defined as an appropriately challenging alternate achievement
standard in reading for a student who reads basic sight words or sentences (or even reads
fluently with basic understanding from paragraphs) would be defined at a different level
of complexity or scope than for a student with no clear use of gestures, pictures, or signs
to communicate and who had no observable awareness of print. Or conversely, what

would be an appropriately challenging math standard for a student “who could do
computational problems with or without a calculator” would appear to be different for a
student who had no observable awareness of numbers. Still, of course, the caveat remains
that even for students at a pre-symbolic level of communication, states are to consider
alternate achievement standards linked to grade-level content standards, and that if a state
does adopt multiple achievement standards, each set of those alternate standards must
reflect that linkage.
We should also note that, if a state chooses to adopt multiple alternate
achievement standards, the U.S. Department of Education (2005) has described the
relationships that should exist between those multiple sets of standards, specifically: “If,
however, a State chooses to define multiple alternate achievement standards, it must
employ commonly accepted professional practices to define the standards; it must
document the relationship among the alternate achievement standards as part of its
coherent assessment plan” (p. 23). We would argue that, based on the results of this
study, a decision to create multiple alternate assessment standards based upon students’
symbolic use of language does represent a coherent distinction in the students who
participate in the alternate assessment, and also provides a mechanism for relating how
students might move from one set of alternate assessment standards to a more complex
set of standards, as students’ attain formalized, symbolic modes of communicating and
representing what they know.
Conclusion
This study has examined the learner characteristics of students in the alternate
assessment on alternate achievement standards in three very geographically and

demographically different states. Based on our results, it can be argued that students in
the alternate assessment include at least two sub-groups within this population although it
should be noted there is no distinct line between the two and most likely a continuum
rather than a precise demarcation of symbolic language levels. The first set (and the
majority of the students in our sample) have either symbolic or emerging symbolic levels
of communication, evidence social engagement, and possess at least some level
functional reading and math skills. The second set of students in our sample (10% to 25%
of our students depending upon the measure and the state) have not yet acquired a formal,
symbolic communication system, do not initiate, maintain, or respond to social
interactions, and have no awareness of print, Braille or numbers. Between these two sets
of students are those who most likely represent skills and abilities characteristic, in part,
of each of these groups. States must consider the educational needs of all these students
in designing their alternate assessments on alternate achievement standards. Most
importantly, states will need to thoughtfully consider, especially for students at a presymbolic level of communication, how to ensure linkage to grade-level content standards
in ways that provide meaningful and useful educational targets for those students.

References

Almond, P., & Bechard, S. (2005). In-depth look at students who take alternate
assessments: What do we know now? Retrieved [March 16, 2006] from
http://www.measuredprogress.org/Resources/SpecialEd.html
Bates, E. (1976). Language in context: Studies in the acquisition of pragmatics. New
York: Academic Press.
Browder, D., Karvonen, M., Davis, S., Fallin, K, & Courtade-Little. (2005). The impact
of teacher training on state alternate assessment scores. Exceptional Children,
71(3), 267-282.
Browder, D. M., Spooner, R., Algozzine, R., Ahlgrim-Delzell, L., Flowers, C., &
Karvonen, M. (2003). What we know and need to know about alternate
assessment. Exceptional Children, 70, 45-61.
Browder, D., Wallace, T., Snell, M., & Kleinert, H. (2005). The use of progress
monitoring with students with significant cognitive disabilities. Washington, DC:
American Institutes of Research, National Center on Student Progress
Monitoring.
Flowers, C., Ahlgrim-Delzell, L., Browder, D., & Spooner, F. (2005). Teachers’
perceptions of alternate assessments. Research and Practice for Persons with
Severe Disabilities, 30(2), 81-92.
Heward, W. (2006). Exceptional children: An introduction to special education (8th ed.).
Upper Saddle River, NJ: Merrill/Prentice Hall.
Individuals with Disabilities Education Act Amendments of 1997 (IDEA), PL 105-17, 20
U.S.C. §§ 1400 et seq.

Individuals with Disabilities Education Improvement Act of 2004 (IDEA), PL 108-446,
20 U.S.C. §§ 1400 et seq.
Kampfer, S., Horvath, L., Kleinert, H., & Kearns, J. (2001). Teachers’ perceptions of one
state’s alternate assessment portfolio program: Implications for practice and
preparation. Exceptional Children, 67(3), 361-374.
Kleinert, H., Haigh, J., Kearns, J., Kennedy, S. (2000). Alternate assessments: Lessons
learned and roads to be taken. Exceptional Children, 67 (1), 51-66.
Kleinert, H., & Kearns, J. (1999). A validation study of the performance indicators and
learner outcomes of Kentucky's alternate assessment for students with significant
disabilities. Journal of the Association for Persons with Severe Handicaps, 24 (2),
100-110.
Kleinert, H., Kearns, J., & Kennedy, S. (1997). Accountability for all students:
Kentucky's Alternate Portfolio system for students with moderate and severe
cognitive disabilities. Journal of the Association for Persons with Severe
Handicaps, 22 (2), 88-101.
Kleinert, H., Kennedy, S., and Kearns, J. (1999). Impact of alternate assessments: A
statewide teacher survey. Journal of Special Education, 33(2), 93-102.
Kleinert, H., & Thurlow, M. (2001). An introduction to alternate assessment. In H.
Kleinert & J. Kearns (2001). Alternate assessment: Measuring outcomes and
supports for students with disabilities (pp. 1-15). Baltimore: Paul Brookes.
Midsouth Regional Resource Center. (2004). Compilation of state alternate assessment
participation guidelines. Retrieved March 15, 2007, from

http://www.rrfcnetwork.org/images/stories/MSRRC/DOCS/ASSESSMENT/alt%
20assess%20participation%20guidelines%202.04.doc
Mirenda, P. (2003). Toward functional augmentative and alternative communication for
students with autism: Manual signs, graphic symbols, and voice output
communication aids. Language, Speech, and Hearing Services in the Schools, 34,
203-216.
National Alternate Assessment Center (NAAC) (2005). Access and alignment to grade
level content for students with significant cognitive disabilities. Pre-session
conducted at the meeting of the Chief Council for State School Officers, San
Antonio, Texas.
No Child Left Behind Act of 2001, PL.107-110, 115 Stat.1425, 20 U.S.C. §§ 6301 et seq.
Orelove, F., Sobsey, D., & Silberman, R. (Eds.) (2004). Educating children with multiple
disabilities: A collaborative approach (4th ed.). Baltimore, MA: Paul Brookes.
Pellegrino, J, Chudowsky, N., & Glaser, R. (Eds.) (2001). Knowing what students know:
The science and design of educational assessment. Washington, DC: Committee
on the Foundations of Assessment, National Academy Press.
Turner, M., Baldwin, L., Kleinert, H., & Kearns, J. (2000). An examination of the
concurrent validity of Kentucky’s alternate assessment system. Journal of
Special Education, 34 (2), 69-76.
U. S. Department of Education. (2002-2003). Education Week analysis of data from the
Office of Special Education Programs, Data Analysis System.

United States Department of Education (2004). Standards and assessment peer review
guidance. Washington, DC: US Department of Education, Office of Elementary
and Secondary Education.
United States Department of Education (2005). Alternate achievement standards for
students with the most significant cognitive disabilities: Non-regulatory guidance.
Washington, DC: US Department of Education, Office of Elementary and
Secondary Education.

Figure 1
The Assessment Triangle (Pellegrino et al., 2001)

Observation

Interpretation

Cognition

Table 1
Data Collection Techniques for the LCI in States 1, 2, and 3
State
State 1

Data Collection Technique
Online survey
Paper/pencil version brought to scoring site
Paper/pencil version completed at scoring site

State 2

Online survey

State 3

Online survey

Table 2
Number of Responses and Percentages for each Variable for States 1, 2, and 3
State 1
Expressive Language

State 2

State 3

N

Percent

N

Percent

N

Percent

799

71%

127

63%

163

74%

193

17%

52

26%

37

17%

92

8%

22

11%

17

8%

Uses symbolic language to communicate: Student uses verbal or
written words, signs, Braille, or language-based augmentative
systems to request, initiate, and respond to questions, describe things
or events, and express refusal.
Uses intentional communication, but not at a symbolic language
level: Student uses understandable communication through such
modes as gestures, pictures, objects/textures, points, etc., to clearly
express a variety of intentions.
Student communicates primarily through cries, facial expressions,
change in muscle tone, etc., but no clear use of objects/textures,
regularized gestures, pictures, signs, etc., to communicate.

Multiple answers

6

1%

0

0%

0

0%

No response

30

3%

0

0%

2

1%

1120

100%

201

100%

219

100%

523

46%

68

34%

122

56%

461

41%

109

54%
73

33%

16

7%

6

3%

0

0%

Total
Receptive Language
Independently follows 1-2 step directions presented through words
(e.g. words may be spoken, signed, printed, or any combination) and
does NOT need additional cues.
Requires additional cues (e.g., gestures, pictures, objects, or
demonstrations/models) to follow 1-2 step directions.
Alerts to sensory input from another person (auditory, visual, touch,
movement) BUT requires actual physical assistance to follow simple

109

10%

21

10%

18

2%

3

2%

directions.
Uncertain response to sensory stimuli (e.g., sound/voice;
sight/gesture; touch; movement; smell).
Multiple answers

1

0%

0

0%

No response

8

1%

0

0%

2

1%

1120

100%

201

100%

219

100%

Yes

202

18%

60

30%

33

15%

No

878

78%

141

70%

184

84%

Multiple answers

0

0%

0

0%

0

0%

No response

40

4%

0

0%

2

1%

1120

100%

201

100%

219

100%

5

2%

73

33%

Total
Communication System
Does your student use an augmentative communication system in
addition to or in place of oral speech?

Total
Reading
Reads fluently with critical understanding in print or Braille (e.g., to

27

2%

NA

NA

153

14%

24

12%

differentiate fact/opinion, point of view, emotional response, etc).
Reads fluently with basic (literal) understanding from
paragraphs/short passages with narrative/informational texts in print

or Braille.
71

33%

14%

40

18%

50

25%

28

13%

1%

0

0%

0

0%

8

1%

4

2%

2

1%

1120

100%

201

100%

219

100%

29

2%

8

4%

Does computational procedures with or without a calculator.

641

57%

75

38%

111

51%

Counts with 1:1 correspondence to at least 10, and/or makes

211

19%

49

24%

59

27%

Reads basic sight words, simple sentences, directions, bullets, and/or
562

50%

95

47%

192

17%

28

172

15%

Multiple answers

6

No response

lists in print or Braille.
Aware of text/Braille, follows directionality, makes letter
distinctions, or tells a story from the pictures that is not linked to the
text.
No observable awareness of print or Braille.

Total
Mathematics
Applies computational procedures to solve real-life or routine word

9
4%

problems from a variety of contexts.

numbered sets of items.
Counts by rote to 5.

76

7%

20

10%

13

6%

No observable awareness or use of numbers.

144

13%

45

22%

25

11%

Multiple answers

8

1%

0

0%

0

0%

No response

11

1%

4

2%

2

1%

1120

100%

201

100%

219

100%

Vision within normal limits.

686

61%

136

68%

110

50%

Corrected vision within normal limits.

331

29%

35

17%

87

39%

Low vision; uses vision for some activities of daily living.

74

7%

22

11%

10

5%

10

5%

23

2%

8

4%

Multiple answers

0

0%

0

0%

0

0%

No response

6

1%

0

0%

2

1%

1120

100%

201

100%

219

100%

Total
Vision

No functional use of vision for activities of daily living, or unable to
determine functional use of vision.

Total

Hearing
Hearing within normal limits.

1040

93%

187

93%

208

95%

Corrected hearing loss within normal limits.

29

2%

1

1%

4

2%

Hearing loss aided but still with significant loss.

12

1%

6

3%

0

0%

Profound loss, even with aids.

10

1%

4

2%

0

0%

Unable to determine functional use of hearing.

20

2%

3

1%

5

2%

Multiple answers

0

0%

0

0%

0

0%

No response

9

1%

0

0%

2

1%

1120

100%

201

100%

219

100%

850

76%

153

76%

177

81%

15

7%

11

5%

Total
Motor
No significant motor dysfunction that requires adaptations.
Requires adaptations to support motor functioning (e.g., walker,

127

11%

20

10%

55

5%

3

2%

adapted utensils, and/or keyboard).
Uses wheelchair, positioning equipment, and/or assistive devices for
most activities.

Needs personal assistance for most/all motor activities.

73

6%

25

12%

14

6%

Multiple answers

4

1%

0

0%

0

0%

No response

11

1%

0

0%

2

1%

1120

100%

201

100%

219

100%

587

52%

85

42%

130

59%

69

32%

Total
Engagement
Initiates and sustains social interactions.
Responds with social interaction, but does not initiate or sustain

414

37%

87

43%

Alerts to others.

84

8%

22

11%

16

7%

Does not alert to others.

21

2%

7

4%

2

1%

Multiple answers

2

0%

0

0%

0

0%

No response

12

1%

0

0%

2

1%

1120

100%

201

100%

219

100%

901

80%

173

86%

183

84%

social interactions.

Total
Health Issues/Attendance
Attends at least 90% of school days.

Attends approximately 75% of school days; absences primarily due
156

14%

27

13%

27

2%

1

1%

6

1%

0

0%

26

12%

5

2%

0

0%

3

1%

to health issues.
Attends approximately 50% or less of school days; absences
primarily due to health issues.
Receives Homebound Instruction due to health issues.
Highly irregular attendance or homebound instruction due to issues
21

2%

0

0%

Multiple answers

2

0%

0

0%

0

0%

No response

7

1%

0

0%

2

1%

1120

100%

201

100%

219

100%

other than health.

Total

Learner Characteristics
Table 3
Relationship between Expressive Communication, Receptive Language, Reading, and
Mathematics
Variables

1

2

3

4

State 1
1. Expressive Communication

-

2. Receptive Language

.576*
-

3. Reading

.574*

.648*

.559*

.634*

-

.783*

4. Mathematics

State 2

1. Expressive Communication

-

2. Receptive Language

.659*
-

3. Reading

.674*

.686*

.577*

.568*

-

.836*

4. Mathematics

State 3

1. Expressive Communication
2. Receptive Language
3. Reading
4. Mathematics
* p > .01

-

.721*
-

.649*

.718*

.678*

.694*

-

.847*
-

43


File Typeapplication/pdf
File Title05 Karvonen p29
AuthorEric Pendley
File Modified2008-08-11
File Created2007-10-30

© 2024 OMB.report | Privacy Policy