National Study on Alternate Assessments (NSAA)
This study responds to a congressional mandate in Section 664(c) of the 2004 reauthorization of IDEA, which calls for a “study on ensuring accountability for students who are held to alternative achievement standards.” More specifically, this legislation requires “a national study or studies to examine (1) the criteria that States use to determine (A) eligibility for alternate assessments; and (B) the number and type of children who take those assessments and are held accountable to alternative achievement standards; (2) the validity and reliability of alternate assessment instruments and procedures; (3) the alignment of alternate assessments and alternative achievement standards to State academic content standards in reading, mathematics, and science; and (4) the use and effectiveness of alternate assessments in appropriately measuring student progress and outcomes specific to individualized instructional need.”
The Individuals with Disabilities Education Act Amendments of 1997 (IDEA ’97) first directed states to develop, and to implement by 2000, alternate assessments as an option for students with disabilities who cannot participate in regular assessments, even with accommodations. In response, states developed a variety of approaches to the design and implementation of such assessments including, portfolios, rating scales, and performance events (Thompson and Thurlow 2001). However, it was not always clear how such assessments would link to state academic content standards, meet standards for technical adequacy, or be incorporated into accountability reporting (Quenemoen, Rigney, and Thurlow 2002). Subsequently, the No Child Left Behind Act of 2001 (NCLB) required states to implement statewide accountability systems for all public schools that are based on challenging state standards in reading, mathematics, and science, and on annual testing of students. States must establish three levels of performance (basic, proficient, and advanced) on the grade-level assessments and set annual performance targets against which to measure adequate yearly progress (AYP) to ensure that all groups of students remain on a trajectory toward proficiency by 2014. AYP targets must be determined, met, and reported for specific subgroups of students, including those with disabilities and those who participate in alternate assessment systems. As a result, not only do state alternate assessment systems vary in terms of their approach, collected evidence, eligibility criteria, and technical characteristics, but states have continued to modify their alternate assessment systems such that the national perspective on them is in a state of flux. This is a result of the ongoing evolution of systems in response to NCLB and IDEA, as well as to several federal rules related to the inclusion of alternate assessment results in accountability frameworks.
In 2005, the National Center for Special Education Research in the Institute of Education Sciences of the U.S. Department of Education awarded a 4-year contract to SRI International to conduct the National Study on Alternate Assessments (NSAA). This project has three major objectives: (1) to produce state profiles for all 50 states and the District of Columbia, plus a national summary profile; (2) to describe and explicate in a selected sample of states (a) the characteristics of alternate assessments, processes of student placement, alignment with content standards, and uses of data; (b) the state and local processes that facilitate or impede the implementation of alternate assessments, alternate achievement standards, and modified academic achievement standards; and (c) consequences for students with disabilities; and (3) to conduct a quantitative analysis of the relationships between variables in alternate assessment systems and student outcomes. The project will accomplish these objectives by means of an analysis of state documents; a national telephone interview survey; case studies of states, local districts, schools, and students with disabilities; and quantitative analysis of data on alternate assessments and student outcomes. SRI International is partnering in this project with the University of Minnesota’s National Center on Educational Outcomes (NCEO) and Policy Studies Associates (PSA). A schedule of project activities is included in Appendix A. The national telephone interview survey (Task 4) is the subject of the current information collection request.
To date, the NSAA is the only in-depth and comprehensive national study of alternate assessments. Members of Congress, Department of Education program and evaluation staff, state and local policymakers, researchers, and practitioners need the information that will be compiled in this study to help ensure that this and future federal programs have the intended effect of supporting students with disabilities, including those with significant cognitive disabilities, to have the same opportunity to achieve high standards and be held to the same high expectations as all other students in each state.
The data collected for this task will be used to have a better understanding of alternate assessment systems nationally that can influence student outcomes and access to the general curriculum. More specifically, the NSAA data will be used
by ED evaluation staff to disseminate information on effective and ineffective practices to state and local policymakers, who may use the data to support the improvement of alternate assessments and accountability systems;
by Congress (the Health, Education, Labor, and Pensions Committee of the Senate and the Committee on Education and the Workforce of the House of Representatives) to inform future legislation for promoting access and accountability for students with significant cognitive disabilities; and
by researchers, who may use the data to inform future studies on types of alternate assessments, technical adequacy, alignment, and accountability.
During the data collection period, a telephone number and an e-mail address will be available to permit respondents to contact the contractor with questions or requests for assistance. The telephone number and e-mail address will be printed on all data collection instruments. As noted above, an electronic template has been developed to record state data. This computer-based system has multiple functionalities. In addition to facilitating data entry, the NSAA data collection system can create a tailored list of questions for the state interviews as well as support data analysis and reporting. The electronic system will also will be used to monitor the flow of data collection activities, from survey administration to processing and coding to entry into the database. This monitoring will help to ensure the efficiency and completeness of the data collection process.
This data collection activity is one of the U.S. Department of Education’s primary efforts to evaluate alternate assessments, including student outcomes and the quality of standards and accountability systems. The contractor is working to minimize the potential burden on participating states by working with ED to collect only data that are not available from secondary sources (such as the peer review documents submitted by states under NCLB) or are not being collected by other research studies supported by the federal government. The NSAA has established collaborative relationships with other research and technical assistance projects to avoid duplication of efforts and maximize the shared use of data.
No small businesses or entities will be involved as respondents.
Failure to collect this information will prevent Congress and ED from evaluating important aspects of the quality of alternate assessments nationally, as mandated under IDEA. The study will be collecting information that has not been systematically acquired and analyzed by other data collection efforts for alternate assessments. This information would e collected only once.
None of the special circumstances listed apply to this data collection.
A notice about the study will be published in the Federal Register when this package is submitted to provide the opportunity for public comment. In addition, throughout this study, the contractor will draw on the experience and expertise of a technical working group (TWG) that provides a diverse range of experience and perspectives, including representatives from the school, district, and state levels, as well as researchers with expertise in relevant methodological and content areas. The members of this group and their affiliations are listed in exhibit 1. The TWG members were informed of state data collection activities at their meeting on November 29-30, 2005.
Exhibit 1
Technical
Working Group Membership
Member |
Affiliation |
Interest & Expertise |
Diane Browder |
University of North Carolina |
Large-scale assessments, alternate assessment methodologies, students with significant cognitive disabilities, special education policy research |
Lizanne DeStefano |
University of Illinois, Champaign-Urbana |
Alternate assessment, transition, accountability, local implementation |
Stephen Elliott |
Vanderbilt University |
Large-scale assessments, alternate assessment methodologies, psychometrics, special education policy research |
Janet Filbin |
Jefferson County School District, Colorado |
Educational document analysis, large-scale assessments, alternate assessment methodologies, students with significant cognitive disabilities |
Margaret Goertz |
University of Pennsylvania |
Educational document analysis, case study methodology, special education policy research, accountability |
Brian Gong |
The National Center for the Improvement of Educational Assessment (NCIEA) |
Psychometrics, accountability, alternate assessment |
Jacqui Kearns |
University of Kentucky |
Case study methodology, large-scale assessments, alternate assessment methodologies, students with significant cognitive disabilities, special education policy research |
Scott Marion |
The National Center for the Improvement of Educational Assessment (NCIEA) |
Psychometrics, accountability, alternate assessment |
Kevin McGrew |
Institute for Applied Psychometrics |
Sampling methodology, large-scale assessments, quantitative methodologies, alternate assessment methodologies, psychometrics, students with significant cognitive disabilities, special education policy research |
Gerald Tindal |
University of Oregon |
Sampling methodology, large-scale assessments, quantitative methodologies, alternate assessment methodologies, students with significant cognitive disabilities, special education policy research |
Dan Wiener |
Massachusetts State Department of Education |
Large-scale assessments, alternate assessment methodologies, special education policy research, state implementation and accountability |
The first 60-day public comment period was announced in the Federal Register on October 11, 2006, with comments due by December 11, 2006. The following two comments were received. Following each public comment is a response from the program sponsor.
Public Comment #1: “The schools still attempt to not fully take care of disabled children. They make it so very very difficult for a parent to get the proper care for their children and turn away so many parents in this fashion of requiring endless meetings and endless turn downs. This happens in NJ. You have to be a very rich parent and a very determined parent with lots of free time to get the proper care for a disabled child in New Jersey. So the school reports are not accurate as to what is being done for these children, many of whom can grow up to be fully productive citizens if they get the help in the beginning of their lives.”
Program Sponsor Response: The National Study on Alternate Assessments focuses on alternate assessments that are used primarily for accountability purposes and will not directly focus on the process of evaluation, eligibility determination, and placement to which this public comment refers. Some of the findings of the study may be relevant to the concerns of this commenter, but that is not its primary purpose, and no changes will be made to the data collection.
Public Comment #2: “The Minnesota Department of Education is currently in the process of developing a new alternate assessment based on alternate achievement standards, as the previously-developed alternate assessments did not meet NCLB requirements. I understand that a pilot study to collect information on alternate assessments is planned for fall, 2006 and the official study will follow. I have reviewed the information needed for this study and believe that it would be advantageous to delay the beginning of the study so that Minnesota is able to respond more fully to the requests for information.”
Program Sponsor Response: A number of states are revising their alternate assessments, and the process of evaluation and revision of assessment instruments may continue for several years. Since the National Study on Alternate Assessments is required under federal law, it is not feasible to wait until all states have reached long-term stability in their assessment policies and instruments. Thus, the program sponsor determined that the survey should study the 2005-06 academic year, which was the focus of the recent NCLB peer review process, but that the survey should also obtain information about anticipated revisions to capture the type of information discussed by the commenter. The instrument already contains items to this effect.
None will be made.
SRI, Policy Studies Associates (PSA), and University of Minnesota are dedicated to maintaining the confidentiality of information on human subjects and sensitive data. The contractors recognize the following minimum rights of every subject in the study: (1) the right to accurate representation of the right to privacy, (2) the right to informed consent, and (3) the right to refuse participation at any point during the study. Respondents will be assured of confidentiality to the extent offered by law in the initial invitation to participate in the study, and this assurance will be reiterated at the time data collection begins. A set of standards and procedures has been established by the contractors to safeguard the privacy of participants and the security of data as they are collected, processed, stored, and reported. These standards and procedures are summarized below.
Project team members will be educated to the confidentiality assurances given to respondents and to the sensitive nature of materials and data to be handled. Each person assigned to the study will be cautioned not to discuss confidential data and will be required to sign a written statement attesting to his or her understanding of the significance of this requirement.
In training the interviewers, the privacy and confidentiality aspects of the study and the facts that any violation of procedures could have serious consequences for research participants will be emphasized. Personnel will be cautioned not to discuss interview data with others outside the study, and to restrict discussion within the project to the essential needs of the data collection activity.
Participants will be informed of the purposes of the data collection and the uses that may be made of the data collected.
Access to the database will be limited to authorized project members only; no others will be authorized such access. Multilevel user codes will be used, and entry passwords will be changed frequently.
All surveys and other documents will be stored in secure areas accessible only to authorized staff members. Computer-generated printouts containing identifiable data will be maintained under these same conditions.
As required, data tapes or disks containing sensitive data will be degaussed prior to their reuse.
All basic computer files will be duplicated on backup disks to allow for file restoration in the event of unrecoverable loss of the original data. These backup files will be stored under secure conditions in an area separate from the location of the original data.
In addition, SRI maintains its own Institutional Review Board. All proposals for studies in which human subjects might be used are reviewed by SRI’s Human Subjects Committee, appointed by the President and Chief Executive Officer. For consideration by the reviewing committee, proposals must include information on the nature of the research and its purpose; anticipated results; the subjects involved and any risks to subjects, including sources of substantial stress or discomfort; and the safeguards to be taken against any risks described. Although the data gathered from each state will be used to develop individual state profiles, individual respondents will not be identified.
No questions of a sensitive nature will be included in the state telephone interviews.
The estimates in exhibit 2 reflect the burden for notification of study participants and their participation in the following activities:
Time for respondent(s) to review and verify the information in the data summary/interview and prepare for discussion during the interview. If state staff other than the person initially contacted are required to review the summary, the number of hours per participant will be reduced but the total number of hours will remain the same.
Time associated with completing the state telephone interview. If state staff other than the person initially contacted are required to answer any of the telephone interview items, the number of hours per participant will be reduced but the total number of hours will remain the same.
Exhibit 2
Estimated
Burden for Alternate Assessment Data Summary Verification and
Follow-up Telephone Interview
Activity |
No. of Participants |
No. of Hours per Activity |
Total No. of Hours |
Estimated Burden |
Verification of state alternate assessment system summary
Follow-up telephone interview |
51 states
51 states
|
4.0a
2.0a |
204
102 |
$8,160
4,080 |
Total |
102 |
6.0 |
306 |
$12,240 |
There are no additional respondent costs associated with this data collection other than the hour burden estimated in item 12.
The annual costs to the federal government for this survey, as specified in the contract, are:
Fiscal year 2006 |
$145,840 |
Fiscal year 2007 |
$395,670 |
Total |
$541,510 |
This request is for a new information collection.
As part of the NSAA, SRI will produce five reports in the course of the study, including reports on the document analysis, telephone interview, state and national profiles (using data from document analysis and telephone interview), case studies, and quantitative analyses (see exhibit 3 for dissemination schedule). The focus of the reports will vary, based on the data collection activities.
Exhibit 3
Schedule for
Dissemination of Study Results
Activity/Deliverable |
Due Date |
Document Analysis Report First draft Final version |
2/28/07 4/27/07 |
Telephone Interview Report First draft Final version |
5/31/07 7/31/07 |
State and National Profiles First draft Final version |
12/29/07 3/29/08 |
Case Study Report First draft Final version |
1/30/09 5/29/09 |
Quantitative Analysis Report First draft Final version |
6/30/09 9/30/09 |
All data collection instruments will include the OMB expiration date.
No exceptions are requested.
NSAA Schedule
Management Matrix/Timeline of Project Activities and Milestones for National Study on Alternate Assessments
Years 1 and 2
Project Months |
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
9 |
10 |
11 |
12 |
13 |
14 |
15 |
16 |
17 |
18 |
19 |
20 |
21 |
22 |
23 |
24 |
|
2005 |
2006 |
2006 |
2007 |
||||||||||||||||||||
|
O |
N |
D |
J |
F |
M |
A |
M |
J |
J |
A |
S |
O |
N |
D |
J |
F |
M |
A |
M |
J |
J |
A |
S |
1. Communication with ED (Blackorby/Cameto) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1.1 Kickoff meeting with ED |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1.2 Submit monthly progress reports |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2. Technical Work Group (TWG) (Blackorby/Cameto) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2.1 Select and recruit TWG |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2.2 Convene initial TWG meeting |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2.3 Convene periodic TWG meetings |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3. Study Design (Blackorby/Cameto) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3.1 Draft study design |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3.2 Final study design |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3.3 Document IRB approval |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3.4 Databases & tracking systems |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5. Document Analysis (Cameto) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5.1 Identify and obtain documents and data sources (Peer review documents) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5.2 Develop materials and procedures for summarizing and analyzing documents (Develop scoring system/rubric to review) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5.3 Submit report of document analysis |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Meeting Memo or other deliverable Draft deliverable Final deliverable Phone conference Ongoing * Date to be determined.
M anagement Matrix/Timeline of Project Activities and Milestones for National Study on Alternate Assessments
Years 1 and 2
Project Months |
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
9 |
10 |
11 |
12 |
13 |
14 |
15 |
16 |
17 |
18 |
19 |
20 |
21 |
22 |
23 |
24 |
|
2005 |
2006 |
2006 |
2007 |
||||||||||||||||||||
|
O |
N |
D |
J |
F |
M |
A |
M |
J |
J |
A |
S |
O |
N |
D |
J |
F |
M |
A |
M |
J |
J |
A |
S |
4. Telephone Interview Survey (Cameto) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4.1 Develop survey |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4.2 Pilot test and finalize survey |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4.3 Develop and submit OMB Clearance Package |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4.4 Recruit respondents |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4.5 Train interviewers |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4.6 Conduct telephone interview survey |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4.7 Submit report of survey results |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
6. State and National Profiles (Cameto) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
6.1 Develop templates for profiles |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
6.2 Develop state and national profiles |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7. Case Studies (Padilla) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7.1 Develop sampling strategy |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7.2 Develop case study protocols and procedures |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7.3 Develop & submit OMB Clearance Package |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7.4 Recruit case study participants |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7.5 Recruit and train data collectors |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7.6 Collect case study data |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7.7 Submit case study report |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Meeting Memo or other deliverable Draft deliverable Final deliverable Phone conference Ongoing * Date to be determined.
Management Matrix/Timeline of Project Activities and Milestones for National Study on Alternate Assessments
Y ears 3-4
Project Months |
25 |
26 |
27 |
28 |
29 |
30 |
31 |
32 |
33 |
34 |
35 |
36 |
37 |
38 |
39 |
40 |
41 |
42 |
43 |
44 |
45 |
46 |
47 |
48 |
|
2007 |
2008 |
2008 |
2009 |
||||||||||||||||||||
|
O |
N |
D |
J |
F |
M |
A |
M |
J |
J |
A |
S |
O |
N |
D |
J |
F |
M |
A |
M |
J |
J |
A |
S |
1. Communication with ED (Blackorby/Cameto) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1.2 Submit monthly progress reports |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2. Technical Work Group (TWG) (Blackorby/Cameto) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2.3 Convene periodic TWG meetings |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3. Study Design (Blackorby/Cameto) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3.4 CD-ROM of study-specific information and data |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
6. State and National Profiles (Cameto) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
6.2 Develop state and national profiles |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7. Case Studies (Padilla) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7.4 Recruit case study participants |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7 .5 Recruit and train data collectors |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7.6 Collect case study data |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7.7 Submit case study report |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
8. Quantitative Analysis (Lash/Blackorby) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
8.1 Develop detailed analysis plan |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
8.2 Collect additional data as needed |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
8.3 Conduct quantitative analysis |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
8.4 Submit report of quantitative analysis |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Meeting Memo or other deliverable Draft deliverable Final deliverable Phone conference Ongoing * Date to be determined.
SRI International |
|
NSAA OMB Clearance |
File Type | application/msword |
File Title | EVALUATION OF TITLE I ACCOUNTABILITY SYSTEMS AND SCHOOL IMPROVEMENT EFFORTS (TASSIE) |
Author | Christine Padilla |
Last Modified By | david.malouf |
File Modified | 2007-01-12 |
File Created | 2007-01-12 |