Att_VA Part A 12-03-07

Att_VA Part A 12-03-07.doc

Eighth grade access to Algebra I: A study of virtual algebra

OMB: 1850-0856

Document [doc]
Download: doc | pdf

Eighth Grade Access to Algebra I: A Study of Virtual Algebra

OMB Clearance Request

Supporting Statement
Part A

December 2007

Prepared For:

Institute of Education Sciences

United States Department of Education

Contract No. ED‑06‑CO‑0025

Prepared By:

Regional Educational Laboratory—Northeast and the Islands

55 Chapel Street

Newton, MA 02458-1060

Supporting Statement

request for clearance of information collection forms for

Eighth Grade Access to Algebra I: A Study of Virtual Algebra



A1. Justification

The Institute of Education Sciences of the U.S. Department of Education is conducting a study of expanding access to Algebra I by delivering Algebra I online to 8th graders. The research is to be carried out by the Northeast Regional Educational Laboratory (REL-NEI) by several of its partner organizations: the Education Development Center (EDC), the American Institutes for Research (AIR), Windwalker Corporation, Nimble Assessment Systems, Class.com, and the Northwest Evaluation Association (NWEA). The current authorization for the Regional Educational Laboratories (REL) program is under the Education Sciences Reform Act of 2002, PL107-279, Part D, Section 174, administered by the Institute of Education Sciences’ National Center for Education Evaluation and Regional Assistance (see Attachment 1).

The national priority for the 2005-2010 REL awards is addressing the goals of the reauthorized Elementary and Secondary Education Act (ESEA). ESEA requires that schools and districts measure academic performance in reading and mathematics in grades three through eight plus one grade in high school to identify weaknesses and make changes appropriately. Schools are considered to have made adequate yearly progress only if all student groups, including poor and minority students, students with limited English proficiency, and students with disabilities meet the state’s adequate yearly progress targets. Schools and districts that do not make sufficient progress toward meeting their State’s adequate yearly progress targets, based on State-defined academic achievement standards and adequate yearly progress measures, are classified as in need of improvement. The Regional Educational Laboratories are charged with

…carrying out applied research projects that are designed to serve the particular educational needs (in prekindergarten through grade 16) of the region in which the regional educational laboratory is located, that reflect findings from scientifically valid research, and that result in userfriendly, replicable school-based classroom applications geared toward promoting increased student achievement, including using applied research to assist in solving site-specific problems and assisting in development activities (including high-quality and on-going professional development and effective parental involvement strategies) (PL107-279, Part D, Section 174, paragraph f, subparagraph 4)

The use of virtual courses grows each year, with little evidence-based research to establish its effectiveness. Virtual courses delivered through online communications have rapidly become a significant part of the U.S. education system. At the college level, 90% of public institutions offer some virtual courses to their students, and it is estimated that 2.6 million college students took online courses in the United States in 2004 (Allen & Seaman, 2004).

Although colleges have led the movement to virtual courses, their use is growing rapidly at the secondary level, showing a tenfold increase in 4 years, from an estimated 40,000–50,000 students in 2001–2002 to more than 520,000 in 2004–2005 (McLeod, Hughes, Brown, Choi, & Maeda, 2005). As of October, 2006, 24 states have statewide online learning programs, and all 50 states have cyber schools and/or district-level online programs (Watson, 2006). The number of schools offering online courses is large and growing dramatically, as is the number of students participating in those courses. The Sloan Consortium, an online learning association, recently conducted a national survey of K–12 school districts, and report that nearly two thirds (63%) of U.S. school districts have one or more students enrolled in online courses. The surveyed districts predicted that their online enrollments will grow by approximately 20% over the next 2 years (Picciano & Seaman, 2007). The National Education Association (NEA) estimates that as of 2006, a majority of high school students have had an online course before graduating (NEA, 2006).

Offering coursework virtually is a strategy schools use to expand the curricula available to their students. Virtual programs often fill gaps in curricula, such as providing Advanced Placement and other courses that are not otherwise offered. Virtual programs also provide courses that help students make up credits for missed or failed classes. This is particularly important for schools that have trouble offering particular courses to students who are ready for them, as well as schools that have trouble recruiting and retaining a sufficient number of highly qualified teachers. High-poverty, low-performing schools most often face this persistent problem, both in rural and urban areas (Coulter, 2007). Using online courses to expand curricular access is of particular interest in rural areas, where limited resources and small student bodies often make it difficult to provide students with a full range of course offerings. Virtual courses are potentially an important solution to ensuring that students in small schools and isolated communities have access to critical courses, especially in science, technology, engineering, and mathematics (STEM) subjects (Tucker, 2007).

There is a distinct national and regional need for further information about the effectiveness of virtual coursework across a range of contexts. The Northeast region is the home to several thriving virtual school programs, including the Virtual High School consortium (based in Massachusetts) and Accelerate U (based in New York). However, the growth of online learning in this region is slower than in other parts of the country, though the need—particularly in rural areas and high-needs urban areas—is great. A key obstacle to growth of online learning in the region has been a lack of consistent policy guidance. Some jurisdictions in the region, such as Vermont and New Hampshire, have recently begun to address the issue more systematically by forming state task forces or policy groups on online learning. The proposed study will provide critical guidance to these efforts and inform decisions about whether and how to continue to invest in building comprehensive and coordinated policy guidance on the use of virtual courses in the region to target specific purposes and populations.

Many states and districts have made commitments to the goal of widespread access to Algebra I in grade 8. However, 10 years after the recommendation from the U.S. Department of Education, schools throughout the country and throughout the Northeast region are delivering algebra courses to only a small proportion of their 8th graders. According to the NCES (2007) transcript study, in 2005, only 20% of U.S. 8th graders in rural schools took Algebra I, whereas 28% in “urban fringe” and 28% in “central city” settings did so.

A scan of Northern Tier (Vermont, New Hampshire, Maine) schools in spring 2007 found a wide range of availability and strategies for giving students access to algebra in 8th grade. In many schools, teachers or administrators believe some but not all 8th grade students are ready to take Algebra I. Some teachers and administrators are not able to provide any Algebra I instruction, but are eager to find the resources that would allow them to do so. Others are able to provide such a course on a pull-out basis or as a freestanding course to a limited number of students. Still others, particularly in Maine, are serving these students by delivering an integrated mathematics curriculum that introduces some algebraic concepts, suggesting that they believe most students are ready for the course content in the eighth grade.

Currently in these schools (in the Northern Tier states and elsewhere), 8th graders that do not have access to Algebra I in their own school do not take the course until their 1st year of high school. In rare cases, these students travel to the high school to take the course, a challenging solution for both the schools and the students. Online courses can allow any number of 8th graders within a school to participate in a full Algebra I course, interact with peers around the curriculum, and be taught by a highly qualified teacher, even in cases where teachers have determined that only a small number of students are ready for the course.

Because Maine has a strong technology initiative that can support the infrastructure needed in the schools to offer an online course, we have targeted Maine as the target state for the study. Eighth grade students in Maine currently use laptops in the course of their daily instruction, and engaging with information delivered online is a familiar teaching tool. By targeting this state, we are able to focus the study in a state where there is a real need for offering online courses, efforts are being made to offer Algebra I earlier, in 8th grade,1 and the technology infrastructure is in place that is necessary for conducting this study. Although we are initially targeting Maine, we are prepared to expand the study into Vermont and New Hampshire, if necessary.2

The goal of this randomized controlled trial is to ascertain the impact of these courses on student achievement in terms of test scores and later mathematics course-taking. In particular, this study addresses the following policy questions:

  • Given that schools need to support students that are ready to take algebra in 8th grade, is it better for the ready students in schools where a course is not offered to take Algebra I in 8th grade as an online course than to take the available course offering for 8th-grade math?

  • Given that schools need to continue to find ways to improve mathematics achievement, is taking virtual algebra in 8th grade beneficial for the students who their schools judge to be ready, as measured by mathematics achievement, courses taken, and credits earned?

The primary research questions of this study are as follows:

  • Do students who have access to an online Algebra I course perform better in terms of mathematics achievement at the end of 8th grade and at the beginning of 10th grade than they would have in the absence of access to an online Algebra I course?

  • Does a greater proportion of students from schools that offer Algebra I online to 8th graders enroll and succeed in higher-level math courses in 9th and 10th grade than students from schools that do not offer Algebra I to 8th graders?

To test these research questions, we will randomly assign schools in Maine that do not currently offer one full section of Algebra I to 8th graders to implement an online Algebra I course (the intervention condition) or not (the control condition). The impact of the online Algebra I course on student outcomes will be measured by comparing student achievement and early high school course-taking for students from schools that offered the online course to students from schools that did not. We will use hierarchical linear modeling (HLM) to estimate the short-term effect of taking Algebra I online in 8th grade on students’ mathematics achievement, as well as the long-term effect on 9th and 10th grade course-taking patterns (see response to A16 for details regarding our planned data analyses).



Request for OMB Clearance

This rigorous study has been designed to provide causally valid answers and follows IES standards, as described in this authorizing legislation, for studies of effectiveness through field tests based on experimental designs. This submission is a request for approval of the data collection instruments (student and teacher surveys) that will be used to support the evaluation of offering Algebra I online as a means of increasing student access to the course. This statement describes the study approach and methodology for collecting and analyzing data. All instruments are appended to this submission, which addresses OMB concerns regarding respondent burden and paperwork control. This submission has been prepared according to guidelines for completing the justification statement to accompany OMB IC Parts 1 and 2.



A2. Purpose and Use of the Information Collected

Several measures will be used, including instruments to measure student outcomes, surveys of students and teachers, classroom observation measures, and administrative data collection protocols. Each is described in detail below. This research study is a well-powered, randomized field trial intended to provide a large-scale, rigorous test of Virtual Algebra’s effectiveness in increasing access to Algebra I, and in enhancing advanced course-taking. The data collected during this evaluation will be useful for state and local policymakers, districts, and schools. An important goal of this evaluation is to produce findings that will help decision-makers to determine how best to offer Algebra I to 8th graders, and to document how offering Algebra I in 8th rather than 9th grade creates opportunities for taking additional math and science courses (which is consistent with the goals of the recently enacted America Competes Act). As noted previously, the proposed study will provide critical guidance and inform decisions about whether and how to continue to invest in building comprehensive and coordinated policy guidance on the use of virtual courses in the region to target specific purposes and populations.

Please note that based on 5 CFR 1320.3(d), we do not seek clearance or declare burden for the student achievement measures, classroom observation measures, or administrative data collection protocols. According to NCEE guidelines, student cognitive measures, classroom observations, and administrative data are not counted towards burden estimates. Please see our response to A9 for additional details.

Student Outcomes

We propose to examine the impact of virtual algebra for 8th graders on student achievement and on course-taking in high school. Our research questions are reiterated here:

  • Do students who have access to an online Algebra I course perform better in terms of mathematics achievement at the end of 8th grade and at the beginning of 10th grade than they would have in the absence of access to an online Algebra I course?

  • Does a greater proportion of students from schools that offer Algebra I online to 8th graders enroll and succeed in higher-level math courses in 9th and 10th grade than students from schools that do not offer Algebra I to 8th graders?

These questions, and the measures we will use to address them, are directly aligned with the policy-related questions that underlie the study. Previous literature and policy goals support the premise that access to Algebra I in 8th grade benefits students over time in terms of achievement and their course-taking sequence. Our study goal is to measure the extent to which access to Algebra I in the form of the virtual course leads to these outcomes.

The outcome measures for the study reflect the longitudinal nature of the design. We will use the following measures at the specified points in time:

  1. At the end of 8th grade (spring 2009), we will gather student scores on the state mathematics assessment. In addition to overall composite scores, we will work with Maine’s math assessment experts to identify and extract the strand of items that test algebraic concepts, if possible.

  2. At the end of 8th grade (spring 2009), we will administer a mathematics assessment—the NWEA MAP—to students in the study. The NWEA MAP math assessment uses a computerized adaptive assessment program based on item response theory and reports scores as reference-normed scores (Rasch index scores) and percentile scores. The NWEA MAP is a flexible instrument that can be tailored for the purpose of the study. We will define a set of parameters including the length of test (e.g. 30 mins) and the proportion of items that test algebraic concepts (e.g., 50% of test) versus other math concepts appropriate for 8th graders (e.g., 50% of test). This test is reliable with strong psychometric properties and is currently being used as an outcome measure in AIR’s evaluation of the impact of professional development in mathematics study, funded by IES. The purpose of administering an assessment at the end of 8th grade in addition to using the state test scores is that we have been advised that the MEA may not include enough algebra items to serve as the only outcome measure to address our research questions.3

  3. At the end of eighth grade (spring 2009), we will gather final grades in eighth grade mathematics courses taken by all students in the study.

  4. At the end of 9th grade (spring 2010), we will gather transcript data for the students in the study to record course credits and grades earned during 9th grade. We will review and code each transcript to capture the type and rigor of mathematics (and science) courses completed and the grade earned.

  5. At the beginning of 10th grade (fall 2010), we will gather enrollment information for students in the study to record whether students are enrolled in math and science courses and which courses they are taking. This information will be coded to capture the type and rigor of mathematics (and science) courses in which each student is enrolled.

  6. In fall 2010, we will also collect individual student scores on the PSAT. As students entering the 10th grade, students in the study will be required, by state mandate, to take the PSAT along with the rest of the entering 10th graders in Maine. These scores will serve as another very policy-relevant outcome measure of the impact of virtual algebra in 8th grade over time.

The coding of the transcript data for analysis will be conducted using the methods employed by NCES for the NAEP and Education Longitudinal Study (ELS) transcript studies to ensure validity of measurement across sites. Transcript coding forms will be developed to guide the extraction of course identifiers and grades. Trained coders will code math courses based the coding system used by NCES called the Classification of Secondary School Courses (CSSC), which is based on information available in school catalogs and other information sources (NCES, 2007). We will use electronic data collection templates using Web-based data entry and aggregation forms to facilitate the timely collection and coding of the transcript data. Information about course names, credits earned, and course grades is transcribed and standardized. Course credits are converted to standardized Carnegie units, and letter grades (A–F) are converted to a point system (0–4). Points are then weighted by the number of Carnegie units earned by course type to yield each student’s score for the math class taken in 9th grade.4, 5

The course-taking scores for spring 2009 (exiting 9th graders) are composite measures of the number and type of course taken and grade earned. The course-taking scores for fall 2010 (courses enrolled in as entering 10th graders) are simpler measures of type of course enrolled in (because grades will not yet be available). All course-taking scores will be linked with the other data on each student by their statewide student identifier.

Our plans for tracking the students in the study over time rely on REL-NEI’s relationships with Maine’s Department of Education. As of the 2006–2007 schools year, each student in Maine has a statewide unique identifier (Data Quality Campaign, 2007). Though the state database does not include student-level transcript records that would provide information on courses completed and grades earned, it does contain student-level school enrollments and demographic information. Therefore, we will use the state database as the source for our achievement outcome measures, and we will also use the state database as the source for identifying the high schools that the students in the study attend during the 2009–2010 and 2010–2011 school years.

Before the students leave 8th grade, however, we will collect information to help us track each student over time. On the student survey that we administer in the spring, we will ask students to identify the high school they plan to attend the following fall, as well as to provide contact information for family members. Because we do not plan to administer an assessment to the students again once they leave 8th grade, our main challenge in tracking the students into high school is identifying their data over time in administrative records at the state, district, and school levels. We do anticipate the need to contact the high schools individually to track down transcripts, though with only 125 high schools statewide and with support from Maine’s Department of Education, we will have the capacity to collect these data.

Student and Teacher Background/Demographic Data

We will collect and use background and demographic data from the schools, teachers, and students that participate in the study.

We will obtain school data from administrative records from the state, districts, and/or schools themselves. Some of this information will be collected during the recruitment process (e.g., the number of math teachers on staff for 8th-grade math, curriculum used), and some of this information will be gathered prior to recruitment for the purpose of identifying potentially eligible schools (e.g., school enrollments and locale).

Although student math achievement is our primary outcome measure, we believe it is important to gather background and attitude data to provide additional context for our findings. To do this, we will administer a brief web-based survey (about 20 minutes) to all 8th-grade students in treatment and control schools at the same time that we administer the NWEA MAP assessment, in the spring of 2009 (see Attachment 2). We will include questions regarding comfort with using technology, and a measure of attitudes toward mathematics (including engagement), along with a course evaluation. The course evaluation will contain three sections, one focusing on the quality of the material (e.g., assignments, textbook), another focusing on the quality of instruction (e.g., the teacher’s organization, preparation, communication), and the last on overall judgments of the experience (e.g., how difficult the course was, how well-prepared one feels for the next math class).

We will also administer a brief, web-based teacher survey to all of the 8th-grade math teachers in treatment and control schools, including the virtual algebra teachers (see Attachment 3). The primary purpose of the teacher survey, administered in spring 2009, is to gather data from teachers on their background characteristics (years of experience, certification status), the amount and type of professional development in mathematics in which they participate, the amount of algebra they teach to 8th graders, and the instructional methods they use to deliver algebra content.

Classroom Instruction and Fidelity of Treatment

Field staff will visit each a random subsample of schools (approximately one-third of the schools in the study) once per semester (i.e., twice during the school year) to conduct classroom observations. The classroom observations for both conditions will use concrete, low-inference measures in the form of discrete questions that will prompt observers to indicate whether or not a relevant event has occurred during the class period.

At this site visit, field staff will also use a simple protocol to guide the collection of classroom materials for the regular, face-to-face eighth-grade math classes, including course syllabi, a sample of teacher assignments, and exams. These materials will be coded for content to quantify the amount and type of algebra concepts that are taught in the regular eighth-grade math classes.

The purpose of the classroom observations is twofold: to provide useful implementation information that will help us ground our findings and to gather contextual information about how teacher instruction and student engagement differ between an online classroom and a regular 8th-grade math classroom. Measuring classroom instruction is perhaps the most challenging measurement task in the study, as researchers have found it difficult to reliably and validly measure teachers’ classroom instruction (Burstein et al., 1995; Henke, Chen, & Goldman, 1999; Mayer, 1999, Stigler et al., 1999). The added challenge in this study is that the interactions between student and teacher will be occurring through very different modes (i.e., in person vs. online), and these differences may be related to variability in the quality of teacher instruction and the level of student engagement within each of the classroom types. The qualitative data gathered during these observations will be used to help interpret findings and to generate relevant examples to include in the final report.

Examples of the dimensions of teacher instruction and student engagement for both conditions and fidelity of implementation for the online condition we seek to measure through the classroom observations are included in Table 1. We will draw on existing observation protocols for observing instruction in traditional (face-to-face) classrooms—such as those being used for the impact study of professional development in mathematics currently being conducted by the American Institutes for Research (AIR), as well as protocols for tracking online interactions. We will consult with our Technical Working Group to review all the constructs and measures as well as to review the protocol instruments. Full development of the observation instruments will occur between December 2008 and April 2009.

Table 1. Examples of Constructs and Indicators for Classroom Observations

Dimension

Construct

Example of Indicators

Virtual Algebra Condition

Control Condition

Fidelity of implementation: The degree to which the intervention has been implemented in line with the standards created by Class.com.

Ease of use

Are students able to navigate the Web site easily?


Algebra curriculum

Is the student at the appropriate chapter?


Equipment

Did the students have access to the appropriate computer equipment (e.g., high-speed Internet connection, fast computer, etc.)?


Student engagement: The degree to which students are immersed and involved in the course.

Number of content-related questions

What proportion of questions asked by students is related to the algebra curriculum?

What proportion of questions asked by the students is related to the algebra curriculum? What proportion of questions asked by the students is related to the general math curriculum?

Engagement with others

Did students engage with each other on algebra-related concepts? Did the teacher create pairs or small groups or encourage the formation of these smaller discussion groups?

Did students engage with each other on algebra-related concepts? Did students engage with each other on general math-related concepts? Did the teacher pair students or use small groups?

Key mathematical ideas

What is the evidence of understanding of the key mathematical ideas (e.g., posting a solution to another student’s or to teacher’s question)?

What is the evidence of understanding of the key mathematical ideas (e.g., answering another student’s or teacher’s question)?

Minutes/hours online

How many minutes/hours per day did the student spend on-line using the software? Working on assignments? How many questions did the student post?


Teacher instruction: How is classroom time being spent? Is there a difference in the quality of instruction? Does the teacher appear to engage students?

Proportion of time spent on nonalgebra issues

What proportion of time did the online teacher answer non-algebra-related questions (e.g., questions about the technology, administration questions)?

What proportion of time did the classroom teacher answer non-algebra-related questions (e.g., administration questions)? What proportion of time did the teacher spend on student discipline or other disruptions in the class?


Constructive feedback

What proportion of teacher feedback to students is constructive?

What proportion of teacher feedback to students is constructive?

Mathematical thinking

Does the teacher frame questions that would require higher-level thinking?

Does the teacher frame questions that would require higher-level thinking?

Connections to prior mathematics classes or lessons

Does the teacher connect the current lesson with past lessons or mathematics classes?

Does the teacher connect the current lesson with past lessons or mathematics classes?

Use of materials

Does the teacher use other materials beyond the textbook?

Does the teacher use other materials or media beyond the textbook?

Use of materials

Does the teacher use other materials beyond the textbook?

Does the teacher use other materials or media beyond the textbook?



The online classroom observations will involve a two-step process. In the first step, the observer will observe the “physical” aspect, which includes the location in the school where the student is working and the computer hardware. For example, where does the student access the Internet (e.g., library, computer lab)? Does the student experience any difficulty locating or navigating the Web site? In the second step, the observer will examine the online teacher’s interactions with his or her students and the implementation of the program by “lurking.” These interactions can then be assessed for themes and the proportion of the discussion that relates to certain topics can be coded and assessed. The remote system monitors student progress through the lesson and can be reviewed to measure time spent on algebra content. The system also generates statistics about the number of times a student posts discussion items to the site, allowing us to measure student participation. Interactions between student and teacher and posting rates can be reviewed multiple times during the year without interfering with class instruction, and do not necessarily need to occur at the same time that the physical aspects of the environment are being observed. (We will review the online logs/transcripts that take place during the same time span as the observations of the traditional classrooms.)

The data collection schedule for this study is shown in Table 2. The full year of implementation is scheduled for 2008–2009, the follow-up year 2009–2010, with a last round of follow-up data collection in Fall 2010.

Table 2. Data Collection Schedule

Measures/
Data to Be Collected

Primary Purpose

Data Collection Schedule

Provide Context/ Covariates

Measure Outcomes

Spring 2008

Fall 2008

Winter 2009

Spring

2009

Fall

2009

Winter

2010

Spring

2010

Fall

2010

Administrative data

Seventh-grade MEA scores









Middle school characteristics









Eighth-grade math course grades









High school characteristics









Transcript data









Course enrollment data









PSAT scores









Data collected directly from students and teachers

Eighth-grade posttest









Eighth-grade MEA scores









Eighth-grade student survey








Eighth-grade math teacher survey









Classroom observation data

Observation protocols—online teacher training








Classroom materials











A3. Use of Technology in Information Collection

The data for this study will primarily be collected electronically, through secure, web-based systems, because of the technology capability in Maine’s schools for students at the target grade level. The measures that we will administer electronically include the spring 2009 posttest math assessment (NWEA MAP, a computer-adaptive test) and the spring 2009 student and teacher surveys. The administrative data including 7th and 8th grade MEA test scores, 8th grade mathematics course grades, 9th grade course transcripts, 10th grade enrollments, school demographics, and PSAT scores will also be collected electronically, by study team members during site visits to state, district, and school records departments. We expect to collect administrative data in a variety of forms but all these data will be entered and stored electronically.


A4. Efforts to Identify Duplication

To our knowledge, there are no other randomized controlled trials that examine the long-and short-term effects of taking virtual algebra in 8th grade. None of the information we request in the student and teacher surveys, or in the classroom observations, is available elsewhere. We are relying heavily on administrative data to use as 9th and 10th grade outcomes. While we will be conducting secondary analyses of state assessment data, we do require MWEA-MAP scores as an additional, more precise algebra outcome measure (i.e., it contains more algebra-specific items than the 8th grade MEA).


A5. Burden on Small Entities

The primary entities for the study are schools. All administrative data collection will be coordinated by AIR employees, and all the online mathematics assessments will be proctored by research team members, to reduce the burden on state, district, and school employees. As noted in our response to A3, data collection for this study will be done electronically. We have also tried to limit the amount of data being collected to the minimum. That is, we are limiting the data collected to that which is directly relevant to measuring the outcomes of interest, and limiting our data collection from students and teachers to brief background surveys (these data will be used as covariates in our analyses). We will send observers to visit each classroom twice during the 2008-2009 school year, so that we may gather information on how instruction, processes, and engagement differ between the virtual algebra and the control group classrooms. We believe it is important to gather this information, document the nature of mathematics instruction in the two types of classrooms. Because classroom observations do not impose any burden on the persons being observed, the frequency of the classroom observations should not create a hardship on the teachers or students.

No special provisions are necessary for small organizations or small businesses. The size of the program is not relevant to this data collection effort.


A6. Consequences of Less Frequent Collection

If the proposed data were not collected, IES and REL-NEI would be unable to provide information on the efficacy of using virtual algebra to increase access for 8th grade students. We have made every effort to limit the frequency of data collection (student and teacher surveys are brief, and administered only once, in spring 2009), are collecting all data electronically, and sending field staff to compile the administrative data we will use in secondary analyses (MEA and PSAT test scores, transcript, and enrollment data), so that state, district, and school staff are not burdened.


A7. Special Circumstances

No special circumstances apply to this study.


A8. Federal Register Announcement and Outside Consultations

a. Federal Register Announcement

The 60-day notice for this collection was published in the Federal Register on [insert date] and ended [insert date].



b. Consultation Outside the Agency

The Technical Working Group (TWG) assembled for this project is composed of nine leading researchers who can provide invaluable expertise in the fields most relevant to this study (literacy, randomized controlled trials, evaluation design, and statistics). The members of the TWG are:

  • Dr. J. Lawrence Aber, New York University

  • Dr. Anthony Bryk, Stanford University

  • Dr. Larry Hedges, Northwestern University

  • Dr. Stephen Klein, RAND Corporation

  • Dr. Don Leu, University of Connecticut

  • Dr. Richard Murnane, Harvard University

  • Dr. Michael Nettles, Educational Testing Services

  • Dr. Aline Sayer, University of Massachusetts, Amherst

  • Dr. Barbara Schneider, Michigan State University



c. Unresolved Issues

None.


A9. Payment or Gift to Respondents

While there will be no direct payment to participating schools, all participating schools will receive an online course free of charge. The schools randomized into the treatment condition will receive a Class.com Algebra I course (enough to enroll one class section, or a maximum of approximately 24 students) during the 2008-2009 school year. The schools randomized into the control condition will receive a Class.com online course during the 2009-2010 school year.


A10. Assurance of Confidentiality

All project staff will follow the confidentiality and data protection requirements of IES (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). We will protect the confidentiality of all information collected for the study and will use it for research purposes only. No information that identifies any study participant will be released. Information from participating institutions and respondents will be presented at aggregate levels in reports. Information on respondents will be linked to their institution but not to any individually-identifiable information. No individually-identifiable information will be maintained by the study team. All institution-level identifiable information will be kept in secured locations and identifiers will be destroyed as soon as they are no longer required. Each of the partner organizations – Education Development Center (EDC), American Institutes for Research (AIR), Nimble Assessment Systems, and Windwalker Corporation – obtains signed NCEE Affidavits of Nondisclosure from all employees, subcontractors, and consultants that may have access to this data and submits them to our NCEE COR (OK Park). In addition, all members of the study team having access to the institution-level data have been certified by AIR’s Institutional Review Board as having received training in the importance of confidentiality and data security. Finally, the following language will appear on all letters, brochures, consent forms, surveys, and other study materials:

You do not have to answer questions that you do not want to answer. Results will be used only for statistical purposes and all results are kept strictly confidential. Each participant will be assigned a study identification number in place of their names. The reports prepared for this study will summarize findings and will not associate responses with a specific school or individual. We will not provide information that identifies you or your school to anyone outside the study team, except as required by law.


A11. Sensitive Questions

None of the requested information on the teacher background survey is sensitive in the traditional sense. Teachers will be asked information about their education and professional backgrounds. Such items may be sensitive to some respondents; however, they are key variables that may be associated with student outcomes. Analyses of all items will be presented in the aggregate.

Some of the items on the student survey may be perceived to be sensitive, such as the student’s educational goals. However, the instructions for both the student and teacher surveys will clearly state the confidential nature of the data (see A10, for text to be placed on all study materials), and that analyses of all items will be presented in the aggregate, which should help mitigate concerns about one’s responses to items perceived to be sensitive in nature.


A12. Estimates of Hour Burden

Based on 5 CFR 1320.3(d) we do not seek clearance or declare burden for the student achievement measures, classroom observation protocols, or administrative data.

  • Study staff, rather than the teachers, will be administering the student achievement tests during the school day, in the classrooms where they typically convene (i.e., there is no burden on the school or the teachers).6

  • Because the students in this study will be younger than age 16 during the spring 2009 semester, no monetary burden on students is included below (only hourly burden for completing the student survey). The students are neither old enough to be earning a wage, nor are they being tested outside of school hours (which might limit their wage earning).

  • We do not state burden for observational protocols because teachers are not being asked to provide responses to questions as would be required in an information collection such as a survey. Observers will be using checklists unobtrusively, to record instances of different pedagogical techniques, while the teacher proceeds with her/his typical instruction.

  • Because field staff will be collecting the administrative data during site visits to state, district, and school data departments, there is no burden on the participants.

Accordingly, we present burden projections only for the student and teacher surveys. We are projecting an 80% response rate for the student surveys and a 90% response rate for the teacher surveys. In contrast to a survey of the general population (where the response rates are typically 50-60%), the REL-NEI field staff will maintain close contact, often face-to-face, with the 60 participating schools. As a result, commitment and motivation will be high, and follow-up with respondents more direct and straightforward. Therefore, the response rates of 80% and 90% are not unreasonable expectations, and are similar to response rates obtained other randomized field trials (e.g., the Collaborative Strategic Learning study being conducted through REL-SW).

Table 3 is a summary of the respondent burden, based on the student and teacher surveys. The estimated monetary cost of burden was computed using data from the National Compensation Survey, conducted by the Bureau of Labor Statistics.7 The hourly wage used for teachers was $40.86, which is the October 2005 (the most current year available) published average for secondary teachers in the Boston-Worchester-Lawrence MA-NH-ME-CT region.



Table 3. Estimates of Hour Burden

Task

Total Sample Size

Estimated Response Rate

Number of Respondents

Time Estimate (in hours)

Number of Administrations

Hours

Estimated Monetary Cost of Burden

Student Spring 2009 survey

1800

80%

1440

0.33

1

475.2

$0.00

Teacher Spring 2009 survey

60

90%

54

0.33

1

17.82

$728.13

Total

493.02

$728.13



A13. Estimate of Annual Cost Burden to Respondents

There are no additional respondent costs associated with this data collection other than the hour burden accounted for in item 12.


A14. Estimate of Annual Cost to the Federal Government

The estimated cost for this study, including development of a detailed study design, intervention and implementation plan, data collection instruments, justification package, data collection, data analysis, and preparation of reports, is $5,021,676.20 overall, with an annualized cost of $1,255,419.05 per year.

Table 4. Estimates of Annual Cost to the Federal Government

Study Year (dates)

Activities

Total Study Costs per Year

Year 1 (11/1/07 to 03/14/08)

Finalizing study plan and observation protocol, securing OMB and IRB clearances, developing recruitment materials, programming project databases

$340,312.71

Year 2 (03/15/08 to 03/14/09)

Recruitment, randomizing schools, obtaining 7th grade MEA scores, documenting schools’ processes for identifying “eligibles,” obtaining consent, teacher recruitment and training, training classroom observers, collecting fall observation data, coding and entering observation data, creating and cleaning MEA and observation data; finalizing MWEA-MAP adaptive test, designing and implementing online data collection systems

$1,972,363.29

Year 3 (03/15/09 to 03/14/10)

Refresher trainer of observers, collecting teacher and student surveys, collecting MWEA-MAP posttest data and 8th grade MEA data from state, downloading survey and MWEA-MAP data, coding and entering observation data, creating, cleaning, and merging of MEA, MWEA-MAP, and survey data, conducting and writing up impact analyses of 8th grade achievement and observation data

$1,659,717.14

Year 4 (03/15/10 to 03/14/11)

Collecting, coding, and entering 9th grade record and 10th grade enrollment data, creating and cleaning administrative datasets, merging all datasets, conducting impact analyses, drafting, revising, and completing final report

$1,049,283.06

Total

$5,021,676.20



A15. Program Changes or Adjustments

Not applicable. This request is for a new information collection.


A16. Plans for Tabulation and Publication of Results

The study timeline is shown in the table below.

Table 5. Study Timeline

Tasks

2007

2008

2009

2010

2011

Q1

Q2

Q3

Q4

Q1

Q2

Q3

Q4

Q1

Q2

Q3

Q4

Q1

Q2

Q3

Q4

Q1

Finalize study plan


















Obtain OMB clearance


















Develop OMB submissions package


















OMB review


















Develop measures


















Develop student survey


















Refine student survey based on pilot


















Develop (adapt) observation measures


















Refine observation measures


















Develop database


















Develop system


















Download pilot data


















Refine system based on pilot results


















Upload measures to system


















Download study data


















Provide technical assistance to users


















Maintain database


















Recruitment


















Identify possible schools


















Create and distribute materials about study


















Recruit schools


















Recruit online teachers


















Train online teachers


















Schools identify “eligible” students ready for Algebra I


















Confirm school participation


















Review blocking and randomization plan with IES and ATS


















Conduct random assignment of schools


















Obtain student/parental consent to take VA


















Obtain class rosters


















Data collection


















Collect state achievement test scores—seventh grade


















Conduct classroom observations and collect classroom materials


















Collect state achievement test scores—eighth grade


















Administer posttest—eighth grade (NWEA MAP)


















Collect state achievement test scores—eighth grade


















Analyze posttest data


















Collect transcript data


















Code and analyze transcript data


















Collect PSAT scores


















Analyze PSAT scores


















Report writing and documentation


















Draft and refine interim report


















Draft and refine final report


















Document and archive data


















Document and archive online system




















Data Analysis

Given the nested data structure (i.e., students nested within schools), the primary analytic method for this study will be the hierarchical linear modeling (HLM) method (Raudenbush & Bryk, 2002). HLM is a statistical method particularly well suited for analyzing data of a nested structure, as is often the case in the field of education as well as other social science disciplines. It analyzes data at different levels simultaneously while explicitly taking into account the dependence or clustering among individuals nested within the same higher-level units (e.g., classrooms or schools). Compared with the traditional regression model, the HLM method generates more accurate standard errors for parameter estimates and thus allows more valid inference about the intervention’s effects when the data are of a multilevel structure. Moreover, it enables researchers to address questions that could not be addressed via the traditional regression model (e.g., heterogeneity of regression slopes).

Consistent with the assumptions and design on which our power calculations are based, we are employing a two-level random-effects HLM model. We will estimate virtual algebra’s effects on mathematics achievement and course-taking outcomes by comparing student outcomes in the schools implementing virtual algebra with their counterparts in control schools.

We will conduct the impact analysis at the end of each of the three rounds of data collection (end of 8th grade, end of 9th grade, beginning of 10th grade). The analysis following the year of implementation of virtual algebra will estimate the short-term effect, whereas the analysis conducted after the next rounds of data collection will give a better indication of the long-term effectiveness of having access to Algebra I in 8th grade in terms of achievement and course-taking patterns. Our approach to the impact analyses is described below.

Data Quality and Outlier Analysis

We will begin data analyses with data quality checks and an outlier analysis regarding all collected data. We will report missing cases, missing data, unusual response patterns, and outliers. Cases that are statistical outliers will be flagged, but will not be automatically removed from the data sets. Rather, flagged cases will be closely checked to make sure nothing inappropriate occurred (e.g., scoring error, miscoding, or data entry mistake). Based on the results of the case examination, outlier data will then either be used as is or modified appropriately if there was an error. Our focus is to use ITT analysis (as well as TOT analysis) and therefore we do not plan to extreme outcome variable values unless there is evidence of error. However, we will check for sensitivity to outliers to explore whether results are dependent on one or more very extreme scores, even in cases that are not determined to be data collection errors.

Baseline Group Equivalence

The primary purpose of preliminary data analyses is to describe sample characteristics and establish group equivalence at baseline. We will conduct descriptive analyses of school- and student-level sample characteristics (e.g., demographic composition) for the full sample and the treatment and control groups separately. These study groups include ALL of the 8th graders in the treatment and control schools.

Although the random assignment of schools is expected to produce two study groups (i.e., 8th graders in treatment schools and 8th graders in control schools) that are statistically equivalent on all measured as well as unmeasured characteristics, there may still be differences between the groups because of sampling error. Moreover, postrandomization attrition of the study participants may also affect the baseline equivalence of the treatment and control groups. Therefore, we will examine equivalence of student characteristics between treatment and control groups prior to conducting the impact analyses.

Specifically, we will assess group equivalence of the analytic sample by comparing the virtual algebra schools to the control schools on the following school and student characteristics:

  • School characteristics, including school size based on enrollments, percentage of students eligible for free or reduced-price lunch, and background characteristics of the 8th-grade math teachers in terms of years of experience and certification status.

  • Student characteristics, including achievement pretest (7th-grade MEA) scores, free or reduced-price lunch status, and gender.

We will test for differences in the above characteristics using a model that accounts for the clustered data structure and blocking used for randomization. Following the analysis of baseline equivalence for all 8th graders in treatment and control schools, we will seek to establish baseline equivalence for the “eligible” students in treatment and control schools.

Descriptive Analyses

For this study, we will collect data that will provide contextual information about the implementation of virtual algebra in treatment schools and the condition of “business as usual” in control schools that will aid interpretation of the results. These data will derive from the observation instruments and the review of classroom materials in both treatment and control schools.

We will conduct observations in both treatment and control schools, providing rich data regarding classroom instruction practices and the content delivered in both virtual algebra classrooms as well as “regular” 8th-grade math classrooms. These data will allow us to understand and describe how schools implement the virtual algebra course, how much algebra instruction occurs in control schools, and variations in the quality of 8th-grade math instruction in treatment and control schools. The observation data will be buttressed by contextual information based on review of classroom materials and teacher survey data—both of which will be used to quantify the amount of algebra taught in 8th-grade math classes in nonvirtual algebra classes in treatment schools and in control schools, and the instructional methods used to deliver algebra content.

Impact Analyses

Our proposed approach to estimating the effects of virtual algebra has several core features:

  • A focus on impacts based directly on the experimental design. These are the most compelling, transparent, and reliable impacts because they involve a minimum of assumptions.

  • Estimation of impacts in ways that account for the randomization of schools (not teachers) and the blocking of schools by locale and curriculum type.

  • Estimation of impacts using baseline covariates to increase precision.

  • Estimation of impacts separately for each of the three follow-up periods (spring of 2009 and 2010, fall 2010) for each student outcome measure.

  • Estimation of impacts separately for (a) “eligibles” in treatment schools versus “eligibles” in control schools, (b) all 8th graders in treatment schools versus all 8th graders in control schools, and (c) “noneligibles” in treatment schools versus “noneligibles” in control schools. That is, our impact estimates will be based on three types of comparisons:

  1. Comparison of outcomes for “eligibles” in treatment schools with “eligibles” in control schools.

    • The “eligibles-treatment” versus “eligibles-control” comparison allows us to estimate the impact of virtual algebra on students who are considered ready for algebra I in 8th grade.

    • Because some of the “eligibles” in treatment schools will choose not to participate in virtual algebra, we will conduct both ITT and impact-of-the-TOT analyses for these comparisons.

  1. Comparison of “ALL” students in treatment schools with “ALL” students in control schools.

    • This comparison is important because the intervention will pull some students out of regular 8th-grade math, which may affect outcomes for students who are left. Because the subset of “ready” students will take virtual algebra, the students remaining in regular 8th-grade math may have less able peers and their class sizes may be smaller.

    • It is important to note that we conducted our power analysis calculations for the “eligibles” versus “eligibles” comparison and the ALL vs. ALL comparison may be underpowered. The number of students will be higher and that increases the power, but the anticipated effect of access to virtual algebra will be smaller, because it affects only a subset of the students in the treatment schools. For example, if 20% of the 8th graders in treatment schools are considered eligible, and if the ES for eligible students is 0.20 and the ES for noneligibles is zero, the overall impact would be just 0.20*(20%) = 0.04.

  1. Comparison of “noneligibles” in treatment schools with “noneligibles” in control schools.

    • The reason this analysis is important and interesting is that it may not be safe to assume that the ES for noneligibles is zero. It is possible that the educational experience of nonparticipating students in treatment schools is affected by the presence of the virtual algebra course, through reduction in class sizes, increased homogeneity, or other mechanisms at play in the regular 8th-grade math classes. The comparison of “noneligibles” will allow us to directly address the extent to which offering access to algebra I through virtual algebra impacts outcomes for students who remain in regular 8th-grade math.

Taken together, these three analytic approaches will allow the study to generate policy-relevant findings, not about the efficacy of online versus face-to-face instruction, but about overall impact on a school’s 8th grade mathematics program of adding algebra I to their course offerings by participating in a virtual school or other online program.

The basic logic of our analytic strategy is to compare schools that are randomly assigned to receive the intervention with those that are not. Because treatment groups are determined at the school level, the primary unit of analysis will be the school. Because the data for this study are hierarchical or nested (i.e. students are nested within schools), units at the same level are not statistically independent from one another, and the most appropriate way to estimate the effect of the interventions on students’ algebra achievement and to correctly estimate the statistical precision of these estimates is to apply a multilevel model through HLM.

Analyses using HLM will allow us to model the effects of student- and school-level factors as well as the interactions between the levels. We will estimate a two-level model where the students will be level 1 and schools will be level 2. The multilevel models will be estimated using HLM software, or solved into a composite equation and estimated using SAS Proc Mixed. To make the interpretation of results more meaningful, the data will be centered on the average student pretest score.

Intent to Treat Estimates

In estimating impacts, we will begin with an ITT analysis, in which members of the experimental group are compared with one another regardless of their actual participation in the treatment (i.e., the virtual algebra course). The ITT estimates represent the effect of assigning schools to offer virtual algebra to students considered ready for algebra I, rather than the effect of students’ actual participation in the course. The ITT analysis will be used to estimate impacts of virtual algebra on all three types of comparisons:

  1. “eligibles” in treatment schools versus “eligibles” in control schools,

  2. “ALL” 8th graders in treatment schools versus “ALL” 8th graders in control schools, and

  3. “noneligibles” in treatment schools versus “noneligibles” in control schools.

We will estimate conditional models at the student and school levels in which we will assess how much of the variability in algebra achievement is accounted for by participation in the intervention or control group, and other predictor variables, including prior achievement, FSLP status, student attitudes toward mathematics, comfort with technology, and course evaluation measures.

Level 1: Students-Within-Schools

Our system of equations begins at the student level. Equation 1 describes the relationship between student achievement, individual background characteristics, and random variation among the students in each school.

Yij = π0j + π1jXij + eij (1)



In this model,

Yij = the outcome for student i in school j and

Xij = individual student characteristics (i.e., prior academic achievement, free and reduced-price lunch status, math attitudes, comfort with technology, and perceived quality of 8th-grade math course) of student i in school j, centered on the grand mean across the sample.

Therefore,

π0j = the average outcome (math achievement) at school j, for students with average characteristics and prior achievement;

π1j = the relationship between individual student characteristics and student outcomes within school j;

eij = the difference between the outcome of student i and average outcomes in school k (adjusted for student background characteristics); eij ~ N (0, σ2).

Level 2: Schools

Given that random assignment occurs at the school level, program effects are estimated at this level of the system of equations.

In the level 2 or school-level model, there will be multiple equations, one for each coefficient in the student-level model. The basic model is presented below. We will test whether the program has an effect on algebra achievement by looking at the statistical significance of 01.

π0j = 00­ + 01VAj + 021j + 032j + r0j (2)

π1j = 10 (3)



Where:

VAj = dummy variable for treatment condition—1 if school j is in the treatment group, 0 otherwise;

1 = dummy variable for blocking variable “locale”—1 if school j is considered rural, 0 otherwise;

2 = dummy variable for blocking variable “curriculum type”—1 if school j uses “nontraditional” curricula for 8th-grade math, 0 otherwise;

00 = the average student outcome across the population of “schools” j;

01 = is the difference between average achievement at schools randomly assigned to the treatment group versus schools assigned to the control group (i.e., the effect of the intervention on student outcomes);

10 = the average effects of pretest, eligibility for free or reduced price lunch, gender, attitudes toward math, comfort with technology, and student evaluations of the course materials, quality, and overall experience on student outcomes across all schools j; and

r0j = the random error associated with school j on school average student outcome; r0j ~ N (0, τ00 ).

Effect of Treatment on the Treated

The impacts described in the previous section are ITT estimates because they estimate the impact of random assignment to virtual algebra or control (for the “eligibles” vs. “eligibles” comparison). Because the incentive to get credit for algebra I in 8th grade can be considered high, we anticipate that students given the option (i.e., the “eligibles”) in the treatment schools will be interested in participating in the virtual algebra course. However, we may expect that not all eligible students in treatment schools will actually take the course. That is, some eligible students may opt out of participating. This type of nonparticipation has been experienced in all RCT settings, particularly those investigating programs associated with choice (Wolf, Gutmann, Puma, & Silverberg, 2006). It is possible that the comparison of virtual algebra participants in treatment schools with “eligibles” in control schools could give us a biased estimate of the impact of the treatment on the treated if there are a number of “eligibles” in the treatment schools that do not consent to take the virtual algebra course. To estimate the average impact of the online course on actual participants, we will apply the “Bloom Adjustment” (Bloom, 1984).

A TOT impact involves rescaling the comparison of “eligibles” in treatment schools to “eligibles” in control schools (i.e., the ITT estimates described above) to account for the fact that a known fraction of the treatment group “eligibles” did not take (or complete) the virtual algebra course. The average treatment impact generated from a mix of participating “eligibles” (EP) and nonparticipating “eligibles” (ENP) is attributed only to EP group, by dividing the average treatment impact by the proportion of the “eligibles” that actually took the course. We anticipate that the TOT estimate will be larger than the ITT estimate, and that the difference between the TOT estimate and the ITT estimate is dependent on the number of nonparticipating “eligibles” in the treatment schools.

Impact estimates for course-taking outcomes will be calculated as for achievement outcomes, using the course-taking score for math (and science) as the outcome measure, Yij.

Interpreting the Impact Findings

In interpreting the results of the impact analyses described above, we will emphasize that any observed impacts of the online algebra intervention on student outcomes are the result of a multi-dimensional intervention with multiple programmatic attributes. We will exercise caution about generalizing the findings to the same or similar online courses with different programmatic attributes, such as the same online algebra course delivered in larger non-ability grouped classes, or to a different online algebra course delivered by less highly trained staff.

Subgroup Analyses

Analyses of student subgroups will have somewhat lower levels of statistical power than the full-sample analyses, so all subgroup analyses are considered exploratory in nature. No causal judgments can be made from them.

Perhaps one of the most important subgroup analyses will test whether there are differential effects of virtual algebra for students with different levels of “readiness” for algebra I when they entered 8th grade. The designations given to the “eligible” students prior to random assignment (e.g., “definitely ready,” “probably ready,” “maybe ready”) will form subgroups for which we can test for differential effects. It is important to note that these designations reflect teacher perceptions of student readiness. Therefore they are not going to be clearly grounded in objective criteria like cut-points on a common measure, but instead reflect perceptions that do play a distinct role in shaping student experiences and pathways through schools. In follow-up analyses, we will also form subgroups within the group of “eligibles” based on prior achievement (e.g., 7th-grade MEA test scores) to test for differential impacts that are based on cut-points on a common measure.

Though also not fully statistically powered, we are also interested in estimating effects of virtual algebra within blocks for each of our blocking variables. Doing so will address these questions:

  • Is virtual algebra differentially effective in particular locales (e.g., rural vs. other)?

  • Is virtual algebra differentially effective in schools that use nontraditional (vs. traditional) math curricula for the 8th-grade general math classes?

Variables measuring the interactions of these blocking variables with treatment condition (virtual algebra vs. control) will be included in the HLM equation at the school (level 2) to assess these subgroup effects.

Missing Data Analyses

For achievement outcomes based on administrative data, including standardized assessment scores and course-taking scores based on transcript data, missing data will likely indicate data quality problems, not actual nonresponse. We will work with staff at the Maine Department of Education and in the districts that house the demographic, achievement, and transcript data to ensure the quality and accuracy of the data received by the REL-NEI.

For survey instruments, item-level missing data will be minimized by using low-burden survey instruments that are well designed with clear question and response structures (e.g., logical skip patterns). We will analyze each item for degree of missing data, note unusual nonresponse rates, and examine possible reasons for missing answers. Missing baseline data issues are commonly addressed by imputing the missing values. The appropriate imputation strategy depends on the extent to which data are missing and the missing data mechanism. If the level of missing data is low and appears random, single imputation methods, such as mean imputation or hot-deck imputation, are unlikely to introduce a serious bias. If the level of missing data is high, more complex imputation techniques will be considered, such as multiple imputation.

Dissemination

The REL-NEI study team – comprised of staff from EDC, AIR, Nimble Assessment Systems, and Windwalker – will produce the study reports. AIR will analyze the data and produce the final report, which will contain an appendix written by EDC that describes the progress of the implementation in the 30 schools with online Algebra I classrooms. After the final report has been approved, REL-NEI will develop two additional products: a 10-page study summary for schools and districts, and a 1-page summary for parents and the general public. We will revise this list of products as needed, to align with the coordinated dissemination plans that are being developed across the regional education laboratories.


A17. Approval to Not Display OMB Expiration

All data collection instruments will include the OMB expiration date.


A18. Explanation of Exceptions

We request no exceptions.


1 In Maine, 21% of students statewide took Algebra I in 2003 (NCES, 2005). Our initial estimates based on information collected about mathematics and science teaching in the state of Maine suggest that between 40–50% of schools that serve 8th but not 9th graders do not currently offer one full Algebra I course to their 8th graders. Even in the 50-60% of Maine schools that do offer a stand-alone algebra class, clearly less than 100% of students are participating, yielding the overall rate of 21% statewide.

2 We acknowledge that the results of this study would only be to other schools that have the same levels of computer availability in place, and with similar implementation of the intervention.

3 Although it is true that we do not want to use an outcome measure that is completely an algebra test because students in the control schools would simply not be prepared to take it, we do want to ensure that the outcome measure includes some algebra. Therefore, we propose to use both the state assessment (policy relevant) and a tailored test for the study. The NWEA MAP test can produce two relevant outcome measures for analysis: the overall score and the algebra-specific score, as long as there are enough items (~15) to contribute to the algebra strand, which will be the case here.

4 Though science is not a direct focus of the study, there is research that suggests that the taking of Algebra in 8th grade is linked with science course-taking as well as math course-taking. For this reason, we plan to additionally analyze science course-taking in 9th and 10th grade.

5 The NCES transcript standardization process does not additionally weight grade points for Advanced Placement (AP), International Baccalaureate (IB), and other honors classes. We will explore options for additionally including weights for such honors classes in our course-taking standardization process because we consider the “rigor” of the courses taken in 9th or 10th grade for the students in the study to be a critical dimension of the course-taking outcomes.

6 According to NCEE's definition of burden, cognitive tests such as student assessments are not counted as part of burden. The reasoning for this is similar to the reasoning behind Exemption #1 of 45 CFR 46.101(b)(1), which exempts from IRB review studies that involve ”Research conducted in established or commonly accepted educational settings, involving normal educational practices, such as (i) research on regular and special education instructional strategies, or (ii) research on the effectiveness of or the comparison among instructional techniques, curricula, or classroom management methods.”

File Typeapplication/msword
File TitleAN EVALUATION OF THE THINKING READER SOFTWARE INTERVENTION
AuthorTeresa Duncan
Last Modified BySheila.Carey
File Modified2008-02-27
File Created2008-02-27

© 2024 OMB.report | Privacy Policy