TO: Brooks Bowden, OK Park (IES)
FROM: Teresa Duncan, Jessica Heppen (AIR), Peggy Clements, Cheryl Rose, Katie Culp, Craig Hoyle, Jill Weber (EDC)
DATE: May 7, 2008
RE: Responses to follow-up OMB questions
Thank you for forwarding the questions from OMB. Our responses are below; we look forward to hearing back from OMB.
I. Student survey
On pages 6-24 of this memo, we have provided a listing of all the items on the student survey, indicating which items are new, which ones were removed, and the sources for each item.
Only items 16a and 16b are new
We will pilot the student survey with nine 8th graders this month (May). One of our staff members in Maine has a daughter in 8th grade and has indicated that her daughter and friends would be willing to take the survey so that we may determine how long it takes students to complete it.
We will send OMB the results of the pilot and the final version of the student survey.
II. Teacher survey
Our plan is to administer the teacher survey to all the treatment (online) and control (face-to-face) group teachers.
The purpose of the teacher survey is to provide context for the results; the data are also important for providing background information to establish comparability (in training, experience, etc.) between teachers in the treatment and control groups.
Because we are working with small schools, with one, possibly two teachers in a building, we do not have the degrees of freedom to estimate a teacher-level effect. Again, the teacher data are to provide context and background information.
Accordingly, the 2-level random effects model described in the OMB submission package is appropriate and consistent with the assumptions and design on which our power calculations are based. Please recall that this study involves randomization at the school level and so our HLM analyses represent impacts at the student and school levels. What we have in the OMB package is the basic model for determining the intervention’s impacts.
Should we find significant intervention impact, we plan to use the teacher data to conduct post hoc analyses, and test for any interaction effects related to teacher characteristics.
We had not planned to administer any surveys to the Classroom Proctors, but acknowledge OMB’s interest in capturing information about who the proctors are and what they are doing during the class periods. The Classroom Proctors are scheduled to undergo a day-long training session prior to the implementation of the course, to help standardize the nature of the support they provide to their classrooms. During this training, we will collect background data (items 1-9 of the teacher survey) from the Classroom Proctors so that we may document the range in training and experience represented in the group.
To capture what the Classroom Proctors are doing in the classroom, we propose to do the following. Because the Classroom Proctors are required to keep close contact with the Online Teacher, we will ask the Online Teachers to keep a running log of the reports/feedback that they receive from each of their Classroom Proctors. This documentation will help us keep track of both minor events (e.g., a student leaving the classroom during the math period) and major events (e.g., the causes of a server problem and when the server is expected to be back online). The back-and-forth between the Classroom Proctors and Online Teachers is a fundamental part of implementing the intervention; all we are asking them to do is to maintain a daily log of those conversations.
Given the importance of online instruction to this region, we expect high initial response rates. Because of the relatively small number of respondents, we will be able to conduct multiple follow-ups of non-respondents (by individual telephone interviews, if need be) to ultimately reach that 90% response rate. The table below details the dates, activities, and expected response rates resulting from those activities.
2009 |
Action |
Expected response rate % |
Appx. N (out of 120) |
4-May |
Email teachers about the online survey going live on May 11; due May 15 |
|
|
11-May |
Email announcement to teachers about online survey (include logins) |
|
|
13-May |
Email reminder about online survey; have School Liaisons remind each teacher |
50% |
60 |
15-May |
Email to announce survey is due today |
65% |
78 |
18-May |
First follow up email |
75% |
90 |
25-May |
Second follow up email |
80% |
96 |
1-Jun |
Begin telephone calls; offer to telephone interview or send paper copy |
83% |
100 |
8-Jun |
Continue telephone calls (reminders/interviews); have School Liaisons collect completed paper copies (in sealed envelopes) |
87% |
104 |
15-Jun |
Complete telephone reminders/interviews; have School Liaisons collect completed paper copies |
90% |
108 |
III. Sample Selection
This study is an investigation of the use of an online algebra course to expand access to eighth graders who are ready to take the course but are unable to do so because they attend schools that are often small and in rural locations that do not offer the course until high school.
The design is a randomized controlled trial with randomization at the school level. Schools that do not currently offer a full section of Algebra I to eighth graders will be randomly assigned to receive a virtual algebra course (at no cost to them) or no virtual algebra course.
We have decided to focus this study on Maine because of:
the high degree of interest in virtual courses for students
low overall enrollments in Algebra I among eighth graders across the state, and
because Maine has a strong technology initiative that can support the infrastructure needed in the schools to offer an online course. Eighth-grade students in Maine currently use laptop computers in the course of their daily instruction, and engaging with information delivered online is a familiar teaching tool.
Because this infrastructure is already in place, implementation of the study will be facilitated and we anticipate a shorter start-up time than in states with more limited technology capacity.
However, we will need to take this contextual factor into account when interpreting the findings of the study, as this may affect the generalizability of the results. In locations where technology problems are more likely to occur, especially at start-up, educators should not expect the results we see in Maine until they achieve similar levels of technology integration.
The target population of schools consists of those in Maine that:
serve students in Grade 8 and below, but not Grade 9 and above,1
which do not offer one full section of Algebra I
but would offer Algebra I to some of their eighth graders if they could.
Based on data from the CCD and information that we have about the local schools, we estimate that there are approximately 80-100 schools in the target population.2
The target population of students is eighth graders attending these schools and who are considered to be “ready for algebra.” By “ready for algebra” we mean
those students who are considered by their schools (teachers, principals), their parents, and themselves to have sufficient mastery of pre-algebra concepts to take Algebra I.
Schools currently make decisions about which students are ready on the basis of teacher perceptions of preparedness, grades in prior math classes up through seventh-grade math, and, more rarely, scores on assessments such as algebra readiness tests (e.g., Iowa Algebra Aptitude Test, Orleans-Hanna Algebra Prognosis Test).
See figure on page 5 of this memo for a visual display of the Virtual Algebra study’s sample selection process.
VIRTUAL ALGEBRA STUDENT SURVEY |
||||||||||||||||||||||||||||||||||
Item # (Original) |
Item # (New) |
Item |
Source |
Known Psychometric Properties |
Construct |
Potentially affected by Treatment |
||||||||||||||||||||||||||||
1-4 |
1-4 |
School and class identifiers |
standard |
|
ID |
No |
||||||||||||||||||||||||||||
5-7 |
5-7 |
Demographics |
standard |
|
Background |
No |
||||||||||||||||||||||||||||
8 |
8 |
Think about the grades you earned during 6th, 7th, and 8th grade. How would you describe your overall grades in MATH classes? |
NAEP Background Survey (2000) |
Used with nationally representative samples of 8th grade students (OMB reviewed federal survey) |
Background |
No
|
||||||||||||||||||||||||||||
9 |
9 |
Which of the following math classes do you expect to take next year (starting next fall, Fall 2009)? |
NAEP Background Survey (2000) |
Used with nationally representative samples of 8th grade students (OMB reviewed federal survey) |
Background |
Yes |
||||||||||||||||||||||||||||
10 |
10 |
Which of the following math classes do you expect to take while you are in high school? |
NAEP Background Survey (2000) |
Used with nationally representative samples of 8th grade students (OMB reviewed federal survey) |
|
Yes |
||||||||||||||||||||||||||||
11 |
11 |
Which of the following best describes your educational goals? Will not finish high school Graduate high school Some education after high school Graduate college Go to graduate school I don’t know |
NAEP Background Survey (2000) |
Used with nationally representative samples of 8th grade students (OMB reviewed federal survey) |
Background |
Yes |
||||||||||||||||||||||||||||
12 |
12 |
My math teacher:
|
Consortium on Chicago School Research |
Classroom Personalism: Used with all students in Chicago Public Schools Elementary Level: Individual Separation = 1.66, Indiv Level Reliability = 0.73, School-Level Reliability = 0.84 High School Level: Individual Separation = 1.61, Indiv Level Reliability = 0.72, School-Level Reliability = 0.95
|
a-d = Academic Press;
d-i = Classroom Personalism |
Yes |
||||||||||||||||||||||||||||
13 |
13 |
How much do you agree with the following statements about your math class?
|
Consortium on Chicago School Research |
Engagement: Used with all students in Chicago Public Schools Elementary Level: Individual Separation = 1.56, Indiv Level Reliability = 0.71, School-Level Reliability = 0.89 High School Level: Individual Separation = 1.33, Indiv Level Reliability = 0.71, School-Level Reliability = 0.97
|
Student Engagement |
Yes |
||||||||||||||||||||||||||||
14 |
14 |
In your math class, how often:
|
Consortium on Chicago School Research |
Academic Press: Used with all students in Chicago Public Schools Elementary Level: Individual Separation = 1.37, Indiv Level Reliability = 0.65, School-Level Reliability = 0.79 High School Level: Individual Separation = 1.68, Indiv Level Reliability = 0.74, School-Level Reliability = 0.93 |
Academic Press |
Yes |
||||||||||||||||||||||||||||
15 |
15a
15b |
Original item: On a typical day, how much time do you spend studying or doing homework for your math class, outside of class time? 0 None 1 Less than 30 minutes 2 30-60 minutes 3 1-2 hours 4 More than 2 hours Replaced with 2 items from TIMSS survey: A) How often does your teacher give you homework in mathematics? 0 Never 1 Less than once a week 2 1 or 2 times a week 3 3 or 4 times a week 4 Every day B) When your teacher gives you mathematics homework, about how many minutes are you usually given? 0 Fewer than 15 minutes 1 15-30 minutes 2 31-60 minutes 3 61-90 minutes 4 More than 90 minutes |
GE Foundation survey
TIMSS Contextual Background Questionnaire(2003) Grade 8 |
N/A (dropped)
Used with International Samples of 8th grade students (including U.S.) |
Amount of Homework
Amount of homework |
Yes
Yes |
||||||||||||||||||||||||||||
16a & 16b |
16a & 16b |
Please answer the following questions about your math class this year. Materials in my math class:
|
New; adapted from course evaluation forms |
Will be pilot tested |
Course evaluation |
Yes |
||||||||||||||||||||||||||||
|
|
The instruction in my math class:
|
|
|
|
|
||||||||||||||||||||||||||||
17 |
17 |
a.
Totally
online course
b.
Hybrid course (a mix
of online and regular-face-to-face instruction in the same
course)
c.
Supplemental course
(extra instruction for a particular course that requires use
of a computer program) Approximately how many courses (including your current courses) have you taken that were delivered in the following modes?
|
University of Minnesota Multi-College Student Survey: Experiences with Instructional Technology |
Field tested with a random sample of 1,100 students from four colleges associated with the University of Minnesota (Jorn et al., 2001) (reliability statistics for this item were requested but are likely unavailable b/c reliability of item would depend on verification from other respondents or administrative data) |
Previous experience with online courses |
No (controlling for VA course participation in T schools) |
||||||||||||||||||||||||||||
18 |
18 |
What type of Internet connection do you have at home? 0 Low speed (dial up) 1 High speed (cable, DSL, T1) 2 No Internet connection at home |
University of Minnesota Multi-College Student Survey: Experiences with Instructional Technology |
Field tested with a random sample of 1,100 students from four colleges associated with the University of Minnesota (Jorn et al., 2001) (reliability statistics for this item were requested but are likely unavailable b/c reliability of item would depend on verification from other respondents or administrative data) |
Background / Context |
No |
||||||||||||||||||||||||||||
19 |
19 |
How much do you agree with the following statements?
|
Computer Attitudes Questionnaire |
Psychometrics examined with 1300 students in 7th and 8th grade in rural TX (Knezek & Christensen, 1995)
Internal Consistency for Computer Enjoyment; Cronbach’s alpha = 0.82
|
Computer Enjoyment (possible mediator) |
Yes |
||||||||||||||||||||||||||||
20 |
20 |
Item set dropped (b and c to be included with item set 19). Replaced with items from ELS 2002 survey on extent of use of computers in mathematics How much do you agree with the following statements?
Replaced with items from ELS 2002 survey on extent of use of computers in mathematics In your current or most recent mathematics class, how often do/did you use computers in the following ways?
|
Computer Attitudes Questionnaire and Project Links
Educational Longitudinal Study (2002) (NCES) |
N/A (dropped)
Used with nationally representative samples of 10th grade students (OMB reviewed federal survey) |
Extent of use of computers in math |
Yes |
||||||||||||||||||||||||||||
21 |
21 |
How much do you agree with the following statements?
|
Project Links Student Survey (PLSS; University of Maryland) |
Field tested, item statistics not available (can be dropped if requested) |
Value of Technology in Math (possible mediator) |
Yes |
||||||||||||||||||||||||||||
22 |
|
Dropped. In the study, the online course will not use email and therefore the data yielded by these items will be inaccurate. How much do you agree with the following statements?
|
Student Computer Attitudes Questionnaire (SCAQ) |
N/A (dropped) |
Comfort with Technology (possible mediator) |
Yes |
||||||||||||||||||||||||||||
23 |
|
Dropped in favor of math attitudes items from one source: How much do you agree with the following statements?
|
Attitudes Toward Math Inventory (ATMI), Project Links Survey (PLSS), NAEP (2000, 2003) |
N/A (dropped) |
Math Self-Confidence (possible mediator) |
Yes |
||||||||||||||||||||||||||||
|
22 |
Replaced with: How much do you agree with the following statements?
|
TIMSS Contextual Background Questionnaire(2003) Grade 8 |
Used with International Samples of 8th grade students With U.S. samples, factor validity is attained and Chronbach’s alphas for SCLM, LM, and UM, respectively, are: 0.83, 0.71, 0.79 (from Kadijevich, 2006) |
Math Attitudes – Self-Confidence in Learning Math (SCLM), Liking Math (LM), Usefulness of Math (UM) (possible mediators) |
Yes |
||||||||||||||||||||||||||||
24 |
|
Dropped in favor of math attitudes items from one source: How much do you agree with the following statements?
|
Attitudes Toward Math Inventory, Project Links Survey, NAEP |
N/A (dropped) |
Value of Math (possible mediator) |
Yes |
||||||||||||||||||||||||||||
|
23 |
Replaced with: How much do you agree with the following statements?
|
TIMSS Contextual Background Questionnaire(2003) Grade 8 |
Used with International Samples of 8th grade students With U.S. samples, factor validity is attained and Cronbach’s alphas for SCLM, LM, and UM, respectively, are: 0.83, 0.71, 0.79 |
Math Attitudes – SCLM, LM, UM (possible mediators) |
Yes |
||||||||||||||||||||||||||||
25 |
|
Dropped in favor of math attitudes items from one source (see new #22, #23) How much do you agree with the following statements?
|
|
|
|
|
||||||||||||||||||||||||||||
26 |
24 |
How much do you agree with the following statements?
|
Attitudes Toward Mathematics Inventory |
Psychometrics derived from use of survey with a sample of 545 high school students mathematics high school classes (Tapia & Marsh, 2004)
Cronbach’s alpha = 0.88, Test-retest reliability = 0.78
|
Math Motivation |
Yes |
||||||||||||||||||||||||||||
27 |
|
Dropped in favor of math attitudes items from one source (see new #22, #23) How much do you agree with the following statements?
|
Project Links Student Survey |
N/A (dropped) |
Attitudes toward Math |
Yes |
||||||||||||||||||||||||||||
28 |
25 |
How difficult was this math test? 0 Very difficult 1 Somewhat difficult 2 Normal 3 Fairly easy 4 Very easy |
NAEP Background Survey (2000) |
Used with nationally representative samples of 8th grade students (OMB reviewed federal survey) |
|
Yes |
||||||||||||||||||||||||||||
29 |
26 |
How important was your success on this math test to you? 0 Not very important 1 Somewhat important 2 Important 3 Very important |
NAEP Background Survey (2000) |
Used with nationally representative samples of 8th grade students (OMB reviewed federal survey) |
|
Yes |
||||||||||||||||||||||||||||
30 |
27 |
Please describe your level of effort on this math test. 0 Could have tried much harder 1 Could have tried harder 2 Tried about has hard as I could 3 Tried very hard |
NAEP Background Survey (2000) |
Used with nationally representative samples of 8th grade students (OMB reviewed federal survey) |
|
Yes |
References in this table:
Jorn, L., Martyr-Wagner, M., et al. (2001). Multi-college student survey: Experiences with instructional technology. Twin Cities, MN: University
of Minnesota
Kadijevich, D. (2006). Developing trustworthy TIMSS background measures: A case study on mathematics attitudes. In The Teaching of
Mathematics, Vol. IX, pp. 41–51. http://elib.mi.sanu.ac.yu/files/journals/tm/17/tm924.pdf
Knezek, G., & Chirstensen, R. (1995). A Comparison of Two Computer Curricular Programs at a Texas Jr. High School Using the Computer
Attitude Questionnaire (CAQ) Technical Report 95. Texas Center for Educational Technology Telecommunications and Informatics Laboratory
Tapia, M. & Marsh, G. E., II (2004). An instrument to measure mathematics attitudes. Academic Exchange Quarterly, 8(2), 16-21.
NOTES and SOURCES:
Item 16 (a and b) is the only new item set in the student survey. As suggested by OMB, we will pilot test the survey on a sample of 9 eighth graders in Maine. In so doing, we will conduct Respondent debriefing and data review. We will report to OMB the results of the pretesting and any changes to the survey instrument that were made based on the findings.
Respondent Debriefing
Respondent debriefing typically consists of follow-up questions at the end of an interview that are designed to obtain quantitative information about respondents’ interpretations of survey questions. These questions help researchers determine whether concepts and questions were understood by respondents in the same way that the survey designers intended. In an interviewer-administered survey, the debriefing questions may be followed by a discussion between respondent and interviewer, to further probe the respondent’s reaction to and comprehension of the questions in the survey instrument.
Data Review
A data review of the pilot test results is conducted to identify questions that have higher than expected or desired levels of non-response (either don’t knows or refusals). High item nonresponse in a pilot test could indicate poor question wording, generally unavailable data, or non-applicability of the question to a significant subset of respondents. Because data review involves examination of quantitative results from the pilot test, larger numbers of respondents may be needed with more complex instruments to ensure that an adequate number of respondents are asked each question.
SOURCES:
ATMI: The Attitudes Toward Mathematics Inventory (http://www.rapidintellect.com/AEQweb/cho25344l.htm) was designed to investigate the underlying dimensions of attitudes toward mathematics. The 49-items of the ATMI were constructed in the domain of attitudes toward mathematics to address factors reported to be important in research. Items were constructed to assess confidence, anxiety, value, enjoyment, motivation, and parent/teacher expectations. Consideration was given to previous research as follows:
Motivation (Singh, Granville, & Dika, 2002; Thorndike-Christ, 1991). The motivation category was designed to measure interest in mathematics and desire to pursue studies in mathematics.
Exploratory factor analysis of the ATMI (Tapia & Marsh, 2004) resulted in four factors identified as Self-confidence. Value of mathematics, Enjoyment of mathematics, and Motivation. The Self-confidence factor consists of 15 items. The Value factor and the Enjoyment factor each consist of 10 items. The Motivation factor consists of five items. Table 1 shows sample items from each one of the factors. The complete inventory is available from the first author upon request. Alpha coefficients for the scores on these scales were found to be .95, .89, .89, and .88 respectively (Tapia & Marsh, 2004). From http://www.thefreelibrary.com/Attitudes+toward+mathematics+of+precalculus+and+calculus+students-a0163980003
Tapia, M. & Marsh, G. E., II (2004). An instrument to measure mathematics attitudes. Academic Exchange Quarterly, 8(2), 16-21.
PLSS: Project Links Student Survey (1997), University of Maryland Physics Department. This survey has been developed by the Project Links evaluation team to measure how interaction with the computer modules developed by the project affects the attitudes, beliefs, and expectations of students towards mathematics. The current draft version of the survey includes
54 general statements about mathematics and their views of mathematics. Students are asked to agree or disagree with these items on a five point scale
6 items specifically rating the module in the class on a 10 point scale.
(Some of these items were not appropriate for 8th grade students.)
TIMSS (2003) Grade 8 Student Survey (see http://timss.bc.edu/timss2003i/PDF/T03 Student 8.pdf).
Kadijevich, D. (2006). Developing trustworthy TIMSS background measures: A case study on mathematics attitudes. In The Teaching of Mathematics, Vol. IX, pp. 41–51. http://elib.mi.sanu.ac.yu/files/journals/tm/17/tm924.pdf
Statements 8a–8d, 8f, 8g, 9a–9e of the TIMSS 2003 Grade 8 Student Questionnaire used as Indicators. Item 8e was not used because of its inappropriate loading on the first underlying factor concerning all twelve statements. Self-Confidence in Learning Mathematics (SCLM) was measured by a 4-item Likert scale administered by means of statements “I usually do well in mathematics”, “Mathematics is more difficult for me than for many of my classmates”, “Mathematics is not one of my strengths”, and “I learn things quickly in mathematics” (see statements 8a, 8c, 8f and 8g of the Questionnaire; to achieve positive meaning, scoring 1–4 was reversed for items 8a and 8g). Usefulness of Mathematics (UM) was measured by a 4-item Likert scale administered by means of statements “I think learning mathematics will help me in my daily life”, “I need mathematics to learn other school subjects”, “I need to do well in mathematics to get into the faculty/university of my choice”, “I need to do well in mathematics to get the job I want” (see statements 9a, 9b, 9c and 9e of the Questionnaire; to achieve positive meaning, scoring 1–4 was reversed for all these items). Liking Mathematics (LM) was measured by a 3-item Likert scale administered by means of statements “I would like to take more mathematics in school”, “I enjoy learning mathematics”, and “I would like a job that involved using mathematics” (see statements 8a, 8d and 9d of the Questionnaire; to achieve positive meaning, scoring 1–4 was reversed for all these items).
CAQ: Student Computer Attitude questionnaire - http://www.tcet.unt.edu/research/
Computer Attitude Questionnaire |
Used by Attitudes Toward Information Technology, Project for the Longitudinal Assessment of New Information Technology Attitudes in Education, Dr. Gerald Knezek and Dr. Rhonda Christensen, Developed by the Texas Center for Educational Technology
Knezek, G., & Chirstensen, R. (1995). A Comparison of Two Computer Curricular Programs at a Texas Jr. High School Using the Computer Attitude Questionnaire (CAQ) Technical Report 95. Texas Center for Educational Technology Telecommunications and Informatics Laboratory
ELS 2002: Education Longitudinal Study of 2002 – Extent of use of computers in mathematics: http://nces.ed.gov/surveys/els2002/pdf/StudentQ_baseyear.pdf
NAEP (2000) & (2003) Grade 8 Student Survey
CCSR Surveys: Consortium on Chicago School Research: http://ccsr.uchicago.edu/content/page.php?cat=4&content_id=25
MCSS: University of Minnesota Multi-College Student Survey: Experiences with Instructional Technology http://dmc.umn.edu/surveys/student-eval/student-eval.pdf
Jorn, L., Martyr-Wagner, M., et al. (2001). Multi-college student survey: Experiences with instructional technology. Twin Cities, MN: University of Minnesota
1 The reason to exclude schools that serve higher than Grade 8 is that we assume these schools will have far more eighth-grade students who will be able to take the 9th- or 10th-grade Algebra I class within the same building.
2 The CCD indicates that there are about 197 schools that serve Grade 8 and below, but not Grade 9 and above. Of those 197 schools, we estimate 40–50% do not currently provide Algebra I as a stand-alone class for eighth graders.
File Type | application/msword |
Author | Teresa Duncan |
Last Modified By | Sheila.Carey |
File Modified | 2008-05-14 |
File Created | 2008-05-14 |