OLL OMB_3.10.09_Section B_Clean

OLL OMB_3.10.09_Section B_Clean.doc

Evaluation of Evidence-Based Practices in Online Learning

OMB: 1875-0251

Document [doc]
Download: doc | pdf

B. Collection of Information Employing Statistical Methods

1. Respondent Universe

Case studies will be conducted at up to twenty-four sites (up to 12 each for K-12 and TPD). For K-12 case studies, the contractor will visit both typical and exemplary sites throughout the state of Florida in order to compare the practices across these two settings and isolate practices associated with exemplary sites. During both K-12 and TPD developer site visits, we expect to meet with between three and eight individuals, including directors and senior officials, instructional developers, evaluators or quality assurance personnel, and teachers, if available. At school-based sites, we expect to meet with program administrators, information technology specialists, between one and three instructors, and a focus group of between three and eight students using the application. During virtual site visits, we will interview one or two instructors and conduct a focus group of between three and eight students.

The contractor has identified FLVS and a large suburban school district as the primary K-12 developers of interest based on a two-step process for identifying states or districts with viable data systems. The contractor first sought nominations from experts for states most likely to have the necessary data, and then proceeded to interview representatives from candidate states to gather additional information. In all, the contractor contacted seven states about the availability of data for the secondary data analysis: Alabama, Arkansas, California, Florida, Georgia, South Carolina, and Tennessee. Of these, only Florida, Tennessee and Arkansas appeared to have the appropriate data and number of enrollments necessary to conduct a detailed analysis. A large suburban school district was also suggested as a possibility for inclusion in the evaluation during additional discussions with online learning experts.

Following this nomination process, the contractor engaged in conversations with the nominated sites to determine the viability of their online and comparison student data. Of the states and districts that were nominated, only Florida and the large suburban school district were found to possess the data necessary to compare online and face-to-face student outcomes while controlling for prior achievement.

The contractor has obtained student data for online and face-to-face students in 2006-07 from FLVS and Florida Department of Education. The contractor expects that FLVS and Florida data from the 2007-08 year will be ready for use in summer 2009. The large suburban school district has agreed to provide the contractor with both 2006-07 and 2007-08 data in Spring 2009.

Once the contractor has verified and cleaned the FLVS and Florida 2006-07 data, districts with “high enrollment” will be identified. “High enrollment” districts are defined as those sites with at least 20 students enrolled in online courses and at least 10 students enrolled in a given course segment. The contractor is focusing on these districts to ensure that there is an adequate-level of online learning occurring to merit site visits. Within the pool of high enrollment districts, statistical analysis will be used to examine student pass rates and student achievement, controlling for a variety of student and school characteristics. As described in detail in the text that follows, the contractor will perform statistical analysis to identify sites with above average achievement as well as average achievement, as measured by final course grades. Student retention rates in individual math and English courses will also be computed.

District-level student achievement will be analyzed using linear regression with a set of covariates representing student and school-level characteristics. Covariates in the model will include student grade level, a flag for limited English proficiency status, free or reduced price lunch status in 8th grade, prior-year FCAT scores in reading (for English courses) or math (for math courses), and a dummy variable that indicates if the student previously attempted and failed the course. The model will also include median school Stanford 10 scores in math and reading for grade 9 in 2006-07 as a control for prior achievement. The model will be estimated without a constant term and with a dummy variable representing each district, so that the estimated mean for each district can be compared to the overall mean. Once the regression model is estimated, the contractor will use an F-test to compare the mean estimated final grade for each district to the estimated mean across all districts in order to identify those sites with achievement that is statistically different than average.

Similarly, district pass rates will be analyzed using logistic regression with the same set of covariates. After estimating the regression model, the contractor plans to use a Wald test to compare the estimated odds of passing the course for each district to the estimated odds across all districts in order to identify those sites with pass rates that are statistically different from the average pass rate.

Once districts course grades and pass rates are estimated through these linear and logistic regression models, the contractor will select up to five implementation sites with “exemplary” achievement, including pass rates, and up to five sites with “typical” achievement using coefficients from the district dummy variables and results of the significance tests. The contractor will also ensure that the selected implementation sites are not homogenous (i.e. all large districts), and that there is some diversity in the characteristics of students who take online courses. A similar process will be conducted with data from the large suburban school district, pending their approval of school-based site visits.

The contractor expects that these sites will include both physical and virtual locations. For physical locations, such as schools, the contractor will screen potential sample schools for student achievement, particularly for subjects and grades associated with the offering. The proposed implementation sites will be submitted to the Department for comment and approval. Sites will be notified of their selection only after Department approval of their inclusion in the study.

The contractor has developed the following guidelines for selecting the 12 teacher professional development case studies.

  1. The learning experience must prepare either pre-service or in-service teachers to teach online.

  2. Courses, graduate degree, and mentoring programs should be comparable to traditional offerings in duration and intensity (e.g. courses of eight to twelve weeks of instruction covering multiple sub-topics associated with the course topic; mentoring should be at least one school-year in length).

  3. The learning experience should include a significant asynchronous component, whereby twenty percent or more of instruction is designed to take place asynchronously. The balance may be synchronous online communication (e.g., text chat, Voice-over-IP) and / or face-to-face interaction.

The contractor will classify TPD interventions that meet the above criteria as either “entirely online” or “blended.” Interventions that are classified as entirely online will not require any face-to-face interaction between students and teachers or among students. All communication, support, and activity will occur online. Blended approaches will require some face-to-face interaction; the intervention may be implemented in a traditional school and classroom where teachers or other faculty provide support and feedback, or the intervention may require teachers to attend short workshops before, during, or after attending an online course.

When choosing among several possible candidates for both the K-12 and TPD implementation sites, the contractor will give priority to interventions that have demonstrated strong, significant effects on student learning. The contractor will strive for diversity in terms of geographic locations and type of provider. Providers can be state or local governments, universities, non-profits, or for-profit entities.

2. Data Collection Procedures

The data collection plan will address the evaluation questions articulated in Section I:

  • What conditions influence the quality of online learning?

  • What implementation practices support effective online learning?

  • What are emergent, promising practices for effective use of online learning to replace face-to-face K-12 courses?

  • What are emergent, promising practices for effective use of online learning to enhance face-to-face instruction in K-12 settings?

The plan for this data collection will be submitted to the Department in winter 2009. The plan will carefully describe the types and characteristics of the selected online instructional offerings in Florida and a large suburban school district; the results of the literature review and meta-analysis; and federal and state policy relevance. The initial and revised plan for the online learning case studies will cover the following core areas: key evaluation questions for the data collection; data collection methods; potential survey item indices, if applicable; site visit interview, observation and focus group protocols, if applicable; proposed notification procedures; data collection timeline; and preliminary plan for data analysis.

The data collection plan will cover the site visits to be conducted during Year 3 of the project and will describe the criteria for selecting sites as well as goals associated with the case study visits. We propose three types of site visits: developer site visits, school site visits, and virtual site visits. Through interviews and observations, users at the selected FLVS and TPD implementation sites will help identify particular practices as helpful or detrimental, and what additional supports might be necessary. When user data is triangulated with direct observations and outcome data, the questions of “why” and “how” can be addressed. The case studies will provide researchers with important information about the design of these online applications. Site visits will also allow researchers to obtain detailed information concerning implementation of the offering, as well as on support practices of schools, instructors, and students. This level of detail would not be possible to collect through survey data or analysis of secondary datasets. For developer site visits, the contractor will meet with both K-12 and TPD providers in order to learn more about the functionality associated with the particular offering and collect information associated with the use of the offering and associated outcomes. Before conducting a site visit, researchers will conduct a thorough review of material available on the Web about the product. During developer visits, researchers will learn about the offering through interviews and system demonstrations.

For virtual courses and related offerings, the contractor will seek teachers, tutors, and support staff located at or near the developer site for interviews. In many cases, developers collect information about site use, frequency of interaction between students and teachers, and related information, and the contractor will collect this information where it is available in aggregate form (thus protecting student privacy). Curriculum and support materials, ancillary services, policies, and guidelines associated with the online offering will also be collected when available. The contractor will request log-in information so that researchers can interact with the offering after the site visit.

For school site visits, the contractor will collect data through interviews, focus groups, and observations. Interviews will be conducted with key staff associated with the offering, including principals, technology coordinators, and teachers. Researcher queries will ask for details about the goals and rationales for use, the frequency and context of use of the offering, and the supports and barriers to its use at the site. (For a list of topics and draft items to be covered by the protocol, please see Appendix D for K-12 and teacher professional development protocol items). The contractor will conduct observations of individual or classroom use of the offering. The likely focus of these observations will be on the nature of interaction between students and teachers, both online and in school, as well as on the ease of use of the offering. Where the offering is not part of a whole-school effort, the contractor may also want to identify teachers of appropriate grade levels and subjects who do not use the offering in order to compare curriculum and instruction in the school with and without the offering.

The contractor will conduct student focus groups to collect data from students about their experiences using the online offering. Researchers will ask about strengths and weaknesses of the offering, additional supports or activities they think would better incorporate the online offering with the course of their education, and perceived attitudes and outcomes associated with the offering. The contractor will also collect the following types of documents from school officials: associated curriculum and support materials, ancillary services, policies, and guidelines for use. The contractor will collect information about student academic outcomes when possible.

Two researchers will conduct each school site visit. In most cases, one site visitor will conduct the focus group or interview while the other site visitor concentrates on recording the interview. (Site visitors will switch roles across respondents to avoid excessive fatigue). Regardless of the form of data collection, an internal data quality review team will check data for completeness and plausibility prior to preparation of the data set for analysis.

Permission and privacy concerns in virtual focus groups will be handled in much the same way as for physical site visits. Participating staff in K-12 sites will receive student consent and parent permission forms for distribution to students and parents. The principal or other site administrator will forward permission slips to parents of students who may participate in virtual focus groups. Signed consent forms will be faxed or emailed back to the contractor. Students, particularly those under the age of eighteen who are participating in virtual focus groups, either via chat or videoconference, will be addressed by first names only.

In cases where researchers are given access to online systems containing identifying and other personal information or discourse (e.g., teacher-developed materials, participant profiles, student work, discussions, and chats), the contractor will work with the provider in advance to identify specific areas, groups, or courses that the researchers will access, and obtain advance consent from instructors and/or learners to enter the designated area to collect data. Typically, gaining consent from instructors and students can be accomplished directly through the system (e.g., by posting a message) or though other pre-established online means of communication (e.g., course e-mail list). This is the online analog to researchers walking into a face-to-face course and observing instruction. Researchers will remove identifying information from discussion transcripts and other materials used to document the case. The contractor will treat log-in names and passwords provided by developers confidentially and will request that any accounts given be canceled at the completion of the data collection.

Project-wide training for site visitors, including role play with the interview and focus group protocols, will be conducted at the contractor’s offices.

a. Cross-Site Analysis

Before conducting the K-12 and TPD site visits, the contractor will collect and review relevant documents (e.g., information about FLVS or the implementation site that is available online). Although the K-12 and TPD cases will be treated separately, in both cases the analytic process will begin as the contractor uses these documents in conjunction with the conceptual framework to generate initial hypotheses about the features, capabilities, and practices associated with each online learning application.

Analysis will continue during the site visits as researchers gather data and compare findings with the study’s conceptual framework. Two researchers will conduct each site visit, and throughout the visit, the team will informally discuss their initial impressions about key features of the online learning implementation and the degree to which the emerging story matches or contradicts study hypothesis. More formally, the site visitors will meet each day of the visit to go through the case study debriefing form and formulate preliminary responses.

Once each site visit is completed, site visitors will draft case study reports. Drafting such reports requires researchers to reduce their field notes to descriptive prose within the structure of a formal debriefing form. This translation of field notes to a case study report involves sorting all of the available data at each site (including interviews, observations, and document reviews) by topic areas that define the sections of the protocols and debriefing forms.

To facilitate the analysis of the qualitative data, ATLAS.ti qualitative data analysis software will be used to store, code, and organize all of the interview and observation data. ATLAS has advantages over other qualitative analysis software in that it was designed for online use by multiple coders and analysts. ATLAS will allow structuring the data set by site or by offering (i.e., grouping data from the three related site visits for each offering). Using ATLAS, the contractor’s researchers will perform the initial coding of the data reporting templates for each school, using a coding scheme linked to the site visit protocols and developed before the first site visit. Additional codes will be developed that correspond to new phenomena unaccounted for previously and will be integrated into the coding scheme throughout the course of the data collection. Using a variety of ATLAS-generated reports (e.g., code frequency counts, memo lists), researchers will examine the range of categories and begin answering this study’s evaluation questions. Staff have extensive experience with qualitative data coding, particularly using ATLAS.ti.

The goal of the cross-site analysis is to compare, contrast, and synthesize findings and propositions from each case, in order to make general statements about the larger sample. The contractor will use a three-step hierarchical procedure for analyzing the data during the cross-site analysis.

Step 1: The cross-site analysis process will begin with a team debriefing meeting involving all site visitors. Debriefings of this type are an efficient means of developing themes that will govern cross-site analyses. This process will focus on general themes across the thirty cases, identifying general statements that address the evaluation questions on promising practices in online learning. These themes will be added as codes in the ATLAS coding structure.

Step 2: Once the data capture forms have been coded, major topics and themes will be divided among the researchers who will focus their attention on more in-depth analysis. Each analyst will have the entire set of case reports available in ATLAS. Analysts will conduct ATLAS queries on their assigned issues and topics to pull up relevant examples and descriptions across the thirty cases. They will be able to develop additional codes during this process and add them to the data set at this stage.

Step 3: Researchers will document the specific cases that account for interesting findings and pursue a particular theme by synthesizing important details, making explanatory judgments, and refining previous hypotheses and propositions. The researchers will draft relevant text that describes and explains the findings relevant to the cases for the final report.

Once the cross-site analysis has been completed and researchers have interpreted and refined the data from the case studies, the contractor will incorporate these findings into a final evaluation report. This reader-friendly document will ultimately provide research-based guidance to policymakers, administrators, and educators on how to optimally implement online learning for K-12 education and teacher preparation.

b. Secondary Data Sources

The use of secondary data sources will guide case study selection, enhance the project’s analyses, and avoid duplication efforts. As described earlier, our systematic review of online learning literature will help focus the project on those conditions and practices most likely to influence academic achievement. A second literature review, currently underway, will identify how teachers are prepared to teach online; the programs, conditions, and practices identified through this review will help guide TPD case study selection. In addition, by gathering secondary sources about the online implementations, the contractor can focus interview time on issues not covered in existing documents. The contractor will also collect data for two cohorts of K-12 students (one in 2006-07 and the other in 2007-08) from Florida and a large suburban school district, in order to examine the effects that online learning has on K-12 student achievement as compared to face-to-face learning, after controlling for prior achievement.

c. Prepare Notification Materials and Gain District and School Cooperation

The notification process will include a letter to online developers and program administrators from the U.S. Department of Education. The notification letter will include a description of the study and contact information for one of the contractor’s researchers whom the recipient may contact in case there are questions or concerns. The letter will describe the purpose, objectives, and importance of the study. A researcher will follow up with developers and administrators to schedule appropriate times for site visits.

3. Methods to Maximize Response Rate

The contractor will select online implementations based on their demonstrated student outcomes, and their exemplary practices associated with online learning - an approach that makes participation attractive to developers and implementation sites. In addition, past experience has shown that working with interviewees to schedule site visits during convenient times facilitates data collection activities. The contractor will make clear in communication with potential respondents that they will do whatever possible to minimize burden associated with collection activities.

4. Pilot Testing

To improve the quality of data collection instruments and control burden on respondents, all instruments for use with the K-12 cases have been pre-tested. A separate document reporting on the results of pilot-testing has been provided to OMB under separate cover. The TPD protocols were based heavily on the K-12 protocols, but have not yet been specifically pre-tested. Pre-testing of TPD site visit protocols will begin after the next TWG meeting (in order to incorporate feedback from the TWG prior to pre-testing), scheduled for spring 2009. Several members of the TWG are themselves involved with online development, and by walking through objectives and topics with them, the contractor will gain additional information about what developers and implementers of online implementations are likely to be able to provide. In addition, the contractor will seek to pilot instructor protocols with individuals who are currently teaching or have taught teachers to teach online.

The protocols will standardize data collection efforts across each site, while still providing flexibility for the site visitor to customize questions to individual sites.

5. Contact Information

Dr. Marianne Bakia is the Co-Project Director for the study. Her mailing address is SRI International, 1100 Wilson Blvd., Arlington, VA 22209. Dr. Bakia can also be reached at 703-247-8571 or via e-mail at marianne.bakia@sri.com.

Dr. Barbara Means is the Co-Project Director for the study. Her mailing address is SRI International, 333 Ravenswood Avenue, Menlo Park, CA 94025. Dr. Means can also be reached at 650-859-4004 or via e-mail at barbara.means@sri.com.


File Typeapplication/msword
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy