FRSS Supporting Statement

Att_FRSS school tech supporting memo.doc

NCES Quick Response Information System

FRSS Supporting Statement

OMB: 1850-0733

Document [doc]
Download: doc | pdf

District Survey, page 9

TO:

Rochelle W. Martinez


June 11, 2008

THROUGH:

Kathy Axt



FROM:

Edith McArthur





SUBJECT:

Request for Clearance for the Proposed Fast Response Survey System (FRSS) 92: Educational Technology in U.S. Public Schools, Fall 2008


Justification


The National Center for Education Statistics (NCES), U.S. Department of Education proposes to employ the Fast Response Survey System (FRSS) to conduct a survey of educational technology in public elementary and secondary schools. The survey was requested by the Office of Educational Technology (OET) to provide national data on current and emerging educational technology within the nation’s public school system. The survey is included in the National Educational Technology Leadership national activities spending plan.


The proposed survey will provide data that can be compared to some of the results from the FRSS survey series Internet Access in U.S. Public Schools and Classrooms: 1994-2005, which focused on access to computers and the Internet. For example, these surveys found that in 2005, the ratio of students to instructional computers with Internet access in public schools was 3.8 to 1, a decrease from the 12.1 to 1 ratio in 1998. The proposed survey will also collect information needed to calculate the current ratio of students to instructional computers.


In addition, the new survey will cover a broader range of educational technology topics. It will provide current national statistics on the availability and types of (1) hardware (e.g., computers, hand-held devices, peripherals); (2) network and Internet access (e.g., wireless access); and (3) operating systems. An important issue for technology usage is the ability of the school staff to integrate technology into the curriculum. Therefore the proposed survey will collect information on the support within the school to help staff integrate technology into instruction, as well as the provision of technical support. To obtain information on the climate for educational technology and identify potential barriers within schools, respondents will be asked to report their perceptions about technology issues in their school and district. The number of instructional classrooms will be collected in order to report the average number of computers per classroom. Finally, to update and/or verify information from the sampling frame, the survey will collect information on the percentage of students eligible for free- or reduced-price lunch and the grades taught at the school. By addressing access to and support for current and emerging educational technology in public schools, the survey will provide valuable national data for OET and other educational policymakers at the national, state, and district levels.


This is one of three proposed surveys that OET has requested be conducted with the FRSS. The other two are district-level and teacher-level surveys.1 OET envisions the three new surveys as a barometer of technology access and use within public elementary and secondary school districts, schools, and classrooms.


The FRSS survey, under OMB clearance #1850-0733, is authorized under Section 153 (a) of the Education Science Reform Act of 2002 (Public Law 107-279), which states that the purpose of NCES is “to collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations.”



Overview of Data Collection


Westat will collect the information for the Early Childhood, International and Crosscutting Studies Division, NCES, U.S. Department of Education, using the FRSS. Westat is responsible for the questionnaire development; sample design and selection; data collection; telephone follow up; editing, coding, keying, and verification of the data; and production of tabulations and the report detailing the results of the survey.


Because this survey includes new topics, substantial development work was conducted. The development of the survey involved several phases. First, after discussions with OET about desired survey topics, Westat conducted a brief literature review and a search of existing survey instruments. The initial draft instrument included some newly crafted items as well as some adapted items from existing surveys. Second, Westat conducted four rounds of feasibility calls to test and improve the instrument. During feasibility calls, respondents were not required to complete the questionnaire, but rather to review and give feedback about the survey in telephone interviews. These calls were conducted over an extended period (April 2007 through January 2008) and the questionnaire for each round was substantially different than in the previous rounds. We contacted 9 or fewer respondents for each round. Respondents were asked about the clarity and relevance of the survey items, and also about whether they could answer each question without too much burden. After each round of calls, the instrument was revised and submitted to OET and NCES for review and further revision. Following the NCES Questionnaire Review Board (QRB) meeting, the questionnaire was revised and submitted for NCES review and approval. This questionnaire draft was then pretested through calls to technology specialists of selected public elementary and secondary schools. Following the pretest, the questionnaire was revised again and is being submitted with an official request for OMB clearance.


We propose a nested sample design that links districts, schools, and teachers. The proposed design includes a nationally representative sample of about 2,000 schools selected from the NCES Common Core of Data (CCD) Public School Universe File. The data collection will be accomplished by means of a self-administered survey. Respondents will have the option of completing the survey on a traditional paper and pencil questionnaire or on a Web version of the questionnaire that will be accessed through the Internet. The questionnaire is limited to three pages of information readily available to respondents and can be completed by most respondents in 30 minutes or less. These procedures are typical for FRSS surveys and result in minimal burden on respondents.


Prior to contacting schools for survey collection, a courtesy information packet consisting of a cover letter and copy of the questionnaire will be mailed to the superintendent of each district with schools selected for participation. The packet also will include a list of the schools within the districts that are in the sample. Any special requirements that districts have for approval of surveys will be met before schools in those districts are contacted.


To minimize the burden on schools, Westat will coordinate the collection of the school survey with the collection of teacher sampling lists (the collection of teacher sampling lists is included in a separate OMB package for the teacher survey). Collection and followup activities for the school surveys and teacher lists will be handled by the same Westat staff to minimize the number of contacts made to principals and other school staff.


Questionnaires and information needed to access the Web survey will be mailed in September 2008 to the principals of each sampled school. One week after mailout, we will send thank-you/reminder postcards thanking those who responded and reminding those who have not yet responded. Telephone follow up for nonresponse will begin about 3 weeks after the questionnaires have been mailed to the schools. Experienced telephone interviewers will be trained to conduct the nonresponse follow up and will be monitored by Westat supervisory personnel. The response rates for FRSS surveys of schools typically have been 90 percent or greater.


Data Collection Instrument


The questionnaire package will include two cover letters: (1) a principal cover letter asking that the survey be completed by the person most knowledgeable about educational technology within the school; and (2) a cover letter for the survey respondent. These cover letters are enclosed as Attachments 1 and 2. Both cover letters request participation and introduce the purpose and content of the survey. The cover letters also include instructions on how to complete and return the survey, as well as contact information in case of queries. Included in the mailing will be information about the option to complete a Web version of the survey.


The questionnaire (Attachment 3) collects information on various aspects of educational technology availability and use in public schools, as summarized below.


  • Questions 1-4 collect counts of computers by characteristics (e.g., instructional use, type, location, Internet access, age).


  • Question 5 asks about the operating system(s) for the instructional computers in the school.


  • Questions 6-7 collect counts of hand-held devices (e.g., Palm OS, Windows CE, Pocket PCs, BlackBerries) and other hardware (e.g., LCD projectors, interactive whiteboards, digital cameras) in the school.


  • Questions 8-9 are designed to collect the type of wireless network access in the school and the use of the district network and Internet for various activities.


  • Questions 10-12 collect information on time required for technical support activities (question 10) and the leadership and support available to help integrate technology into instruction and provide technical support (questions 11 and 12).


  • Question 13 asks respondents to report their perceptions about technology issues in the school and district.


  • Question 14 collects the number of instructional classrooms so that the average number of computers per classroom can be reported.


  • Questions 15-16 collect information on the percentage of students eligible for free- or reduced-price lunch and the grades taught at the school, which will be used to update and/or verify information from the sampling frame.



Review by Persons Outside the Agency


All development work occurred in close collaboration with the Office of Educational Technology. The various draft versions of the instrument were also tested with individuals in the field, for example, educational technology specialists in schools. In addition to multiple rounds of feasibility calls, the questionnaire was most recently pretested through calls to educational technology specialists in schools. Based on input from these respondents, NCES, and OET, the questionnaire was revised and submitted as Attachment 3 in this official request for OMB clearance.


Survey Cost


The survey is estimated to cost the Federal government about $330,000, including about $300,000 for contractual costs and $30,000 for salaries and expenses. Based upon costs of past FRSS sample surveys, contractual costs are divided into the subtask costs shown in Exhibit 1.


Exhibit 1. Estimated contractual costs by subtask


Subtask

Cost



Sampling

10,000

Survey preparation

50,000

Data collection

125,000

Data analysis

40,000

Report preparation and dissemination

75,000



Total

300,000


Time Schedule


Mailing of the survey is planned for September 2008. One week after mailout, we will send thank-you/reminder postcards thanking those who responded and reminding those who have not yet responded. About 3 weeks after mailout of the surveys, Westat will begin telephone follow up for nonresponse. Data collection is scheduled for completion about 12 weeks after initial mail out. Exhibit 2 shows the anticipated schedule.


Exhibit 2. Anticipated data collection schedule



Cumulative workdays


From submission to RIMG/OMB

From RIMG/OMB approval




Package to OMB

0

-

Package approved by OMB

45

0

Mail-out of questionnaire

55

10

Telephone Follow up started

70

25

Follow up completed

115

70


Plan for Tabulation and Publication


Most of the analyses of the questionnaire data will be descriptive in nature, providing NCES, OET, and other data users with tables and appropriate explanatory text. Survey responses will be weighted to produce national estimates. Tabulations will be produced for each data item. Crosstabulations of data items will be made with selected classification variables, such as the following.


  • School level (elementary and secondary/combined)

  • School enrollment (less than 300, 300-999, and 1,000 or more);

  • Geographical region (Northeast, Southeast, Central, and West);

  • Locale (city, urban fringe, town, rural);

  • Percent minority enrollment (less than 6 percent, 6-20 percent, 21-49 percent, 50 percent or more); and

  • Percent of students eligible for free or reduced-price lunch (less than 35 percent, 35-49 percent, 50-74 percent, 75 percent or more).


Reports of the findings will be distributed to the data requester, survey respondents, and, upon request, to other interested individuals and organizations, as well as published on the NCES website.


Statistical Methodology


Reviewing Statisticians


Adam Chu, Senior Statistician, Westat, (301) 251-4326, was consulted about the statistical aspects of the design.


Respondent Universe


The respondent universe for the proposed survey on educational technology will include the individuals most knowledgeable about educational technology in all regular public elementary and secondary/combined schools in the United States. This survey is one of three related surveys to be conducted under a nested design involving a sample of districts, schools within districts, and teachers within schools. For the purpose of this survey, elementary schools are defined to be those with a high grade of 8 or less and a low grade of 6 or less. All other schools are considered to be "secondary/combined" schools. Vocational education, special education, alternative/other non-regular schools, and schools operated by the Department of Defense or Bureau of Indian Affairs are ineligible for the survey, as are schools with a high grade of kindergarten or lower, ungraded schools, and schools in the outlying U.S. territories. As described in the following section, a stratified sample of approximately 1,000 elementary schools and 1,000 secondary/combined schools will be selected from the most up-to-date NCES Common Core of Data (CCD) Public School Universe File. Table 1 summarizes the distribution of schools in the CCD Public School Universe File by level, enrollment size class, and percent of students eligible for free/reduced price lunch. Note that while the counts in the table are based on 2005-06 CCD data, the more current 2006-07 CCD file will be used for sampling if it is available.


Table 1. Number of schools in the 2005-06 CCD Public School Universe File by level, enrollment size class, and percent of students eligible for free/reduced price lunch




Percent of students eligible for free/reduced price lunch

Level/enrollment

size class*

Number of

schools



Missing



Less than 35


35 to 49


50 to 75


75+








Elementary







Less than 300

16,335

348

5,294

3,230

4,366

3,097

300 to 499

20,895

96

7,705

3,701

5,183

4,210

500 to 599

8,731

38

3,539

1,332

2,067

1,755

600 to 749

8,405

37

3,370

1,272

1,990

1,736

750 or more

9,281

31

3,777

1,371

1,978

2,124








Secondary/combined







Less than 300

6,718

296

2,594

1,449

1,439

940

300 to 499

3,622

52

1,620

780

753

417

500 to 999

5,615

43

2,910

1,118

1,070

474

1,000 to 1,499

2,842

23

1,687

510

467

155

1,500 or more

3,275

54

1,998

581

490

152








TOTAL†

85,719

1,018

34,494

15,344

19,803

15,060

* For sampling purposes, schools with a low grade of 6 or less and a high grade of 8 or less are considered to be "elementary" schools. All other schools are considered to be "secondary/combined" schools.

Counts in this table are given for illustration. The more up-to-date 2006-07 CCD file will be used for sampling if it is available.


Statistical Methodology


The sample design for the school survey on educational technology will be a stratified sample with primary strata defined by level, enrollment size class, and percent of students eligible for free/reduced price lunch (see Table 1). Stratification by size class and the five categories for percent-free/reduced price lunch will ensure that schools of all sizes and all poverty levels are appropriately represented in the sample. It should be noted that the percent free lunch information required for stratification is missing for about one percent of the schools in the CCD frame. Although it will not be possible to assign these schools to the appropriate stratum for sampling purposes, all such schools will be given a chance of selection for the survey.


A total of 2,000 schools will be selected for the survey, including approximately 1,000 elementary schools and 1,000 secondary/combined schools. Initially, the 1,000 elementary schools and 1,000 secondary/combined schools will be allocated to the primary strata in rough proportion to the aggregate measure of size of the schools in the stratum, where the measure of size is defined to be the square root of the number of teachers (FTE) in the school. Such an allocation is expected to be reasonably efficient for jointly estimating school-level characteristics and quantitative measures correlated with the number of teachers and/or school enrollment. Within the primary strata defined above, schools in the sampling frame will be sorted by type of locale (city, urban fringe, town, rural) and Office of Education (OE) region. When used in conjunction with systematic sampling, the sorting will induce additional implicit substratification within the primary strata. Within each stratum, the specified sample of schools will then be selected systematically with probabilities proportionate to the measure of size. Although the use of the measure of size to select the schools will increase unequal weighting design effects for school-level estimates, it will also help control the variation in teacher sample sizes across schools for the subsequent teacher survey. The expected numbers of schools to be selected under the proposed design by level and enrollment size class are summarized in the last column of Table 2.


Table 2. Proposed allocation of the public school sample for survey on educational technology by level and size class




Instructional

level





Enrollment size class


Number

of schools to

be sampled


1. Elementary

Less than 300

173


300 to 499

314


500 to 599

150


600 to 749

158


750 or more

205




2. Secondary/

Less than 300

165

combined

300 to 499

135


500 to 999

272


1,000 to 1,499

175


1,500 or more

253

TOTAL


2,000


Expected Levels of Precision


Table 3 summarizes the approximate sample sizes and standard errors to be expected under the proposed design for selected domains. Note that the standard errors in Table 3 include approximate design effects ranging from 1.05 to 1.40 to reflect the increase in variance due the use of variable sampling fractions. Under the proposed stratified sample design, (1) large schools will be sampled at relatively higher rates (i.e., have smaller sampling weights) than small schools, and (2) secondary/combined schools will be sampled at relatively higher rates than elementary schools to improve subgroup comparisons within the major instructional levels. Since the sample sizes in Table 3 are based on preliminary tabulations of the CCD file, the actual sample sizes may differ from those shown. Also, note that the sample sizes represent the expected numbers of schools returning completed questionnaires, and not the initial numbers of schools to be sampled. The standard errors in Table 3 can be converted to 95 percent confidence bounds by multiplying the entries by 2. For example, an estimated proportion of the order of 20 percent (P = 0.20) for the total sample would be subject to a margin of error of ±2.2 percent at the 95 percent confidence level. Similarly, an estimated proportion of the order of 50 percent (P = 0.50) for elementary schools would be subject to a margin of error of ±3.4 percent at the 95 percent confidence level.


Table 3. Expected sample sizes (number of responding schools) and corresponding standard errors for estimates of proportions for selected analytic domains



Standard error† of an estimated proportion equal to ...


Sample size*

P = 0.20

P = 0.33

P = 0.50

Total sample

1,800

0.011

0.013

0.014






Instructional level





Elementary

900

0.014

0.016

0.017

Secondary/combined

900

0.014

0.017

0.018






Type of locale





City

472

0.020

0.024

0.025

Urban fringe

641

0.017

0.020

0.021

Town

170

0.033

0.039

0.042

Rural

518

0.019

0.022

0.024






Percent eligible for free/reduced

price lunch





Less than 35 percent

817

0.015

0.018

0.019

35 to 49 percent

330

0.024

0.028

0.030

50 to 75 percent

387

0.022

0.026

0.028

75 percent of more

266

0.027

0.031

0.033






Region





Northeast region

362

0.023

0.027

0.029

Southeast region

415

0.021

0.025

0.027

Central region

457

0.020

0.024

0.025

West region

566

0.018

0.021

0.023






Level by enrollment size class





Elementary





Less than 300

156

0.033

0.039

0.041

300 to 499

283

0.024

0.029

0.030

500 to 749

277

0.025

0.029

0.031

750+

185

0.030

0.035

0.038





Secondary/combined





Less than 500

149

0.034

0.040

0.043

500 to 999

366

0.022

0.026

0.027

1,000 or more

385

0.021

0.025

0.027

* Expected number of responding schools, assuming 90% survey response rate.

Assumes design effects ranging from 1.05 to 1.33 to reflect increase in variance due to disproportionate allocation to instructional levels and size classes.


Estimation and Calculation of Sampling Errors


For estimation purposes, sampling weights reflecting the overall probabilities of selection and adjustments for nonresponse will be attached to each data record. To properly reflect the complex features of the sample design, standard errors of the survey-based estimates will be calculated using jackknife replication. Under the jackknife replication approach, 50 subsamples or "replicates" will be formed in a way that preserves the basic features of the full sample design. A set of estimation weights (referred to as "replicate weights") will then be constructed for each jackknife replicate. Using the full sample weights and the replicate weights, estimates of any survey statistic can be calculated for the full sample and each of the 50 jackknife replicates. The variability of the replicate estimates is used to obtain a measure of the variance (standard error) of the survey statistic. Previous surveys, using similar sample designs, have yielded relative standard errors (i.e., coefficients of variation) in the range of 2 to 10 percent for most national estimates. Similar results are expected for this survey.



1 Separate OMB clearance packages are being submitted for the district and teacher surveys.


File Typeapplication/msword
File TitleTO:
AuthorBasmat Parsad
Last Modified ByDoED User
File Modified2008-06-13
File Created2008-06-13

© 2024 OMB.report | Privacy Policy