1875-NEW TO27-Task4.4-Part 2-OMB-PART.B-toPPSS-2016-7-10

1875-NEW TO27-Task4.4-Part 2-OMB-PART.B-toPPSS-2016-7-10.pdf

Study of Digital Learning Resources for Instructional English Learner Students

OMB: 1875-0279

Document [pdf]
Download: pdf | pdf
Task Order 27
Study of Digital Learning Resources for
Instructing English Learner Students
Task 4.4 (Part 2) Revised Draft OMB Part B

July 11, 2016

Submitted to:
U.S. Department of Education
Office of Planning, Evaluation and Policy Development

Submitted by:
Westat
1600 Research Boulevard
Rockville, Maryland 20850-3129
(301) 251-1500

Study of Digital Learning Resources
(DLRs) for Instructing English Learner
Students

Supporting Statement for Paperwork Reduction
Act Submission
PART B: Collection of Information Employing
Statistical Methods

Table of Contents
B.

Collection of Information Employing Statistical Methods ..........................

1

Introduction .........................................................................................................................

1

Overview of the Study ..........................................................................................................

1

B.1.

Respondent Universe and Sampling Methods ................................................

2

B.2.

Statistical Methods for Sample Selection and Degree of Accuracy
Needed.............................................................................................................

8

Methods to Maximize Response Rates and Deal With
Nonresponse ....................................................................................................

13

B.4.

Test of Procedures and Methods to be Undertaken .......................................

15

B.5.

Individuals Consulted on Statistical Aspects of the Design .............................

16

References .........................................................................................................................

17

B.3.

Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

iii

Exhibits
B.1

Definition of English learner (EL) representation categories...........................

9

B.2

English learner (EL) distribution across cells defining EL student
representation in the district ...........................................................................

9

Expected EL-teacher sample distribution after nonresponse:
Sample distribution by teacher type (mainstream and specialist
teachers of ELs) and grade level taught (elementary and
secondary)........................................................................................................

11

B.3

Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

iv

Part B. Collection of Information Employing Statistical Methods
Introduction
The Study of Digital Learning Resources for Instructing English Learner Students will address a critical gap
in research and in supports for practice in the instruction of students identified as English learners (EL).
The limited focus in the field on use of digital learning resources (DLRs) in instruction of EL students is a
particularly critical issue given recent demographic changes. The research on DLRs lags in addressing the
new reality of K–12 classrooms today which increasingly present a “new mainstream” (Enright, 2011).
Growing populations of immigrant and EL populations are challenging many districts and schools to
meet the needs of EL students (Capps et al., 2005; NCELA, 2014). Teachers report using DLRs to meet EL
students’ needs; however, many do so without the guidance they need to best support their EL students
(Warschauer et al., 2004; Zehler et al., 2008).
The goal of this research effort is to provide an understanding of the current use of DLRs for instructing
EL students in order to inform further research and policy development efforts. To achieve its goal the
study will survey school districts and teachers, conduct interviews and observations through site visits to
districts and schools, collect information through demonstrations of digital learning resource
demonstrations and document reviews, and conduct market research on existing DLRs for K–12
instruction.

Overview of the Study
The goal of this research effort is to provide an understanding of the current use of DLRs for instructing
EL students in order to inform further research and policy development efforts. The study addresses the
following main research questions:
1.

How do districts and teachers identify and select DLRs in general? How do districts and teachers
identify and select DLRs specifically to support EL students?

2.

What types of DLRs do districts report using to support English learners? What types of DLRs do teachers
report using in instructing and structuring learning activities for their EL students?

3.

How do teachers of EL students use DLRs in the instruction of EL students?

4.

To what extent do teachers receive professional development (PD) in effective use of DLRs for
instruction? Which professional development approaches do teachers report to be most helpful in
supporting their use of DLRs in instruction?

5.

What are barriers to and supports for (1) the use of DLRs in instruction of EL students and (2) the
use of DLRs by students at home? How can districts, schools, and DLR developers address these?

6.

How do districts and teachers define and measure the success of their use of technology to support
EL students?

7.

How could developers and practitioners improve the usefulness of DLRs for instructing EL students?

The project consists of five components:

Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

1

1)

Market research on existing DLRs for K–12 instruction to support students’ second language
acquisition and learning of academic English, content and skills in core academic content areas;

2)

Survey of school district administrators who are responsible for making instructional and
technology decisions for schools;

3)

Survey of teachers of EL students;

4)

Case studies including site visits to districts and schools to conduct interviews, observations, DLR
demonstrations and document review; and

5)

Expert guidance from a Technical Working Group (TWG) and an Expert Panel including
technology developers, practitioners, and education researchers.

The study will culminate in a final report, a guide for DLR use for educators and a toolkit for technology
developers.

B.1.

Respondent Universe and Sampling Methods

Key subgroups are defined for districts and teachers. The study will include: (1) a stratified sample of
districts that ensures representation of districts that serve significant populations of EL students; and
(2) a sample of teachers of EL students that includes both specialist teachers of ELs and mainstream
teachers of ELs.

B.1.1. District Sample
Respondent Universe
The respondent universe for the district survey is all public school districts that serve at least one EL
student according to the most recent NCES Common Core of Data (CCD) Local Education Agency
Universe File (2013-14). Districts will be selected using a stratified random sample design to ensure an
adequate representation of districts with significant, moderate and low incidence level of EL student
representation. A total of 999 districts will be selected. This sample size will enable the study team to
address the key research questions with a reasonable level of power to support key comparisons by the
district EL-representation categories.

Sampling Methods
District Stratification: Definition of EL Student Representation in Districts
Our definition of EL representation in a district considers both the number of EL students enrolled and
the proportion of total K–12 enrollment that EL students represent. Our experience and knowledge of
the available research (e.g., Zehler et al., 2008, 2011) lead us to believe that there may be important
differences in the use of DLRs for instruction of EL students depending on the district context. In our
design, we propose to define this context in terms of both the number of ELs and the level of
representation of ELs in the district. Neither alone is an adequate basis for definition. For example,
Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

2

number of ELs alone is not sufficient since 300 students could represent a very small proportion of a
district population or a majority of students in a district. In the latter case, district administrators will be
more likely to take EL students’ needs into account in technology decisions, and there may be different
choices made in technology purchases by the district.
However, the proportion of EL students alone is similarly not sufficient since the number of ELs in a
district can in some cases be a critical factor in determining decision-making related to use of DLRs. For
example, large numbers of students are more likely to trigger thresholds that provide access to federal
and state funding to support instruction of EL students, including resources such as DLRs. They are also
more likely to be identified as a subgroup for planning purposes in the district.
The plan for defining the district sample utilizes criteria both in terms of number of EL students and
percentage of EL students in the district to define three strata for EL representation: districts with
significant EL representation, moderate EL representation, and a low-incidence stratum representing all
districts with small numbers of EL students enrolled. This stratification plan responds to the
Department’s requirement that the study address districts with significant populations of EL students. In
the plan, districts with a Significant representation of EL students are those that enroll more than 1,000
EL students and the percentage of EL students in the district is 10 percent or more.1 The plan defines
Moderate districts as including those with more than 1,000 ELs in districts where ELs are less than
10 percent of the total student enrollment and there are between 101 and 1,000 ELs enrolled.1
Low-incidence districts comprise the third stratum. We define a low-incidence category of districts on
the basis of the argument that it is equally important to examine DLR use in districts that include small
numbers of EL students as it is to examine DLR use in districts with significant EL representation. The
importance of including low-incidence districts as a separate stratification category is underlined by the
fact that approximately 70 percent2 of districts with at least one EL student fall within this low-incidence
category. The sampling plan defines low-incidence districts as all districts enrolling small numbers of EL
students, specifically 100 or fewer ELs.
EL students in low-incidence districts are more likely to be dispersed across schools and grades with only
one or a few EL students at any one grade level. Often low-incidence districts are rural districts that are
likely to encounter additional challenges due to geographically dispersed schools and students and due
to the district’s location in areas remote from sources of EL-related expertise. Given small numbers of
ELs in a district, (1) it is also less likely that there would be a focus on EL students as a key district subgroup and (2) the numbers of EL students are less likely to trigger additional state or federal funding or
other resources for ELs, including DLRs. In these districts, the contexts for EL student instruction, the
student needs to be addressed, and the resources available to serve ELs are likely to be very different
from those found for districts with large numbers of ELs. Correspondingly, the use of DLRs may be very
different in goals and in instructional practices. Of further note, many low-incidence districts may be
districts with emerging EL communities, that is, districts where ELs are a recent or newly enrolling

1

We selected 1,000 as representing a population that we judged would be considered large and supported by the fact that the
2003 nationally representative Descriptive Study of Services for Limited English Proficient (LEP) Students and LEP students with
Disabilities (Zehler et al., 2003) included 1,000 as an analytic break point in examining variables by size of the EL student
population in the district. (The full set of analytic cut-points for district size was: 1–24; 25–99; 100–999; 1,000–9,999; and
10,000 or more). Similarly, we selected 100 as the cut-off for numbers of ELs that would be considered small and lowincidence, based on our judgment and supported by the cut-offs in the 2003 study.

2

This statistic is based on analysis of the National Center for Education Statistics, Common Core of Data, 2013–14 district data.

Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

3

population. In such districts administrators and teachers may be just gaining awareness of EL students as
a population requiring services, and beginning to build capacity to serve EL students. Their DLR use may
also reflect these different needs.
Thus for a number of reasons, the goals for DLR use; the processes for identification, selection, and
access to DLRs; and the use of DLRs in low-incidence districts may differ in important ways from those of
districts with significant or moderate EL student representation. Our sampling plan includes lowincidence districts on the assumption that these differences will be important to understand. The
findings from the low-incidence districts may contribute key additional policy-relevant information to
the Department and further inform the field. Specific to the goals of this study, the low-incidence district
findings are anticipated to be important in further guiding the development of the educator toolkit. The
criteria for determining significant, moderate, and low-incidence strata are discussed further in the
sampling section below.

Expected Response Rates
We are seeking a minimum response rate of 85 percent. Thus the sample of 999 districts will result in at
least 850 completed district surveys.

B.1.2. Teacher Sample
Respondent Universe
The teacher survey addresses the practices in use of DLRs by teachers of one or more English learner
students in grades K–12 in public schools in the United States. The teacher survey addresses two types
of teachers, mainstream or general education teachers who instruct EL students among other students
in their classes, and EL specialist teachers who provide specialized programs of instruction and services
to EL students that are specifically directed toward meeting students’ needs as English learner students.
Teachers will provide background information in the teacher survey so that it will be possible to examine
teachers by key characteristics, such as level of experience teaching grades K–12, level of education, and
certification.
The identification of the teacher survey universe is a two-step process. The first step is to randomly
select a subsample of 600 districts from the 999 sampled for the district survey. The second step is to
randomly select one school within each of the 600 districts. The universe of teachers is the teachers
within the sampled schools who teach EL students either as specialists or as mainstream teachers. From
among this group one EL specialist teacher and one mainstream teacher will be randomly selected from
each school for a total sample of 1,200 teachers.

Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

4

Sampling Methods
Definition of Teachers of EL Students
The teacher survey will include as respondents two categories of teachers of EL students: (1) English as a
second language or bilingual teachers and (2) other teachers of ELs. The Department is interested in DLR
use by teachers who are specialists in instructing EL students as compared to mainstream teachers of EL
students.
Teachers of EL students refer to all teachers who instruct one or more EL students. Each teacher will be
identified as either an EL-specialist or mainstream teacher of ELs as follows:
(1)

EL-Specialist teachers of EL students will include:


English as a Second Language (ESL) teachers;



Teachers of bilingual program self-contained classrooms or, in departmentalized programs,
teachers of bilingual content area classes;



Teachers of classrooms providing all-English programs specifically designed for instruction of
EL students (such as structured immersion, SDAIE programs); and



Teachers of dual-language programs for EL students and English proficient students in which
the goal is full proficiency in both English and the EL-students’ native language for all
students in the program.

One challenge in defining EL-Specialist teachers may occur in districts with small numbers of EL
students. In these districts there may not be an EL-Specialist teacher on staff at the school.
Instead, the school may be served by an itinerant specialist. In such cases, we will identify the
itinerant specialist assigned to the selected school.
(2)

Mainstream teachers of EL students will include general education teachers whose primary
responsibility is instruction of students in a school’s main grade-level classroom or content area
(and where the instruction is not focused on specialized instruction for EL students), and who
instruct one or more EL students. Mainstream teachers of EL students may include self-contained
classroom teachers, or, in a departmentalized program, teachers of academic content classes
designed for the general student population. Resource teachers (such as Title I or reading
specialist teachers) will not be included in order to more consistently represent in the sample
general education teachers of ELs who provide core instruction in academic content areas.

Grade Level of Schools
The sample of schools from which teachers are drawn will include both elementary and secondary
schools (drawing a systematic sample using grade level as a sort variable, see section B.2.1 below).
Grade level is defined as follows:


Elementary schools will be defined as schools that include grades PK–3 as the lowest grade level
and grades PK–8 as the highest grade level.

Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

5



Secondary schools include middle and high schools. Middle schools include grades 4–7 as the
lowest grade level and grades 4–9 as the highest, and high schools include grades 7–12 as the
lowest and grade 12 as the highest grade level. Other categories of schools will be considered as
“combined” or “other.”

For the purposes of this study, where a K–12 school is the only school present, such as may be the case
in a rural area, the selection of teachers will be based on either elementary or secondary grade levels
only from that school.

Expected Response Rates
The teacher sample will include 600 mainstream teachers of ELs and 600 EL-specialist teachers, for a
total sample of 1,200 teachers. We anticipate a minimum response rate of 85 percent resulting in at
least 1,020 completed teacher surveys.

B.1.3. Case Study Sample
Universe
The universe for the site visit site selection will be the stratified list of districts that complete the district
survey.

Site Selection
The case study sample will be composed of six school districts and two schools within each district (12
schools across the six sites). Given the exploratory nature of the research, a multiple, comparative case
study design is appropriate as it enables the research team to address the overarching questions about
decision-making, support, use, and evaluation of DLRs with EL students (Yin, 2014). Moreover, a multiple
case design facilitates the analysis of similarities and differences across districts with respect to DLR
decision-making and use and identification of key areas in which administrators and teachers require
support with respect to DLR use with EL students. These findings can thereby inform the design of the
educator guide and developer toolkit.
District case study sites. To identify the six school districts for the case studies, the sampling approach
will include two strategies. First, to address the goal of illustrating DLR decision-making and use in
different types of EL district contexts, the case study sample selection will reflect the EL-representation
stratification categories used in the survey sample. The research team will initially identify a random
sample of six candidate school districts drawn from within each of the three stratification categories
(i.e., a total of an initial 18 candidate school districts).
Next, we will conduct a purposive sampling process to narrow the 18 candidate districts down to a
sample of six districts, including two districts from each EL-representation stratification category
(significant, moderate, and low-incidence). The goal will be to select a final set of six case study districts
that allows for maximum variation, identifying districts that vary from one another with respect to
selected key characteristics, including:

Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

6

(a) First languages of the EL students, to include EL districts with a Spanish-speaking majority as
well as districts with a non-Spanish-speaking EL majority and/or an EL population comprising
students from multiple language groups;
(b) District experience with ELs, to include districts with a long history of high EL concentration
(and, consequently, several years of experience working with EL students) as well as districts
experiencing a more recent influx of EL students (and, therefore, with less experience of
working with EL students); and
(c) Variation in the types of EL instructional services, to include programs that use EL students’
languages in instruction and those that do not.
While a set of six school districts cannot exhaustively sample for geographical variation, we anticipate
that our case study sample will be distributed across the country and also will include at least one to two
rural districts.
To obtain information about the key variables described above and make our case study sample
selection, we will consult publicly available sources of information about the 18 candidate districts,
including the Common Core of Data and information available on district websites. Additionally, we will
obtain information through contacts with the district personnel to inform final selection decisions.
School case study sites. In each district, we will select two schools, one elementary school and one
secondary school. First, we will select the school sampled for the teacher survey. This will be either an
elementary school with ELs or a secondary school with ELs. We will select the second school such that
there is one elementary and one secondary school with ELs included in the case study visit.

Selection of Respondents
District-level respondents. Study participants at the district level will be district administrators,
including individuals responsible for EL student services, curriculum and professional development, and
technology. At the district offices we will conduct interviews with three to six individuals including
administrators involved in identifying and selecting DLRs and providing PD or coaching to teachers
around the use of DLRs, particularly with EL students. While the respondents will be identified and
tailored to each site, they are likely to include technology coordinators, EL coordinators, and curriculum
and professional development specialists.
School-level respondents. In each case study school we will conduct interviews with the principal,
instructional and/or technology coaches, and EL services coordinators (where those positions exist), and
three to five teachers of EL students. The sample of teachers within each school will be identified based
on the following: random selection of one to two EL-specialist teachers (where there is more than one),
and identification via the recommendation of the principal or school EL coordinator of one mainstream
teacher of ELs who is known in the school for use of DLRs. The remaining teachers to be interviewed will
be randomly selected from the list of mainstream teachers who teach EL students. Interviews will be
conducted with one to two EL specialists and two to three mainstream teachers of ELs. Among
mainstream teachers of ELs, we will look for diversity in terms of subject matter focus across the case
studies. Of the teachers interviewed at each school, we will conduct an observation in the classroom of
one teacher who is known to use DLRs for instruction.

Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

7

Expected Response Rate
Since we will be visiting case study districts and schools for the purposes of data collection, we expect a
high response rate above the 85 percent minimum. Although they are not likely, non-responses may
occur if a district or school administrator or teacher is unavailable at the time that the case study is
scheduled.

B.2.

Statistical Methods for Sample Selection and Degree of Accuracy
Needed

B.2.1. Statistical Methodology for Stratification and Sample Selection
The proposed sample design is a stratified sample of 999 school districts that serve at least one EL
student according to the most recent NCES Common Core of Data (CCD) Local Education Agency
Universe File (2013–14). We expect that the final sample of 999 districts will result in approximately 850
districts cooperating, assuming an 85 percent cooperation rate (methods for ensuring this response rate
are discussed in the overview of the study implementation).

School District Sampling
Three district-level strata representing level of EL student representation in the district will be defined
by cross-classifying two variables: (a) the percent EL students ([1] 25%–100%, [2] 10%–<25%,
[3] 5%–<10%, [4] 2%–<5%, [5] 1%–<2%, and [6] > 0%–<1%) and (b) the number of EL students ([1] more
than 1,000, [2] 101–1,000, and [3] 1–100). Additional variables, such as region and locale, will be used to
sort the districts within the sampling strata prior to a systematic sample selection. Sorting is a form of
implicit stratification that helps ensure that districts with these characteristics are adequately
represented in the sample. A sample of 333 districts will be selected with equal probability from each of
the sampling strata. An equal sample size per strata is the recommended approach for computing
estimates for these groups. However, this will increase the design effect for estimates for the whole
population due to the differential sampling rates by stratum. Exhibits B.1 and B.2 summarize the
proposed stratification of districts.
Since the goal of the study is to provide estimates for broadly defined subgroups of districts as well as
overall national estimates, a stratified sample design is believed to be an effective way to meet these
objectives. Specification of explicit strata for sampling purposes will allow for the selection of districts at
varying rates to (a) ensure that key subgroups are adequately represented in the sample, and
(b) improve sampling precision for selected subgroup estimates.

Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

8

Exhibit B.1. Definition of English learner (EL) representation categories
EL student representation in district

Definition

1. Significant*

Districts with large numbers of ELs (more than 1,000) and a high
percentage of ELs (>10 percent).

2. Moderate

Districts with either:
a.

Large numbers of ELs (more than 1,000) but a low percentage
of ELs (<10 percent); or
b. Medium numbers of ELs (101–1,000)
3. Low-incidence

Districts with a low number of ELs (1–100) regardless of the
percentage of ELs

*The 10 largest districts (based on number of ELs) will be selected with certainty.

Exhibit B.2. English learner (EL) distribution across cells defining EL student representation in the
district
Number and percentage of EL students in the district
EL students as percentage
of total district enrollment

More than
1,000

101–1,000

1–100
Total
1,497,151
130,384
7,275
1,634,810
1. 25%–100%
(33.5%)
(2.9%)
(0.2%)
(36.5%)
1,548,532
249,098
21,240
1,818,870
2. 10%–<25%
(35.0%)
(6.0%)
(0.0%)
(41.0%)
355,144
219,788
27,378
602,310
3. 5%–<10%
(7.9%)
(4.9%)
(0.6%)
(13.5%)
75,423
182,068
53,012
310,503
4. 2%–<5%
(1.7%)
(4.1%)
(1.2%)
(6.9%)
0
33,661
39,456
73,117
5. 1%–<2%
(0.0%)
(0.8%)
(0.9%)
(1.6%)
2,015
3,693
30,568
36,276
6. >0%–< 1%
(0.1%)
(0.1%)
(0.7%)
(0.8%)
3,478,265
818,692
178,929
4,475,886
Total
(77.7%)
(18.3%)
(4.0%)
(100.0%)
Note: EL student representation in the district is defined by considering both the number of EL students enrolled
and ELs as a proportion of total enrollment in the district.
Key: Light grey cells, significant representation; medium grey cells, moderate representation; and dark grey cells,
low-incidence of EL students. The percentages in parentheses on the second line in each cell indicate the
percentage of total EL students for all districts with at least one EL student enrolled.
Exhibit reads: There are 1,497,151 EL students in districts with more than 1,000 EL students enrolled and where EL
students represent 25 percent or more of total district enrollment. The EL students in this cell are 33.5 percent of
all EL students enrolled in districts with ELs.
Source: U.S. Department of Education, Common Core of Data, 2013–14 school year.

Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

9

Teacher Sampling
The selection of the sample for the teacher survey will occur in three steps. First, from the sample of 999
districts we will select a subsample of 600 districts. Second, since information about which teachers
instruct EL students in most cases will not be available at the district level, in each of the 600
subsampled districts we will select one school that serves EL students and obtain a roster of teachers. In
the third step, in each sampled school, we will select two teachers of EL students: one mainstream
teacher of at least one EL student, and one EL-specialist teacher of at least one EL student, for a total of
1,200 selected teachers. Below, we describe each of these steps in more detail.

Step One: Select a Subsample of 600 Districts
As a first step, we will randomly select 600 districts from the 999 districts in the district survey sample by
subsampling 200 districts from each stratum defined by EL student representation in district. By
subsampling the districts, we reduce the number of districts that need to be contacted in order to
obtain information for the school sample.

Step Two: Select 600 Schools
We will select a sample of 600 schools (one school per district) before selecting the sample of teachers
of EL students, since information about teachers of EL students will not be directly available from the
district.
Only those schools that serve EL students will be eligible for the selection into the school sample. To
minimize the impact of differential weighting, which leads to an increased design effect, schools will be
sampled PPS (probability proportional to size sampling) within each sampling stratum. The ideal
measure of size (MOS) for sampling schools is the number of teachers of EL students at each school.
However, since this information is not available from districts, we propose to use total number of
teachers in the schools as a proxy for the MOS of the school for selecting the schools within the
subsample of districts.
Each of the 600 districts in the subsample will be asked to provide the following information:


Number and list of schools in the district;



Number of EL students in each school within the district (so that schools without ELs can be
excluded from sampling);



Number of teachers serving each school including any itinerant ESL or EL-specialist teacher
assigned to the school for all schools in the district; and



Grade levels served by each school in the district.

Using this information, a sample of 600 schools serving EL students will be selected from the districts.
Schools with no EL students will be excluded from sampling. Schools with one or more EL students will
be identified as elementary or secondary (middle or high school) and will be sampled using PPS with the
number of teachers as the measure of size. Schools will be selected in a manner so that no more than
one school is selected per district. To ensure an adequate representation of grade spans within
elementary and secondary groups we will use the school grade span as a sort variable and take a
Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

10

systematic sample. Thus, this will ensure representation of lower and upper elementary schools in the
elementary sample, and of middle and high schools in the secondary sample. A sample of 300
elementary schools and 300 secondary schools will be selected. An equal sample size for these groups is
the best approach for comparisons between the groups; however, it increases the design effect for the
whole population due to differential sampling.

Step Three: Select 1,200 Teachers
After selecting the sample of 600 schools, we will contact the schools to obtain a complete list of the
teachers serving these schools and information on:


the teacher’s role (EL-specialist or mainstream teacher), and



identification of teachers that teach one or more EL students.

The information in these lists will include contact information of all EL-specialist (including itinerant
teachers) and all mainstream teachers of EL students in the school. The list of eligible teachers in the
sampled schools will then be stratified into two groups (1) EL-specialist teachers (including itinerant
teachers) and (2) mainstream teachers of EL students. One teacher will be randomly sampled from each
group within the school. If no EL specialists serve the school then two mainstream teachers of EL
students will be selected. A total of 1,200 teachers will be sampled and we expect to receive 1,020
completed surveys (510 from each teacher type and 510 from each school type), assuming a response
rate of 85 percent. An equal sample size of teachers by these groups is the best approach for
comparisons by type of teachers; however, it increases the design effect for the whole population due
to differential sampling. Exhibit B.3 summarizes the proposed stratification of teachers.
Exhibit B.3. Expected EL-teacher sample distribution after nonresponse: Sample distribution by
teacher type (mainstream and specialist teachers of ELs) and grade level taught
(elementary and secondary)

Type of teacher
Mainstream
EL-Specialist
Total sample

Grade level of teacher of EL students
Elementary
Secondary
Completed
Completed
Sample
cases
Sample
cases
300
255
300
255
300
255
300
255
600
510
600
510

Total sample
Completed
Sample
cases
600
510
600
510
1,200
1,020

B.2.2. Estimation Procedures
Weighting
Separate district-level and teacher-level weights will be produced to represent the two domains being
studied. Our sample design attempts to balance the need to obtain a large enough sample size for
analysis of subdomains with the desire to have as small a design effect as possible for each population.

Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

11

Sampling weights reflecting the overall probabilities of selection and adjustments for nonresponse will
be attached to each data record. Two sets of weights will be provided to enable estimation of district
and teacher characteristics. To generate district weights, an initial weight will be assigned to each
sampled district. This weight will be the inverse of the district’s probability of being selected for the
survey. An adjustment will then be applied to the initial weight to compensate for district-level
nonresponse. This will result in the final district weight for use in calculating estimates for districts.
To generate teacher estimates, an additional sequence of steps will be performed. First, the final district
weight will be adjusted to account for the subsampling of schools from the cooperating districts. We will
then adjust the weight further to compensate for school-level nonresponse. Then we will adjust the
weight for the subsampling of teachers from among the sampled schools. Finally, to create the final
teacher-level weight, we will adjust for nonresponding teachers.
To properly reflect the complex features of the sample design, standard errors of the survey-based
estimates will be calculated using jackknife replication. Under the jackknife replication approach,
50–100 subsamples or "replicates" will be formed in a way that preserves the basic features of the full
sample design. A set of estimation weights (referred to as "replicate weights") will then be constructed
for each jackknife replicate. Using the full sample weights and the replicate weights, estimates of any
survey statistic can be calculated for the full sample and each of the jackknife replicates. The variability
of the replicate estimates is used to obtain a measure of the variance (standard error) of the survey
statistic.

Estimation
Given the use of a statistical sample, survey data presented for districts and teachers will be weighted to
national totals (tabulations will provide standard errors for the reported estimated statistics). At the end
of data collection we will create analysis weights for districts and teachers that reflect the sample design
and how the samples are selected (i.e., the sample of districts is selected from a stratified frame and the
sample of teachers is drawn from a sample of schools within the sample districts). The weights will be
adjusted for nonresponse at the different levels of selection (district, school and teacher level). These
weights are needed to produce unbiased estimates for districts and teacher characteristics and for
comparisons among the different groups of districts and teachers. In addition, the descriptive tables will
indicate where differences between subgroups are statistically significant. We will use Chi-Square tests
to test for significant differences among distributions and t-tests for differences in means. Tabulations
will be included in the reports where appropriate.

B.2.3. Degree of Accuracy Needed
District survey. The proposed district sample size (333 per stratum) is enough for a Minimum Detectable
Effect Size (MDES) of 21.62 percent (39.19 percent – 60.81 percent) for single comparisons among the
three groups of EL student representation in a district (significant, moderate, and low-incidence) for an
assumed proportion with a true value p=0.5 with a significance level α=0.05, a statistical power
1−β=0.80, and assuming a design effect of 1.2. However, the MDES for estimates of districts will differ
depending upon the variable being analyzed and it will be larger for variables with larger design effects.

Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

12

Teacher survey. The proposed sample size of teachers is enough to provide an MDES of 19.87 percent
(40.07 percent — 59.94 percent) for comparisons between mainstream and specialist teachers of EL
students for a proportion with a true value p=0.5 with a significance level α=0.05 with a statistical power
1−β=0.80 and a design effect of 1.29. The same MDES would result for paired comparisons among
elementary and secondary teachers under the same assumed proportion, significant level, power, and
design effect. As in the district estimates, the MDES for estimates of teachers depends upon the variable
being analyzed and it will be larger for variables with larger design effects.

B.2.4. Unusual Problems Requiring Specialized Sampling Procedures
There are no unusual problems to be addressed in the sampling.

B.2.5. Use of Periodic (less than annual) Data Collection Cycles to Reduce Burden
This is a one-time data collection.

B.3.

Methods to Maximize Response Rates and Deal With Nonresponse

B.3.1 District and Teacher Surveys
This study will use a variety of strategies toward achieving a response rate of 85 percent. Westat has
conducted many nationally representative studies to achieve the level of response required, including
studies for the Department of Education’s National Center for Education Statistics. Obtaining high
response rates will be critical to the success of the study. It will be particularly important to obtain
response rates that are not only high overall, but that are approximately equal across district
stratification groups and in both teacher groups. As a first step to ensuring high response rates, we
develop survey instruments that are clear and that respondents can easily and readily navigate.
Once the sample has been identified, the survey staff will develop databases of district and school
contact information using available online resources. Upon receipt of OMB clearance, notifications of
selection will be sent out to the districts (via email). The survey operations staff will immediately reach
out to the sampled districts, and will confirm or complete contact information as needed and update the
survey contact information in the survey operations database. The staff will use email and/or phone to
make the necessary contacts, and may use regular mail if required.
In addition, the staff will begin to submit applications for research approval for those districts where
such approvals are required. Based on our past experience in conducting national education studies, our
staff is aware of many such districts in advance, familiar with the types of requirements presented by
the districts and understands how to prepare responsive applications. Submitting the applications will
be an immediate priority component of the preparation for the data collection.
In further preparation for a successful survey data collection with a high response rate, Westat assigns a
specialist survey operations manager who is experienced in implementing surveys. The survey
operations manager uses proven systems for tracking survey responses on an ongoing basis and for
following up on non-response cases using a variety of methods. The survey operations team comprises
Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

13

skilled survey staff with many years of experience in conducting such follow-up contacts. In addition,
Westat employs a number of strategies to assure the required response rate is obtained. The survey
staff employs a comprehensive strategy for contacting non-respondents, using a combination of
reminder emails, follow-up letters and telephone calls to encourage respondents to complete the
surveys. In addition, the strategies will be tailored to districts and teachers specifically as outlined
below.
Maximizing district survey response rates. We will design our notification and follow-up materials to
encourage districts to take part based on an interest in furthering knowledge in an understudied but
critically important area. We anticipate that it will be an incentive to at least some districts to
understand that their completion of the surveys will assist in addressing a new area of research, and
especially research on use of technology, an area that is presenting many areas of opportunity and
challenges to administrators. Districts may also feel that there is value in responding since the study
findings are expected to meaningfully shape the guidance products that will be developed at the end of
the study, and that these products will be of greater value to developers and to educators if their
perspectives and practices are included. Districts will be informed that their input and that of teachers in
their district will be important in ensuring that their perspectives and needs contribute to the findings
that will help define the content of the guidance documents.
The district main respondent will be the EL services coordinator or other person most focused on EL
student instructional services. Thus our follow-up with non-respondents will encourage their completion
of the survey as a means of assisting educators to better understand how to select and use DLRs to
support EL students. In follow-up contacts, the study staff will note that the educator’s guide for
technology use with ELs will be provided at the end of the study. This may also serve as an incentive
since it will be a resource that the coordinator can share with schools and teachers. However, where
needed, we will also contact the district superintendent’s office to request alternative respondents
knowledgeable about district EL services and instructional practices and/or technology use in instructing
ELs to replace the non-respondent.
A further consideration in maximizing district response rates is that many districts will require district
research approval applications. In the section below on study timeline (section 10), we outline a
proposed revised data collection timeline, with data collection beginning in fall 2016, rather than June
2016. One of the benefits of the revision is that it allows for advance time to contact districts to identify
and comply with their research approval requirements. This will also assist in ensuring higher response
rates from districts.
Maximizing teacher survey response rates. We will provide teachers with an incentive of $25 for
completing the teacher survey. It is important to provide this incentive since teacher participation in the
study is voluntary and we recognize that teachers have many demands on their time. We anticipate that
the incentive will assist in reducing the scale of the nonresponse follow-up necessary to achieve the
desired response rate. In addition, we expect the incentive to have positive impacts on increasing
teacher interest and encouraging teachers to prioritize participation over other activities that do not
provide a monetary reward. The fact that they will receive the incentive, partially compensating them
for their time and effort, may also help ensure a higher quality of data, since teachers will understand
that there is acknowledgment of their time required to complete the survey. Further, we anticipate that
it will be an incentive to some teachers to know that what is learned will in very real ways inform the
guide for teachers and also lead to important guidance for DLR developers. Some may be motivated by
the fact that this survey is addressing an area in which there has not been much research and input is

Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

14

very much needed. Our materials will be designed to ensure teachers are aware of the importance of
their contributions.

B.3.2 Case Studies
It is essential that a high percentage of nominated districts and schools agree to participate in the study
because the study is using a limited number of case studies, and each case study district and school is
being used to collect in-depth data for a specific context. If districts and schools do not participate, then
important contextual information may not be gathered.
As noted above, the research team has extensive experience in gaining access to schools and districts for
research purposes. Prior to site visits, the research team will identify a school district contact. We will
have initial contact information based on the teacher sampling process, in which we will be contacting
districts and schools. We will identify the district person who is most responsible for EL services as our
main contact for the survey, and to be consistent this would be the same person who would be
considered the main contact for the district site visit as well.
We will provide our district contact with a letter describing the study and its importance for the field,
and the contribution that districts and schools will make, through their participation, toward advancing
the development and use of DLRs with ELs. This letter will also include the purpose of the case studies
and the major interview topics, and will provide information on how to learn more about the study.
Additional strategies that the research team will use include: assigning one researcher as the primary
contact to ensure consistency in contacts; using multiple methods to communicate with districts and
schools (in addition to email, using phone and mail if necessary); providing ample opportunities for
district and school contacts to ask questions about the study; building in flexibility in working with
multiple local coordinators for scheduling where necessary; selecting mutually convenient dates; and
providing easy-to-use tools such as scheduling templates to minimize the burden on sites.
The research team will work closely with districts and schools to select respondents based on their role
and will be flexible in scheduling the time and location of the interviews. To ensure that each relevant
respondent group is represented in each case study, the research team will conduct interviews by phone
at a later date in any case where respondents are unable to schedule a meeting during the site visit or
become unavailable on short notice. Through implementation of these multiple strategies, we anticipate
meeting and exceeding the 85 percent minimum response rate.

B.4.

Test of Procedures and Methods to be Undertaken

Surveys
The district and teacher surveys were constructed to ensure that each research question is addressed by
one or more items and that each item addresses a specific research question. After a process of iterative
internal reviews and including Department input, the draft district and teacher surveys were pilot tested
with up to two to three respondents each. The pilot tests with follow-up cognitive interviews were
carried out to confirm time estimates and to obtain input on how best to ensure clarity of the items and
comprehensiveness of response options. The project team randomly identified up to two districts (e.g.,
one with significant EL representation and one with low-incidence representation) based on the U.S.

Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

15

Department of Education database of districts. The staff contacted the identified respondent by email
and/or phone to request assistance in informing the development of the relevant survey. The survey
operations staff established procedures for reaching out to districts and schools for such initial piloting
and cognitive interviews. The pilot respondents each completed the relevant survey and then discussed
the individual items and the survey overall with the project staff. They identified items that were unclear
and/or suggested ways to ensure that the items are relevant to their experience. Each respondent also
provided information on the amount of time required to complete the survey. No more than nine
respondents participated in pilot tests of the surveys.

Case Studies
An internal pretesting of interview protocol items to ensure clarity has been conducted. The case study
protocols were pilot-tested with appropriate representatives from districts and schools (no more than
nine respondents were interviewed). The protocols were revised as needed for clarity and
comprehensiveness in relation to the study objectives. Participants in the case studies were identified
through project team contacts and recommendations of districts with levels of EL-representation. The
pilot tests and cognitive interviews ensure that all protocols are aligned with the constructs detailed in
the conceptual framework (described in the introduction of Supporting Statement A) and map back to
the research questions and subquestions, ensuring that the protocols will capture all the information
needed. The research team also constructed a matrix to ensure that the protocol for each respondent
type addresses all topics relevant to that role.

B.5.

Individuals Consulted on Statistical Aspects of the Design

Name
Annette Zehler
Ismael Flores-Cervantes
Atsushi Miyaoka
Phil Vahey
Savitha Moorthy
Julie Warner
Libia Gil
James Collins

Affiliation
Westat
Westat
Westat
SRI
SRI
ED/PPSS
ED/OELA
ED/OET

Telephone Number
301-251-8231
301-251-4204
301-610-4948
650-859-2143
650-859-5143
202-453-6043
202-401-4300
202-401-1444

e-mail
AnnetteZehler@westat.com
IsmaelFloresCervantes@westat.com
MIYAOKA1@westat.com
Philip.Vahey@sri.com
Savitha.Moorthy@sri.com
Julie.Warner@ed.gov
Libia.Gil@ed.gov
James.Collins@ed.gov

Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

16

References
Capps, R., M. Fix, J. Murray, J. Ost, J. Passel, and S. Herwantoro. The New Demography of America’s
Schools: Immigration and the No Child Left Behind Act. 2005.
http://www.urban.org/research/publication/new-demography-americas-schools.
Enright, K. A. “Language and Literacy for the New Mainstream.” American Educational Research Journal
48 no. 1 (2011): 80–118. doi: 10.3102/0002831210368989.
NCELA (National Clearinghouse for English Language Acquisition). NCELA State Title III Information
System. 2014. http://www.ncela.us.
U.S. Department of Education, Office of Educational Technology. Future Ready Learning:
Reimagining the Role of Technology in Education. Washington, DC, 2016.
U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics,
Common Core of Data (CCD). Local Education Agency Universe Survey. n.d. http://nces.ed.gov.
Warschauer, M., D. Grant, G. Del Real, and M. Rousseau. “Promoting Academic Literacy with
Technology: Successful Laptop Programs In K-12 Schools.” System 32 (2004): 525–537.
Zehler, A. M., C. Adger, C. Coburn, I. Arteagoitia, K. Williams, and L. Jacobson. Preparing to Serve English
Language Learner Students: School Districts with Emerging English Language Learner
Communities (REL 2008–No. 049). Washington, DC: National Center for Education Evaluation
and Regional Assistance, Regional Educational Laboratory-Appalachia, 2008.
http://ies.ed.gov/ncee/edlabs/projects/project.asp?ProjectID=151.
Zehler, A. M., Y. Yilmazel-Sahin, L. Massoud, S. C. Moore, C. Yin, and K. Kramer. The Implementation of
Technology for Instruction of English Learner Students: District Survey. Technical assistance
memorandum submitted to the U.S. Department of Education, Institute of Education Sciences,
National Center for Education Evaluation and Regional Assistance, Regional Educational
Laboratory-Appalachia, 2011.

Task 4.4 Revised Draft OMB. Part B: Collection of Information Employing Statistical Methods

17


File Typeapplication/pdf
AuthorAnnette Zehler
File Modified2016-07-21
File Created2016-07-10

© 2024 OMB.report | Privacy Policy