Supporting Statement Part B_1855-0024

Supporting Statement Part B_1855-0024.docx

National Survey of Charter School Facilities

OMB: 1855-0024

Document [docx]
Download: docx | pdf

OMB Number: 1855-0024


SUPPORTING STATEMENT

FOR PAPERWORK REDUCTION ACT SUBMISSION

OMB Number: 1855-0024


B. Collection of Information Employing Statistical Methods

The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce burden or improve accuracy of results. The following documentation should be provided with the Supporting Statement Part A to the extent that it applies to the methods proposed. For further information, please obtain a copy of the FAQs for statistical surveys by the Office of Management and Budget via this link. The standards and guidelines are available from ICCD’s SharePoint site here.

  1. Describe the potential respondent universe (including a numerical estimate) and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, state and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

Respondent Universe and Sample Size

The National Charter School Facility Survey (NCSFS) will use a stratified, systematic sample of charter schools selected from a list of U.S. charter schools. We will draw our sample from a list of charter schools in 50 states and the District of Columbia from the Common Core of Data (CCD) school survey, a survey of all elementary and secondary schools in the U.S. For our purpose, we specify the school as the sampling unit and generated a population list of 7,573 charter schools in the U.S. as of the 2018–19 school year. The target sample size is 700 charter schools, which is approximately 9 percent of the total charter school population.

We expected an 80 percent response rate. The previous state-level surveys had an average response rate of 80 percent. The table below shows the sample allocation by strata.





Stratum

Description

Population Size

Proportion

Sample size

Contact Sample Size

1

City, Elementary, Title-I eligible

1824

26.17%

183

229

2

Non-city, Elementary, Title-I eligible

1383

19.84%

139

174

3

City, Secondary, Title-I eligible

1344

19.28%

135

169

4

Non-city, Secondary, Title-I eligible

795

11.40%

80

100

5

Non-city, Elementary, Title-I ineligible

588

8.43%

59

74

6

City, Elementary, Title-I, ineligible

370

5.31%

37

46

7

City, Secondary, Title-I ineligible

339

4.86%

34

43

8

Non-city, Secondary, Title-I ineligible

328

4.71%

33

41


Total

6971

100.00%

700

875


Table reads: Stratum 1 includes Title-I eligible elementary charter schools located in cities. There are1,824 schools in this stratum, accounting for 26.17 percent of the total 6,971 charter schools with available data in Common Core of Data. We will need complete responses from 183 schools in stratum 1 to form a nationally representative sample of 700 charter schools. Assuming a 20 percent non-response rate, we need to contact 229 charter schools in stratum 1.


  1. Describe the procedures for the collection of information, including:

  • Statistical methodology for stratification and sample selection.

  • Estimation procedure.

  • Degree of accuracy needed for the purpose described in the justification.

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

Sampling Method

Prior to sampling, we will stratify the list of charter schools, i.e., group schools before selection, to reduce the variances of the sample estimates under a proportionally allocated sampling scheme. We will explicitly stratify schools by:

1. Urbanicity: city versus others

2. School instructional level: schools serving elementary grades (any grade between prekindergarten and fifth grade) versus schools serving secondary grades only, and

3. Percentage of students from low-income families: eligible for Title-I programs versus others

We will stratify charter schools by these characteristics to create homogeneous groups for sampling. These homogeneous groups help reduce sample variances and therefore increase the precision of parameter estimates.

Within each stratum, we will sort schools by school age, (i.e., the number of years since the establishment of a charter school) and school enrollment size in the 2018–19 school year. We will randomly pick a school within a stratum and select schools at a sampling interval of eight, i.e., we will select every 8th school after the randomly selected school.

Estimation Procedure


This survey data collection is designed to answer three research questions –

1. What are the physical conditions of charter school buildings?

2. What resources do charter schools use to access and maintain facilities?

3. What challenges do charter schools have in accessing and maintaining facilities?


To answer these research questions, we will develop a data analysis plan detailing data

sources, key variables to be used in the analysis, type of analysis, and significance level to be used. We will apply weights appropriate for the sample design to calculate population estimates that reflect the design effect resulting from a complex survey design.


Prior to producing estimates, we will establish criteria for determining thresholds for acceptable sampling and non-sampling error associated with a direct survey estimate or model-based estimate. We will calculate variance estimates to take into account the probabilities of selection, stratification, effects of non-response, post-stratification adjustments if needed. We will develop model-based estimates to estimate population estimates of means and standard deviations for measures of interest. We also plan to compare these estimates across subgroups if the sample sizes of these groups suffice. Sensitivity analysis will be performed to determine if changes in key model inputs cause key model outputs to respond in a sensible fashion. We will also compare results from our analysis to previous national school facility surveys to validate model performance and examine differences in facility conditions between charter schools and public schools overall.

Degree of Accuracy

To see how sensitive the facility measures are to changes in the sample size, we identified three key measures from the NCSFS questionnaire that were reported in Condition of Public School Facilities: 2012-13:

  • Percentage of schools with permanent buildings

  • Percentage of schools reporting their permanent building condition as excellent

  • Percentage of schools reporting needs for spending money on repairs

We obtained means and standard deviations of these measures as reported in the Condition of Public School Facilities: 2012–131, and computed, under assumptions of simpler random sampling, the precision of these measures, i.e., standard errors and confidence intervals (CI). With a sample size of 700, the standard errors of these measures are around 0.2, small enough to produce precise estimates.




Percentage with a permanent building

Percentage with a permanent building in excellent condition

Percentage with the need to spend $ on repairs

Mean

0.85

0.20

0.53

Standard Deviation

0.43

0.40

0.50





n=700




Standard Error

0.02

0.02

0.02

95% CI lower limit

0.72

0.17

0.49

95% CI upper limit

0.78

0.23

0.57

There is no unusual problems requiring specialized sampling procedures, and the data collection takes place only once for this contract.

  1. Describe methods to maximize response and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for the intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.

Efforts to maximize response

We will work with Charter Support Organizations (CSOs) to identify and recruit data collectors across states to onboard selected schools. These data collectors will also follow up with schools to facilitate survey completion and maximize the response rate. We will create multiple channels of communication to facilitate recruitment and data collection, including a data collection monitoring dashboard. Listed below are communications with schools for which we develop protocols for the project staff.


For recruitment:

1 email to schools asking them to participate in the survey

2 follow-up emails,

2 follow-up calls,

1 follow-up email


For data collection:

1 email to schools to send the URL for the survey questionnaire

2 follow-up emails

2 follow-up calls

1 follow-up email

1 follow-up postcard

1 Thank-you letter


All data collectors will receive training to learn about the survey study, protocols for school recruitment and survey follow-ups, and the internal communication channels.


Handling non-response and testing reliability


The survey estimates are subject to nonsampling errors due to non-response or noncoverage, errors of reporting, and other errors made in data collection. These errors can bias the data and are hard to measure.


To minimize the potential for non-sampling error, we conducted a pilot test of the questionnaire with charter school personnel knowledgeable about school facilities. During the design and pilot of the survey instruments, we made great efforts to check the clarity of instructions and question items to ensure consistent interpretations. The questionnaire and other data collection protocols were extensively reviewed by staff in the Charter School Programs office and the Institute of Education Science.


During the analysis we will develop a code file to check data accuracy and consistency. We will create a source file to generate a codebook which served as the main tool for coding,

editing, and processing completed questionnaires. We will use the codebook to identify cases requiring data retrieval or clarification and verify non-response items in line with skip patterns. We will conduct manual logic checks and automated checks using SAS.

  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

The National Charter School Resource Center (NCSRC) executed pilot test activities with five charter school leaders from March 2020 through May 2020 in preparation for the administration of the National Charter School Facilities Survey that will take place in 2021. The purpose of the pilot test is to design clear questions in concise language that are relevant for the charter school facilities field. The pilot test also allows us to adjust and refine the recruitment and data collection processes so we can maximize the response rates for the national survey administration.

During the pilot test process, we drafted recruitment and data collection protocols, created a communications plan, conducted cognitive interviews, recruited schools, disseminated the survey questionnaire, monitored data collection, and conducted exit interviews. We improved our survey questionnaire based on feedback collected from pilot test participants, tested the viability of the data collection platform, and developed a more precise estimate on participant burden.

  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other persons who will actually collect and/or analyze the information for the agency.

James Lepkowski, Ph.D., Advisor, 734-649-8652

Ying Zhang, Ph.D., Survey Director, Manhattan Strategy Group

Hannah Sullivan, Senior Analyst, Manhattan Strategy Group

Aubrey DeBoer, Education Research Analyst, Manhattan Strategy Group

5


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement Part B
AuthorYing Zhang
File Modified0000-00-00
File Created2021-02-11

© 2024 OMB.report | Privacy Policy