Att_Perkins 4 Supporting Statement Part A

Att_Perkins 4 Supporting Statement Part A.doc

Evaluation of the Implementation of the Carl D. Perkins Career and Technical Education Act of 2006

OMB: 1875-0253

Document [doc]
Download: doc | pdf



Evaluation of the Implementation of the Carl D. Perkins Career and Technical Education Act of 2006



OMB Forms Clearance Package:







July 23, 2009



Prepared for:

Policy and Program Studies Service

Office of Planning, Evaluation and Policy Development

U.S. Department of Education







Prepared by:

MPR Associates, Inc.

205 SE Spokane Street, Suite 344

Portland, OR 97219






Contents

Introduction 1

National Assessment of Career and Technical Education (NACTE) 2

Overview 2

Purpose of the Study 3

Conceptual Framework 3

Theory of Action 5

Outputs and Impact 6

Supporting Statement for Paperwork Reduction Act Submission 10

Part A: Justification 10

A.1 Importance of the Information 10

A.2 Purposes and Uses of the Data 11

A.3 Use of Electronic Technologies to Reduce Data Collection Burden 11

A.4 Efforts to Reduce Duplication 12

A.5 Impacts on Small Businesses or Other Entities 13

A.6 Consequences If Data are Not Collected or are Collected Less Frequently 13

A.7 Special Circumstances Relating to Data Collection 14

A.8 Federal Register Announcement and Consultations Outside the Agency 14

A.9 Payments or Gifs to Respondents 15

A.10 Assurances of Confidentiality to Respondents 15

A.11 Justification for Questions of a Sensitive Nature 16

A.12 Estimates of Information Collection Burden 16

A.13 Estimate of Total Cost Burden 18

A.14 Estimates of Annualized Cost to the Federal Government 18

A.15 Explanation of Program Changes or Adjustments 18

A.16 Project Time Schedule 19

A.17 OMB Expiration Date 19

A.18 Exceptions to Certification Statement 19


Part B: Collection of Information Employing Statistical Methods 20

B.1 Respondent Universe 20

B.2 Information Collection Procedures 22

B.3 Methods for Maximizing Response Rates 25

B.4 Test of Procedures 27

B.5 Individuals Consulted on Statistical Aspects of the Design 28




List of Exhibits


Exhibit 1: Perkins IV Logic Model 4



List of Tables


Table 1: Survey Response Times 16
Table 2a: Survey Response Burden 17
Table 2b: Fiscal Allocation Response Burden 17
Table 3: Costs for Collecting Survey and Fiscal Allocation Data 18
Table 4: Project Activities 19
Table 5: Population and Sample Counts for LEAs, Excluding Area CTE Centers, by Strata 20
Table 6: Population and Sample Counts for IHEs, by Strata 22





Appendix


Appendix A: Notification Materials: Survey and Fiscal Allocation Data Collections 29
Appendix B: State Secondary and Postsecondary Surveys 47
Appendix C: Local Secondary and Postsecondary Surveys 97

Introduction


The Policy and Program Studies Service (PPSS), Office of Planning, Evaluation and Policy Development, U.S. Department of Education (ED), is conducting the congressionally mandated independent evaluation and assessment of career and technical education programs under the Carl D. Perkins Career and Technical Education Act of 2006 (Act or Perking IV), including the implementation of the Act. This assessment is referred to as the National Assessment of Career and Technical Education (NACTE). PPSS requests clearance for the design of survey instruments and fiscal data collection tools for the NACTE.

NACTE focuses on three key aspects, among others identified in Perkins IV, of state and local implementation of career and technical education (CTE) programs funded with federal Perkins IV resources:

  1. Programs of Study—how states and locals are creating sequenced, nonduplicative coursework aligned with challenging academic standards and rigorous technical content;

  2. Accountability Systems—how well, and in what manner, new performance reporting requirements are working to promote accountability and program improvement; and

  3. Finance Systems—how financing of, and state and local priorities for, CTE have changed as a result of new legislation provisions.

Clearance is requested for the design, sampling strategy, survey instruments, and fiscal data collection tools to be employed as part of the evaluation.


National Assessment of Career and Technical Education (NACTE)

Overview

The Carl D. Perkins Career and Technical Education Act of 2006 (Act or Perkins IV) reinforces and elaborates a longstanding federal commitment to supporting career and technical education (CTE). Although federal contributions to the enterprise account for only a fraction of state and local spending, federal policy has had, and continues to exert, a catalytic influence on state and local programs and policies. Over time, as national attention has turned to globalization and technological change, and their implications for a more highly skilled work force, so too has the Act’s emphasis, with current legislation aimed at raising the academic and technical rigor and aligning the provision of secondary and postsecondary CTE coursework to prepare students for entry into high-skill, high-wage, or high-demand occupations.

In FY 2009, the federal government allocated in excess of $1.2 billion dollars in support of CTE programs offered in secondary local education agencies (LEA) and public institutions of higher education (IHE) throughout the 50 states, District of Columbia (D.C.), Commonwealth of Puerto Rico, and outlying areas. The U.S. Department of Education (ED) has responsibility for monitoring recipients’ use of federal funds to ensure that resources are spent in an optimal manner and in accordance with congressional intent.

Perkins IV mandated that the Secretary provide for an independent evaluation and assessment of career and technical education programs under the Act, referred to as the National Assessment of Career and Technical Education (NACTE), with the guidance of an Independent Advisory Panel (IAP) of CTE experts. This evaluation study, which is one of multiple CTE evaluation efforts directed by PPSS, focuses on evaluating how eligible agencies (state boards responsible for administration of CTE) and eligible recipients (secondary LEAs and eligible institutions) are responding to the new legislation. The study’s complex research design, which calls for collecting data through state and local surveys and state fiscal allocation records, supplemented by case studies, web searches, expert panel reviews, and state agency and local provider interviews, promises to yield a wealth of information on the immediate roll-out of Perkins IV.

This study focuses on three key legislative changes introduced in the 2006 Act. These include the requirement that 1) all eligible recipients offer at least one Program of Study (POS) to organize CTE coursework; 2) eligible agencies design and implement separate performance measures for secondary, postsecondary, and Tech Prep programs, and extend these accountability provisions to the local level; and 3) promote and provide for increased state and local flexibility in the use of federal funds. These changes will require that states and local service providers make substantial changes in how they administer programs, deliver services, and collect and report data on student and program outcomes.

To capture state and local staff perceptions of the Act’s implementation, researchers will administer surveys to the secondary and postsecondary directors of CTE within eligible agencies in each of the 50 states, the District of Columbia, Puerto Rico, the Virgin Islands, and outlying areas. A second set of surveys will be administered to a nationally representative sample of 1,265 secondary LEAs and 765 IHEs. To assess the distribution of federal resources to LEAs and IHEs, researchers will collect 2008-09 fiscal allocation data from state secondary and postsecondary eligible agencies administering federal Perkins dollars.

Purpose of the Study

The NACTE will assist federal policymakers in assessing the effect of legislative changes introduced in Perkins IV on the Act’s implementation and help to inform possible reauthorization of the Perkins Act. Study activities call for evaluating three aspects of state and local implementation of programs funded by Perkins IV, with the goal of answering the following evaluation questions, among others:


Programs of Study

  • How many POS are offered within LEAs and IHEs, and what are the characteristics of these programs?

  • How were POS developed and who participated in their creation?

  • What strategies have been used to implement POS at the local level, and what types of assistance were provided?

Accountability Systems

  • How have states designed and implemented new Perkins performance measures?

  • What approaches are states using to apply CTE performance accountability systems at the local levels?

  • How well, and to what extent, are the new performance requirements working to promote accountability and program improvement?

Finance Systems

  • How has financing of, and state and local priorities for, CTE changed as a result of new legislative provisions?

  • How has increased flexibility in the use of federal funds, and Tech Prep funds, in particular, affected eligible agencies and local CTE programs?



Conceptual Framework

Perkins IV mandates that all participating eligible recipients offer at least one Program of Study (POS) that provides students sequenced coursework that integrates challenging academic standards and career and technical content that leads to an industry-recognized credential or certificate at the postsecondary level, or an associate or baccalaureate degree. The Act also affords states increased flexibility in the allocation of funds by permitting eligible agencies to merge their Tech Prep funding (Title II) into their Basic Grant (Title I).


The theory of action underlying Perkins IV holds that the organization of career and technical knowledge and skills into a standards-based, aligned and articulated, non-duplicative sequence of courses leading to an industry-recognized credential, certificate, or degree will lead to increased academic and technical achievement by students, along with higher levels of student persistence, program completion, and college and career readiness.

Federal legislation, funding, and nonregulatory guidance

State legislation, funding, policies, and guidance

Local funding and program support

Integrated academic and CTE content and instruction

Collaborative secondary and postsecondary partnerships

Focused professional development for improved instructional quality

Accountability for program improvement and management

Inputs

Outputs

Impacts

Activities


Theory of Action

Although the law retains many of its central components, Perkins IV introduces some important advances. New legislative features contained within the Act form the basis of a theory of action directing eligible recipients to undertake activities for promoting program improvement and student attainment of academic as well as technical proficiency. In many ways, the evolution of Perkins IV parallels that of the No Child Left Behind Act (NCLB), which seeks to hold local programs accountable for achieving educational gains as part of a continuous improvement process.

Arguably the most substantive change introduced in Perkins IV is the requirement that all eligible recipients offer at least one program of study (POS) to help students prepare for postsecondary education and career entry, including military service. Serving as a unifying framework for guiding the development of CTE programs, POS encompass a number of elements that fall within two key dimensions.

The first is that POS have coherent and rigorous content, that is to say, they are based on challenging academic standards and relevant career and technical content. At the secondary level, Perkins IV places a priority on promoting student attainment of challenging academic standards identified within states’ NCLB systems and integrating these standards with relevant technical content within a given CTE area. Instruction is intended to provide students with the academic and technical skills they need to pursue advanced education or training, military service, or workforce entry. Skill specificity becomes progressively more advanced as students transition to postsecondary education, with increasing emphasis placed on students’ attainment of an industry-recognized credential or certificate or an associate or baccalaureate degree.

The second dimension is that POS offer a systemic focus to program design. The technical definition of POS specified in Perkins IV is that a POS is a sequence of courses that (1) incorporates secondary education and postsecondary education elements; (2) presents challenging academic standards and relevant CTE content in a coordinated, non-duplicative progression of courses that prepares high school students to succeed in postsecondary education and the workforce; (3) may include the opportunity for secondary students to participate in dual or concurrent enrollment programs or offer other ways to acquire postsecondary education credits; and (4) leads to an industry-recognized credential or certificate at the postsecondary level, or associate or baccalaureate degree.

Perkins IV’s statewide performance accountability requirements support and reinforce POS by specifying CTE goals in the form of performance measures and negotiated performance levels. As within NCLB, the development of statewide accountability systems becomes viable once rigorous content is identified. Perkins IV imposes performance accountability by elaborating a set of discrete secondary, postsecondary, and Tech Prep accountability measures to track students’ academic and career and technical skill attainment, along with program completion, and placement in employment, military service, apprenticeship programs, or advanced training or postsecondary education. In keeping with the Act’s focus on equity, Perkins IV requires that states disaggregate performance data by students’ race-ethnicity and special population status to monitor whether all students are benefiting from program services.

Another important advancement in Perkins IV is its extension of accountability expectations to the local level. In addition to negotiating performance targets with the Office of Vocational and Adult Education (OVAE) in the U.S. Department of Education (as earlier Perkins Acts required), states must now negotiate individual performance targets with each eligible recipient or require that all local programs adopt state-established targets. Eligible recipients falling short of negotiated targets face progressive sanctions, beginning with the requirement that they develop a local program improvement plan to address identified deficiencies and culminating in the loss of some or all of their Perkins IV funding. In expanding accountability to the local level, Perkins IV echoes NCLB expectations that all students can succeed and establishes a mechanism for holding local grantees accountable to this vision.

These changes occur against a backdrop of federal fiscal allocation policies that have remained, in large part, constant over time. The Perkins IV allocation formula for distributing grants remains essentially unchanged. States may continue to reserve a portion of their federal funds for state leadership and administration, allocating remaining resources using the formula contained in the preceding legislation. The Act also continues its focus on equity and addressing the needs of special populations. States are still required to monitor and take steps to improve student access to program services and to collect and report data on student and program performance, overall and disaggregated by demographic characteristics.



Outputs and Impact

Perkins IV offers a set of administrative and programmatic requirements that lay the groundwork for a comprehensive transformation of the CTE enterprise. To understand the Act’s anticipated effect, it is first necessary to catalog the expected outputs and short- and intermediate-term impacts expected to result from the legislation. These measurement points are summarized below.

Outcome 1: Integrated Academic and Technical Content

One of the criticisms leveled at traditional vocational education was its single-minded focus on technical skill instruction devoid of academic substance or rigor. Perkins IV directs educators to identify technical skills that are reflective of what an individual needs to succeed in the workforce and to couple them with academic standards. Achieving integrated curriculum will require the collaboration of academic and technical instructors, engaged in directed, content-focused reviews of subject area curricula. Outputs will include the identification and cataloguing of challenging academic content standards to be incorporated with career and technical education curricula within a given POS and used in the development of CTE.

Impacts

Successful integration of academic and technical skills should have direct, measureable effects on student outcomes. These may include:


  • Increased academic rigor within CTE programs—Students who are taught using integrated curricula should benefit by achieving higher levels of academic skill proficiency, as measured by valid and reliable assessments.

  • Clarity of instructional outcomes for CTE programs—Instructional programs that are founded on industry-recognized standards will enable secondary teachers and postsecondary faculty to detail explicitly the expected knowledge and skills students should have when they complete their studies and to measure students’ attainment of those career and technical skill proficiencies.


Outcome 2: Collaborative Secondary and Postsecondary Partnerships

The introduction of POS in Perkins IV is intended to reinforce the connection between secondary and postsecondary instruction. When fully implemented, an aligned secondary–postsecondary program sequence will provide a seamless transition between education sectors, benefiting both students and institutions. Developing comprehensive partnerships will demand the commitment of state secondary and postsecondary system directors, who will need to develop articulation agreements, specify criteria for establishing regional or statewide articulation, agree on formulas for sharing resources across educational sectors, and codify policies for awarding and transferring credit. High school teachers and college faculty will also need to play a role in aligning standards, curricula, and assessments across educational sectors to ensure that students entering a postsecondary institution do so with the requisite skills.

Impacts

Meaningful secondary and postsecondary partnerships should lead to improved outcomes for students participating in POS. These may include:


  • Reduction or elimination of postsecondary remediation for entering secondary students—High school students participating in a POS would be expected to be more proficient in the educational prerequisites they will need to complete their secondary coursework and enroll in a postsecondary institution.

  • Increased concurrent enrollment and dual credit opportunities—Courses that offer opportunities to earn credit at the both secondary and postsecondary levels allow secondary students to accelerate their pace through POS and provide incentives for motivated students to achieve.

  • Increased postsecondary placement and employment rates for CTE graduates—Secondary CTE students will pursue postsecondary CTE at increasing rates as POS become accepted as a single pathway, spanning the secondary and postsecondary sectors.

  • Student attainment of industry credential or postsecondary certificate or degree—An expected output of Perkins IV could be increased numbers of students entering (1) employment, military service, and postsecondary education with career and technical skill proficiencies aligned with industry-recognized standards and (2) postsecondary education with clear certificate or degree objectives in mind and better understanding of how to achieve them.




Outcome 3: Focused Professional Development

Achieving the transformational changes that Perkins IV envisions will require outfitting secondary teachers and postsecondary faculty with the skills they need to develop curricula and build partnerships across educational sectors. Professional development activities specified in the Act are intended to: (1) promote integration of coherent and rigorous academic and technical content; (2) encourage academic and CTE teachers to collaborate on the development and implementation of curricula and instructional practices; (3) increase the percentage of teachers that meet teacher certification or licensing requirements; (4) be high quality, sustained and focused on instruction and increase the academic knowledge and understanding industry standards of CTE teachers; (5) encourage applied learning that contributes to the academic and technical knowledge of the student; (6) assist teachers and faculty in accessing and utilizing student achievement and assessment data; and (7) promote integration with NCLB professional development activities at the secondary level.

Impacts

Secondary instructors and postsecondary faculty who participate in professional development should have a better understanding of how to structure their coursework to support student learning. Impacts may include:

  • Improvements in classroom instruction—Instructors trained in POS use will understand the knowledge and skills students should have when they complete their component of a POS.

  • Strengthen CTE instructors’ capacity to enhance and reinforce academic content—Targeted professional development, in conjunction with the introduction of guided curricular materials, will assist CTE educators in integrating academic standards into their teaching of technical skills.


Outcome 4: Accountability for Program Improvement and Monitoring

Perkins IV establishes specific expectations for the development of state and local performance accountability systems. These systems require programs to assemble data and evidence that will promote continuous improvement and inform policymakers and the public of the return on program investments. To motivate change, the Act imposes a progressive set of sanctions on state agencies and local grantees that fail to achieve negotiated levels of performance on their performance measures. Consequences begin with the expectation that underperforming grantees develop program improvement plans to address identified deficiencies, over time escalating to a loss of some or all Perkins IV funding if improvements in meeting targets are not realized.

Impacts

Performance accountability systems are intended to promote continuous improvement. Impacts may include:


  • Valid and reliable metrics to measure program outcomes—As states adapt to OVAE guidance, the validity and reliability of state and local accountability definitions and measures, and in particular, technical skill assessments, should improve over time.

  • Increased program effectiveness and efficiency—Quantitative data from performance measures can be combined with qualitative data received from program stakeholders to create a clear picture of programmatic operations and areas in need of improvement.

  • Expanded industry recognition of and confidence in CTE graduates—As employers come to trust that program graduates have the requisite skills to aid the industry’s economic success, they will be more likely to voice support for CTE programs. Such support may lead to increases in student mentoring by industry professionals, hosting of work-based learning experiences, donations of materials and equipment, and employee incentives for individuals graduating from a recognized POS.



Supporting Statement for Paperwork Reduction Act Submission


Section A. Justification


A.1 Importance of the Information

The Carl D. Perkins Career and Technical Education Act of 2006 (Act or Perkins IV) builds upon and extends prior congressional legislation aimed at increasing the academic knowledge and technical skills of secondary and postsecondary students enrolling in CTE programs. To ensure that federal funds are expended for purposes detailed in the Act, the Perkins IV legislation mandates a set of national activities (PL 109-270, section 114) to describe and evaluate the status of state and local implementation of the Act’s requirements.

Information collected from the NACTE study will be submitted to Congress in the form of an interim report (due January 1, 2010) and final report (due July 1, 2011), which are mandated in the legislation. These reports are to give an account of the condition of CTE and the success of eligible agencies (state boards responsible for administration of CTE) and eligible recipients (secondary LEAs and postsecondary IHEs) in implementing the Act and improving the quality and effectiveness of CTE services.

Perkins IV introduced some significant changes in how eligible agencies and eligible recipients use federal resources. As such, there is a need to assess how legislative innovations affect state administration of CTE programs and the delivery and outcomes of eligible recipients’ services. This study will focus on collecting information relating to three key areas:

1. Programs of Study

All participating eligible recipients are now required to offer at least one Program of Study (POS) that provides students a non-duplicative progression of secondary and postsecondary coursework that integrates challenging academic standards and career and technical content and that leads to an industry-recognized credential or certificate at the postsecondary level, or an associate or baccalaureate degree.

2. Performance Accountability

Reauthorization has permitted Congress to tailor accountability requirements within educational sectors and to improve the validity and reliability of performance indicators contained in the Act. Perkins IV introduces separate core indicators for secondary and postsecondary education and a new, distinct set of measures for Tech Prep programs. Congress had made an effort to link the Perkins IV measures to other federal initiatives, and in particular, to the No Child Left Behind Act (NCLB). Academic skill gains and completion rates of secondary students taking a threshold level of CTE coursework must now be calculated using the methodology of NCLB, with CTE concentrators now expected to achieve at the same rate as all other students.

Congress also has expanded accountability from the state agency to the local level. Secondary and postsecondary local providers are now required to negotiate with their eligible agency to establish performance targets for each accountability measure. Eligible recipients falling short of their targets face sanctions that begin with the need to develop a program improvement plan and culminate with the loss of some or all of their federal Perkins IV funding.

3. Finance

Perkins IV generally retains previous formulas used to allocate federal resources to eligible agencies and by eligible recipients (LEAs and eligible institutions). Given the importance that Congress has attached to targeting resources to populations living below the poverty line at the secondary level or who are Pell grant recipients at the postsecondary level, there is a need to update past fiscal analyses to assess whether the legislation is continuing to target resources to where they are intended.

The Act does, however, introduce increased flexibility in states’ use of reserve funding and the manner in which Tech Prep programs may be funded. In particular, the Act permits states to merge their Title II Tech Prep funds into their Title I Basic grant, and in so doing, sidestep the separate fiscal and accountability requirements for Tech Prep contained in the Act. Information on the extent to which states are taking advantage of these new provisions, and their effect on program offerings need also be considered.


A.2 Purposes and Uses of the Data

Data collected for the NACTE study will be used by Congress and other stakeholders to assess the extent to which state secondary and postsecondary eligible agencies are implementing legislative requirements introduced in the 2006 legislation. Specifically, survey data collected at the state director level will be used to assess whether, and if so, how, eligible agencies have guided POS development and modified their state accountability systems to improve the validity and reliability of CTE population definitions and performance measures. State fiscal allocation data also will be used to assess the distribution of federal resources to local providers in the 2008-09 program year and to conduct trend analyses using 2000-01 allocation data reported in the 2004 National Assessment of Vocational Education.

Local survey data will be collected from CTE program directors within eligible recipients to determine how new legislative requirements, which require locals to offer at least one POS to be eligible for funding, are being implemented. Information will also be collected on the effect that new accountability requirements are having on locals’ collection and use of student and program data. Collection of these data are critical for assessing the effect that federal policies have at the local level, both directly, and through their translation into policy and guidance provided by a state to its secondary and postsecondary eligible recipients. If state and federal officials plan to use Perkins data to make important policy decisions, and to guide subsequent reauthorizations, it is essential that any benefits and limitations associated with the legislation be surfaced and documented.


A.3 Use of Electronic Technologies to Reduce Data Collection Burden

The contractor will employ a web-based survey as the primary mechanism for data collection. The web-collection system works by storing the survey instrument on an SQL server and displaying questions for respondents in program-controlled sequences on the computer screen. Through computer control of the instrument administration process, web-based self-reporting offers the capacity for substantial improvements in data quality and data collection efficiency over a standard survey conducted using paper and pencil.

The incidence of missing or inconsistent data is greatly reduced with web-based approaches, since questionnaire skip patterns are computer controlled. Moreover, invalid entries or entries inconsistent with previous responses are rejected by the computer, requiring that the respondent enter corrected information at the time the survey is conducted. The system also adds considerable flexibility to the interviewing process, since questions can be quickly modified should an unforeseen complication arise.

Additional features of the system include: (1) on-line help to assist respondents in providing responses; (2) full documentation of all instrument components, including variable ranges, formats, record layouts, labels, question wording, and flow logic; (3) an integrated case-level control system to track the status of each sample-survey respondent or institution across the various data collection activities (lead-letter mailing, reminder mailing, web data collection, etc.); and (4) automatic audit file creation and timed backup to ensure that, if an instrument is exited prematurely and later restarted, all data entered previously will be retrieved.

To achieve the target response rate of 85 percent will require flexibility. Since not all individuals have access to, or are comfortable working with web-based technologies, respondents will be offered the opportunity to complete the survey via the web, phone, or using a paper questionnaire. The proposed data collection strategy incorporates the best features of mail communication, rapport building through telephone contacts, and the robustness of a web-survey instrument. Additionally, the contractor will remain flexible during data collection process, implementing additional strategies to reduce nonresponse as required.


A.4 Efforts to Reduce Duplication

The contractor has taken steps to reduce duplication of efforts and to minimize the reporting burden placed on respondents, and in particular, state directors of CTE who annually are asked to respond to multiple data collection requests from different state and federal agencies. Since this is a new data collection, focused on obtaining state CTE director and local CTE program director perceptions on the implementation of new legislative provisions, it is not possible to use or modify other data sources to achieve the reporting requirements contained in the Act.

As a first step to reduce data burden, project staff met with federal representatives at PPSS to obtain copies of survey instruments and data collection methodologies used for the Spring 1992 and Summer 2004 National Assessment of Vocational Education (NAVE) surveys. Project team members also obtained copies of raw fiscal allocation data files used to produce the finance section of previous studies.

In November 2008, NACTE project researchers met with National Center for Education Statistics (NCES) staff after determining that they were planning to survey state secondary and postsecondary CTE directors in early 2009, using the agency’s Fast Response Survey System. Since this survey effort would have overlapped the content of the NACTE state level survey, and have been conducted several months prior to the NACTE collection, it was agreed that it would be in the best interests of both agencies and the field if the NCES survey were delayed and, if possible, refocused around a different topic. Project staff continue to consult with NCES to obtain survey development guidance and will, where appropriate and permissible, share survey and fiscal findings from this effort.

NACTE staff also consulted with representatives of the Government Accounting Office (GAO), which is planning to conduct a survey of state directors of CTE to assess their perceptions on the implementation of Perkins IV in the first quarter of 2009. Since this survey effort is being conducted at the request of congressional members, it is not possible to delay or cancel the planned survey. To reduce duplication, NACTE researchers conducted a conference call with GAO staff to share information on the proposed survey effort.

In addition to supplying GAO with recommendations for improving their survey content, NACTE staff used information gleaned from their survey review to minimize overlapping questions. Although the GAO is, at present, unsure if the agency will be permitted to share raw data from its survey with NACTE researchers, GAO has agreed to share its overall study findings with the NACTE study team for potential use in this study.

Other than the two identified studies, this data collection effort is the only one that will collect information about the implementation of the Perkins IV legislation. NACTE researchers have contacted representatives of other CTE stakeholder groups, including the National Association of State Directors of Career and Technical Education, the National Research Center for Career and Technical Education, the Association for Career and Technical Education, and the National Association for Career and Technical Education Information to advise their leadership of the proposed survey and to solicit their support.


A.5 Impacts on Small Businesses or Other Entities

No small businesses or other entities will be involved in the survey effort. All respondents will be employees of 1) state secondary or postsecondary departments of education (or other eligible agencies that administer federal Perkins funds); 2) LEAs or area CTE centers serving public school districts, or 3) public IHEs, including less than 2-year institutions, 2-year colleges, tribally controlled colleges, or area or regional schools funded with postsecondary resources.


A.6 Consequences if Data are not Collected or are Collected Less Frequently

The Perkins legislation mandates that the national evaluation of CTE address the extent to which state, local, and tribal agencies have developed, implemented, or improved state and local CTE programs funded through the Act. Survey and fiscal allocation data, which will be collected once during the lifetime of this project, will constitute a major source of information about how programs are implemented under Perkins IV. Survey findings and fiscal allocation data will be used to respond to requests from Department of Education and other government officials, and Congressional legislators who wish to know how and how well the Perkins IV legislation is working and how effectively the taxpayers’ dollars are being invested. Study results will also be useful to assist Congress in reauthorizing the Perkins Act

If survey and fiscal allocation data are not collected from state secondary and postsecondary directors of CTE, the project team will be unable to respond to the Congressional mandate concerning how Perkins IV and U.S. Department of Education guidance has been translated into state policies regarding programs of study, accountability, and finance. This will prevent ED and the Independent Advisory Panel (IAP) from determining whether, and if so, to what extent states are successfully implementing the new Perkins legislation. Lack of fiscal data on Perkins IV allocations to local providers will also prevent ED and the IAP from providing Congress with information on the usefulness and equity of federal funding.

Absence of survey information from program directors of LEAs and IHEs will preclude ED and the IAP from assessing how new Perkins requirements are being applied at the local level and whether implementation patterns differ as a function of provider characteristics. Given that the Act introduces a number of provisions, including that eligible recipients negotiate program performance levels and develop at least one program of study, the collection of provider-level data is critical to assessing the effect of federal legislative changes in achieving congressional intent.


A.7 Special Circumstances Relating to Data Collection

There are no special circumstances listed in this section that apply to the collection of survey or fiscal allocation information.


A.8 Federal Register Announcement and Consultations Outside the Agency

In accordance with the Paperwork Reduction Act of 1995, a 60-day notice to solicit public comments was published in the Federal Register in Volume 74, page 22160, on May 12, 2009. No public comments were received at the close of the comment period.

Project team members have consulted with ED on the research and sample design, survey instrument, and data sources and collections strategies. These consultations have been used to ensure that study activities align with federal expectations and congressional reporting needs.


MPR Associates Inc. and its subcontractors AED and RTI have collaborated with federal staff to identify and secure existing data sources and to obtain input on the study design. Key staff from these organizations are listed below:

MPR Associates Inc. Steve Klein, Director, Preparation for College and Careers

Jim Schoelkopf, Senior Research Associate

Elliott Medrich, Director, External Affairs and Development


AED Ivan Charner, Vice President and Director, National Institute for

Work and Learning

Robin White, Senior Program and Policy Director

Corinne Alfeld, Senior Research and Evaluation Specialist


RTI James Isaac, Survey Director

Darryl Creel, Research Statistician

A.9 Payments or Gifts to Respondents

There are currently no plans to pay or provide gifts to state directors, as these individuals have a vested interest in the project and historically had evidenced high cooperation rates. In circumstances where local program respondents clearly demonstrate a lack of resources to complete the survey, we propose to offer a modest reimbursement. This reimbursement will provide financial resources to institutions, principally due to limited staff resources. As such, reimbursement will allow a local respondent to be used when a LEA or IHE needs to pay someone to complete the data entry (e.g., overtime or additional hours).

We anticipate that the use of reimbursement will be needed for no more that 15 percent of potential non-respondents, and that the per-hour reimbursement rate will range from $25 to $40.1 Given that we anticipate a total of 3,047 local respondents (2,041 LEAs and 1,006 IHEs), a potential nonresponse rate of 15 percent would translate to 457 respondents (0.15 * 3,047). Assuming a reimbursement rate of no more than $40 per person, the total reimbursement cost should not exceed $22,850 ($40 * 1.25 hrs * 457 respondents).


A.10 Assurances of Confidentiality to Respondents

Responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific secondary LEA, postsecondary institution, or individual at the local or state levels. Moreover, study team members will not release any information that identifies a subject or district to anyone outside the study team, except as required by law.

Additionally, RTI, which is overseeing the data collection process, maintains a standing Committee on Human Subjects to ensure that all surveys of human populations comply with applicable regulations concerning informed consent, confidentiality, and protection of privacy. This group serves as the agency’s Institutional Review Board (IRB) as required by law (45 CFR46). Policy requires that the IRB independently review and approve the study design, instruments, and procedures, and monitor the study to ensure that sample members' rights are fully protected.

An advance letter will be sent to all survey participants, describing the voluntary nature of the survey, and conveying the extent to which respondent identifiers and all responses will be kept confidential.

A.11 Justification for Questions of a Sensitive Nature

No questions of a sensitive nature will be asked. Questions focus on state-level or local level provider information rather than on personal information about individuals. Published data from the surveys will be presented in aggregate form that does not identify individual respondents.

A.12 Estimates of Information Collection Burden

Congress has mandated that the U.S. Department of Education conduct an independent evaluation and assessment of the Perkins Act each time the Act has been reauthorized. Researchers routinely have employed survey research methods to collect study information, querying secondary and postsecondary state directors in both the 1992 and 2004 national assessments, and secondary and postsecondary local program directors in the 1992 study.2 To be sensitive to the work demands on state and local program administrators, effort has been made to reduce the burden associated with collecting data. In addition to employing web-based technologies to collect information, researchers have also substantially shortened the length of state agency and local program director surveys from that used in earlier data collections.

As shown in Table 1, the length of time allocated for survey collection for the 2011 NACTE has either remained constant or been substantially condensed from prior study efforts. Reductions reflect NACTE researchers’ efforts to focus the local study on specific topics—identified in the theory of action—that are hypothesized to affect the Act’s operation.

Table 1

Survey Response Times — Secondary and Postsecondary State Directors and (in minutes) Local Program Administrators


1992 Survey

2004 Survey*

2011 Survey

Secondary State Director

138

95*

90

Postsecondary State Director

132

90

90

Secondary Program Director

150

NA

75

Postsecondary Program Director

192

NA

75

*Includes time allocated for a Tech Prep survey (30 minutes) and subsequent follow-up survey (20 minutes) in addition to the original survey instrument.

NA: Not applicable


Table 2a presents estimates of the reporting burden state directors and local program providers will incur in completing the surveys, which will be administered in the second year of the study. Time estimates are based on feedback provided by participants in a February 2009 pilot study of the survey and on our experience using similar instruments.3 There are no direct monetary costs to respondents other than their time to participate in the study.

Table 2a

Survey Response Burden — Secondary and Postsecondary State Directors and Local Program Administrators

Survey Respondent

Number of Respondents

Time per Response (minutes)

Total Hours

Cost per Hour

Cost

Secondary State Director

53

90

80

$25-$40

$2,000 – $3,200

Postsecondary State Director

53

90

80

$25-$40

$2,000 – $3,200

Secondary Program Director

2,041

75

2,552

$25-$40

$63,800 – $102,080

Postsecondary Program Director

1,006

75

1,258

$25-$40

$31,450 – $50,320

Total

3,153


3,970


$99,250 – $158,800

Annualized Average

1,051


1,323


$33,083 – $52,933

Note: Respondents include the 50 states, the District of Columbia, Puerto Rico, and the Virgin Islands.

Source: AFT Public Employees 2007 Compensation Survey.


Table 2b


Fiscal Allocation Response Burden—State Data Directors and Data Analysts


Respondent


Number of Respondents

Time per Response (minutes)

Total Hours


Cost per Hour


Cost

State Secondary Director

53

30

26.5

$25-$40

$663 – $1,060

State Secondary Data Analyst

53

60

53

$25-$40

$1,325 – $2,120

State Postsecondary Director

53

30

26.5

$25-$40

$663 – $1,060

State Postsecondary Data Analyst

53

60

53

$25-$40

$1,325 – $2,120

Total

212


159


$3,976 – $6,360

Annualized Average

71


53


$1,325 – $2,120

Note: The 53 respondents include the 50 states, the District of Columbia, Puerto Rico, and the Virgin Islands.

Source: AFT Public Employees 2007 Compensation Survey.


State directors also will be asked to submit an electronic file containing 2006-07 and 2009–10 fiscal allocation data for all LEAs and IHEs awarded a federal Perkins grant. The information contained in this file will be drawn from existing state administrative records used to award and administer federal grants. Time estimates for this request, detailed in Table 2, assume that the state director will review the original request, assign a data analyst to complete the request, and review the data prior to submission. Assembling the data will require that a state data analyst identify relevant data files and transfer information within them to an electronic data file using a data layout provided to them by the project team. State size should have no bearing on the time required to produce the electronic file, since data will already be in standardized electronic formats.

Similar to the procedures described in section A9, nominal monetary reimbursement will be made available in those cases where the utilization of the data analyst would otherwise not be practical or possible. We anticipate that the use of reimbursement will be needed for no more that 15 percent of potential non-respondents, and that the per-hour reimbursement rate for data analysts will not exceed $40.


A.13 Estimate of Total Cost Burden

There are no additional respondent costs associated with this data collection other than the time and cost burden estimated in item 12.


A.14 Estimate of Annualized Cost to the Federal Government

The estimated total annual cost to the federal government for collecting survey and state fiscal allocation data from local agencies is included in Table 3, along with cost breakdowns for the individual survey and fiscal allocation collections. Cost data are based on research and support staff hours required to design instruments and collect information, as well as associated operational costs, including overhead and other direct costs, such as travel, communications, and printing.


Table 3

Costs for Collecting Survey and Fiscal Allocation Data

Study Year (Dates)


Total Study Costs

Survey Data Collection

Fiscal Allocation Data

Year 1 (10-01-2008---09-30-2009)

$438,445

$372,678

$65,767

Year 2 (10-01-2009---09-30-2010)

$380,201

$323,171

$57,030

Year 3 (10-01-2009---09-30-2011

$33,433

$28,418

$5,015

Total

$852,079

$724,267

$127,812

Annualized Average

$284,026

$241,422

$42,604

Note: It is estimated that development of the fiscal allocation data collection instrument and collection of finance data will account for roughly 15 percent of the total level of effort for the survey and data collection.



A.15 Explanation of Program Changes or Adjustments

The change in annual reporting burden is 1,323 hours for the survey and 53 hours for the fiscal data collection, for a total of 1,376 hours, because this is a new collection.

A.16 Project Time Schedule

This is a three-year study that will make use of descriptive statistics to tabulate findings. No advanced analytical techniques will be used. Project findings will be incorporated into an interim report and final report. Key development and deliverable dates are noted in Table 4.


Table 4

Project Activities

Activity

End Date

Survey Design

Draft management plan

Final management plan

October 2008

December 2008

Instrument Design

Draft survey and fiscal data collection tool

Survey pilot test (8 respondents)

Final survey and fiscal data collection tool

January 2009

February 2009

March 2009

Study Sample

Sample identification*

April 2009

Data Collection

Survey notification

Data collection---State Directors

Data collection---LEAs and IHEs

As soon as OMB approval is
received

November 2009

January 2010

Draft Report: POS

Draft

Final

July 2010

August 2010

Final Report: Accountability

Draft

Final

May 2010

July 2010

Final Report: Finance

Draft

Final

July 2010

September 2010

* Sample identification entailed generating a random sample of secondary school districts using the Common Core of Data and postsecondary institutions using the Integrated Postsecondary Education Data System.











































A.17 OMB Expiration Date

All data collection instruments will display the OMB expiration date.


A.18 Exceptions to Certification Statement

No exceptions are requested.

1To calculate payment rates, researchers consulted the AFT Public Employees 2007 Compensation Survey, available at http://www.aft.org/pubemps/pubs-reports/PEcompsurvey.htm. To calculate compensation rates for state and local program directors, the minimum and maximum average annual salary for state Educational Specialist in 2007 was identified, adjusted by 6 percent per year and rounded to the highest $5,000.

2 To provide for trend analyses, a subset of questions used in both the 1992 and 2004 surveys have been replicated in the current survey effort. However, the Perkins IV legislation changed the federal accountability system and introduced Programs of Study as an organizing principle for offering CTE programs. As such, the current survey instruments differ substantially from that previously approved by OMB in 1992 (OMB #1850-0664) and 2004 (OMB#1875-0210).

3 Pilot study members included a total of eight individuals, including two former state directors of secondary and two former state directors of postsecondary state agencies, and four current program directors recruited from the field, with two representatives from secondary LEAs and two from postsecondary IHEs. Since the pilot study involved less than 9 individuals in total (i.e., 4 reviewers for the state director and 4 reviewers for the local survey), OMB approval was not requested prior to survey administration.

2


File Typeapplication/msword
File TitleEvaluation of the Implementation of the Carl D
AuthorSteve Klein
Last Modified By#Administrator
File Modified2009-10-06
File Created2009-10-06

© 2024 OMB.report | Privacy Policy