CPATH Evaluation OMB Pkg_Supporting Part A_final 032210 revisions

CPATH Evaluation OMB Pkg_Supporting Part A_final 032210 revisions.docx

Data Collection for the Evaluation of the Computer & Information Science & Engineering’s Pathways to a Revitalized Undergraduate Computing Education Program (CPATH)

OMB: 3145-0211

Document [docx]
Download: docx | pdf

Supporting Statement

Request for Clearance: National Science Foundation, Directorate of Education and Human Resources, Division of Graduate Education

Data Collection for the Evaluation of the Computer & Information Science & Engineering’s Pathways to a Revitalized Undergraduate Computing Education Program (CPATH)


SECTION A

Introduction

This request for Office of Management and Budget (OMB) review asks for a 3-year clearance of the evaluation plan and the initial data collection for the Evaluation of the Computer & Information Science & Engineering’s Pathways to a Revitalized Undergraduate Computing Education Program (CPATH), which is administered by the National Science Foundation (NSF)’s Directorate for Computer and Information Science and Engineering (CISE). The evaluation is funded through the Directorate for Education and Human Resources (EHR).

NSF funds research and education in science and engineering. It does this through grants, contracts, and cooperative agreements to more than 2,000 colleges, universities, and other research and/or education institutions in all parts of the United States. NSF accounts for about 20 percent of Federal support to academic institutions for basic research. EHR is the directorate within NSF that is responsible for the health and continued vitality of the Nation’s science, technology, engineering, and mathematics (STEM) education and for providing leadership in the effort to improve education in these areas.


This package describes plans for the evaluation of the CPATH program, including two types of data collections. The first data collection will focus on interviews that will be conducted face to face with six respondent types at CPATH project sites, or by telephone with off-site respondents. The interview protocols for the six types of respondents are contained in Appendix A. These interviews will be conducted in Spring 2010, 2011, 2012, and 2013. The second data collection – a survey of CPATH faculty – will be submitted for clearance in 2010.


Overview of Program: CISE Pathways to a Revitalized Undergraduate Computing Education Program (CPATH)

As part of its mission CISE contributes to the education and training of the next generation of computer scientists and engineers. Through the CPATH program CISE is challenging its community partners – colleges, universities, and other stakeholders committed to advancing the field of computing and its impact – to transform undergraduate computing education on a national scale in order to meet the challenges and opportunities of a world where computing is essential to U.S. leadership and economic competitiveness across all sectors of society.

The use of computers has permeated and in many cases transformed almost all aspects of our everyday lives. As computer use becomes more important in all sectors of society, so does the need for preparation of a globally competitive U.S. workforce with knowledge and understanding of critical computing concepts, methodologies, and techniques. Thus, upgrading undergraduate computing education to keep abreast of the multitudes of rapid changes in computing is paramount for the U.S. economy and competitiveness.

The CPATH vision is of a U.S. workforce with the computing competencies and skills imperative to the Nation’s health, security, and prosperity in the 21st century. This workforce includes a cadre of computing professionals prepared to contribute to sustained U.S. leadership in computing in a wide range of application domains and career fields, and a broader professional workforce with knowledge and understanding of critical computing concepts, methodologies, and techniques.

To achieve this vision, CISE is calling for colleges and universities to work together and with other stakeholders (industry, professional societies, and other types of organizations) to formulate and implement plans to revitalize undergraduate computing education in the United States. The full engagement of faculty and other individuals in CISE disciplines will be critical to success. Common challenges are fluctuating enrollments in traditional computer science programs, changes and trends in workforce demographics, the imperative to integrate fast-paced computing innovations into the curriculum, and the need to integrate computing concepts and methodologies into the undergraduate curriculum at large. Goals and strategies must be developed to address these and other challenges. Successful CPATH projects will be systemic in nature, address a broad range of issues, and have significant potential to contribute to the transformation and revitalization of undergraduate computing education on a national scale.

CPATH was first announced in FY 2006. From FY 2007 to FY 2009, NSF made awards to 96 institutions involved in 69 CPATH projects. The following categories were used to categorize the awards in FY 2007 and FY 2008 (all four were used in FY 2007; only the first two were used in FY 2008).

  • Community Building (CB). CB awards support community-building efforts that bring stakeholders together to discuss the challenges and opportunities inherent in transforming undergraduate computing education, and to identify creative strategies to do so. The types of activities supported by CB grants include, but are not limited to: a) development of forums and opportunities for community stakeholders to come together to explore common interests, share lessons learned, and identify promising practices; b) engagement of stakeholders in undergraduate computing education including administrators and faculty from computer science and other disciplines in which computing is playing an increasingly important role, within one institution or more broadly; and c) efforts focused on developing strong partnerships among academic, industrial, and not-for-profit organizations with a stake in undergraduate computing education. The scope of CB activities is deliberately broad. CISE encourages the community to develop creative strategies likely to effect transformation in undergraduate computing education at the institutional, local, regional, and/or national levels and across all institution types.

  • Institutional Transformation/Transformative Implementation (IT). IT grants support the implementation of innovative, integrative models for undergraduate computing education that have potential to serve as national models. IT projects are expected to: a) develop and implement innovative, integrative organizational models for undergraduate computing education at one or more institutions; b) create sustainable changes in culture and practice within the participating organizations; and c) serve as models and resources for the national computing community. Single-institution IT projects must engage multiple academic units or disciplines. IT grants also support the work of multiple institutions committed to the implementation of common or related models of undergraduate computing education.

  • Evaluation, Adoption, and Extension (EAE). EAE awards support the ongoing work of institutions that have already discovered and have begun to implement innovative undergraduate computing education models and approaches to realize the CPATH vision, as well as those organizations wishing to emulate and/or evolve the models. Specifically, EAE awards support efforts to: a) evaluate the success and impact of new models currently being implemented; b) engage additional institutions in their implementation; and/or c) to expand the scope of ongoing efforts. EAE grants support either, or both, the originating institutions and the institutions committed to replicating or evolving the promising model. EAE grantees are expected to disseminate lessons learned and promising practices such that other institutions and organizations may benefit from the project outputs.

  • Distinguished Education Fellow (CDEF) projects. CDEF grants recognize accomplished, creative, and talented computing professionals who have the potential to serve as national leaders or spokespersons for change in undergraduate computing education. CDEF awards are made to individuals who have achieved distinction in the computing profession, who are committed to transforming undergraduate computing education, and who have innovative ideas on how to do so. CDEF recipients may spend significant time and effort on projects focused on innovative, original, and possibly untested ideas that will benefit undergraduate computing education on a national scale.


The FY 2009 solicitation classified CPATH grants according to budget level, although awarded projects must have met the criteria described above. Class I projects have budgets totaling no more than $300,000 for 1-, 2- or 3-year durations. Class II projects have budgets totaling no more than $800,000 for 2- or 3-year durations.


Overview of Evaluation Plan


NSF has contracted with SRI International to conduct a five-year evaluation of the CPATH program. The goals of this program evaluation are to:

  • Document the overall CPATH program delivery through its funding of awards;

  • Describe how the program is being implemented through different project strategies; and

  • Evaluate the extent to which the program is accumulating evidence supporting the transformation of undergraduate computing education.


A primary focus of this evaluation will be to describe and document the program strategies utilized in infusing computational thinking across different contexts and disciplines. CPATH is a new program for NSF and addresses a relatively complex problem for higher education institutions – how to reform undergraduate computer education for a world that has rapidly embraced technology in almost every facet of life and work. Therefore, this evaluation will focus on providing a comprehensive description of all project types, including curricular and pedagogical innovations and promising models of institutional change in higher education institutions. Given the interdisciplinary nature of computing, this evaluation will also examine the development of communities of practitioners and the dissemination of best practices around computational thinking. Additionally, the evaluation will examine partnerships between the different sectors with a stake in computing education. Finally, this evaluation will examine preliminary evidence on how the CPATH program is preparing students for career options in the STEM workforce.


The four main sources of information for the CPATH evaluation will be:

  • Site visits to a sample of awardees (conducted annually for four years)

  • Survey of a sample of faculty (conducted twice)

  • Project documents (reviewed annually)

  • Evaluation reports from a subset of projects that are supported to use quasi-experimental designs (reviewed annually)


In addition, programmatic data collected from project Principal Investigators through the CPATH data monitoring system will provide the numbers and types of people working on CPATH projects as well as other background information required for monitoring NSF programs. The data monitoring system is awaiting approval under EHR’s generic clearance.


Logic Model. The conceptual logic model that will guide the CPATH evaluation is depicted in the figure on the next page. The model presents a common framework for understanding the context of the CPATH program, its specifications, and its goals, strategies, and outputs. The CPATH evaluation will employ mixed-method evaluation strategies including document analyses, site visit interviews (conducted mostly face-to-face but some by telephone), and faculty surveys to assess and measure site-based outputs as they accumulate over the period of the evaluation. The survey instruments will be informed by the first round of site visits and are not yet developed. They will be presented for clearance in 2010.



Shape1

Research Questions. Five overarching evaluation questions will guide the CPATH evaluation:

  1. How is the CPATH program infusing computational thinking into a wide range of disciplines serving undergraduate education?

  2. What is the evidence that university and community college departments and faculty are integrating computational thinking into their courses?

  3. What is the evidence that the program is supporting the development of promising models of institutional change?

  4. What is the evidence that the program is developing communities of practitioners (among the different program stakeholders) that regularly share best practices across communities?

  5. How has the CPATH program promoted sustainable multi-sector partnerships that represent a broad range of stakeholders (e.g., industry, higher education, K–12) and contributed to the growth of a workforce that ensures continued U.S. leadership in innovation?


Appendix B presents a crosswalk between these five overarching evaluation questions and subquestions and the items in the site visit protocols that will provide data to address the evaluation questions. The crosswalk documents the pathways for gathering data to shed light on the infusion of computational thinking across disciplines, on the development of models of institutional change, on community building, and on the creation of partnerships.

A.1. Circumstances Requiring the Collection of Data

The CPATH program was initiated in FY 2007. The program has not been evaluated previously by any agency or individual. While CPATH projects are required to have a project-level evaluation, no data on the extent to which expected programmatic outputs are being achieved are available.

The evaluation of CPATH at the program level requires a different approach than evaluation of the individual projects it supports. A program evaluation determines the overall value of a collection of projects that address an identified issue. It assumes variation in approaches to addressing the issue but looks for patterns that can help explain both successes and challenges. Some data used to track individual project activities and measure their impacts can be used for program-level evaluation, but program evaluation is more than an aggregation of project-level evaluation data.

A second reason aggregation of project data will not suffice for evaluation in this case is that the CPATH program itself may influence the community, including those who are not directly involved in the program. Although difficult to identify, these “spill-over” influences can be significant. If CPATH were an unchanging program that supported nearly uniform projects, or if this were a summative evaluation, a fixed evaluation plan would be methodologically appropriate. However, by its very nature CPATH requires a flexible and adaptive evaluation plan. This evaluation is therefore designed to have elements that track the same data over time through the use of mixed methods that will capture the changing dynamics within undergraduate education.

The CPATH program falls under the jurisdiction of the America Creating Opportunities to Meaningfully Promote Excellence in Technology, Education, and Science (COMPETES) Act (HR 2272), a legislative statute that provides increased support for education programs in the United States to ensure that STEM students, teachers, workers, and businesses remain competitive in the global economy and mandates (through the oversight of the Academic Competitiveness Council) that Federal STEM education programs receiving support undergo rigorous evaluation. In the case of CPATH, rigorous evaluation includes general data collection from all the projects as well as implementing quasi-experimental designs (using comparison groups) with a subset of projects to collect more in-depth data and build evidence to make causal claims about the program’s effectiveness. In addition to the proposed original qualitative data collection activities, the program evaluation will draw from reports of these project-level evaluations to the extent practical, using an analytic approach called narrative review to summarize evaluation results.

A.2. Purposes and Uses of the Data

The overall purpose of the data collections is program evaluation. The data obtained from the data collections will be used to document the effectiveness and outputs of the CPATH program and assess achievement of program goals.


The specific purpose of the site visit interviews is to learn more about the CPATH projects, how they are being implemented, the organizations and stakeholders involved, and the effect that the projects are having on various groups and organizations.


The interview protocols for the Principal Investigators (including Co-PIs), administrators, faculty, project staff, and external partners seek to obtain background information about the individual and/or organization, strategies being used for the CPATH project, factors related to implementation of the project, perceived outputs, and community building and development of partnerships. Many of the questions are the same, or very similar, on all five protocols. Occasionally wording is revised somewhat to make it more relevant to a certain group. Some questions do not appear on a particular protocol because they are not relevant to the group. More detail on the specific content being collected via these five protocols is provided below. Specifics on the student focus group protocol are described after the other protocols.


Background information: roles at the institution (PI protocol only); length of service; other positions held; role/responsibilities related to the CPATH project; prior involvement in undergraduate reform activities (for administrators, past and current reform initiatives on campus); perceptions of the most pressing issues in improving undergraduate education (not on staff protocol); how/why the institution/person became involved with CPATH; principal goals of CPATH project (staff protocol only). The protocol for external partners includes questions on the primary mission and focus of the organization, its size, its history including the year it was established, other major projects it is involved in, and its main sources of support.


Project strategies: nature of the teaching/learning environment for computing that the project is trying to create at the institution/partner sites; core strategies being used and how successful they have been; core computing concepts/competencies that the project focuses on (not on administrator protocol); how the project has integrated these concepts into courses outside of traditional computing disciplines (not on administrator or partner protocols); the primary beneficiaries of the project (not on partner protocol).


Implementation factors: highlights and successes/failures thus far; challenges/barriers; lessons learned; support offered by the institution to help faculty with curriculum development or devising pedagogical strategies for teaching computational thinking (not on staff or partner protocols); factors that have supported project implementation; sustaining the project beyond the end of the funding period (not on faculty protocol).


Outputs: influences of the project on faculty and students, e.g., enrollment in computing courses (not on partner protocol); changes in faculty culture (not on partner protocol); faculty publications on computational thinking (not on staff or partner protocols); institutional changes that can be attributed to the project, e.g., integration of computational thinking into other disciplines; rewards/incentive structure at the institution (not on staff protocol); whether or not the project has created a model that could be used at other institutions. The administrator protocol includes a question on how institutional changes might be documented.


Community building and partnership development: kinds of stakeholders and how information is shared with them; how inclusive the partnerships are; whether or not there is a shared understanding about computing competencies and/or computational thinking among stakeholders; communication among partners/sharing best practices; other organizations with which the partner has an ongoing relationship because of the project; extent to which NSF funding of the project has created new opportunities for partnerships among multiple sectors; extent to which the partnerships leveraged pre-existing relationships or new opportunities; interdependency of all partners (not on administrator protocol), effectiveness of partnership (not on administrator protocol), ways/barriers to sustaining multi-sector partnerships to replicate computational thinking models over the long term.


The focus group protocol for undergraduate students seeks to obtain information on the background of the students, their participation in the CPATH project, outputs of their participation, and their future plans. Specific items being collected are identified below.


Background information: year in college; major.


Participation in CPATH project: courses taken/currently enrolled in; other non-classroom activities that involve computing.


Outputss: benefits emerging from participating in computing courses/activities (interest in computing, computing ability, impact on future career plans, contacts with faculty/companies/ other organizations, jobs/internships/interviews/mentoring obtained through the CPATH project, whether classmates would say the same); description of a learning experience in the classroom or lab that involved computing (including resources/tools used, role of instructors/TAs, teamwork); how CPATH computing courses/experiences compare to previous ones (including caliber and diversity of students); what “computational thinking” means and how computing courses/ experiences through CPATH have shaped that understanding; perceptions of CPATH’s goals for students at their institution and whether or not the goals are being met.


Future plans: for continued involvement in computing courses/activities; for graduate school vs. job using computing vs. job in a non-computing specialty; how computing courses/ experiences have influenced the students post-graduation plans.

A.3. Use of Information Technology to Reduce Burden

The data collection for which clearance is requested at this time will take place in Spring 2010, 2011, 2012, and 2013. This collection will involve face-to-face interviews during site visits to institutions with CPATH projects, as well as some telephone interviews with off-site partners. Section A.3 does not apply to this data collection.

A.4. Efforts to Identify Duplication

The evaluation of the CPATH program does not duplicate other NSF efforts. The data being collected for this program evaluation have not been and currently are not being collected by NSF or other institutions. As noted earlier, the program evaluation will draw from rigorous project-level evaluations to the extent practical, using an analysis approach called narrative review to summarize evaluation results.

A.5. Small Business

It is unlikely that this program evaluation will have an impact on small business. Site visits will include speaking with CPATH grant partners, who may represent large companies, K-12 school districts, higher education institutions, government offices, non-profits, and professional membership organizations. Partners will be asked questions about their CPATH project, how it is being implemented, and the extent to which various organizations and stakeholders have been involved and been affected by the project. If the program ultimately succeeds in reforming undergraduate computer education, many small businesses may benefit in the longer term.

A.6. Consequences of Not Collecting the Information

If the information is not collected, NSF will be unable to document the effectiveness and outputs of the CPATH program. Moreover, it will not be able to meet its accountability requirements because it will be unable to assess the degree to which the program is meeting its goals. This lack of information may hamper program management. In addition, NSF will be unable to comply fully with the Congressional mandate that NSF evaluate its science, technology, engineering, and mathematics (STEM) education programs.

A.7. Special Circumstances Justifying Inconsistencies with Guidelines in 5 CFR 1320.6

The data collection will comply with 5 CFR 1320.6.

A.8. Consultation Outside the Agency

Two notices have been published to solicit comments from the public. The first notice was published in the Federal Register on September 22, 2009 (Volume 74, Number 182, pages 48316–17). The second notice was published on December 8, 2009 (Volume 74, Number 234, pages 64743–44). This package follows the second notice. A copy of the text of both notices is included in Appendix C. No public comments were received in response to the first notice. No comments have been received to date in response to the second notice.


The evaluation design was developed in consultation with NSF staff in the Directorate for Education and Human Resources (EHR), through which the evaluation of the CPATH program is funded, and with staff in the Directorate for Computer and Information Science and Engineering (CISE), which funds and administers the CPATH program.

A.9. Payments or Gifts to Respondents

No payment or gifts will be provided to participants in any data collection activities.

A.10. Assurance of Confidentiality

Interviewees will be advised that any information on specific individuals will be maintained in accordance with the Privacy Act of 1974. The data that are collected will be available to only NSF officials and staff and to the evaluation contractor. The data will be processed according to Federal and State privacy statutes. Detailed procedures for making information available to various categories of users are specified in the Education and Training System of Records (63 Fed. Reg. 264, 272 January 5, 1998). That system limits access to personally identifiable information to authorized users. The data will be used in accordance with criteria established by NSF for monitoring research and education grants and in response to Public Law 99-383 and 42 USC 1885c. The information requested may be disclosed to qualified researchers and contractors in order to coordinate programs and to a Federal agency, court or party in a court, or Federal administrative proceeding, if the government is a party.


Participants in the site visit interviews will be assured that the information they provide will not be released in any form that identifies them as individuals. Evaluation findings about the CPATH projects will be reported in aggregate form in all reports. The contractor, SRI International, has extensive experience in collecting information and maintaining the confidentiality, security, and integrity of data.


The following standards and procedures will safeguard the privacy of interviewees and the security of the data that are collected, processed, stored, and reported.


  • Project team members will be educated about NSF Privacy Act Systems of Records., the need to ensure study participants about confidentiality of their responses, and ways data and other sensitive materials are to be handled. They will be cautioned not to discuss interview results with others outside the evaluation. Within the evaluation team, discussions will be restricted to the essential needs of a particular set of site visits.

  • An initial letter of invitation from the National Science Foundation will inform all individuals that their participation in the CPATH evaluation study is voluntary and that, if they are willing to participate, their privacy will be assured. This assurance will be reiterated at the time data collection begins. Participants will also be informed of the purposes of the data collection and the potential uses of the data collected.

  • Prospective interviewees will be given a Consent Form that states the data will be protected under NSF Privacy Act Systems of Records as well as describes the purposes of the study, potential risks and discomforts, and benefits of participation.

  • Personal information (names, addresses, phone numbers, email addresses) will be collected solely for the purpose of identifying and contacting study participants, and will not be distributed outside the site visit team.

  • All electronic recordings of interviews, interview notes, and other project-related documents will be stored in secure areas that are accessible only to authorized staff members. Electronic files and databases will be stored on a secure server and will be accessible only to authorized staff members. Access to response databases, as well as to other electronic and hard-copy materials used to record collected data, will be limited to Nancy Adelman (PI), Raymond McGhee (Co-PI), and only those researchers who are granted access by the PI or Co-PI.

  • All interview results recorded on paper containing identifiable data will be shredded as soon as the need for the hard copies no longer exists.

  • All basic computer files will be duplicated on backup servers to allow files to be restored in the event of unrecoverable loss of the original data. These backup files will be stored under secure conditions in an area separate from the location of the original data.

  • Reports to NSF will include participants’ responses only in aggregate form. Responses will not be associated with any specific institution or individual. No information that could be used to identify individuals or their institution will be revealed to anyone outside the study team.

A.11. Questions of a Sensitive Nature

The interview protocols for Principal Investigators (and Co-PIs), administrators, faculty, and project staff request information on the participation of females, students with disabilities, and minorities (especially groups underrepresented in undergraduate computer education), and which groups, if any, are being targeted by the project. The student focus group protocol asks students about the degree of diversity in their project-related classes with respect to these same groups. No personal information is requested about any particular individual that could be used to identify that person, and reporting of the requested information is voluntary. Respondents may choose not to provide information that they feel is privileged. The interview protocol for external partners does not include any questions of a sensitive nature.


The basic data on these demographic groups are being collected because diversity in education is of key interest to NSF and to the entire Nation. This information will provide context in the study analyses and will be reported only in aggregate form.

A.12. Estimates of Response Burden

For this clearance request, the evaluation study relies on interviews with CPATH Principal Investigators (and Co-PIs), administrators, university faculty, project staff, external partners, and students. The interview protocols used in this data collection appear in Appendix A. This section provides estimates for only the site visit interviews. Estimates for the faculty survey will be provided in 2010, when that data collection is submitted for clearance.

A.12.1. Number of Respondents, Frequency of Response, and Annual Hour Burden

The data collection instruments requiring approval at this time are the six interview protocols. For all groups except Principal Investigators/Co-PIs, respondent burden consists of the time spent being interviewed at their sites, expected to average one hour. Principal Investigators will spend an estimated four hours, in addition to the interview time, working with their staff on: compiling lists of faculty, staff, administrators, students, and partners to be interviewed; helping to arrange interviews; gathering documents; and meeting with the site visitors. Respondents will not incur any equipment, postage, or travel costs. The interviews will be conducted once a year for four consecutive years, from 2010 to 2013.

The table below shows the annual number of respondents and the annual hour burden for the site visit interviews and other PI activities. The total number of individuals to be interviewed during site visits in 2010–2013 is estimated to be 224 annually. The annual burden, across all six interview protocols, is estimated at 176 person hours. In addition, the 12 PIs whose sites will be visited will spend an additional 48 hours (combined) on advance preparations for the site visit interviews and in meetings with contractor staff during the site visits. The annual burden is calculated by multiplying the number of anticipated respondents by the estimated response burden per person.

Annual Burden Hours by Instrument/Activity

Instrument/Activity (Years of Data Collection)

Respondent Type

Number of Respondents

Burden Hours Per Respondent

Annual Burden Hours

Site Visit Interview Protocols (2010, 2011, 2012, 2013)

CPATH Principal Investigators/Co-PIs, administrators, faculty, project staff, external partners, students

176

1

176

Preparation for Site Visits; On-Site Meetings (2010, 2011, 2012, 2013)

CPATH Principal Investigators

12 (also included above)

4

48

A.12.2. Hour Burden Estimates by Each Form and Aggregate Hour Burdens

Section A.12.1 above gives estimates of the combined annual burden for the six site visit interview protocols. This section provides burden estimates for each of the six protocols. The annual burden for each of the data collection instruments is calculated by multiplying the number of anticipated respondents by the estimated response burden per person. The time estimates are based on prior experience with interviews of this nature.


Annual Burden Hours by Respondent Type

Respondent Type

Number of Respondents

Burden Hours Per Respondent

Annual Burden Hours

CPATH Principal Investigators

12

5

60

University Administrators

20

1

12

CPATH Faculty

32

1

36

CPATH Project Staff

20

1

12

CPATH External Partners

20

1

24

CPATH Students

72

1

72

TOTAL, All Interviewees

176

5 for 12 PIs

1 for 164 others

224


In FYs 2007–2009, NSF made awards to 96 institutions for 69 CPATH projects. Site visits will occur annually over four years, with the first series to begin in March 2010. For the entire 4-year duration of this data collection activity, the total number of respondents is estimated to be 704, with a total burden of 896 hours.


Interviews will be conducted at 10–12 sites each year. Selection of sites will be based on three of the project types used to classify awards in FYs 2007–08 (community building; institutional transformation; evaluation, adoption, and extension); project maturity; models and approaches used (new curricular or pedagogical development, supporting communities of practice, other); Carnegie classification of institution; and geographical location. Interviews will be conducted with a variety of key informants including: Principal Investigators/Co-PIs, university administrators, faculty, project staff, external partners, and students. With the possible exception of university administrators, participants will be actively involved in the site’s CPATH project or very knowledgeable about it. The PI(s) for each project will work with the site visitors to identify a representative number of individuals from the various groups to be interviewed.


Approximately 14 participants will be interviewed at each site, for an annual total of 176 and an estimated annual burden of 224 hours. The number of respondents in each group is approximate; the final distribution across groups and the total number may vary among CPATH awardees in order to document the implementation of each awardee in the sample.


Principal Investigators (: The PIs of each site will be interviewed. The total number to be interviewed annually is estimated at 12 (1 PI per site). It is expected that 100% of these individuals will be interviewed as the meetings will be pre-arranged. Each interview will take about 1 hour. The 12 primary PIs will be responsible for helping to arrange interviews, gathering documents, and meeting with contractor staff on-site, for an estimated 4 hours per PI. SRI will select the faculty to be interviewed based on participant lists and annual reports. The total respondent burden per primary PI is estimated at 5 hours. The annual burden for the 12 primary PIs is estimated at 60 hours, and the annual burden for the 12 Co-PIs is estimated at 12 hours, for a total of 72 person hours for this group.


Administrators: The total number of administrators to be interviewed annually is estimated at 20 (1.5 per site). It is expected that 100% of these individuals will be interviewed as the meetings will be pre-arranged. Each interview will take about 1 hour. The total annual burden for this group is estimated at 20 hours.


Faculty: The total number of faculty members to be interviewed annually is estimated at 32 (approximately 3 per site). It is expected that 100% of these individuals will be interviewed as the meetings will be pre-arranged. Each interview will take about 1 hour. The total annual burden for this group is estimated at 32 hours.


Project Staff: The total number of project staff to be interviewed annually is estimated at 20 (1.5 per site). It is expected that 100% of these individuals will be interviewed as the meetings will be pre-arranged. Each interview will take about 1 hour. The total annual burden for this group is estimated at 20 hours.


External Partners: The total number of external partners to be interviewed annually is estimated at 20 (1.5 per site). It is expected that 100% of these individuals will be interviewed as the meetings will be pre-arranged. Each interview will take about 1 hour. The total annual burden for this group is estimated at 20 hours.


Students (focus group): The total number of undergraduate students expected to participate in focus groups is 72 (6 per site). It is expected that 100% of these individuals will participate as the focus groups will be pre-arranged. Each focus group session will take about 1 hour. The total annual burden for this group is estimated at 72 hours.

A.12.3. Estimates of Annualized Cost to Respondents for the Hour Burdens

The table below gives the overall annual cost, based on labor burden, for all site visit interviewees, and also the annual cost for each type of respondent. The total annual cost for all interviewees is estimated to be $6,374.20. The cost for each type of respondent is calculated by multiplying the total annual burden hours by their average hourly rate.

Annual Cost to Site Visit Interviewees for Burden Hours, by Respondent Type

Respondent Type

Number of Respondents

Burden Hours Per Respondent

Total Annual Burden Hours

Average Hourly Rate

Estimated Total Annual Costs

CPATH Principal Investigators

12

5

60

$35.45

$2,127.00

University Administrators

20

1

20

$71.00

$1,420.00

CPATH Faculty

32

1

32

$35.45

$1,134.40

CPATH Project Staff

20

1

20

$23.09

$461.80

CPATH External Partners

20

1

20

$35.45

$709.00

CPATH Students (undergraduates)

72

1

72

$7.25

$522.00

TOTAL, All Interviewees

176

5 for 12 PIs

1 for 156 others

224

$7.25 – $71.00

$6,374.20


The estimated hourly rate for PIs, Co-PIs, and faculty is based on national median salaries for associate professors in computer and information sciences, education, engineering, engineering technologies, and mathematics and statistics. The average median salary of these five job titles combined is $73,740. Divided by the 2,080 hours in a standard work year, this calculates to an average hourly rate of $35.45. The source of this information is the 2008/2009 National Faculty Salary Survey, conducted by the College and University Professional Association for Human Resources (CUPA-HR), www.higheredjobs.com/salary.


The faculty rate is also used for the external partners to be interviewed. Because partners include individuals from a variety of sectors (e.g., business, K–12 schools), there is likely to be a wide range of actual salaries. The faculty rate is considered to be a reasonable proxy for a mix of partner rates.


The rate for university administrators is based on national median salaries in higher education for the job titles of CEO, single institution (e.g., President), Provost, and Deans of Arts and Sciences, Computer and Information Sciences, Education, Sciences, and Undergraduate Programs. The average median salary of these seven job titles combined is $147,692. Divided by the 2,080 hours in a standard work year, this calculates to an average hourly rate of $71.00. The source of this information is the 2008/2009 Administrative Compensation Survey, conducted by the College and University Professional Association for Human Resources (CUPA-HR), www.higheredjobs.com/salary .


The rate for project staff is based on national median salaries for the job titles of research computer specialist and research assistant and senior research assistant in natural/physical sciences. The average median salary of these three job titles combined is $48,029. Divided by the 2,080 hours in a standard work years, this calculates to an average hourly rate of $23.09. The source of this information is the 2008/2009 Mid-Level Administrative and Professional Salary Survey, conducted by the College and University Professional Association for Human Resources (CUPA-HR), www.higheredjobs.com/salary.


The hourly rate for undergraduate students is based on minimum wage information effective July 24, 2009, obtained from the U.S. Department of Labor at http://www.dol.gov/dol/topic/wages/minimumwage.htm.

A.13. Estimate of Annualized Capital and Maintenance Costs to Respondents

There are no respondent costs associated with these data collections beyond those included in the estimates presented in Section A.12.

A.14. Estimates of Annualized Costs to the Federal Government

The estimated total cost to the Federal government of all data collection, analysis, and reporting activities associated with the CPATH program evaluation is $2,066,429. The average annual cost to the Federal government is estimated at $413,286. The CPATH contract period covers five years, from FY 2009 to FY 2013. Data collection will take place in the last four years. Site visit interviews will be conducted in 2010, 2011, 2012, and 2013. Faculty surveys will be fielded in 2010 and 2012. Site visit briefing reports, summaries of survey results, and annual reports will be drafted over the course of the project. The final report will be delivered in September 2013.


The table below breaks down the total cost of the CPATH evaluation to the Federal government by labor vs. other direct costs. Figures are given for the overall 5-year period of the contract and for an average year. The average annual cost was obtained by dividing the total cost by 5. These estimates are based on actual figures for the first year and modified contractual amounts for the second year. The projected budgets for the final three years are expected to be about the same as the contractual amounts for the second year.


Estimated Total and Annual Cost to the Federal Government


Total Cost for 5 Years

Average Annual Cost

Labor

$1,886,119

$377,224

Other Direct Costs*

$180,310

$36,062

Total, All Costs

$2,066,429

$413,286

* Includes local and non-local travel, materials and supplies, report production, telephone and fax communications, shipping, support costs, and G&A on support costs.

A.15. Changes in Burden

There are no changes in burden as this is the initial request for clearance.

A.16. Schedule and Plans for Data Collection and Reports

Timeline for Data Collection, Analysis, and Report Writing. The evaluation of the CPATH program is being conducted over the course of five fiscal years, FY 2009 through FY 2013. Work on this evaluation began in late 2008 with a review of existing CPATH project reports and project evaluations, and development of a logic model and evaluation plan to guide the overall evaluation. Data collection will start in March–May 2010 with visits to CPATH project sites to interview Principal Investigators/Co-PIs, administrators, faculty, project staff, external partners, and students. Additional site visit interviews will be conducted annually in January–May of 2011–2013.


Analysis will be ongoing from the beginning of data collection in Spring 2010 through September 2013, when the final report will be delivered to NSF. Briefing reports on the site visits will be prepared in May 2010–2013. In August–September of 2010 and 2012, a faculty survey will be administered. Summaries of the survey results will be prepared in November 2010 and 2012.


Annual reports will synthesize the findings to date from all document reviews and data collections relative to the evaluation questions. The reports in the first two years will focus on identifying trends and patterns of implementation across CPATH awards, implementation challenges, levels of participation, and the extent to which new models are developing and how community building is occurring throughout the field of computing education. These early reports will highlight new models for advancing computational thinking and identifying breakthrough interventions that have promise. Reports in the final three years will focus on examining the accumulation of evidence on program effectiveness.


The schedule for CPATH data collections and reports is presented in the table below.


Schedule of CPATH Data Collections and Reports

Project Activity

Time Frame

Site Visit Interviews

Conduct site visits

January–May 2010, 2011, 2012, 2013

Prepare briefing report on site visits

May 2010, 2011, 2012, 2013

Faculty Survey

Administer survey

August–September 2010, 2012

Prepare summary of survey results

November 2010, 2012

Annual Reports

Draft report, revise after NSF review, submit revised report to NSF

January–August 2009–2013

Final Report

Deliver final version of last annual report reflecting all accumulated evidence

September 2013


Publications. Like many agencies, NSF is reducing its reliance on formal (i.e., traditional) publication methods and publication formats. SRI International, the contractor for this evaluation, is forbidden contractually from publishing results unless NSF has made a specific exception. In short, all products of the collections are the property of NSF. After the products are delivered, NSF determines whether the quality of the products deserves publication verbatim by NSF, i.e., NSF is the exclusive publisher of the information being gathered. Often it is only after seeing the quality of the information delivered by the study that NSF decides the format (raw or analytical) and manner (in the NSF-numbered product Online Document System (ODS) or simply a page on the NSF Web site) in which to publish.


NSF’s plans for publishing any reports from the CPATH evaluation are undetermined at this time.

A.17. Approval to Not Display Expiration Date

Not applicable.

A.18 Exceptions to Item 19 of OMB Form 83-I

No exceptions apply.



3


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement
File Modified0000-00-00
File Created2021-02-03

© 2024 OMB.report | Privacy Policy