TABLE OF CONTENTS
A.1. Circumstances That Make the Collection of Information Necessary 3
A.2. Purpose and Use of Information Collection 4
A.3. Use of Improved Information Technology and Burden Reduction 5
A.4. Efforts to Identify Duplication 6
A.5. Impact on Small Businesses or Other Small Entities 6
A.6. Consequences if Information Collected Less Frequently 6
A.7. Consistency with Guidelines of 5 CFR 1320.8(d) 6
A.8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency 6
A.9. Explanation of any Payment or Gift to Respondents 7
A.10. Assurance of Confidentiality Provided to Respondents 7
A.11. Justification for Sensitive Questions 9
A.12. Estimates of Annualized Hour Burden and Costs 9
A.13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers 9
A.14. Estimates of Annualized Costs to the Federal Government 9
A.15. Explanation for Program Changes or Adjustments 9
A.16. Plans for Tabulation and Publication and Project Time Schedule 11
LIST OF TABLES
Exhibit 1: Key Evaluation Questions 5
Exhibit 2: Estimated Burden Hours 10
Exhibit 3: Timetable for Data Collection And Publication For Other Data Collection Efforts 11
ATTACHMENTS
Attachment 1: Feasibility Study Site Visit Script
Attachment 2: School Principal Feasibility Study Site Visit Protocol
Attachment 3: Internal Coach Feasibility Study Site Visit Protocol
Attachment 4: Teacher Feasibility Study Site Visit Protocol
Attachment 5: AmeriCorps Member Feasibility Study Site Visit Protocol
Attachment 6: Process Evaluation Site Visit Script
Attachment 7: Initial Letters Informing Schools of Selection
Attachment 8: School Principal Process Evaluation Site Visit Protocol
Attachment 9: Internal Coach Process Evaluation Site Visit Protocol
Attachment 10: Teacher Process Evaluation Site Visit Protocol
Attachment 11: Teacher Focus Group Process Evaluation Site Visit Protocol
Attachment 12: AmeriCorps Member Process Evaluation Site Visit Protocol
Attachment 13: Email Invitation to Complete the MRC Baseline Member Survey
Attachment 14: MRC Baseline Member Survey Consent Form
Attachment 15: MRC Baseline Member Survey
Office of Strategy and Special Initiatives
Corporation for National and Community Service
Process and Impact Evaluation of
the Minnesota Reading Corps (MRC)
Supporting Statement
The Corporation for National & Community Service (CNCS) is requesting Office of Management and Budget (OMB) approval for data collection associated with the Process and Impact Evaluation of the Minnesota Reading Corps (MRC) Program. The Minnesota Reading Corps (MRC) was started in 2003 to provide reading and literacy tutoring to children in preschool programs. In 2006 the program was expanded to serve students in kindergarten through third-grade. The goal of the MRC is to ensure that students become successful readers and meet reading proficiency targets by the end of the third grade. The core activities of the program are training, placing, and monitoring AmeriCorps members to serve as literacy tutors in school-based settings to implement research-based, early-literacy strategies. AmeriCorps members are monitored at schools through observations made by an Internal Coach. The MRC is one of the largest education-focused AmeriCorps grantees funded by CNCS.
The process and impact evaluation of the MRC consists of three phases: Phase 1- Feasibility Study; Phase II - Process Assessment; and Phase III – (Future) Impact Evaluation. The feasibility phase will explore options for designing a future random assignment impact evaluation of the MRC that assesses the impact of the program on students’ literacy levels. The purpose of the process assessment is to thoroughly understand the MRC model; conduct an assessment of the effects of serving as an AmeriCorps member in MRC; and capture lessons learned for future program replication. The primary goal of the future impact evaluation will be to understand if the MRC is having an impact using a rigorous randomized control trial (RCT); the secondary goal is to understand why. This package addresses the first two phases of the project, which include the following data collection instruments to be reviewed:
Feasibility Interview Protocols: Site visits to 50 sampled MRC programs to assess their suitability for participation in and inform them about the implementation of the later impact evaluation. The sites selected would be evaluated for participation in the later random assignment study to take place in Phase III of the study;
Process Assessment Interview Protocols and Focus Group Guide: Site visits to up to 20 MRC programs to better understand variations in the MRC model and how to replicate the program in other locations; and
Baseline Member Applicant Survey: A baseline survey with new applicants to the 2012-13 MRC program. Approximately 2,000 subjects will be asked to participate in the survey, of which approximately 1,000 will be MRC members (treatment group) and 1,000 will be individuals who were selected as alternates (comparison group) by the program.
The third phase of the project, the Impact Evaluation, will be funded in FY2013 based on the findings from the current feasibility study.
CNCS anticipates that this study will not only examine if the MRC is having an impact on service recipients (students) and what the effects of the program are for AmeriCorps members who serve as literacy tutors, but it will provide stakeholders with an understanding of the extent to which trained volunteer tutoring programs can effectively provide reading assistance to students at low cost to school districts.
CNCS is relying on the study findings to assist with future funding decisions around other educationally-focused AmeriCorps programs. The feasibility site visits will provide essential information for designing a later impact evaluation that assesses the impact of the program on students’ literacy levels and for planning for the random assignment process in the 2012-2013 school year. In addition, the process site visits will collect necessary information on the processes and procedures of the MRC to inform the findings on the program’s impact on students and its possible effects on AmeriCorps members. The baseline and follow-up member survey will measure changes in AmeriCorps members’ educational goals and civic engagement.
In addition to assessing the suitability of sites for participation in the future random assignment impact evaluation, other goals of the feasibility study are to better understand the MRC program model and alternative supplemental assistance available to students (i.e., counterfactual); obtain stakeholder buy-in; and provide design recommendations for an impact study to be funded in a future project phase. The feasibility site visits will include on-site interviews with the internal coach, principal, teachers and AmeriCorps members at each selected site (we anticipate visiting up to 20 sites).The goal of the interviews is to understand and identify program responsibilities, staffing and management responsibilities, number and type of students being served, alternate programs offered at the school (counterfactual), MRC program implementation issues, organizational support, data capabilities, future plans, and program concerns around the randomization process.
The process assessment site visits will include interviews with a similar group of program and school staff, the internal coach, principal, teachers, and AmeriCorps members, and possibly a teacher focus group at 20 selected sites. The goal of the interviews and focus group is to understand the history of the program including its implementation at the school, staffing and management structure of the program, AmeriCorps Member selection process, training and orientation (AmeriCorps Member only), responsibilities, interventions with students, alternate programs offered at the school, organizational support, community engagement, facilitators and barriers to program implementation, and results and lessons learned.
The process assessment also involves the administration of a baseline member survey to two groups of applicants to the MRC program: approximately 1,000 new applicants selected to serve as AmeriCorps members (treatment) for the 2012-13 school year and up to 1,000 new applicants identified as possible alternates (comparison). Applicants who previously served with MRC will not be eligible to take the survey. The survey includes questions on the applicant’s background before applying to the MRC program, reasons for applying to the program, the applicant’s interests and values as they relate to tutoring, and what the applicant thinks he/she might do after the program. A follow-up survey will be administered to the same groups of program applicants as part of a future round of funding.
The contractor will achieve these objectives by answering the key research questions provided in Exhibit 1. Research questions for the later impact evaluation will be finalized based on findings and recommendations resulting from the feasibility study phase.
Exhibit 1: Key Evaluation Questions
Feasibility Study
Is the MRC (or similar reading tutoring program) reaching its intended service recipients (e.g. by demographics, geographic location, educational needs)? Are there populations that are underrepresented? What are the characteristics of service recipients?
Are service recipients receiving the full extent of the program(s) intended dosage? Are service recipients receiving the intended program components?
What are the characteristics of AmeriCorps members? Are AmeriCorps members receiving the intended program components?
How similar is the MRC to other reading tutoring models and/or programs?
What is the program theory of change? What are the immediate, intermediate, and long-term outcomes that the program(s) are attempting to achieve? Is there fidelity to the program model? If fidelity does not exist, what processes and policies can be implemented to improve fidelity?
Is the program attaining its performance goals? Are the performance goals appropriate for the program?
Does the reading tutoring program have adequate data collection history and resources? Is there a commitment to evaluation and using evaluation findings?
Is the MRC (or similar reading tutoring program) 'ready' for a rigorous impact evaluation? Is the program sufficiently mature and do they have sufficient sample size?
What are the design options for the impact evaluation?
Process Assessment
How is the MRC (or similar reading tutoring program) reaching its intended service recipients (e.g. by demographics, geographic location, educational needs)?
Are there characteristics of AmeriCorps members that are particularly effective with service recipients?
Are AmeriCorps members receiving the appropriate training and supervision? Are there characteristics of members or service recipients that make them more prone to drop out of the program? What is the effect of member training and supervision on tutee outcomes?
What is the effect of the program on AmeriCorps members? Are members experiencing transformation, are they more civically engaged, are they more likely to go into education-related careers post program, and are they satisfied with their service experience?
Is the program attaining its performance goals? Are the performance goals appropriate for the program?
How is the program achieving its immediate, intermediate, and long-term outcomes? How does the program's design and administration lead to the achievement of these target outcomes?
Which findings and lessons learned from the MRC can be applied to other models and/or programs? Are there characteristics that are suitable for similar reading tutoring programs to replicate?
The baseline member applicant survey will rely on data gathered from a self-administered, Web-based survey of applicants to the MRC program. (Please see our sampling plan in Section B.1 for more detail). We know that individual respondents will have access to and be familiar with the necessary technology to complete the survey due to the fact that they had to apply to the MRC program online. The survey will be administered electronically to minimize the burden on the respondents. The web-based survey permits respondents to complete the survey at their preferred time. Respondents who begin the survey and are unable to complete it in one attempt will be able to save their responses and resume work on the survey at a later time. The web-based format will incorporate skip patterns that ensure that respondents automatically skip past sections of the survey that are not relevant to their experiences. The study will have a centralized case management system (CMS), linked to the web survey, as well as the prompting and receipt control systems, which will allow for the review of case status at any time. This CMS will allow for effective follow-up with nonrespondents, including ensuring no sample member is prompted for a survey response once they have completed the web survey. All MRC applicants will be emailed an invitation letter with web survey access, including a unique Personal Identification Number (PIN) and password. This initial contact will be followed-up with additional emails encouraging participation. If necessary, follow-up phone calls may be used to encourage participation when email prompts fail.
The information necessary for this evaluation has not been collected elsewhere in any format that could be adapted to obtain the information required to address the research objectives of the evaluation. Based on our thorough review of existing information, including a project literature review and field assessment, no survey or other mode of data collection has captured the needed information on the MRC program’s processes and procedures and on the program’s effect on AmeriCorps members’ educational goals and civic engagement, nor is it available elsewhere. No other data are currently being collected to answer these specific research questions. However, any existing information that might be useful for the research questions can and will be used whenever possible. The information sought as part of this study is unique, and will set the framework for the future impact evaluation.
No small businesses are involved, as respondents are all MRC applicants, principals, AmeriCorps members, teachers, or coaches.
Both the feasibility and process site visits and the member survey will be administered a single time. Some items on the baseline member survey may be repeated in the later follow-up survey to compare changes experienced by AmeriCorps members pre and post program participation. This information is not currently being collected in any other form, making the current data collection request necessary for achieving the goals of the evaluation.
This data collection request is fully consistent with the guidelines in 5 CFR 1320.8(d). There are no special circumstances required for the collection of information in this data collection.
A. In accordance with the paperwork Reduction Act of 1995, the notice required in 5 CFR 1320.8(d) has been published in the Federal Register announcing CNCS’ intention to request an OMB review of data collection activities. This notice was published on Friday, March 9, 2012 in volume 77, number 47, on pages 14353 and 14354 and provided a 60-day period for public comment. No comments were received during the 60 day period.
|
|
|
The feasibility interview protocols and process assessment interview protocols and focus group guide were developed by CNCS and its contractor, NORC at the University of Chicago. Input and feedback on the protocol instruments and guide were sought from AmeriCorps program staff, the project’s Technical Working Group (TWG) and ServeMinnesota (the commission that oversees the operations of the MRC program) as part of the development process. The group of technical experts reviewed the protocols and provided feedback and suggestions to CNCS' contractor. Meetings were held between CNCS, its contractor, and ServeMinnesota to discuss and revise the interview protocols and focus group guide.
The survey instrument also was developed by CNCS and its contractor. Input and feedback on the survey instrument was also obtained from AmeriCorps program staff, the project’s TWG, and ServeMinnesota. In addition, the instrument was pretested with a small group of five AmeriCorps members currently serving with MRC.
Since December 2011, CNCS has consulted with the following persons regarding this information collection:
Dr. Robert Boruch, University of Pennsylvania
Dana M. Stein, Civic Works, Inc.
Dr. Robert LaLonde, University of Chicago
Dr. Christopher Hulleman, James Madison University
Dr. Matthew Stagner, Chapin Hall at the University of Chicago
Dr. Elizabeth Albro, Institute for Education Services
Dr. Terry A. Ackerman, University of North Carolina at Greensboro
Dr. Ann Casey, Minneapolis Public Schools, Minnesota
Kerry Bollman, Saint Croix River Education District, Minnesota
Audrey Suker, ServeMinnesota
Sadie O'Connor, ServeMinnesota
No payment or gift will be offered to respondents for their participation in the data collection.
Participation in this study is voluntary. Respondents will be told the purposes for which the information is collected and that, in accordance with this statute, any identifiable information about them will not be used or disclosed for any other purpose.
MRC will first contact and notify the principal and internal coach at the schools selected for the feasibility and process assessment site visits. The project team will then contact by phone the internal coach, who serves as the program coordinator, about scheduling the site visit and individual interviews. Verbal consent will be obtained when conducting face-to-face interviews with the internal coach, principal, teachers, and AmeriCorps members for the feasibility and process evaluation site visits, and any teacher focus groups, prior to beginning the discussion or asking any questions. Consent will be sought once with each individual being interviewed. The project team member leading the discussions will be responsible for seeking consent from the subjects. Verbal consent will be obtained after the informed consent script is read to all the interviewees and focus group participants prior to beginning discussions for the process assessment site visits. The consent script provides a brief overview of the project, informs the respondent that the interview is completely voluntary and they can skip any questions or terminate the interview at any point, and that the information that is provided to the contractor will be summarized in a report such that no individual respondents names will be identified in the report. If the subject says "yes" to the fact that they consent to participate in the discussion, the interview will proceed. If the subject says "no", the interview will be terminated.
A select group of subjects who apply to be an MRC tutor will be eligible to complete the baseline member survey. MRC will provide a list of all MRC applicants selected to be tutors and alternates (estimated to be about 2,000 persons) to the contractor. The applicants will be contacted by email in July of 2012 requesting that they complete the survey. Respondents will click on a survey link in the email and will be required to enter a unique user ID and password. Survey participants will first see a screen that provides a brief overview of the study, informs participants about confidentiality and privacy, requests their voluntary participation, and provides a frequently asked questions link, a toll-free telephone number, and email address if participants have any questions about the survey. By clicking a button at the bottom of the consent screen, the survey participant is providing their voluntary consent to participate in the survey.
The data collection plan and instruments have been reviewed and approved by the Institutional Review Board of CNCS's contractor, NORC at the University of Chicago. Data collection procedures will incorporate numerous safeguards for the data. While collecting data, information that could identify a particular sample member will be stored in a separate file from survey data collected from that person. Each sample member will be assigned a unique identifier, and this identifier will be used to store identifying information (such as name, address, etc.) in a separate database from the survey response data.
The contractor will be collecting names, addresses, date of birth, telephone numbers, and email addresses as part of the member survey. This information will be used for contacting respondents for a follow-up study. With regards to confidentiality, responses will be de-identified and will be identified by ID number only. The survey data will be tabulated and analyzed statistically with no individual names or responses every identified. Names, address, date of birth, telephone numbers, and email addresses will be retained in a secure location on the server, available only to project staff to use as part of the future follow-up survey for the impact evaluation or to be securely provided to another contractor, if selected. Data will be coded such that obvious identifiers will be substituted with a unique identifying number. The contractor will retain a master list linking study codes and direct identifiers. The master list will be saved on the contractor’s secure servers and then on the contractor’s archive server. All systems used to store electronic survey data are secure by design and protected by passwords only available to authorized study staff.
Special steps will be taken to ensure that data collected via the Web questionnaire are secure. First, access to the Web instrument is only allowed with a valid Personal Identification login user name and password. Second, data will be transmitted by the Secure Sockets Layer (SSL) protocol that uses powerful encryption during transmission through the Internet. If a respondent keeps a Web survey open without any activity, the Web server will close the survey after a short period of inactivity, thus preserving the data up to the break-off point and securely closing the connection. Both development and production servers are backed up nightly.
CNCS and its contractor will publish aggregate statistics of the survey responses in a report, along with information obtained during the site visits. Individual respondents will not be identified in any report, publication, or presentation of this study or its results. Upon completion of the project, names, address, date of birth, telephone numbers, and email addresses of the survey respondents will either be retained for the next phase of the study, the impact evaluation, or they will be securely turned over to another contractor (such as through a secure method such as an FTP site) if CNCS selects a different contractor for the impact evaluation. This information will be used for contacting respondents for a follow-up survey. All telephone interview and site visit materials and notes will be destroyed upon completion of the project. No identifying information will be shared unless the future impact evaluation is awarded to a different contractor, in which case the contractor would transfer the information discussed previously via a secure mechanism.
The baseline member survey will ask applicants to self-identify their race and ethnicity using the federally approved questions from the U.S. Census. This question is necessary in order to conduct subgroup analyses and to understand the characteristics of applicants’ participation in tutoring programs, civic engagement, and the pursuit of educational-related careers.
No other questions of a sensitive nature are asked during the site visit interviews, focus group, or baseline member survey.
With regard to the feasibility site visits, the contractor will interview one principal, one internal coach, up to three teachers, and up to four AmeriCorps members at approximately 75 sites for a total of 675 individuals. For the process assessment site visits, the contractor will interview one principal, one coach, up to five teachers (to be interviewed or participate in a focus group), and up to four AmeriCorps members at approximately 20 sites for a total of 220 individuals. If a focus group takes place with teachers, it would be conducted instead of individual interviews and is estimated to take about 45 minutes, the same amount of time as the individual interviews.
CNCS estimates that they will contact approximately 2,000 AmeriCorps Member Applicants who apply to be in the Minnesota Reading Corps program to complete the web-based member baseline survey. Of this number, we anticipate obtaining responses from 80 percent of the sample (1,600 respondents). We consider this response rate to be a reasonable estimate because the survey takes a short amount of time to complete (20 minutes based on a pretest conducted with five current AmeriCorps members serving in the MRC program), respondents are well-immersed in the use of email and the Web, and there is a high-level of enthusiasm among applicants for the MRC program. If a lower than expected response rate results, we will conduct non-response bias tests to determine if any bias resulted from the lower response rate. If these tests provide evidence of bias, we will make adjustments to our results with the use of weight adjustments and/or response imputation.
Exhibit 3 presents estimates of the reporting burden for survey respondents. The average burden per response is approximately 45 minutes. The total cost to all respondents for the burden is estimated to be $0 because these interviews will be conducted during professional hours as part of respondents’ usual job requirements.
There are no annualized capital/startup or ongoing operation and maintenance costs involved in collecting the information. Other than their time to complete the survey or interview, which is estimated in Exhibit 2, there are no direct monetary costs to respondents.
The estimated cost to the Federal Government for the Process and Impact Evaluation of the Minnesota Reading Corps (MRC) data collection activities is $526,000. This is the cost to our Federal contractor, NORC at the University of Chicago, for data collection activities associated with this submission.
The original burden estimate contained two errors on the part of CNCS. The number of respondents at process assessment sites was calculated in error assuming one respondent per site (1 X 20 = 20). This figure should have been calculated assuming up to 11 respondents per site (11 X 20 = 220). The second error involved the number of feasibility site visits, which was 50 in the original notice and is now 75, with a resulting increase in potential respondents from 450 to 675. The increase was suggested by a master coach at a Minnesota Reading Corps site, who raised the possibility that some sites may have difficulty scheduling times when all types of respondents (i.e., principals, coaches, teachers, and members) might be available. Raising the number of feasibility sites will accommodate such possible challenges.
Exhibit 2: Estimated Burden Hours
Forms |
Type of Respondent |
Number of Respondents per year |
Number of Responses per Respondent |
Average Burden hours per Response |
Total Burden Hours per Response |
Web-based Member Baseline Survey |
AmeriCorps Member Applicants to the Minnesota Reading Corps Program |
2,000 |
1 |
|
|
|
|
|
|
|
|
Feasibility Study protocols for 75 site visits |
Principals, Coaches, Teachers, and AmeriCorps Members participating in the Minnesota Reading Corps Program |
1 principal |
1 |
|
|
1 coach |
1 |
|
|
||
Up to 3 teachers |
1 |
|
|
||
Up to 4 ACM |
1 |
|
|
||
675 persons |
|
|
|
||
|
|
|
|
|
|
Process Assessment protocols for 20 site visits |
Principals, Coaches, Teachers, and AmeriCorps Members participating in the Minnesota Reading Corps Program |
1 principal |
1 |
|
|
1 coach |
1 |
|
|
||
Up to 5 teachers |
1 |
|
|
||
Up to 4 ACM |
1 |
|
|
||
220 persons |
|
|
|
||
|
|
|
|
|
|
Total |
|
2,895 |
|
45/60 =.75 |
2171.25 hrs |
An interim study report will be based on the findings from a cross-site analysis of the information obtained through the process assessment site visits, as well as other forms of information provided through program administrative data, a field assessment, and a literature review. The final report will include the following sections:
Executive Summary. The executive summary will be written in a manner that makes it useful as a stand-alone document for individuals who do not have time to review the entire report. It will highlight the objectives, key findings, and the implications of these findings to health insurance reform.
Methodology. This section will describe the methods used for developing, implementing and analyzing the survey.
Key Issues and Findings. This section will discuss findings around each of the key research questions.
Conclusions. Conclusions will include recommendations for or suggestions for future research and policy initiatives.
A separate summary memo will be provided after completing each site visit for the feasibility study and a final design report will be developed based on individual reviews and assessments of sites’ readiness to participate in the later random assignment study. The design report will provide similar information on methods and key findings. In addition, it will provide design recommendations for the later impact evaluation.
Data for the web-based survey will be collected once over a two month period starting in July 2012, contingent on receiving OMB approval. Analysis will begin shortly after data are collected from the feasibility and process evaluation site visits and will result in the production of a design report in December 2012 upon completion of the analysis. Some basic frequencies and weighting will be conducted on the baseline survey; however, the final analysis will not take place until data is collected in a later follow-up survey to take place in May 2013 at the end of the AmeriCorps member’s service. Data from the baseline survey will be combined with information provided through the follow-up survey with the same respondents to provide evidence to support possible effects of the program on AmeriCorps members.
Exhibit 4 provides the reporting schedule for the entire study. The remainder of this section describes the analytic techniques that will be employed.
Exhibit 3: Timetable for Data Collection and Publication For Other Data Collection Efforts
Activity |
Estimated Start Date |
Estimated End Date |
Develop Instruments for Data Collection and Site Visits |
|
|
Develop AC member survey |
December 2011 |
February 2012 |
Finalize supporting materials (FAQs, consent, etc.) |
February 2012 |
February 2012 |
Obtain IRB approval |
February 2012 |
March 2012 |
Obtain OMB approval |
March 2012 |
July 2012 |
Develop draft site visit protocols (feasibility and process) |
November 2011 |
March 2012 |
Develop Sampling Plan |
|
|
Draft plan for Member Survey |
February 2012 |
March 2012 |
Develop sampling frame of sites (schools) for feasibility study selection |
January 2012 |
March 2012 |
Finalize section of sites for process assessment site visits |
February 2012 |
March 2012 |
Implement Data Collection |
|
|
Program on-line Member Survey |
March 2012 |
July 2012 |
Beta-test on-line Member survey |
June 2012 |
July 2012 |
Survey AC members prior to beginning service (baseline) |
July 2012 |
September 2012 |
Develop contact scripts, training materials, checklists for feasibility visits |
February 2012 |
March 2012 |
|
|
|
|
|
|
Conduct site visits and recruitment for the feasibility study |
July 2012 |
August 2012 |
Conduct site visits for the process assessment |
September 2012 |
October 2012 |
Process Site Visits and Other Data |
|
|
Interim Report |
December 2012 |
December 2012 |
Feasibility Design Report |
|
|
Site visit summaries |
August 2012 |
August 2012 |
Final design report |
September 2012 |
September 2012 |
All data collection materials will display the OMB expiration date.
CNCS certifies that the collection of information encompassed by this request complies with 5 CFR 1320.9 and the related provisions of 5 CFR 1320.8(b)(3).
Page
|
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | REQUEST FOR CLEARANCE FOR |
Author | DHHS |
File Modified | 0000-00-00 |
File Created | 2021-01-30 |