Feedback Survey IES FY19 RFA 84.305A

Generic Clearance for the Collection of Qualitative Feedback on Agency Service Delivery

1880-0542 Feedback Survey IES FY 2019 RFA 84_305A

OCTAE LINCS CLS Survey and IES Feedback Survey on FY 2019 Request for Applications (RFA)

OMB: 1880-0542

Document [docx]
Download: docx | pdf

FY 2019 84.305A Survey – Education Research Grants

https://surveys.ies.ed.gov/?305A_FY2019


According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. The valid OMB control number for this information collection is 1880-0542. Public reporting burden for this collection of information is estimated to average 20 minutes per response, including time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. The obligation to respond to this collection is voluntary. If you have comments or concerns regarding the status of your individual submission of this survey, please contact Phill Gagné directly at, U.S. Department of Education, Institute of Education Sciences, 400 Maryland Ave., SW, PCP-4122, Washington, DC 20202. [Note: Please do not return the completed survey to this address.]


Thank you for participating in this survey. Your feedback is important to helping IES improve its grants program. This survey pertains to your experience reading and submitting an application to the FY 2019 Request for Applications for the Education Research Grants Program, 84.305A, which can be found online at https://ies.ed.gov/funding/pdf/2019_84305A.pdf.


If you need assistance completing this survey, please contact IES/NCER by sending an email to NCER.Commissioner@ed.gov .


The password for this survey is 2019RFA.


Please enter the password to access this survey:

Shape1 START


Shape2


  1. Including the application(s) you submitted to the Education Research Grants (CFDA# 84.305A) FY 2019 Request for Applications (RFA), how many IES grant applications have you submitted as the Principal Investigator? (Count previous submissions of the same application as separate applications.)

  • 1

  • 2-3

  • 4+


  1. Have you previously been the PI or co-PI on a grant funded by IES?

  • Yes

  • No


  1. To which Topic(s) did you apply in response to the FY19 RFA?

  • Career and Technical Education

  • Cognition and Student Learning

  • Early Learning Programs and Policies

  • Education Leadership

  • Education Technology

  • Effective Teachers and Effective Teaching

  • English Learners

  • Improving Education Systems

  • Postsecondary and Adult Education

  • Reading and Writing

  • Science, Technology, Engineering and Mathematics (STEM) Education

  • Social and Behavioral Context for Academic Learning

  • Special Topic (i.e., Foreign Language Education or Social Studies)


  1. Rate the usefulness of the Topic descriptions for focusing your research idea.

  • Very Useful

  • Useful

  • Marginally Useful

  • Not Useful


  1. Did your project(s) seem to fit more than one Topic?

  • Yes

  • No

  • If #5 = Yes, then go to #6

  • If #5 = No, then go to #7


  1. What factors did you consider for your final decision about the Topic to which you should apply?

Shape3

Text Box – Maximum 4,000 characters (about 500 words). Longer responses may be truncated.




  1. Rate the clarity of the information in the Requirements section (i.e., Sample, Outcomes, and Setting) in the RFA of the Topic(s) to which you applied.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear

  • I did not read this section of the RFA


  1. Each RFA Topic section includes a description of Research Gaps. Rate the usefulness of these descriptions.

o Very Useful

o Useful

o Marginally Useful

o Not Useful

o I did not read the Research Gaps


  1. Did you contact an IES program officer as you prepared your application(s) for the FY 2019 competition?

  • Yes

  • No

  • If #9 = Yes, then go to #10

  • If #9 = No, then go to #11


  1. For what reason(s) did you contact an IES program officer? (Please check all that apply.)

  • Question(s) about the suitability of the study for the Education Research Grants program

  • Question(s) about the Topics described in the RFA

  • Question(s) about the Goals described in the RFA

  • Question(s) about the budget for your proposed study

  • Question(s) about your eligibility to apply

  • Question(s) about the application process

  • Question(s) about the review process

  • Question(s) about resubmitting a previous application that was not funded

  • Other

  • If #10 = Other, then go to #12

  • If #10 = all other responses, then go to #13


  1. For what reason(s) did you not contact an IES program officer? (Please check all that apply.)

  • I found the information I needed in the RFA

  • I found the information I needed from a colleague

  • I did not feel the need to contact a program officer

  • I did not know I could contact a program officer

  • Other

  • If #11 = Other, then go to #12

  • If #11 = all other responses, then go to #13


  1. Please describe the other reason(s).

Shape4

Text Box – Maximum 4,000 characters (about 500 words). Longer responses may be truncated.





  1. Rate the usefulness of the section on Changes in the FY19 RFA.

  • Very Useful

  • Useful

  • Marginally Useful

  • Not Useful

  • I did not read the section on Changes in the FY19 RFA


  1. The RFA provides definitions of Authentic Education Settings (pp. 4-5). What, if any, additional settings should IES consider for future RFAs?

Shape5

Text Box – Maximum 4,000 characters (about 500 words). Longer responses may be truncated.





  1. Rate the clarity of the distinction between required student education outcomes (i.e., those which must be included) and optional student outcomes (i.e., those which may be included, if appropriate).

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear


  1. Rate the level of difficulty of locating important material in the RFA.

  • Not at all Difficult

  • Somewhat Difficult

  • Difficult

  • Very Difficult

  • If #16 = Not at all Difficult, then go to #19

  • If #16 = All other responses, then go to #17


  1. What information did you have difficulty locating in the RFA? (Please check all that apply.)

  • Changes to the RFA

  • Topic requirements

  • Goal requirements

  • Budget limits

  • Grant duration limits

  • Eligibility criteria

  • Authentic education setting requirements

  • Student education outcome requirements

  • Dissemination plan requirements

  • Data management plan requirements

  • How to prepare biosketches for senior/key personnel

  • How to upload applications

  • How to use Workspace on Grants.gov

  • How to fill out the budget and budget narrative

  • How to fill out other specific application forms

  • Other

  • If #17 = Other, then go to #18

  • If #17 = all other responses, then go to #19


  1. What other information did you have difficulty locating in the RFA?

Shape6

Text Box – Maximum 4,000 characters (about 500 words). Longer responses may be truncated.





  1. The RFA currently includes application requirements and information about preparing and submitting your application to grants.gov. Is it more useful for all of this information to be in a single document or would it be more useful if it were split into two documents (i.e., an RFA and an Application Submission Guide)?

  • It would be more useful for the application requirements and the application submission information to continue to be in one document

  • It would be more useful to have the application submission information in a separate document

  • I have no opinion about this


  1. Rate the usefulness of the Glossary.

  • Very Useful

  • Useful

  • Marginally Useful

  • Not Useful

  • I did not notice the Glossary

  • If #20 = Very Useful or Useful, then go to #22

  • If #20 = Otherwise, then go to #21


  1. What would potentially make the Glossary more useful?

Shape7

Text Box – Maximum 4,000 characters (about 500 words). Longer responses may be truncated.





==============================Goal 1==============================


  1. How carefully did you read the Exploration Goal (Goal 1) of the RFA?

  • Did not read it

  • Casually

  • Thoroughly

  • If #22 = Did not read it or Casually, then go to #27.

  • If #22 = Thoroughly, then go to #23.


  1. Rate the clarity of…

    1. The purpose of the Exploration Goal.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear

    1. The description of the Research Plan section of the application.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear

    1. The distinction between primary data analysis and secondary data analysis.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear

    1. The statements explaining that meta-analysis could be proposed as a secondary data analysis.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear

    1. The definition of malleable factors.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear


  1. Rate the clarity of the distinction between requirements and recommendations for Goal 1.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear


  1. Rate the usefulness of the recommendations for preparing your application.

  • Very Useful

  • Useful

  • Marginally Useful

  • Not Useful


  1. If you felt that any aspects of the Exploration Goal were unclear, then please indicate the way(s) in which the clarity could be improved.


Text Box – Maximum 4,000 characters (about 500 words). Longer responses may be truncated.

Shape8


==============================Goal 2==============================


  1. How carefully did you read the Development & Innovation Goal (Goal 2) of the RFA?

  • Did not read it

  • Casually

  • Thoroughly

  • If #27 = Did not read it or Casually, then go to #32.

  • If #27 = Thoroughly, then go to #28.


  1. Rate the clarity of…

    1. The purpose of the Development & Innovation Goal.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear

    1. The description of the theory of change.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear

    1. The expectations for the iterative development process.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear

    1. The distinction between feasibility and usability.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear

    1. The description of developing/refining measures of fidelity of implementation as part of a Development & Innovation project.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear

    1. The requirements and recommendations for the pilot study (e.g., the types of research designs that may be appropriate).

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear

    1. The requirements for the Cost Analysis.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear


  1. Rate the clarity of the distinction between requirements and recommendations for Goal 2.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear


  1. Rate the usefulness of the recommendations for preparing your application.

  • Very Useful

  • Useful

  • Marginally Useful

  • Not Useful


  1. If you felt that any aspects of the Development & Innovation Goal were unclear, then please indicate the way(s) in which the clarity could be improved.


Text Box – Maximum 4,000 characters (about 500 words). Longer responses may be truncated.

Shape9


==============================Goal 3==============================


  1. How carefully did you read the Efficacy & Follow-up Goal (Goal 3) of the RFA?

  • Did not read it

  • Casually

  • Thoroughly

  • If #32 = Did not read it or Casually, then go to #37.

  • If #32 = Thoroughly, then go to #33.


  1. Rate the clarity of…

    1. The purpose of the Efficacy & Follow-Up Goal.

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear

    1. The distinction between the purposes and types of studies supported under Efficacy & Follow-Up (Goal 3) versus Replication: Efficacy & Effectiveness (Goal 4).

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear

      • I did not read the Replication: Efficacy & Effectiveness Goal

    1. The differences among the types of studies supported under the Efficacy & Follow-Up Goal (i.e., Initial Efficacy, Follow-up, and Retrospective).

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear

    1. The definition of an Initial Efficacy Study.

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear

    1. The requirements for the Cost Analysis.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear

    1. The requirements for the Cost-Effectiveness Analysis.

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear

    1. The recommendations regarding objectivity of the research and the roles of personnel involved in the development of the intervention.

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear

    1. The requirements for the Data Management Plan.

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear


  1. Rate the clarity of the distinction between requirements and recommendations for Goal 3.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear


  1. Rate the usefulness of the recommendations for preparing your application.

  • Very Useful

  • Useful

  • Marginally Useful

  • Not Useful


  1. If you felt that any aspects of the Efficacy & Follow-Up Goal were unclear, then please indicate the way(s) in which the clarity could be improved.


Text Box – Maximum 4,000 characters (about 500 words). Longer responses may be truncated.

Shape10


==============================Goal 4==============================


  1. How carefully did you read the Replication: Efficacy & Effectiveness Goal (Goal 4) of the RFA?

  • Did not read it

  • Casually

  • Thoroughly

  • If #37 = Did not read it or Casually, then go to #42.

  • If #37 = Thoroughly, then go to #38.


  1. Rate the clarity of…

    1. The purpose of the Replication: Efficacy & Effectiveness Goal.

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear

    1. The level of evidence from a previous causal impact study needed to justify a Goal 4 study.

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear

    1. The distinction between the purposes and types of studies supported under Efficacy & Follow-up (Goal 3) versus Replication: Efficacy & Effectiveness (Goal 4).

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear

      • I did not read the Efficacy & Follow-Up Goal

    1. The differences among the types of studies supported under the Replication: Efficacy & Effectiveness Goal (i.e., Effectiveness, Efficacy Replication, and Re-analysis)

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear

    1. The distinction between direct and conceptual replications.

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear

    1. The requirements for Mediator and Moderator Analyses.

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear

    1. The requirements for an Implementation Study.

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear

    1. The difference between measuring and analyzing Fidelity of Implementation and conducting an Implementation Study.

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear

    1. The requirements for the Cost Analysis.

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear

    1. The requirements for the Cost-Effectiveness Analysis.

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear

    1. The recommendations regarding objectivity of the research and the roles of personnel involved in the development of the intervention.

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear

    1. The requirements for the Data Management Plan.

      • Very Clear

      • Somewhat Clear

      • Somewhat Unclear

      • Very Unclear


  1. Rate the clarity of the distinction between requirements and recommendations for Goal 4.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear


  1. Rate the usefulness of the recommendations for preparing your application.

  • Very Useful

  • Useful

  • Marginally Useful

  • Not Useful


  1. If you felt that any aspects of the Replication: Efficacy & Effectiveness Goal were unclear, then please indicate the way(s) in which the clarity could be improved.


Text Box – Maximum 4,000 characters (about 500 words). Longer responses may be truncated.

Shape11


==============================Goal 5==============================


  1. How carefully did you read the Measurement Goal (Goal 5) of the RFA?

  • Did not read it

  • Casually

  • Thoroughly

  • If #42 = Did not read it or Casually, then go to #47.

  • If #42 = Thoroughly, then go to #43.


  1. Rate the clarity of…

    1. The purpose of the Measurement Goal.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear

    1. The differences among the types of studies supported by the Measurement Goal (i.e., Development/Refinement projects and Validation projects).

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear

    1. The description of the assessment framework.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear


  1. Rate the clarity of the distinction between requirements and recommendations for Goal 5.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear


  1. Rate the usefulness of the recommendations for preparing your application.

  • Very Useful

  • Useful

  • Marginally Useful

  • Not Useful


  1. If you felt that any aspects of the Measurement Goal were unclear, then please indicate the way(s) in which the clarity could be improved.


Text Box – Maximum 4,000 characters (about 500 words). Longer responses may be truncated.

Shape12


==============================The Rest==============================


  1. After reading the Goal sections, did you have a clear sense of the Goal best suited for your research?

  • Yes

  • No

  • If #47 = Yes, then go to #49.

  • If #47 = No, then go to #48.


  1. In what way(s) was the Goal best suited for your research not clear?

Shape13

Text Box – Maximum 4,000 characters (about 500 words). Longer responses may be truncated.





  1. Rate the clarity of the requirements for the dissemination plan.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear


  1. Rate the clarity of the requirements for making data and results publicly available.

  • Very Clear

  • Somewhat Clear

  • Somewhat Unclear

  • Very Unclear


  1. For FY 2019, the Institute revised Goals 3 and 4 in order to better support research that goes beyond a single efficacy study and to build a coherent body of work to support evidence-based decision making. To what extent do you endorse the following statements about the revisions:

    1. The changes will better support systematic replication studies that contribute to the larger evidence base on education interventions that have prior evidence of efficacy.

  • Strongly Agree

  • Somewhat Agree

  • Neither Agree nor Disagree

  • Somewhat Disagree

  • Strongly Disagree

  • I am not familiar enough with the changes to agree or disagree

  • If #51a = I am not familiar enough with the changes to agree or disagree, then go to #54

  • If #51a = Any other response, then go to #51b

    1. The changes provide clearer expectations and guidance around designing and conducting a variety of replication studies.

  • Strongly agree

  • Somewhat agree

  • Neither agree nor disagree

  • Somewhat disagree

  • Strongly disagree

    1. The changes will advance our knowledge of the impact of interventions, including the conditions under which and for whom an intervention may or may not be effective.

  • Strongly agree

  • Somewhat agree

  • Neither agree nor disagree

  • Somewhat disagree

  • Strongly disagree


  1. What is your overall perspective on the changes to Goals 3 and 4?

    • I support the changes

    • I prefer the previous version

    • I don’t prefer the previous version, but I dislike the changes

    • I am indifferent to the changes

    • Other


  1. Please provide any additional feedback you have on the changes to Goals 3 and 4.

Shape14

Text Box – Maximum 4,000 characters (about 500 words). Longer responses may be truncated.





  1. Please comment on any language or instructions in the RFA that were unclear to you. Provide specific examples if possible.

Shape15

Text Box – Maximum 4,000 characters (about 500 words). Longer responses may be truncated.





  1. Please give us any additional feedback you may have about the RFA, including comments on the length, the changes, the level of detail, and the organization.

Shape16

Text Box – Maximum 4,000 characters (about 500 words). Longer responses may be truncated.




Shape17

Thank you for contributing your time and thoughtful responses to this important survey! If you have any questions about this survey, then please feel free to contact IES/NCER by e-mail at NCER.Commissioner@ed.gov .


According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. The valid OMB control number for this information collection is 1880-0542. Public reporting burden for this collection of information is estimated to average 20 minutes per response, including time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. The obligation to respond to this collection is voluntary. If you have comments or concerns regarding the status of your individual submission of this survey, please contact Phill Gagné directly at, U.S. Department of Education, Institute of Education Sciences, 400 Maryland Ave., SW, PCP-4122, Washington, DC 20202. [Note: Please do not return the completed survey to this address.]

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPhill.Gagne
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy