Supporting Statement B draft 3-4-16

Supporting Statement B draft 3-4-16.docx

Web-Survey for DOE's Small Business Vouchers Pilot of Participating and Nonparticipating Small Business

OMB: 1910-5180

Document [docx]
Download: docx | pdf

United States Department of Energy

Web-survey for DOE's Small Business Vouchers Pilot of Participating and Nonparticipating Small Businesses”

Supporting Statement B

OMB Number xxx-xxx



  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole.


DOE expects the number of unduplicated survey responses to be 140 or less for the survey conducted after the first year of vouchers has been completed (i.e., the second year of the pilot), consisting of 70 participants in the Small Business Vouchers (SBV) Pilot and another 70 will be small businesses with similar characteristics to participants, but who did not participate in the SBV Pilot. DOE expects approximately 200 unduplicated survey responses from SBV participants to the survey conducted in year three of the pilot, approximately 300 unduplicated survey responses from SBV participants to the survey conducted in year four of the pilot, and 300 unduplicated survey responses from SBV participants and 100 unduplicated survey responses from non-SBV participants to the survey conducted in year five of the pilot. The response rates are expected to be close to 100% for SBV participants in the first survey and not below 70 percent in year 5 as all firms still in business are required by contract to report in year 5.


The nonparticipants will serve as a comparison group to the SBV participant sample. We used guidance from the Coalition for Evidence-Based Policy1 on comparison-group designs most likely to produce valid results to define characteristics relevant to our consideration. In addition to not having received a voucher for work done at a national laboratory, an SBV comparison group should have the following elements:

  • The program and comparison group are as similar as feasible in observable pre-program characteristics such as: characteristics of participants, and of institutions providing the assistance, including geographic location and pre-program measures of the outcome the program seeks to improve (characteristics of the program funder).

  • Program and comparison group members are likely to be similar in motivation – e.g., because the study uses an eligibility “cutoff” to form the two groups (same motivation, objectives)

  • Outcome data are collected in the same way for both groups – e.g., the same survey administered at the same point in time to both groups (with a similar time passed since start of work).


We also considered the existence and ease of access to data and metrics for selecting a comparison group. The nonparticipant sample will consist of small businesses who applied to the SBV pilot but were not selected and small businesses who completed Cooperative Research and Development Agreement (CRADA) projects (preferably in the energy efficiency and renewable energy [EERE] sector at the same Laboratories) within the previous year at the National Laboratories participating in the SBV Pilot.


There were 450 applicants to the first of three rounds of the SBV pilot. Assuming similar levels of applications in rounds two and three and approximately 100 voucher awards total in the first three rounds (depending on the amount of funds per voucher), there will be approximately 1,250 non-selected firms in the small businesses universe. These firms will be ranked by reviewer scores and the highest ranking firms will be asked to participate until a number of non-participants who will complete the survey is found equal to the number of participants, Not all of this group will be in existence and reachable in Year 5. The surveys in years three, four and five of the pilot will draw upon small businesses selected to receive vouchers in years one, two and three of the SBV pilot (the information collection assumes there will be comparable levels of funding and participating SBV firms in 2017 and 2018).


To be a good match the comparison group CRADAs should be one year or less in duration and recently completed, with a contact person who can be surveyed. Existing CRADAs of longer duration that reach the one year mark during this first year of the voucher pilot could also be included. At this point it is not known how many CRADAs will fit this description at selected Laboratories, but recent technology transfer data shows that across all of the Laboratories there were 66 CRADAs with small businesses in FY2014, including Labs working in the EERE sector.


There are no plans to generalize the data beyond the scope of the samples studied. The data will inform the impact of the SBV Pilot and will not be generalized to the universe of small businesses or small businesses that have engaged in CRADAs with National Laboratories.


Population

Universe

Sample

SBV Participants, Pilot Year Two

100

Up to 70

SBV Non-Participants, Pilot Year Two

1,300

Up to 70

SBV Participants, Pilot Year Three

200

Up to 200

SBV Participants, Pilot Year Three and Four

300

Up to 300

SBV Participants, Pilot Year Five

300

Up to 300

SBV Non-Participants, Pilot Year Five

1,300

Up to 100




  1. Describe the procedures for the collection of information including:

  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


Once candidate respondents have been identified, DOE will issue a solicitation email to the selected small businesses (participating and nonparticipating). The email will explain the importance of their participation, assure confidentiality, and explain the steps to access the web-survey. The information will be collected via a secure web-survey administered by the participating National Laboratories.


In order to make valid conclusions about the impact of the SBV Pilot, the comparison group must be carefully selected so that the measured impacts can be justifiably attributed to the Pilot activities. Hence, it is important when selecting the small businesses for the comparison group, that we have accurate information pertaining to the characteristics outlined in the answer to Question 1 above.


This is an annual information collection activity designed to maintain contact with participating SBV firms and to evaluate the effectiveness and impacts of the SBV Pilot and to capture lessons learned from the pilot. It is not possible to reduce the frequency of data collection.


  1. Describe methods to maximize response rates and to deal with issues of non-response.

The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


The small businesses who participated in the SBV Pilot will have received substantial support from the National Laboratory and will have developed ongoing relationships with key staff at the Laboratory. The DOE is confident that this history of support and collaboration is sufficiently strong to obtain nearly 100% response rate for the participating small businesses. Additionally, DOE expects the participating Pilot Laboratories to encourage and support data collection from the small businesses selected for the comparison group. If problems of low-response arise, DOE will issue additional email solicitation notices or otherwise promote the web-survey through its normal communications with these populations.


This information collection requires sampling, but will not be generalizing to the universe of all small businesses that conduct research and development with National Laboratories. Generalizing is not permissible with the sampling plan required by the study design. We are employing a comparison-group design which requires careful and targeted selection of respondents which does not yield generalizable data.


  1. Describe any tests of procedures or methods to be undertaken.

Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of test may be submitted for approval separately or in combination with the main collection of information.



Wherever possible questions were borrowed from the tested SBIR surveys. The survey instruments will be reviewed by a group of external experts in evaluation. The web surveys themselves will be tested prior to release by the evaluation team and some of the SBV staff.


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Greg Clendenning of NMR Group, Inc. (617-284-6230, extension 3), Marjorie McRae of Research into Action (503 287 9136), and Gretchen Jordan (831-920-2790), study leaders of the evaluation team, will oversee the data collection and analysis.



The design of the data collection instruments and analysis of the study have been developed with input from Jeff Dowd at DOE’s EERE Office. His contract information is below:


Jeff Dowd

US Department of Energy, EE-61P

1000 Independence Ave. SW

Washington DC 20585

Jeff.Dowd@ee.doe.gov

202-395-4718







1 http://coalition4evidence.org/wp-content/uploads/2014/01/Validity-of-comparison-group-designs-updated-January-2014.pdf

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJennifer Loomis
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy