9000-0204 Supporting Statement Part B 08.04.2023

9000-0204 Supporting Statement Part B 08.04.2023.docx

Acquisition 360 Voluntary Survey; FAR Section Affected: 52.201-01

OMB: 9000-0204

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes



Acquisition 360 – Improving the Acquisition Process Through Timely Feedback from External and Internal Stakeholders


OMB Information Collection Request

New Collection

9000-0204



Supporting Statement

Part B

March 2023


Submitted By:

Office of Federal Procurement Policy

Office of Management and Budget

Executive Office of the President


725 17th Street NW

Washington, D.C. 20503


Project Officers: Porter Glock

Acquisition 360 Voluntary Survey

OMB Control No. 9000-0204

Part B


B1. Objectives

Study Objectives

This information collection covers the Acquisition 360 Survey which captures data around the agency preaward and debriefing process for new acquisition solicitations. The objectives are:

  1. To understand actual and potential offeror pain points to improve Office of Management and Budget’s Office of Federal Procurement Policy (OFPP) and agencies’ training, guidance, and outreach efforts.

  2. To assist in the comparisons between contracting offices and/or purchase categories by generalizing within the government contracting context (such as within Product Service Code or dollar thresholds of procurements) to inform targeted guidance to improve specific categories.

  3. To increase the share of contracts awarded to small disadvantaged businesses by providing additional insights on how to improve experiences for small businesses for future solicitations.

Generalizability of Results

The Acquisition 360 Survey is intended to present insights into the difficulties faced during the preaward and debriefing process for actual/potential offerors, government contracting offices, or government program offices (customer) to identify trends year over year, within certain agencies, or within specific areas of interest. Only the Actual/Potential Offerors portion is for public use and therefore, the only portion of the survey covered by the burden estimated under this information collection.

Appropriateness of Study Design and Methods for Planned Uses

The Acquisition 360 Survey data is not expected to meet the threshold of influential or highly influential scientific information. The survey data is intended to be used to identify areas where procurement officials should direct attention and gather more insights, to inform training, guidance, and outreach efforts.

This survey will be an opt-in sample. Per FAR 1.102-3, Evaluating agency acquisition processes, the provision 52.201-01 will be placed into solicitations “in accordance with agency procedures” whereby the agencies, either themselves, or via consideration of the Chief Acquisition Officers Council (CAOC), will allow the provision to be inserted in certain areas of interest (i.e., information technology, high-dollar awards). Agencies cannot require participation in the survey, but can, in accordance with procedures, post and encourage participation to actual and potential offerors.

Efforts to encourage participation include publicizing the Acquisition 360 Survey to groups that are less likely to respond (e.g., businesses bidding for their first contract or smaller program offices). Instituting the survey as voluntary reduces the burden on actual and potential offerors, especially small business entities, who face considerable obstacles in being eligible to do business with the Federal government. The questions, formatting, and access point will be identical across agencies and participation groups.



B2. Methods and Design

Target Population

The survey’s target population is entities who are interested in fulfilling government contracts. The population is estimated at 122,000 entities. For fiscal year 2022, approximately 82,000 entities had active Federal awards, and the Office of Federal Procurement Policy (OFPP) estimates an additional 40,000 would like to be active in the Federal marketplace. Of those 82,000 entities, approximately 61,000 were categorized as small businesses for at least one North American Industry Classification System (NAICS) Code

Sampling

While this will be an opt-in survey, the CAOC, in coordination with OFPP, will identify priority areas and issue governmentwide guidance to use the Acquisition 360 Survey. For example, the CAOC and OFPP may decide to focus on information technology contracts above a certain dollar threshold and instruct agencies to include FAR provision 52.201-01 in every solicitation that meets that requirement. Responses to the survey will continue to be entirely voluntary. If necessary, there will be additional Supporting Statements developed at a later date for focus groups with a sample of ‘discouraged’ or ‘non active’ respondents. When data is posted internally, there will be disclaimers clearly stating the number of responses for the selected category. Agencies are also not able to track the vendor information for a vendor who decides not to apply to an award and doesn’t fill out the survey, whereas they will be able to track vendor information for those who apply for the award but do not fill out the survey. This will help assist with the necessary weights.

B3. Design of Data Collection Instruments

Development of Data Collection Instrument(s)

The Acquisition 360 Survey questions have been tested, used, and commented on through two governmentwide pilot efforts as well as included in the ANPRM and NPRM for FAR case # 2017-014, Use of Acquisition 360 to Encourage Vendor Feedback, Advanced Notice of Proposed Rulemaking (see supporting statement part A.8.). Modifications have been made to the final set of questions for clarity and applicability. Examples of these modifications include

  • Public Comment on FAR Case 2017-014, Use of Acquisition 360 To Encourage Vendor Feedback, suggested that “Acquisition 360” techniques would be most effective by implementing “a true 360-degree rollout, including adding questions that would identify both contract type and non-bidders.”

    • The result was to add a question as to whether or not the respondent had submitted a bid.

  • Based on feedback from the Department of Defense, the phrase “Requirements Development Process” was changed to “Presolicitation Phase”.

  • Additionally, “proposals” were largely changed to “offers”, “based on the requirement and associated risks” was added to the question about appropriate contract type



B4. Collection of Data and Quality Control

The project team (which is comprised of members of OFPP and the General Services Administration (GSA) Office of Shared Solutions & Performance Improvement) will be responsible for the collection of the data utilizing the Qualtrics platform. The questions are formulated so that each is a required field, but there is an option for “Not Applicable” if the respondent feels the question is not relevant. The remainder of the options represent a scale from “Extremely Satisfied” to “Extremely Dissatisfied” with three intermediate responses. GSA’s Office of Shared Solutions & Performance Improvement manually screen a selected percentage of responses to look for anomalies or incorrect data. An example of this would be verifying all Notice ID/Solicitation IDs are valid and removing any anomalies, or for set-aside procurements, remove any responses that indicated the respondent is a large business.

The Qualtrics survey platform has a System of Records Notice covered according to the Chief Privacy Officer at GSA1. Within GSA, the data will be hosted on Data to Decisions (D2D) and will utilize a SQL server. This server will act as the system within GSA for analysis and also as the connection point to the Acquisition 360 Survey data and FPDS data to confirm Notice ID/Solicitation ID. D2D data is not retrieved using a “personal identifier” so it does not have a SORN.

OFPP will conduct a communication and outreach effort to accompany the rollout of the survey and FAR rule change to increase understanding and awareness of the effort within the contractor community. OFPP intends to partner with several contractor organizations to assist in getting information out.



Instrument

Type of Respondent

Number of Respondents per collection cycle (fiscal year)

Number of data collections per 3-year OMB clearance

Survey Template (Instrument 1)

Actual and potential Offerors

4,500

13,500

Survey Template (Instrument 1)

Contracting Office

765

2,295

Survey Template (Instrument 1)

Government Program Office

765

2,295



*The estimates for the number of respondents are based on the pilots that were conducted in 2015 and 2016. The response rate for vendors during this survey was 33%. The average number of contracts by GSA, using data from FY 2017, 2018, and 2019 was 92,373 contract actions. It is assumed that agencies may include the survey in 5% of contract actions which is an estimated 4,619 contract actions for GSA. Assuming there are three offers received for each contract action, this would result in an estimated 13,857 actual and potential offerors. Using the 33% response rate, the GSA anticipates 4,573 or approximately 4,500 responses for GSA.



B5. Response Rates and Potential Nonresponse Bias

Response Rates

Initially, the aim is for a 33% response rate from actual and potential offerors in line with the earlier pilots. In 2015, the survey was sent to 1,150 vendors and there were responses from 310. Ideally, as the Acquisition 360 Survey becomes institutionalized and actual/potential offerors become more familiar with the process, the response rate will increase. Additionally, marketing and communication efforts from the Federal government will seek to promote the survey to increase participation.

Nonresponse

The survey is voluntary and therefore there will be instances of nonresponse. The survey is also anonymous and therefore nonresponse bias analysis is not feasible. As actual/potential offerors become more familiar with the Acquisition 360 Survey, the more likely they will participate. One area where the project team can track nonresponse is by looking at the number of offerors received or a solicitation ID and the number of responses to the survey for the solicitation ID to track engagement. There is no ability to track potential offerors who don’t fill out the survey and don’t submit an offer.

B6. Production of Estimates and Projections

In terms of the entire Federal acquisition community, the Acquisition 360 Survey data are not representative and are not generalizable. No estimates or projections will be made. The survey data will be used to identify pain points or obstacles that can be targeted for further research and evaluation. The project team will share our findings with the acquisition community through a presentation or research brief on an annual basis and, as appropriate, will update the public on areas identified for further training or policy implementation. Limitations around generalizability will be noted in all shared information.


B7. Data Handling and Analysis

Data Handling

The project team will check data collected through the survey for a randomized sample of responses (the lesser of 100 surveys or 5% of responses. Where possible, solicitations where responses are received from potential offerors, contracting offices, and government program offices will provide a full-circle view of the process. These instances will be identified and grouped together as a select subset of case studies to be presented in the annual presentation.

As Acquisition360 enters into its second and third years, data from survey responses will be able to be compared from previous years in a “year-over-year” comparison, with the appropriate caveats and weights depending on response rates. During the first year, the project team can track the data quality and survey response rate with those of the two pilot years recognizing that it won’t be an exact match as the questions have changed and the promotional efforts to encourage participation will be much greater.

Data Analysis

This survey data is largely quantitative data, with minimal qualitative data being added in the open-ended questions spaces. An added element of deep-dive interviews with potential offerors is being considered for future implementation, which would be submitted as part of a separate information collection clearance. The project team will manage and systematically analyze the survey data to ensure results are valid and reliable. To maximize our ability to answer the evaluation’s overarching research questions, the project team will conduct descriptive analyses and develop data visualizations using the survey data. These data will yield important details about the pain points experienced by vendors, contracting offices, and government program offices throughout the solicitation process.

Quantitative Data: All data collected through the annual collaboration survey will be cleaned and analyzed to generate descriptive statistics (i.e., counts, ranges, frequencies, means, and standard deviations) using Excel or SAS. The ordinal variables (Extremely Satisfied, Satisfied, etc.) will be assigned a numerical value from zero to five with zero representing Extremely Dissatisfied and 5 representing Extremely Satisfied. “Not Applicable” will not be given a score and will be assigned a null value. Analyses of these data will include a detailed summary that utilizes appropriate descriptive statistics. Average scores for each survey question will be generated by using the following steps: (1) add together all the assigned scored for the questions or statements related to each survey question; and (2) divide by the total number of ratings for those questions (this number is equal to the number of respondents excluding the null values).

Each year, survey data will be analyzed and summarized at overall category (e.g., IT or Small Business) level. Because the survey is designed to be administered anonymously, respondents will not be asked to provide their name or the name of their organization on the survey. The survey will, however, include a set of background questions for respondents, including the solicitation ID, role in the acquisition process (actual/potential offeror, government contracting office, or government program office), and question about whether the actual or potential offeror is a small business and the opportunity to select all of the socioeconomic small business (SESB) categories that apply. This basic background information will allow for analyses to be grouped by SESB category for insights.

To the extent possible, the project team will compare and contrast scores within agencies by looking at responses from potential offerors, contracting offices, and government program offices within that agency. The team will also compare and contrast scores across aforementioned categories to identify common themes related to difficulties in the preaward phase. Yearly administration of the survey will allow the team to examine patterns and trends in the data as they evolve over time, both at the agency level but also across different prioritized equity-focused and spend categories.

Data Use

The survey data may be archived for restricted data-use. If it is archived, internal supporting document will be developed, including variable names, predetermined categories where survey was posted, appropriate weights used, and describe the major analyses performed.

B8. Contact Person(s)


Name

Organization

Role on Contract

Phone/email

Neil Miller

GSA (OGP/MY)


neil.miller@gsa.gov

Matthew Childers

GSA (OGP/MD)


matthew.childers@gsa.gov

Porter Glock

OFPP


Porter_O_Glock@omb.eop.gov

202-395-3145


Attachments

Acquisition 360 Survey


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2023-08-23

© 2024 OMB.report | Privacy Policy