Supporting Statement B
“E-Government Website Customer Satisfaction Survey (Formerly American Customer Satisfaction Index (ACSI) E-Government Website Customer Satisfaction Survey)”
OMB Control Number 1090-0008
Collection of Information Employing Statistical Methods
The agency should be prepared to justify its decision not to use statistical methods in any case where such methods might reduce burden or improve accuracy of results. When the question “Does this ICR contain surveys, censuses, or employ statistical methods?” is checked "Yes," the following documentation should be included in Supporting Statement B to the extent that it applies to the methods proposed:
1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g. establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the stata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
All surveys covered under this clearance will have these specific characteristics:
They will be conducted using the Verint/ForeSee CXA methodology.
They will measure customer satisfaction with federal government websites and related media.
Only a small percentage of each website’s visitors will generally be qualified to take the survey.
The criteria for qualification will vary by agency and will determine when and where the survey will be presented.
The survey will be served up randomly as visitors experience the website.
Collection of personal information through the surveys is not required by the Verint/ForeSee CXA methodology.
Government agencies will be the sole owner of the data results.
The Verint/ForeSee CXA model measures multi-variable components that are reported as indices. Various models have been developed to meet the objectives of each type of website. The models contain between 25-40 questions depending on the needs of the agency for additional custom questions. We are able to decrease the actual number of questions presented which maximizes response rates and minimizes obtrusiveness to website visitors. To accomplish this, we use an imputation statistical methodology to estimate the missing values in a data set using the information that is available. Because multiple questions are used for each element of the econometric model, stable results are obtained using a sample size of 300 respondents.
While there is a quota of 300 respondents per sample, the selection of each respondent is made as a random probability selection from the universe of visitors on the website at any given time. Thus, the maintenance of random sampling is at the individual level rather than at the total sample level. The historical response rate to these surveys of government websites conducted under this clearance ranges between 1.5% and 35% and has averaged about 5% in the last 3 years. These results are comparable to the response rates for surveys of private sector websites. No follow-up is attempted if a citizen selected to take a survey fails to complete a portion of or the entire questionnaire.
2. Describe the procedures for the collection of information including:
* Statistical methodology for stratification and sample selection,
* Estimation procedure,
* Degree of accuracy needed for the purpose described in the justification,
* Unusual problems requiring specialized sampling procedures, and
* Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
Data will be collected through an on-line survey that will be presented to website visitors randomly. Survey respondents are identified through a number of conditions that are contingent on the website’s traffic and architecture. The trigger code – which causes the survey to be presented – has a number of options that can be customized to the specific website. There is a variable that causes the survey to be presented randomly to a percentage of website visitors as they experience the website. The trigger can also utilize another variable that is referred to as a loyalty factor. This would prevent a visitor from getting the survey before seeing more than a specified number of pages. The survey will run continuously over the time period specified by the agency. The agency will be able to access and analyze all data that is collected over the time period.
The survey is presented continuously throughout the subscription period, because the Internet is constantly changing and websites are continuously updating. Government agencies must have a steady pulse on what is taking place on their website through the various scores and analysis that they receive. Another reason for the survey to run continuously is that it is not possible to know how quickly the necessary data will be collected or how often agency management must have results.
No personal or demographic information of the respondents is acquired through the ACSI methodology on the survey. The government agency has the option to add custom questions that are deemed necessary or beneficial in understanding citizen concerns and priorities. Often those questions are demographic in nature.
An on-line reporting facility will be available for government personnel to access the results of the data collection. The facility is hosted on a secure remote server, and a username and password are established for clients to retrieve their data. All data will be owned by the government agency involved. The results are used to create indices, which are compiled from aggregated data and measurements.
Projected estimate for each fiscal year 2015 – 2017 is follows:
The total sample is 250 X 5,000 = 1,250,000. This number is derived from the number of surveys being conducted (250) multiplied by the estimated number of respondents per survey per year (5,000).
3. Describe methods to maximize response rates and to deal with issues of non-response. The
accuracy and reliability of information collection must be shown to be adequate for intended
uses. For collections based on sampling, a special justification must be provided for any
collection that will not yield “reliable” data that can be generalized to the universe studied.
To maximize the response rate, the surveys are kept short and take only 2-3 minutes to complete. Questions are brief and easy to answer. The welcome text will indicate that the data is being collected by an independent, third party; that the purpose of the survey is to improve the citizen satisfaction with the agency website; and that no data will be used for sales calls or other purposes. Also, the survey is presented only to a small sample of the visitor universe yet provides truly actionable information. Citizens offered a survey are not likely to receive more than one invitation to take the survey during the subscription period of one year.
Because the index approach employs multiple questions to create the index and because the 1- 10 rating scale used for the majority of the questions generates a mean (as compared to a proportion) which is then converted to a 0 – 100 scale, and because we know empirically that the standard deviation tends to be approximately 20 for this survey data, a sample size of approximately 300 yields confidence intervals in the range of +/- 1.5 to +/- 3.5 at the 95% confidence level and confidence intervals of +/- 2.3 to +/- 2.6 at the 90% confidence level on the 0 – 100 scale.
The intended purpose of these data collections is to guide leaders and managers in making managerial decisions about ways to improve the quality of government websites and experience by visitors to the sites. Data collected in these surveys are not used to make policy decisions.
4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an
effective means of refining collections of information to minimize burden and improve
utility. Tests must be approved if they call for answers to identical questions from 10 or
more respondents. A proposed test or set of tests may be submitted for approval separately
or in combination with the main collection of information.
Background
Verint/ForeSee CXA Methodology is founded on and is an application of the publicly available ASCI Methodology. For the past 20 years, Verint (which acquired ForeSee in 2018) has provided a predictive methodology and thought leadership to the federal government. The methodology was developed at the University of Michigan's National Quality Research Center at the Ross School of Business.
The creation of ForeSee in 2001 was for the purpose of adapting the published ACSI Methodology (i.e., the scientific measurement and analysis of customer experience or satisfaction) to the online environment, including websites. It has been actively engaged in that mission ever since. Since that time, Verint/ForeSee has utilized survey questions to collect data on customer expectations, perceptions of quality and perceptions of value, and then utilized a cause-and-effect model, a statistically rigorous “structural equation model,” to quantitively analyze the answers to those questions to generate customer satisfaction scores and benchmarks. Those scores provide a predictive framework for analyzing how likely a customer or consumer will return for the same product or service in the future.
Since 2001, Verint/ForeSee has used that methodology and since at least 2004, Verint/ForeSee has tailored those questions to government customers’ specific needs and circumstances and provided scores and benchmarks that could be reasonably compared to what is commonly referred to as “ACSI scores.”
Verint continues to evolve the practice of the methodology and hone the algorithms and data collection processes to adapt the methodology to the evolving circumstances of websites, mobile websites, mobile apps and market needs. We remain committed to continual evolution of our practices while maintaining consistency in the services provided. Verint/ForeSee has demonstrated this drive for sustained improvement when, as example, it moved from the use of legacy algorithms, such as PLS, to using GSCA for the last decade. Additional advancements are planned to be implemented by adopting use of the OLS algorithm, all while presenting equally comparable results within margins of error and with general improvements on efficiency. In essence, Verint/ForeSee provides FCG customers with the advantage of using state-of-the-art application of the publicly available methodology while retaining comparability with legacy practices of the same rooted methodology.
The purpose of using a consistent methodology across interactions is to help improve the quality of goods and services available to American citizens, while the goal of using the Verint/ForeSee CXA methodology in this instance is to improve the quality of government websites to American citizens.
The Verint/ForeSee CXA methodology produces an econometric model that enables agencies to obtain insights for valuable, high-return, customer-focused decisions. An important advantage, in contrast to methods that rely solely on survey questions, is that it produces results with statistical stability and low chance variation. This helps ensure uniform and consistent results that allow cross-agency, cross-company, and cross-industry comparisons.
Testing
No tests of procedures or methods will be undertaken. The Verint/ForeSee CXA methodology is patented, and the standard questionnaire has undergone extensive, rigorous testing and study in academia and through empirical studies with several million respondents in both government and the private sector. More specifically, the E-Government website surveys have been used in the Federal Government at numerous agencies over the past 14 years and also have undergone extensive testing to identify the set of questions that increase reliability and utility while reducing burden.
5. Provide the name and telephone number of individuals consulted on statistical aspects of the
design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will
actually collect and/or analyze the information for the agency.
Questions regarding any statistical aspects employed or data collection procedures used should be directed to:
Jose Benkí, Director of Research Science
Verint/ForeSee
2500 Green Road, Suite 200
Ann Arbor, MI 48103
Email: jose.benki@verint.com
Web: www.Verint.com
Administrative questions regarding the use of this generic clearance by the U.S. Department of the Interior’s Federal Consulting Group should be directed to:
Jessica Reed
Director, Federal Consulting Group
1849 C Street, NW
Washington, DC 20240
Telephone: (202) 208-4699
Fax: (202) 513-7686
Email: Jessica_Reed@ios.doi.gov
Web: www.fcg.gov
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | bjinnoha |
File Modified | 0000-00-00 |
File Created | 2021-10-04 |