Download:
pdf |
pdfCONSUMER FINANCIAL PROTECTION BUREAU
INFORMATION COLLECTION REQUEST – SUPPORTING STATEMENT
PART B - COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
SURVEY FOR CONSUMER ATTITUDES, UNDERSTANDING, AND BEHAVIORS WITH
RESPECT TO FINANCIAL SERVICES AND PRODUCTS
(OMB CONTROL NUMBER: 3170-0034)
1. Respondent Universe and Selection Methods
RECRUITMENT METHODOLOGY
Knowledge Network (KN) allows households to use their own computers connected to the Internet
for taking surveys. Additionally, Windows-based laptop computers and netbooks are provided to
non-Internet households. KN uses an address-based sample (ABS) frame instead of the national
Random Digit Dialing (RDD) frame. This is in response to the growing number of cell phone-only
households, including many young adults and minorities, who are outside the traditional RDD
landline telephone frame. Also, the decision to use ABS instead of RDD is motivated by declining
RDD response rates.
ABS involves probability-based sampling of addresses from the U.S. Postal Service’s Delivery
Sequence File. Randomly sampled addresses are invited to join KnowledgePanel through a series of
mailings (English and Spanish materials) and by telephone follow-up to non-responders when a
telephone number can be matched to the sampled address. This ensures that households who do not
have a computer in the home are invited to participate in the panel. Invited households can join the
panel by one of several means: completing and mailing back an acceptance form in a postage-paid
envelope; calling a toll-free hotline staffed by bilingual recruitment agents; or going to a dedicated
KN recruitment Web site and completing the recruitment information online.
In 2008, KN constructed KnowledgePanel LatinoSM taking quality of online panel representation to
the next level by providing netbooks and Internet service for the roughly 40% of Latinos who do not
have Internet access at the time of recruitment.
For all new panel members, demographic information such as gender, age, race/ethnicity, income,
education, and for Latino members, language proficiency are collected in an online “profile” survey.
This information is used to determine eligibility for specific studies and eliminates the need for
gathering basic demographic information on each panel survey. After this survey is completed, the
panel member is regarded as active and ready to be sampled for other surveys. Additionally, for all
Hispanic panel members Knowledge Networks ask a series of questions that can be used by its
clients to apply to an acculturation scale. Such questions include media use, country of birth, number
of years in the U.S., and other attitude and values questions.
As of May 2010, all households without computers heading into the panel are provided by
Knowledge Networks with a netbook computer, receive training and get access to the Internet.
Page 1 of 6
PANEL SURVEY SAMPLING
Once panel members are profiled, they become “active” for selection for specific surveys. Profiling
consists of collecting demographic information such as gender, age, race/ethnicity, income,
education, and for Latino members, language proficiency in an online survey for new panel
members. Samples are drawn from among active members using a probability proportional to size
(PPS) weighted sampling approach. Customized stratified random sampling based on profile data is
also conducted, as required by specific studies.
In September 2007, KN was assigned a patent (U.S. Patent No. 7,269,570) for its unique
methodology for selecting multiple online survey samples from a panel. The selection methodology,
which has been used by KN since 2000, assures that multiple sequential KnowledgePanel samples
from a finite panel membership will each reliably represent the U.S. population.
This sampling methodology was developed by KN in recognition of the practical issue that different
survey samples may target different panel subpopulations. It is not unusual that only panel members
with certain characteristics are selected for a survey. This selectivity can skew the remaining panel
membership demographics and affect the representativeness of later survey samples. The patented
sampling methodology was developed to correct for this in panel sampling; see U.S. Patent No.
7,269,570 for more information.
CFPB anticipates a 60% completion rate (sometimes called cooperation rate), similar to other
surveys conducted using KnowledgePanel (please see Appendix A “Response Rate and Survey
Completion Rate” for an explanation of response rate and completion rate calculations for a panel).
Panel members have already agreed to take online surveys when they became panel members and
therefore are more likely to respond. A survey invitation will be emailed to the selected sample along
with three email reminders to non-respondents.
The typical response rate calculation required by OMB (as discussed in OMB Statistical Standards
page 14), is similar to the AAPOR standard response rate #3. This response rate calculation is widely
used for surveys. Appendix A “Response Rate and Survey Completion Rate), explains how response
rates for panels are calculated differently because of the four stages of recruiting for any one survey,
starting from panel recruiting and ending with survey recruiting. As described in Appendix A, the
cumulative response rate for this survey is expected to be about 4%.
2. Information Collection Procedures
See PANEL SURVEY SAMPLING above.
STATISTICAL WEIGHTING
KnowledgePanel sampling begins as an equal probability sample that is self-weighting with several
enhancements incorporated to improve efficiency. Since any alteration in the selection process is a
deviation from a pure equal probability sample design, statistical weighting adjustments are made to
the data to offset known selection deviations. These adjustments are incorporated in the sample’s
base weight.
Page 2 of 6
There are also several sources of survey error that are an inherent part of any survey process, such as
non-coverage and non-response due to panel recruitment methods and to inevitable panel attrition.
These sources of sampling and non- sampling error are addressed using a panel demographic poststratification weight as an additional adjustment.
Even with this weighting, it is important to note that the panel includes those people who use the
internet, but does not include those who would refuse even a free internet connection. Therefore, the
results of the survey will be applicable to those who are willing to use the internet, whether at home
(paid or for free) or at any other location, such as at work or a library. To better understand any
potential differences in this group who refuse to use the internet, we will analyze differences in the
responses of those who recently got an internet connection when they joined the panel and in
differences in the responses of those who use the internet less frequently.
3. Methods to Maximize Response Rates and Address Issues of Non-Response
Every effort is made to obtain responses from all invited respondents. However, some degree of nonresponse is expected in every survey. In order to minimize non-response, KnowledgePanel employs
the following procedures:
•
•
•
•
•
Ensure that all surveys contain clear language about their intent, use and purpose;
Ensure that respondents receive a survey instrument that is well structured and contains
only those questions that are necessary for the intended purpose;
Support survey respondents with staffed help lines during virtually all daytime hours;
Provide periodic reminder emails to alert and remind respondents that they have survey
invitations and how to access them; and,
Provide an ongoing loyalty program that incentivizes responses to surveys by providing a
“Thank You” for their efforts. The standard incentive has two classes: those who provide
their own computer and ISP and those who use a computer and ISP provided as a part of
their recruitment into KnowledgePanel. Those who use their own computer and ISP
receive 1,000 loyalty points for completing this survey. 1,000 loyalty points is equivalent
to $1.00 and is deposited into their account for future use. When a computer and ISP are
provided to a respondent, Internet access when not taking a KnowledgePanel survey
represents the standard incentive.
In addition, the demographic information for those who do not complete a survey is available to
researchers to allow for examination of the demographics of non-respondents. A full non-response
analysis will not be conducted due to the nature of this survey. The core objective of the survey is to
measure consumers’ awareness, understanding, and behaviors with respect to consumer financial
services and products, and to use this knowledge to inform agency consumer engagement choices.
The survey is not at all intended to inform public policy decisions, nor is it intended to be
representative of the American public as a whole, but is only intended to provide insights for the
agency to guide consumer engagement choices.
As discussed in #1 above, the typical response rate calculation required by OMB (as discussed in
OMB Statistical Standards page 14), is widely used for surveys. Appendix A “Response Rate and
Survey Completion Rate), explains how response rates for panels are calculated differently because
of the four stages of recruiting for any one survey, starting from panel recruiting and ending with
survey recruiting.
Page 3 of 6
4. Testing of Procedures or Methods
The survey has been successfully conducted annually over the past several years.
5. Contact Information for Statistical Aspects of the Design
• Dr. Brian Griepentrog, Senior Vice President, ForsMarsh Group, LLC, 571-858-3798
• Dr. Bill Walton, Project Leader, ForsMarsh Group, LLC, 571-858-3794
Page 4 of 6
Appendix A:
Response Rate and Survey Completion Rate
Calculation of a response rate for a freshly recruited sample survey is fairly simple. As published in
Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys, THE
AMERICAN ASSOCIATION FOR PUBLIC OPINION RESEARCH (AAPOR), Revised 2011, the
calculation of the response rate is the number of completed interviews divided by the number of
eligible reporting units in the initial sample.
Response rates - The number of complete interviews with reporting units divided by the
number of eligible reporting units in the sample. The report provides six definitions of
response rates, ranging from the definition that yields the lowest rate to the definition that
yields the highest rate, depending on how partial interviews are considered and how cases of
unknown eligibility are handled.
The response rate for a Panel sample is a bit more complicated. Respondents must first agree to join a
®
panel in order to participate in an ongoing array of surveys. In the case of KnowledgePanel , then
must then complete an initial Profile survey. Some will ultimately leave the Panel, either by choice or
when they’ve been Panelists for a number of years. Finally, they must complete the specific survey
for which the calculation is made. Thus, the Response Rate is a composite of the following items:
household recruitment rate x
household profile rate x
panel retention rate x
survey completion rate
The composite of these items is likely to result in Response Rate calculations in the 6% - 8% range
for panels more than 10 years old (KnowledgePanel was initially recruited in 1999). The specific
values for each stage of the recruitment can only be calculated when the sample is identified.
As an example, the following calculation was performed for an article entitled Computing Response
Metrics for Online Panels by Mario Callegaro and Charles DiSogra (Public Opinion Quarterly, Vol.
72, No. 5 2008, pp. 1008–1032).
HOUSEHOLD RECRUITMENT RATE (RECR) = 0.326
HOUSEHOLD PROFILE RATE (PROR) = 0.568
HOUSEHOLD RETENTION RATE (RETR) = 0.390
STUDY COMPLETION RATE (COMR) = 0.845
This example resulted in a Response Rate of 6.1%. If the first three stages of the calculation for the
current survey remained valid and the Study Completion Rate was 60%, the Response rate would be
4.3%. It is unlikely that any two surveys will have identical values for the first three stages but the
results are likely to be similar.
It should be noted that a recent Pew Research Center report (Assessing the Representativeness of
Public Opinion Surveys, May 15, 2012) noted recent Random Digit Dial telephone typical response
rates of 9% even with the known advantage of social pressure to participate associated with
interaction with a human interviewer.
Page 5 of 6
It should be understood that respondents to a specific Panel survey are also a well-known group,
having completed multiple Profile surveys during their Panel tenure. The characteristics
(demographic, health, financial, etc.) are well known. These same characteristics are known for those
who choose not to respond to the specific survey. Thus survey responders and non-responders can be
evaluated for demographic and other differences quite efficiently. (Note: this can also be done for
any stage of the recruitment effort but the cost and complexity will be considerable. Prior efforts of
this type have not resulted in identifiable non-response bias.)
These Profile data are also valuable in identify potential respondents to the specific survey. Panel
surveys can target the specific set of individuals targeted for the survey, thereby potentially reducing
screening effort and respondent burden of the survey. This is specifically true for surveys targeting
identifiable subpopulations where the Profile information is available to directly target the sample of
interest.
Survey Completion Rates, on the other hand, is fairly simple. It is the number of qualified responses
divided by the number of invitations sent. Again from the AAPOR publication:
Cooperation rates - The proportion of all cases interviewed of all eligible units ever
contacted. The report provides four definitions of cooperation rates, ranging from a minimum
or lowest rate, to a maximum or highest rate.
For KnowledgePanel surveys, this value generally ranges from 60% to more than 80% and is
influenced by a number of items including:
• Survey content and salience
• Survey length (shorter is better)
• Field period (longer is better)
• Survey sponsorship (name recognition generally improves cooperation)
• Use of survey specific incentives (incentives are in the form of loyalty points or sweepstake
prizes)
Our assumption of a relatively low cooperation rate is based upon the survey content, length, a
relatively unknown sponsor and the lack of survey specific incentives. We believe this is a
conservative estimate. We might expect to achieve greater cooperation if we can remain in the field
for a longer period.
Page 6 of 6
File Type | application/pdf |
Author | Garber, Matt (CFPB);Cathaleen.Skinner@cfpb.gov |
File Modified | 2016-03-01 |
File Created | 2016-03-01 |