SUPPORTING STATEMENT B
U.S. Department of Commerce
U.S. Census Bureau
Census Household Panel
OMB Control No. 0607-XXXX
B. Collections of Information Employing Statistical Methods
The targeted initial invite sample size from the Department of Defense universe files will be 3,400 service members and 3,400 spouses. The recruitment rate to the initial screener/baseline instrument is expected to be approximately 30%, resulting in 1,000 active-duty service members and 1,000 spouses of activity duty service members being included in the baseline sample. The participation rate to each bi-monthly topical data collection is expected to be around 70%, a conditional cumulative response rate of 21% (AAPOR, 20231), resulting in a sample size of 1,400 for each data collection.
Describe the procedures for the collection of information including:
Statistical methodology for stratification and sample selection,
Estimation procedure,
Degree of accuracy needed for the purpose described in the justification,
Unusual problems requiring specialized sampling procedures, and
Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
The Census Military Panel sample design will be a stratified simple random sample within strata formed for gender, race, marital status, and branch of service. The sample is distributed equally between strata (service members/spouses) and proportionally among substrata (gender, race, marital status, branch of service). Due to lower response rates, as well as different response rates between the service member and spouse files, an initial oversample of service members and spouses will be selected. The targeted final panelists will be 1,000 service members and 1,000 spouses with the sample size in each sub-stratum proportionate to the total number of panelists in each stratum.
Future refreshment samples will be drawn from the DOD Universe files based on response rates throughout the data collection cycle. The refreshed sample will boost the sample back to original expected sample sizes. These refreshment sample will be selected randomly in the same stratum as the original sample design.
The final Census Military Panel Survey weights are designed to represent the number of enlisted service members and spouses, enabling the production of estimates at branch of service (Army, Navy, Marine Corps, Air Force) level.
The final Census Military Panel Survey weights are created by multiplying the base weights by a nonresponse adjustment factor within each stratum and then ratio adjusting the nonresponse adjusted weights to the population controls (number of overall records) in the universe file provided by the DoD.
Within each stratum, the nonresponse adjustment is calculated by taking the total number of sampled cases (the number of respondents and nonrespondents) within the stratum and dividing by the number of respondents from that stratum:
The next step in the weighting process is a ratio adjustment. This ratio calibrates the weights to universe file totals by stratum raking to overall totals, the branch of service totals, martial status totals, gender totals, race totals, and the cross totals of service*gender and service*race totals.
For example, the ratio adjustment for married males in the Navy is the stratum population count of married males in the Navy divided by sum of the base-weights of respondents that are married male Navy service members.
Ratio totals will be calculates for both the service member sample and the spouse sample files.
To enroll in the panel, potential panelists must complete a screener/baseline questionnaire.
The Baseline Questionnaire is a 15-minute questionnaire to verify the sampled participant, collect detailed demographic information about household members, and invite the sampled participant to be part of the panel. A $5 prepaid incentive will accompany the invitation to the baseline questionnaire. Panelists who complete the baseline questionnaire will get a $20 incentive for completing the questionnaire and enrolling into the panel. These data will establish important benchmarks for subsequent analyses, including examination of characteristics of nonrespondents and panel members who attrit over time. The baseline questionnaire will also collect detailed contact information and permission to send text messages for survey invitations and nonresponse follow-up. A panelist is considered enrolled after completion of the baseline questionnaire.
For initial recruitment, we will mail an invitation to complete the Baseline Questionnaire to the sampled participant with a visible $5 prepaid incentive. The letter will contain a unique link to the Baseline Questionnaire with a QR code, a phone number for inbound calling, and a brochure describing the panel and incentive structure. The questionnaire will include a household roster and will be programmed using Qualtrics. Respondents will complete the questionnaire on their computer, tablet, or smartphone. Those who choose to complete via phone will call into a phone line provided by NPC. Phone interviewers will have access to a Qualtrics instrument for data entry.
Three days after the initial invitation, all cases will be mailed a first reminder with a web link and inbound CATI number. This first reminder will be a pressure-sealed envelope. One week later (10 days after the initial invitation), nonresponding cases that have an associated phone number will receive a phone call reminder. A final mailing of the survey invitation with web link and inbound CATI will be sent one week after the phone call reminder. Panel recruitment is expected to take place over an 8-week period. After 8 weeks, the baseline invitation will close, and enrolled panelists will be mailed a $20 incentive for completing the baseline questionnaire.
Once enrolled, panelists will be invited to respond to bi-monthly topical surveys. Invitations will be sent by email, text message (opt-in), and inbound CATI. Phone-only panelists will complete topical surveys via inbound CATI.
Data collection for each topical survey will take place in a 2-week window. New panelists will receive the first topical survey invitation 4 weeks after the initial recruitment period ends. Each topical survey will be approximately 15 minutes long and panelists will receive up to two reminders to complete a topical survey. Panelists who complete a topical survey will be mailed a thank you letter with a $10 cash incentive about 10 days after the topical survey field period closes.
Content for topical surveys will come from DoD and be provided by Census. Some content space will be dedicated to monitoring data quality and items necessary for non-response bias analysis. Staff from the Census Bureau should be notified of content changes as soon as possible but no less than 8 weeks before the content change is requested. This allows for time to cognitively test new items as well as expert review and programming changes. Content changes will happen no more than three times each year. Each topical survey will offer panelists an opportunity to update contact information and verify their address for incentive mailing.
Keeping panelists engaged will prevent attrition and maintain the representativeness of the panel. We anticipate sending panelists one topical survey every other month to keep them engaged. Panelists will not be eligible for more than one survey per data collection month to keep burden low and reduce panel conditioning.
We plan to use the Audience Management functionality in Qualtrics to create a web page where panelists can view their upcoming surveys, check for mailing of incentives for past questionnaires, update their contact information, access technical assistance, and opt-out of panel participation. At least once a year, panelists will be asked to verify or update information from their original Baseline Questionnaire to ensure information about the panelist and their household is current.
Census Household Panel members will be asked to complete approximately one questionnaire every other month and will receive an incentive for each questionnaire. Panelists will be enrolled for three years and drop off after that period. In addition to this three-year limit, we expect attrition due to inactivity and requests to disenroll. Attrition can bias the panel estimates, making the development of a panel member replenishment plan of vital importance (Herzing & Blom, 2019; Lugtig et al., 2014; Schifeling et al., 2015; Toepoela & Schonlau, 2017).
Panelist requests to disenroll from the panel will be identified and processed according to forthcoming protocols. Periodic nonresponse or refusal to the bi-monthly requests for otherwise active panelists is expected. The definition of an inactive panelist is as follows:
No response or active refusal to:
a survey request for two consecutive months; or
more than 50% of survey requests within a 12-month period.
A particular questionnaire may be classified as “no response” due to unit nonresponse (i.e., no questionnaire initiation), item nonresponse resulting in an interview that is not usable for analyses (e.g., item nonresponse to questions deemed critical for analysis, high item nonresponse alone or after data review), and poor-quality data resulting in an unusable interview. Inactive panelists will remain members of the Census Military Panel if reengagement is desired by DoD, especially for newer service members and other targeted groups.
We will assess on an ongoing basis (and no less than quarterly) the generalizability of the panel estimates to represent the target population. Evaluative methods will include precision within important demographic and geographic characteristics, R-indicators, propensity scores, and nonresponse bias analyses (Bianchi & Biffignandi, 2017; Eckman et al., 2021; Groves & Peytcheva, 2008; Peytcheva & Groves, 2009; Rosen et al., 2014). All analyses will be released to the public in coordination with the Department of Defense.
Based on results from multiple analyses, we will identify any subgroups requiring replenishment. New members will be sampled and recruited using the same protocol as for initial enrollment.
Because incentives remain one of the most effective ways to encourage survey participation. The current incentive design includes the following:
Initial Invitation: $5 visible prepaid incentive with the initial invitation to complete the screener.
Baseline Questionnaire: $20 baseline contingent incentive after initial recruitment field period.
Topical Surveys: $10 for each topical survey (~15-minute average; once every other month).
Respondents will be mailed cash incentives for survey completion. NPC will coordinate incentive distribution. The incentive structure could be amended to facilitate ongoing engagement of panelists, particularly for groups of panelists that are rare or historically undercounted.
The Ask U.S. Panel Pilot was developed to test methods for a Federally-sponsored, probability-based, nationally-representative survey panel which would include historically undercounted populations. The Pilot was designed to answer critical methodological questions about our ability to recruit and retain historically undercounted population groups in a panel. To address two related challenges that may contribute to nonresponse bias in estimates – how to engage those who are unlikely to complete an online screening questionnaire and how to include the population without internet, we launched a two-phase panel recruitment design with subsampling for nonresponse. We oversampled populations who were historically missing from online surveys – those with low internet penetration and Hispanics – and evaluated recruitment protocols that may increase response rates and minimize the potential for nonresponse bias. We found that nonresponse follow-up efforts allowed us to access more diverse households, such as those who do not own their home, speak a language at home other than English, and who receive financial assistance.
Experimentally, we focused on two design elements – sponsorship and prepaid incentives. In a 2x2 design, we compared explicit government sponsorship vs. none and visibility of a $5 prepaid incentive sent with the initial recruitment letter. We found that both the explicit government sponsorship and the visible $5 incentive had a positive and significant influence on the response rates. The effect remained significant even after controlling for design variables. The interaction of the two experimental conditions was also significant such that the condition with the visible incentive and the Census Bureau brand had the highest response rate. These findings are all described: Ask U.S. Panel Pilot General Population Final Report (census.gov)
We plan to continue to experiment with ways to maximize recruitment and retention using incentives for the Census Military Panel. Any experiment will be submitted to OMB as it is planned.
Statistical Design:
Anthony Tersine
Demographic Statistical Methods Division
Demographic Programs Directorate
Anthony.g.tersine@census.gov
Data Collection/Survey Design:
Jason Fields
Social Economic and Housing Statistics Division
Demographic Programs Directorate
jason.m.fields@census.gov
Jennifer Hunter Childs
Center for Behavioral Science Methods
Associate Director Research and Methodology
jennifer.hunter.childs@census.gov
Statistical Analysis:
David Waddington
Social Economic and Housing Statistics Division
Demographic Programs Directorate
david.g.waddington@census.gov
1 Standard Definitions - AAPOR (https://aapor.org/wp-content/uploads/2023/05/Standards-Definitions-10th-edition.pdf)
Page
|
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Mary Reuling Lenaiyasa (CENSUS/PCO FED) |
File Modified | 0000-00-00 |
File Created | 2024-07-29 |