CMS-10549_Supporting_Statement_Part_A

CMS-10549_Supporting_Statement_Part_A.docx

Generic Clearance for Questionnaire Testing and Methodological Research for the Medicare Current Beneficiary Survey (MCBS) (CMS-10549)

OMB: 0938-1275

Document [docx]
Download: docx | pdf

Supporting Statement A

for Request for Revision to a Currently Approved Collection

Generic Clearance:

Questionnaire Testing and Methodological Research for the

Medicare Current Beneficiary Survey (MCBS)

OMB Control No: 0938-1275

Contact Information:

William S. Long

Contracting Officer’s Representative, Medicare Current Beneficiary Survey

Office of Enterprise Data and Analytics/CMS

7500 Security Boulevard, Mail Stop Mailstop B2-04-12

Baltimore, MD 21244

(410) 786-7927 william.long@cms.hhs.gov

(410) 786-5515 (fax)

December 18, 2020









LIST OF ATTACHMENTS

Attachment 1: 60-day Federal Register Notice

Attachment 2: Fall 2019 Advance Letter Experiment Report

Attachment 3: Request for Approval to Test Advance Mail Materials

A. JUSTIFICATION

A1. Circumstances Making the Collection of Information Necessary

The Centers for Medicare & Medicaid Services (CMS) is the largest single payer of health care in the United States. CMS plays a direct or indirect role in administering health insurance coverage for more than 120 million people across the Medicare, Medicaid, CHIP, and Exchange populations. A critical aim for CMS is to be an effective steward, major force, and trustworthy partner in supporting innovative approaches to improving quality, accessibility, and affordability in healthcare. CMS also aims to put patients first in the delivery of their health care needs.

CMS activities result in substantial data generation. Although administrative data are a critical resource for CMS and its partners, there remains an important need for self-reported data in order to obtain information that is not captured through other CMS operations. For example, a Medicare beneficiary’s satisfaction with, access to, and quality of care are important pieces of information that can be captured by obtaining the beneficiary’s unique perspective. Healthcare services not covered by Medicare, including dental, vision, and hearing visits, are collected by surveying beneficiaries, as these data are not currently available to CMS via administrative data. In addition, information on beneficiary insurance coverage and payments from non-Medicare sources (including beneficiary out-of-pocket spending) are also collected by surveying beneficiaries. Survey-collected data elements, combined with CMS administrative data, complete the picture of a beneficiary’s health care experience and provide a vital component in the development and evaluation of models and analysis conducted by CMS.

The Medicare Current Beneficiary Survey (MCBS) is the most comprehensive and complete survey available on the Medicare population and is essential in capturing data not otherwise collected through CMS operations. The MCBS is a nationally-representative, longitudinal survey of Medicare beneficiaries that is sponsored by CMS and directed by the Office of Enterprise Data and Analytics (OEDA). Interviews are usually conducted in-person using computer-assisted personal interviewing (CAPI); however, conducting interviews by phone is also permitted on the MCBS and has been since its origin.

CMS collects administrative information on the Medicare population through its claims records. However, the current administrative information collected by CMS does not provide the complete picture needed for CMS to evaluate its programs and comply with legislative mandates found in both:

  1. Section 1115A of the Social Security Act, as established by Section 3021 of the Affordable Care Act (ACA) of 2010; and

  2. Section 723 of the Medicare Prescription Drug, Improvement and Modernization Act (MMA) of 2003.

Data produced as part of the MCBS are enhanced with CMS administrative data to provide users with more accurate and complete estimates of total health care costs and utilization. The MCBS captures beneficiary information, whether aged or disabled, living in the community or facility, or serviced by managed care or fee-for-service. The MCBS has been continuously fielded since September 1991 and consists of three annual interviews per survey participant. The MCBS has been at the forefront of in-person survey collection and data processing, most notably as one of the first surveys to successfully 1) implement a computer assisted personal interview (CAPI) and 2) match survey and claims data to adjust and correct for underreporting in survey reported health care utilization.

The CMS vision for the MCBS is to continue to provide unique, high-quality and high-value data in a timely manner, continue to break ground in innovative, efficient and analytically powerful new areas of survey administration, design and development, and to increase the survey’s ability to develop, monitor, assess and evaluate the impact of CMMI care delivery and payment models.

To succeed in these areas, the generic clearance for MCBS Questionnaire Testing and Methodological Research encompasses development and testing of MCBS questionnaires, instrumentation, and data collection protocols, as well as a mechanism for conducting methodological experiments. The generic clearance for research and testing activities allows CMS to accomplish the following goals:

  • Improve data quality and accuracy by evaluating and revising existing questionnaire items;

  • Address emerging policy and program issues by testing new questionnaire items;

  • Reduce respondent burden by improving questionnaire items, response categories, and questionnaire flow;

  • Reduce survey costs and implement efficiencies by improving questionnaire items and interview flow, as well as considering new methods and modes of data collection;

  • Increase response rates by improving respondent materials and improving questionnaire content and flow to reduce survey length.

The current clearance includes conducting field tests and experiments, including split ballot experiments, within the MCBS production environment. This revision expands the methods to allow for a field test outside of MCBS production. The key difference here is that tests conducted within production do not incur any additional burden on respondents whereas tests conducted outside production must account for additional respondent burden. For example, on May 7, 2020, OMB approved CMS-10549 GenIC #7 MCBS COVID-19 Rapid Response Supplement Testing under this Generic Clearance. The field test was conducted with MCBS respondents living in the Community from June 10 to July 15, 2020. While it was conducted with MCBS respondents, it was a separate supplement to the main MCBS.

This clearance requests approval for five types of potential research activities:

  1. cognitive interviewing

  2. focus groups

  3. usability testing

  4. field testing

    1. within the MCBS production environment

    2. outside the MCBS production environment 5) respondent debriefings 6) research about incentives.

Whether within production or outside of production, the field tests will be used to test questionnaire items or changes in protocols for potential future inclusion in the MCBS. CMS will submit individual collection requests under this generic clearance, and will provide OMB with a memo explaining the specific purpose and procedures for each collection, as well as copies of all questionnaires, protocols, consent forms, and debriefing materials in advance of any testing activity.

NORC at the University of Chicago, under contract with CMS to administer the MCBS, will conduct activities under this generic clearance. NORC employs methodological specialists, research scientists and public health analysts, who will collaborate with CMS to examine questionnaire and data collection protocols from MCBS and compare those with the “state of the science” in other federal agencies, or other academic or professional institutions. Specific topics to be addressed will be outlined in individual collection requests under the generic clearance. All data collection and analysis will be performed in compliance with OMB, Privacy Act, and Protection of Human Subjects requirements.

The general methods proposed for the five types of research activities under this revised clearance are described below.

1. Cognitive Interviewing. Cognitive pretesting is an important innovation in the development and testing of survey questionnaires that emerged over 30 years ago. Its chief strength is in providing a structured methodology for ascertaining whether the respondent has understood the questions in the way CMS and researchers intend them to be understood, and to assess the ability of respondents to provide meaningful and accurate information. Cognitive interviewing is done through the administration of questions by a specially trained and experienced cognitive interviewer, followed by probes to ascertain comprehension, memory, judgment processes, and topic sensitivity. A secondary purpose is to make sure that issues pertinent to the research are covered adequately. The cognitive interviewing process often includes techniques, such as observation and coding of respondent behaviors (e.g., responses of “don’t know” and requests for question clarification), and in-depth debriefings with respondents, survey methodologists and interviewers to fully understand the functioning of a survey questionnaire.

Cognitive interviewing offers a detailed depiction of question interpretations and processes used by respondents to answer questions—processes that ultimately produce the survey data. Cognitive interviewing is useful not only for investigating individual questionnaire items, but also for understanding how contextual factors, such as instructions to the respondent or question order can influence response and contribute to measurement error. As such, the method offers an insight that can transform understanding of question validity and response error.

Respondents are typically not selected through a random process, but rather are selected for specific characteristics such as age, health status or some other attribute that is relevant to the type of questions being tested. Because the goal is to identify the presence of problems, as opposed to making estimations or causal statements, a randomly drawn sample is not required.

The interview structure consists of respondents first answering a draft survey question and then providing explanations to reveal the processes involved in answering the test question. Specifically, cognitive interview respondents are asked to describe how and why they answered the question as they did. Through the interviewing process, various types of question-response problems that would not normally be identified in a traditional survey interview, such as interpretive errors and recall accuracy, are uncovered.

Data collection procedures for cognitive interviewing are different from survey interviewing. While survey interviewers strictly adhere to scripted questionnaires, cognitive interviewers use survey questions as starting points to begin a more detailed discussion of questions themselves: how respondents interpret key concepts, their ability to recall the requested information and to formulate an answer, and the appropriateness of response categories. Because the interviews generate narrative responses rather than statistics, results are analyzed using qualitative methods. This type of in-depth analysis reveals problems in particular survey questions and, as a result, can help to improve the overall quality of the MCBS. Results of cognitive interviews will be used to make questionnaire design decisions that minimize survey response error; to enhance our understanding of the question response process; to develop better standards for questionnaire design; and to improve data collection procedures. Because of the programming costs involved, cognitive testing is typically administered by a trained questionnaire methodologist using a paper form that simulates asking the questionnaire items via CAPI or phone.

Cognitive interviewing methodology identifies problems that are missed by traditional field tests. Field interviewers may not be sufficiently trained to identify questionnaire problems, and such tests are often conducted too late to allow for substantial revisions to be made. Nevertheless, field tests are a vital complement to cognitive interviews because they can provide important information about how new or revised questions perform in a production environment.

  1. Focus Groups. Focus groups are used to obtain insights into target respondent perceptions, attitudes, and experience during questionnaire and materials development and testing. Focus groups are usually composed of 8 - 10 people who have characteristics similar to the target survey population, or subgroups of the target population. The groups are conducted by a professional moderator who keeps the session on track while allowing respondents to talk openly and spontaneously. The moderator uses a loosely structured discussion outline, which allows him/her to change direction as the discussion unfolds and new topics emerge. The interactive nature of a focus group often encourages a richer discussion than would have been possible in individual interviews.

  2. Usability Testing. Research on computer-user interface designs for computer-assisted instruments is often referred to as “usability testing.” This research examines how survey questions, instructions, and supplemental information are presented on computer instruments (e.g., CAPI or Computer Assisted Self-Interviewing (CASI) instruments), Audio ComputerAssisted Self-Interview (ACASI), or web-based instruments) and investigates how their presentation affects the ability of users to effectively utilize and interact with these instruments. Authors of computer-assisted instruments make numerous design decisions: how to position the survey question on a computer screen; how to display interviewer instructions that are not to be read to respondents; the maximum amount of information that can be effectively presented on one screen; how supplemental information such as “help screens” should be accessed; whether to use different colors for different types of information presented on the screen; and so on. Research has shown that these decisions can have a significant effect on the time required to administer survey questions, the accuracy of question-reading, the accuracy of data entry, and the full exploitation of resources available to help the user complete his or her task.

Usability testing has many obvious similarities to questionnaire-based cognitive research, since it focuses on the ability of individuals to understand and process information in order to accurately complete survey data collection. It is also somewhat different, in that the typical user can be a field interviewer (in the case of CAPI instruments) as well as a respondent (in the case of CASI/ACASI instruments). It also focuses more heavily on matters of formatting and presentation of information than traditional cognitive testing. In addition, usability testing can be informative to the development of web-based surveys. While MCBS does not currently include a web-based design, future investigations could focus on testing web-based responses, especially for facility data collection.

  1. Field tests. Field tests are a well-established method to ensure that changes to survey materials, protocols, and questions do not result in measurement error or bias. They can be used with relatively small numbers of respondents or can be used with a larger group, like in a split ballot experiment. In addition to testing variations of in-person and telephone data collection, methodological experiments could also involve testing other modes of data collection, such as self-administered paper questionnaire (SAQ) or self-administered web surveys. For the MCBS, CMS had demonstrated that field tests conducted both within the production environment and outside of production (e.g., as a short standalone survey), have been useful and efficient while limiting increases to respondent burden.

    1. Within the MCBS production environment. This research program will evaluate changes to the questionnaire and/or data collection procedures through tests or experiments within the production environment. Conducting tests within the production environment maximizes efficiency, reduces costs, and does not increase respondent burden. For example, CMS-10549 GenIC #4 MCBS Testing of Revised Advance Letter provided a mechanism to conduct a split ballot experiment within production to determine whether a new advance letter would improve response. The experiment was conducted from July 2019 through December 2019 and results indicated that the test letter performed slightly better than the original letter. Because it was conducted within the production environment, there was no additional respondent burden as the burden is accounted for in the main MCBS clearance (0938-0568). See Attachment 2 for a report on this research.

Field tests within the production environment will be utilized after cognitive interviews or additional survey research methods are completed that tested new or revised questions, revisions to questionnaire flow, or data collection methods. Professional MCBS field interviewers will be trained to administer these test questions or changes to the instrument flow. A subset of these interviews may be observed by a survey professional from CMS and/or NORC. In cases involving observation, as the interviewer conducts the interview, the observer compiles notes regarding respondent misunderstandings or difficulty answering, or questions that interviewers have difficulty administering, or difficulties with new data collection methodologies, which help to identify potential question revisions. Analysis of outcome data such as response rates and response distributions to key items, para-data (e.g., response times), interviewer observations, and respondent debriefing data will be planned and described in information collection requests to OMB. Subject matter staff are debriefed on these findings and if changes are required, the results of the field test will be used to modify the questionnaire or data collection procedures for follow-up field tests prior to recommending changes to the production instrument.

    1. Field tests outside of the MCBS production environment. These field tests will include new or revised survey items usually with existing MCBS respondents or with respondents from expired panels (e.g., from the exit round panel comprised of respondents who have recently completed their final 11th MCBS interview) but occasionally with new respondents recruited only for the purpose of the test. The recruitment approach will be detailed in the information collection requests submitted for OMB approval. Based on CMS’ recent experience developing and testing the MCBS COVID-19 Rapid Response Supplement (CMS 10549 GenIC #7, approved May 7, 2020) as a field test outside of the production environment, we have demonstrated a cost efficient method to test new survey questions, especially to address critical or emerging health policy concerns that must be fast tracked to obtain measures. Tests conducted outside production incur additional respondent burden as specified in the GenIC request to OMB because it is not accounted for in the mail MCBS clearance (0938-0568).

Similar though to the field tests within production, the main objective of the field tests outside of production is to evaluate new questionnaire items, with the goal of assessing how well they perform in the field, obtaining accurate administration timings, and identifying any issues with question wording or non-response. Professional MCBS field interviewers will be trained to administer these test questions; depending on the objective of the test, administration could be in various modes. A subset of these interviews may be observed by a survey professional from CMS and/or NORC. In cases involving observation, as the interviewer conducts the interview, the observer compiles notes regarding respondent misunderstandings or difficulty answering, or questions that interviewers have difficulty administering, or difficulties with new data collection methodologies, which help to identify potential question revisions. Subject matter staff are debriefed on these findings and if changes are required, the results of the field test will be used to modify the questionnaire or data collection procedures for follow-up field tests prior to recommending changes to the production instrument.

  1. Respondent debriefings. In this method, standardized debriefings are administered to respondents who have participated in a field test. The debriefing form is administered at the end of the interview, and contains questions that probe to determine how respondents interpret the questions, whether they have problems in completing the survey/questionnaire, or whether they have questions or concerns about new procedures being tested. This structured approach to debriefing enables quantitative analysis of data from a representative sample of respondents, to learn whether respondents can answer the questions, and whether they interpret them in the manner intended by the questionnaire designers. The debriefing would be administered by professionally trained MCBS field interviewers.

  2. Research about incentives. In the original design of the MCBS, $3.00 was provided to each community survey participant at each interview. In the early 1990’s, CAPI laptop battery life technology could be questionable, especially if an interviewer was conducting multiple interviews in the course of a day. Therefore, interviewers were instructed to plug in their laptops, if they could, while conducting the in-person interview. Keeping in mind that many of the MCBS survey participants live on limited incomes and being mindful of any potentially added costs, our interviewers offered $3.00 to cover any cost associated with the electrical usage during the interview. This approach was cleared in the original OMB clearance and all subsequent applicable clearances.

In 2008 the MCBS was faced with a challenging budget year. As a result, CMS in consultation with the existing MCBS contractor at the time, determined that the $3.00 electrical usage compensation was no longer a necessity. Laptops were common place in the community and there wasn’t the apprehension associated with plugging them into a respondent’s outlet that there once was. In actuality, the $3.00 compensation was seen as a very small form of appreciation by most of the survey participants. Starting in 2009 the compensation was phased out over the course of four years for continuing survey participants. We began eliminating the compensation for all new panels entering the survey.

Independent of the prior use of the $3.00 compensation, CMS, similar to other national surveys, has seen a small but steady drop in response rates over time. Response rates for the incoming panel has gone from 84 percent in 2001 to 56 percent in 2018. These respondents come into the survey each Fall and their cooperation rates have a long lasting impact to the quality of the data over the four year period of participation. Therefore, incentives to improve the response rates of the incoming panel would be targeted to gaining initial cooperation. Of similar interest would be incentive experiments targeted to reduce attrition over the life of enrolled respondents.

CMS may, in the future, request approval to evaluate what impact incentives could have on the MCBS response rate. This evaluation would at first consist of conducting an environmental scan of the state of the science on respondent incentives in longitudinal surveys and other Federal surveys. From these findings CMS would consult with OMB about the various kinds of experiments that would both inform the statistical community at large as well as provide information about improving the quality of MCBS data and potentially reducing survey costs.

This clearance also includes a request for approval to test advance mail materials. This request was approved by OMB on May 2, 2020 (CMS-10549.GenIC #5). The original plan was to field the test during the Fall 2020 Round 88 MCBS production cycle. However, due to the coronavirus pandemic, data collection for that round was only conducted by phone. Since the materials are designed for in-person data collection, the test was postponed. Therefore, this revision to the Generic Clearance also includes in Attachment C a request to conduct the experiment in Fall 2021 Round 91.

A2. Purpose and Use of Information Collection

The information collected will be used by CMS staff to evaluate and improve the quality of the data in the MCBS survey. The MCBS has remained virtually unchanged in methodology and content since it was first fielded in 1991, while the state of the science has adapted to the ever changing Medicare health care related survey environment. To address a need for modernization, the MCBS through its contactor will conduct cognitive interviews, focus groups, usability testing, field tests, respondent debriefings, and research on incentives.

The qualitative and quantitative data collected under this testing research program will aid CMS in its overarching goals for administering the MCBS: improving data quality; addressing emerging issues; reducing respondent burden; reducing survey costs and implementing efficiencies; and increasing response rates.

A3. Use of Information Technology and Burden Reduction

Appropriate technology will be used during testing to keep respondent burden at a minimum. All cognitive testing will be facilitated by an interviewer, however automated data collection methods such as Computer Assisted Personal Interviewing (CAPI) and Audio Computer Assisted Self Interview (ACASI), as well as web-based interviews may be used to reduce respondent burden. Field testing conducted with MCBS samples during the course of regular fieldwork (e.g., within the production environment) will typically employ the usual CAPI data collection method used on the MCBS but could also include tests of other modes of survey administration. Field testing conducted outside of the production environment could be conducted using CAPI, phone, or other modes of survey administration.

A4. Efforts to Identify Duplication and Use of Similar Information

This testing and methodological research program does not duplicate any other questionnaire design work being done by CMS or other Federal agencies. No information to be obtained from the proposed testing currently exists. The research may involve collaboration with staff from other agencies. All efforts will be collaborative and no duplication in this area is anticipated.

A5. Impact on Small Businesses and Other Small Entities

Most of the data collected under the MCBS Generic Clearance will be from individuals in households. However, it is possible that some testing will be conducted in long-term care facilities. Medicare beneficiaries selected in the MCBS sample also reside in government- sponsored, non- profit, and for-profit institutions such as nursing and personal care homes. Some of these institutions likely qualify as small businesses. For data collected on sample persons in these institutions, their employees serve as proxies for each sample person in their care. The data collection procedures are designed to minimize the burden on facility staff by utilizing as much administrative data as possible to streamline the data collection process.

A6. Consequences of Collecting the Information Less Frequently

This clearance involves one-time data collection for each testing activity. If the research program is not conducted, new or revised questions or data collection protocols cannot be tested, thus potentially impacting the quality of the data if implemented without testing.

A7. Special Circumstances Relating to Guidelines of 5 CFR 1320.5

None of the special circumstances listed by OMB apply to this MCBS research program.

A8. Comments in Response to the Federal Register Notice and Efforts to Consult

Outside Agencies

The 60-day Federal Register notice was published on October 21, 2020 (85 FR 66991).

No comments were received.


The 30-day Federal Register notice was published on December 23, 2020 (85 FR 83967).

A9. Explanation of Any Payment or Gift to Respondents

Respondents for testing activities conducted using cognitive interviews, usability testing, and focus groups under this clearance will receive a small incentive. This practice has proven necessary and effective in recruiting subjects to participate in this type of small-scale research, and is also employed by other Federal cognitive laboratories such as the National Center of Health Statistics Collaborating Center for Questionnaire Design and Evaluation Research. The standard incentive for participation in a cognitive interview is $40 for adults, and for participation in a focus group it is $50 - $75 unless approval is granted by OMB on a case-bycase basis to pay a higher incentive. Respondents for methods that are generally administered as part of field test activities (whether within production or outside of the production environment) will not receive payment unless there are extenuating circumstances that warrant it. Any planned use of incentives will be included in the specific information collection requests submitted to OMB for approval.

A10. Assurances of Confidentiality Provided to Respondents

On February 14, 2018, CMS published in the Federal Register a notice of a modified or altered System of Record (SOR) (System No. 09-70-0519). The notice was published in 83 Federal Register 6591.

All respondents who participate in research under this clearance will be informed that the information they provide will be kept private and that their participation is voluntary.

For field testing activities, MCBS advance letters contain a reference to the Privacy Act of 1974, as amended. Additional materials (including a handout sheet provided to the household respondent at the door and the nursing home administrator and proxy respondents) contain a statement of privacy consistent with the Privacy Act of 1974 and the Paperwork Reduction Act of 1995.

For field testing outside of the production environment, respondents will usually receive an advance letter letting them know that they will be contacted by a field interviewer to conduct a test of new or revised questionnaire items. Prior to beginning the survey, the interviewers will read a consent script that explains the purpose of the test and informs respondents that the information they provide will be kept private and that their participation is voluntary. If respondents agree to participate, the interviewer will begin administering the survey items. Again, the specific protocol will be detailed in the specific information collection request submitted to OMB for approval.

Interviewer training stresses the importance of maintaining privacy. The MCBS interviewer's manual specifically addresses this and it is part of the training for all interviewers whether the interview takes plan in the household or in a long term care facility. Procedures have been established to maintain and ensure privacy. These include computer security procedures (laptop password encryption).

Any data published from this research will exclude information that might lead to the identification of specific individuals (e.g., ID number, claim numbers, and location codes). CMS will take precautionary measures to minimize the risks of unauthorized access to the records and the potential harm to the individual privacy or other personal or property rights of the individual.

All MCBS survey staff directly involved in MCBS data collection and/or analysis activities are required to sign confidentiality agreements. Furthermore, all MCBS patient-level data are protected from public disclosure in accordance with the Privacy Act of 1974, as amended.

A11. Justification for Sensitive Questions

In general, the MCBS does not ask sensitive questions. However, for a small number of respondents, there may be some questionnaire items that are considered to be sensitive, including questions regarding income and assets, food security, alcohol use, obesity screening, mental health screening, and HIV testing, and respondents’ perception of their health care. All interviewers are trained on how to handle respondent concerns about questions being sensitive. It is possible that some potentially sensitive questions may be included in questionnaire items that are tested under this clearance. One of the purposes of the testing is to identify such questions, determine sources of sensitivity, and alleviate them insofar as possible before they are incorporated into the main MCBS questionnaires. If there is a need to test sensitive questions, it will be explained and justified in the specific information collection submitted to OMB for approval.

A12. Estimates of Annualized Burden Hours and Costs

Table 1 is based on the maximum number of data collections expected on an annual basis under this generic clearance. The total estimated respondent burden and costs are calculated below. Please note that for Field Tests within production, our plan is to conduct these efforts in production with respondents from active MCBS panels. Therefore, the burden for their time is captured in the mail MCBS clearance, 0938-0568. The request to test advance mail materials found in Attachment C will be conducted during Fall 2021 Round 91 production; therefore, no burden in this generic clearance request is reported for this test.

Table 1. Estimated Annual Reporting Burden, by Anticipated Data Collection Methods

Number of Respondents

Frequency of Response

Hours Per Response

Total Hours

Cognitive Interviews

75

1

1.50

112.50

Focus Group Interviews

40

1

1.50

60

Usability Testing Sessions

40

1

1.5

60

Field tests outside of production environment

11,000

1

0.33

3,630

Respondent Debriefing

500

1

0.167

84

TOTAL

11,655

3,947

The estimated annualized costs to respondents is based on the Bureau of Labor Statistics (BLS) data from May 2019, http://www.bls.gov/oes/current/oes_nat.htm. The mean hourly wage for all occupations is $25.72.

The estimated annualized annual costs are outlined in Table 2.

Table 2. Estimated Annual Costs

Wages

Total Hours

Total Costs

Cognitive Interviews

$25.72

112.50

$2,894

Focus Group Interviews

$25.72

60

$1,543

Usability Testing Sessions

$25.72

60

$1,543

Field Tests outside of production

$25.72

3,630

$93,364

Respondent Debriefings

$25.72

84

$2,160

TOTAL

3,947

$101,540

A13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers

None.

A14. Annualized Costs to the Federal Government

At this time, we cannot anticipate the actual number of participants, length of interview, and/or mode of data collection for the surveys to be conducted under this clearance. Thus, it is impossible to estimate in advance the cost to the Federal Government. Costs will be covered by CMS under the existing MCBS budget.

A15. Explanation for Program Changes or Adjustments

Previously, the current estimated burden for this generic clearance was 3,295 hours. An increase in 652 hours is being requested to accommodate additional field testing conducted outside of the production environment (MCBS production is fielded under 0938-0568). This request would bring the total estimated burden for this generic clearance to 3,947 hours.

A16. Plans for Tabulation and Publication and Project Time Schedule

This clearance request is for questionnaire development activities and for developmental work that will guide future questionnaire design and data collection protocols. The majority of testing (cognitive interviews, focus groups) will be analyzed qualitatively. The survey designers and methodologists serve as interviewers and use detailed notes and transcriptions from the in-depth cognitive interviews to conduct analyses. Final reports will be written that document how the question performed in the interviews, including question problems as well as the phenomena captured by the survey question. Reports are used to provide necessary information to guide designs for redesigning a question prior to fielding as well as to assist end users when analyzing the survey data. For field test activities, qualitative and quantitative analysis will be performed in order to determine whether the new or revised questionnaire items are performing as expected, and whether there are any issues with changes to the data collection procedures. Because CMS is using state-of-the-science questionnaire development techniques, methodological papers will be written which may include descriptions of response problems, recall strategies used, and quantitative analysis of frequency counts of several classes of problems that are uncovered through the cognitive interview and observation techniques.

A17. Reason(s) Display of OMB Expiration Date is Inappropriate

No exemption is requested.

A18. Exceptions to Certification for Paperwork Reduction Act Submissions

There are no exceptions to this certification statement.

9

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleCMS-10549_0938_1275_SSA
SubjectSupporting Statement A
AuthorCMS
File Modified0000-00-00
File Created2021-01-11

© 2024 OMB.report | Privacy Policy