Crosswalk: Appendix E1

Appendix E2 2022 MUC Data Template Crosswalk.docx

Quality Payment Program (QPP)/Merit-Based Incentive Payment System (MIPS) (CMS-10621)

Crosswalk: Appendix E1

OMB: 0938-1314

Document [docx]
Download: docx | pdf

MUC Data Template Crosswalk

CY 2022 Final Versus CY 2023 Final

Burden Impact: The changes to this form do not reflect policies in the CY 2023 Physician Fee Scheduled (PFS) Final Rule for the Quality Payment Program. There are no impacts to burden as a result of any changes reflected here.

Change #1

Location: Title (Page 1)

Reason for Change: Updated date of document.

CY 2022 Final Rule text: Measures under Consideration 2021

CY 2023 Final Rule text: Measures under Consideration 2022

Change #2

Location: Instructions (Page 1)

Reason for Change: Changed the instructions to better fit with the current CY 2023 Final Rule text document and to reference the measure submission tool, CMS MERIT.

CY 2022 Final Rule text:

  1. Before accessing the CMS MERIT (Measures Under Consideration Entry/Review and Information Tool) online system, you are invited to complete the measure template below by entering your candidate measure information in the column titled “Add Your Content Here.”

  2. All rows that have an asterisk symbol * in the Field Label require a response.

  3. For each row, the “Guidance” column provides details on how to complete the template and what kinds of data to include.

  4. For check boxes, note whether the field is “select one” or “select all that apply.” You can click on the box to place or remove the “X.”

  5. Row numbers are for convenience only and do not appear on the MERIT user interface.

  6. Send any questions to MMSsupport@battelle.org.

CY 2023 Final Rule text:

  1. Before accessing the CMS MERIT (Measures Under Consideration Entry/Review and Information Tool) online system, you are invited to complete the measure template below by entering your candidate measure information in the column titled “Add Your Content Here.”

  2. All rows that have an asterisk symbol * in the Field Label require a response.

  3. For each row, the “Guidance” column provides details on how to complete the template and what kinds of data to include. Unless otherwise specified the character limit for text fields in CMS MERIT is 8000 characteristics.

  4. For check boxes, note whether the field is “select one” or “select all that apply.” You can click on the box to place or remove the “X.”

  5. Numeric fields are noted, where applicable, in the “Add Your Content Here” column.

  6. Note that CMS MERIT does not accommodate text formatting, including nested tables, carriage returns, and indented bulleted lists.

  7. Row numbers are for convenience only and do not appear on the CMS MERIT user interface.

  8. Send any questions to MMSsupport@battelle.org.

Change #3

Location: Page 1, Measure Information, Row 001, Guidance column

Reason for Change: To provide additional information on measure title.

CY 2022 Final Rule text: N/A

CY 2023 Final Rule text: For additional information on measure title, see: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf

Change #4

Location: Whole Document – Add Your Content Here Column

Reason for Change: Added to provide guidance of where to add content

CY 2022 Final Rule text: N/A

CY 2023 Final Rule text: ADD YOUR CONTENT HERE

Change #5

Location: Page 2, Measure Information, Row 002, Guidance

Reason for Change: Added to provide additional information on measure description.

CY 2022 Final Rule text: N/A

CY 2023 Final Rule text: For additional information on measure description, see: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf

Change #6

Location: Page 2, Measure Information, Row 003

Reason for Change: Relocated the program selection row to earlier in the template.

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

N/A

102

*Select the CMS program(s) for which the measure is being submitted.

Select all that apply.

If you are submitting for MIPS, there are two choices of program. Choose MIPS-Quality for measures that pertain to quality and/or efficiency. Choose MIPS-Cost only for measures that pertain to cost. Do not enter both MIPS-Quality and MIPS-Cost for the same measure.

Because you selected MIPS, you are required to download the MIPS Peer Reviewed Journal Article Template and attach the completed form to your submission using the “Attachments” page.

  • Ambulatory Surgical Center Quality Reporting Program

  • End-Stage Renal Disease (ESRD) Quality Incentive Program

  • Home Health Quality Reporting Program

  • Hospice Quality Reporting Program

  • Hospital-Acquired Condition Reduction Program

  • Hospital Inpatient Quality Reporting Program

  • Hospital Outpatient Quality Reporting Program

  • Hospital Readmissions Reduction Program

  • Hospital Value-Based Purchasing Program

  • Inpatient Psychiatric Facility Quality Reporting Program

  • Inpatient Rehabilitation Facility Quality Reporting Program

  • Long-Term Care (LTC) Hospital Quality Reporting Program

  • Medicare and Medicaid Promoting Interoperability Program for Eligible Hospitals and Critical Access Hospitals (CAHs)

  • Medicare Shared Savings Program

  • Merit-based Incentive Payment System-Cost

  • Merit-based Incentive Payment System-Quality

  • Part C and D Star Ratings [Medicare]

  • Prospective Payment System-Exempt Cancer Hospital Quality Reporting Program

  • Skilled Nursing Facility Quality Reporting Program

  • Skilled Nursing Facility Value-Based Purchasing Program

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

003

*Select the CMS program(s) for which the measure is being submitted.

Select all that apply. Please note, measures specified and intended for use at more than one level of analysis must be submitted separately for each level of analysis (e.g., individual clinician, facility). If you choose multiple programs for this submission, please ensure the programs fall under the same level of analysis. If you choose multiple programs and need guidance as to whether your selection represents multiple levels of analysis, please contact MMSSupport@battelle.org. There is functionality within CMS MERIT to decrease the data entry process for multiple submissions of the same measure. Please reach out to MSSupport@battelle.org for guidance and support. If you are submitting for MIPS, there are two choices of program. Do NOT enter both MIPS-Quality and MIPS-Cost for the same measure. Choose MIPS-Quality for measures that pertain to quality and/or efficiency. Choose MIPS-Cost only for measures that pertain to cost. Because you selected MIPS, you are required to download the MIPS Peer Reviewed Journal Article Template and attach the completed form to your submission using the “Attachments” page.

  • Ambulatory Surgical Center Quality Reporting Program

  • End-Stage Renal Disease (ESRD) Quality Incentive Program

  • Home Health Quality Reporting Program

  • Hospice Quality Reporting Program

  • Hospital-Acquired Condition Reduction Program

  • Hospital Inpatient Quality Reporting Program

  • Hospital Outpatient Quality Reporting Program

  • Hospital Readmissions Reduction Program

  • Hospital Value-Based Purchasing Program

  • Inpatient Psychiatric Facility Quality Reporting Program

  • Inpatient Rehabilitation Facility Quality Reporting Program

  • Long-Term Care (LTC) Hospital Quality Reporting Program

  • Medicare and Medicaid Promoting Interoperability Program for Eligible Hospitals and Critical Access Hospitals (CAHs)

  • Medicare Shared Savings Program

  • Merit-based Incentive Payment System-Cost

  • Merit-based Incentive Payment System-Quality

  • Part C and D Star Ratings [Medicare]

  • Prospective Payment System-Exempt Cancer Hospital Quality Reporting Program

  • Skilled Nursing Facility Quality Reporting Program

  • Skilled Nursing Facility Value-Based Purchasing Program

Change #7

Location: Page 2

Reason for Change: Added a row that becomes optional if you select “Merit-based Incentive Payment System-Quality” in row 003.

CY 2022 Final Rule text: N/A

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

n/a

n/a

If you select “Merit-based Incentive Payment System -Quality” in Row 003, then Row 004 becomes an optional field.

n/a

This is not a data entry field.

Change #8

Location: Page 3

Reason for Change: Relocated row to earlier in the template.

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

N/A

103

MIPS Quality: Identify any links with related Cost measures and Improvement Activities

For MIPS Quality measures only: Where available, provide description of linkages and a rationale that correlates this MIPS quality measure to other performance category measures and activities.


CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

004

MIPS Quality: Identify any links with related Cost measures and Improvement Activities

For MIPS Quality measures only: Where available, provide description of linkages and a rationale that correlates this MIPS quality measure to other performance category measures and activities.

ADD YOUR CONTENT HERE

Change #9

Location: Page 3

Reason for Change: Relocated row to earlier in the template

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

State of Devel.

021

*State of Development

Select all that apply. Before selecting “Conceptualization” or “Specification,” or “Field Testing,” check program requirements.

  • Conceptualization

  • Specification

  • Field Testing

  • Fully Developed

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

005

*State of Development

Select one. Note that fully developed measures are highly preferred. See the definition of fully developed measure within CMS MERIT for guidance.For additional information regarding state of development, see: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf

  • Conceptualization

  • Specification

  • Field (Beta) Testing

  • Fully Developed

Change #10

Location: Page 3

Reason for Change: Added new conditional row that either allows you to skip row 006 if you select “Fully Developed” and requires row 006 if you select “Conceptualization” or “Field (Beta) Testing”.

CY 2022 Final Rule text: N/A

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

n/a

n/a

If you select “Conceptualization,” “Specification”, or “Field (Beta) Testing” in Row 005, then Row 006 becomes a required field. If you select “Fully Developed” in Row 005, then skip to Row 007.

n/a

This is not a data entry field.

Change #11

Location: Page 3

Reason for Change: Relocated row to earlier in the template.

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

State of Devel.

022

State of Development Details

If “Conceptualization,” or “Specification,” describe when testing is planned (i.e., specific dates), what type of testing is planned (e.g., alpha, beta) as well as the types of facilities in which the measure will be tested.

If “Field Testing” or “Fully Developed,” describe what testing (e.g., alpha, beta) has taken place in addition to the results of that testing.

Summarize results from validity testing and reliability testing. For additional information, see: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf


CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

006

*State of Development Details

If “Conceptualization,” “Specification,” or “Field (Beta) Testing,” describe when testing is planned (i.e., specific dates), what type of testing is planned (e.g., alpha, beta) as well as the types of facilities in which the measure will be tested. For additional information, see: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf

ADD YOUR CONTENT HERE

Change #12

Location: Page 4

Reason for Change: Relocated row to earlier in the template.

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

State of Devel.

023

*At what level(s) of analysis was the measure tested?

Select all that apply

  • Clinician

  • Group

  • Facility

  • Health plan

  • Medicaid program (e.g., Health Home or 1115)

  • State

  • Not yet tested

  • Other (enter here):

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

007

*Level of Analysis

Select the level of analysis at which the measure is specified and intended for use. If the measure is specified and intended for use at more than one level, submit the others separately. Any testing results provided in subsequent sections of this submission must be conducted at the level of analysis selected here. For MIPS submissions, you must report the results of individual clinician-level testing. If group-level testing is available, you may submit those results as an attachment.

  • Clinician - Individual

  • Clinician - Group

  • Facility

  • Health plan

  • Medicaid program (e.g., Health Home or 1115)

  • State

  • Other (enter here):

Change #13

Location: Page 4

Reason for Change: Relocated row to earlier in the template.

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

State of Devel.

024

*In which setting was this measure tested?

Select all that apply.

  • Ambulatory surgery center

  • Ambulatory/office-based care

  • Behavioral health clinic or inpatient psychiatric facility

  • Community hospital

  • Dialysis facility

  • Emergency department

  • Federally qualified health center (FQHC)

  • Hospital outpatient department (HOD)

  • Home health

  • Hospice

  • Hospital inpatient acute care facility

  • Inpatient rehabilitation facility

  • Long-term care hospital

  • Nursing home

  • PPS-exempt cancer hospital

  • Skilled nursing facility

  • Veterans Health Administration facility

  • Other (enter here):

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

008

*In which setting(s) was this measure tested?

Select all that apply.

  • Ambulatory surgery center

  • Ambulatory/office-based care

  • Behavioral health clinic

  • Inpatient psychiatric facility

  • Community hospital

  • Dialysis facility

  • Emergency department

  • Federally qualified health center (FQHC)

  • Hospital outpatient department (HOD)

  • Home health

  • Hospice

  • Hospital inpatient acute care facility

  • Inpatient rehabilitation facility

  • Long-term care hospital

  • Nursing home

  • PPS-exempt cancer hospital

  • Skilled nursing facility

  • Veterans Health Administration facility

  • Not yet tested

  • Other (enter here):

Change #14

Location: Page 5

Reason for Change: New Row added

CY 2022 Final Rule text: N/A

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

009

*Multiple Scores

Does the submitter recommend that more than one measure score be reported for this measure (e.g., 7- and 30-day rate, rates for different procedure types, etc.)? If yes, describe the different scores and rationale for reporting both. Note: If “Yes”, indicate which score will be described in this form. Submit separate attachments for each of the other scores.

  • Yes (enter here):

  • No

Change #15

Location: Page 6

Reason for Change: Renumbered Numerator from row 3 to row 10 and added text to column,

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

003

*Numerator

The upper portion of a fraction used to calculate a rate, proportion, or ratio. An action to be counted as meeting a measure's requirements. For all fields, especially Numerator and Denominator, use plain text whenever possible. If needed, convert any special symbols, math expressions, or equations to plain text (keyboard alphanumeric, such as + - * /). This will help reduce errors and speed up data conversion, team evaluation, and MUC report formatting. For all free-text fields: Be sure to spell out all abbreviations and define special terms at their first occurrence. This will save time and revision/editing cycles during clearance.


CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

010

*Numerator

The upper portion of a fraction used to calculate a rate, proportion, or ratio. An action to be counted as meeting a measure's requirements. For all fields, especially Numerator and Denominator, use plain text whenever possible. If needed, convert any special symbols, math expressions, or equations to plain text (keyboard alphanumeric, such as + - * /). This will help reduce errors and speed up data conversion, team evaluation, and MUC report formatting. For all free-text fields: Be sure to spell out all abbreviations and define special terms at their first occurrence. This will save time and revision/editing cycles during clearance.

ADD YOUR CONTENT HERE

Change #16

Location: Page 6

Reason for Change: Renumbered Numerator Exclusions from row 4 to 11.

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

004

*Numerator Exclusions

For additional information on exclusions/exceptions, see: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf. If not applicable, enter ‘N/A.’



CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

011

*Numerator Exclusions

For additional information on exclusions/exceptions, see: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf. If not applicable, enter ‘N/A.’


ADD YOUR CONTENT HERE

Change #17

Location: Page 6, Row 012

Reason for Change: Renumbered Denominator from row 5 to 12.

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

005

*Denominator

The lower part of a fraction used to calculate a rate, proportion, or ratio. The denominator is associated with a given population that may be counted as eligible to meet a measure’s inclusion requirements.


CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

012

*Denominator

The lower part of a fraction used to calculate a rate, proportion, or ratio. The denominator is associated with a given population that may be counted as eligible to meet a measure’s inclusion requirements.

ADD YOUR CONTENT HERE

Change #18

Location: Page 6

Reason for Change: Renumbered Denominator Exclusions from 6 to 13.

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

006

*Denominator Exclusions

For additional information on exclusions/exceptions, see: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf. If not applicable, enter ‘N/A.’


CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

013

*Denominator Exclusions

For additional information on exclusions/exceptions, see: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf. If not applicable, enter ‘N/A.’

ADD YOUR CONTENT HERE

Change #19

Location: Page 7

Reason for Change: Renumbered Denominator Exceptions from 7 to 14.

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

007

*Denominator Exceptions

For additional information on exclusions/exceptions, see: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf. If not applicable, enter ‘N/A.’


CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

014

*Denominator Exceptions

For additional information on exclusions/exceptions, see: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf. If not applicable, enter ‘N/A.’

ADD YOUR CONTENT HERE

Change #20

Location: Page 6

Reason for Change: Row removed

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Below rows were deleted Measure Information

008

*Briefly describe the peer reviewed evidence justifying this measure

Add description of evidence. If you have lengthy text, add the evidence as an attachment, named to clearly indicate the related form field. You may attach the completed CMS consensus-based entity “Evidence Attachment” if applicable.


Measure Information

009

Evidence that the measure can be operationalized

Provide evidence that the data source used by the measure is readily available to CMS. Summarize how CMS would operationalize the measure. For electronic clinical quality measures (eCQMs), attach feasibility scorecard or other quantitative evidence indicating measure can be reported by the intended reporting entities. If you have lengthy text, add the evidence as an attachment, named to clearly indicate the related form field.


CY 2023 Final Rule text: N/A

Change #21

Location: Page 6

Reason for Change: Relocated Burden Section

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

Burden

010

*Burden for Patient: Does the measure require survey data from the patient?

Select one

  • Yes

  • No

Burden

011

*If yes, what is the estimated time to complete the survey?

Enter time in minutes. If unknown, enter 0.


Burden

012

*If yes, what is the frequency of requests for survey data per year?

Enter the number of requests per patient per year.


Burden

013

*If yes, are the survey data to be collected during or outside of a visit?

Select all that apply

  • Prior to visit

  • During visit

  • After visit

Burden

014

*Burden for Provider: Was a provider workflow analysis conducted?

Select one

  • Yes

  • No

Burden

015

*If yes, how many sites were evaluated in the provider workflow analysis?

Enter the number of sites that were evaluated in the provider workflow analysis.


Burden

016

*Did the provider workflow have to be modified to accommodate the new measure?

Select one

  • Yes

  • No

Burden

017

*If yes, how would you describe the degree of effort?

Select one

  • 1 (little to no effort)

  • 2

  • 3

  • 4

  • 5 (substantial effort)

Burden

018

*Does the measure require manual abstraction?

Select one

  • Yes

  • No

Burden

019

*If yes, what is the estimated time per record to abstract data?

Enter time in minutes. If unknown, enter 0.


Burden

020

*How many data elements will be collected for the measure?

Enter number of elements. If a data element has to be abstracted more than once per record (e.g., medication dose is abstracted once for each of the patient’s medications), estimate the average number of times it would be abstracted per eligible case and include that in the total number of data elements.


CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Burden

021

*Burden for Provider: Was a provider workflow analysis conducted?

Select one

  • Yes

  • No

n/a

n/a

If you select “Yes” in Row 021, then Row 022 and 023 become required fields. If you select “No” in Row 022, then skip to Row 024.

n/a

This is not a data entry field.

Burden

022

*If yes, how many sites were evaluated in the provider workflow analysis?

Enter the number of sites that were evaluated in the provider workflow analysis.

Numeric field

Burden

023

*Did the provider workflow have to be modified to accommodate the new measure?

Select one

  • Yes

  • No

Change #22

Location: Page 7

Reason for Change: Relocated from Data Sources section to Measure Implementation section

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Data Sources

064

*Feasibility of Data Elements

To what extent are the specified data elements available in electronically defined fields? Select all that apply. For a PRO-PM, select the data collection format(s).

  • ALL data elements are in defined fields in administrative claims

  • ALL data elements are in defined fields in electronic health records (EHRs)

  • ALL data elements are in defined fields in electronic clinical data (e.g., clinical registry, nursing home minimum data set, or MDS, home health Outcome and Assessment Information Set, or OASIS)

  • ALL data elements are in defined fields in a combination of electronic sources

  • Some data elements are in defined fields in electronic sources

  • No data elements are in defined fields in electronic sources

  • Patient/family-reported information: electronic

  • Patient/family-reported information: paper

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Implementation

015

*Feasibility of Data Elements

Select the extent to which the specified data elements are available in electronic fields. Select all that apply. For a PRO-PM, select the data collection format(s). Electronic fields should include a designated location and format for the data in claims, EHRs, registries, etc.

  • ALL data elements are in defined fields in electronic sources

  • Some data elements are in defined fields in electronic sources

  • No data elements are in defined fields in electronic sources

Change #23

Location: Page 7

Reason for Change: Added new row.

CY 2022 Final Rule text: N/A

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Implementation

016

*Feasibility Assessment

Summarize how you evaluated the feasibility of the data elements included in your measure. For claims-based measures, indicate whether the codes included in the measure appear in the claims used to calculate the measure (e.g., if based on Medicare claims, does Medicare cover the services included in the measure?). For electronic clinical quality measures (eCQMs), attach the feasibility scorecard and other quantitative evidence (if available) indicating that the data required to calculate the measure can be feasibly obtained from the data source. For registry-based or other third-party measures, describe what testing was done to evaluate the feasibility of transferring the data between provider and the third-party. For manually abstracted measures, discuss whether abstractors were able to consistently locate the information required for the measure in the medical records

ADD YOUR CONTENT HERE


Change #24

Location: Page 8

Reason for Change: Added new row

CY 2022 Final Rule text: N/A

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Implementation

017

*Method of measure calculation

Select the method used to calculate measure scores. If the measure can be calculated two or more ways, select all that apply (e.g., measure is fully specified as an eCQM for providers with EHRs and fully specified for manual abstraction for providers without an EHR). Please review guidance before making selections. Select “Claims” if the measure can be calculated entirely from claims data submitted for billing or other purposes. If the measure requires supplemental data codes to be submitted with claims (e.g., MIPS measures that require Part B quality data codes), select “Hybrid.” Select “eCQM" if the measure is specified entirely using accepted national standards for eCQMs (https://ecqi.healthit.gov/ecqm-standards). If the measure only uses some eCQM data elements (e.g., clinical eCQM data is merged with claims data), select “Hybrid.” Select “Other digital method” if the measure is not specified using accepted national standards for eCQMs but can be calculated electronically (e.g., registry, MDS, OASIS). If data needs to be manually abstracted prior to measure calculation (e.g., provider inputs data into registry or online portal manually), select “Hybrid.” Select “Manual abstraction” if all data elements in the measure require manual review of records prior to measure calculation.

  • Claims

  • eCQM

  • Other digital method

  • Manual abstraction

  • Hybrid

  • Other (enter here):

Change #25

Location: Page 8-9

Reason for Change: Measure Implementation rows added

CY 2022 Final Rule text: N/A

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Implementation

n/a

If you select “Hybrid” in Row 017, then Row 018 becomes an optional field.

n/a

This is not a data entry field.

Measure Implementation

018

Hybrid measure: Methods of calculation

Select all methods that apply

  • Claims

  • eCQM

  • Other digital method

  • Manual abstraction


Measure Implementation

019

*How is the measure expected to be reported to the program?

This is the anticipated data submission method. Select all that apply. Use the ”Submitter Comments” field to specify or elaborate on the type of reporting data, if needed to define your measure.

  • eCQM

  • Clinical Quality Measure (CQM) Registry

  • Claims

  • Web interface

  • Other (enter here):

Measure Implementation

020

*Stratification

Does the submitter recommend that measure scores be stratified (e.g., by provider characteristics, by patient characteristics)? If “Yes”, describe the different strata and recommended method for stratifying the results. Note whether overall results will be reported in addition to stratified results. Note: If “Yes”, include the stratified results as an attachment

  • Yes (enter here):

  • No

Change #26

Location: Page 9

Reason for Change: State of Development rows removed

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

State of Devel.

021

*State of Development

Select all that apply. Before selecting “Conceptualization” or “Specification,” or “Field Testing,” check program requirements.

  • Conceptualization

  • Specification

  • Field Testing

  • Fully Developed

State of Devel.

022

State of Development Details

If “Conceptualization,” or “Specification,” describe when testing is planned (i.e., specific dates), what type of testing is planned (e.g., alpha, beta) as well as the types of facilities in which the measure will be tested.

If “Field Testing” or “Fully Developed,” describe what testing (e.g., alpha, beta) has taken place in addition to the results of that testing.

Summarize results from validity testing and reliability testing. For additional information, see: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf


State of Devel.

023

*At what level(s) of analysis was the measure tested?

Select all that apply

  • Clinician

  • Group

  • Facility

  • Health plan

  • Medicaid program (e.g., Health Home or 1115)

  • State

  • Not yet tested

  • Other (enter here):

CY 2023 Final Rule text: N/A

Change #27

Location: Page 10-11

Reason for Change: Rows relocated, and language changed.

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

Reliability
Testing

025

*Type of Reliability Testing

Select all that apply

  • Measure Score Reliability

  • Data Element Reliability

Reliability
Testing

026

*Reliability Testing: Type of Testing Analysis

Select all that apply

  • Signal to Noise

  • Random Split Half Correlation

  • IRR (Inter-rater reliability)

  • ICC (Intraclass correlation coefficient)

  • Test-Retest

  • Internal Consistency

  • Other (enter here):

Reliability Testing

027

*Reliability testing sample size

For the reliability testing provided, indicate the number of measured entities sampled.


Reliability Testing

028

*Reliability testing statistical result

For the reliability testing provided, indicate the statistical result(s) of the testing analysis. If data element reliability was conducted, provide the scores for the critical data elements tested. If signal-to-noise was conducted for measure score reliability, give the range of reliability scores for measured entities in addition to the mean.


Reliability Testing

029

*Reliability testing interpretation of results

For the reliability testing provided, briefly describe the interpretation of results.


Reliability Testing

030

Reliability Testing: Was a minimum number of denominator cases per measured entity established to achieve sufficient measure score reliability?

Select one


  • Yes

  • No

Reliability Testing

031

If yes, specify the number of cases and the percentage of providers

Enter the minimum number of denominator cases required for each measured entity to report on this measure.

Also, specify the percentage of providers in the test sample that met the minimum denominator requirement.


CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Score Level (Accountable Entity Level) Testing

024

*Reliability

Indicate whether reliability testing was conducted for the accountable entity-level measure scores. For more information on accountable entity level reliability testing, refer to the CMS Measures Management System Blueprint (https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf)

Note: This section refers to the reliability of the accountable entity level measure scores in the final performance measure. Refer to the Patient-Reported Data section for testing of surveys or patient reported tools.

Note: for MIPS submissions, please provide individual clinician-level results. If the measure was also tested at the clinician group level, you may include those results in an attachment.

  • Yes

  • No

n/a

n/a

If you select “Yes” in Row 024, then Row 025 becomes a required field. If you select “No” in Row 024, then skip to Row 038.

n/a

This is not a data entry field.

Measure Score Level (Accountable Entity Level) Testing

025

*Reliability: Type of analysis

Select all that apply.

Signal-to-noise (or inter-unit reliability) is the precision attributed to an actual construct versus random variation (e.g., ratio of between unit variance to total variance) (Adams J. The reliability of provider profiling: a tutorial. Santa Monica, CA: RAND; 2009. http://www.rand.org/pubs/technical_reports/TR653.html).

Random split-half correlation is the agreement between two measures of the same concept derived from split samples drawn from the same entity at a single point in time.

  • Signal-to-Noise

  • Random Split-Half Correlation

  • Other (enter here):

n/a

n/a

If you select “Signal-to-Noise,” in Row 025, then Rows 026-029 become required fields. If you select, “Random Split-Half Correlation,” in Row 025, then Rows 030-033 become required fields. If you select “Other” in Row 025, then Rows 034-037 become required fields.

n/a

This is not a data entry field.

Measure Score Level (Accountable Entity Level) Testing

026

*Signal-to-Noise: Name of statistic

Enter specific name of analysis that was conducted, as applicable.

ADD YOUR CONTENT HERE


Measure Score Level (Accountable Entity Level) Testing

027

*Signal-to-Noise: Sample size

Indicate the number of accountable entities sampled to test the final performance measure.

Numeric field

Measure Score Level (Accountable Entity Level) Testing

028

*Signal-to-Noise: Statistical result

Indicate the median result for the signal-to-noise analysis used to assess accountable entity level reliability. Results should range from 0.00 to 1.00. Calculate reliability as the measure is intended to be implemented (e.g., after applying minimum denominator requirements, appropriate type of setting, provider, etc.).

Numeric field

Measure Score Level (Accountable Entity Level) Testing

029

*Signal-to-Noise: Interpretation of results

Describe the interpretation of the results (e.g., low, moderate, high). List accepted thresholds referenced and provide a citation. If applicable, include the precision of the statistical result (e.g., 95% confidence interval) and/or an assessment of statistical significance (e.g., p-value)

ADD YOUR CONTENT HERE


Change #28

Location: Page 12 - 16

Reason for Change: Rows relocated with some language changed.

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

Validity Testing

032

* Type of Validity Testing

Select all that apply

  • Measure Score Validity

  • Data Element Validity

Validity Testing

033

*Validity Testing: Type of Validity Testing Analysis

Select all that apply

  • Correlation

  • Face Validity

  • Construct Validity

  • Gold Standard Comparison

  • Internal Consistency

  • Predictive Validity

  • Structural Validity

  • Other (enter here):

Validity Testing

034

*Validity testing sample size

For the validity testing provided, indicate the number of measured entities sampled.


Validity Testing

035

*Validity testing statistical result

For the validity testing provided, indicate the statistical result(s) of the testing analysis. If data element validity was conducted, provide the scores for the critical data elements tested. If face validity was conducted, list the total number of voting members in addition to the percentage that voted in favor of the measure’s face validity.


Validity Testing

036

*Validity testing interpretation of results

For the validity testing provided, indicate the interpretation of results.


CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Score Level (Accountability Entity Level) Testing

030

*Random Split-Half Correlation: Name of statistic

Enter specific name of analysis that was conducted, as applicable.

ADD YOUR CONTENT HERE


Measure Score Level (Accountability Entity Level) Testing

031

*Random Split-Half Correlation: Sample size

Indicate the number of accountable entities sampled to test the final performance measure. If number varied by sample, use the largest number of measured entities.

Numeric field

Measure Score Level (Accountability Entity Level) Testing

032

*Random Split-Half Correlation: Statistical result

Indicate the statistical result for the random split-half correlation analysis used to assess accountable entity level reliability. Results should range from -1.00 to 1.00. Calculate reliability as the measure is intended to be implemented (e.g., after applying minimum denominator requirements, appropriate type of setting, provider, etc.).

Numeric field

Measure Score Level (Accountability Entity Level) Testing

033

*Random Split-Half Correlation: Interpretation of results

Describe the interpretation of the results (e.g., low, moderate, high). List accepted thresholds referenced and provide a citation. If applicable, include the precision of the statistical result (e.g., 95% confidence interval) and/or an assessment of statistical significance (e.g., p-value).

ADD YOUR CONTENT HERE


Measure Score Level (Accountability Entity Level) Testing

034

*Other: Name of statistic

Enter specific name of statistic.

ADD YOUR CONTENT HERE


Measure Score Level (Accountability Entity Level) Testing

035

*Other: Sample size

Indicate the number of accountable entities sampled to test the final performance measure.

Numeric field

Measure Score Level (Accountability Entity Level) Testing

036

*Other: Statistical result

Indicate the statistical result for the analysis used to assess accountable entity level reliability. Calculate reliability as the measure is intended to be implemented (e.g., after applying minimum denominator requirements, appropriate type of setting, provider, etc.).

Numeric field

Measure Score Level (Accountability Entity Level) Testing

037

*Other: Interpretation of results

Describe the interpretation of the results (e.g., low, moderate, high). List accepted thresholds referenced and provide a citation. If applicable, include the precision of the statistical result (e.g., 95% confidence interval) and/or an assessment of statistical significance (e.g., p-value).

ADD YOUR CONTENT HERE


Measure Score Level (Accountability Entity Level) Testing

038

*Empiric Validity

Indicate whether empiric validity testing was conducted for the accountable entity-level measure scores. For more information on accountable entity level empiric validity testing, refer to the CMS Measures Management System Blueprint (https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf)

Note: This section refers to the empiric validity of the accountable entity level measure scores in the final performance measure. Refer to the Patient-Reported Data section for testing of surveys or patient reported tools.

Note: for MIPS submissions, please provide individual clinician-level results. If the measure was also tested at the clinician group level, you may include those results in an attachment.

  • Yes

  • No

n/a

n/a

If you select “Yes,” in Row 038, then Rows 039-043 become required fields. If you select “No” in Row 038, then skip to Row 044.

n/a

This is not a data entry field.

Measure Score Level (Accountability Entity Level) Testing

039

*Empiric Validity: Statistic name

Indicate the name for the statistic used to assess accountable entity level validity. Describe whether the result is a relative risk, odds ratio, relative difference in scores, etc.

If more than one test or comparison was conducted, describe the statistic that most strongly supported the validity of the measure and provide the full testing results under the “Methods and findings” question or as an attachment.

ADD YOUR CONTENT HERE


Measure Score Level (Accountability Entity Level) Testing

040

* Empiric Validity: Sample size

Indicate the number of accountable entities sampled to test the final performance measure.


ADD YOUR CONTENT HERE


Measure Score Level (Accountability Entity Level) Testing

041

*Empiric Validity: Statistical result

Indicate the statistical result. Calculate empiric validity as the measure is intended to be implemented (e.g., after applying minimum denominator requirements, etc.).

If more than one test or comparison was conducted, provide the result that most strongly supports the validity of the measure and provide the full testing results under the “Methods and findings” question or as an attachment.

Numeric field

Measure Score Level (Accountability Entity Level) Testing

042

*Empiric Validity: Methods and findings

Describe the methods used to assess accountable entity level validity. Describe the comparison groups or constructs used to verify the validity of the measure scores, including hypothesized relationships (e.g., expected to be positively or negatively correlated). Describe your findings for each analysis conducted, including the statistical result provided above and the strongest and weakest results across analyses. If applicable, include the precision of the statistical result(s) (e.g., 95% confidence interval) and/or an assessment of statistical significance (e.g., p-value). If methods and results require more space, include as an attachment.

ADD YOUR CONTENT HERE


Measure Score Level (Accountable Entity Level) Testing

043

*Empiric Validity: Interpretation of results

Indicate whether the statistical result affirmed the hypothesized relationship for the analysis conducted.

  • Yes

  • No

Measure Score Level (Accountable Entity Level) Testing

044

*Face validity

Indicate if a vote was conducted among experts and patients/caregivers on whether the final performance measure scores can be used to differentiate good from poor quality of care.

Select “No” if experts and patients/caregivers did not provide feedback on the final performance measure at the specified level of analysis or if the feedback was related to a property of the measure unrelated to its ability to differentiate performance among measured entities.

  • Yes

  • No

n/a

n/a

If you select “Yes” in Row 044, then Rows 045-046 become required fields. If you select “No” in Row 044, then skip to Row 047.

n/a

This is not a data entry field.

Measure Score Level (Accountable Entity Level) Testing

045

*Face validity: Number of voting experts and patients/caregivers

Indicate the number of experts and patients/caregivers who voted on face validity.

Numeric field

Measure Score Level (Accountable Entity Level) Testing

046

*Face validity: Result

Indicate the number of experts and patients/caregivers who voted in agreement that the measure could differentiate good from poor quality care among accountable entities. If votes were conducted using a scale, sum all responses in agreement with the statement. Do not include neutral votes. If more than one question was asked of the experts and patients/caregivers, only provide results from the question relating to the ability of the final performance measure to differentiate good from poor quality care.

Numeric field

Change #29

Location: Page 16

Reason for Change: Added new row to support collection of additional information, which included: Subsection, Row, Field Label, and Guidance

CY 2022 Final Rule text: N/A

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Patient/Encounter Level (Data Element Level) Testing

047

*Patient/Encounter Level Testing

Indicate whether patient/encounter level testing of the individual data elements in the final performance measure was conducted. Select “No” if testing was not conducted for each critical data element required to identify the denominator and numerator. If testing was conducted for a subset of critical data elements only, select “No” and submit these results as an attachment. Note: This section includes tests of both data element reliability and validity. Note: for MIPS submissions, please provide individual clinician-level results. If the measure was also tested at the clinician group level, you may include those results in an attachment.

  • Yes

  • No

Change #30

Location: Page 16 - 21

Reason for Change: Added new rows to support collection of additional information.

CY 2022 Final Rule text: N/A

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

n/a

n/a

If you select “Yes” in Row 047, then Rows 048-052become required fields. If you select “No” in Row 047 then skip to Row 053.

n/a

This is not a data entry field.

Patient/Encounter Level (Data Element Level) Testing

048

*Type of Analysis

Select all that apply. For more information on patient/encounter level testing, refer to the CMS Measures Management System Blueprint (https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf)Note: This section refers to the patient/encounter level data elements in the final performance measure. Refer to the Patient-Reported Data section for testing of patient/encounter level data elements in surveys or patient reported tools.

  • Agreement between two manual reviewers

  • Agreement between eCQM and manual reviewer

  • Agreement between other gold standard and manual reviewer

  • Other (enter here):

Patient/Encounter Level (Data Element Level) Testing

049

*Sample Size

Indicate the number of patients/encounters sampled.

Numeric field

Patient/Encounter Level (Data Element Level) Testing

050

*Statistic Name

Indicate the statistic used to assess agreement (e.g., percent agreement, kappa, positive predictive value, etc.). If more than one type of statistic was calculated, list the one that best depicts the reliability and/or validity of the data elements in your measure.

  • Percent agreement

  • Kappa

  • ICC

  • Pearson correlation coefficient

  • Sensitivity

  • Positive Predictive Value

  • Other (enter here):

Patient/Encounter Level (Data Element Level) Testing

051

*Statistical Results

Indicate the lowest critical data element result of the statistic selected above.

Numeric field

Patient/Encounter Level (Data Element Level) Testing

052

*Interpretation of results

Briefly describe the interpretation of results including summary results for the overall denominator (with inclusion, exclusion, and exception criteria) and numerator. Include 95% confidence intervals for the overall denominator and numerator results, as applicable. If any data element has low reliability or validity, describe the anticipated impact and whether it could introduce bias to measure scores. If there is variation in reliability or validity scores across test sites/measured entities, describe how this variation impacts overall interpretation of the results. Include a list of all data elements tested that includes their frequency, statistical results, and 95% confidence intervals, as applicable. Provide results broken down by test site if reliability/validity varied between sites. If more room is needed, include as an attachment.

ADD YOUR CONTENT HERE

Patient-Reported Data

053

*Does the performance measure use survey or patient-reported data?

Indicate whether the performance measure utilizes data from structured surveys or patient-reported tools.

  • Yes

  • No

n/a

n/a

If you select “Yes” in Row 053, then Rows 054-059 become required fields. If you select “No” in Row 053, then skip to Row 062.

n/a

This is not a data entry field.

Patient-Reported Data

054

*Surveys or patient-reported outcome tools

List each survey or patient-reported outcome tool accepted by the performance measure and indicate whether the tool(s) have been validated by a peer reviewed study or empirical testing. Indicate whether the tool(s) are being used as originally specified and tested or if modifications are required. If available provide each survey or tool as a link or attachment. Describe the mode(s) of administration available (e.g., electronic, phone, mail) and the number of languages the survey(s) or tool(s) are available in. Indicate whether any of the surveys or tools is proprietary requiring licenses or fees for use.

ADD YOUR CONTENT HERE


Patient-Reported Data

055

*Meaningful to Patients: Number consulted

Indicate the number of patients and/or caregiver representatives who provided feedback on whether the survey or tool meaningfully informs the care they receive and/or helps them better understand their condition or treatment. If the measure uses an established survey or tool, include information from the original development of the survey or tool. If the measure uses a modified version of the survey or applies the survey to a new patient population, it is recommended to obtain patient feedback on the survey as it would be used for the purposes of the performance measure. If the measure allows for the use of more than one survey or tool, include the number of patients consulted on the most relevant or primary survey or tool in this field and provide feedback on the other tools as an attachment.

Numeric field

Patient-Reported Data

056

*Meaningful to Patients: Number indicating survey/tool is meaningful

Indicate the number of patients and/or caregiver representatives who agreed the survey or tool meaningfully informs the care they receive and/or helps them better understand their condition or treatment. If the measure allows for the use of more than one survey or tool, include patient feedback on the most relevant or primary survey or tool in this field and provide feedback on the other tools as an attachment.

Numeric field

Patient-Reported Data

057

*Meaningful to Clinicians: Number consulted

Indicate the number of clinicians who provided feedback on whether the survey or tool meaningfully informs the care they provide their patients. If the measure uses an established survey or tool, include information from the original development of the survey or tool. If the measure uses a modified version of the survey or applies the survey to a new patient population, it is recommended to obtain clinician feedback on the survey as it would be used for the purposes of the performance measure. If the measure allows for the use of more than one survey or tool, include the number of clinicians consulted on the most relevant or primary survey or tool in this field and provide feedback on the other tools as an attachment.

Numeric field

Patient-Reported Data

058

*Meaningful to Clinicians: Number indicating survey/tool is meaningful

Indicate the number of clinicians who agreed that the survey or tool meaningfully informs the care they provide their patients. If the measure allows for the use of more than one survey or tool, include the number of clinicians consulted on the most relevant or primary survey or tool in this field and provide feedback on the other tools as an attachment.

Numeric field

Patient-Reported Data

059

*Survey level testing

Indicate whether survey level testing was conducted. For a list of acceptable types of testing, please refer to the latest CMS Blueprint version (https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf).Select “yes” if you can provide relevant testing of the survey or tool conducted either prior to development of the performance measure or as part of the development of the performance measure.

  • Yes

  • No

n/a

n/a

If you select “Yes” in Row 059, then Rows 060-061 become required fields. If you select “No” in Row 059, then skip to Row 062.

n/a

This is not a data entry field.

Patient-Reported Data

060

*Type of testing analysis

Select all that apply.

  • Internal Consistency

  • Construct Validity

  • Other (enter here):

Patient-Reported Data

061

*Testing methodology and results

Briefly describe the method used to psychometrically test or validate the patient survey or patient-reported outcome tool. (e.g., Cronbach’s alpha, ICC, Pearson correlation coefficient, Kuder-Richardson test). If the survey or tool was developed prior to the development of the performance measure, describe how the intended use of the survey or tools for the performance measure aligns with the survey or tool as originally designed and tested. Indicate whether the measure uses all components within a tool, or only parts of the tool. Summarize the statistical results and briefly describe the interpretation of results.

ADD YOUR CONTENT HERE


Change #31

Location: Page 20-21

Reason for Change: Relocated Measure Performance section

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

Measure Performance

037

*Measure performance - type of score

Select one

  • Proportion

  • Ratio

  • Mean

  • Median

  • Continuous Variable

  • Other (enter here):

Measure Performance

038

*Measure performance score interpretation

Select one

  • Higher score is better

  • Lower score is better

  • Score falling within a defined interval

  • Passing Score

  • Other (enter here):

Measure Performance

039

*Provide mean performance rate and standard deviation for each submission method a measure has or is anticipated to have

Provide the mean performance rate and standard deviation for the measure’s submission method(s). If the measure has more than one submission method, provide all that are available, indicating which results correspond to which method.


Measure Performance

040

*Benchmark, if applicable

Provide the benchmark for the measure’s performance rate. If not applicable, type “not applicable.”


CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Performance

062

*Measure performance - type of score

Select one

  • Proportion

  • Ratio

  • Continuous Variable Mean

  • Continuous Variable Median

  • Other (enter here):

Measure Performance

063

*Measure performance score interpretation

Select one

  • Higher score is better

  • Lower score is better

  • Score falling within a defined interval

  • Passing score

  • Never event

  • Other (enter here):

Measure Performance

064

*Mean performance score

Provide the mean performance score across accountable entities in the test sample that is relevant to the intended use of the measure.Note: for MIPS submissions, please provide individual clinician-level results. If the measure was also tested at the clinician group level, you may include those results in an attachment.

Numeric field

Measure Performance

065

*Median performance score

Provide the median performance score for the testing sample that is relevant to the intended use of the measure.

Numeric field

Measure Performance

066

*Minimum performance score

Provide the minimum performance score for the testing sample that is relevant to the intended use of the measure.

Numeric field

Measure Performance

067

*Maximum performance score

Provide the maximum performance score for the testing sample that is relevant to the intended use of the measure.

Numeric field

Measure Performance

068

*Standard deviation of performance scores

Provide the standard deviation of performance scores for the testing sample that is relevant to the intended use of the measure.

Numeric field

Change #32

Location: Page 21-23

Reason for Change: Relocated Impact Section

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

Impact

041

* Meaningful to Patients. Was input collected from patient and/or caregiver?

Select one

  • Yes

  • No

Impact

042

*If yes, choose all methods of obtaining patient/caregiver information.

Select all that apply

  • Standard Technical Expert Panel (TEP) inclusive of patient/caregiver representatives

  • TEP consisting of ONLY patients or family representatives

  • Focus groups

  • Working groups

  • One-on-one interviews

  • Surveys

  • Virtual communities

  • Other (enter here):

Impact

043

How many times and at what phase(s) of measure development was the patient/caregiver engaged?

Specify the number of times the patient/caregiver representatives were engaged and at what phases of measure development. For example, patient/caregivers were engaged a total of 2 times. Once during conceptualization and once at the conclusion of specification.


Impact

044

*Total number of patients and/or caregivers consulted

Indicate number


Impact

045

Specify the ratio of patients/caregivers to policy/clinician experts engaged in TEP or working groups

Number of patients/caregivers : number of policy/clinician experts. For example, 1:2


Impact

046

*Total number of patients/caregivers who agreed that the measure information helps inform care and make decisions

Indicate number


Impact

047

*Meaningful to Clinicians. Were clinicians and/or providers consulted?

Select one

  • Yes

  • No

Impact

048

*If yes, choose all methods that obtained clinician and/or provider input

Select all that apply

  • Standard TEP

  • TEP consisting of ONLY clinicians

  • Focus groups

  • Working groups

  • One-on-one interviews

  • Surveys

  • Virtual communities

  • Other (enter here)

Impact

049

*Total number of clinicians/providers consulted

Indicate number


Impact

050

*Total number of clinicians/providers who agreed that the measure was actionable to improve quality of care

Indicate number


Impact

051

*Estimated impact of the measure: Estimate of annual denominator size

Enter numerical value or “unable to determine.”


Impact

052

*Estimate of annual improvement in measure score

Enter numerical value or “not applicable.” State the expected improvement in absolute terms in the units expressed by the measure, for example, percentage points or patients per 1000. Using the estimated annual denominator size and median measure scores from your test data, estimate the number of additional numerator events or outcomes that would be achieved during each performance period if measured entities below the median score achieved at least the median measure score. For inverse measures, estimate the number of additional numerator events or outcomes avoided if measured entities above the median score achieved the median measure score.


CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Impact

069

* Meaningful to Patients. Was input on the final performance measure collected from patient and/or caregiver?

Select one

  • Yes

  • No

n/a

n/a

If you select “Yes” in Row 068, then Rows 069-070 become required fields. If you select “No” in Row 068, then skip to Row 071.

n/a

This is not a data entry field.

Impact

070

*Total number of patients and/or caregivers who responded to the question asking them whether the final performance measure helps inform care and decision making

Indicate number

Numeric field

Impact

071

*Total number of patients/caregivers who agreed that the final performance measure information helps inform care and decision making

Indicate number using the total number of patients who responded.

Numeric field

Impact

072

*Meaningful to Clinicians. Were clinicians and/or providers consulted on the final performance measure?

Select one

  • Yes

  • No

n/a

n/a

If you select “Yes” in Row 072, then Rows 073-074 become required fields. If you select “No” in Row 072, then skip to Row 075.

n/a

This is not a data entry field.

Impact

073

*Total number of clinicians/providers who responded when asked if the final performance measure was actionable to improve quality of care

Indicate number

Numeric field

Impact

074

*Total number of clinicians/providers who agreed that the final performance measure was actionable to improve quality of care

Indicate the total number who responded. This is separate from any face validity testing conducted.

Numeric field

Impact

075

*Estimated impact of the measure: Estimate of annual denominator size

Enter the numerical value of the estimated annual denominator size across accountable entities eligible to report the measure. This can be estimated from the average entity-level denominator in the test sample multiplied by the approximate number of eligible entities that may report the measure. If the measure requires a multi-year denominator, divide the estimate to report the estimated number of denominator cases per year rather than for the full denominator period. If it is not possible to estimate based on the testing sample and other publicly available information, enter 0000.

Numeric field

Change #33

Location: Page 24

Reason for Change: Relocated Cost Factors and updated language

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

Cost Factors

053

*Estimated Cost Avoided by the Measure: Estimate of average cost savings per event

Numeric dollar value, “not applicable,” or “unable to determine.” Enter the estimated average net cost avoided per event as a numeric dollar value. If there is no anticipated impact, state “none.” If you are unable to estimate costs avoided, state “unable to determine.” If costs avoided are not an appropriate metric for your measure focus (e.g., mortality), state “not applicable.”


Cost Factors

054

*Cost avoided annually by Medicare/Provider

Using the estimate for improvement and the estimated average cost savings per event, provide the costs that would be avoided by Medicare/provider annually as a numeric dollar value. If there is no anticipated impact, state “none.” If you are unable to estimate costs avoided, state “unable to determine.” If costs avoided are not an appropriate metric for your measure focus (e.g., mortality), state “not applicable.”


Cost Factors

055

*Source of estimate

Briefly describe the assumptions for your cost estimates and cite the sources of cost information. If you did not identify sources of cost information, state “none.” If costs avoided are not an appropriate metric for your measure focus (e.g., mortality), state “not applicable.”


Cost Factors

056

*Year of cost literature cited

Provide the year of the cost estimate (e.g., 2016 dollars). If adjusted for inflation, provide the year the estimate was adjusted to (e.g., 2020 dollars after adjusting for inflation). If you did not identify sources of cost information, state “none.” If costs avoided are not an appropriate metric for your measure focus (e.g., mortality), state “not applicable.”


CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Cost Factors

076

Cost estimate completed

Indicate whether an estimate of the impact on healthcare costs was completed as part of the business case or development process.

  • Yes

  • No

n/a

n/a

If you select “Yes” in Row 076, then Row 077 becomes an optional field.

n/a

This is not a data entry field.

Cost Factors

077

Cost estimate methods and results

Briefly describe the methods and assumptions for your cost estimates and cite the sources of cost information. Provide the year of the cost estimate (e.g., 2016 dollars). If adjusted for inflation, provide the year the estimate was adjusted to (e.g., 2020 dollars after adjusting for inflation). Summarize the range of healthcare cost impacts based on your analysis.

ADD YOUR CONTENT HERE


Change #34

Location: Page 24-26

Reason for Change: Relocated Background Information and updated language

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

Background Information

057

*What is the history or background for including this measure on the current year MUC list?

Select one

  • New measure never reviewed by Measure Applications Partnership (MAP) Workgroup or used in a CMS program

  • Measure previously submitted to MAP, refined and resubmitted per MAP recommendation

  • Measure currently used in a CMS program being submitted as-is for a new or different program

  • Measure currently used in a CMS program, but the measure is undergoing substantial change

Background Information

058

If currently used: Range of year(s) this measure has been used by CMS Program(s).

For example: Hospice Quality Reporting (2012-2018)


Background Information

059

If currently used: What other federal programs are currently using this measure?

Select all that apply. These should be current use programs only, not programs for the upcoming year’s submittal.

  • Ambulatory Surgical Center Quality Reporting Program

  • End-Stage Renal Disease Quality Incentive Program

  • Home Health Quality Reporting Program

  • Hospice Quality Reporting Program

  • Hospital-Acquired Condition Reduction Program

  • Hospital Inpatient Quality Reporting Program

  • Hospital Outpatient Quality Reporting Program

  • Hospital Readmissions Reduction Program

  • Hospital Value-Based Purchasing Program

  • Inpatient Psychiatric Facility Quality Reporting Program

  • Inpatient Rehabilitation Facility Quality Reporting Program

  • Long-Term Care Hospital Quality Reporting Program

  • Medicare and Medicaid Promoting Interoperability Program for Eligible Hospitals and Critical Access Hospitals (CAHs)

  • Medicare Shared Savings Program

  • Merit-based Incentive Payment System

  • Part C and D Star Ratings [Medicare]

  • Prospective Payment System-Exempt Cancer Hospital Quality Reporting Program

  • Quality Health Plan Quality Rating System

  • Skilled Nursing Facility Quality Reporting Program

  • Skilled Nursing Facility Value-Based Purchasing Program

  • Other (enter here):

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Background Information

078

*What is the history or background for including this measure on the current year MUC List?

Select one

  • New measure never previously submitted to the MUC List, reviewed by Measure Applications Partnership (MAP) Workgroup, or used in a CMS program

  • Measure previously submitted but not included on the MUC List

  • Measure previously submitted to MAP, refined and resubmitted per MAP recommendation

  • Measure currently used in a CMS program being submitted as-is for a new or different program

  • Measure currently used in a CMS program, but the measure is undergoing substantial change

n/a

n/a

If you select “New measure never previously submitted to the MUC List, reviewed by Measure Applications Partnership (MAP) Workgroup, or used in a CMS Program” in Row 078 then skip to Row 081. If you select “Measure currently used in a CMS program being submitted as-is for a new or different program” or Measure currently used in a CMS program, but the measure is undergoing substantial change” then Rows 079-080 become required fields.

n/a

This is not a data entry field.

Background Information

079

*Range of year(s) this measure has been used by CMS Program(s).

For example: Hospice Quality Reporting (2012-2018)

ADD YOUR CONTENT HERE


Background Information

080

*What other federal programs are currently using this measure?

Select all that apply. These should be current use programs only, not programs for the upcoming year’s submittal.

  • Ambulatory Surgical Center Quality Reporting Program

  • End-Stage Renal Disease Quality Incentive Program

  • Home Health Quality Reporting Program

  • Hospice Quality Reporting Program

  • Hospital-Acquired Condition Reduction Program

  • Hospital Inpatient Quality Reporting Program

  • Hospital Outpatient Quality Reporting Program

  • Hospital Readmissions Reduction Program

  • Hospital Value-Based Purchasing Program

  • Inpatient Psychiatric Facility Quality Reporting Program

  • Inpatient Rehabilitation Facility Quality Reporting Program

  • Long-Term Care Hospital Quality Reporting Program

  • Medicare and Medicaid Promoting Interoperability Program for Eligible Hospitals and Critical Access Hospitals (CAHs)

  • Medicare Shared Savings Program

  • Merit-based Incentive Payment System

  • Part C and D Star Ratings [Medicare]

  • Prospective Payment System-Exempt Cancer Hospital Quality Reporting Program

  • Quality Health Plan Quality Rating System

  • Skilled Nursing Facility Quality Reporting Program

  • Skilled Nursing Facility Value-Based Purchasing Program

  • Other (enter here):

Change #35

Location: Page 26

Reason for Change: Relocated Data Sources section and updated language

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

Data Sources

060

*What data sources are used for the measure?

Select all that apply.

Use the next field to specify or elaborate on the type of data source, if needed to define your measure.

  • Administrative Data (non-claims)

  • Claims Data

  • Electronic Clinical Data (non-EHR)

  • Electronic Health Record

  • Paper Medical Records

  • Standardized Patient Assessments

  • Patient Reported Data and Surveys

  • Registries

  • Hybrid

  • Other (enter here):

Data Sources

061

If applicable, specify the data source(s)

Use this field to specify or elaborate on the type of data source, if needed, to define your measure.


Data Sources

062

If EHR or Claims or Chart-Abstracted Data, description of parts related to these sources

Describe the parts or elements of the measure that are relevant to these data sources


Data Sources

063

*How is the measure expected to be reported to the program?

This is the anticipated data submission method. Select all that apply. Use the 'Comments' field to specify or elaborate on the type of reporting data, if needed to define your measure.

  • eCQM

  • Clinical Quality Measure (CQM) Registry

  • Claims

  • Web interface

  • Other (enter here):

Data Sources

064

*Feasibility of Data Elements

To what extent are the specified data elements available in electronically defined fields? Select all that apply. For a PRO-PM, select the data collection format(s).

  • ALL data elements are in defined fields in administrative claims

  • ALL data elements are in defined fields in electronic health records (EHRs)

  • ALL data elements are in defined fields in electronic clinical data (e.g., clinical registry, nursing home minimum data set, or MDS, home health Outcome and Assessment Information Set, or OASIS)

  • ALL data elements are in defined fields in a combination of electronic sources

  • Some data elements are in defined fields in electronic sources

  • No data elements are in defined fields in electronic sources

  • Patient/family-reported information: electronic

  • Patient/family-reported information: paper

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Data Sources

081

*What data sources are used for the measure?

Select all that apply.

Use the next field to specify or elaborate on the type of data source, if needed to define your measure.

  • Administrative Data (non-claims)

  • Claims Data

  • Electronic Clinical Data (non-EHR)

  • Electronic Health Record

  • Paper Medical Records

  • Standardized Patient Assessments

  • Patient Reported Data and Surveys

  • Registries

  • Other (enter here):

Data Sources

082

If applicable, specify the data source

Use this field to specify or elaborate on the type of data source, if needed, to define your measure.

ADD YOUR CONTENT HERE


Data Sources

083

Description of parts related to each data source

Describe the parts or elements of the measure that are relevant to the selected data sources

ADD YOUR CONTENT HERE


Change #36

Location: Page 26-28

Reason for Change: Relocated STEWARD section.

CY 2022 Final Rule text:

STEWARD

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

Steward Information

065

*Measure steward

Enter the current Measure Steward. Select all that apply.

See Appendix A.065-067 for list choices. Copy/paste or enter your choices here:


Steward Information

066

*Measure Steward Contact Information

Last name, First name; Affiliation (if different); Telephone number; Email address.


Long-Term Steward Information

067

Long-Term Measure Steward (if different)

Entity or entities that will be the permanent measure steward(s), responsible for maintaining the measure and conducting endorsement maintenance review. Select all that apply.

See Appendix A.065-067 for list choices. Copy/paste or enter your choices here:


Long-Term Steward Information

068

Long-Term Measure Steward Contact Information

If different from Steward above: Last name, First name; Affiliation; Telephone number; Email address.


Submitter Information

069

Is primary submitter the same as steward?

Select “Yes” or “No.”

  • Yes

  • No

Submitter Information

070

*Primary Submitter Contact Information

If different from Steward above: Last name, First name; Affiliation; Telephone number; Email address. NOTE: The primary and secondary submitters entered here do not automatically have read/write/change access to modify this measure in MERIT. To request such access for others, when logged into the MERIT interface, navigate to “About” and “Contact Us,” and indicate the name and e-mail address of the person(s) to be added.


Submitter Information

071

Secondary Submitter Contact Information

If different from name(s) above: Last name, First name; Affiliation; Telephone number; Email address.


CY 2023 Final Rule text:

STEWARD

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Steward Information

084

*Measure Steward

Enter the current Measure Steward.

See Appendix A.084-086 for list choices. Copy/paste or enter your choices here:


Steward Information

085

*Measure Steward Contact Information

Please provide the contact information of the measure steward.

ADD YOUR CONTENT HERE


Long-Term Steward Information

086

Long-Term Measure Steward (if different)

Entity or entities that will be the permanent measure steward(s), responsible for maintaining the measure and conducting CBE endorsement maintenance review. Select all that apply.

See Appendix A. 084-086 for list choices. Copy/paste or enter your choices here:


Long-Term Steward Information

087

Long-Term Measure Steward Contact Information

If different from Steward above: Last name, First name; Affiliation; Telephone number; Email address.

ADD YOUR CONTENT HERE


Submitter Information

088

Is primary submitter the same as steward?

Select “Yes” or “No.”

  • Yes

  • No

Submitter Information

089

*Primary Submitter Contact Information

If different from Steward above: Last name, First name; Affiliation; Telephone number; Email address. NOTE: The primary and secondary submitters entered here do not automatically have read/write/change access to modify this measure in CMS MERIT. To request such access for others, when logged into the CMS MERIT interface, navigate to “About” and “Contact Us,” and indicate the name and e-mail address of the person(s) to be added.

ADD YOUR CONTENT HERE


Submitter Information

090

Secondary Submitter Contact Information

If different from name(s) above: Last name, First name; Affiliation; Telephone number; Email address.

ADD YOUR CONTENT HERE


Change #37

Location: Page 27-28

Reason for Change: Relocated CHARACTERISTICS section.

CY 2022 Final Rule text:

CHARACTERISTICS

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

General Characteristics

072

*Measure Type

Select only one type of measure. For definitions, visit this web site: https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/QualityMeasures/Pre-Rule-Making.html .

  • Access

  • Communication and Care Coordination

  • Composite

  • Cost/Resource

  • Cost/Resource Use

  • Efficiency

  • Intermediate Outcome

  • Not Specified

  • Outcome

  • Patient Engagement/Experience

  • Patient Perspective

  • Patient Reported Outcome

  • Process

  • Structure

  • Other (enter here):

General Characteristics

073

*Is the measure a composite or component of a composite?

Select one

  • Yes

  • No

General Characteristics

074

*Is this measure in the CMS Measures Inventory Tool (CMIT)?

Select Yes or No. Current measures can be found at https://cmit.cms.gov/CMIT_public/ListMeasures

  • Yes

  • No

General Characteristics

075

*If yes, enter the CMIT ID

If the measure is currently in CMIT, enter the 4-digit CMIT ID. Current measures and CMIT IDs can be found at https://cmit.cms.gov/CMIT_public/ListMeasures


General Characteristics

076

Alternate Measure ID

DO NOT enter consensus-based entity (endorsement) ID, CMIT ID, or previous year MUC ID in this field. This is an alphanumeric identifier (if applicable), such as a recognized program ID number for this measure (20 characters or less). Examples: 199 GPRO HF-5; ACO 28; CTM-3; PQI #08.


General Characteristics

077

Outline the clinical guideline(s) supporting this measure. Also see note at Rows 082 and 083 below.

Provide a detailed description of which guideline supports the measure and how the measure will enhance compliance with the clinical guidelines. Indicate whether the guideline is evidence-based or consensus-based.


General Characteristics

078

*What is the target population of the measure?

What populations are included in this measure? e.g., Medicare Fee for Service, Medicare Advantage, Medicaid, Children’s Health Insurance Program (CHIP), All Payer, etc.


General Characteristics

079

*Select ALL areas of specialty the measure is aimed to, or which specialties are most likely to report this measure

Select all areas of specialty that apply.

See Appendix A.079 for list choices. Copy/paste or enter your choice(s) here:


General Characteristics

080

*Evidence of performance gap

Evidence of a performance gap among the units of analysis in which the measure will be implemented. Provide analytic evidence that the units of analysis have room for improvement and, therefore, that the implementation of the measure would be meaningful. If you have lengthy text add the evidence as an attachment, named to clearly indicate the related form field.


General Characteristics

081

*Unintended consequences

Summary of potential unintended consequences if the measure is implemented. Information can be taken from the CMS consensus-based entity Consensus Development Process (CDP) manuscripts or documents. If referencing CDP documents, you must submit the document or a link to the document, and the page being referenced.


CY 2023 Final Rule text:

CHARACTERISTICS

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

General Characteristics

091

*Measure Type

Select only one type of measure. For definitions, see:

https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf.

  • Cost/Resource Use

  • Efficiency

  • Intermediate Outcome

  • Outcome

  • Outcome - (PRO-PM)

  • Process

  • Structure

  • Other (enter here):

n/a

n/a

If you select “Outcome” or “Outcome – (PRO-PM)" in Row 091 then Row 121 in the Evidence section becomes a required field. Continue to complete required General Characteristics and Evidence questions.

n/a

This is not a data entry field.

General Characteristics

092

*Is the measure a composite or component of a composite?

Select one


  • Composite measure

  • Component of a composite measure

  • Not a composite or component of a composite measure

General Characteristics

093

*Is this measure in the CMS Measures Inventory Tool (CMIT)?

Select Yes or No. Current measures can be found at https://cmit.cms.gov/CMIT_public/ListMeasures

  • Yes

  • No

n/a

n/a

If you select “Yes” in Row 093 then Row 094becomes a required field.

n/a

This is not a data entry field.

General Characteristics

094

*CMIT ID

If the measure is currently in CMIT, enter the CMIT ID in the format #####-X-XXXXXXX. Current measures and CMIT IDs can be found at https://cmit.cms.gov/CMIT_public/ListMeasures

ADD YOUR CONTENT HERE


General Characteristics

095

Alternate Measure ID

DO NOT enter consensus-based entity (endorsement) ID, CMIT ID, or previous year MUC ID in this field. This is an alphanumeric identifier (if applicable), such as a recognized program ID number for this measure (20 characters or less). Examples: 199 GPRO HF-5; ACO 28; CTM-3; PQI #08.

ADD YOUR CONTENT HERE


General Characteristics

096

*What is the target population of the measure?

What populations are included in this measure? e.g., Medicare Fee for Service, Medicare Advantage, Medicaid, Children’s Health Insurance Program (CHIP), All Payer, etc.

ADD YOUR CONTENT HERE


General Characteristics

097

*What one area of specialty the measure is aimed to, or which specialty is most likely to report this measure?

Select one.

See Appendix A.097 for list choices. Copy/paste or enter your choice(s) here:


General Characteristics

098

*Evidence of performance gap

Evidence of a performance gap among the units of analysis in which the measure will be implemented. Provide analytic evidence that the units of analysis have room for improvement and, therefore, that the implementation of the measure would be meaningful. If you have lengthy text add the evidence as an attachment, named to clearly indicate the related form field.

ADD YOUR CONTENT HERE


General Characteristics

099

*Unintended consequences

Summary of potential unintended consequences if the measure is implemented. Information can be taken from the CMS consensus-based entity Consensus Development Process (CDP) manuscripts or documents. If referencing CDP documents, you must submit the document or a link to the document, and the page being referenced.

ADD YOUR CONTENT HERE


Change #38

Location: Page 29-38

Reason for Change: Relocated Evidence section and added rows with updated language

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

Evidence

082

*Type of evidence to support the measure

Select all that apply

  • Clinical Guidelines

  • USPSTF (U.S. Preventive Services Task Force) Guidelines

  • Systematic Review

  • Empirical data

  • Other (enter here):

Evidence

083

If you select Clinical Guidelines and/or USPSTF Guidelines in Row 082 above, then Row 077 (Outline the Clinical Guidelines) becomes a required field.

n/a

This is not a data entry field.

Evidence

084

*Were the guidelines graded?

Select one

  • Yes

  • No

Evidence

085

*If yes, who graded the guidelines?

Specify the agency or organization(s) that graded the guidelines.


Evidence

086

*If yes, what was the grade?

Specify the grade that was assigned to the guidelines.


CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Evidence

100

*Type of evidence to support the measure

Select all that apply. Refer to the latest CMS Blueprint version (https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf) and the supplementary material related to evidence review (https://www.cms.gov/files/document/blueprint-environmental-scans.pdf) to obtain updated guidance.

  • Clinical Guidelines or USPSTF (U.S. Preventive Services Task Force) Guidelines

  • Peer-Reviewed Systematic Review

  • Empirical data

  • Other (enter here):

n/a

n/a

If you select “Clinical Guidelines or USPSTF (U.S. Preventive Services Task Force) Guidelines in Row 100, then Rows 101-102 become required fields. If you select “Systematic Review” in Row 100, then Rows 115-116 become required fields. If you select “Empirical data” in Row 100, then Rows 117-118 become required fields. If you select “Other” in Row 100, then Rows 119-120 become required fields.

n/a

This is not a data entry field.

Evidence

101

*Number of clinical guidelines, including USPSTF guidelines that address this topic

Enter a numerical value of ≥1. Count all guidelines that are relevant to this measure topic including those that offer contradictory guidance.

Numeric field

Evidence

102

*Outline the clinical guideline(s) supporting this measure

Provide a detailed description of which guideline(s) support the measure and indicate for each, whether they are evidence-based or consensus-based. Summarize the meaning/rationale of the guideline statements that are being referenced, their relation to the measure concept and how they support the measure whether directly or indirectly, and how the guideline statement(s) relate to the measure’s intended accountable entity. Describe the body of evidence that supports the statement(s) by describing the quantity, quality and consistency of the studies that are pertinent to the guideline statements/sentence. Quantity of studies represent the number of studies and not the number of publications associated with a study. If the statement is advised by 3 publications reporting outcomes from the same RCT at 3 different time points, this is considered a single study and not 3 studies.If referencing a standard norm which may or may not be driven by evidence, provide the description and rationale for this norm or threshold as reasoned by the guideline panel.If this is an outcome measure or PRO-PM, indicate how the evidence supports or demonstrates a link between at least one process, structure, or intervention and the outcome. Document the criteria used to assess the quality of the clinical guidelines such as those proposed by the Institute of Medicine or ECRI Guideline’s Trust (see CMS Blueprint version (https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf and the supplementary material related to evidence review. (https://www.cms.gov/files/document/blueprint-environmental-scans.pdf)If there is lengthy text, describe the guidelines in an evidence attachment, named to clearly indicate the related form field.

ADD YOUR CONTENT HERE


Evidence

103

*Name the guideline developer/entity

If the response to the Number of clinical guidelines, including USPSTF guidelines, that address this measure topic is >1, identify the guideline that most closely aligns with and supports your measure concept. This is now referred to as the primary clinical guideline. Spell out the primary clinical guideline entity’s name followed by the appropriate acronym, if available.

For example: United States Preventive Services Task Force (USPSTF)

ADD YOUR CONTENT HERE


Evidence

104

*Publication year

Provide the publication year for the primary clinical guideline.

Use the 4-digit format (e.g., 2016).

Numeric field (4-digit year)

Evidence

105

*Full citation +/- URL

Provide the full citation for the primary clinical guideline in any established citation style (e.g., AMA, APA, Chicago, Vancouver, etc.) and the accompanying URL, if available.

ADD YOUR CONTENT HERE


Evidence

106

*Is this an evidence-based clinical guideline

There are disparate methods of developing clinical guidance documents. An evidence-based guideline is one which uses evidence to inform the development of their recommendations. The evidence must be reviewed in a deliberate, systematic manner. To determine this, the developer must have provided a description of a systematic search of literature and their search strategy which includes the dates of the literature covered, databases consulted, and a screening, review and data extraction process. Select “No” for clinical guidelines that are based purely on expert consensus with or without supplementation with a narrative literature review (non-systematic).

  • Yes

  • No

Evidence

107

*Is the guideline graded?

A graded guideline is one which explicitly provides evidence rating and recommendation grading conventions in the document itself. Grades are usually found next to each recommendation statement. Select one.

  • Yes

  • No

n/a

n/a

If you select “Yes” in Row 107, then Rows 108-113 become required fields. If you select “No” in Row 107, then skip to Row 114.

n/a

This is not a data entry field.

Evidence

108

*List the guideline statement that most closely aligns with the measure concept.

If there are more than one statement from this clinical guideline that may be relevant to this measure concept, document the statement that most closely aligns with the measure concept as it is written in the guideline document. For example, Statement 1: In patients aged 65 years and older who have prediabetes, we recommend a lifestyle program similar to the Diabetes Prevention Program to delay progression to diabetes.No more than one statement should be written in the text box. All other relevant statements should be submitted in a separate evidence attachment.

ADD YOUR CONTENT HERE


Evidence

109

*What evidence grading system did the guideline use to describe strength of recommendation?

Select the evidence grading system used by the clinical guideline. (e.g., GRADE or USPSTF) to describe the guideline statement’s strength of recommendation.

  • GRADE method

  • Modified GRADE

  • USPSTF

  • Other (enter here)

Evidence

110

*List all categories and corresponding definitions for the evidence grading system used to describe strength of recommendation in the guideline?

Insert the complete list of grading categories and their definitions.

ADD YOUR CONTENT HERE


Evidence

111

*For the guideline statement that most closely aligns with the measure concept, what is the associated strength of recommendation?

Select the associated strength of recommendation using the convention used by the guideline developer. Select one.

  • USPSTF Grade A, Strong recommendation or similar

  • USPSTF Grade B or D, Moderate recommendation or similar

  • USPSTF Grade C or I, Conditional/weak recommendation or similar

  • Expert Opinion

  • Other (enter here)

Evidence

112

*List all categories and corresponding definitions for the evidence grading system used to describe level of evidence or level of certainty in the evidence?

Insert the complete list of grading categories and their definitions.

ADD YOUR CONTENT HERE


Evidence

113

*For the guideline statement that most closely aligns with the measure concept, what is the associated level of evidence or level of certainty in the evidence?

Select the associated level of evidence or certainty of evidence using the convention used by the guideline developer. Select one.

  • High or similar

  • Moderate or similar

  • Low, Very Low or similar

  • Other (enter here)

Evidence

114

*List the guideline statement that most closely aligns with the measure concept.

If there are more than one statement from this clinical guideline that may be relevant to this measure concept, document the statement that most closely aligns with the measure concept as it is written in the guideline document. For example, Statement 1: In patients aged 65 years and older who have prediabetes, we recommend a lifestyle program similar to the Diabetes Prevention Program to delay progression to diabetes. No more than one statement should be written in the text box. All other relevant statements should be submitted in a separate evidence attachment.

ADD YOUR CONTENT HERE


Evidence

115

*Number of systematic reviews that inform this measure concept

Insert the number of peer reviewed systematic reviews that addresses this measure topic. This includes systematic reviews that address the same intervention/ process/ structure but may have conflicting conclusions. Enter a numerical value of greater than or equal to 1.

Numeric field

Evidence

116

*Briefly summarize the peer-reviewed systematic review(s) that inform this measure concept

Summarize the peer-reviewed systematic review(s) that address this measure concept. For each systematic review, provide the number of studies within the systematic review that addressed the specifics defined in this measure concept, indicate whether a study-specific risk of bias/quality assessment was performed for each study, and describe the consistency of findings. Number of studies is not equivalent to the number of publications. If there are three publications from a single cohort study cited in the systematic review, report one when indicating the number of studies. For every systematic review cited, provide full citations using any established citation style. If this is an outcome measure or PRO-PM, indicate how the evidence supports or demonstrates a link between at least one process, structure, or intervention with the outcome. If there is lengthy text, submit details via an evidence attachment.

ADD YOUR CONTENT HERE


Evidence

117

*Source of empirical data

Select all that apply

  • Published, peer-reviewed original research

  • Published and publicly available reports (e.g., from agencies)

  • Internal data analysis

  • Other (enter here)

Evidence

118

*Summarize the empirical data

Provide a summary of the empirical data and how it informs this measure concept. Describe the limitations of the data and provide a full citation for each source of empirical data in any established citation style. If this is an outcome measure or PRO-PM, indicate how the evidence supports or demonstrates a link between at least one process, structure, or intervention with the outcome. If there is lengthy text, include details in a separate evidence attachment.

ADD YOUR CONTENT HERE


Evidence

119

*Name evidence type

If citing evidence other than clinical guidelines, peer-reviewed systematic reviews and empirical data, state the type of evidence referenced to inform this measure concept.

ADD YOUR CONTENT HERE


Evidence

120

*Summarize the evidence

Provide a summary of the other type(s) of evidence used to inform this measure concept. Describe the limitations of the data and provide a full citation for piece of evidence cited in any established citation style. If this is an outcome measure or PRO-PM, indicate how the evidence supports or demonstrates a link between at least one process, structure, or intervention with the outcome. If there is lengthy text, include details in a separate evidence attachment.

ADD YOUR CONTENT HERE


Evidence

121

*Does the evidence discuss a link between at least one process, structure, or intervention with the outcome?

Select “Yes” if the evidence that was discussed in the evidence section demonstrate a link between at least one process, structure, or intervention with the outcome.

  • Yes

  • No

Change #39

Location: Page 35-38

Reason for Change: Relocated Risk Adjustment section and added rows with updated language

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

Risk Adjustment

087

*Is the measure risk adjusted, stratified, or both?

Select as many as apply.

  • Risk adjusted

  • Stratified

  • None


Risk Adjustment

088

*Are social determinants of health built into the risk adjustment model?

Select one. If it was determined that risk adjustment for social determinants of health was not appropriate for the risk model used, select “not applicable.” If risk adjustments for social determinants of health were appropriate but are not currently built in, select “no.”

  • Yes

  • No

  • Not Applicable

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Risk Adjustment

122

*Is the measure risk adjusted?

Select one.

  • Yes

  • No

n/a

n/a

If you select “Yes” in Row 122, then Rows 123-124 become required fields and you should not answer Row 125. If you select “Yes” in Row 122 you are also encouraged to upload documentation about your risk adjustment model as an attachment. If you select “No” in Row 122, then skip to Row 125.

n/a

This is not a data entry field.

Risk Adjustment

123

*Risk adjustment variables

Select ALL risk adjustment variable types that are included in your final risk model. For more information on how to select risk factors for accountability measures, refer to the CMS Measures Management System Blueprint (https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf Select “Patient-level demographics” if the measure uses information related to each patient’s age, sex, race/ethnicity, etc. Select “Patient-level health status & clinical conditions” if the measure uses information specific to each individual patient about their health status prior to the start of care (e.g., case-mix adjustment).Select “Patient functional status” if the measure uses information specific to each individual patient’s functional status prior to the start of care (e.g., body function, ability to perform activities of daily living, etc.)Select “Patient-level social risk factors” if the measure uses patient-reported information related to their individual social risks (e.g., income, living alone, etc.)Select “Proxy social risk factors” if the measure uses data related to characteristics of the people in the patient’s community (e.g., neighborhood level income from the census)Select “Patient community characteristic” if the measure uses information about the patient’s community (e.g., percent of vacant houses, crime rate).Select “Other” if the risk factor is related to the healthcare provider, health system, or other factor that is not related to the patient.

  • Patient-level demographics

  • Patient-level health status & clinical conditions

  • Patient functional status

  • Patient-level social risk factors

  • Proxy social risk factors

  • Patient community characteristics

  • Other (enter here):

Risk Adjustment

n/a

If you select “Patient Demographics” in Row 123, then Row 124 becomes a required field. If you select “Patient-level health status & clinical conditions” in Row 123, then Row 125 becomes a required field. If you select “Patient functional status” in Row 123, then Row 126 becomes a required field. If you select “Patient-level social risk factors” in Row 123, then Row 127 becomes a required field. If you select “Proxy social risk factors” in Row 123, then Row 128 becomes a required field. If you select “Patient community characteristics” in Row 123, then Row 129 becomes a required field.

n/a

This is not a data entry field.

Risk Adjustment

124

*Patient-level demographics: please select all that apply

Select all that apply

  • Age

  • Sex

  • Gender

  • Race/ethnicity

  • Other (enter here):

Risk Adjustment

125

*Patient-level health status & clinical conditions: please select all that apply

Select all that apply

  • Case-Mix Adjustment

  • Severity Illness

  • Health behaviors/health choices

  • Other (enter here):

Risk Adjustment

126

*Patient functional status: please select all that apply

Select all that apply

  • Body Function

  • Ability to perform activities of daily living

  • Other (enter here):

Risk Adjustment

127

*Patient-level social risk factors: please select all that apply

Select all that apply

  • Income

  • Education

  • Wealth

  • Living Alone

  • Social Support

  • Other (enter here):

Risk Adjustment

128

*Proxy social risk factors: please select all that apply

Select all that apply

  • Neighborhood Level Income from the Census

  • Dual Eligibility for Medicare and Medicaid

  • Other (enter here):

Risk Adjustment

129

*Patient community characteristics: please select all that apply

Select all that apply

  • Percent of Vacant Houses

  • Crime Rate

  • Urban/Rural

  • Other (enter here):

Risk Adjustment

130

*Risk model performance

Provide empirical evidence that the risk model adequately accounts for confounding factors (e.g., c-statistics). Describe your interpretation of the results.

ADD YOUR CONTENT HERE


Risk Adjustment

131

*Rationale for not using risk adjustment

Select ALL reasons for not implementing a risk adjustment model in the measure. For more information on measure types that do not require risk adjustment, refer to the CMS Measures Management System Blueprint (https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf

  • Addressed through exclusions (e.g., process measures)

  • Addressed through stratification of results

  • Not conceptually or empirically indicated (enter here):

  • Other (enter here):

Change #40

Location: Page 39

Reason for Change: Relocated Healthcare Domain section

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

Healthcare Domain

089

*What one healthcare domain applies to this measure?

Select the ONE most applicable healthcare domain. For more information, see: https://www.cms.gov/meaningful-measures-20-moving-measure-reduction-modernization

  • Person-Centered Care

  • Equity

  • Safety

  • Affordability and Efficiency

  • Chronic Conditions

  • Wellness and Prevention

  • Seamless Care Coordination

  • Behavioral Health

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Healthcare Domain

132

*What one healthcare domain applies to this measure?

Select the ONE most applicable healthcare domain. For more information, see: https://www.cms.gov/meaningful-measures-20-moving-measure-reduction-modernization

  • Person-Centered Care

  • Equity

  • Safety

  • Affordability and Efficiency

  • Chronic Conditions

  • Wellness and Prevention

  • Seamless Care Coordination

  • Behavioral Health

Change #41

Location: Page 39-40

Reason for Change: Relocated Endorsement Characteristics section with updated language

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

Endorsement Characteristics

090

*What is the endorsement status of the measure?

Select only one. For information on consensus-based entity (CMS contractor) endorsement, measure ID, and other information, refer to: http://www.qualityforum.org/QPS/

  • Endorsed

  • Endorsement Removed

  • Submitted

  • Failed endorsement

  • Never submitted

Endorsement Characteristics

091

*CBE ID (CMS consensus-based entity, or endorsement ID)

Four- or five-character identifier with leading zeros and following letter if needed. Add a letter after the ID (e.g., 0064e) and place zeros ahead of ID if necessary (e.g., 0064). If no CBE ID number is known, enter numerals 9999.


Endorsement Characteristics

092

If endorsed: Is the measure being submitted exactly as endorsed by the CMS CBE?

Select 'Yes' or 'No'. Note that 'Yes' should only be selected if the submission is an EXACT match to the CBE-endorsed measure.

  • Yes

  • No

Endorsement Characteristics

093

If not exactly as endorsed, specify the locations of the differences

Indicate which specification fields are different. Select all that apply.

  • Measure title

  • Description

  • Numerator

  • Denominator

  • Exclusions

  • Target Population

  • Setting (for testing)

  • Level of analysis

  • Data source

  • eCQM status

  • Other (enter here and see next field):

Endorsement Characteristics

094

If not exactly as endorsed, describe the nature of the differences

Briefly describe the differences


Endorsement Characteristics

095

If endorsed: Year of most recent CDP endorsement

Select one

  • None

  • 2017

  • 2018

  • 2019

  • 2020

  • 2021

Endorsement Characteristics

096

Year of next anticipated CDP endorsement review

Select one. If you are submitting for initial endorsement, select the anticipated year.

  • None

  • 2021

  • 2022

  • 2023

  • 2024

  • 2025

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Endorsement Characteristics

133

*What is the endorsement status of the measure?

Select only one. For information on consensus-based entity (CMS contractor) endorsement, measure ID, and other information, refer to: http://www.qualityforum.org/QPS/

  • Endorsed

  • Endorsement removed

  • Submitted

  • Failed endorsement

  • Never submitted

Endorsement Characteristics

134

*CBE ID (CMS consensus-based entity, or endorsement ID)

Four- or five-character identifier with leading zeros and following letter if needed. Add a letter after the ID (e.g., 0064e) and place zeros ahead of ID if necessary (e.g., 0064). If no CBE ID number is known, enter numerals 9999.

ADD YOUR CONTENT HERE


Endorsement Characteristics

135

If endorsed: Is the measure being submitted exactly as endorsed by the CMS CBE?

Select 'Yes' or 'No'. Note that 'Yes' should only be selected if the submission is an EXACT match to the CBE-endorsed measure.

  • Yes

  • No

n/a

n/a

If you select “No” in Row 135, then Rows 136-137 become required fields.

n/a

This is not a data entry field.

Endorsement Characteristics

136

If not exactly as endorsed, specify the locations of the differences

Indicate which specification fields are different. Select all that apply.

  • Measure title

  • Description

  • Numerator

  • Denominator

  • Exclusions

  • Target population

  • Setting (for testing)

  • Level of analysis

  • Data source

  • eCQM status

  • Other (enter here and see next field):

Endorsement Characteristics

137

If not exactly as endorsed, describe the nature of the differences

Briefly describe the differences

ADD YOUR CONTENT HERE


Endorsement Characteristics

138

If endorsed: Year of most recent CDP endorsement

Select one

  • None

  • 2018

  • 2019

  • 2020

  • 2021

  • 2022

Endorsement Characteristics

139

Year of next anticipated CDP endorsement review

Select one. If you are submitting for initial endorsement, select the anticipated year.

  • None

  • 2022

  • 2023

  • 2024

  • 2025

  • 2026

Change #42

Location: Page 40-41

Reason for Change: Relocated GROUPS section with added rows and language

CY 2022 Final Rule text:

GROUPS

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

N/A

097

*Is this measure an electronic clinical quality measure (eCQM)?

Select 'Yes' or 'No'. If your answer is yes, the Measure Authoring Tool (MAT) ID number must be provided below. For more information on eCQMs, see: https://www.emeasuretool.cms.gov/

  • Yes

  • No

N/A

098

*If eCQM: Measure Authoring Tool (MAT) Number

You must attach Bonnie test cases for this measure, with 100% logic coverage (test cases should be appended), attestation that value sets are published in Value Set Authority Center (VSAC), and feasibility scorecard. If not an eCQM, or if MAT number is not available, enter 0.


N/A

099

* If eCQM, does the measure have a Health Quality Measures Format (HQMF) specification in alignment with the latest HQMF and eCQM standards, and does the measure align with Clinical Quality Language (CQL) and Quality Data Model (QDM)?

Select 'Yes' or 'No'. For additional information on HQMF standards, see: https://ecqi.healthit.gov/tool/hqmf

  • Yes

  • No

Burden

100

* If this measure is an eCQM, does any electronic health record (EHR) system tested need to be modified?

Select one

  • Yes

  • No

Burden

101

*If yes, how would you describe the degree of effort?

Select one

  • 1 (little to no effort)

  • 2

  • 3

  • 4

  • 5 (substantial effort)

CY 2023 Final Rule text:

GROUPS

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

n/a

140

*Is this measure an electronic clinical quality measure (eCQM)?

Select 'Yes' or 'No'. If your answer is yes, the Measure Authoring Tool (MAT) ID number must be provided below. For more information on eCQMs, see: https://www.emeasuretool.cms.gov/

  • Yes

  • No

n/a

n/a

If you select “Yes” in Row 140, then Rows 141-143 become required fields. If you select “No” in Row 140, then skip to Row 144.

n/a

This is not a data entry field.

n/a

141

* Measure Authoring Tool (MAT) Number

You must attach Bonnie test cases for this measure, with 100% logic coverage (test cases should be appended), attestation that value sets are published in Value Set Authority Center (VSAC), and feasibility scorecard. If not an eCQM, or if MAT number is not available, enter 0.

ADD YOUR CONTENT HERE


n/a

142

* If eCQM, does the measure have a Health Quality Measures Format (HQMF) specification in alignment with the latest HQMF and eCQM standards, and does the measure align with Clinical Quality Language (CQL) and Quality Data Model (QDM)?

Select 'Yes' or 'No'. For additional information on HQMF standards, see: https://ecqi.healthit.gov/tool/hqmf

  • Yes

  • No

n/a

143

* If eCQM, does any electronic health record (EHR) system tested need to be modified?

Select “Yes” if any of the EHR systems tested had to modify how data were entered by providers or stored to facilitate calculation of the eCQM. Select “No” if the data needed to calculate the eCQM were already included in structured fields in the EHR systems tested and none of them needed to be modified.

  • Yes

  • No

Change #43

Location: Page 41-42

Reason for Change: Relocated Similar Measures section with added language

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

Similar In-Use Measures

104

*Is this measure similar to and/or competing with measure(s) already in a program?

Select either Yes or No. Consider other measures with similar purposes.

  • Yes

  • No

Similar In-Use Measures

105

If Yes: Which measure(s) already in a program is your measure similar to and/or competing with?

Identify the other measure(s) including title and any other unique identifier.


Similar In-Use Measures

106

If Yes: How will this measure add value to the CMS program?

Describe benefits of this measure, in comparison to measure(s) already in a program.


Similar In-Use Measures

107

If Yes: How will this measure be distinguished from other similar and/or competing measures?

Describe key differences that set this measure apart from others.


CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Similar In-Use Measures

144

*Is this measure similar to and/or competing with measure(s) already in a program?

Select either Yes or No. Consider other measures with similar purposes.

  • Yes

  • No

n/a

n/a

If you select “Yes” in Row 144 then Rows 145-147 become required fields. If you select “No” in Row 137, then skip to Row 148.

n/a

This is not a data entry field.

Similar In-Use Measures

145

If Yes: Which measure(s) already in a program is your measure similar to and/or competing with?

Identify the other measure(s) including title and any other unique identifier.

ADD YOUR CONTENT HERE


Similar In-Use Measures

146

If Yes: How will this measure add value to the CMS program?

Describe benefits of this measure, in comparison to measure(s) already in a program.

ADD YOUR CONTENT HERE


Similar In-Use Measures

147

If Yes: How will this measure be distinguished from other similar and/or competing measures?

Describe key differences that set this measure apart from others.

ADD YOUR CONTENT HERE


Change #44

Location: Page 42-43

Reason for Change: Relocated Previous measures section and added language

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

Previous Measures

108

*Was this measure published on a previous year's Measures under Consideration list?

Select 'Yes' or 'No'. If yes, you are submitting an existing measure for expansion into additional CMS programs or the measure has substantially changed since originally published.

  • Yes

  • No

Previous Measures

109

In what prior year(s) was this measure published?

Select all that apply. NOTE: If your measure was published on more than one prior annual MUC List, as you use the MERIT interface, click “Add Another Measure” and complete the information section for each of those years.

  • None

  • 2011

  • 2012

  • 2013

  • 2014

  • 2015

  • 2016

  • 2017

  • 2018

  • 2019

  • 2020

  • Other (enter here):

Previous Measures

110

What were the MUC IDs for the measure in each year?

List both the year and the associated MUC ID number in each year. If unknown, enter N/A.


Previous Measures

111

List the CMS CBE MAP workgroup(s) in each year

List both the year and the associated workgroup name in each year. Workgroup options: Clinician; Hospital; Post-Acute Care/Long-Term Care; Coordinating Committee. Example: "Clinician, 2014."


Previous Measures

112

What were the programs that MAP reviewed the measure for in each year?

List both the year and the associated program name in each year.


Previous Measures

113

What was the MAP recommendation in each year?

List the year(s), the program(s), and the associated recommendation(s) in each year. Options: Support; Do Not Support; Conditionally Support; Refine and Resubmit.


Previous Measures

114

Why was the measure not recommended by the MAP workgroups in those year(s)?

Briefly describe the reason(s) if known.


Previous Measures

115

MAP report page number being referenced for each year

List both the year and the associated MAP report page number for each year.


Previous Measures

116

If this measure is being submitted to meet a statutory requirement, list the corresponding statute

List title and other identifying citation information.


CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Previous Measures

148

*Was this measure published on a previous year's Measures Under Consideration List?

Select 'Yes' or 'No'. If yes, you are submitting an existing measure for expansion into additional CMS programs or the measure has substantially changed since originally published.

  • Yes

  • No

n/a

n/a

If you select “Yes” in Row 147 then Rows 148-148 become required fields. If you select “No” in Row 147, then skip to Row 155.

n/a

This is not a data entry field.

Previous Measures

149

*In what prior year(s) was this measure published?

Select all that apply. NOTE: If your measure was published on more than one prior annual MUC List, as you use the MERIT interface, click “Add Another Measure” and complete the information section for each of those years.

  • None

  • 2012

  • 2013

  • 2014

  • 2015

  • 2016

  • 2017

  • 2018

  • 2019

  • 2020

  • 2021

  • Other (enter here):

Previous Measures

150

*What were the MUC IDs for the measure in each year?

List both the year and the associated MUC ID number in each year. If unknown, enter N/A.

ADD YOUR CONTENT HERE


Previous Measures

151

*List the CMS CBE MAP workgroup(s) in each year

List both the year and the associated workgroup name in each year. Workgroup options: Clinician; Hospital; Post-Acute Care/Long-Term Care; Coordinating Committee. Example: "Clinician, 2014."

ADD YOUR CONTENT HERE


Previous Measures

152

*What were the programs that MAP reviewed the measure for in each year?

List both the year and the associated program name in each year.

ADD YOUR CONTENT HERE


Previous Measures

153

*What was the MAP recommendation in each year?

List the year(s), the program(s), and the associated recommendation(s) in each year. Options: Support; Do Not Support; Conditionally Support; Refine and Resubmit.

ADD YOUR CONTENT HERE


Previous Measures

154

*Why was the measure not recommended by the MAP workgroups in those year(s)?

Briefly describe the reason(s) if known.

ADD YOUR CONTENT HERE


Previous Measures

155

*MAP report page number being referenced for each year

List both the year and the associated MAP report page number for each year.

ADD YOUR CONTENT HERE


Previous Measures

156

*If this measure is being submitted to meet a statutory requirement, list the corresponding statute

List title and other identifying citation information.

ADD YOUR CONTENT HERE


Change #45

Location: Page 44

Reason for Change: Relocated Attachment section with additional language

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

N/A

157

Attachment(s)

You are encouraged to attach the measure information form (MIF) if available. This is a detailed description of the measure used by the CMS consensus-based entity (CBE) during endorsement proceedings. If a MIF is not available, comprehensive measure methodology documents are encouraged.

If you are submitting for MIPS (either Quality or Cost), you are required to download the MIPS Peer Reviewed Journal Article Template and attach the completed form to your submission using the “Attachments” feature. See https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/QualityMeasures/Pre-RulemakingIf your measure is risk adjusted, you are encouraged to attach documentation that provides additional detail about the measure risk adjustment model such as variables included, associated code system codes, and risk adjustment model coefficients If eCQM, you must attach MAT Output/HQMF, Bonnie test cases for this measure, with 100% logic coverage (test cases should be appended), attestation that value sets are published in VSAC, and feasibility scorecard.

ADD YOUR CONTENT HERE


N/A

158

MIPS Peer Reviewed Journal Article Template

Select Yes or No. For those submitting measures to MIPS program, enter “Yes.” Attach your completed Peer Reviewed Journal Article Template.

  • Yes

  • No

CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

N/A

117

Attachment(s)

You are encouraged to attach the measure information form (MIF) if available. This is a detailed description of the measure used by the CMS consensus-based entity (CBE) during endorsement proceedings. If a MIF is not available, comprehensive measure methodology documents are encouraged.

If you are submitting for MIPS (either Quality or Cost), you are required to download the MIPS Peer Reviewed Journal Article Template and attach the completed form to your submission using the “Attachments” feature. See https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/QualityMeasures/Pre-RulemakingIf eCQM, you must attach MAT Output/HQMF, Bonnie test cases for this measure, with 100% logic coverage (test cases should be appended), attestation that value sets are published in VSAC, and feasibility scorecard.


N/A

118

MIPS Peer Reviewed Journal Article Template

Select Yes or No. For those submitting measures to MIPS program, enter “Yes.” Attach your completed Peer Reviewed Journal Article Template.

  • Yes

  • No

Change #46

Location: Page 44

Reason for Change: Relocated COMMENTS section

CY 2022 Final Rule text:

Subsection

Row

Field Label

Guidance

[ADD YOUR CONTENT HERE]

N/A

119

Submitter Comments

Any notes, qualifiers, external references, or other information not specified above.


CY 2023 Final Rule text:

Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

N/A

159

Submitter Comments

Any notes, qualifiers, external references, or other information not specified above.

ADD YOUR CONTENT HERE


Change #47

Location: Pages 45 - 46, Appendix

Reason for Change: Renumbered choices for Measure Steward and Long-Term Measure Steward.

CY 2022 Final Rule text: A.065-067 Choices for Measure Steward (065) and Long-Term Measure Steward (if different) (067)

CY 2023 Final Rule text: A. 084-086 Choices for Measure Steward (084) and Long-Term Measure Steward (if different) (086)

Change #48

Location: Page 46, Appendix – Choices for Areas of Specialty

Reason for Change: Renumbered choices for areas of specialty.

CY 2022 Final Rule text: A.079 Choices for Areas of specialty (079)

CY 2023 Final Rule text: A.097 Choices for Areas of specialty (097)

Change #49

Location: Page 46, Row 96

Reason for Change: Changed expiration date from 1/31/2022 to 1/31/2025

CY 2022 Final Rule text: According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number for this information collection is 0938-1314 (Expiration date: 01/31/2022). The time required to complete this information collection is estimated to average 3.5 hours per response, including the time to review instructions, search existing data resources, gather the data needed, and complete and review the information collection. If you have comments concerning the accuracy of the time estimate(s) or suggestions for improving this form, please write to: CMS, 7500 Security Boulevard, Attn: PRA Reports Clearance Officer, Mail Stop C4-26-05, Baltimore, Maryland 21244-1850. ****CMS Disclosure**** Please do not send applications, claims, payments, medical records or any documents containing sensitive information to the PRA Reports Clearance Office. Please note that any correspondence not pertaining to the information collection burden approved under the associated OMB control number listed on this form will not be reviewed, forwarded, or retained. If you have questions or concerns regarding where to submit your documents, please contact QPP at qpp@cms.hhs.gov

CY 2023 Final Rule text : According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number for this information collection is 0938-1314 (Expiration date: 01/31/2025). The time required to complete this information collection is estimated to average 3.5 hours per response, including the time to review instructions, search existing data resources, gather the data needed, and complete and review the information collection. If you have comments concerning the accuracy of the time estimate(s) or suggestions for improving this form, please write to: CMS, 7500 Security Boulevard, Attn: PRA Reports Clearance Officer, Mail Stop C4-26-05, Baltimore, Maryland 21244-1850. ****CMS Disclosure**** Please do not send applications, claims, payments, medical records or any documents containing sensitive information to the PRA Reports Clearance Office. Please note that any correspondence not pertaining to the information collection burden approved under the associated OMB control number listed on this form will not be reviewed, forwarded, or retained. If you have questions or concerns regarding where to submit your documents, please contact QPP at qpp@cms.hhs.gov

21

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMUC Data Template Crosswalk: CY 2022 Final Versus CY 2023 Final
AuthorCenters for Medicare & Medicaid Services
File Modified0000-00-00
File Created2023-08-25

© 2024 OMB.report | Privacy Policy