Statement-B in Word 112213X

Statement-B in Word 112213.DOCX

Survey on Core Competencies for the Direct Service Workforce (CMS-10512)

OMB: 0938-1229

Document [docx]
Download: docx | pdf

OMB Package Supporting Statement B Updated 10/28/2013

Direct Service Workforce Resource Center CC Survey Instrument

Section B

1—Respondent Universe and Sampling Methods

The survey is expected to have a respondent universe of 4,800. This will include respondents from all population sectors, service populations and each of the four states selected for data collection.

The Direct Service Workforce Resource Center (DSW RC) seeks to obtain an equal number of respondents per sector per state. Exhibit 1 shows the projected breakdown of respondents by population, sector and state.

Exhibit 1: Sampling Strategy on the State-Level

Topics

Aging

Behavioral Health

ID/DD

Physical Disabilities

Total

Total across four states

Direct service workers

150

150

150

150

600

2,400

Front Line Supervisors and Managers

50

50

50

50

200

800

Agency Administrators and Directors

50

50

50

50

200

800

Self-Advocates and Guardians

50

50

50

50

200

800

Total

300

300

300

300

1,400

4,800

Total across four states

1,200

1,200

1,200

1,200

4,800


The DSW RC plans to publish the survey in four states. Thus, the DSW RC expects to receive a total of 1,200 respondents from each state, for a total sample of 4,800 respondents. Assuming a 20% response rate, DSW RC and its’ partners will send the survey to 24,000 prospective respondents.

No individual can participate in multiple projects associated with the Core Competency Validation. So, pretest respondents and people who have otherwise given feedback on the Core Competencies are ineligible to participate in the survey. Similarly, survey respondents will be unable to provide feedback on the set of core competencies following survey completion. This will ensure that the DSW RC collects an unduplicated data set.

The DSW RC will control for number of respondents from each sector. Once the survey software closes, the DSW RC will work to adjust response numbers from each sector as appropriate. This includes seeking additional data and respondents from some sectors and/or reducing the number of surveys to ensure balance across sectors. This will ensure that each population is fairly represented in data analysis and presentation.

2—Recruitment Strategy

The survey will be distributed to prospective respondents in four states. These states will be selected based upon ethnic and racial diversity, diversity in urban/rural populations, service delivery models, and geography. Further, states with existing relationships with the DSW RC and its’ partners will be prioritized to maximize response rates.

To enhance the likelihood of a high participation rate in this survey, the DSW RC and CMS have developed the following contact/recruitment strategy. To begin, the DSW RC will work with its’ contacts in appropriate state agencies who can navigate the cross-sector recruitment efforts. The will help identify provider agencies, fiscal support entities, centers for independent living and other appropriate partners to assist in facilitating and conducting recruitment of survey respondents.

The survey will be of a purposeful convenience sample approach. The DSW RC will provide all documents and protocols necessary for mailings or emails that go into the recruitment process. This will include the introductory emails respondents receive, links to the survey website, explanations of the survey’s purpose, uses of response data and other items as needed.

3—Methods to Maximize Response Rates and Deal with Nonresponse

The Direct Service Workforce Resource Center will use a number of proven methods to maximize participation in the survey. These include:

  • Use of relatively short and uncomplicated survey instrument with clear instructions for completion;

  • Limited number of open-ended questions in the survey instrument;

  • Flexibility about the time of response (eight weeks); and

  • The study team will track responses and conduct follow-up with non-respondents via email, phone, and postcard mailings.

The Direct Service Workforce Resource Center team will calculate a nonresponse rate for the overall survey. To evaluate nonresponse bias, we will compare respondents and non-respondents on information available from the sampling frames compiled by the states with assistance from the DSW RC, as described above, to determine whether response rates vary on those attributes or whether respondents and non-respondents differ on those characteristics. We will also assess potential nonresponse bias by analyzing differences between respondents and initial refusals who later responded, by levels of effort to obtain the overall response (e.g., the number of mail reminders sent / telephone calls made).

In addition, we will evaluate nonresponse bias at the item level for each item.

The data entry system will provide special administrative features for each question for data enterers to designate non-responses and confusing/contradictory responses. Missing data will be assigned a “missing” code, and the number of respondents will be noted for each statistic reported.

4—Tests of Procedures or Methods to be Undertaken

Reading level testing

Microsoft Word’s reading level tool will be used to test the reading level of the survey’s questions. The tool uses the Flesch-Kincaid Grade Level Test, a validated and published test that rates text on a U.S. school grade level. Question and instruction language were modified to be at appropriate grade reading levels, based on previous research indicating that this is the reading level range needed to ensure that the surveys are understandable by this workforce.

Pretesting

The DSW RC will conduct pretesting to confirm the burden of the survey as well as the applicability, appropriateness and accessibility of questions. Pretesting will be conducted online using the same survey software as planned for the final version. The survey software indicates how much time the survey takes to complete, and this will be used to estimate the hours burden. Immediately after the pretest is completed, an interactive teleconference will be held with the respondents and representatives from the DSW RC to gather information about difficult questions and language, the survey’s burden, general thoughts and/or comments about the survey and survey software. The DSW RC estimates the pretest will take 1.5 hours per respondent; 30 minutes for completing the instrument, and 60 for teleconferencing. The DSW RC will facilitate and host the teleconference. Pretest respondents will be representative of the various audiences the final survey will face, and there will be no more than nine pretest respondents in total. All feedback from pretest respondents will be taken into consideration and reflected in the final survey instrument.

5—Individuals Consulted on Statistical Aspect and Individuals Collecting and/or Analyzing Data

The investigators chosen by CMS to conduct this study include the following individuals:

  • Carrie Blakeway, Managing Consultant, The Lewin Group, 703-269- 5711, Project Director

  • Erika Robbins, Senior Consultant, The Lewin Group, 937-828-1500, Project Manager

  • Lori Sedlezky, Director of Knowledge Translation, Research and Training Center, University of Minnesota, 612-624-7668

  • Annie Johnson Sirek, Project Coordinator, Research and Training Center, University of Minnesota, 612-626-0535

  • Betsy Dilla, Senior Research Analyst, The Lewin Group, 703-269-5574

  • Brendan Flinn, Senior Research Analyst, The Lewin Group, 703-269-5722

  • Corey Lipow, Research Consultant, Lewin Group, 703-269-5628

  • Michael Hoge, Professor & Director of Clinical Training in Psychology, Department of Psychiatry, Yale University School of Medicine, 203-785-5629

  • Amy Hewitt, Senior Research Associate, Research and Training Center, Institute on Community Living, University of Minnesota, 612-625-1098, hewit005@umn.edu

The CMS task order officers for this study are:

  • Kathryn King, Project Officer, Division of Community Systems Transformation, Disabled & Elderly Health Programs Group, Center for Medicaid and State Operations, 410-786-1283, Kathryn.king@cms.hhs.gov

  • Susan Joslin, Health Insurance Specialist, Division of Community Systems Transformation, Disabled & Elderly Health Programs Group, Centers for Medicare & Medicaid Service, 410-786-0268, Susan.Joslin@cms.hhs.gov

1—Respondent Universe and Sampling Methods Page | 3

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleOMB Package Support Statement B
SubjectDirect Service Workforce
AuthorThe Lewin Group
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy